Applied Digital Announces 250MW AI Data Center Lease With CoreWeave in North Dakota.  Read More >>
WHITE PAPER

AI Factory: A Case Study For Total Cost Of Ownership

Executive Summary

The explosive growth in AI computing demand (projected to require 171-219 GW globally by 2030) necessitates rethinking data center design and location strategies. By leveraging stranded power sources and climates optimized for free cooling, AI factories can achieve significantly lower total cost of ownership while supporting the infrastructure demands of generative AI applications.

The analysis suggests shifting away from traditional hubs like Northern Virginia toward more optimal regions with abundant, low-cost power, and favorable cooling conditions.

Download White Paper

1 / 10

AI Factory

Optimizing Total Cost of Ownership Through Strategic Design & Location

The AI Revolution
Generative AI is fundamentally changing data center requirements with exponentially higher power and cooling demands
The Opportunity
Strategic design and location choices can dramatically reduce Total Cost of Ownership while meeting these new demands

The Explosive Growth in AI Computing Demand

2023
55 GW
Global data center power
2030
171-219 GW
Projected requirement
That's a 3-4x increase in just 7 years
US Impact
Data center energy usage expected to reach 6.7% to 12% of total US electricity consumption by 2028
Context: ChatGPT
Answers 1 billion queries per day - it took Google 11 years to reach that milestone

Why AI is Different: Computing Paradigm Shift

Traditional Computing
  • Deterministic: Rules-based processing with predictable outputs
  • Linear scaling: Complexity increases proportionally
  • Modest hardware: Runs efficiently on standard CPUs
  • Lower energy: Reasonable power per operation
Generative AI
  • Probabilistic: Pattern-based responses from vast datasets
  • Exponential scaling: Enormous computational resources required
  • Specialized hardware: Requires GPUs + CPUs working in tandem
  • High energy: Significantly more power per operation
The Bottom Line
AI requires a fundamentally different data center design - we call this an "AI Factory"

The Power Density Challenge

4-9 kW
Traditional Data Center
67% of data centers today
VS
132 kW
AI Factory Rack
NVL72 SuperCluster
That's 15-30x higher power density
>1,000
Watts per sq ft
Forces move from air to liquid cooling
<2%
Current data centers
Have racks >50kW today

AI Factory vs Traditional Data Center

Traditional Data Center
  • Located near users to minimize latency
  • Optimized for diverse workloads
  • Air cooling sufficient
  • Multi-tenant COLOs common
  • Success = uptime + response time
AI Factory
  • Optimized for internal processor communication
  • Massive parallel computing focus
  • Liquid cooling required (>50kW/rack)
  • Single tenant for custom optimization
  • Success = computational throughput + efficiency
Key Insight: Different Optimization = Different Location Strategy

The Power Supply Challenge

Scale of Demand
A single future AI Factory may need 4GW of power - equivalent to all of Northern Virginia's current data center consumption
Infrastructure Strain
Current grid infrastructure and transmission capacity cannot support this explosive growth in traditional locations
The Solution: Stranded Power
Renewable generation has outpaced transmission capacity, creating opportunities for data centers to access curtailed power
Win-Win Scenario
AI Factories can access low-cost renewable power while helping stabilize the grid and reduce waste

The Cooling Revolution: From Constraint to Opportunity

Forced Evolution
Power density >50kW per rack forces liquid cooling - no choice in the matter
The Opportunity
Liquid cooling systems can utilize free cooling with only 10-18°F temperature differential
Air Cooling Limitation
Air cooling requires 25-45°F differential - much more restrictive for location selection
Free Cooling Benefits
Uses ambient air to cool liquid directly - no mechanical refrigeration needed for significant portions of the year
Result: More Locations Become Viable for Significant Free Cooling

Quantifying the TCO Impact

$60-90M
Annual savings for a 100MW AI Factory with optimized design & location
Factor
Industry Avg
Improved
Optimized
PUE
1.58
1.33
1.18
Electricity Rate
$0.09/kWh
$0.07/kWh
$0.04/kWh
Annual Cost
$125M
$82M
$41M
30-Year Impact
$1.8 - $2.7 billion total lifespan savings
Cost Drivers
Electricity + lease rates = 80%+ of operating expenses

Strategic Location Analysis: The Evidence

Northern Virginia
• Current data center capital
Power constrained - uses more than it generates
• 174 days/year free cooling potential
• Focus on intermittent generation
Texas
• Largest electricity generator
Supply delays: 180-day delays for new loads
• Only 95 days/year free cooling
Least stable grid: 13% of US outages
North Dakota
Exports 33% of generated power
Abundant stranded wind power
220+ days/year free cooling
Lowest electricity rates in US
The Math
125 extra days of free cooling (ND vs TX)
4¢/kWh rate advantage
• Combined impact: $45-65M annual savings

The Strategic Imperative

The Evidence is Clear
Strategic location and design choices deliver $60-90M annual TCO reduction for 100MW AI Factories
First Mover Advantage
Best locations with stranded power and optimal cooling are limited resources - early action secures decades of advantage
Sustainable Growth
Leveraging renewable stranded power and minimizing water usage creates responsible AI infrastructure
Proven Approach
Polaris Forge in Ellendale, ND demonstrates these principles in action with 1.18 PUE target and community integration
$1.8-2.7B
30-year value creation per 100MW facility
The future of AI infrastructure: Responsible • Sustainable • Cost-Optimized
<< Return to Insights
Trusted By