Why Renewable Co-location Changes the Economics of AI

Why Renewable Co-location Changes the Economics of AI

The AI industry has an energy problem that is widely acknowledged and poorly understood. The standard narrative focuses on the sheer volume of energy consumed by AI compute. The more consequential problem is that the AI industry is competing for the most expensive energy on earth — grid-delivered, peak-demand, metropolitan-priced electricity — when the cheapest energy on earth is being wasted in plain sight.

Curtailed renewable energy — power generated by wind farms, solar installations, and hydro facilities that the grid cannot absorb — is one of the largest untapped energy resources in the world. In Australia alone, curtailed solar generation reached record levels in 2024. In parts of Northern Europe, wind farms are routinely paid to switch off. In South America, hydro facilities generate surplus power that has no commercial outlet. This energy is not expensive. Its marginal cost approaches zero.

The constraint is not the energy. It is the assumption that compute must be located where the grid delivers power. This assumption, inherited from the first generation of data centre development, no longer holds.

The constraint is not the energy. It is the assumption that compute must be located where the grid delivers power. This assumption, inherited from the first generation of data centre development, no longer holds. Factory-manufactured, self-contained AI compute modules can be deployed directly to the site of energy generation — co-located with the renewable asset, connected behind the meter, independent of grid transmission constraints.

The economics of this model are transformative. The energy developer gains a guaranteed off-taker for generation that would otherwise be curtailed — improving project returns, increasing capacity factor, and securing revenue that the grid connection cannot provide. The AI compute operator gains energy at near-zero marginal cost, no grid dependency, and deployment timelines measured in days rather than the years required for new grid-connected generation.

The levelised cost of compute under this model is structurally lower than any grid-connected alternative — because the single largest operating cost of any data centre, energy, has been reduced to near zero. When this energy cost advantage is combined with the capital efficiency of factory-manufactured modular infrastructure, the resulting cost trajectory is one that fixed hyperscale construction cannot approach at any scale.

The convergence of AI compute demand and curtailed renewable energy supply is the most significant structural opportunity in the infrastructure industry.

This is not a future possibility. It is a current operational reality for companies that have spent the time required to engineer the integration of renewable energy systems, battery storage, power conditioning, and high-density AI compute into a single, self-contained, deployable module. The convergence of AI compute demand and curtailed renewable energy supply is the most significant structural opportunity in the infrastructure industry. The question is who has the engineering to capture it.