As artificial intelligence scales rapidly, the energy footprint of data centres is rising just as fast, making sustainability an urgent priority. From powering and cooling to site selection and grid integration, the industry faces a choice between business-as-usual or a bold transition to cleaner, smarter operations. Jatinder Singh Pabla, Chief Sales & Marketing Officer at ST Telemedia Global Data Centres India (STT GDC India), brings over two decades of leadership experience across global technology giants like Wipro, HP, Convergys, and Microsoft. At STT GDC India, he is helping guide one of India’s largest data centre providers toward a carbon-neutral future. With over 400MW of IT load across 30 facilities in 10 cities, STT GDC India serves a wide portfolio of clients, including Fortune 500 companies, and commands more than 25% of the Indian colocation market by revenue. The company has committed to achieving carbon-neutral operations by 2030, aligning business growth with responsible environmental stewardship.
Toward a Carbon-Neutral AI Future: Rethinking the Role of Data Centre
AI is growing fast, and so is the power it needs. The choice in front of us is plain: build more of the same and raise emissions, or change how data centres are planned, powered, cooled, and run. The second path is not just possible. It is good business, and it will decide who leads this market over the next decade.
The global electricity use by data centres is set to approximately double to roughly 945 TWh by 2030, with AI as the main driver. That is close to the current power use of a large economy. Growth is running at ~15% a year, much faster than other sectors. If we do not act, this will lock in higher emissions and grid stress in many regions.
In the US, data-centre load alone may rise from ~35 GW in 2024 to ~78 GW by 2035. Average hourly power draw could triple. These numbers show how fast the ground is shifting for utilities, cities, and operators.
Water is now a headline risk too. A recent review found 160+ new AI data centres added in three years in parts of the US where water situation is already tight. Site choices will face far more scrutiny from regulators and the public.
A New Playbook for Data Centres of Tomorrow
1. Tie AI growth to clean power you can prove
Long-term clean power deals must be the default, not the exception. What matters now is hour-by-hour matching and local impact i.e clean megawatt-hours on the same grid and the same hours when your load runs. Big tech is already moving. Recent deals explore advanced nuclear and other firm clean sources to back AI loads across the day. If you plan to scale AI clusters, plan firm clean power in the same breath.
Use a simple rule: if your AI roadmap doubles, your clean power plan doubles with it. And report it with clear, auditable tracking.
2. Cool smarter, not just harder
High-density racks push air cooling to its edge. Direct liquid cooling (DLC) is moving from pilot to plan in many AI halls, but operators still worry about new failure modes and standards. Leaders will phase DLC in where rack density and site water risk make sense, and pair it with strong ops playbooks and parts commonality. Do not wait for a perfect standard set. Instead, start where density is highest and measure results.
3. Treat the grid as a partner, not a constraint
As loads rise, active grid participation becomes essential: flexible load, on-site storage, and demand response. Industry trackers expect data centres to play a direct role in grid stability as growth continues. In practice, that means shaping workloads to off-peak windows, bidding flexibility into markets, and using on site generation (thru battery storage- which would have been charged from clean power available during off-peak hours/ H2 based fuel cells etc.) to shave peaks. It also means planning sites where clean power growth is actually possible.
4. Put water on equal footing with carbon
Carbon is not the only metric that matters. In dry regions, water use can decide if a project wins approval. Use site-level water stress data at the start, prefer cooling designs with low water draw, and set a hard cap on water use per MW. If you need evaporative systems, offset with on-site recycling and clear reporting. The public will judge both power and water claims.
5. Run AI like a grid-aware workload
Not all AI tasks are the same. Training is batch-heavy and can move in time and place; inference is steady and close to users. Use that to your edge:
● Shift training to locations and hours with surplus clean power.
● Keep inference close to users but throttle plan with real-time grid data.
● Use AI to tune your own halls: forecast peaks, right-size cooling, and cut waste.
Global energy bodies now flag AI as both a source of growth and a tool to run energy systems better. Use it for both.
What to Build (and where) From 2025 Onward
From 2025 onward, building the right kind of data centres will depend on making sharper choices about both design and location. Site selection should rest on three non-negotiable filters: a clear path for firm clean power growth, adequate grid headroom with access to flexibility markets, and low to moderate water stress. Projects that fail any of these checks will struggle to scale, as recent cases have shown how limits on power and water can delay or block approvals.
The power stack should combine wind, solar, and storage, with firm clean sources such as advanced nuclear or hydro included where possible. Priority must shift to hourly-matched, local supply rather than relying on distant offsets, since this reduces both emissions and curtailment risk.
AI can help solve energy problems, but only if we hold ourselves to the same standard we ask of others. Data centres are not just the pipes of the internet anymore. They are major power users that can also help the grid. The leaders in this space will tie AI growth to clean power, cut water use, adopt the right cooling for dense racks, and earn trust with simple, honest reports.
Do this well, and AI growth and a carbon-neutral future can rise together.