By Lucas Maurice, Massera Winigah | 7-minute read
Artificial Intelligence (AI) is a key player in the shift to clean energy. This is especially true in industrial settings. From grid optimisation and predictive maintenance to real-time energy demand forecasting. The International Energy Agency says that electricity use from data centers and AI might double by 2026. This could hit over 1,000 terawatt-hours a year. That’s about the same as Japan’s total electricity use. The benefits of AI are tangible, but so is its energy appetite. If we fail to manage adoption, industrial AI might drive up energy demand quicker than we can decarbonise. This could hurt our net-zero goals instead of helping them.
At the 2025 Hello Tomorrow Summit, this paradox was highlighted in the panel titled “Sustainable AI: Course-Correcting Our Path to Net Zero”, which brought together voices from the investment, corporate, and start-up worlds, including Ashwin Shashindranath (Energy Impact Partners), Marc Oman (Google), and Sandra Trittin (Beebop.ai) joined by Thomas Spencer of the International Energy Agency (IEA).
This article unpacks those insights, exploring how AI can support the clean energy transition, where it may fall short, and what steps are needed to make its growth sustainable.
Industrial AI: the glue between legacy systems and the new energy economy
AI is uniquely positioned to bridge the gap between traditional energy systems and emerging renewable technologies. As Sandra Trittin, co-founder of Beebop.ai, a company that leverages AI to optimise energy use in buildings based on live data from occupancy levels, weather, and pricing signals, puts it:
“AI can be seen as the glue between the old and the new energy economy.”
Its most promising applications include:
-Predictive maintenance to extend the lifespan of ageing infrastructure
-Smart grid optimisation to enhance grid resilience, reliability and security
-Integration of renewables by balancing the intermittent nature of solar and wind power
Thomas Spencer from IEA highlighted that this is only a fraction of what can be done with data. As more gigafactories emerge – with each capable of generating up to a trillion data points daily – we should be able to harness this data deluge to turbocharge R&D cycles, optimise production, and accelerate material breakthroughs in batteries, hydrogen, and beyond.
Can industrial AI crack the energy trilemma? Gigawatts, Gigabytes, and Gigatons
Ashwin Shashindranath, a partner at the climate-tech investment firm Energy Impact Partners, opened the discussion with a clear framing:
“It’s a gigawatt, gigabyte, gigaton problem.”
In other words, energy capacity, data processing, and carbon emissions are inextricably linked:
– Gigawatt (energy capacity) measures how much power we generate or consume.
One gigawatt could power about 750,000 homes. As we use more electricity, we need more clean, reliable energy to avoid adding to carbon emissions.
– Gigaton (Carbon Emissions) translates to one billion tonnes of carbon dioxide (CO₂) released into the atmosphere.
When we burn fossil fuels to generate electricity for our growing digital and energy needs, we emit more CO₂.
– Gigabyte (Data Processing) measures the digital information. AI can optimise energy use and improve the integration of renewables into the grid. Yet, every time AI trains a model, runs a search, or manages a smart grid, it uses data. Processing that data requires electricity. If that electricity comes from fossil fuels, it increases consumption and emissions.
While this “Gigawatts, Gigabytes, and Gigatons” triad might sound like a tongue-twister, the logic is straightforward: AI may offer radical efficiencies, but its deployment is energy-hungry and growing fast. More data processing requires more power, and without clean energy, this leads to more carbon emissions.
“We need to deploy AI smartly, not just rapidly,” Ashwin warned, adding that careless infrastructure scaling risks inflating an “infrastructure bubble.”
Still, Ashwin remains optimistic. His firm backs ventures such as Zap Energy. This US-based start-up founded in 2017 is working on a compact and cheaper way to achieve nuclear fusion using a new Z-pinch technology. They deliver zero-carbon baseload power without the need for massive magnets or complex infrastructure.
Energy Impact Partners also invests in Elemental Energy, a Canada-based company specialising in renewable hydrogen microgrids that integrate solar, wind, battery storage, and on-site hydrogen generation.
Both Zap Energy and Elemental Energy stand to benefit from the strategic application of AI, whether through accelerating fusion breakthroughs or optimising the performance of decentralised clean energy systems.
The Gigawatt problem: Global energy demand is rising, but can the grid keep up?
Electricity is at the heart of the modern world. However, global demand is growing at a rate that challenges our ability to decarbonise.
According to the IEA’s Global Energy Review 2025, electricity demand rose by 4.3% in 2024, nearly double the average annual rate of the past decade. Key drivers for this increase include:
– The electrification of transport
– Industrial expansion in both mature and emerging economies
– The rise of AI technologies and data centres
The IEA’s World Energy Outlook 2024 estimates that AI and data centres could consume over 4% of global electricity by 2030, up from 1.5% in 2024. It equates to roughly 945 TWh of power. To put that in perspective:
– It is about the same as Japan’s entire annual electricity consumption.
– Equivalent to the combined electricity use of Britain, Italy and Spain.
– More than double France’s annual electricity consumption (~460 TWh)
The Gigabyte problem: When smart data come with a carbon cost
Every AI function, from training large language models to managing smart grids, relies on vast data processing. This demand for digital power is measured not in gigawatts, but in gigabytes, and processing those gigabytes consumes a staggering amount of electricity.
In Europe, this energy draw is already accelerating: according to a study done by McKinsey, the number of data centres is projected to nearly triple by 2030, pushing electricity consumption from approximately 62 TWh to over 150 TWh, equating to around 5% of Europe’s total power demand
In essence, data becomes a hidden driver of emissions.
The Gigaton problem: AI’s growing carbon footprint
Artificial intelligence is often seen as a tool to improve efficiency and sustainability. But it also comes with a growing carbon footprint. One estimate suggests AI could consume 82 terawatt-hours of electricity in 2025,. This is about the same as Switzerland’s annual energy use. If that demand is met by fossil fuels, the emissions could cancel out many of the climate benefits AI claims to support.
To align AI development with climate goals, the sector must address three main levers: clean energy supply, smarter software, and energy-efficient hardware.
To create better alignment between AI development and SDG climate goals, stakeholders must deploy various strategies in parallel, starting with:
a. Using clean energy
AI infrastructure needs electricity, and lots of it. To meet that demand sustainably, tech giants like Google, Meta, and Amazon have committed to supporting a threefold expansion of global nuclear capacity by 2050.
While next-generation nuclear is promising, it’s a long-term solution. First industrial-scale reactors are not expected online before 2030, a mismatch with the explosive growth of AI happening today.
In the meantime, AI systems will be powered mostly by a mix of fossil fuels and renewables. In the US, this includes a high share of natural gas. In other regions like Malaysia, Southeast Asia’s growing data hub, the situation is worse, 46% of electricity still comes from coal. Without a cleaner energy base, AI’s environmental impact will continue to rise.
b. Sustainable algorithms, lower impact
AI’s carbon footprint is shaped not just by electricity sources, but by how the software is designed.
AI systems consume energy in two key stages:
– Training, when the model learns from large datasets.
– Inference, when the model responds to real-world tasks.
Surprisingly, up to 90% of total energy use may occur in the inference phase. That means making everyday AI use more efficient is just as important as improving the initial training phase.
Progress is being made. For example, Microsoft researchers recently showed how refining AI agent workflows could boost energy efficiency by 4.5 times. Smaller, more focused models, optimised prompts, and reduced memory load all help minimise power draw, especially in industrial applications.
c. Energy-efficient hardware
Energy savings also depend on the hardware that runs AI. Start-ups like Axelera AI are building custom chips that can run AI tasks locally, without needing to connect to large, energy-hungry cloud servers. This shift is known as edge computing and it means lower latency and much lower energy use.
Better hardware also improves cooling. Traditional data centres use up to 40% of their electricity just to keep servers from overheating. To cut this, many facilities are moving from air-based systems to liquid cooling, which is more efficient, but raises water use concerns, especially in regions facing climate stress.
The case for geothermal energy: Using Google’s example
Google aims to match its electricity use with clean energy 24/7. This commitment helps reduce its carbon footprint, even as digital demands grow. Solar and wind energy are key to this transition, but they don’t always provide steady power. That’s where geothermal energy comes in.
Geothermal energy provides steady power from the Earth’s heat, available 24/7. This reliability helps stabilise the grid. It can also reduce our need for fossil fuel peaker plants. Plus, it makes renewable energy sources more practical on a larger scale.
Geothermal energy is a dependable but often overlooked part of clean energy. Its high startup costs, which can be 2 to 8 times more than a solar farm, hold it back. Also, reaching deep underground heat needs exploration, drilling, and specialised tools. But AI and geothermal energy could form a happy symbiotic relationship.
On the one hand, AI could support more accurate geological modelling, real-time data analysis and precision drilling. This can greatly cut the risks and costs of exploring and developing next-generation geothermal energy.
On the other hand, with a clean energy supply like geothermal, the tech industry can reduce its dependence on carbon-intensive grids. This connects digital transformation to net-zero goals. It makes AI a key part of the clean energy solution, not just a demand source.
Marc Oman, Principal for Energy & Infrastructure at Google, highlighted the company’s growing interest in geothermal energy. This interest was recently demonstrated through Google’s collaboration with Fervo Energy, which taps into underground heat using next-gen geothermal systems and AI-powered precision drilling.
The industrial AI and energy paradox calls for a dual mandate
-Maximise AI’s contribution to energy efficiency
-Minimise its carbon footprint
Achieving this requires:
-Systemic thinking to assess AI’s ripple effects across the energy value chain
-Strategic deployment in use cases with high sustainability returns
-Coordinating across sectors to make sure AI uses clean energy. Also, stakeholders must deploy AI responsibly and align it with climate goals.
To sum up, there are lots of opportunities for AI to drive innovation in the clean energy value chain. This includes things like predicting how wind and solar energy systems will perform, finding new materials for the next generation of batteries and solar panels, and managing the electricity grid and reducing demand.
Yet the risks of uncontrolled AI infrastructure development are equally significant. Without a solid framework for cross-industry collaboration, digital infrastructure will struggle. This includes energy-intensive data centers. They might harm the climate goals that AI is designed to support.
As Ashwin Shashindranath aptly put it:
“Energy, data, and carbon are intrinsically linked. Optimising one without considering the others is shortsighted. Whether we are designing smarter grids, drilling for geothermal energy, or scaling up hydrogen, AI offers a once-in-a-generation opportunity to rewire the world for resilience. But this opportunity must be approached with care and coordination.”
Looking to leverage industrial AI to meet ESG goals?
Connect with Hello Tomorrow’s consulting team to explore how we can help you develop and deploy industrial AI strategies in the energy space.
Authors
Editor
This article was inspired by the 2025 Hello Tomorrow Summit panel: “Sustainable AI: Course-Correcting Our Path to Net Zero” moderated by Thomas Spencer (IEA) with speakers Ashwin Shashindranath (EIP), Marc Oman (Google), and Sandra Trittin (Beebop.ai).





