The Reuters news staff had no role in the production of this content. It was created by Reuters Plus, the brand marketing studio of Reuters.
Produced by Reuters Plus for
Disclaimer: The Reuters news staff had no role in the production of this content. It was created by Reuters Plus, the brand marketing studio of Reuters. To work with Reuters Plus, contact us here.
But while AI can help make companies more efficient, it can also use a lot of energy –so much that AI inferencing (when AI uses what it has learned to make decisions or predictions) could account for 10–20% of total global data center energy consumption by 2030. This means it is essential for companies to deploy AI in a way that delivers value alongside efforts to reduce environmental impact.
John Frey, a technology leader at Hewlett Packard Enterprise (HPE), calls this the “AI paradox.” He helps organizations find ways to use AI that are good for business while also taking into consideration the energy constraints facing the world today.
The Reuters news staff had no role in the production of this content. It was created by Reuters Plus, the brand marketing studio of Reuters.
Produced by Reuters Plus for
Disclaimer: The Reuters news staff had no role in the production of this content. It was created by Reuters Plus, the brand marketing studio of Reuters. To work with Reuters Plus, contact us here.
Addressing the AI paradox:
Today, businesses everywhere are turning to artificial intelligence (AI) to work smarter and faster.
So often companies are focused on what they can do with AI that might drive efficiency, that they don't concentrate enough on making that AI solution as efficient as it can be. And that includes companies that have very aggressive climate goals.
The urgency is mounting. As AI adoption accelerates, the demand for energy‑smart solutions is expanding. Projects that ignore efficiency often stall before they ever leave the testing phase—wasting time, money, and momentum. With business and energy demands, companies can’t afford to treat efficiency as an afterthought.
John Frey, Senior Director and Chief Technologist, Sustainable Transformation, at Hewlett Packard Enterprise
Data efficiency
Software efficiency
Equipment efficiency
Resource efficiency
Energy efficiency
Data efficiency
AI needs lots of data to learn. But more isn’t always better. It’s important to use only the best and most relevant data. Frey says: “If we’re going to train the model or tune the model, use the highest quality data available. Don’t include data that you don’t need. For example, if you’re training a large language model in English, any data in that data set that is not in English or that is machine language, immediately take it out.”
Using too much unnecessary data can make AI less accurate, which can pose serious issues in critical applications like healthcare or finance.
John Frey, Senior Director and Chief Technologist, Sustainable Transformation, at Hewlett Packard Enterprise
If we’re going to train the model or tune the model, use the highest quality data available. Don’t include data that you don’t need. For example, if you’re training a large language model in English, any data in that data set that is not in English or that is machine language, immediately take it out.
Software efficiency
How AI programs are designed affects how much energy they use. Better programming saves energy. Frey asks, “How do we design models and the applications that use these AI solutions as efficiently as possible, using better programming languages that require fewer resources to run?” Agentic AI, he notes, while sometimes less efficient per inference, can be more efficient per task by learning from experience and reaching solutions faster.
Energy efficiency
The goal here is to get the most work out of every bit of energy used. With AI workloads expected to consume 150 Twh globally by 2030 –and inference tasks potentially requiring three times more energy than training by 2028 –smart energy strategies are essential. Frey says, “We’re really looking at how to do the most amount of work per watt of energy that we’re putting into the system.” This includes using advanced cooling (like liquid cooling), monitoring energy use, and getting power from low-carbon sources.
Resource efficiency
This lever looks at the big picture—how technology choices affect the overall environment. Frey cautions: “Beyond the IT stack, if we make a decision in a technology solution that requires additional cooling needs, for example, in the data center, or requires additional power conversion, perhaps that’s not the most efficient solution.”
To address this, operators are adopting advanced cooling methods such as liquid cooling, which can cut energy use by up to 37%, along with smarter power systems and increased operational efficiency.
Frey concludes: “By taking all five of those levers, understanding that they are tightly interconnected with one another…when companies do that and take a strategic approach, they can drive a lot of efficiencies and a tremendous amount of benefits in terms of cost reduction and operational efficiency increases.”
Equipment efficiency
AI runs on powerful computer hardware like GPUs and other accelerators, which can use a lot of electricity, often in data centers not designed for such heavy work. These workloads are demanding because they include two big tasks: training and inference. Training means teaching models using huge datasets, which takes massive computing power and time. Inference happens after training and is about making predictions in real time. Each inference task is lighter, but there are millions of them, and they need to be fast, which puts pressure on systems to handle lots of data quickly with very little delay.
Frey says we should ask, “How do we have each piece of equipment do as much work as possible?” That means matching tasks to the right systems, keeping them optimized, upgrading to more efficient versions when available, and ensuring data moves quickly with robust storage and networking. Another key strategy is direct liquid cooling, which circulates cool liquid over computer components to absorb heat far more effectively than traditional air cooling. This approach helps equipment run at peak performance while reducing energy use.
John Frey, Senior Director and Chief Technologist, Sustainable Transformation, at Hewlett Packard Enterprise
How do we have each piece of equipment do as much work as possible?
IT sustainability solutions
As a company that aims to be a net-zero enterprise by 2040, with science-based targets along the way, HPE helps businesses advance IT sustainability agendas with a holistic approach to carbon footprints across the IT estate, edge to cloud.
Learn more
The AI paradox explained
Five levers of efficiency for enterprises
The AI paradox is no longer a distant challenge; it’s a crisis unfolding in real time. Companies are racing to harness AI for speed and efficiency, yet every new model can consume staggering amounts of energy.
Frey warns: “So often companies are focused on what they can do with AI that might drive efficiency, that they don't concentrate enough on making that AI solution as efficient as it can be. And that includes companies that have very aggressive climate goals.”
The contradiction is clear: the very tool meant to streamline operations risks undermining business and sustainability commitments if efficiency isn’t prioritized from the start.
Five levers of efficiency
Making efficiency the guiding principle of AI adoption is imperative. Frey advocates for a holistic strategy, focusing on five major “levers” to ensure AI delivers value responsibly and efficiently:
Getting the most work out of every watt
Goal
Problem
Maximize work per watt of energy
AI workloads expect to consume 150 TW globally by 2030. Inference tasks use 3× as much energy as training
Solutions
Using liquid cooling
Monitoring energy use
Getting power from low-carbon sources
The importance of analytics and working together
To meet big sustainability goals, companies need good data and teamwork. Frey notes: “It’s not a surprise that often what we’re trying to get better, we don’t measure. So first, it’s really putting analytics in place.”
This means using tools to track energy use and spot waste. Collaboration is also key. “When companies collaborate with their sustainability organizations, the
facilities organizations, all of a sudden they can come to solutions that have optimized as much as possible for a variety of variables, including their sustainability goals.”
Measuring the impact
It’s not always easy to measure all the ways AI affects the world. Frey uses two words: “footprint” (the environmental cost) and “handprint” (the positive difference AI can make).
“We often talk in terms of footprint and handprint. So the footprint is the impact of the technology solution itself…The handprint is the beneficial good that you do as a result of that. And if both the footprint and the handprint are in the same currency, say, kilograms of CO2 equivalent, for example, then you can compare them to each other. But often, there’s a societal benefit…And it’s hard to quantify, but that doesn’t keep us from trying.”
As more organizations use AI, Frey’s five levers—data, software, equipment, energy, and resource efficiency—show the way to balance business success while minimizing environmental impact. Using these principles, along with good measurement and teamwork, can help companies use AI in a more efficient and sustainable way.
The journey requires careful planning, new ideas, and a strong commitment to efficiency every step of the way.