What’s trending
UPCOMING EVENTS
Looking Past the Hype: Artificial Intelligence Sustainability and Its Future Cost
By Sasha Semjonova
Artificial intelligence has been a dominating force ever since the release of ChatGPT at the end of November 2022. From there, AI has delivered and excelled in all of the areas standard for a groundbreaking technology: prowess, possibilities, and power.
However, with any monumental technological breakthrough, the question of efficacy and sustainability will always come up sooner or later, and that time has come for AI.
The Power of AI
There is no doubt that artificial intelligence – in its many forms and models – is very powerful. Although 2022 is seen by many as a real turning point for the technology, with AI adoption doubling from 2017 to 2022, the first real breakthroughs began in the 1990s and early 2000s, and then again in 2017 with the release of ‘Transformer’ models.
Since 2022, AI’s feats have been nothing short of remarkable; in the same year as the official release of the ChatGPT we all know and love, DALL-E 2 and GitHub Copilot were also released.
In the tech industry alone, AI’s achievements and advancements have spanned multiple sectors, including cybersecurity, customer service, medicine, analytics, DevOps, software development, and more.
In 2024, rival chatbots from Google, Meta, ServiceNow, Microsoft, and Salesforce entered the scene, quickly turning into agents and signaling the start of an agentic future. It was also the year for legislation, with the EU’s Artificial Intelligence Act coming into force in August, marking a significant step by an international regulator to implement a framework and safeguards around AI.
The arrival of this legislation signaled that artificial intelligence was bridging the scope of its current possibilities and a new technology of this size and power needed to be regulated.
Power… But At What Cost?
As reported by Aquiva Labs, 2024’s artificial intelligence advancements made a lot of people stop and think about just what AI was costing in terms of both money and energy.
In December of that year, OpenAI released its latest model, o3, and called it the beginning of Artificial General Intelligence (AGI). Its pass score on the ARC-AGI benchmark, a test for any AI to see how well it can adapt to new tasks, was a mighty 87.5% – considerably higher than the previous benchmark score of 55.5% for any other AI.
However, this kind of computing power comes at a cost; every task in o3 ‘High Compute Mode’ costs more than $1000 in computing power. That is 170 times the cost of the compute version of o3 and clearly much higher than o1, which costs less than $4 per task.
Not only that, but according to a recent study, a single o3 task uses about 1,785 kWh of electricity. This is the average amount of electricity an average American household uses over two months.
Boris Gamazaychikov, the Head of AI Sustainability at Salesforce, says this also translates to 684 kg CO₂e, based on last month’s U.S. grid emissions factor. This is equivalent to the carbon emissions from more than 5 full tanks of gas.
“AI is an Energy Hog”
According to the International Energy Agency, electricity consumption from data centers, AI, and cryptocurrency could reach double 2022 levels by 2026. On the lower end, this could mean a requirement of about 160 terawatt-hours of additional electricity by 2026. On the higher end, that number might be 590 TWh. To put it into perspective, 160 TWh is equivalent to the electricity usage of approximately 15 million average U.S. homes in a year.
However, measuring what AI costs in terms of fuel costs gets tricky due to the different kinds of power it needs. As it’s so difficult to nail down AI’s specific contribution, predictions about electricity demand from data centers should always be taken with a pinch of salt.
As reported by the World Economic Forum, despite AI’s rapid expansion, AI data center energy consumption will “still likely account for only a small fraction of global electricity demand.”
Predominantly, AI’s energy usage can be broken down into a few select areas:
- Pre-Training: This stage requires vast datasets and months of computation.
- Post-Training and Refining: Continual learning adds an extra computing burden.
- Inference: Running AI models at scale consumes constant energy.
- Cooling: AI chips generate extreme heat, requiring liquid cooling and water chillers.
- Data Transfer: Moving data between chips and storage uses more power than expected.
Although powering AI comes at a considerable cost, some experts insist it’s a necessary evil. These high energy costs will force companies to optimize for efficiency once running large models becomes financially unsustainable.
This poses another important question: Is sustainability a genuine focus for these companies, or does the “necessary evil” mentality span further than we thought?
Is Sustainability a Focus?
According to the World Economic Forum, coordinated efforts across industries are needed to enable sustainable AI adoption. There are concerns that grid infrastructure will not be able to keep up with the demand and that the effects of AI’s energy usage will negatively contribute to climate change. But we only need to cast the magnifying glass over this industry’s top players to see where their chess pieces are set on the board.
Microsoft, for example, has a corporate power agreement to revive a nuclear plant. OpenAI has made significant investments in the nuclear fusion company Helion Energy. Salesforce manages their energy usage by utilizing a strategy of building limited, purpose-built models rather than “large, monolithic, general-purpose ones”.
However, no organization is truly unwavering in the events of political unrest, innovation, or the promise of monetary gain. Nearly five years ago, Microsoft pledged to bring its greenhouse gas emissions to zero (or even lower) by the end of the decade – something that no longer seems entirely feasible, as per the company’s latest sustainability report. AI has been cited as a potential reason.
Salesforce have committed to being mindful of their impacts on the environment with the release of the AI Energy Score, despite having a two-star rating on their submitted model compared to Facebook’s models, which have 4 and 5 stars, as well as Microsoft’s having 4 and 5 stars.
Greenwashing in the Tech Sector
Robert Sösemann, Senior Principal Architect at Aquiva Labs and AI enthusiast, said that this kind of ‘cold feet’ approach to sustainability and efficiency is merely the standard of any breakthrough technology.
“Whenever you bring out a new technology, it’s inefficient. It doesn’t work,” he said. “People are skeptical. But in the end, humankind always benefits from it.”
He believes it was natural for companies to prioritize innovation and results over sustainability, certainly in the short term, whether that was a good thing or not.
In fact, according to Robert, a lessened focus on sustainability might not be as detrimental as it sounds, as AI is likely to assist us in finding environmentally and climate-beneficial technology in the long run.
Greg Wasowski, SVP of Consulting and Strategy at Aquiva Labs, says that many companies may want it to seem like they prioritize sustainability, but that this may not always be the case.
He said that many companies use Power Purchase Agreements (PPAs) to appear sustainable whilst still relying on fossil fuels. If a company is strictly focused on growth and quick results, then fossil fuels are the only short-term option.
This is because fossil fuels are one of the only solutions we currently have to meet the demands of AI’s energy usage. Dr. Vijay Gadepally, a senior scientist and principal investigator at the Massachusetts Institute of Technology (MIT) Lincoln Laboratory, said that it will be “very hard” to meet this demand with clean energy.
“You can’t take ten years to build a data center. It has to be done in a year, a year and a half, just because of the economics behind it,” he said. “The only power sources that can generally scale that fast are non-renewable sources.”
However, this tech-variant of greenwashing is still something that we should – as consumers of AI – continue to examine closely.
“It’s interesting that progress is being prioritized over sustainability, but I think that’s just in the short term,” Greg said. “Long term, we already have solutions that are more sustainable and rely on clean energy, like SMRs.”
Salesforce: A Case Study
Although Salesforce’s AI Energy Score falls short of its competitors, this doesn’t mean that their AI itself is inefficient. In fact, Agentforce is 92% more energy-efficient than GPT-4, as Salesforce’s SFR-RAG model achieves 97% of GPT-4’s accuracy while using 92% less energy.
Salesforce have also been extremely vocal with their thoughts on AI regulations, with both the AI Energy Score and their general reporting, giving customers insight into the impacts of the AI services they use.
Not only that, but the cloud giant has also committed to reducing its absolute emissions by 50% by 2030 and striving for near-zero emissions by 2040. Agentforce, of course, plays a big role in this; Salesforce is committed to leveraging AI to address environmental issues by using AI-powered agents to optimize supply chains.
How Do We Balance Innovation and Sustainability?
Although sustainability might not be a keen focus for all organizations creating and using AI (despite claims stating otherwise), there is still no denying that sustainability as a focus should not be entirely disregarded.
According to a recent Salesforce survey of nearly 500 sustainability professionals, 58% believe the benefits of AI will outweigh its risks when solving the climate crisis. However, we are still in the early days.
Despite this, the World Economic Forum reported that AI is already helping some companies reduce energy usage by up to 60%. We’ve also witnessed the current brilliance of this in action with DeepSeek, which uses only 2,000 GPUs vs. 350,000+ for traditional AI while maintaining strong accuracy.
So, how do we actually balance innovation and sustainability going forward without risking falling behind or heavily contributing to the climate crisis?
Nuclear or Nothing
There has been a notable interest in nuclear energy ever since the AI energy debate first emerged. This is due to the fact that AI data centers need consistent, 24/7 power, making nuclear a strong fit.
The possibilities of nuclear energy also extend to fusion energy – the ultimate clean power, thanks to its carbon-free disposition. However, grid infrastructure is not ready for widespread nuclear integration, and fusion energy is decades away from commercial viability.
Gadepally says scaling is our current biggest concern and that he “doesn’t know” how quickly we can scale nuclear energy.
“Clean wind and solar power are also a poor fit for data centers, which need ‘firm’ energy [like nuclear] that runs around the clock,” he said.
“[However], if developers find they have to add firm energy very quickly, they will likely turn to climate-warming natural gas – as is already happening in data center hotspots like Texas and Virginia.”
Energy-Efficient Hardware
The WEF believes that a key way to focus on sustainable energy in our current environment is to work on the development of energy-efficient hardware. This includes AI-optimized cooling and smarter data center design and operations to limit AI’s energy consumption.
Microsoft and OpenAI are also investing in Small Modular Reactors (SMRs), which could provide stable, low-carbon power for AI. However, the challenge here is that high upfront costs and slow regulatory approval mean SMRs won’t be the short-term fix that they might be looking for.
Ultimately, the thinking is that if we run all of our data centers on clean energy – in the short term or long term – then we need to try to make AI models consume less energy – a belief that Gadepally stands behind.
Algorithm Advancement
Aside from the hardware, developing the software will also be crucial.
The WEC predicts that advancements in chips and algorithms (e.g., small language models) may further mitigate AI’s energy consumption, and recent advancements in self-improving AI could mean that AI routinely refines itself to create smaller, more efficient versions in the future.
Robert says that he believes a focus should be put on model reuse too, rather than training new foundation models from scratch, which could do a lot to increase the efficiency of internal AI systems whilst also reducing energy consumption.
Regulation, Regulation, Regulation
Lastly, it’s no secret that half of the battle of getting any kind of new technology or energy proposals across the line is tackling regulation.
However, according to a study by the AI Governance Alliance, “regulatory, policy, and financial enablers can [actually] incentivize responsible AI development through compliance frameworks and funding mechanisms.”
Regulatory and policy frameworks can promote responsible AI development by setting standards for ethical use, privacy, and transparency, while financial incentives like grants and tax benefits encourage investment in ethical AI research and education. Together, these mechanisms ensure that AI development aligns with societal values and maintains public trust.
AI sustainability regulations will also likely coincide with natural market economics, driving efficiency improvements as companies try to reduce operational costs.
Final Thoughts
AI is a powerful and fast-advancing technology, and as a result, it requires a significant amount of energy – an amount that we currently can support but may struggle to in the future if we do not reevaluate our energy options.
As the focus on AI isn’t currently best placed to consider sustainability at the forefront of efforts, we might find ourselves running out of time to find a balance between innovation and sustainability, despite hopes that AI itself will support this endeavor.
Ultimately, if we aren’t able to use clean energy, then more resources need to be used to work out how we can make AI use less energy, which is likely something that experts will be working hard on for the foreseeable future.