How Much Should Brands Worry About AI’s Environmental Impact?

Artificial intelligence is a notorious energy guzzler. The models that underlie tools like ChatGPT demand huge computational resources, and quietly enabling them are stacks of servers hidden away in remote data centres gobbling up vast quantities of emissions-generating electricity.

Exactly how much energy AI as a whole uses is hard to say, since those data centres handle all sorts of tasks, but to come up with some sense of the total, a researcher named Alex de Vries found a workaround. Because Nvidia’s servers are used by the great majority of the AI industry, de Vries multiplied the number expected to ship by 2027 by the amount of electricity each one uses. He concluded in an analysis published in 2023 that AI servers could consume about as much electricity as a small country uses in a year.

That tremendous energy consumption and the emissions that come with it at first glance seem like reason enough for any brand or retailer that cares about sustainability to keep AI at a distance. Consumers are on alert about the issue, too. Recently, when the bag brand Baggu was hit with online backlash over a collaboration with Collina Strada that made use of AI-generated designs, one of the major points of criticism from commentators was AI’s environmental impact.

But while AI’s appetite for energy is undeniable, it’s also not exceptional. Essentially everything brands do in the course of business has some impact, and it can be much greater than their use of AI. When PwC studied its own adoption of generative AI, it determined that the related annual emissions would amount to “a fraction” of those from business travel.

AI can also help brands be more efficient, including by allowing them to better predict demand and avoid overproducing.

So how much should brands really worry about AI’s environmental impact?

Much of the nervousness over AI’s energy use is due to the recent rise of large language models. The process of training these models, which entails ingesting huge amounts of data, has a high impact on its own. Back in 2019, researchers determined that training a large AI model produced about five times the lifetime emissions of the average car. That sounds catastrophic, but perhaps less so when you consider how many more cars there are on the road than AI models being trained every day.

Of course, training is just one part of the equation. The PwC study noted that, for corporate users, the biggest impact will come from actually using these models over time. That’s in part because of how they work.

“Every time you query the model, the whole thing gets activated, so it’s wildly inefficient from a computational perspective,” Sasha Luccioni, a computer scientist at the AI company Hugging Face, told the BBC earlier this year.

Luccioni worked on a study that looked at the energy use of different tasks performed by AI models. It found that, in order to generate 1,000 images, “the least efficient image generation model uses as much energy as 522 smartphone charges (11.49 kWh), or around half a charge per image generation.” (One caveat was that there was a lot of variation across the image-generation models, depending on the size of the image generated.) Text-based tasks were much more efficient, though they still add up.

While that’s a significant amount of energy, each individual use isn’t so much a problem on its own, and how concerning you deem it may depend on your frame of reference. A separate group of researchers, predominantly from the University of California, compared the carbon emissions of using AI for writing and illustrating tasks versus having humans perform them. The AI systems actually yielded fewer emissions than humans when factoring in how long it takes a human to write a page or create an illustration and the emissions produced by running a computer over that period.

It’s arguably not a perfect comparison. An AI system could quickly write a page of copy or produce an image, but the results may then require a human to go back and edit. Still, if it saves enough time, the point stands that it could compare favourably with a human working at an electricity-powered computer, probably saving their work to the cloud, which is really just another name for remote servers. (The fact that AI could displace workers and that popular generative models are trained on creative work without consent are separate, if no less serious, matters.)

To that point, it can be easy to forget how much energy we use in our daily working lives. Data centres are significant drivers of the world’s growing electricity use, and that’s not just because of AI. They power everything from cloud storage to video calls to internet services, as Ars Technica pointed out in a story about AI’s energy use, and already swallow up huge amounts of electricity to do it.

AI will add to that total, even as its developers try to build more efficient systems. The International Energy Agency predicts the electricity use of data centres, AI and the cryptocurrency industry could double by 2026.

Meanwhile, data centres will be busy powering the rest of the digital world, too, including social media, e-commerce, streaming video, online gaming and more — all of them requiring energy. Ars Technica also noted that the amount of electricity consumed by PC gaming according to one 2018 study was nearly as great as the amount de Vries estimated AI would use in the next few years.

So should brands be concerned about AI’s impact? They absolutely should. But they should also be aware of the impact they’re having in other ways, too, because the environmental problems around AI aren’t totally unique to the technology. Just in running their daily operations they’re using energy and generating emissions. What matters more than just whether or not a company chooses to use AI is whether it’s even considering and trying to reduce its impact overall.

Share This Article