This might sound blunt, but considering I've described myself in job interviews as mercenary and also a blunt instrument don't be surprised. Here is the reality check that some need: AI isn't going anywhere. The rise of AI and large language models is a fact of life now, and anyone thinking this is just another bubble waiting to burst needs a serious dose of historical perspective, and depending on circustances a slap across the face.

Look back at the 2007-8 financial crisis. The entire global economy neared collapse, yet banking didn't disappear—it adapted and continued. Or consider the dot-com bubble of the early 2000s. Sure, plenty of people lost money and countless startups went bust, but the internet didn't cease to exist. It became the backbone of modern 'civilisation'.

The same pattern applies to AI. Classical AI and large language models will require major power consumption and massive infrastructure upgrades. That's not speculation—it's engineering reality. This march "forward" isn't stopping, regardless of what environmental activists or tech sceptics might hope.

The numbers make this abundantly clear. According to analysis by Alex de Vries-Gao, founder of the Digiconomist tech sustainability website, artificial intelligence systems could account for nearly half of datacentre power consumption by the end of 2025. His research, published in the sustainable energy journal Joule, suggests AI could already represent 20% of the 415 terawatt hours of electricity consumed by all datacentres globally last year.

To put this in perspective, AI consumption could reach 23 gigawatts by year-end—twice the total energy consumption of the Netherlands. The International Energy Agency forecasts that datacentres will need as much energy by the end of this decade as Japan uses today, with AI demand being a primary driver of this exponential growth.

Water consumption is equally staggering. Researchers at the University of California, Riverside, and the University of Texas at Arlington estimate that global AI demand will require 4.2 billion to 6.6 billion cubic meters of water by 2027—more than half of the UK's annual water consumption.

But let's be honest about context here. The environmental impact of AI isn't good, but neither is the environmental impact of human beings on the planet, full stop. We're also not hearing equivalent outrage about the production of Bitcoin or those ridiculous electronic art pieces built on blockchain technology that consume massive amounts of energy for essentially decorative purposes. And no I can't remember what they're called and I'm not wasting the energy of doing a google search on the ridiculous things. I'll go boil the kettle instead.

The difference is that AI is actually solving real problems. Unlike crypto art or speculative trading, AI is improving medical diagnoses, optimising energy grids, and accelerating scientific research. That is fact. The question isn't whether we should stop AI development—it's how we manage its environmental footprint while it transforms virtually every industry on Earth.

The tension between AI infrastructure needs and environmental reality is playing out dramatically in places like Brazil. The Chinese social media giant TikTok is reportedly planning a supercomputer warehouse in Caucaia, a city in northeast Brazil that has declared a state of emergency due to drought in 16 of the 21 years between 2003 and 2024.

The proposed facility would be the size of 12 football pitches and part of a 55 billion reais (£7.3 billion) expansion of global datacentre infrastructure. According to the State Environmental Superintendence, the project has already received approval for water consumption of 30 cubic metres per day from an artesian well, though access to further details has been restricted due to commercial confidentiality.

Major tech companies aren't burying their heads in the sand—they're pivoting hard toward solutions, even if those solutions come with their own complexities. Microsoft admitted in 2024 that five years after committing to become zero carbon by 2030, its total emissions had risen by 23% due to factors including AI expansion. Google reported that 15% of its water use was in areas of "high water scarcity," while Microsoft said 42% of its water came from areas of "water stress."

The response has been a decisive move toward nuclear power. Meta recently signed its first nuclear deal, a 20-year agreement with Constellation Energy to keep one reactor operating in Illinois. The deal allows Constellation to expand the Clinton Clean Energy Center by 30 megawatts, with Meta providing financial support for relicensing and operations when state subsidies expire in 2027.

Google has reached agreements to supply its datacentres with nuclear power via half a dozen small reactors, while Microsoft's contract will restart the Three Mile Island nuclear plant—the site of the most serious nuclear accident in US history. These moves represent a recognition that renewable energy alone isn't sufficient to meet AI's voracious appetite for clean power.

The technical realities of datacentre cooling present unavoidable challenges. AI workloads generate intense heat that must be continuously managed to prevent equipment failure. Datacentres can be cooled through air conditioning, not efficient for large facilities, or water-based systems.

Water cooling can involve closed-loop systems similar to car engines, but costs are high. Alternatively, cooling towers use evaporation to remove heat, allowing cold water to be pumped back into the system. Both methods result in water loss, with even a small 1MW datacentre consuming 25.5 million liters of water yearly and losing 255,000 liters to evaporation.

In the UK, a proposed £10 billion datacentre in Lincolnshire demonstrates the scale we're dealing with. The Elsham datacentre project, with its 15 power-hungry computer warehouses, is projected to release 857,254 tonnes of CO2 annually when running at full capacity. The facility would consume 3.7 billion kWh of energy annually, and will generate so much excess heat that developers are proposing adjacent glasshouses with capacity to produce more than 10 tonnes of tomatoes daily.

The challenge isn't to halt AI development—that ship has sailed. It's about finding pragmatic solutions fast enough to keep pace with deployment. Some factors could help moderate demand growth. Efficiency improvements, such as China's DeepSeek R1 model, that reportedly achieved strong performance using fewer chips, possibly due to export restrictions forcing innovation in efficiency.

However, efficiency gains often lead to increased usage—a phenomenon known as Jevons' paradox. The rise of "sovereign AI"—multiple countries building their own systems rather than relying on global providers—could actually increase overall demand.

One of the most frustrating aspects of this situation is the lack of transparency around actual consumption. For example while the EU AI Act requires companies to disclose energy consumption for training models, it doesn't cover day-to-day operational use.

Companies don't voluntarily publish water consumption data, and governments often refuse to release technical documentation for licensing, citing industrial secrecy. This opacity makes it impossible for communities, policymakers, and investors to make informed decisions about trade-offs.

Behind the technical specifications and carbon calculations are real human impacts that can't be ignored. In Brazil's drought-prone regions, communities have installed cisterns to store water during scarce periods, while water trucks regularly deliver supplies to areas where municipal systems fail. The arrival of water-intensive datacentres in these regions raises legitimate questions about resource allocation.

AI is here to stay whether you like it or not. The question is whether we'll develop practical solutions fast enough to manage its environmental impact, or whether we'll waste time on unrealistic fantasies about stopping technological progress.

Major infrastructure upgrades are coming. Power grids will need to expand. Nuclear plants will need to be built or restarted. Water allocation systems will need to be redesigned. These aren't optional nice-to-haves—they're engineering requirements for the world we're already building.

The sooner we accept this reality and focus on solutions rather than denial, the better chance we have of managing AI's environmental footprint while capturing its transformative benefits. To repeat myself in different words, this march forward isn't stopping.


Share this post
The link has been copied!