Who are we kidding about saving our planet?
The energy demands of AI are insatiable.
Is saving our planet still possible, or is it a mirage as AI pushes us to the tipping point with the carbon from a hundred million GPUs?
I've written much about technologies like liquid cooling and efforts by operators to make data centres more sustainable.
But on this Unfiltered Friday, I ask the question: Are we kidding ourselves?
Pandora's box
When OpenAI first unveiled ChatGPT, it was the culmination of many years of AI research.
A quick spotlight:
- 2009: Li Fei-Fei pioneers large-scale datasets for CV.
- 2017: Researchers invented transformer architecture.
- 2020: Seminal paper about the scaling law of AI.
And then OpenAI publicly released GPT-3.5 in 2022, and there was no going back.
More data, more GPUs
Today, dozens, maybe even hundreds of tech firms are in a race to create better AIs.
New AI models are utilising new techniques, multimodality, and problem-solving capabilities to get ahead.
But mainly, they all have an eye on the scaling law, which is based on the premise that AI models trained with more data and compute power perform better.
And so they keep going bigger:
🔸Current state-of-the-art AI models are now being trained on clusters of servers with up to 100,000 GPUs.
🔸New, multi-data centre 300,000-GPU clusters are now being designed and built for the model-after-next.
🔸To train the next-next generation of AI, tech giants are mulling nuclear plants and snapping up all the fossil power they can.
Is the million-GPUs AI model next?
Meeting climate targets, not
According to a November 2023 report by BCG:
- Global emissions still rising by 1.5% annually.
- Emissions must go down by 7% yearly to stay at +1.5°C.
- 50% key climate tech won't be economically viable soon.
Then there's the power demands of AI:
- We are ramping up fossil production.
- Coal plant shutdowns are being deferred.
- Fuel giants are walking back on commitments.
In fact, it's been pointed out to me that we are now ramping up fossil fuel production at a faster rate than ever before.
To what purpose?
I don't have an aversion to AI. I've written often about how I use AI daily, and the need to upskill and learn AI.
But one has to wonder: Why are we in such a hurry to build ever-more powerful AI systems? Somehow, I doubt the reason is to benefit me personally.
The tech giants are in it to dominate markets and maximise profits, creating an unassailable advantage for themselves and their shareholders.
And they can't stop because their competitors won't.
Generative AI, like Pandora's box of Greek mythology, can no longer be contained or controlled.