Where exactly are we headed with AI?
The more things change, the more they stay the same.
The more things change, the more they stay the same.
This week saw two major announcements from two top players in the AI space: OpenAI with its GPT-4o model and Google with a slew of announcements that include new models, an AI assistant, and deep AI integration into everything – including search.
Gunning for AI supremacy
There's a mad scramble for AI supremacy among the technology giants. And many of the usual players, as well as a long list of startups, are rushing to develop cutting-edge generative AI models.
And not unlike previous iterations of similar technological races, some players are already imploding. Stability AI, facing a cash crunch, is reportedly discussing a potential sale, according to The Information.
There is no question that OpenAI remains firmly ahead. While OpenAI offered GPT-4o for free like GPT-3.5, it is clear GPT-4o is superior to GPT-4. In fact, some such as the AI-powered writing tool Type.ai have within days swapped out GPT-4 for GPT-4o.
In the meantime, the sheer speed and correspondingly lower cost of GPT-4o means OpenAI can offer it to free users – giving them an almost unassailable advantage in terms of training data.
Do we really need more data centres?
This rush towards AI will sharply ramp up power demands, says Charles Yang of Huawei. Speaking at Huawei's Global Data Center Facility Summit 2024 on Friday, he cited predictions that global consumption of electricity for data centres will surpass a trillion kilowatt-hours (kWh) by 2026.
For sure, data centres are changing as the computer systems they were created evolve. Love it or hate it, liquid cooling is going to be a key topic in future data centres as rack density continues its rise.
But while I don't doubt that a new class of AI-centric data centres will emerge over the next 18 months, I'm wary that a substantial amount of the clamour about the need for extremely high-density racks of 100kW or even 200kW racks is generated by those with something to sell you.
This isn't the first instance of sky-high predictions. Remember blade servers, hyperconverged systems, and the cloud? And here we are today with a global average rack density of 6kW, according to 2023 data from the Uptime Institute.
In this case, I believe the thirst for more AI processing and dire predictions of an energy apocalypse will be tampered by new generations of far more energy-efficient chips, new strategies such as neuroscience-based techniques, and the fact that GPUs will continue to make up a relatively small proportion of global data centre workloads.
What does it mean for you and me?
Talk of AI and data centres are all good and well. But most of us are probably more interested in the implications of AI on our jobs. This is where things get a lot murkier.
At the corporate level, organisations are dabbling in AI to save money and dramatically speed up work processes. Depending on their success, the unpalatable reality is that it could impact jobs – and there is practically nothing we can do about that.
On a personal level, everyone is having varying levels of success in improving their productivity. What is confounding is how our success with AI is completely different depending on individual strengths, our inclinations, roles, and even distinct responsibilities at work.
In my opinion, the only sane strategy is to grab the (AI) bull by its horns. Dabble in AI and do more than dip your toes in. Embrace the technology, learn its intricacies, and actively adapt it to improve what you are doing. Don't be lulled into a sense of complacency just because nothing appears to have changed.
Bill Gates once said: We overestimate the impact of technology in the short-term and underestimate the effect in the long run. As AI becomes deeply integrated into various industries over time, expect the landscape of work to keep changing.
After all, jobs have never stopped evolving.
Ultimately, the more things change, the more they stay the same.