Select Page

 

I did my undergraduate IT degree in the 1980s,  A time of big hair, and bold fashion, and in my second year in 1984 we were told that we were the first cohort to learn about this new technology and that it would change the world.
Back then computing was all BATCH processing on mainframes, but REAL TIME programming ushered in a new wave of architecture to allow transactions to be updated in real-time.

It kind of did – through automating INFORMATION (informed data). Fast forward to today, and we’re on the cusp of another monumental shift, this time in the realm of KNOWLEDGE efficiency, all thanks to Artificial Intelligence (AI).

So, what’s the big deal about AI and its insatiable appetite for COMPUTE power – perhaps very soon the most precious commodity in the world?
To simply AI technology, Imagine a towering four-layer cake:

  • CHIPS  At its base, Chips that power everything.
  • INFRASTRUCTURE, think massive computer rooms.
  • MODELS The third layer consists of Models like ChatGPT, Copilot, Gemini, Claude, and Grok
  • APPS Topped off with Apps, from image creation to role-playing games.

Nvidia might currently be leading the charge with AI chips, but tech giants like Google, Elon Musk’s ventures, and Facebook are hot on their heels, crafting their versions of smart AI chips.

In a nostalgic twist, we’re witnessing a return to the mainframe era, albeit in a modern guise. Microsoft and OpenAI recently threw their hats into the ring with a jaw-dropping announcement: a $100 billion project to construct a supercomputer dubbed Stargate. This behemoth, set to be completed in 5-6 years, promises to be a game-changer for training future iterations of GPT models.

 

But here’s a plot twist – these supercomputers, with their gargantuan appetite for power, can’t be confined to a single region. The energy demands are astronomical, to the point where discussions about nuclear energy as a power source are on the table. Many are suggesting that these new supercomputers will have to be housed next to nuclear power plants.  Check out the insights on this at Carbon Countdown: AI’s $10 Billion Rise in Power Use Explodes Data Center Emission

On the positive side, by 2028, we’re looking at chips that are ten times more efficient to run machines like Stargate, as detailed at IEEE Spectrum: Trillion-Transistor GPU

 

Now, $100 billion might sound like a king’s ransom, but for Microsoft, which owns 49% of  OpenAI, it’s just another day at the office.
Meanwhile, OpenAI’s CEO, Sam Altron, is on a mission to raise $7 trillion for AI chip manufacturing,

This technological renaissance, while echoing the mainframe era, is essentially cloud-based. As we stand on the brink of this compute revolution, one can’t help but ponder the implications of such concentrated power in the hands of Big Tech.  Open AI just announced an update to their voice Engine which allows AI cloning after only a few sentences.

One key issue is the massive energy requirements for AI super computers. The Energy Agency suggests that  current data center energy usage stands at around 460 terawatt hours in 2022 and could increase to between 620 and 1,050 TWh in 2026 — equivalent to the energy demands of Sweden or Germany, respectively.  This paper breaks it down by AI task.  Google is using Geo-thermal energy to power it’s Nevada data centres in order to become carbon neutral by 2030.

Will they usher in an era of innovation and progress, or are we heading towards a dystopian future à la George Orwell’s 1984?
The Open AI Safety policies and their voluntary commitments seem like a good start, but can they provide adequate governance for a technology that is advancing so rapidly?
Only time will tell, but one thing’s for sure – we’re not in the ’80s anymore.


Discover more from Frontiering

Subscribe to get the latest posts sent to your email.