Member-only story

Bill Raduchel
8 min readJun 7, 2024

AI and Reality

Artificial intelligence is not new as a technology. What is new is that advances in computing combined with the accessibility of enormous amounts of information on the Internet have made it feasible. Artificial intelligence does not spring from the mind of a small team of brilliant people. They were essential to create the software that underlies the implementations we see today: Generalized Pretrained Transformers or GPT. The key word is trained. They need enormous amounts of data to be created. Gossip says it took a month of computing for GPT-4 at a cost of $30 million.

What it trained on was the Internet. Without the Internet and the vast amounts of data available on it, you could not have the large language models (LLMs) we have today. So we have a perfect storm of better software from DeepMinds, a U.K. company owned by Google, cheaper computing courtesy of Moore’s Law and Nviidia GPUs (built for gaming) and content accessible through trees built for search engines. Credit to Sam Altman and Elon Musk for seeing this. though so did others, but Sam Altman won the race to expose it to the whole world.

These LLMs are fantastic, and the technology clearly works to a point. Over the next five years, nearly all software applications will be rewritten to use an LLM as its user interface; this will upend the technology world in so many ways. Mobile devices will change completely. There will be winners and losers. In turn, just as email and Enterprise Resource Planning systems did starting in the late 1980s, corporate workflows will have to change as will corporate organization, culture and staffing. This will take at least a decade, but again there will be winners and losers.

The human brain is a miracle of evolution. It takes a third of the energy of our body, but in the end that is about the same as a 50 watt lightbulb! The rush to build data centers reflects the fact that AI is the exact opposite of the brain in terms of power efficiency. There are projections that 10 years from now AI computing will take a third of all the electricity produced in the United States. That would be half of all we produce today, which is about 12 gigawatts. There are discussions of building single datacenters taking a gigawatt.

The current plans for these AI data centers just cannot be realized: even if we could generate the power they require, there are not enough electrical step-down transformers to deliver it. Indeed, there may not be enough copper. The copper mining and electrical equipment manufacturing industries…

--

--

Bill Raduchel
Bill Raduchel

Written by Bill Raduchel

Author, The New Technology State and The Bleeding Edge. Strategic advisor on technology and media, independent director and former angel investor.

No responses yet