Creeping Complexity

Bill Raduchel
3 min readMay 22, 2020

The optimized analog world of forty or so years ago was based on networks in which the nodes were humans. Sometimes they were imaginative, clever. I read about a Cambridge don who sent himself reminders to non-existent addresses around the world, depending on the postal system to return them to him at the appropriate time. At a company where I worked in the 1980s, their distribution system depended on high school kids on roller skates putting paper slips into pigeonholes. Try as we could, we could find no digital solution within even one order of magnitude on cost.

Analog systems like this had one huge advantage: they could deal with ambiguity. That ability allowed them to be much simpler: ambiguity allows simplicity. The network nodes, aka human beings, could look at information, realize something was wrong and correct it. Software nodes can only do what they have been programmed to do. Thus, seemingly simple tasks became complicated. A rule of thumb was that more than 90% of every software application deals with error handling rather than the actual task.

I was forced to deal with this full on thirty years ago at Sun Microsystems. In the spring of 1989 we had a disastrous quarter, and the initial answer the company gave was that new manufacturing software failed. This was correct but misleading. We replaced an existing enterprise resource planning system (ERP) with a new one. The existing system was much less sophisticated and relied upon humans making common sense decisions. The data had to be roughly right. The new system did much more, but the…

--

--

Bill Raduchel

Author, The New Technology State and The Bleeding Edge. Strategic advisor on technology and media, independent director and former angel investor.