A HISTORY OF MODERN COMPUTING
By Paul E. Ceruzzi
The MIT Press, 2003, 445 pages
One could wonder how much of ‘modern computing’ could be understood in context. This question is a serious issue among historians as a more full look at all the social factors related to technology change, like any other change, only come with time. For example, most technology takes applied directions different from how originally envisioned by its creators. The morphing of ARPANET into the Internet, growth of the world-wide web, human-machine interfaces, migration from tubes, to transistors, to silicon chips, to new investigations into quantum processors, are all examples of technology change not envisioned by those who created the predecessors. Perhaps one of the latest look into technology is in the area of artificial intelligence (AI).
With that said, Ceruzzi took the intrepid step to attempt the task of defining technology change while examining circumstances that at least influenced, if not defined, the evolution. I say change deliberately avoiding the word advancement because that word implies progress toward a specific goal. In fact much technology change represents a haphazard combination of salient and reverse-salient approaches to specific needs. Many of the needs were not needs until the technology changed and inspired the need. For example, today most people would claim ‘dependence’ on their smart-phone, but many of us remember where cellphones didn’t exist, yet alone their ‘smarter’ descendants.