In our search for “the next big thing,” might we draw more knowledge and wisdom from our technological past?

The present and imminent future is so exciting, so compelling, and so absorbing that our “historical amnesia” is understandable. But might our immersion in the prospects for tomorrow limit, or even eliminate, the knowledge we can learn from innovations – successful and not, computational and bio-medical – that have come before us? 

Yes, we can definitely benefit from understanding that what comes next digitally demands deeper engagement with history and even with the deeper human, biological and environmental past.

I turn first to the past, whenever I confront questions about emerging digital technologies – their commercial prospects and their societal and cultural implications. Historical methods are open, democratic and relatively easy to master, and historical research remains centered around texts and documentary evidence.  

Ad

Historical sources provide many clues and insights into how to understand the socio-technical systems of today and tomorrow. So to anticipate the next phases of our digital future, we must first understand clearly how we got to where we are, technologically.

Bill Gates, Microsoft co-founder.

What am I looking for when I look to the past? On one track, I always try to understand the past in its own terms. I try to understand contingency by heightening an awareness that no innovator’s success was inevitable. Lessons can be learned, for instance, from imagining factors that could have prevented Steve Jobs and Steve Wozniak (Apple), or Bill Gates and Paul Allen (Microsoft), or Sergey Brin and Larry Page (Google) from succeeding wildly. These astonishing partnerships might never have yielded bountiful harvests. Hewlett-Packard might have embraced the offer by Wozniak, then an H-P employee, to build a personal computer. IBM might have hired Gary Kildall to write the original operating system for the PC, leaving Gates and Allen in the digital wilderness. Brin and Page could have been out-maneuvered by one of the many competing search engines that existed when they launched Google out of their Stanford lab. 

Exploring what might have been helps undermine the pernicious belief that some innovators were destined for dominance, and that nothing could stop them. So one way to engage the past is through “what if” thought-experiments, or rear-view mirror scenarios that produce sharply different outcomes than what actually happened. 

Ad

At the same time,  I’m committed to a concept of “usable history,” so that I also seek to build models from the past. Historical awareness, and historical materials, for me, generate explanatory models for present and future phenomena. These models, derived from close historical readings, can help anticipate the future, if not sometimes even predict it. 

Building such models is easier if it is nourished by overlapping theories of technological change. On the generative side, I pay out-sized attention to the concept of momentum, or what some call “path dependence.” While innovators are obsessed with change, historians have spent a great deal of energy on why things stay the same. Paul A. David, the emeritus economic historian at Stanford, wrote a widely cited article on the persistence of the QWERTY keyboard, arguing that high “switching costs” explained why sometimes inferior technologies could defeat new and improved innovations in the market. David uses the term path dependence to describe the capacity for old technologies to triumph over new ones.

Venture capitalists, while often looking backward ever so slightly, long have embedded the concept of path dependence in their economic calculus by often insisting that replacement innovations must carry a 10-fold, or even hundred-fold, improvement in productivity. Yet even when such improvements appear available, supplanting an existing socio-technical system with a new one can be surprisingly difficult. 

Consider the fabulous advantages of Amazon’s Blink outdoor camera system with traditional services offered by home-security companies. Yet new approaches, which can be controlled from smart phones, only slowly gain ground on existing systems which can seem brain-dead by comparison. 

The history of why things stay the same can also offer meaningful lessons on why some highly-touted and even briefly-successful technological projects can fail so extravagantly, such as nuclear-powered airplanes in the U.S. in the 1950s (touted), or France’s Minitel computer system (briefly successful– in the 1980s it brilliantly anticipated both the personal computer and the Internet). And the same idea of path dependence helps understand why the transition to solar electricity for residences, which so many know to be desirable and appealing, may nonetheless take decades, not years, to fully occur. 

I also pay attention to the “reception side,” or how regular people accept or resist emerging technologies that appear to have considerable value or support. I’ve been very much in the thrall of the American sociologist William F. Ogburn. In 1922, he articulated a “cultural lag” theory to explain the failure of societies to “catch up” to valuable innovations. To Ogburn, adaption to emerging technologies is only a matter of time, and absorption is a slow-motion inevitability. There is a superficial appeal to Ogburn’s lag thesis, since societies often do only eventually adapt to technological trajectories in ways that incorporate established culture, values and aspirations. 

I’m interested in problems arising from adaptation difficulties as well as creative destruction (in other words who wins and who loses from radical innovations). So I find Claude Fisher’s work on the rather slow adoption of the telephone to be an antidote to the popular (if wrong) contemporary idea that technological change is ever accelerating. The telephone faced a crisis of stagnation in the 1930s. 

 “Perhaps there are some techno-scientific doors we should close and, collectively, insist that humans never open. Historical cases may help teach us how to do so effectively.”

G. Pascal Zachary

By contrast, efforts to slow down or halt techno-scientific change fascinate me, and I believe deserve more attention. There is plenty to learn about the future of gene-editing technologies, I suspect, from examining in detail how in 1952 the highest levels of the U.S. government were consumed with a debate over whether never to test the first Hydrogen bomb. Leading engineers and scientists secretly argued that the genie should be put back in the bottle. They were over-ruled by a political elite in the thrall of a venerable notion: if humans can build something, even a world-destroying explosive, they always should. 

Yet as students and practitioners of gene-editing increasingly worry, perhaps there are some techno-scientific doors we should close and, collectively, insist that humans never open. Historical cases may help teach us how to do so effectively. 

Practicing historians manage the tension between documenting and explaining, between building case studies and identifying wider historical forces (let’s not call them theories of history) that may work independently of sociological and material factors, or that work in concert with them.   

Case studies are very important. Even in my studies of the present I find the case study method valuable. So when I researched, under a National Science Foundation  grant, the rise since 2000 of academic computer science in East Africa, I paid special attention to the co-evolution of computing and the mobile phone in this region. We also studied the social role of information in East Africa from 1960 on, so we could understand how computing systems, introduced by outsiders, might have been shaped by any distinctive East African experience with information. 

Similarly, a current project of mine, on the spiritual and religious sources of current and future digital and bio-medical innovation, relies heavily on past conceptions of consciousness, models of the mind, and models of information processing. For example, from the 1950s through the 1970s the counterculture movement in the Bay Area and around Boston intersected in many ways with the pioneers of personal computing. 

My experience has taught me to conclude that we can ground speculations or anticipations about the future in case studies from the past. The past can also teach us if not when future disruptive changes will occur, at least what options we might have as individuals and societies when these radical disruptions do happen. 

Our technological and scientific histories, meanwhile, are also shaped, sometimes decisively, by charismatic personalities, such as Jobs or Jonas Salk, Oppenheimer or Rickover, Bernadine Healy or Rachel Carson. While eschewing any “great person” theory of change, we nevertheless can learn from the role of what the German sociologist Max Weber 100 years ago labeled “charismatic authorities.” 

These unusual people, gifted in in their technological insights, inspired leadership and their design of innovative organizations can shape humanity’s complex socio-technical systems in surprising and durable ways. We just have to look back to see it.