J.C.R. Licklider is not a household name, but the computing world he envisioned in the early 1960s is the one we live in today. When he took over the Information Processing Techniques Office at ARPA (now DARPA) in 1962, computers were room-sized machines that processed batch jobs. Users submitted punch cards and waited hours for results. Licklider saw something different: computers as interactive tools that would augment human thinking, connected in networks that would allow collaboration across distances.
He then funded the research that made it happen. The labs that Licklider funded, at MIT, Stanford, UCLA, and elsewhere, produced time-sharing systems, computer graphics, hypertext, the mouse, and eventually the ARPANET (the precursor to the internet). The people he funded or influenced include Doug Engelbart (who invented the mouse and demonstrated hypertext), Bob Taylor (who later ran Xerox PARC), and Alan Kay (who conceived the personal computer).
Waldrop tells this story in detail, tracing connections between people, ideas, and funding decisions that are invisible in most histories of computing. The book shows how a field that seemed like pure research in the 1960s produced the commercial computing industry of the 1980s and the internet economy of the 1990s.
The writing is thorough and occasionally dense. Waldrop covers both the technical ideas and the human stories: the rivalries, the funding battles, the moments when a demo changed everyone’s thinking. At about 530 pages, the book requires commitment.
For founders, the history is relevant because it shows how major technologies actually develop. Not through lone geniuses in garages, but through sustained funding of smart people working on hard problems over decades. The pattern, visionary funder identifies talent and gives them space to work, is recognizable in modern venture capital and corporate research.
Patrick Collison put this on the Stripe Press reading list and has recommended it publicly. The book was republished by Stripe Press in 2018 after being out of print. It is considered one of the best histories of computing ever written.
