The Innovators by Walter Isaacson
Notes by Bill Norris 2/20/15
According to Isaacson the men and women who created computers and the Internet were often Ph.D.’s in mathematics, engineering, physics, and psychology! They generally worked together, sharing ideas. A number of them were tinkers with electronics. Some were musicians. Many had parents who nurtured them in technology and the humanities. The environment of the flower children of Haight-Ashbury in San Francisco was influential in the products that were created with personal computer technology. There were three ways that the digital age was put together. First, government programs, second, corporate efforts, and third, peer collaboration.
THE PEOPLE AND THE PROGRESSION:
Charles Babbage – Built his Analytical Engine in 1842, the first primitive, mechanical computer
Ada Lovelace, daughter of the Lord Byron, in 1847 first described the capability of a computer while a friend and associate of Babbage. She stated that no machine would ever be a thinking machine; later dubbed by Alan Turing as Lady Lovelace’s Objection.
Eckert & Mauchley – Invented the ENIAC
Alan Turin – Bletchley Park; a mathematician, built a computer that deciphered Hitler’s Enigma Machine
John Bardeen & Walter Brattain – created the transistor at Bell Labs; won the Nobel Prize
William Shockley – jealous re transistor discovery, horned in on their Nobel Prize
Arthur Rock – Venture Capitalist – moved to West Coast to raise funds for the new industry
Robert Noyce, Gordon Moore, Andy Grove – INTEL developed processors for industry
MOORE’S LAW – 1975 – The number of transistors that can be put in a microchip will double every year for the next 10 years
GAMES – Starwar developed by guys in a Stanford U model railroad club; “hackers” coined;
Vannevar Bush conceived the idea of the government-industry-academia working together to advance the science of the USA.
Stanford U. created a Technology Park to spawn technology companies
Robert Taylor working with ARPA (Applied Research Program Adm.) came up with the idea of connecting computers into a network, hence evolved the Internet. The establishment fought it, not wanting to be connected to each other for research security reasons.
See Chapter 12 as a summary of the teamwork that made the Digital Age.
The Internet grew up among numerous communications experts. No one person can claim the insight alone, though one did, and was put down by his peers. The concept was one of a distributed network, rather than a central controlled system where one computer could go down and cause the whole thing to collapse. Evolving during the Cold War made experts leery of one master controller that could be taken out. An ARPA program spun funds out to corporations and institutions who were the likely people to put their computers together in a network, ARPANET. The idea of routers sending packets of data was devised by Paul Baran at RAND, to be reassembled when they reached their destination. Routers handled the traffic rather than main frames, freeing up mainframes’ computing space. Finally, in 1973 an internet protocol IP was created to direct the packets through the system and reassemble them at the destination. The Internet was born. They were geeks, geniuses and hackers who masterminded the architecture for the Internet.
The dispute over why the Internet was created has two answers: #1 – it was to allow researchers to communicate together for academic R&D and #2 – it was created to counter the threat of nuclear war for military purposes. Both were concerns. The government wrote checks based on the second issue! As one observer put it, building the Internet was like building a cathedral. Bricks were laid by different individuals and generations, and for history different generations claimed the leading role, but in the end, it was the co-effort of many that produced the cathedral over time. Government funding of speculative research was the bread and butter of digital innovation.
The personal computer grew out of the “flower power” generation on the West Coast. Paul Baran described the concept as far back as 1945, but it was left to later generations to create it in the 1980’s. The concept of a device that could communicate, store information, pictures and create open ended encyclopedias was envisioned. In the seventies and eighties the freewheeling LSD generation wanted distance from government and military Big Brother mentality.
Gates grew up as a nerdy, obsessed kid in a private school on the West Coast who was fascinated by computers. He spent all his spare time with a remote device connected to a government computer. BASIC was the language he used. In the eighth grade he was working on commercial projects for industry on main frame computers. He and Paul Allen became fast friends on such projects. He developed a program for counting traffic with a tube laid across the highway. He chose Harvard and applied mathematics. At Harvard he lived in the computer room with a military funded computer. In January 1975 they read about the Altair 8800 in Popular Electronics developed in Albuquerque NM by Ed Roberts and realized their time had come to write a BASIC language for programming the Altair. Allen met Ed Taylor in Albuquerque who was thrilled with the program seeing it take 2+2 and calculating 4. He licensed it for sale with the Altairs. Gates said it was the first personal computer, ever. Home computer was born. In 1975 Gates dropped out of Harvard with two semesters left to graduate. Thirty years later he went back for an honorary degree. He told his father in the audience, “I told you I would come back and get my degree!”
Gates insisted that he and Paul retained ownership of the software and that Altair would use “best efforts” to require other developers to use it. They would split the proceeds. This was the beginning of Gates’ business structure. He insisted to Allen that he own 60% while Paul own 40%. Gates’ father was a lawyer! Later he pushed that to a 64/36% split.
Steve Jobs grew up with a fascination for creating and selling things. Steve Wozniak was the son of an engineer who passed his love of electronics to his son at an early age. Wozniak and Jobs became fast friends as Woz made things and Jobs sold them. Then came the Altair and the Popular Electronics article. They set out to build a laptop. They had to lease Gates’ operating system, known as DOS. Jobs came up with a dream machine for graphic interface via a mouse. They ask Gates to write a new operating system for a graphic interface and mouse, drop down menus, etc. Gates and Allen seized the opportunity to steal the idea and create their own machine. Gates and Jobs split!
While at first Apple had far more revenues than Microsoft, Gates’ willingness to lease his software to any company for any machine while Apple wanted theirs for only their machines meant that Microsoft turned out the greater revenues in the long run.
Linus Torvalds in Helsinki set out on a different route than Gates and Jobs, namely, open source operating system which he named Linux, a diminutive of his first name. He made his product open to the world to use, improve, update and debug. He asked users to send him postcards not money.
Hence, the Gates/Allen approach was MS/DOS that could operate on different machines, Jobs/Wozniak’s was tied to the Apple machine, and Torvalds/Stallman’s open source system free to all. In reality, the three different approaches made an ideal environment for the growth of the personal computer.
Modems were created to connect computers via telephone lines. ATT tried to prevent it but a Supreme Court decision ended their obstruction.
The Internet grew out of an alphabet soup of precursors, beginning with ARPANET, the government network created for the military. Many companies, institutions and serial entrepreneurs got into the networking activities. From uses that grew up on nets, it was evident that people loved chatrooms and billboards. Social networks evolved. Al Gore sponsored bipartisan legislation in Congress that pulled all the pieces together for a universal Internet available to the world. He didn’t invent the Internet nor did he ever say he did, but he was a major player.
“Innovation arises in place where the right primordial soup exists.” Silicon Valley had that for Gates and Jobs. Tim Berners-Lee did not have that in Manchester England when he was creating his innovations. He came up with the idea of anyone being able to put any information on the Internet in a hypertext with an address http://www.cern.ch. Any document or information could have its own address enabling anyone, anywhere on any machine to access it via the Internet. Berners-Lee named it the World Wide Web, abbreviated www. The rest is history!
Browsers were developed to enable users to locate information. Search engines became the next innovation, allowing individuals to search the web for sites and data. Publishing personal content on the web evolved with the blog in 1997. The world had access. By 2014 there were over 800 million blogs worldwide. A battle raged over whether or not users would pay fees to use the web. Those in favor wanted to earn money; those against fees wanted as many users as possible to make the web & Internet as universal as possible.
Print and television media and their ilk were cavemen to the world of Internet technocrats. Central control was anathema to the visionaries. The web was to be the people’s media. (On Feb 26, 2015 Congress voted to make the Internet a public utility, meaning it would be controlled like a telephone company or electric company in the U.S.)
Ev Williams created blogger.com, growing it to 100,000 users with no income. His staff of six walked out, but he continued the struggle. Finally, he developed an advanced model for fees. Out of his work Twitter evolved.
The idea of wikis, Hawaii’s word for quick, evolved to provide a software that would allow the public to edit things. Controversy grew over whether only “experts” would be allowed to contribute or the general cyber public. Jimmy Wales from Huntsville AL led the creation of a new form of encyclopedia. Wikipedia exploded from its inception in 2001 to 4.4 million articles in 2014 in English while Ency. Britannica’s electronic edition only had 2% of that when it quit publishing in 2010! The unique feature of Wikipedia is its principle that communities can provide and police the content, like a wall that it is easier to take graffiti down than it is to put it up! It is the world’s largest collaborative knowledge sharing project. “Experts” don’t have to anoint qualified contributors.
Then came Google, a name taken from a googol, meaning a one with a thousand zeros behind it. Two students, Brin and Page, teamed up at Stanford U. to create a symbiosis between man and machine. They created a search engine that could prioritize the data found to match it up with the researcher’s interests. Other companies turned down the opportunity to buy the system for $1 M so they started a company with the name Google, and the rest is history.
Ada Lovelace’s Objection, machines cannot think like humans, brings up artificial intelligence. So far many claims have been made, but no machine passes the tests for human intelligence, including IBM’s Big Blue. Computers can do hard things easy, but easy things they cannot do. Like answering the questions, “How deep is the Red Sea?” the machine will respond 7254 feet, but on asking “Can a crocodile play baseball?” the machine can’t figure that out, while a kid can! The future seems to be a human-machine symbiosis where working together man and machine optimize outcomes that neither alone could do.
Walter Isaacson ends in Chapter 12 on the note of the vital role of collaboration in the development of technology as so well illustrated in this e-generation. From Ada Lovelace and Charles Babbage in the Nineteenth Century to Gates and Jobs in the Twenty-first, innovators have progressed via collaborations. Some proprietary, some open source, but they all illustrate how teams work best together. Another of Isaacson’s conclusions is the role of technology and humanities. Where these two disciplines intersect is where the best innovation develops.
Chapter 12 is a summary of how technology evolved to produce the Digital Age of today.
Notes by Bill Norris – 2/20/15