Roger McNamee

Zucked: How Users Got Used and What We Can Do About It


Скачать книгу

for a founder of 3Com, said that the value of any network would increase as the square of the number of nodes. Bigger networks are geometrically more valuable than small ones. Moore’s Law and Metcalfe’s Law reinforced each other. As the price of computers fell, the benefits of connecting them rose. It took fifty years, but we eventually connected every computer. The result was the internet we know today, a global network that connects billions of devices and made Facebook and all other internet platforms possible.

      Beginning in the fifties, the technology industry went through several eras. During the Cold War, the most important customer was the government. Mainframe computers, giant machines that were housed in special air-conditioned rooms, supervised by a priesthood of technicians in white lab coats, enabled unprecedented automation of computation. The technicians communicated with mainframes via punch cards connected by the most primitive of networks. In comparison to today’s technology, mainframes could not do much, but they automated large-scale data processing, replacing human calculators and bookkeepers with machines. Any customer who wanted to use a computer in that era had to accept a product designed to meet the needs of government, which invested billions to solve complex problems like moon trajectories for NASA and missile targeting for the Department of Defense. IBM was the dominant player in the mainframe era and made all the components for the machines it sold, as well as most of the software. That business model was called vertical integration. The era of government lasted about thirty years. Data networks as we think of them today did not yet exist. Even so, brilliant people imagined a world where small computers optimized for productivity would be connected on powerful networks. In the sixties, J. C. R. Licklider conceived the network that would become the internet, and he persuaded the government to finance its development. At the same time, Douglas Engelbart invented the field of human-computer interaction, which led to him to create the first computer mouse and to conceive the first graphical interface. It would take nearly two decades before Moore’s Law and Metcalfe’s Law could deliver enough performance to enable their vision of personal computing and an additional decade before the internet took off.

      Beginning in the seventies, the focus of the tech industry began to shift toward the needs of business. The era began with a concept called time sharing, which enabled many users to share the use of a single computer, reducing the cost to everyone. Time sharing gave rise to minicomputers, which were smaller than mainframes but still staggeringly expensive by today’s standards. Data networking began but was very slow and generally revolved around a single minicomputer. Punch cards gave way to terminals, keyboards attached to the primitive network, eliminating the need for a priesthood of technicians in white lab coats. Digital Equipment, Data General, Prime, and Wang led in minicomputers, which were useful for accounting and business applications but were far too complicated and costly for personal use. Although they were a big step forward relative to mainframes, even minicomputers barely scratched the surface of customer needs. Like IBM, the minicomputer vendors were vertically integrated, making most of the components for their products. Some minicomputers—Wang word processors, for example—addressed productivity applications that would be replaced by PCs. Other applications survived longer, but in the end, the minicomputer business would be subsumed by personal computer technology, if not by PCs themselves. Main frames have survived to the present day, thanks in large part to giant, custom applications like accounting systems, which were created for the government and corporations and are cheaper to maintain on old systems than to re-create on new ones. (Massive server farms based on PC technology now attract any new application that needs mainframe-class processing; it is a much cheaper solution because you can use commodity hardware instead of proprietary mainframes.)

      ARPANET, the predecessor to today’s internet, began as a Department of Defense research project in 1969 under the leadership of Bob Taylor, a computer scientist who continued to influence the design of systems and networks until the late nineties. Douglas Engelbart’s lab was one of the first nodes on ARPANET. The goal was to create a nationwide network to protect the country’s command and control infrastructure in the event of a nuclear attack.

      The first application of computer technology to the consumer market came in 1972, when Al Alcorn created the game Pong as a training exercise for his boss at Atari, Nolan Bushnell. Bushnell’s impact on Silicon Valley went far beyond the games produced by Atari. He introduced the hippie culture to tech. White shirts with pocket protectors gave way to jeans and T-shirts. Nine to five went away in favor of the crazy, but flexible hours that prevail even today.

      In the late seventies, microprocessors made by Motorola, Intel, and others were relatively cheap and had enough performance to allow Altair, Apple, and others to make the first personal computers. PCs like the Apple II took advantage of the growing supply of inexpensive components, produced by a wide range of independent vendors, to deliver products that captured the imagination first of hobbyists, then of consumers and some businesses. In 1979, Dan Bricklin and Bob Frankston introduced VisiCalc, the first spreadsheet for personal computers. It is hard to overstate the significance of VisiCalc. It was an engineering marvel. A work of art. Spreadsheets on Apple IIs transformed the productivity of bankers, accountants, and financial analysts.

      Unlike the vertical integration of mainframes and minicomputers, which limited product improvement to the rate of change of the slowest evolving part in the system, the horizontal integration of PCs allowed innovation at the pace of the most rapidly improving parts in the system. Because there were multiple, competing vendors for each component, systems could evolve far more rapidly than equivalent products subject to vertical integration. The downside was that PCs assembled this way lacked the tight integration of mainframes and minicomputers. This created a downstream cost in terms of training and maintenance, but that was not reflected in the purchase price and did not trouble customers. Even IBM took notice.

      When IBM decided to enter the PC market, it abandoned vertical integration and partnered with a range of third-party vendors, including Microsoft for the operating system and Intel for the microprocessor. The first IBM PC shipped in 1981, signaling a fundamental change in the tech industry that only became obvious a couple of years later, when Microsoft’s and Intel’s other customers started to compete with IBM. Eventually, Compaq, Hewlett-Packard, Dell, and others left IBM in the dust. In the long run, though, most of the profits in the PC industry went to Microsoft and Intel, whose control of the brains and heart of the device and willingness to cooperate forced the rest of the industry into a commodity business.

      ARPANET had evolved to become a backbone for regional networks of universities and the military. PCs continued the trend of smaller, cheaper computers, but it took nearly a decade after the introduction of the Apple II before technology emerged to leverage the potential of clusters of PCs. Local area networks (LANs) got their start in the late eighties as a way to share expensive laser printers. Once installed, LANs attracted developers, leading to new applications, such as electronic mail. Business productivity and engineering applications created incentives to interconnect LANs within buildings and then tie them all together over proprietary wide area networks (WANs) and then the internet. The benefits of connectivity overwhelmed the frustration of incredibly slow networks, setting the stage for steady improvement. It also created a virtuous cycle, as PC technology could be used to design and build better components, increasing the performance of new PCs that could be used to design and build even better components.

      Consumers who wanted a PC in the eighties and early nineties had to buy one created to meet the needs of business. For consumers, PCs were relatively expensive and hard to use, but millions bought and learned to operate them. They put up with character-mode interfaces until Macintosh and then Windows finally delivered graphical interfaces that did not, well, totally suck. In the early nineties, consumer-centric PCs optimized for video games came to market.

      The virtuous cycle of Moore’s Law for computers and Metcalfe’s Law for networks reached a new level in the late eighties, but the open internet did not take off right away. It required enhancements. The English researcher Tim Berners-Lee delivered the goods when he invented the World Wide Web in 1989 and the first web browser in 1991, but even those innovations were not enough to push the internet into the mainstream. That happened when a computer science student by the name of Marc Andreessen created the Mosaic browser in 1993. Within a year, startups like Yahoo and Amazon had come along, followed in 1995 by eBay, and the web that we now know had come to life.

      By