The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies (7 page)

BOOK: The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies
9.04Mb size Format: txt, pdf, ePub
ads

This view emphasizes the steadiness of the doubling over time rather than the large numbers at the end. Because of this, we often use logarithmic scales for graphing doublings and other exponential growth series. They show up as straight lines and their speed is easier to evaluate; the bigger the exponent, the faster they grow, and the steeper the line.

Impoverished Emperors, Headless Inventors, and the Second Half of the Chessboard

Our brains are not well equipped to understand sustained exponential growth. In particular, we severely underestimate how big the numbers can get. Inventor and futurist Ray Kurzweil retells an old story to drive this point home. The game of chess originated in present-day India during the sixth century CE, the time of the Gupta Empire.
7
As the story goes, it was invented by a very clever man who traveled to Pataliputra, the capital city, and presented his brainchild to the emperor. The ruler was so impressed by the difficult, beautiful game that he invited the inventor to name his reward.

The inventor praised the emperor’s generosity and said, “All I desire is some rice to feed my family.” Since the emperor’s largess was spurred by the invention of chess, the inventor suggested they use the chessboard to determine the amount of rice he would be given. “Place one single grain of rice on the first square of the board, two on the second, four on the third, and so on,” the inventor proposed, “so that each square receives twice as many grains as the previous.”

“Make it so,” the emperor replied, impressed by the inventor’s apparent modesty.

Moore’s Law and the tribble exercise allow us to see what the emperor did not: sixty-three instances of doubling yields a fantastically big number, even when starting with a single unit. If his request were fully honored, the inventor would wind up with 2
64 –1
, or more than eighteen quintillion grains of rice. A pile of rice this big would dwarf Mount Everest; it’s more rice than has been produced in the history of the world. Of course, the emperor could not honor such a request. In some versions of the story, once he realizes that he’s been tricked, he has the inventor beheaded.

Kurzweil tells the story of the inventor and the emperor in his 2000 book
The Age of Spiritual Machines: When Computers Exceed Human Intelligence
. He aims not only to illustrate the power of sustained exponential growth but also to highlight the point at which the numbers start to become so big they are inconceivable:

After thirty-two squares, the emperor had given the inventor about 4 billion grains of rice. That’s a reasonable quantity—about one large field’s worth—and the emperor did start to take notice.

But the emperor could still remain an emperor. And the inventor could still retain his head. It was as they headed into the second half of the chessboard that at least one of them got into trouble.
8

Kurzweil’s great insight is that while numbers do get large in the first half of the chessboard, we still come across them in the real world. Four billion does not necessarily outstrip our intuition. We experience it when harvesting grain, assessing the fortunes of the world’s richest people today, or tallying up national debt levels. In the second half of the chessboard, however—as numbers mount into trillions, quadrillions, and quintillions—we lose all sense of them. We also lose sense of how quickly numbers like these appear as exponential growth continues.

Kurzweil’s distinction between the first and second halves of the chessboard inspired a quick calculation. Among many other things, the U.S. Bureau of Economic Analysis (BEA) tracks American companies’ expenditures. The BEA first noted “information technology” as a distinct corporate investment category in 1958. We took that year as the starting point for when Moore’s Law entered the business world, and used eighteen months as the doubling period. After thirty-two of these doublings, U.S. businesses entered the second half of the chessboard when it comes to the use of digital gear. That was in 2006.

Of course, this calculation is just a fun little exercise, not anything like a serious attempt to identify the one point at which everything changed in the world of corporate computing. You could easily argue with the starting point of 1958 and a doubling period of eighteen months. Changes to either assumption would yield a different break point between the first and second halves of the chessboard. And business technologists were not only innovating in the second half; as we’ll discuss later, the breakthroughs of today and tomorrow rely on, and would be impossible without, those of the past.

We present this calculation here because it underscores an important idea: that exponential growth eventually leads to staggeringly big numbers, ones that leave our intuition and experience behind. In other words, things get weird in the second half of the chessboard. And like the emperor, most of us have trouble keeping up.

One of the things that sets the second machine age apart is how quickly that second half of the chessboard can arrive. We’re not claiming that no other technology has ever improved exponentially. In fact, after the one-time burst of improvement in the steam engine Watt’s innovations created, additional tinkering led to exponential improvement over the ensuing two hundred years. But the exponents were relatively small, so it only went through about three or four doublings in efficiency during that period.
9
It would take a millennium to reach the second half of the chessboard at that rate. In the second machine age, the doublings happen much faster and exponential growth is much more salient.

Second-Half Technologies

Our quick doubling calculation also helps us understand why progress with digital technologies feels so much faster these days and why we’ve seen so many recent examples of science fiction becoming business reality. It’s because the steady and rapid exponential growth of Moore’s Law has added up to the point that we’re now in a different regime of computing: we’re now in the second half of the chessboard. The innovations we described in the previous chapter—cars that drive themselves in traffic;
Jeopardy!
-champion supercomputers; auto-generated news stories; cheap, flexible factory robots; and inexpensive consumer devices that are simultaneously communicators, tricorders, and computers—have all appeared since 2006, as have countless other marvels that seem quite different from what came before.

One of the reasons they’re all appearing now is that the digital gear at their hearts is finally both fast and cheap enough to enable them. This wasn’t the case just a decade ago. What does digital progress look like on a logarithmic scale? Let’s take a look.

FIGURE 3.3
The Many Dimensions of Moore’s Law

This graph shows that Moore’s Law is both consistent and broad; it’s been in force for a long time (decades, in some cases) and applies to many types of digital progress. As you look at it, keep in mind that if it used standard linear scaling on the vertical axis, all of those straight-ish lines would look like the first graph above of Andy’s tribble family—horizontal most of the way, then suddenly close to vertical at the end. And there would really be no way to graph them all together—the numbers involved are just too different. Logarithmic scaling takes care of these issues and allows us to get a clear overall picture of improvement in digital gear.

It’s clear that many of the critical building blocks of computing—microchip density, processing speed, storage capacity, energy efficiency, download speed, and so on—have been improving at exponential rates for a long time. To understand the real-world impacts of Moore’s Law, let’s compare the capabilities of computers separated by only a few doubling periods. The ASCI Red, the first product of the U.S. government’s Accelerated Strategic Computing Initiative, was the world’s fastest supercomputer when it was introduced in 1996. It cost $55 million to develop and its one hundred cabinets occupied nearly 1,600 square feet of floor space (80 percent of a tennis court) at Sandia National Laboratories in New Mexico.
10
Designed for calculation-intensive tasks like simulating nuclear tests, ASCI Red was the first computer to score above one teraflop—one trillion floating point operations
*
per second—on the standard benchmark test for computer speed. To reach this speed it used eight hundred kilowatts per hour, about as much as eight hundred homes would. By 1997, it had reached 1.8 teraflops.

Nine years later another computer hit 1.8 teraflops. But instead of simulating nuclear explosions, it was devoted to drawing them and other complex graphics in all their realistic, real-time, three-dimensional glory. It did this not for physicists, but for video game players. This computer was the Sony PlayStation 3, which matched the ASCI Red in performance, yet cost about five hundred dollars, took up less than a tenth of a square meter, and drew about two hundred watts.
11
In less than ten years exponential digital progress brought teraflop calculating power from a single government lab to living rooms and college dorms all around the world. The PlayStation 3 sold approximately 64 million units. The ASCI Red was taken out of service in 2006.

Exponential progress has made possible many of the advances discussed in the previous chapter. IBM’s Watson draws on a plethora of clever algorithms, but it would be uncompetitive without computer hardware that is about one hundred times more powerful than Deep Blue, its chess-playing predecessor that beat the human world champion, Garry Kasparov, in a 1997 match. Speech recognition applications like Siri require lots of computing power, which became available on mobile phones like Apple’s iPhone 4S (the first phone that came with Siri installed). The iPhone 4S was about as powerful, in fact, as Apple’s top-of-the-line Powerbook G4 laptop had been a decade earlier. As all of these innovations show, exponential progress allows technology to keep racing ahead and makes science fiction reality in the second half of the chessboard.

Not Just for Computers Anymore: The Spread of Moore’s Law

Another comparison across computer generations highlights not only the power of Moore’s Law but also its wide reach. As is the case with the ASCI Red and the PlayStation 3, the Cray-2 supercomputer (introduced in 1985) and iPad 2 tablet (introduced in 2011) had almost identical peak calculation speeds. But the iPad also had a speaker, microphone, and headphone jack. It had two cameras; the one on the front of the device was Video Graphics Array (VGA) quality, while the one on the back could capture high-definition video. Both could also take still photographs, and the back camera had a 5x digital zoom. The tablet had receivers that allowed it to participate in both wireless telephone and Wi-Fi networks. It also had a GPS receiver, digital compass, accelerometer, gyroscope, and light sensor. It had no built-in keyboard, relying instead on a high-definition touch screen that could track up to eleven points of contact simultaneously.
12
It fit all of this capability into a device that cost much less than $1,000 and was smaller, thinner, and lighter than many magazines. The Cray-2, which cost more than $35 million (in 2011 dollars), was by comparison deaf, dumb, blind, and immobile.
13

Apple was able to cram all of this functionality in the iPad 2 because a broad shift has taken place in recent decades: sensors like microphones, cameras, and accelerometers have moved from the analog world to the digital one. They became, in essence, computer chips. As they did so, they became subject to the exponential improvement trajectories of Moore’s Law.

Digital gear for recording sounds was in use by the 1960s, and an Eastman Kodak engineer built the first modern digital camera in 1975.
14
Early devices were expensive and clunky, but quality quickly improved and prices dropped. Kodak’s first digital single-lens reflex camera, the DCS 100, cost about $13,000 when it was introduced in 1991; it had a maximum resolution of 1.3 megapixels and stored its images in a separate, ten-pound hard drive that users slung over their shoulders. However, the pixels per dollar available from digital cameras doubled about every year (a phenomenon known as “Hendy’s Law” after Kodak Australia employee Barry Hendy, who documented it), and all related gear got exponentially smaller, lighter, cheaper, and better over time.
15
Accumulated improvement in digital sensors meant that twenty years after the DCS 100, Apple could include two tiny cameras, capable of both still and video photography, on the iPad 2. And when it introduced a new iPad the following year, the rear camera’s resolution had improved by a factor of more than seven.

Machine Eyes

As Moore’s Law works over time on processors, memory, sensors, and many other elements of computer hardware (a notable exception is batteries, which haven’t improved their performance at an exponential rate because they’re essentially chemical devices, not digital ones), it does more than just make computing devices faster, cheaper, smaller, and lighter. It also allows them to do things that previously seemed out of reach.

Researchers in artificial intelligence have long been fascinated (some would say obsessed) with the problem of simultaneous localization and mapping, which they refer to as SLAM. SLAM is the process of building up a map of an unfamiliar building as you’re navigating through it—where are the doors? where are stairs? what are all the things I might trip over?—and also keeping track of where you are within it (so you can find your way back downstairs and out the front door). For the great majority of humans, SLAM happens with minimal conscious thought. But teaching machines to do it has been a huge challenge.

BOOK: The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies
9.04Mb size Format: txt, pdf, ePub
ads

Other books

Spanked by the Vet by Christa Wick
Movie For Dogs by Lois Duncan
Dreadful Sorry by Kathryn Reiss
The Lake of Sorrows by Rovena Cumani, Thomas Hauge
Hollowmen by Amanda Hocking
Mating the Alpha by Ivy Sinclair
Reset (Book 2): Salvation by Druga, Jacqueline
The Guest by Kelsie Belle