Inside Our Machine-Enchanted World
--
An old computer dynamic remakes tech. And more.
This is an excerpt from a review I wrote for Encyclopedia Britannica’s most recent Book of the Year. Images added. The original is here.
Note: I wrote this when I was working for The New York Times. I am now at Google, where I do editorial work for the company’s cloud computing business. I was not reflecting either company’s views.
At Google, I write and speak about what I am convinced is the most dramatic technology development of our lives: The advent of unimaginable computing power everywhere. Along with the Artificial Intelligence programs in large-scale clouds, I believe, will transform business, economics, and society, and enable radical new understandings of our world.
The most-important technology transformation of the early part of the 21st century — the movement of machine intelligence to seemingly every point on the planet — continued apace in 2016. Just a few years earlier, annual reviews would have examined the number of mainframe computers, computer servers, and personal computers (PCs) in the world and the uses to which they were being put. However, while hundreds of millions of those machines were shipped in 2016, for the most part those high-tech devices were connected objects in global cloud-computing networks that might also include computer-infused machinery, appliances, automobiles, and, most notably, smartphones.
The result was a transformation of the computer industry as well as other enterprises. In particular, industries far removed from technology, such as farming and taxi services, were compelled to cope with computer-driven upstarts. By 2016 some of those privately financed digital upstarts — known as “unicorns” — were worth billions of dollars. (See Special Report.) Consumers also had increased access to nontraditional on-demand or sharing businesses, such as the private transport giant Uber (by far the largest of the unicorns), through mobile apps downloaded onto their smartphones or tablets. (See Special Report.) Computer-assisted surveillance was becoming a commonplace occurrence of everyday life owing to an explosion of cheap digital cameras and camera-equipped smart devices as well as open-source software capable of effecting facial recognition. Computer-assisted drones were used to distribute medicines in rural areas of Africa and were being tested for everyday package delivery in the developed world.
At the same time, such dramatic transformations were the continuation of an earlier phenomenon in computing — known informally as an edge/core dynamic — taken to a global scale. That dynamic involves several less-powerful computers on the periphery of a network. Those devices interact with a powerful computer (or more than one) at the centre, which records incoming data and steers tasks by means of deployed software. There are examples of that interaction going back at least as far as the early days of the space program, but the method came into its own with the so-called client/server computing that dominated the computer industry in the late 20th century. The “client” was typically a personal computer, which was connected to computer servers that were networked to manage a large number of PCs. Companies such as Oracle, Microsoft, Dell, and Cisco Systems all rose to industry dominance selling client/server products.
Moore’s law, an economic observation regarding the tendency of the computer industry to produce chips with twice the transistor density at no additional cost every 18–24 months, was particularly beneficial to client/server companies. At a dependable rate, hardware manufacturers would produce PCs and servers capable of handling more computationally intensive tasks, and software companies could write packages with the assumption that a certain level of complexity would be available in two years’ time. Networking companies such as Cisco could produce more-capable edge devices, challenging the capability of core information routers, and then produce a stronger router, enabling more tasks at the edge.
Cloud computing arose from software that transformed the capability of networked servers and brought about performance improvements beyond the periodic doubling of Moore’s law. For relatively little money, subscribers to commercial cloud systems created by Amazon, Microsoft, Google, and others could secure seemingly infinite amounts of computation and data storage. Usage of those big clouds continued to accelerate in 2016, while most incumbent companies from the client/server era struggled to cope.
Cloud computing might have seemed to end the edge/core dynamic, since even a relatively weak PC could utilize a cloud’s supercomputing power and storage capacity. In fact, the leap of computing from specific machines into many types of devices meant that the dynamic was stronger than ever before. The edge might include self-driving cars, smartphones, and PCs, while the core comprised globe-spanning cloud systems, in some cases with more than two million connected servers.
The virtuous computing dynamic was no longer periodic, following Moore’s law, but rather had evolved into something closer to continuous. Different types of devices did not increase their chip density in lockstep, smoothing the increase over time. Cloud computing centers utilized custom chip designs and more-sophisticated software on ever-larger campuses. Throughout 2016 Microsoft was in the process of constructing a vast facility in Virginia, with room for 20 data centers; the company had also purchased a nine-hole golf course in Iowa in order to obtain land that would be one part of a larger complex. Google and Amazon built out at a similar rate, spending between $5 billion and $9 billion a year on cloud centers.
This new edge/core dynamic began to have profound effects on how computing worked. The amount of data available, alongside storage costs that were effectively near zero and very low processing costs, led to renewed interest in applying statistical methods for predicting behavior using algorithms that examined past interactions. The algorithms inside those new systems led to an explosion of applications that used artificial intelligence (AI) and machine learning.
Computing had previously been viewed largely through the prism of a chip’s processing power; a standard measure was FLOPS (floating point operations per second). Increasingly, however, the ability to manage petabytes of data and train AI systems on those big data sets became a key job skill in the industry. The chip maker NVIDIA, which manufactured chips for video games, enjoyed a boom once it became clear that semiconductors for graphics could be adapted for AI. Self-driving cars, which in 2016 were commercially deployed in Pittsburgh by Uber, relied on a large amount of onboard storage and processing, which was later fed back to Uber’s main computers for system optimization.
The dynamic of an ever-more-powerful and ever-more-diverse edge feeding an ever-larger and ever-more-capable core seemed likely to continue on a global basis. In April 2016 Facebook announced a 10-year plan for global computing. The scheme included edge devices such as virtual reality goggles that would ultimately receive information over a network infrastructure that Facebook was developing. When completed, that infrastructure was intended to make high-speed connectivity an affordable reality to another one billion people around the globe. Facebook, which operated its own big cloud, had previously used similar open-source hardware techniques to sharply lower its core-computing costs.
The new algorithms and richer data sets also had dramatic impacts. Google’s DeepMind, a U.K.-based company that focused on a type of artificial intelligence known as deep learning, developed a computer program that in March beat the world’s human champion at Go, which was considered to be among the most-complex games, several years ahead of earlier expectations. Google also turned DeepMind’s AI system on itself and effected energy savings of 15% in a corporate data centre that it considered one of the industry’s most efficient. Google expected to turn that kind of analysis on other industrial systems, notably manufacturing plants.
My thanks to Britannica (( https://www.britannica.com/ )) for permission to run this excerpt. Check out their site; it has a lot of great stuff.