Exponential change
All phenomena are dynamic and while we tend to try to analyze and formalize them through an opposite process, it is the static abstraction that is the further representation of the reality. Of course, as the struggles in the evolution of the methods of science have shown, reality is not always readily decipherable. Nor it is conveniently laid out in front of us in intuitive packages conforming to our common sense. Many of our intuitions about the rules governing natural phenomena turn out to be wrong. This can happen with a series of experiments that, once the physical laws behind the phenomena are well understood, anybody can carry out. At that point, with accessible explanations that can be illustrated with immediacy, there is no excuse for being ignorant about the nature of reality. A simple example of this is Newton's first law that says that "all objects maintain their state of motion in absence of an external force". Our everyday experience is that a car put in motion will actually stop, if you don't push on the gas pedal. But now we have a clear understanding of the role of attrition, and that the deceleration is due to the engine, the terrain, and the air in front of the car. Taking away all sources of attrition, the car will go on forever. The consequences of dynamic change are all around us, in the ebbs and flows of water, rivers, oceans, and rain. In the growth of vegetation, trees, and forests, or the advancing of deserts and the changes of seasons. But even if we have plenty of experience with dynamic change, our intuition can be more misleading than ever, with respect of the raw power behind its abstract mathematical nature, unencumbered and unrestrained by the constraints of a natural, physical environment. Exponential change is such a dynamic environment. We can prepare for it, but its blunt force will still surprise us, often confounding even experts and certainly taking laypersons aback by how powerfully it can reshape the landscape of our reality.
The simplest examples of exponential change, the doubling of a quantity, for example, in a certain amount of time, if the starting point is the unit of 1, can look fairly harmless or even disappointing at the beginning. Figure 2: Doubling of grains of wheat on a chessboard 1, 2, 4, 8, 16, 32, 64, 128, 256, 512, 1024, 2048, 4096, ... is a sequence familiar to anybody who has had any interest at all in numbers. Rattling of the sequence in your head could have been a harmless childhood exercise. There is a corresponding sequence, before the unit, that you can look at, potentially unassuming to even a greater degree: 0.01, 0.02, 0.04, 0.08, 0.16, 0.32, 0.64, and then 1.28. The interesting surprise of this sequence, nothing magical about it, is what came before: 0.00015625, 0.0003125, 0.00625, 0.00125, 0.0025, 0.005, 0.01. Why are these three sequences interesting and why are they representative of the nature of exponential change? Imagine that you are looking at the world around you and, trying to decipher it, predict what a given phenomenon will amount to, you collect data about it. This collection will not be as neat and clear as the sequences above. There will be a lot of noise in it. Errors of measure, mistakes made during the process or the planning of the measurements, other phenomena intruding and confounding your attempts to a clear understanding, and so on. The noise of a natural environment where, before even being able to clearly train your ears to it, you want to discern a pattern that is possibly new, something that nobody else tried to listen to before.
Is there signal in the noise? It is very likely that while you try to answer the question, there will be other opinions around. And per definition, they will be different from yours, not really aligned, or even maybe in opposition. If you are in a research environment and are competing for grants, or you are in an industry and the product that you are trying to engineer or the service that you are promoting among users rightly distracted by a huge offering of alternative options, in any case you'll be confused by the resistance to your original theory which the signals may support. You need to be strong in your opinions, you even have to have faith in what you want to show, an unreasonable conviction that you are right while everybody else is telling you that you are wrong. Or even that what you are looking for is non-existent, or impossible. This is the realm of the third sequence, leading up to 0.01 (or 1% of the unit). The area where even experts will be against you. It needs a keen eye and ear, concretely or abstractly, to understand that in the presence of the distractions of a noisy natural world, there is indeed something brewing. Doubling calmly, without anybody else noticing it but you, and after several doublings arriving at the threshold of 1% from the goal of the unit. Figure 3: Interpolation of a signal from a noisy phenomenon. After you arrive to the 1%, those experts who still don't believe you should be stripped of their label ignominiously. Because from then onwards it should be clear not only to you but to anybody who pays even a passing attention that it is now just a matter of time. In merely seven successive doublings you will have arrived at the unit.
Turning this description from abstract sequences of numbers into a real example, we can look at what happened in the massive Human Genome Project in the US. Started in 1985, the duration of the project was originally set for 15 years. As with any scientific project, it was not completely clear how all the hurdles would be solved, and what approach would be the winning one. After seven years into the project only 1% of the goal had been achieved! Many at the time were loudly demanding that the project should be suspended or even abandoned: look, already halfway through and it is only at 1% to the objective! Those more careful, or the experts in exponential dynamics, though, were able to see that all was good. Having reached 1% doubling the amount of base pairs decoded each year in the previous seven years, in additional seven years of doublings the project would reach its goal of 100%, of decoding the entire human genome. There are many phenomena that are subject to exponential growth: populations and nuclear chain reactions, to name a couple of examples. Populations grow exponentially because, as long as on average a couple has more than two offspring generation after generation, the increase will be cumulative: those offspring will have more children too. Nuclear chain reactions also occur when the fissile material, uranium for example, produces among its fission products neutrons that before exiting the volume of material make another uranium atom break apart, producing other neutrons, and so on. Most importantly for the theme of this book, the power of computing and information systems is also growing exponentially, and has been doing so for over 50 years. However, there is no natural law behind this dynamic, no biological or physical necessity. It is an engineering project that ended up being called, by the name of the person who first formulated it,
. Moore's Law Gordon Moore was working on the newly invented integrated circuit, at the beginning of the '60s. He was in a noisy environment, in terms of one having to concentrate on the features of a novel phenomenon in presence of many others going on at the same time. Practical computers have been around for a couple of decades, more or less, becoming more and more powerful, but at a rate that was rather slow, if looked at linearly. Different approaches have been tried to make them capable of storing more information for the calculations, and executing them more rapidly. The vacuum tubes, magnetic core memories, and other components of what at the time the media liked to call "electronic brains" were cumbersome, prone to high failure rates, and needed scores of specialized personnel to take care of them, to make sure that they would work. The cost of computers was in the millions of dollars, and only national research programs, or very large corporations, could afford them. The invention of the transistor, to be used as a basic component for calculation, promised much more reliable and cheaper production, assembly, and running of computers. Transistors could be packaged together with other components to create a useful unit of calculation called the integrated circuit. Not only that, but given their nature, it was possible to forecast the development of next generation components that were smaller, faster, and cheaper than previous ones. Figure 4: Computer evolution propelled by Moore's law. Gordon Moore was able to observe what the current capabilities of the production processes were, and the increase of these capabilities in the course of a few years. Based on only a handful of data points, plotted on a piece of graph paper that after 50 years still survives, he boldly formulated a prediction that the number of transistors that could be accommodated on a given integrated circuit would double every year. A bit later he corrected the prediction to two years, and then finally settled on 18 months, which is the value currently accepted and used. Based on how few data points were available to him this prediction was quite bold, maybe even reckless. However, with the benefit of hindsight, it would appear that this courageous ambition is what was really needed. Because what happened is that, spurred by curiosity, the desire to excel, and basic economic competition, more and more groups of engineers set out to create more powerful integrated circuits. Together with all the supporting systems that were needed, they weaved together an entire industry. At the beginning this process was driven by the individual capabilities of these groups, and what they were able to offer on the market. However, later on, Moore's Law became itself a driving force, a kind of self-fulfilling prophecy as well as a guidepost against which to measure the achievements of the various groups. Many times it has been predicted that Moore's Law would fail to hold up in the next generation, and sooner or later it is bound to do so in its strictest formulation. More generally extending its predictions to the power of computing, there is reason to believe that it is going to be possible to still hold it up for a long time. Moving from silicon to other substrates for the circuits; creating three-dimensional components; moving from architectures that see quantum phenomena as a hindrance to ones that fully exploit them... there are many approaches to overcome eventual roadblocks that lie ahead in proving this law right, the same way others have been overcome in the past fifty years. It is important to note that the spreading of knowledge is at the basis of Moore's Law. No single group working in secret could hope to be the one that will indeed be able to solve the problems that pop up along the road in the next generation of solutions, or the one after that and so on. Only the collaboration of many groups makes this possible. It is enough for one of them to achieve a breakthrough, discovering the solution necessary. All the others will leverage that, through licensing agreements that incorporate the solution in the next generation fabrication plants churning out integrated circuits, that are today produced by the billions each year. The complex interlocking ecosystem of industrial infrastructure needed to maintain the pace of evolution in computing is not only in the production of the integrated circuits themselves. Similarly have to evolve the manufacturing tools that create the circuits, the software systems that allow to design them, and the financial support to make the investments possible for the plants, raw materials, refinement, and very importantly, human capital. Whatever the fundamental physical limits for the growth of computation are, measured by the generalized Moore's Law, they lie far in the future. The progress that we've seen in the past 50 years of increase in the power of computing is going to be vastly eclipsed by that of the next 50 years. Actually, it is going to be exceeded, by the very nature of exponential growth, within the next couple of years. And then again, the next couple of years, and so on. Figure 5: Linear progressions can overwhelm exponential trends initially.
It doesn't matter, of course, in terms of how rapidly an exponential sequence develops. No need for it to double in a year to be exponential. These are just arbitrary units, and any cumulative change where the resulting quantity increases by a given amount expressed in the result itself will do. If you have a quantity of 100 and it increases by 10, you'll have 110, then 120, 130, etc. This is linear growth. But if you have a quantity that is increasing by 10%, then you'll have 110, 121, 133, and so on. That little difference, which doesn't appear to be very significant at the start, is all that matters. That is exponential growth. There are many ways to express this power, and how surprising it is for those accustomed to think linearly. Look for example at the sum of this sequence: 1, 2, 4. The sum 7 = 1+2+4 is the total amount in the entire sequence. And the next step in it is 8, larger than the total of all the steps preceding it. This is true for all exponential growth. In the next period of doubling in computing in merely 18 months, thanks to Moore's Law, there will be more transistors and integrated circuits created (and computers from them and calculations carried out through them) than in the entire history of computing for the past fifty years or more!
Another example illustrating the power of exponentials is to look at a closed system, for example a pond sustaining a population of frogs. If there are algae that make the lake uninhabitable for the frogs, and as it covers the surface of the lake in ever greater extent, from just a fraction, to one percent, doubling every week, how long do the frogs have to live, once the algae covers half of the pond? By now hopefully the answer is clear: only one week, as during the next doubling the lake will be entirely covered by algae! Even more alarmingly, perhaps, already at 1% there is less than two months' time left for the frogs to flee to another pond, or to find a way to stop the algae from expanding. Our unique position is to be able to see what is happening to the pond, contrary to frogs. And this capability of data collection, analysis, and foresight gives us great responsibility in understanding if the pond is OK or not. Taking active action managing the pond, countering the algae, can't and won't be done by others, but we can do it. The various examples of exponential change we find in nature feed themselves, but seldom assemble into interacting chains that feed each other. Our technological civilization on the other hand is full of these self-reinforcing chains that keep the acceleration of exponential change going.
The inventor, author, and co-founder of Singularity University Ray Kurzweil has been collecting data about exponential phenomena for decades. It is not enough, in fact, to be able to recognize what is going on. The explosive nature of exponentials is such that timing is crucial if you want to be able to ride their wave instead of being swept aside by it. Jump on the wave too soon, and those who say otherwise will have an easy time deflating your enthusiasm or that of your financial backers, because the upswing from the sequence trailing the hypothetical unit in our example sequences will not happen. See it too late, and it will already be in full power when you'll want to climb it, making the endeavor too expensive, difficult, or even impossible, as others will have crowded the crests already. From flatbed scanners to optical character recognition, from musical synthesis to speech synthesis, or handheld systems for the blind, all of Kurzweil's inventions take advantage of a keen understanding of the right timing. When to accelerate research and development, so that by the time supporting hardware systems are available at the right price and the right level of integration, all the other components of software, user interface, development systems, and the entire supporting ecosystem are ready as well. At the Santa Fe Institute based on research by Bela Nagy there is a full database of exponential phenomena that is accessible to be studied and expanded upon further. Kurzweil also recognized that with the interconnected and intercommunicating systems of human knowledge that do not grow in isolation but reinforce each other, there are exponentials feeding on exponentials. He called the resulting effect the Law of Accelerating Returns. This is contrary to the acquired wisdom of classical economics where it is assumed that in order to achieve a given increase of economic of output there needs to be a progressively higher amount of available input of capital, called the Law of Diminishing Returns. Just as with Moore's Law, the Law of Accelerating Returns formulated by Kurzweil is a self-fulfilling prophecy, sustained by the open communications, and competing groups aiming to achieve success and excellence in their research and industrial production endeavors. It is definitely possible to break either of these laws. If you stop believing in the law of universal gravitation and jump off the fifth story of a building, you can do it a thousand times, and you'll never stop dropping like a stone, most probably to your death. But if we were to give up trying to make better circuits, or if we decided that it wasn't worth our effort to make better solar panels, better batteries, and so on, as long as everybody stopped, those circuits, panels, and batteries won't happen. Kurzweil is, as of this writing, a Director of Engineering at Google, by his own admission the first ever job he has had. Using the resources made available by the company, he is applying his skills to make natural language interaction with the computers possible, and the next wave of user interaction, making computers even easier to use and able to better serve our needs.
A frequent criticism of Kurzweil's analysis and predictions is based on a misunderstanding of what constitutes the exponential that he is talking about. The critics highlight the fact that what appears to be an exponential in reality is the first half of an S-curve or a logistics curve. It appears exponential initially, as the benefits of a given technology are exploited. However, after a while it plateaus out because it becomes harder and harder to squeeze additional benefits from the same technology. It becomes exhausted, and the belief in the power of technology of those who preach unending exponentials falsified. In reality this is true: each individual technology as it runs its course is unable to give more than its natural limit. As it approximates that limit, it becomes fruitless to insist to want to get more out of it, both from the point of view of engineering as well as from that of economics and of return on investment. And that is why new groups with new ideas will try to achieve the desired output through a different approach. Smart people will see the point in time when the current generation of technologies gets exhausted, and work in parallel with the groups leading at the time, to find a new technology that will deliver the objective at scale, better than before. The cycle in a few exponential doublings will repeat itself, and a third generation of solutions will be needed, and so on. The cumulative effect of these different S-curves, smoothing out each other's endings, and more or less seamlessly interlocking in a chain of invention, innovation, and industrial deployment, is drawing the exponential that Kurzweil points to in his analyses. Figure 6: A sequence of S-curves designs the exponential trend. Taking computing, for example, there have been many generations of computing technologies, each leading in its own time, which have been pushed to their limits, and superseded by the next one, better, cheaper, and faster to generate the desired output: calculations. Mechanical relays, vacuum tubes, transistors, and integrated circuits decade after decade enabled the construction of the world's fastest and most powerful computers. The companies employing them were the leaders of their time, pushing the limits of the technologies, and were supplanted by new models, based on the new technologies in a few doublings of performance. Another example where these years we are witnessing a fundamental switch is in permanent memory storage. The larger and larger amounts of data that our computers need to persistently record, so that when electricity is turned off, and the computer wakes up later on, the data can be retrieved without having to start over from scratch. From punch cards, to magnetic core memory, to magnetic tape, to spinning hard disks, now we are on the verge of moving storage for next generation needs to solid state support (flash storage), which is going to be able to memorize orders of magnitude more, accessed much faster and reliably, and more affordably than any generation of device previously.
Many technologies can be seen through the lens of this exponential interpretation of accelerating change. The doubling periods can be different, of course, than the 18 months that Moore's Law accustomed us to rely on. In solar energy we talk about Swanson's law, which represents the decrease in price per watt of a photovoltaic panel. Starting in 1974 with the creation of the first of such devices, that cost over $70 per watt, today we are at $0.30 per watt, and the decrease in price continues. This decrease comes from economies of scale, from better understanding of manufacturing processes, and from the birth of an ecosystem of financing, deployment and servicing of the modules, as well as from new basic approaches of materials and construction methods that increase substantially the efficiency of a given module as it transforms sunlight into electricity. There is a doubling of the storage capacity of our batteries. This doubling is a more sedate (and infuriating if you feel you spend too much time charging your various power-hungry devices) ten-year period. Depending on metallurgy, chemistry, and manufacturing processes, it is not unimaginable that, in an illustration of Kurzweil's Law of Accelerating Returns, the industry would find a way to speed up the doublings, by adopting a radically new approach and making applications practical that would have been impossible before.
The goals that research programs such as the Human Genome Project set themselves are often somewhat arbitrary. They represent a useful goalpost, but not the end of the development of processes, their refinement, and certainly not the desire for knowledge and for the capacity to acquiring it faster and cheaper. After decoding the genome of a single individual there is the task of doing the same for another seven billion of them. After the human genome there is the genome of other animals, or bacteria in the oceans, or the bacteria that symbiotically live on and in all of us, constituting what is called our microbiome. The capacity of decoding the human genome did not stop at the rate of one per three billion dollars in fifteen years. We certainly would not have taken much advantage of that. In the fifteen years from that first success, the technologies that have been invented, perfected, deployed and substituted by better ones again, allowed an astonishing progress: today it is possible to decode an entire human genome for about two thousand dollars in a couple of weeks. But progress is not stopping there either, and it is possible to forecast the availability within the next ten years of technologies that will enable the decoding of a genome for less than ten cents in a fraction of a second. It is worth thinking about the transformations that this kind of change is going to bring in the world of healthcare, insurance, privacy and more. The takeaway is that there is nothing magical about a given threshold that we end up calling a unit or 100%, and that the power of invention and implementation that drove technologies to achieve that doesn't stop, but keeps going on, delivering increases in the desired output, at lower costs, faster speeds.