Future computing challenges will make present computers seem as clunky as an old mainframe. Pursuits hed: There are challenges dek: future computing challenges will make present computers seem as clunky as an old mainframe. by Nelson King
It’s late autumn. To quote Keats: “The sedge has wither’d from the lake, and no birds sing.” Whatever the weather, it has not been a very pleasant autumn for reasons that need no elaboration. It’s also the time when media folk (I guess I’m one of ’em) are asked to summarize the year gone by, look ahead to the coming year, or both. The assignment isn’t very original, but perhaps because it helps re-establish a familiar framework in an unsettling time, it has value.
There are challenges and then there are challenges. We know about many challenges that followed the events of Sept. 11. Most of these challenges are not positive, for in a better world we wouldn’t have to face them. Right now I’ll write about positive challenges, problems and tasks we face willingly -even optimistically.
Upholding Moore’s Law
Many people in computing, especially those on the hardware side of the business, feel unspoken angst about the eventual end of Moore’s Law (the number of transistors per square inch of integrated circuits will double every 18 months). They fear we will no longer see faster and more powerful processors from the likes of Intel and AMD on such a regular basis. Of course, there are commercial reasons for this concern-the computer industry generally doesn’t make money from standing still. But it’s more than that. The fact that the industry has been able to maintain the 18-month pace for more than three decades has made Moore’s Law a benchmark of general technological progress.
Setting aside the details of why we should want computing power to increase at such a rate, the impulse behind much of the effort to avoid the end of Moore’s Law seems to be simply that we can increase performance. It has not been easy, however, and the future promises a much greater challenge. It’s not that we can’t see possibilities for continuing Moore’s Law, but we’re not sure which of the several possible technologies will be feasible, practical, and, of course, profitable.
Eeny, meeny, miney, moe
At the moment, there are a number of technologies waiting in the wings that will someday be household words. The wings in this case are laboratories scattered all over the world. These technologies are, in no particular order: nanocomputing, optical computing, organic computing, and quantum computing.
While the immediate future of computers is dominated by manufacturing processes that etch transistor circuitry of diminishing dimensions into layers of silicon, most engineers and scientists predict, within the next five to 10 years, limits to both the reduction in size of circuits and the speed of required data transfers. Of course, hard-disk technology was pronounced growth-challenged many years ago, and now we can treat ourselves to 100GB drives for a couple of C-notes. So we might as will throw advances in silicon-based technology into the mix of options.
Nevertheless, it’s widely believed that within this decade (and certainly the next) progress will require new integrated-circuit technology, and it’s probably going to come from one or more of the nano, optical, organic, or quantum fields of research. So let’s have a brief look at each of these technologies, along with a bit of an appraisal.
Up the nanotubes
It’s easy to confuse nanotechnology, organic semiconductors, and biological computers. I’m not even sure scientists keep it straight. However, the hottest area of research for computing technology is focused on organic semiconductors, part of nanotechnology, which are organic only in the sense that they are based on carbon. This research is different from that using organic principles, such as DNA encoding, in biological computing.
Nanotechnology covers a lot more than computing, but one part of it, stemming from the discovery and research of nanotubes, has already been used for computer operations in labs such as IBM, NEC, and Lucent Technologies. Nanotubes are based on a form of carbon that becomes shaped as molecular-sized tubes that have various electrical properties. These properties enable nanotubes to behave like transistors and act like tiny wires. To put things into perspective, Intel’s latest CPU uses a 180-nanometer (0.18-micron) process, and contains some 42 million transistors. Nanotubes can make transistors with a process of 1 or 2 nanometers in size.
Nanotubes have other benefits. For example, they can be used in manufacturing processes without the need for expensive clean rooms. Theoretically, this kind of nanotechnology can take us to much smaller and more powerful processors that are less expensive to build. But at the moment, there’s still a big jump to be made from proof of concept in the lab to the nitty-gritty of a production line. Making this jump cannot be taken for granted, though this one is getting close-perhaps within five years.
Computing on the light side
We’re already using optical fiber as a means of transmitting computer data, which gives this technology a leg up for use in ultra-thin wiring for integrated circuits. Intel, among others, is researching various aspects of optical (light-based) materials for both connections between devices and eventually for wiring on the CPU itself. The carrot that draws the research is the small size of optical fiber, but the real stick is the potential for inexpensive manufacturing. The time horizon for this technology to emerge from the lab and into production is also around five years. Because of its familiarity, it stands a good chance of rapid adoption.
Carbon-based computer forms
Biological computing does not mean breeding computers. It simply means that computer elements are made of carbon instead of silicon and operate on principles more familiar to biologists than electrical engineers. The essence of the approach, and the focus of most current research, is to use the properties of DNA to encode, store, and process information in ways analogous to computing. This research is much further out than most in part because of the need for crossover understanding of both the biological and electronic worlds. At the moment, it seems that biological computers may never approach the speed or miniaturization of hardware, but may offer vast opportunities for low cost and highly flexible computing.
The tip of the qubit
So far, I’ve mostly been writing about things that are small and fast. Let’s wind up with something really small and fast: quantum computing. With this technology it is the properties of atoms that are used to do the functions of transistors. Individual electrons are used to represent binary states of “on” or “off” as in a traditional computer, but there’s a huge bonus: Quantum components can have a mixed state–one part on, one part off–that’s called a quantum bit, or qubit. Without going into the math, the mixed state provides a huge increase in computing power-on top of the small size and incredible speed.
Researchers have been able to make quantum computations (not full processors) in the lab. Even for extremely crude operations, a roomful of very expensive equipment is required. This suggests that commercially viable uses of the technology are a long way off, at least 10 years. Ten years is a long time in computing, but a relatively short time in human history. The idea that we may begin to see molecular- and atomic-sized computing devices before, say, 2025 should boggle our minds. This is miniaturization that begs the old question of how many angels can dance on the head of a pin.
Picking up the gauntlet
By now, we should have learned not to dismiss even the most blue-sky estimates of what certain technologies may accomplish. Although it’s something like the lottery in that the chances of success are very small for any one person, the fact remains that somebody wins. Likewise, with this kind of science, it’s very difficult to predict a particular winner (or combination of winners); but the fact remains that the most amazing things have been discovered and made practical.
In the next four installments of “Pursuits,” I’ll explore the outer edges of research in computing and communications: processors and advanced computing; peripherals and communications; software and complexity; and networks and integration. There are many important implications about what is being developed. Social and ethical questions are raised, along with problems of economics and matters of purpose and philosophy. Now is a better time than most to examine and re-examine the leading technologies of our age. Looking at the challenges ahead implies a certain amount of optimism-something we can use right now.