In these first few days of a new millennium, we can already see through a window to the future, assured that mankind will witness continuing, accelerating change. It is well arguable that the single most important cause bringing the world to current conditions, evolving faster in the last fifty years than all history, was the advent of information processing by computers. The nexus of metallurgy and chemistry with chip fabrication is not, as some say who cite Moorens Law, at a cusp but rather, is poised for another run to alter every tomorrow.

In 1947 Bell Labs designed the first transistor, but it was Jack Kirby in May 1958, a new hire at Texas Instruments, who notionalized and then built a first integrated circuit in September that year, whereupon the company filed for patent in February 1959. The world often forgets Kirby because in January, a similar idea occurred to Robert Noyce at Fairchild Semiconductor, which filed for a patent in July 1959, notably different because Noyce painted connections between components (resistors, capacitors, transistors and diodes) on a silicon wafer, whereas Kirby used solder and wires. The first chip was made in 1961 and by 1964 a chip contained 32 transistors; today a chip can contain 28 million devices. It was in 1965 that Dr. Gordon Moore, then research director at Fairchild and three years later a founder of Intel, observed that the complexity and number of devices on a chip would double every year and a half, giving rise to "Moore's Law" that has held true until today. But that law will certainly change for computers as we know them.

Chip features made by lithography are now as small as 180 nanometers (billionths of a meter) and are projected to reduce to 150 this year and to 100 nanometers by 2005. At these tiny sizes materials problems appear, including those that limit making smaller features. For example, "dopants" mixed into silicon to hold and localize electrical charges do not work when the few atoms clump together in such small volumes and "gates" that control electron flow, now under two nanometers, allow electrons to pass (tunnel) even when the gates are shut.

...The amazing ability to make faster, smaller, and cheaper computers may be reaching limits as we know them. Last year US companies spent nearly $250 billion on computers and peripherals, more than invested for plants, vehicles, or any other kind of durable equipment. This was done to improve productivity. Recall that after WWII the average annual productivity growth was 3%, which lasted until 1973, when it fell for no known reason to 1.1%. In 1995 it rose again to 2.2%, for no explainable reason, but a consensus thesis is that computerization drove this pattern because we know that annual computer price deflation grew from minus 10% to minus 25% in the last five years of the decade. But now it seems computer makers are about to hit a technical wall that could deal US industrial productivity growth a crushing blow.

But not to worry because human innovation has new approaches for the dilemma. The third millennium holds promises for optical, biological, quantum, and molecular computers, any of which could be orders of magnitude more powerful and far cheaper than todayns machines. Remember that computing really is controlled on/off switching to represent a one or zero and there can be many ways to achieve that goal. The biological computer could be composed of billions upon billions of programmable bacteria in a self-organizing system where genetic, one-bit switches or "flip-flops" are built from two genes that are mutually antagonistic. When one is active it turns the other off and vice versa, alternating between states when stimulated by an external influence. Best of all, from a single programmed cell, billions more can be grown inexpensively. The Lawrence Berkeley Laboratory and three universities (MIT, Boston, and Rockefeller) lead the work.

Maybe closer to functionality but still on the horizon is the quantum world where subatomic particles that exhibit "spin" can be manipulated by magnetic fields. A problem is that quantum bits (qubits) can represent a one or zero simultaneously, so maintaining "coherence," once established, is paramount. Recall that manipulating or reading nuclear spin is a mature technology (think nuclear magnetic resonance or NMR at a hospital) and has developed using element isotopes. But to date qubit spectrometry, which is not fast by computing standards, is still in proof-of-concept stage. Schools such as MIT, UCB, Harvard, and Stanford; NIST and Los Alamos federal labs; and firms like IBM work these problems.

Synthesized molecules from Rice University fabricated for a Yale University project pack vastly more memory on a chip than silicon so that even trillions of switches can reside there, each a few nanometers in diameter. A monolayer of organic molecules sandwiched between metal electrodes forms the basis for switches, many of which can be added or combined with conventional chips, often using carbon nanotubes as "wires" in a hybrid array. Researchers at Harvard, Rice, Yale, Colorado and Delft plus firms such as IBM and Hewlett-Packard work the issues.

These concepts and others closer to the fringe will bridge materials gaps between silicon-based, chip computers and what is needed in coming years. The world will change via these innovations and it must to assure the future.