This website requires certain cookies to work and uses other cookies to help you have the best experience. By visiting this website, certain cookies have already been set, which you may delete and block. By closing this message or continuing to use our site, you agree to the use of cookies. Visit our updated privacy and cookie policy to learn more.
This Website Uses Cookies By closing this message or continuing to use our site, you agree to our cookie policy. Learn MoreThis website requires certain cookies to work and uses other cookies to help you have the best experience. By visiting this website, certain cookies have already been set, which you may delete and block. By closing this message or continuing to use our site, you agree to the use of cookies. Visit our updated privacy and cookie policy to learn more.
In these first few days of a new millennium, we can already see through a window to the future, assured that mankind will witness continuing, accelerating change. It is well arguable that the single most important cause bringing the world to current conditions, evolving faster in the last fifty years than all history, was the advent of information processing by computers. The nexus of metallurgy and chemistry with chip fabrication is not, as some say who cite Moorens Law, at a cusp but rather, is poised for another run to alter every tomorrow.
In 1947 Bell Labs designed the first transistor, but it was Jack Kirby in May 1958, a new hire at Texas Instruments, who notionalized and then built a first integrated circuit in September that year, whereupon the company filed for patent in February 1959. The world often forgets Kirby because in January, a similar idea occurred to Robert Noyce at Fairchild Semiconductor, which filed for a patent in July 1959, notably different because Noyce painted connections between components (resistors, capacitors, transistors and diodes) on a silicon wafer, whereas Kirby used solder and wires. The first chip was made in 1961 and by 1964 a chip contained 32 transistors; today a chip can contain 28 million devices. It was in 1965 that Dr. Gordon Moore, then research director at Fairchild and three years later a founder of Intel, observed that the complexity and number of devices on a chip would double every year and a half, giving rise to "Moore's Law" that has held true until today. But that law will certainly change for computers as we know them.