Encyclopedia  |   World Factbook  |   World Flags  |   Reference Tables  |   List of Lists     
   Academic Disciplines  |   Historical Timeline  |   Themed Timelines  |   Biographies  |   How-Tos     
Sponsor by The Tattoo Collection
Integrated circuit
Main Page | See live article | Alphabetical index

Integrated circuit

An integrated circuit (IC) is a thin chip, usually coin-sized or smaller, consisting of thousands or millions of interconnected semiconductor devices, mainly transistors, as well as passive components like resistors. The most advanced integrated circuits are the microprocessors, which drive everything from computers to cellular phones to digital microwave ovens. Digital memory chipss are another family of integrated circuits that are crucially important in modern society.

The integrated circuit was made possible by mid-twentieth-century technology advancements in semiconductor device fabrication and experimental discoveries that showed that semiconductor devices could perform the functions performed by vacuum tubes at the time. The integration of large numbers of tiny transistors onto a small chip was an enormous improvement on the hand assembly of finger-sized vacuum tubes. The integrated circuit's small size, reliability, fast switching speeds, low power consumption, mass-production capability, and ease of adding complexity quickly pushed vacuum tubes into obsolescence.

Only a half century after their development was initiated, integrated circuits have become ubiquitous. Computers, cellular phones, and other digital appliances are now inextricable parts of the structure of modern societies. Indeed, many scholars believe that the digital revolution brought about by integrated circuits was one of the most significant occurrences in the history of mankind.


Table of contents
1 Fabrication
2 Significance
3 History
4 Further developments
5 Notable integrated circuits
6 Notable manufacturers
7 See also
8 References


Main article: Semiconductor device fabrication.

ICs are fabricated in an almost two-dimensional bottom-up layer process which includes these key process steps: -

The main process steps are supplemented by doping, cleaning and planarisation steps.

A single-crystal silicon wafer (or for special applications, silicon on sapphire or gallium arsenide wafers) are used as the substrate. Photolithography is used to mark different areas of the substrate to be doped or to have polysilicon or aluminum tracks sputtered on them. (See also semiconductor.) Each device is tested, before packaging. The wafer is then diced into small rectangles called die. The die is then connected into a package using gold or aluminum wires which are welded to pads, usually found around the edge of the die. After packaging, the devices go through final test on very expensive automated testers, which account for over 25 percent of the cost of fabrication. A fabrication facility, commonly known as a semiconductor fab, currently costs over a billion US Dollars to construct, because much of the operation is automated. In the most advanced processes, the wafers exceed 30 centimeters in diameter (wider than a common dinner plate).


Integrated circuits can be classified into analog, digital and mixed signal (both analog and digital on the same chip). Digital integrated circuits can contain anything from one to millions of logic gates, flip-flops, multiplexers, etc. in a few square millimeters. The small size of these circuits allows high speed, low power dissipation, and reduced manufacturing cost compared with board-level integration.

The growth of complexity of integrated circuits follows a trend called "Moore's Law", first observed by Gordon Moore of Intel. Moore's Law states that the number of transistors in an integrated circuit doubles every two years. By the year 2000 the largest integrated circuits contained hundreds of millions of transistors, and the trend shows no signs of slowing down.

The integrated circuit is one of the most important inventions of the 20th century. Modern computing, communications, manufacturing, and transportation systems, including the Internet, all depend on its existence.


The integrated circuit was first conceived by a radar scientist, Geoffrey W.A. Dummer (born 1909), working for the Royal Radar Establishment of the British Ministry of Defence, and published in Washington DC on May 7, 1952. Dummer unsuccessfully attempted to build such a circuit in 1956.

The first integrated circuits were developed independently by two scientists: Jack Kilby of Texas Instruments filed a patent for a "Solid Circuit" on February 6, 1958, and Robert Noyce of Fairchild Semiconductor was awarded a patent for a more complex "unitary circuit" on April 25, 1961.

Noyce credited Kurt Lehovec of Sprague Electric for the principle of dielectric isolation caused by the action of a p-n junction (the diode) as a key concept behind the IC.


The first integrated circuits contained only a few transistors. Called "Small-Scale Integration" (SSI), they used circuits containing transistors numbering in the tens.

SSI circuits were crucial to early aerospace projects, and vice-versa. Both the Minuteman missile and Apollo program needed lightweight digital computers for their inertially-guided flight computers; the Apollo flight computer led and motivated the integrated-circuit technology, while the Minuteman missile forced it into mass-production.

These programs purchased almost all of the available integrated circuits from 1960 through 1963, and almost alone provided the demand that funded the production improvements to get the production costs from $1000/circuit (in 1960 dollars) to merely $25/circuit (in 1963 dollars).


The next step in the development of integrated circuits, taken in the late 1960s, introduced devices which contained hundreds of transistors on each chip, called "Medium-Scale Integration" (MSI).

They were attractive economically because while they cost little more to produce than SSI devices, they allowed more complex systems to be produced using smaller circuit boards, less assembly work (because of fewer separate components), and a number of other advantages.


Further development, driven by the same economic factors, led to "Large-Scale Integration" (LSI) in the mid 1970s, with tens of thousands of transistors per chip.

LSI circuits began to be produced in large quantities around 1970, for computer main memories and pocket calculators.


The final step in the development process, starting in the 1980s and continuing on, was "Very Large-Scale Integration" (VLSI), with hundreds of thousands of transistors, and beyond (well past several million in the latest stages). The largest chips are sometimes called "Ultra Large-Scale Integration" (ULSI).

For the first time it became possible to fabricate a CPU or even an entire microprocessor on a single integrated circuit. In 1986 the first one megabyte RAM was introduced, which contained more than one million transistors. Microprocessor chips produced in 1994 contained more than three million transistors.

This step was largely made possible by the codification of "design rules" for the CMOS technology used in VLSI chips, which made production of working devices much more of a systematic endeavour. (See the 1980 landmark text by Carver Mead and Lynn Conway referenced below.)

Further developments

The most extreme integration technique is wafer-scale integration (WSI), which uses whole uncut wafers containing entire computers (processors as well as memory). Attempts to take this step commercially in the 1980s (e.g. by Gene Amdahl) failed, and it does not now seem to be a high priority for industry.

In the 1980s programmable integrated circuits were developed. These devices contain circuits whose logical function and connectivity can be programmed by the user, rather than being fixed by the integrated circuit manufacturer. This allows a single chip to be programmed to implement different LSI-type functions such as logic gates, adders and registers. Current devices named FPGAs (Field Programmable Gate Arrays) can now implement tens of thousands of LSI circuits in parallel and operate up to 400 MHz.

The techniques perfected by the integrated circuits industry over the last three decades have been used to create microscopic machines, known as MEMS. These devices are used in a variety of commercial and defense applications, including projectors, ink jet printers, and are used to deploy the airbag in car accidents.

In the past, radios could not be fabricated in the same low-cost processes as microprocessors. But since 1998, a large number of radio chips have been developed using CMOS processes. Examples include Intel's DECT cordless phone, or Atheros's 802.11 card.

Notable integrated circuits

Notable manufacturers

See also