Integrated circuits, or microchips, are tiny electronic circuits made of semiconductor material and found in modern appliances. They consist of transistors, resistors, diodes, and capacitors that work together to produce a specific effect. The first integrated circuit was demonstrated in 1958 and since then, technology has progressed through various generations, doubling in complexity every two years according to Moore’s Law. Integrated circuits are now faster, smaller, and more ubiquitous than ever before, with the semiconductor industry producing over 267 billion chips annually.
An integrated circuit (IC), commonly known as a silicon chip, computer chip, or microchip, is a miniature electronic circuit rendered on a sliver of semiconductor material, typically silicon, but sometimes sapphire. Thanks to their tiny size and incredible processing power — modern integrated circuits house millions of transistors on boards just 5 millimeters (about 0.2 inches) square and 1 millimeter (0.04 inches) thick — they are found in virtually every modern appliance. and devices, from credit cards, computers and cell phones to satellite navigation systems, traffic lights and airplanes.
In essence, an integrated circuit is a composite of various electronic components, namely transistors, resistors, diodes and capacitors, which are arranged and connected to produce a specific effect. Each unit in this “team” of electronic components has a unique function within the integrated circuit. The transistor acts like a switch and determines the ‘on’ or ‘off’ state of the circuit; the resistor controls the flow of electricity; the diode allows electricity to flow only when some condition on the circuit is met; and finally the capacitor stores electricity before its release in a sustained burst.
The first integrated circuit was demonstrated by Texas Instruments employee Jack Kil in 1958. This prototype, measuring approximately 11.1 by 1.6 millimeters, consisted of a strip of germanium and a single transistor. The advent of silicon coupled with the ever shrinking size of integrated circuits and the rapid increase in the number of transistors per millimeter meant that integrated circuits underwent massive proliferation and gave rise to the modern computing age.
From its inception in the 1950s to the present day, integrated circuit technology has gone through various “generations” which are now commonly referred to as small-scale integration (SSI), medium-scale integration (MSI), large-scale integration ( LSI) and Very Large Scale Integration (VSLI). These progressive generations of technology describe an arc in integrated circuit design progress that illustrates the foresight of Intel boss George Moore, who coined “Moore’s Law” in the 1960s which stated that integrated circuits double in complexity every two years.
This doubling of complexity is borne out by the generational shift in technology that has seen SSI’s dozens of transistors rise to hundreds of MSIs, then tens of thousands of LSIs, and finally millions of VSLIs. The next frontier that integrated circuits promise to violate is that of ULSI, or Ultra-Large Scale Integration, which involves the deployment of billions of microscopic transistors and has already been heralded by the Intel project codenamed Tukwila, which is believed to take more than two billion transistors.
If more proof of the persistent truthfulness of Moore’s dictum is needed, we need only look to the modern integrated circuit that is faster, smaller, and more ubiquitous than ever before. As of 2008, the semiconductor industry produces more than 267 billion chips annually and this figure is projected to rise to 330 billion by 2012.
Protect your devices with Threat Protection by NordVPN