Image Credit: Raimundas | Shutterstock.com
Materials that contain both the properties of a conductor and an electrical insulator are known as semiconductors. Although the concept had been discussed before, the first documented observation of a semiconductor was made in 1833 when British physicist Michael Faraday noted the electrical resistance of Silver sulfide declines with falling temperatures.
Rectification and Diodes
After the initial discovery of semiconductor material, the next big breakthrough came with the realization that these materials could be used in rectification, or the conversion of alternating current (AC) in to direct current (DC).
In 1874, British physicist Arthur Schuster observed rectification with a circuit made of copper wires. Schuster noted that rectification only occurred in his circuit after it had been left to sit for a while. Then, after Schuster cleaned off the end of the wires, the rectification effect was gone. The reason for this was when Schuster cleaned off his wires, he was effectively removing copper oxide, a semiconductor.
In 1929, German physicist Walter Schottky was able to confirm this effect at the metal-semiconductor junction.
The property of rectification makes semiconductors essential to the creation of a diode. Permitting an electric current to pass in just one direction, diodes are the basis for radio and TV tuners and LEDs (light-emitting diodes), among other things.
Photovoltaics
In addition to being useful at the metal-semiconductor junction, physicists found semiconductor materials to be useful at junction between a semiconductor and an electrolyte, where it has a photovoltaic effect.
In 1876, William Grylls Adams and Richard Evans Day discovered that lighting up a junction between selenium and platinum could change the direction of a flowing electric current. This discovery led to the creation of the first solar cell, by Charles Fritts in 1883. The efficiency of this cell was less than 1 percent.
In Transistors
Semiconductors are also crucial to the development of transistors. The first semiconductor transistor was developed by America physicists John Bardeen and Walter Brattain in 1947. The physicists used the semiconductor germanium with two tightly-spaced gold contacts placed against it with a spring. The chunk of germanium had an exterior layer with an overabundance of electrons, and when an electric signal passed in through the gold foil, it inserted points without electrons, making a thin layer with an electron deficiency.
Image Credit: NorGal | Shutterstock.com
A modest positive current placed on one of the two contacts had an effect on the current which passed between the other contact and the base where the germanium was attached. The physicists saw a modest change in the first contact current triggered a greater shift the second contact current, acting as an amplifier.
A staple of modern technology, transistors acts as either an amplifier or a switch. As an amplifier, a transistor takes an electric current as an input, and it outputs a stronger current. This capacity is particularly useful in devices like hearing aids, which function by boosting a signal.
Using essentially the same capability, a transistor can be used as a switch, as a small electric current flowing through one part of a transistor can turn on a larger current in a different part of the device. This way of using a transistor is the basis of computer processors, which contain billions of tiny switches.
Processing Power
The first iteration of a transistor, known point-contact transistor, was highly unstable and the electrical qualities of the device were difficult to control. Furthermore, point-contact transistors had to be built one at a time and connected by hand.
The development of the integrated circuit solved those problems and allowed for the creation of the modern computer. Through a planar manufacturing process, multiple transistors could be made and wired together at the same time. In 1962, Fairchild Semiconductor Company was making integrated circuits with about a dozen transistors each.
In a landmark 1965 research paper, American physicist Gordon Moore said that the amount of transistors that could be placed on integrated circuits would double about every two years, thanks to technological advancements. That proclamation, which came to be known as Moore’s Law, has largely held true.
While computer technology and electronic have advanced by leaps and bounds since the 1960s, the two basic units of transistor and integrated circuit have been the foundation for all those advancements.
References and Further Reading
History of Semiconductors
A Brief History of Semiconductors - Wealth Daily
A Brief History of Semiconductors - Semiwiki
Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.