6.1 Historical Overview

Semiconductor technology is not only an industry branch of global importance, but it has also always been an exciting field of research where new technologies have been continuously developed since the middle of the twentieth century. Semiconductor devices enable to perform digital calculations at speeds unthinkable before and still no physical barriers have been reached yet. Nowadays they are part of our everyday life and will continue to play a crucial role in research and in industry, e.g., in consumer electronics, wireless communication, network technologies, microprocessors, memory chips, automotive electronics, micro controllers, and sensors and actuators.

The history of semiconductor technology, that is currently called ULSI (ultra large scale integration) technology, started in the late 1920s and early 1930s, when J.E. Lilienfeld (USA) and O. Heil (Germany) discussed the predecessor of what nowadays is called the field effect transistor. Both were granted several patents, but technological challenges delayed the utilization of this device [62]. After the second world war three researchers at the Bell Laboratories, namely J. Bardeen, W. Brattain, and W. Shockley, introduced the bipolar junction transistor in 1947.

About ten years later J. Kilby, an engineer at Texas Instruments, invented the first integrated circuit in the summer of 1958. Only a few months later in the next year R. Noyce at Fairchild independently reported a procedure more closely resembling today's integrated circuits. Based on these inventions the mass production of various kinds of semiconductor integrated circuits developed in the early 1960s.

The well-known Moore's Law [80] states that the number of devices per chip doubles every eighteen months. This statement of exponential growth in the complexity of integrated circuits is basically still true after decades despite several forecasts of its demise. This fact underlines the dynamic growth and the importance of inventions and innovations in semiconductor technology, although part of the influence of Moore's Law may be due to being a self-fulfilling prophecy.

Technology evolved in the following decades. While bipolar transistors dominated in the 1950s and 1960s, in the 1970s MOS (metal oxide semiconductor) technology began to overtake bipolar technology regarding functional complexity and the level of integration. The chip complexity of MOS technology was increased by scaling [29,111], and bipolar circuits were rivaled in the domain of high speed applications. The 1970s saw the advent of microprocessors, which - in combination with memory chips - enabled the numerical simulation of complicated physical processes and thus fertilized other technologies and mathematical numerics. CMOS (complementary MOS) technology, introduced in 1963, began to replace other technologies during the 1980s and nowadays represents three quarters of the world's semiconductor market.

Early semiconductor devices were based on germanium. It is, however, unsuitable for certain applications because of its high junction leakage currents which are a consequence of its relatively narrow bandgap of $ 0.66\,\mathrm{eV}$. Hence silicon with a bandgap of $ 1.1\,\mathrm{eV}$ substituted germanium and is now the prevailing semiconductor material in integrated circuit manufacturing. Other materials, like Gallium Arsenide, are accepted for certain specialized applications.

It is generally expected that the progress of semiconductor technology will continue at the same speed until at least 2015 [7].

Clemens Heitzinger 2003-05-08