Www.WorldHistory.Biz
Login *:
Password *:
     Register

 

3-08-2015, 00:25

Transistors and the revolution in electronics

The computer revolution offers a good example of how a vital technology fuelled the Cold War, but also developed a trajectory and momentum of its own, particularly in the capitalist West.

Electronic computers were another spin-off from the Second World War. They were made possible by expertise and technology from the vast British and American radar projects; they were made necessary by the massive and speedy mathematical calculations required in technowar. By the end of 1943, the British government was using an electronic calculator, Colossus, to crack German ciphers at its Bletchley Park code-breaking centre. The first stored-programme computers were built and tested in England in 1948-49. These pioneering machines were essentially mathematical instruments, designed for complicated calculations. During the 1950s, however, their successors were developed as massive data-processors, to replace desk calculators or punched-card systems. They were produced in a big way in the United States by Remington Rand and especially International Business Machines Corporation (IBM) which, by 1964, accounted for 70 per cent of the worldwide inventory of computers, with a value totalling $10 billion.9

8  Nikolai Krementsov, Stalinist Science (Princeton, NJ: Princeton University Press, 1997), 290.

9  Emerson W. Pugh, Building IBM: Shaping an Industry and Its Technology (Cambridge, MA: MIT Press, 1995), 296.

382

In part, IBM won out through superior customer support and heavy investment in R&D. But government contracts, particularly for the military, made a crucial contribution to establishing IBM as the industry’s giant in the quarter-century after the Second World War. Over half of IBM’s revenues from electronic data processing in the 1950s came from its analog guidance computer for the B-52 bomber and from the Semi-Automatic Ground Environment (SAGE) air defence system - at around $8 billion, the largest and most expensive military project of the 1950s. In 1955, about 20 per cent of IBM’s 39,000 American employees were working on it.10

Yet SAGE is a neglected Cold War story.11 After the Soviet nuclear test in 1949, US Air Force (USAF) planners were alarmed at the vulnerability of the United States to Soviet air attack. To co-ordinate information from radar all over North America, a vast and very sophisticated computing system was needed - operating in real time, extremely reliable, and around the clock. The USAF turned first to MIT, establishing a special research programme there in 1951, which became the famous Lincoln Laboratory near Route 128. Once MIT had designed a feasible system and tested a prototype on Cape Cod, south of Boston, IBM won the contract to build and run the computers for the whole system. The first SAGE direction centre became operational in July 1958, but the whole system was not fully deployed until 1963 - involving twenty-four separate centres, each with two identical computers to permit servicing and prevent any system collapse. Each computer had 60,000 vacuum tubes and occupied an acre of floor space. Later, the vacuum tubes were replaced with magnetic cores, vastly enhancing speed and reliability. SAGE thereby pioneered the random-access core memory that within a few years was routine in all commercial computers. Apart from the financial benefits, SAGE also gave thousands of IBM engineers and programmers their basic training in the business. The experience gained was fully utilised when IBM was asked in 1957 to design a computerised reservations system for American Airlines. Little wonder that Thomas J. Watson, the company’s head, claimed: 'It was the Cold War that helped IBM make itself the king of the computer business.’12 Not until 1959 did IBM’s revenues from commercial

10  Kenneth Flamm, Creating the Computer: Government, Industry, and High Technology (Washington, DC: Brookings Institution Press, 1988), 82-90.

11  On SAGE, see the special issue of Annals of the History of Computing, 5,4 (October, 1983), 319-403, and Paul Edwards, The Closed World: Computers and the Politics ofDiscourse in Cold War America (Cambridge, MA: MIT Press, 1996), ch. 3.

12  Thomas J. Watson, Jr., and Richard Petre, Father and Son, & Co.: My Life at IBM and Beyond (London: Bantam Press, i990), 230-33.

383

Electronic computers exceed those from SAGE and other military computing projects.13

In April 1964, IBM unveiled its System 360 'family’ of computers and peripherals, all using the same software. By the end of the decade, it had captured three-quarters of the world market for mainframe computers. This great leap forward in technology was partly the result of refining the magnetic core memory developed for SAGE. But even more important was the revolution in electronics that made possible, first, the transistor and, then, integrated circuits. Again, the Cold War proved a critical catalyst.

The vacuum tubes used in early televisions and computers were large, fragile, and expensive. But a substitute emerged from wartime work on radar, where electronic tubes could not be used for microwave detection - hence the development of crystals such as germanium and silicon as semiconductors. After the war, Bell Laboratories - the research arm of the telecommunications giant AT&T - employed this wartime knowledge and many radar scientists in the search for a solid-state amplifier. At the end ofJune 1948, Bell unveiled a prototype called 'the Transistor’, but the announcement was overshadowed by the start of the Berlin blockade. A brief story was relegated to the back of the New York Times under the heading 'News of Radio’.14

Although the first transistor radios were on sale by 1954, the new technology took time to catch on. The industry gradually moved from craft methods - rows of women workers using tweezers - to mass production and, in raw materials, from germanium to the more robust silicon. By 1960, the platform for a commercial industry had been built. But the industry would not have reached that point without military assistance. The transistor was hugely attractive to the armed forces because they needed reliable, lightweight guidance and communications systems in ships, planes, and guided missiles. By 1953, the US military was funding half of Bell Labs’ R&D in transistors. Even more important, it provided large and secure markets. The proportion of US semiconductor production for military use rose from 35 per cent in 1955 to a peak of nearly 48 per cent in 1960. In 1963, transistor sales to the military were worth $119 million, to industry $92 million, and only $41 million to the

15

Consumer.

By the 1960s, the military was spreading its money more widely, to smaller, specialist firms such as Fairchild Semiconductor and Texas Instruments. These

13 Pugh, Building IBM, 326, Appendix D. 14 New York Times, i July 1948, 46.

15 Ernest Braun and Stuart MacDonald, Revolution in Miniature: The History and Impact of Semiconductor Electronics, 2nd ed. (Cambridge: Cambridge University Press, 1982), 80.

384

Companies were another sign of the porous nature of the military-industrial-academic complex in the United States (unlike the Soviet Union), and they were also the motor for the next phase in solid-state technology. Between them, Texas and Fairchild pioneered miniaturisation, replacing separate transistorised components linked in circuits with a single integrated circuit in one piece (or chip) of germanium. The first chips were marketed in 1961. By the end of the decade integrated circuits had become the norm in electronic components such as digital watches, which flooded the consumer market in the 1970s. But once again, Cold War funding and demand helped at the crucial start-up stage - until 1967 the US military was taking over 50 per cent of chip production, much of it for the new space race.16



 

html-Link
BB-Link