Today, women earn a relatively low percentage of computer science degrees and hold proportionately few technical computing jobs. Meanwhile, the stereotype of the male “computer geek” seems to be everywhere in popular culture. Few people know that women were a significant presence in the early decades of computing in both the United States and Britain. Indeed, programming in postwar years was considered woman’s work (perhaps in contrast to the more manly task of building the computers themselves).
Like all great social and technological developments, the "computer revolution" of the twentieth century didn't just happen. People—not impersonal processes—made it happen. In The Computer Boys Take Over, Nathan Ensmenger describes the emergence of the technical specialists—computer programmers, systems analysts, and data processing managers—who helped transform the electronic digital computer from a scientific curiosity into the most powerful and ubiquitous technology of the modern era.
No company of the twentieth century achieved greater success and engendered more admiration, respect, envy, fear, and hatred than IBM. Building IBM tells the story of that company—how it was formed, how it grew, and how it shaped and dominated the information processing industry. Emerson Pugh presents substantial new material about the company in the period before 1945 as well as a new interpretation of the postwar era.
When we think of the Internet, we generally think of Amazon, Google, Hotmail, Napster, MySpace, and other sites for buying products, searching for information, downloading entertainment, chatting with friends, or posting photographs. In the academic literature about the Internet, however, these uses are rarely covered. The Internet and American Business fills this gap, picking up where most scholarly histories of the Internet leave off—with the commercialization of the Internet established and its effect on traditional business a fact of life.
Between 1946 and 1957 computing went from a preliminary, developmental stage to more widespread use accompanied by the beginnings of the digital computer industry. During this crucial decade, spurred by rapid technological advances, the computer enterprise became a major phenomenon.
In The Government Machine, Jon Agar traces the mechanization of government work in the United Kingdom from the nineteenth to the early twenty-first century. He argues that this transformation has been tied to the rise of "expert movements," groups whose authority has rested on their expertise. The deployment of machines was an attempt to gain control over state action—a revolutionary move.
This engaging history covers modern computing from the development of the first electronic digital computer through the dot-com crash. The author concentrates on five key moments of transition: the transformation of the computer in the late 1940s from a specialized scientific instrument to a commercial product; the emergence of small systems in the late 1960s; the beginning of personal computing in the 1970s; the spread of networking after 1985; and, in a chapter written for this edition, the period 1995-2001.
From its first glimmerings in the 1950s, the software industry has evolved to become the fourth largest industrial sector of the US economy. Starting with a handful of software contractors who produced specialized programs for the few existing machines, the industry grew to include producers of corporate software packages and then makers of mass-market products and recreational software. This book tells the story of each of these types of firm, focusing on the products they developed, the business models they followed, and the markets they served.
This is the story of an extraordinary effort by the U.S. Department of Defense to hasten the advent of "machines that think." From 1983 to 1993, the Defense Advanced Research Projects Agency (DARPA) spent an extra $1 billion on computer research aimed at achieving artificial intelligence. The Strategic Computing Initiative (SCI) was conceived as an integrated plan to promote computer chip design and manufacture, computer architecture, and artificial intelligence software.
This history of computing focuses not on chronology (what came first and who deserves credit for it) but on the actual architectures of the first machines that made electronic computing a practical reality. The book covers computers built in the United States, Germany, England, and Japan. It makes clear that similar concepts were often pursued simultaneously and that the early researchers explored many architectures beyond the von Neumann architecture that eventually became canonical. The contributors include not only historians but also engineers and computer pioneers.
After World War II, other major industrialized nations responded to the technological and industrial hegemony of the United States by developing their own design and manufacturing competence in digital electronic technology. In this book John Vardalas describes the quest for such competence in Canada, exploring the significant contributions of the civilian sector but emphasizing the role of the Canadian military in shaping radical technological change.
This book presents an organizational and social history of one of the foundational projects of the computer era: the development of the SAGE (Semi-Automatic Ground Environment) air defense system, from its first test at Bedford, Massachusetts, in 1951, to the installation of the first unit of the New York Air Defense Sector of the SAGE system, in 1958. The idea for SAGE grew out of Project Whirlwind, a wartime computer development effort, when the U.S. Department of Defense realized that the Whirlwind computer might anchor a continent-wide advance warning system.
Howard Hathaway Aiken (1900-1973) was a major figure of the early digital era. He is best known for his first machine, the IBM Automatic Sequence Controlled Calculator or Harvard Mark I, conceived in 1937 and put into operation in 1944. But he also made significant contributions to the development of applications for the new machines and to the creation of a university curriculum for computer science.
with the cooperation of Robert V. D. CampbellThis collection of technical essays and reminiscences is a companion volume to I. Bernard Cohen's biography, Howard Aiken: Portrait of a Computer Pioneer. After an overview by Cohen, Part I presents the first complete publication of Aiken's 1937 proposal for an automatic calculating machine, which was later realized as the Mark I, as well as recollections of Aiken's first two machines by the chief engineer in charge of construction of Mark II, Robert Campbell, and the principal programmer of Mark I, Richard Bloch.
John von Neumann (1903-1957) was unquestionably one of the most brilliant scientists of the twentieth century. He made major contributions to quantum mechanics and mathematical physics and in 1943 began a new and all-too-short career in computer science. William Aspray provides the first broad and detailed account of von Neumann's many different contributions to computing.
The first attempts to mechanize the production of numerical tables were remarkable in conception coming at a time when a "computer" was in fact a person rather than a machine. This book is the first to provide a unified picture of the difference engines that were the mechanical predecessors of today's digital computer, to emphasize them as part of the history of numerical tables, and to give equal weight to the technical and social aspects of their creation.
In this personal memoir, electrical engineer David Lundstrom recalls the heyday of early computing—the rise of Control Data out of the Univac division of Sperry Rand, such milestone computer systems as the Univac and the Naval Tactical Data System the exploits of CDC's top designer Seymour Cray, and the gradual corporate shift from the exciting and technically interesting world of computer design to internal politics and clumsy bureaucracy.
In this engrossing biography, Dorothy Stein strips away the many layers of myth surrounding Ada Lovelace's reputation as the inventor of the science of computer programming to reveal a story far more dramatic and fascinating than previous accounts have indicated. Working with original sources, Stein clears up a number of puzzles and misinterpretations of Ada's life and activities.
Given the proliferation of computers and computer-related products, it is difficult to realize that the computer's history is, in fact, a very short one. René Moreau, Director of Scientific Development, IBM France, traces the evolution of the computer from its earliest stored-program stages in the 1940s to the introduction of the IBM 360 Series in 1963.
In describing the technical experiences of one company from the beginning of the computer era, this book unfolds the challenges that IBM's research and development laboratories faced, the technological paths they chose, and how these choices affected the company and the computer industry.