In the 1930s a series of seminal works published by Alan Turing, Kurt G?del, Alonzo Church, and others established the theoretical basis for computability. This work, advancing precise characterizations of effective, algorithmic computability, was the culmination of intensive investigations into the foundations of mathematics. In the decades since, the theory of computability has moved to the center of discussions in philosophy, computer science, and cognitive science.
The vast majority of all email sent every day is spam, a variety of idiosyncratically spelled requests to provide account information, invitations to spend money on dubious products, and pleas to send cash overseas. Most of it is caught by filters before ever reaching an in-box. Where does it come from?
Today, women earn a relatively low percentage of computer science degrees and hold proportionately few technical computing jobs. Meanwhile, the stereotype of the male â€ścomputer geekâ€ť seems to be everywhere in popular culture. Few people know that women were a significant presence in the early decades of computing in both the United States and Britain. Indeed, programming in postwar years was considered womanâ€™s work (perhaps in contrast to the more manly task of building the computers themselves).
The history of computing could be told as the story of hardware and software, or the story of the Internet, or the story of “smart” hand-held devices, with subplots involving IBM, Microsoft, Apple, Facebook, and Twitter. In this concise and accessible account of the invention and development of digital technology, computer historian Paul Ceruzzi offers a broader and more useful perspective.
Long ago, in 1985, personal computers came in two general categories: the friendly, childish game machine used for fun (exemplified by Atari and Commodore products); and the boring, beige adult box used for business (exemplified by products from IBM). The game machines became fascinating technical and artistic platforms that were of limited real-world utility. The IBM products were all utility, with little emphasis on aesthetics and no emphasis on fun. Into this bifurcated computing environment came the Commodore Amiga 1000.
In the first three and a half years of its existence, Fairchild Semiconductor developed, produced, and marketed the device that would become the fundamental building block of the digital world: the microchip. Founded in 1957 by eight former employees of the Shockley Semiconductor Laboratory, Fairchild created the model for a successful Silicon Valley start-up: intense activity with a common goal, close collaboration, and a quick path to the market (Fairchild's first device hit the market just ten months after the company's founding).
Like all great social and technological developments, the "computer revolution" of the twentieth century didn't just happen. People—not impersonal processes—made it happen. In The Computer Boys Take Over, Nathan Ensmenger describes the emergence of the technical specialists—computer programmers, systems analysts, and data processing managers—who helped transform the electronic digital computer from a scientific curiosity into the most powerful and ubiquitous technology of the modern era.
Global warming skeptics often fall back on the argument that the scientific case for global warming is all model predictions, nothing but simulation; they warn us that we need to wait for real data, "sound science." In A Vast Machine Paul Edwards has news for these skeptics: without models, there are no data. Today, no collection of signals or observations—even from satellites, which can "see" the whole planet with a single instrument—becomes global in time and space without passing through a series of data models.
A Hollywood biopic about the life of computer pioneer Grace Murray Hopper (1906–1992) would go like this: a young professor abandons the ivy-covered walls of academia to serve her country in the Navy after Pearl Harbor and finds herself on the front lines of the computer revolution. She works hard to succeed in the all-male computer industry, is almost brought down by personal problems but survives them, and ends her career as a celebrated elder stateswoman of computing, a heroine to thousands, hailed as the inventor of computer programming.
In The Allure of Machinic Life, John Johnston examines new forms of nascent life that emerge through technical interactions within human-constructed environments—"machinic life"—in the sciences of cybernetics, artificial life, and artificial intelligence. With the development of such research initiatives as the evolution of digital organisms, computer immune systems, artificial protocells, evolutionary robotics, and swarm systems, Johnston argues, machinic life has achieved a complexity and autonomy worthy of study in its own right.