Skip navigation

Philosophy of Science

  • Page 2 of 8
Philosophical and Scientific Perspectives

Information shapes biological organization in fundamental ways and at every organizational level. Because organisms use information--including DNA codes, gene expression, and chemical signaling--to construct, maintain, repair, and replicate themselves, it would seem only natural to use information-related ideas in our attempts to understand the general nature of living systems, the causality by which they operate, the difference between living and inanimate matter, and the emergence, in some biological species, of cognition, emotion, and language. And yet philosophers and scientists have been slow to do so. This volume fills that gap. Information and Living Systems offers a collection of original chapters in which scientists and philosophers discuss the informational nature of biological organization at levels ranging from the genetic to the cognitive and linguistic.

The chapters examine not only familiar information-related ideas intrinsic to the biological sciences but also broader information-theoretic perspectives used to interpret their significance. The contributors represent a range of disciplines, including anthropology, biology, chemistry, cognitive science, information theory, philosophy, psychology, and systems theory, thus demonstrating the deeply interdisciplinary nature of the volume’s bioinformational theme.

How do novel scientific concepts arise? In Creating Scientific Concepts, Nancy Nersessian seeks to answer this central but virtually unasked question in the problem of conceptual change. She argues that the popular image of novel concepts and profound insight bursting forth in a blinding flash of inspiration is mistaken. Instead, novel concepts are shown to arise out of the interplay of three factors: an attempt to solve specific problems; the use of conceptual, analytical, and material resources provided by the cognitive-social-cultural context of the problem; and dynamic processes of reasoning that extend ordinary cognition.

Focusing on the third factor, Nersessian draws on cognitive science research and historical accounts of scientific practices to show how scientific and ordinary cognition lie on a continuum, and how problem-solving practices in one illuminate practices in the other. Her investigations of scientific practices show conceptual change as deriving from the use of analogies, imagistic representations, and thought experiments, integrated with experimental investigations and mathematical analyses. She presents a view of constructed models as hybrid objects, serving as intermediaries between targets and analogical sources in bootstrapping processes. Extending these results, she argues that these complex cognitive operations and structures are not mere aids to discovery, but that together they constitute a powerful form of reasoning—model-based reasoning—that generates novelty. This new approach to mental modeling and analogy, together with Nersessian's cognitive-historical approach, makes Creating Scientific Concepts equally valuable to cognitive science and philosophy of science.

A Bradford Book

Decomposition and Localization as Strategies in Scientific Research

In Discovering Complexity, William Bechtel and Robert Richardson examine two heuristics that guided the development of mechanistic models in the life sciences: decomposition and localization. Drawing on historical cases from disciplines including cell biology, cognitive neuroscience, and genetics, they identify a number of "choice points" that life scientists confront in developing mechanistic explanations and show how different choices result in divergent explanatory models. Describing decomposition as the attempt to differentiate functional and structural components of a system and localization as the assignment of responsibility for specific functions to specific structures, Bechtel and Richardson examine the usefulness of these heuristics as well as their fallibility—the sometimes false assumption underlying them that nature is significantly decomposable and hierarchically organized.

When Discovering Complexity was originally published in 1993, few philosophers of science perceived the centrality of seeking mechanisms to explain phenomena in biology, relying instead on the model of nomological explanation advanced by the logical positivists (a model Bechtel and Richardson found to be utterly inapplicable to the examples from the life sciences in their study). Since then, mechanism and mechanistic explanation have become widely discussed. In a substantive new introduction to this MIT Press edition of their book, Bechtel and Richardson examine both philosophical and scientific developments in research on mechanistic models since 1993.

Comparative Philosophical Perspectives

The notion of function is an integral part of thinking in both biology and technology; biological organisms and technical artifacts are both ascribed functionality. Yet the concept of function is notoriously obscure (with problematic issues regarding the normative and the descriptive nature of functions, for example) and demands philosophical clarification. So too the relationship between biological organisms and technical artifacts: although entities of one kind are often described in terms of the other--as in the machine analogy for biological organism or the evolutionary account of technological development--the parallels between the two break down at certain points. This volume takes on both issues and examines the relationship between organisms and artifacts from the perspective of functionality. Believing that the concept of function is the root of an accurate understanding of biological organisms, technical artifacts, and the relation between the two, the contributors take an integrative approach, offering philosophical analyses that embrace both biological and technical fields of function ascription. They aim at a better understanding not only of the concept of function but also of the similarities and differences between organisms and artifacts as they relate to functionality. Their ontological, epistemological, and phenomenological comparisons will clarify problems that are central to the philosophies of both biology and technology.ContributorsPaul Sheldon Davies, Maarten Franssen, Wybo Houkes, Yoshinobu Kitamura, Peter Kroes, Ulrich Krohs, Tim Lewens, Andrew Light, Françoise Longy, Peter McLaughlin, Riichiro Mizoguchi, Mark Perlman, Beth Preston, Giacomo Romano, Marzia Soavi, Pieter E. Vermaas

A Prolegomenon

Building a person has been an elusive goal in artificial intelligence. This failure, John Pollock argues, is because the problems involved are essentially philosophical; what is needed for the construction of a person is a physical system that mimics human rationality. Pollock describes an exciting theory of rationality and its partial implementation in OSCAR, a computer system whose descendants will literally be persons.

In developing the philosophical superstructure for this bold undertaking, Pollock defends the conception of man as an intelligent machine and argues that mental states are physical states and persons are physical objects as described in the fable of Oscar, the self conscious machine.

Pollock brings a unique blend of philosophy and artificial intelligence to bear on the vexing problem of how to construct a physical system that thinks, is self conscious, has desires, fears, intentions, and a full range of mental states. He brings together an impressive array of technical work in philosophy to drive theory construction in AI. The result is described in his final chapter on "cognitive carpentry."

A Bradford Book

Contemporary Readings in Philosophy and Science

Emergence, largely ignored just thirty years ago, has become one of the liveliest areas of research in both philosophy and science. Fueled by advances in complexity theory, artificial life, physics, psychology, sociology, and biology and by the parallel development of new conceptual tools in philosophy, the idea of emergence offers a way to understand a wide variety of complex phenomena in ways that are intriguingly different from more traditional approaches. This reader collects for the first time in one easily accessible place classic writings on emergence from contemporary philosophy and science. The chapters, by such prominent scholars as John Searle, Stephen Weinberg, William Wimsatt, Thomas Schelling, Jaegwon Kim, Robert Laughlin, Daniel Dennett, Herbert Simon, Stephen Wolfram, Jerry Fodor, Philip Anderson, and David Chalmers, cover the major approaches to emergence. Each of the three sections ("Philosophical Perspectives," "Scientific Perspectives," and "Background and Polemics") begins with an introduction putting the chapters into context and posing key questions for further exploration. A bibliography lists more specialized material, and an associated website (http://mitpress.mit.edu/emergence) links to downloadable software and to other sites and publications about emergence.

Contributors: P. W. Anderson, Andrew Assad, Nils A. Baas, Mark A. Bedau, Mathieu S. Capcarrère, David Chalmers, James P. Crutchfield, Daniel C. Dennett, J. Doyne Farmer, Jerry Fodor, Carl Hempel, Paul Humphreys, Jaegwon Kim, Robert B. Laughlin, Bernd Mayer, Brian P. McLaughlin, Ernest Nagel, Martin Nillson, Paul Oppenheim, Norman H. Packard, David Pines, Steen Rasmussen, Edmund M. A. Ronald, Thomas Schelling, John Searle, Robert S. Shaw, Herbert Simon, Moshe Sipper, Stephen Weinberg, William Wimsatt, and Stephen Wolfram

An Evolving Polarity

Genetically modified food, art in the form of a phosphorescent rabbit implanted with jellyfish DNA, and robots that simulate human emotion would seem to be evidence for the blurring boundary between the natural and the artificial. Yet because the deeply rooted concept of nature functions as a cultural value, a social norm, and a moral authority, we cannot simply dismiss the distinction between art and nature as a nostalgic relic. Disentangling the cultural roots of many current debates about new technologies, the essays in this volume examine notions of nature and art as they have been defined and redefined in Western culture, from the Hippocratic writers’ ideas of physis and techne [note: bar over e] and Aristotle’s designation of mimetic arts to nineteenth-century chemistry and twenty-first century biomimetics. These essays--by specialists of different periods and various disciplines--reveal that the division between nature and art has been continually challenged and reassessed in Western thought. In antiquity, for example, mechanical devices were seen as working “against nature”; centuries later, Descartes not only claimed the opposite but argued that nature itself was mechanical. Nature and art, the essays show, are mutually constructed, defining and redefining themselves, partners in a continuous dance over the centuries.ContributorsBernadette Bensaude-Vincent, Horst Bredekamp, John Hedley Brooke, Dennis Des Chene, Alan Gabbey, Anthony Grafton, Roald Hoffmann, Thomas DaCosta Kaufmann, William R. Newman, Jessica Riskin, Heinrich Von Staden, Francis Wolff, Mark J. Schiefsky Bernadette Bensaude-Vincent is Professor of History at the University of Paris X. She is the author of A History of Chemistry and other books. William R. Newman is Ruth Halls Professor of History and Philosophy of Science at Indiana University, Bloomington. He is the coeditor of Secrets of Nature (MIT Press, 1999) and author or editor of several other books.

From Theory to Practice

The twentieth century’s conceptual separation of the process of evolution (changes in a population as its members reproduce and die) from the process of development (changes in an organism over the course of its life) allowed scientists to study evolution without bogging down in the “messy details” of development. Advances in genetics produced the modern synthesis, which cast the gene as the unit of natural selection. The modern synthesis, however, has had its dissenters (among them Stephen Jay Gould), and there is now growing interest in the developmental synthesis (also known as evo-devo), which integrates the study of evolution and development. This collection offers a history of the developmental synthesis, argues for its significance, and provides specific case studies of its applications ranging from evolutionary psychology to the evolution of culture. Widespread interest in the developmental synthesis is a relatively new phenomenon. Scientists don’t yet know whether revisions to evolutionary theory resulting from the findings of evo-devo will be modest, with the developmental synthesis seen as a supplement to evolutionary theory, or a more far-reaching fundamental theoretical rethinking of evolution itself. The chapters in Integrating Evolution and Development not only make a case for the importance of the developmental synthesis, they also make significant contributions to this fast-growing field of study. ContributorsWerner Callebaut, James R. Griesemer, Paul E. Griffiths, Manfred D. Laubichler, Jane Maienschein, Gerd B. Müller, Stuart A. Newman, H. Frederik Nijhout, Roger Sansom, Gerhard Schlosser, William C. WimsattRoger Sansom is Assistant Professor of Philosophy at Texas A&M University. Robert Brandon is Professor of Philosophy and Biology at Duke University and the coeditor of Genes, Organisms, Populations: Controversies over the Units of Selection (MIT Press, 1984).

Structures, Behaviors, Evolution

Abstract and conceptual models have become an indispensable tool for analyzing the flood of highly detailed empirical data generated in recent years by advanced techniques in the biosciences. Scientists are developing new modeling strategies for analyzing data, integrating results into the conceptual framework of theoretical biology, and formulating new hypotheses. In Modeling Biology, leading scholars investigate new modeling strategies in the domains of morphology, development, behavior, and evolution. The emphasis on models in the biological sciences has been accompanied by a new focus on conceptual issues and a more complex understanding of epistemological concepts. Contributors to Modeling Biology discuss models and modeling strategies from the perspectives of philosophy, history, and applied mathematics. Individual chapters discuss specific approaches to modeling in such domains as biological form, development, and behavior. Finally, the book addresses the modeling of these properties in the context of evolution, with a particular emphasis on the emerging field of evolutionary developmental biology (or evo-devo).ContributorsGiorgio A. Ascoli, Chandrajit Bajaj, James P. Collins,, Luciano da Fontoura Costa, Kerstin Dautenhahn, Nigel R. Franks, Scott Gilbert, Marta Ibañes Miguez, Juan Carlos Izpisúa-Belmonte, Alexander S. Klyubin, Thomas J. Koehnle, Manfred D. Laubichler, Sabina Leonelli, James A. R. Marshall, George R. McGhee Jr., Gerd B. Müller, Chrystopher L. Nehaniv, Karl J. Niklas, Lars Olsson, Eirikur Palsson, Daniel Polani, Diego Rasskin Gutman, Hans-Jörg Rheinberger, Alexei V. Samsonovich, Jeffrey C. Schank, Harry B. M. Uylings, Jaap van Pelt, Iain Werry Manfred D. Laubichler is Assistant Professor in the School of Life Sciences at Arizona State University. He is the coeditor of From Embryology to Evo-Devo (MIT Press, 2007). Gerd B. Müller is Professor and Head of the Deparment of Theoretical Biology at the University of Vienna. He is a coeditor of Origination of Organismal Form (MIT Press, 2003) and Environment, Development, Evolution (MIT Press, 2003).

This book by distinguished philosopher Nicholas Rescher seeks to clarify the idea of what a conditional says by elucidating the information that is normally transmitted by its utterance. The result is a unified treatment of conditionals based on epistemological principles rather than the semantical principles in vogue over recent decades. This approach, argues Rescher, makes it easier to understand how conditionals actually function in our thought and discourse. In its concern with what language theorists call pragmatics--the study of the norms and principles governing our use of language in conveying information--Conditionals steps beyond the limits of logic as traditionally understood and moves into the realm claimed by theorists of artificial intelligence as they try to simulate our actual information-processing practices.The book’s treatment of counterfactuals essentially revives an epistemological approach proposed by F. P. Ramsey in the 1920s and developed by Rescher himself in the 1960s but since overshadowed by the now-dominant possible-worlds approach. Rescher argues that the increasingly evident liabilities of the possible-worlds strategy make a reappraisal of the older style of analysis both timely and desirable. As the book makes clear, an epistemological approach demonstrates that counterfactual reasoning, unlike inductive inference, is not a matter of abstract reasoning alone but one of good judgment and common sense.

  • Page 2 of 8