Skip navigation


  • Page 2 of 5
Computation, Representation, and Dynamics in Neurobiological Systems

For years, researchers have used the theoretical tools of engineering to understand neural systems, but much of this work has been conducted in relative isolation. In Neural Engineering, Chris Eliasmith and Charles Anderson provide a synthesis of the disparate approaches current in computational neuroscience, incorporating ideas from neural coding, neural computation, physiology, communications theory, control theory, dynamics, and probability theory. This synthesis, they argue, enables novel theoretical and practical insights into the functioning of neural systems. Such insights are pertinent to experimental and computational neuroscientists and to engineers, physicists, and computer scientists interested in how their quantitative tools relate to the brain.

The authors present three principles of neural engineering based on the representation of signals by neural ensembles, transformations of these representations through neuronal coupling weights, and the integration of control theory and neural dynamics. Through detailed examples and in-depth discussion, they make the case that these guiding principles constitute a useful theory for generating large-scale models of neurobiological function. A software package written in MatLab for use with their methodology, as well as examples, course notes, exercises, documentation, and other material, are available on the Web.

This introductory text offers a clear exposition of the algorithmic principles driving advances in bioinformatics. Accessible to students in both biology and computer science, it strikes a unique balance between rigorous mathematics and practical techniques, emphasizing the ideas underlying algorithms rather than offering a collection of apparently unrelated problems.The book introduces biological and algorithmic ideas together, linking issues in computer science to biology and thus capturing the interest of students in both subjects. It demonstrates that relatively few design techniques can be used to solve a large number of practical problems in biology, and presents this material intuitively.An Introduction to Bioinformatics Algorithms is one of the first books on bioinformatics that can be used by students at an undergraduate level. It includes a dual table of contents, organized by algorithmic idea and biological idea; discussions of biologically relevant problems, including a detailed problem formulation and one or more solutions for each; and brief biographical sketches of leading figures in the field. These interesting vignettes offer students a glimpse of the inspirations and motivations for real work in bioinformatics, making the concepts presented in the text more concrete and the techniques more approachable.PowerPoint presentations, practical bioinformatics problems, sample code, diagrams, demonstrations, and other materials can be found at the Author's website.

Studies in Neurophilosophy

Progress in the neurosciences is profoundly changing our conception of ourselves. Contrary to time-honored intuition, the mind turns out to be a complex of brain functions. And contrary to the wishful thinking of some philosophers, there is no stemming the revolutionary impact that brain research will have on our understanding of how the mind works.

Brain-Wise is the sequel to Patricia Smith Churchland's Neurophilosophy, the book that launched a subfield. In a clear, conversational manner, this book examines old questions about the nature of the mind within the new framework of the brain sciences. What, it asks, is the neurobiological basis of consciousness, the self, and free choice? How does the brain learn about the external world and about its own introspective world? What can neurophilosophy tell us about the basis and significance of religious and moral experiences?

Drawing on results from research at the neuronal, neurochemical, system, and whole-brain levels, the book gives an up-to-date perspective on the state of neurophilosophy—what we know, what we do not know, and where things may go from here.

Selected Readings in the Philosophy of Perception

The philosophy of perception is a microcosm of the metaphysics of mind. Its central problems—What is perception? What is the nature of perceptual consciousness? How can one fit an account of perceptual experience into a broader account of the nature of the mind and the world?—are at the heart of metaphysics. Rather than try to cover all of the many strands in the philosophy of perception, this book focuses on a particular orthodoxy about the nature of visual perception.

The central problem for visual science has been to explain how the brain bridges the gap between what is given to the visual system and what is actually experienced by the perceiver. The orthodox view of perception is that it is a process whereby the brain, or a dedicated subsystem of the brain, builds up representations of relevant figures of the environment on the basis of information encoded by the sensory receptors. Most adherents of the orthodox view also believe that for every conscious perceptual state of the subject, there is a particular set of neurons whose activities are sufficient for the occurrence of that state. Some of the essays in this book defend the orthodoxy; most criticize it; and some propose alternatives to it. Many of the essays are classics. The contributors include, among others, G.E.M. Anscombe, Dana Ballard, Daniel Dennett, Fred Dretske, Jerry Fodor, H.P. Grice, David Marr, Maurice Merleau-Ponty, Zenon Pylyshyn, Paul Snowdon, and P.F. Strawson.

This popular behavioral endocrinology text provides detailed information on what hormones are, how they affect cells, and how such effects can alter the behavior of animals, including humans. Presenting a broad continuum of levels of analysis, from molecular to evolutionary, the book discusses how genes work, the structure of cells, the interactions of endocrine organs, the behavior of individuals, the structure of social hierarchies, and the evolution of mating systems.

The second edition, while maintaining the strengths of the first edition, has been thoroughly revised to reflect recent developments in genetics and molecular biology and related social concerns. It contains four new chapters: on the use of molecular biology techniques in behavioral endocrinology, on psychoneuroimmunology, on hormonal influences on sensorimotor function, and on cognitive function in nonhuman animals.

The Machine Learning Approach

An unprecedented wealth of data is being generated by genome sequencing projects and other experimental efforts to determine the structure and function of biological molecules. The demands and opportunities for interpreting these data are expanding rapidly. Bioinformatics is the development and application of computer methods for management, analysis, interpretation, and prediction, as well as for the design of experiments. Machine learning approaches (e.g., neural networks, hidden Markov models, and belief networks) are ideally suited for areas where there is a lot of data but little theory, which is the situation in molecular biology. The goal in machine learning is to extract useful information from a body of data by building good probabilistic models—and to automate the process as much as possible.

In this book Pierre Baldi and Søren Brunak present the key machine learning approaches and apply them to the computational problems encountered in the analysis of biological data. The book is aimed both at biologists and biochemists who need to understand new data-driven algorithms and at those with a primary background in physics, mathematics, statistics, or computer science who need to know more about applications in molecular biology.

This new second edition contains expanded coverage of probabilistic graphical models and of the applications of neural networks, as well as a new chapter on microarrays and gene expression. The entire text has been extensively revised.

An Introduction to Neural Network Modeling of the Hippocampus and Learning

This book is for students and researchers who have a specific interest in learning and memory and want to understand how computational models can be integrated into experimental research on the hippocampus and learning. It emphasizes the function of brain structures as they give rise to behavior, rather than the molecular or neuronal details. It also emphasizes the process of modeling, rather than the mathematical details of the models themselves.

The book is divided into two parts. The first part provides a tutorial introduction to topics in neuroscience, the psychology of learning and memory, and the theory of neural network models. The second part, the core of the book, reviews computational models of how the hippocampus cooperates with other brain structures—including the entorhinal cortex, basal forebrain, cerebellum, and primary sensory and motor cortices—to support learning and memory in both animals and humans. The book assumes no prior knowledge of computational modeling or mathematics. For those who wish to delve more deeply into the formal details of the models, there are optional "mathboxes" and appendices. The book also includes extensive references and suggestions for further readings.

Understanding the Mind by Simulating the Brain

The goal of computational cognitive neuroscience is to understand how the brain embodies the mind by using biologically based computational models comprising networks of neuronlike units. This text, based on a course taught by Randall O'Reilly and Yuko Munakata over the past several years, provides an in-depth introduction to the main ideas in the field. The neural units in the simulations use equations based directly on the ion channels that govern the behavior of real neurons, and the neural networks incorporate anatomical and physiological properties of the neocortex. Thus the text provides the student with knowledge of the basic biology of the brain as well as the computational skills needed to simulate large-scale cognitive phenomena.

The text consists of two parts. The first part covers basic neural computation mechanisms: individual neurons, neural networks, and learning mechanisms. The second part covers large-scale brain area organization and cognitive phenomena: perception and attention, memory, language, and higher-level cognition. The second part is relatively self-contained and can be used separately for mechanistically oriented cognitive neuroscience courses. Integrated throughout the text are more than forty different simulation models, many of them full-scale research-grade models, with friendly interfaces and accompanying exercises. The simulation software (PDP++, available for all major platforms) and simulations can be downloaded free of charge from the Web. Exercise solutions are available, and the text includes full information on the software.

An Algorithmic Approach

In one of the first major texts in the emerging field of computational molecular biology, Pavel Pevzner covers a broad range of algorithmic and combinatorial topics and shows how they are connected to molecular biology and to biotechnology. The book has a substantial "computational biology without formulas" component that presents the biological and computational ideas in a relatively simple manner. This makes the material accessible to computer scientists without biological training, as well as to biologists with limited background in computer science.

Computational Molecular Biology series
Computer science and mathematics are transforming molecular biology from an informational to a computational science. Drawing on computational, statistical, experimental, and technological methods, the new discipline of computational molecular biology is dramatically increasing the discovery of new technologies and tools for molecular biology. The new MIT Press Computational Molecular Biology series provides a unique venue for the rapid publication of monographs, textbooks, edited collections, reference works, and lecture notes of the highest quality.

Since the late 1960s the Internet has grown from a single experimental network serving a dozen sites in the United States to a network of networks linking millions of computers worldwide. In Inventing the Internet, Janet Abbate recounts the key players and technologies that allowed the Internet to develop; but her main focus is always on the social and cultural factors that influenced the Internets design and use. The story she unfolds is an often twisting tale of collaboration and conflict among a remarkable variety of players, including government and military agencies, computer scientists in academia and industry, graduate students, telecommunications companies, standards organizations, and network users. The story starts with the early networking breakthroughs formulated in Cold War think tanks and realized in the Defense Department's creation of the ARPANET. It ends with the emergence of the Internet and its rapid and seemingly chaotic growth. Abbate looks at how academic and military influences and attitudes shaped both networks; how the usual lines between producer and user of a technology were crossed with interesting and unique results; and how later users invented their own very successful applications, such as electronic mail and the World Wide Web. She concludes that such applications continue the trend of decentralized, user-driven development that has characterized the Internet's entire history and that the key to the Internet's success has been a commitment to flexibility and diversity, both in technical design and in organizational culture.

  • Page 2 of 5