Skip navigation

Scientific & Engineering Computation

  •  
  • Page 1 of 4

Complex communicating computer systems—computers connected by data networks and in constant communication with their environments—do not always behave as expected. This book introduces behavioral modeling, a rigorous approach to behavioral specification and verification of concurrent and distributed systems. It is among the very few techniques capable of modeling systems interaction at a level of abstraction sufficient for the interaction to be understood and analyzed.

Have you ever wondered how your GPS can find the fastest way to your destination, selecting one route from seemingly countless possibilities in mere seconds? How your credit card account number is protected when you make a purchase over the Internet? The answer is algorithms. And how do these mathematical formulations translate themselves into your GPS, your laptop, or your smart phone? This book offers an engagingly written guide to the basics of computer algorithms.

A Gentle Introduction

The combination of two of the twentieth century’s most influential and revolutionary scientific theories, information theory and quantum mechanics, gave rise to a radically new view of computing and information. Quantum information processing explores the implications of using quantum mechanics instead of classical mechanics to model information and its processing. Quantum computing is not about changing the physical substrate on which computation is done from classical to quantum but about changing the notion of computation itself, at the most basic level.

Much of the difficulty in creating information technology systems that truly meet people’s needs lies in the problem of pinning down system requirements. This book offers a new approach to the requirements challenge, based on modeling and analyzing the relationships among stakeholders. Although the importance of the system-environment relationship has long been recognized in the requirements engineering field, most requirements modeling techniques express the relationship in mechanistic and behavioral terms.

This text is a guide to the foundations of method engineering, a developing field concerned with the definition of techniques for designing software systems. The approach is based on metamodeling, the construction of a model about a collection of other models. The book applies the metamodeling approach in five case studies, each describing a solution to a problem in a specific domain. Suitable for classroom use, the book is also useful as a reference for practitioners.

Devices

This text offers an introduction to quantum computing, with a special emphasis on basic quantum physics, experiment, and quantum devices. Unlike many other texts, which tend to emphasize algorithms, Quantum Computing without Magic explains the requisite quantum physics in some depth, and then explains the devices themselves.

Portable Shared Memory Parallel Programming

"I hope that readers will learn to use the full expressibility and power of OpenMP. This book should provide an excellent introduction to beginners, and the performance section should help those with some experience who want to push OpenMP to its limits."
—from the foreword by David J. Kuck, Intel Fellow, Software and Solutions Group, and Director, Parallel and Distributed Solutions, Intel Corporation

The minimum description length (MDL) principle is a powerful method of inductive inference, the basis of statistical modeling, pattern recognition, and machine learning. It holds that the best explanation, given a limited set of observed data, is the one that permits the greatest compression of the data. MDL methods are particularly well-suited for dealing with model selection, prediction, and estimation problems in situations where the models under consideration can be arbitrarily complex, and overfitting the data is a serious concern.

Use of Beowulf clusters (collections of off-the-shelf commodity computers programmed to act in concert, resulting in supercomputer performance at a fraction of the cost) has spread far and wide in the computational science community. Many application groups are assembling and operating their own "private supercomputers" rather than relying on centralized computing centers. Such clusters are used in climate modeling, computational biology, astrophysics, and materials science, as well as non-traditional areas such as financial modeling and entertainment.

Achieving System Balance
Edited by Daniel A. Reed

As we enter the "decade of data," the disparity between the vast amount of data storage capacity (measurable in terabytes and petabytes) and the bandwidth available for accessing it has created an input/output bottleneck that is proving to be a major constraint on the effective use of scientific data for research.

  •  
  • Page 1 of 4