Skip navigation

Coming Soon

  •  
  • Page 1 of 7
Information, E-Government, and Exchange

The computer systems of government agencies are notoriously complex. New technologies are piled on older technologies, creating layers that call to mind an archaeological dig. Obsolete programming languages and closed mainframe designs offer barriers to integration with other agency systems. Worldwide, these unwieldy systems waste billions of dollars, keep citizens from receiving services, and even—as seen in interoperability failures on 9/11 and during Hurricane Katrina—cost lives.

In this book, Omer Preminger investigates how the obligatory nature of predicate-argument agreement is enforced by the grammar. Preminger argues that an empirically adequate theory of predicate-argument agreement requires recourse to an operation, whose obligatoriness is a grammatical primitive not reducible to representational properties, but whose successful culmination is not enforced by the grammar.

Cognitive Representations as Graphical Models

Our ordinary, everyday thinking requires an astonishing range of cognitive activities, yet our cognition seems to take place seamlessly. We move between cognitive processes with ease, and different types of cognition seem to share information readily. In this book, David Danks proposes a novel cognitive architecture that can partially explain two aspects of human cognition: its relatively integrated nature and our effortless ability to focus on the relevant factors in any particular situation.

Building the Interactive Web

Adobe Flash began as a simple animation tool and grew into a multimedia platform that offered a generation of creators and innovators an astonishing range of opportunities to develop and distribute new kinds of digital content. For the better part of a decade, Flash was the de facto standard for dynamic online media, empowering amateur and professional developers to shape the future of the interactive Web. In this book, Anastasia Salter and John Murray trace the evolution of Flash into one of the engines of participatory culture.

The New York Times declared 2012 to be “The Year of the MOOC” as millions of students enrolled in massive open online courses (known as MOOCs), millions of investment dollars flowed to the companies making them, and the media declared MOOCs to be earth-shaking game-changers in higher education. During the inevitable backlash that followed, critics highlighted MOOCs’ high dropout rate, the low chance of earning back initial investments, and the potential for any earth-shaking game change to make things worse instead of better.

Deleuze Monument

Part-text, part-sculpture, part-architecture, part-junk heap, Thomas Hirschhorn’s often monumental but precarious works offer a commentary on the spectacle of late-capitalist consumerism and the global proliferation of commodities. Made from ephemeral materials—cardboard, foil, plastic bags, and packing tape—that the artist describes as “universal, economic, inclusive, and [without] any plus-value,” these works also engage issues of justice, power, and moral responsibility.

In order to control climate change, the International Panel on Climate Change (IPCC) estimates that greenhouse gas emissions will need to fall by about forty percent by 2030.

Art and Conflict in the 21st Century
Edited by Peter Weibel

Today political protest often takes the form of spontaneous, noninstitutional, mass action. Mass protests during the Arab Spring showed that established systems of power—in that case, the reciprocal support among Arab dictators and Western democracies—can be interrupted, at least for a short moment in history. These new activist movements often use online media to spread their message.

Category theory was invented in the 1940s to unify and synthesize different areas in mathematics, and it has proven remarkably successful in enabling powerful communication between disparate fields and subfields within mathematics. This book shows that category theory can be useful outside of mathematics as a rigorous, flexible, and coherent modeling language throughout the sciences.

Sparse modeling is a rapidly developing area at the intersection of statistical learning and signal processing, motivated by the age-old statistical problem of selecting a small number of predictive variables in high-dimensional datasets. This collection describes key approaches in sparse modeling, focusing on its applications in fields including neuroscience, computational biology, and computer vision.

  •  
  • Page 1 of 7