In 1987, game designer Chris Crawford introduced the concept of process intensity, “the degree to which a program emphasizes processes instead of data.” Process, Crawford explains, involves “algorithms, equations, and branches,” while data refers to “tables, images, sounds, and texts.” A process-intensive program “spends a lot of time crunching numbers; a data-intensive program spends a lot of time moving bytes around.”

For Crawford, process intensity is not only a theoretical frame for understanding the difference between algorithms and information, but also an aesthetic principle. “Processing data is the very essence of what a computer does,” contends Crawford, so using it just to store and move data around is a waste. For this reason, Crawford boldly claims that process intensity is “a useful criterion for evaluating the value of a piece of software.” From word processors to videogames, works with a higher “crunch per bit ratio”—that is, the ones that contain more processing than they do data—are better and more virtuous examples of computational media than those with lower ratios, according to Crawford.

In his article, Crawford cites the famous 1983 laserdisc game Dragon’s Lair as an example of low process intensity (“its crunch per bit ratio stank,” he deadpans). The game displayed big chunks of animations, performing very little processing on the video data and user input. Crawford refers to his own game Balance of Power as a contrasting, desirable, high process intensity specimen. The game simulates Cold War geopolitics by algorithmically analyzing data like insurgency, economics, might, and prestige across many nations in relation to user actions like sending aid, escalating conflict, and backing down. In a book-length manual for the game, Crawford summarizes the four geopolitical processes he hoped the game would emphasize: insurgency, coups d’etat, Finlandization, and crises. Dragon’s Lair focuses on one process, timing, and a lot of audiovisual instantial assets, whereas Balance of Power highlights many processes operating independently on abstract data sets.

In his 1984 book The Art of Computer Game Design, Crawford had described the same phemonenon as a dichotomy between instantiality and procedurality. Games are instantial when they rely on prerendered, invariable assets over dynamic processes. This distinction was somewhat easier to grasp for a working game developer in the early 1980s, when a game might be limited to 4-64k in size. Given a choice, filling that space with code instead of data would allow for a larger, denser experience. Such concerns are not really relevant anymore, but the general idea of a relative distribution of processes and assets in a particular work remains a potentially useful perspective on a game’s formal construction.

By the mid-2000s process intensity gained renewed attention. Channeling Crawford in a 2006 SIGGRAPH keynote, Greg Costikyan advocated for interactive processes over poly pushing and canned data on the grounds that instantial games were hobbling the medium. Costikyan lamented that “80+% of the man-hours (and cost) for a game is in the creation of art assets. …In other words, we’ve spent the last three decades focusing on data intensity instead of process intensity.”

Indeed, the cost of those man-hours were becoming impractical. Where aesthetic rationales for procedural approaches hadn’t made much headway, economic imperatives did. The rising costs of AAA game production catalyzed a new interest in procedural methods in game design, most visibly the procedural authoring and gameplay tools of Will Wright’s Spore. But as Costikyan pointed out, procedural content doesn’t necessarily change the process intensity of a game on the gameplay register. Or as the critic Noah Wardrip-Fruin has explained, the central issue is not how much total processing takes place in a computational work, but which works “exhibit a comparative intensity of behavioral processing.”

Put differently, process intensity and instantial intensity look different depending on which part of a computational platform one works with. Like 3D game engines, procedural content generation methods just increase the process intensity of an already data intensive design paradigm. Compare computer animated films like Toy Story and Monsters, Inc. Considerable effort goes into animating characters and generating environmental effects through procedural methods, but the end result is instantial rather than procedural: a series of still images meant for anamorphic projection.

Read the rest of the article online at Gamasutra

published May 23, 2012