I’ve been watching reactions to Apple’s controversial decision to prohibit the publication of iPhone applications created in environments other than Apple’s own.
The policy has a number of implications, including the fact that iPhone apps are prevented from running any interpreted code. For example, Apple removed the iPhone Scratch player from the App Store, because it runs Scratch apps (a popular programming education tool for kids).
One victim of the change is Flash. Adobe had created a Flash exporter for iPhone, which now clearly violates the new terms. Adobe has since scrapped the exporter. After a flood of angry responses online, Steve Jobs penned his own response.
Faced with Apple’s decision, programmers have many choices. For example, Cocoa programmers will have to decide if Apple’s policies sit well with them. If not, they may have to choose another platform, one that won’t be programmable in Objective-C. Likewise, Flash developers who want to make software for iPhone will have to decide if they’re willing to move over to Cocoa, or if they want to opt for another mobile platform like Android.
No matter what one chooses, and no matterÂ how one reads Apple’s intentions, there’s something perhaps even more insidious going on among the programming public. Specifically, a large number of developers seem to think that they have the right to make software for the iPhone (or for anything else) in Flash, or in another high-level environment of their choosing. Literally, the right, not just the convenience or the opportunity. And many of them are quite churlish about the matter. (A few among many examples can be found in text and especially comments at 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11)
This strikes me as a very strange sort of attitude to adopt. There’s no question that Flash is useful and popular, and it has a large and committed user base. There’s also no question that it’s often convenient to be able to program for different platforms using environments one already knows. And likewise, there’s a long history of creating OS stubs or wrappers or other sorts of gizmos to make it possible to run code “alien” to a platform in a fashion that makes it feel more native.
But what does it say about the state of programming practice writ large when so many developers believe that their “rights” are trampled because they cannot write programs for a particular device in a particular language? Or that their “freedom” as creators is squelched for the same reason?
I wonder if it doesn’t amount to an indictment of the state of computational literacy.
There are lots of types of computer platforms. There are embedded systems that have to be programmed in low-level or machine languages. There are scripting environments that sit inside commercial productivity software. And there are many in between. Part of understanding computation is understanding the differences between platforms—what makes them unique and how to consider and exploit those uniquenesses. Such is part of the goal of the platform studies project (see, for example, my and Nick Montfort’s book on the Atari, Racing the Beam).
When I teach Introduction to Computational Media at Georgia Tech, I purposely force my students to work with a large variety of platforms. Some of them are familiar, like Java. Others are less familiar, like Inform or AIML. Some are downright unusual, like the Atari VCS. And others still are just plain absurd, like the esoteric programming language Chef.
I do this to force them to touch multiple platforms, each of which requires a different way of thinking about computational creativity. Inform, for example, inspires a different sort of work (interactive fiction) than does Processing (generative, abstract visual art).
I worry that we’re losing a sense of diversity in computation. This seems to be happening at both the formal and informal levels. Georgia Tech’s computer science bachelor’s degree doesn’t require a language survey class, for example (although one is offered as an elective). This year in the Computational Media curriculum committee, we’ve been discussing the idea of creating a history of programming languages course as a partial salve, one that would explain how and why a number of different languages and environments evolved. Such a course would explicitly focus on how to learn new languages and environments, since that process is not always obvious. It’s a wonderful and liberating feeling to become familiar with and then master different environments, and everyone truly interested in computing should experience that joy.
I am not suggesting that Flash developers are lazy or stupid. But I do think that the reaction Apple’s iPhone terms have inspired should tell us something about our collective attitude toward creating things with computers. And not something good.
The computational ecosystem is burgeoning. We have more platforms today than ever before, from mobile devices to microcomputers to game consoles to specialized embedded systems. Yet, a prevailing attitude about making computational creativity longs for uniformity: game engines that target multiple platforms to produce the same plain-vanilla experience, authoring tools that export to every popular device at the lowest common denominator; and, of course, the tyranny of the web, where everything that once worked well on a particular platform is remade to work poorly everywhere.
It is a kind of computational extirpation, where everything unique is crippled or cleansed in order to service a perverted belief in universality. I consider it a kind of jingoism, and I hope we can outgrow or destroy it.