It's Doddery Fodder...but is it Art?

By Frank Wales

Supplementary Notes[1]

[1] These are like the kitchen of the academic discourse party; they're where all the interesting quips hang out.

For some years now, I have been haunted[2] by a question I was asked once by a fellow PPIGlet: "Have you ever written a paper entitled 'Post-modernism explained by a software engineer'? Because I'd like to cite it if you have."[3] Since then, I've wondered what this paper might be about, and if I ever wrote it, whether it would be any good.

I was reminded of this pending omission from my future emissions by these two essays which both compare programmers favourably with artists. The first article actually has "psychology of programming" in its title, so it must be right, while "Hackers and Painters" by Paul Graham contributes to the debate on whether programming is art or engineering firmly on the art side:

[2] Well, not haunted exactly, since I don't believe in ghosts. Perhaps stalked. Or at least followed around from time to time and leered at in a funny way.

[3] To protect the identity of this person, who now seems to have forgotten asking such a question, I won't use his real name. Instead, I'll refer to him as "Allan Blackwell".

If I can ever find someone at least vaguely authoritative who can explain Post-modernism to me without first apologising for why their definition isn't a very good one, I might be able to make progress in writing my paper. But funnily enough, as I study art more[4], I find "vague authority" seems to sum up much of modern art pretty well anyway, so maybe I'm being just too enlightenment-oriented when I expect a simple answer that I can understand.

[4] And yes, in my spare time I really am an arts student. "Know thine enemy", as someone once said.

After all, the history of art is a series of oscillations between simplification & distillation and exploration & complexification, with the Classical and Modernist eras examples of the former, and the Baroque and Post-modern examples of the latter. [5]

[5] In physics, this important technique is called "proof by blatant assertion".

It is very tempting to try to fit the history of programming into this model, with (say) structured programming and design patterns analogous to Classicism and Modernism, and things like the Web and Perl representing the Baroque and Post-modern[6].

Maybe we could even pretend that Sourceforge represents the world's largest collection of found art[7], while Google is the world's biggest collection of repurposed content[8]. And maybe this site, dedicated to discussing tiny pieces of free code, will act as a microcosm of that pretense:

[6] Hence the phrase "If it ain't Baroquen, don't fix it."

[7] "Found art" appears to be a declaration by an artist that his immediate surroundings had more talent than he did.

[8]"Re-purposed content"appears to be a declaration by a developer that her immediate predecessors had more talent than she did.

More interestingly, sites like this perhaps show the essential Post-modernism of the whole free software movement, which avowedly encourages copying, adaptation and recombination.

Speaking of recombination, this announcement from the Royal College of Art mooting some biological software muckaboutery fits right in:


Perhaps it's just me, but the idea of inserting the entire human genome into that of a tree strikes me as both massively ill-conceived and perhaps even ludicrously dangerous[9].

Anyway, I'm thinking that I should just appropriate[10] Larry Wall's paper "Perl: the first post-modern programming language" to get me started, and then toss in some verbiage from the Post-modernism generator to lend gravitas, or at least promote a sense of intellectually-outclassed bewilderment in the reader[11]:

[9] But then, perhaps I know just enough about genetics and software to imagine the failure scenarios, but not enough about conceptual art to see why this isn't just genetic engineering by cut-and-paste.

[10] "Appropriation" is the art-world term for what most intellectual property lawyers would call "stealing". See these links

[11] Cynicism like this is very Post-modern too, I'm told.

Many programmers seem attracted to perl by its maxim "There's more than one way to do it", which is antipathetic to many specialised languages, which might have the maxim "There's nearly one way to do it, but not quite.". Perl, along with other scripting languages, certainly polarises the armchair software engineers who've wandered over from the web development world:


Whether perl's imminent re-design will enhance the language or dehance[12] it remains to be seen, but it'll clearly provide much fodder for future arguments between aficionados of dense syntax and fans of semantically significant quantities of white-space:

One of perl's clear strong points is its lack of shame in stealing (sorry, appropriating) ideas and ways of doing things from other languages, and it's perhaps this that's led it to become an important glue language, especially in these web-enabled times when the hard problems seem to be festering in the cracks and crevices of the virtual world.

[12] What do you mean, 'dehance' isn't a word? You understood it, right? Next, you'll be telling me that 'pessimal' isn't the opposite of 'optimal'.

Which brings me back to XML again[13].

This presentation opines that XML is awful, but inevitable despite its awfulness:

[13] Which, due to a technical oversight, was completely missing from my previous column

And its author is not alone in finding fault with this programming generation's most over-achieving technology:


But so far, I haven't found a document about XML that matches the comprehensive pasting given to Unix (and its known associates) in The Unix-Haters Handbook, now available in its entirety online[14]:

But that's enough non sequiturs for today.

[14] Apart from the Unix barf bag that came with the original book.

One of the reasons for disliking the "programming as art" opinion is that it makes the interviewing process a lot harder. But there are still some software domains where the environment applies enough selection pressure to make questions with right answers useful, such as embedded systems development:

To test those eager candidates who've memorised the Frequently Asked Questions from comp.lang.c, here are the Infrequently Asked Questions:


And if you find that your interview candidates have low SAT scores, perhaps they're just showing off how much they do know by deliberately answering everything wrong, like this guy did:


Still, I know a few managers who much prefer to treat software developers as interchangeable code-monkeys, so they might be dismayed to find out that code monkeys aren't so interchangeable after all. At least, if Primate Programming Inc is to be believed:

Now that people outside of software development are actually taking the whole "software as art" thing more seriously, maybe this shows that programming is sinking into the heads of enough artists to start some genuinely new things:


Perhaps one way to resolve the art versus engineering dichotomy is to consider it as false as the dichotomy between plumbing [15] and fashion [16], and instead accept that there are different aspects to programming [17] that sometimes look arty, and sometimes engineeringy[18].

[15] Essential, hidden, stable.

[16] Inessential, visible, transitory.

[17] As opposed to aspect-oriented programming, which is now apparently patented, and therefore potentially a lot harder to use: United States Patent: 6,467,086

[18] You want precision? Ask an engineer. You want vagueness? Ask an artist. You want both? Ask a programmer about deadlines.

While technologists can clearly learn many things from artists, there are at least some fundamental things they still have to learn from us. For example, that the primary colours are not red, blue and yellow[19].

[19] For those of you who think this is right, take a close look at the dyes in a full-colour CMYK print cartridge. And if you think 'cyan' and 'magenta' are really just different shades of 'blue' and 'red', then you need more help than I can offer.

But with the Net now pretty well-established as the mediator of, and the reason for, many of the most interesting developments in programming, maybe I should be planning to write a paper about "Post-modemism"[20] instead. After all, I just have to make this thrilling new concept vague enough to be understood clearly by anyone, and my place in History[21] will be assured.

[20] If you think this says "Post-modernism", you need to look at it more closely. I hereby declare this to be the world's first typogronym, which is a new word that, when printed with the right typeface, closely resembles another word. Maybe I should patent it.

[21] Or at least my place in Geography.

Frank Wales


Many thanks go to Frank Wales for the assistance he has given during the production of this newsletter.