I openly idolize Charles and Ray Eames’s filmmaking, especially the elegant, playfully “high concept” explainers they made like Powers of Ten. I also idolize the technically innovative, brimming-with-process-value “brand journalism” that Google produces in its promotional videos. When I saw this spot, I didn’t just love it. I lusted it:

It’s the kind of filmmaking that I wish I could do all the time, on every project: physical, inventive, unconstrained by petty things like “time” and “budget”, and engaging as hell. The closest I’ve gotten — and it’s pretty darn close — is the “Lego Antikythera” film I directed last year, which had a very large budget (relatively speaking, compared to what I’d worked with before) and got nearly 2 million views on Youtube, Vimeo, and elsewhere:

Since making that film, I’ve been lucky to parlay its success into working with generally bigger budgets and more “high conceptey” projects. Just what I wanted, right?

Well…

Those kinds of films, while creatively invigorating, can also be kind of exhausting to produce — or even get to produce. The bar is set pretty high each time and the energy barrier for client “buy-in” is high, too. And even when the budgets swell, it still “never feels like quite enough.” The high-concept reach always wants to extend just a few inches beyond one’s practical means, no matter what the scale is. (Maybe this just means I’m a bad producer.)

Moreover, I’m noticing more and more that I am just as inspired and engaged by quiet, simple, “low concept” films. Like this promo for Twitter’s redesign…

Or this concept/demo video for BERG‘s Little Printer:

Or this wonderfully elegant, tactile documentary about letterpress:

Or this concise, stylish, informative portrait of the inventor of the first digital camera:

In a sense, these films all look and feel “the same.” A lot of locked off shallow focus shots, graphically composed like still photographs, lined up in a stately pace, with some cute or elegant or simply unobtrusive music underneath. There’s not a lot of formal inventiveness or physical “wow factor” like the Google videos or Eames films.

And yet, they connect.

I want to train my creative brain to consider this kind of filmmaking more often, but it takes effort. There’s a fear there: how can I get anyone to pay me decently to make something that looks this… easy? How will anyone notice me (and want to hire me) if what I make is this… simple? If I don’t shoot for the moon every time out, turn every project into some kind of ambitious experiment (for better or worse), aren’t I… settling?

Above is a Twitter conversation I had with Timo Arnall, a “director, designer and researcher” at BERG and the filmmaker behind that Little Printer clip (as well as all of BERG’s other great video work). I had a sense that I might be drifting from my own creative maxim, borrowed from Paul Rand: “Don’t try to be original. Just try to be good.”

What I love about Arnall’s work is that it’s all about “just trying to be good.” Good in the sense of effective and useful. It’s well-designed, suited to its task, and not tricked up any more than that. (It exemplifies another maxim I like to remind myself of, by Milton Glaser: “Less is not more. Just enough is more.”) It’s not surprising because at BERG, filmmaking is a means — a tool for illuminating and exploring design ideas — not an end in itself, or at least not an end in the way I usually think of it.

The irony is that the mighty Eameses also made quiet, low concept films — they’re not as well known as, say, Powers of Ten, but they’re just as bewitching. Toccata for Toy Trains or Blacktop: A Story of the Washing of a School Play Yard wouldn’t look out of place on Vimeo’s DSLR page, even though they were made half a century ago.


[I couldn't find a copy of "Blacktop" with the original soundtrack. This one has Elliott Smith music dubbed in. Ignore it.]

Another irony is that making something that appears simple or “just enough” is often much more challenging than “shooting for the moon.” I think it requires more creative confidence — the confidence not only to achieve “just enough,” but to even recognize it. “Just enough” is a point, not a space or a range. And its location is not always obvious. Throwing a ball over a high wall is one thing, but throwing a ball to knock a pebble off the wall is quite another.

Keith “keef” Erlich, a talented director who recently launched a small business built around this kind of “just enough” filmmaking, coined a phrase I quite like recently while we were chatting over coffee: “middle-class media making.” Basically, the idea of doing good creative work in a sustainable way for decent pay, in a zone somewhere between being a young’n’hungry striver and being a creative “1 percenter” like Mark Romanek or Spike Jonze. Or to use another analogy: “middle class media making” is like owning a successful small restaurant in a neighborhood you like — rather than being a dishwasher, or being David Chang.

As someone who recently started a family, that sounds like #winning to me.

Could doing more of this “quiet” kind of filmmaking be a way of helping myself build a more sustainable “middle-class media making” career? I wonder. I’m proud of the fact that every film I’ve made in the past couple years is very different from every other. That was on purpose. In a way, doing more “low concept” projects would continue that trend. At the very least, it’s probably something worth experimenting with — if only to confront some fears, and avoid falling into a rut.


I just published a post on Fast Company’s Co.Design about usability in programming language design, based on some interesting research from Southern Illinois University. The comments on that post will no doubt get heated, but there is some additional material I wanted to put out there which I couldn’t fit into that post. I interviewed Alex Payne, a former developer at Twitter, now CTO of Simple Finance, and organizer of a fascinating thing called “Emerging Languages Camp” (which I covered for Technology Review in 2010), which gathers designers of new programming languages to share their work and ideas.

Payne made some of the same points that Andreas Stefik (the lead author of the paper I discussed in the Co.Design post) did about how programming languages get created, and how peculiarities of syntax can affect the learning curve of a programming language. But Payne had another insight on the design of programming languages that I found particularly interesting. I’ll quote him [emphasis added by me]:

A secondary factor that shapes the language learning curve is fuzzier: the ideas that the language is trying to get across. Programming languages are usually more than just a way to get a computer to do stuff. They’re often a collection of opinions about *how* one should go about getting a computer to do stuff. For example, the Haskell language basically argues that you should program computers in very much the way a mathematician might work out a problem. In order to program in Haskell, you need to learn both its syntax and its mindset, if you will. This contrasts with a language like Python that has very little syntax to learn and not much of an opinion about how one should program, beyond that you should do it in a straightforward way. No wonder, then, that Python is considered very easy to learn and is often used to educate burgeoning programmers.

This is tangent to another question I was curious to answer, which led me to cover the Emerging Languages Camp in the first place: why do we need many different programming languages, or new languages, at all? But if a programming language is not just an interface, but an argument, the need for (or at least desire for) making new ones all the time makes more sense.


Dead End

02May11
Nemesis

The title character from Mark Millar's "bad Batman" comic, Nemesis

How do you kill a concept? Common wisdom is that you can’t. Just ask Bruce Wayne.

Except we just did. Just ask Osama bin Laden. Or rather, ask the Obama administration, who skillfully and quite brilliantly designed a way to not just capture an enemy of the state, but effectively neutralize the symbol he embodied. The former victory was tactical, and to be honest, almost an implicit embarrassment: according to Neil deGrasse Tyson, finding one dirty little fugitive took as much time and money as putting a man on the Moon. The latter victory, though, looks like a decisive strategic coup.

I couldn’t help but be reminded of Mark Millar’s Nemesis character when considering the blankness left by bin Laden (and bin Laden™)’s removal. Where there was once a kind of “bad Batman” out there, more than just a man, haunting our collective consciousness like a demon, inspiring others by example and, later, by simple nose-thumbing existence, now there is just a Nothing: no body, no image, no locus for more bloodlust or vengeance or worship or debate. Just a lacuna in the text, a literal dead end.

To quote the editor of Fast Company Design, whom I blog for:

All we’re left with is old images of Bin Laden, and the image of a stern, dignified President Obama. There is nothing there for Bin Laden’s cohort to twist and remix for their purposes. There is no whiff of American savagery, and no whiff of personal vendetta. Merely justice.

This isn’t justice as erasure, even though we did “rub him out.” This is something more abstract, and more quintessentially contemporary: justice as absence. The man has been disappeared. But so has his symbology, his meme, his brand. They’ll endure as memories and fixed images but (in all likelihood) won’t adapt or evolve — or at least, not in the same virulently powerful way.

The Obama administration’s expert framing of justice against bin Laden as a kind of anti-communication has an unintentional consequence, though: some Americans don’t seem to quite know how to process it. Justice is something we’re used to knowing when we see it — and feeling it — but that bin Laden-shaped lacuna may leave us at a bit of a dead end, too. (Especially after ten years.) My first reaction to the news was … well, I’m not quite sure what to call it, but it felt about as emotionally cathartic or “closure”-ey as noticing that a CD suddenly skipped on a distantly playing stereo system. Or that the wi-fi went out for a couple minutes. Or — and maybe this is the “truest”-feeling analogy I can think of — that the pointer onscreen briefly turned into a spinning beachball as “the program” called “bin Laden is still out there” suddenly hung, then halted… and then, doink, process killed, pointer restored.

Anyway, it bugged me all morning, that feeling of being gypped out of a head-on-a-pike moment and wanting one to come. To me, images like this feel moving and triumphant, but still somehow indirect. (Not for those firemen, though.) But after some reflection, and reading Cliff’s lucid essay, this dead end is actually more satisfying than “Mission Accomplished” ever could be. The memory-leaking malware called Osama bin Laden is no longer a drain on system resources. Debug complete. End of line.


I was recently invited by the folks at Frank Lloyd Wright’s Unity Temple Restoration Foundation to give a talk as part of their Break the Box series, which celebrates “creative nonconformity.” It was a real honor. I’ve written before about a design pattern in media/culture that I informally call “process value”, and this talk was an opportunity to really attempt a deep dive into the idea.

Here’s the presentation:

From the program notes:

What do homemade music videos by OK Go, live Twitter updates about Egypt, and industrial films from the 1950s have in common? They all have a high degree of “process value”: a willingness to expose the creative act itself and embed it, front and center, in the finished product. And they generate intense engagement on the web — often much more than their big budgeted, high-production-value counterparts. Wired and Fast Company writer and filmmaker Pavlus looks at why that is — and how to put it to use.

One thing I wasn’t able to talk about in the presentation (because I’m just not knowledgeable enough about it) is how this idea of process value applies to architecture. Luckily, there was a gentleman in the audience who filled in that gap for me during the Q&A session, explaining how Frank Lloyd Wright himself was very much into “exposing the scaffolding” of his process both literally and figuratively in his architecture and architectural philosophy.

Here’s a list of links to the videos that I included in the talk:

C’Etait Un Rendezvous, by Claude Lelouche

Here It Goes Again, by OK Go

Touch Wood, by Morihiro Harano, Kenjiro Matsuo, et. al.

A Glorious Dawn, by Symphony of Science

The Monitor, by me, Christie Nicholson, and Christopher Mims

7 Ways to Walk the Walk, by Alissa Walker

Lego Antikythera Mechanism, by me, Andrew Carol, Misha Klein, Adam Rutherford, et. al.

Also:

RadioLab, Infinite Jest, Longshot magazine, slow food, Michael Bay, Kickstarter, and many more.


Man, it’s tougher than ever out there for writers. Or is it? It may seem like the only jobs available are the journalistic equivalent of waxing Tom Cruise’s motorcycle for $50/week — but it ain’t true. Those are the only jobs advertised. There are much better ones hidden in the foliage, available only to those who take out their machetes and start a-choppin’. Which is to say: same as it ever was.

I’ll put a finer point on it. Editors are desperate for blogging talent. Two really good ones — with non-slave wages to offer — recently emailed me, literally saying: I’ve got money I need to spend, tell me who to hire — please. So it’s not just us writers who feel like we’re stuck in a forbidding economic jungle, fighting over scraps — it’s the editors, too! How do these two camps manage to stay so damned invisible to each other? Who knows. But I do know (or, at least, have some anecdotal personal experience to relate) about how to nip it in the bud from the writer’s side of things.

Basically, you just have to not give a shit. (While totally, passionately giving a shit.) Wait, what?

Continue reading ‘How to succeed in blogging without really trying (which is, coincidentally, the ONLY way to succeed)’


After lusting after an iPad for most of 2010 (and blowing various brain-farts about this or that aspect of it), I’ve had one in the house for about a week (borrowed from a client) and … dang. I just don’t want one of these things anymore.

The awesome things about it are still awesome. It’s sexy as all git-out. I want to use it all day, every day. But I don’t… because the thing is

  • too damn heavy. Just seriously, too damn heavy.

That’s the one attribute that kind of outshines all the others, unfortunately. My right thumb got some scary pre-carpal-tunnel feeling from scrolling while holding it with two hands in what I thought was a comfortable position. If the iPad’s weight was not fully supported by something other than my two hands, I just couldn’t use it for more than a few minutes. Casual surfing/tweeting/anything-ing (even reading long articles) on my tiny phone was much more comfortable.

Unless the next version is significantly lighter, I can’t see myself buying one.

One other thing I was wrong about: turns out that replacing the tiny Home Button with a “big” multitouch gesture makes a lot of sense on a device this size. On a small phone, though, I maintain that dropping this basic piece of haptic/tactile functionality seems dumb (because it turns a simple, one-digit/one-hand, no-look action into a multi-finger/multi-hand, requires-looking-right-at-the-screen action).

Key word there being “seems” — but I feel pretty secure in that judgment based on my experience using an iPod Touch, which already frustrated me greatly by forcing me to use two-hand/look-right-at-it gestures to do simple things like skip forward or backward in my music library, which I could not do without stopping everything else I was doing at the time (like walking down the street) and focusing on the device. No such problems with the olde-tyme physical click-wheel, which was as near to a perfect user-interface-to-use-case match I’m likely to see in my lifetime.

Touch may be the future, but it’s got its limits.


I cross-posted this on Freelancer Hacks, but I don’t think anyone’s reading that anymore, and I want to get this down, even if it’s only for myself. The most important lesson I learned about successful, productive freelancing in 2010 was this: everything is generative.

In plain English, that means this: Doing trumps planning. There’s no such thing as wasted work or projects. And whatever you do almost always snowballs into more of the same…whether you like it or not.

Continue reading ‘What I learned in 2010 about making a living: Everything Is Generative’


I just learned an embarrassing little lesson that I should have known already: Twitter is a small world. It’s fun to serve up “witty” 140-character takedowns of stuff you don’t particularly like, but if you’re not careful, you might very well be slagging someone you know and like without realizing it.

I just did that this morning. I saw a short film on the web that I didn’t particularly like. I actually engaged in a reasoned critique of it with a colleague over email, but on Twitter I just barfed out a venomous little mal mot that wasn’t terribly constructive. About an hour later I received an email from a friend asking me what I thought of his new piece and would I mind giving it some love on Twitter… the same film I had just knee-jerk dismissed. Time to invent a new hashtag! #eggonface

I apologized to him; he hasn’t responded yet, so I don’t know how badly (or if) his feelings were hurt. But I feel pretty ashamed. Hell, I know what it’s like to put a lot of creative energy into something and offer it up to the teeming digi-masses in hopes of pleasing them. There are already enough jerkwads online with nothing better to say about someone’s work than “nice job fag!!1″ Why add to it?

This isn’t to say I (or anyone) should water down honest opinions, but it is possible to be constructive and nuanced in 140 characters. It’s not as easy. But it’s surely possible. Or if it isn’t, there’s always your blog… or — ka-razy though this might sound — just keeping your opinion to yourself.

I’ll certainly be considering that as a New Year’s Resolution…


I’ve come to believe that thinking too much about where your own creative impulses come from, whose shoulders they stand on — ie, who your “influences” are — is detrimental to actually acting on those creative impulses. Who the hell cares where your ideas come from, as long as you do stuff with them. Let future critics of your genius body of work figure out who “influenced” you.

I like to think about who my heroes are instead. It’s more aspirational — about where you want to go, versus looking backwards and constantly analyzing, as if you’re talking to James Lipton, where you came from.

Tomato, tomahtoe, I know. But it makes sense to me. So for a pre-Thanksgiving post for all of my 2.5 readers who care, here’s a short list of my creative heroes — each of whom I’m thankful to, for inspiring me.

Continue reading ‘Heroes vs. Influences’


I finally saw The Social Network this past weekend. Believe the hype: it is a truly thrilling piece of popular art, and an amber-encased chunk of Aughts zeitgeist to boot. But not because of what it says about Facebook, or us — because of what it says about us when we talk about Facebook. Which we do endlessly. And, increasingly, silly-ly.

What is it about Zuckerberg’s brainchild that compels otherwise-smart writers to make pretentious asses of themselves when “analyzing” the subject? First we heard that Facebook’s “Hide” function was some dark new harbinger of the Impending PhonyFriendpocalypse, when actually it’s just a digital version of what humans have already been doing since time immemorial. Now Zadie Smith has written an essay-review about The Social Network in which she makes New-York-editor-dazzling leaps of critical thought that, to anyone who actually uses Facebook, come off more like the stumbles of someone who accidentally tied her shoelaces together.

I realize I may be succumbing to the same ill-conceived urge that these other writers have, simply by writing this. But what the hell.

Continue reading ‘The Banality of Facebook (or, Why your deep thoughts on it… aren’t)’



Follow

Get every new post delivered to your Inbox.

Join 27 other followers