I just published a post on Fast Company’s Co.Design about usability in programming language design, based on some interesting research from Southern Illinois University. The comments on that post will no doubt get heated, but there is some additional material I wanted to put out there which I couldn’t fit into that post. I interviewed Alex Payne, a former developer at Twitter, now CTO of Simple Finance, and organizer of a fascinating thing called “Emerging Languages Camp” (which I covered for Technology Review in 2010), which gathers designers of new programming languages to share their work and ideas.
Payne made some of the same points that Andreas Stefik (the lead author of the paper I discussed in the Co.Design post) did about how programming languages get created, and how peculiarities of syntax can affect the learning curve of a programming language. But Payne had another insight on the design of programming languages that I found particularly interesting. I’ll quote him [emphasis added by me]:
A secondary factor that shapes the language learning curve is fuzzier: the ideas that the language is trying to get across. Programming languages are usually more than just a way to get a computer to do stuff. They’re often a collection of opinions about *how* one should go about getting a computer to do stuff. For example, the Haskell language basically argues that you should program computers in very much the way a mathematician might work out a problem. In order to program in Haskell, you need to learn both its syntax and its mindset, if you will. This contrasts with a language like Python that has very little syntax to learn and not much of an opinion about how one should program, beyond that you should do it in a straightforward way. No wonder, then, that Python is considered very easy to learn and is often used to educate burgeoning programmers.
This is tangent to another question I was curious to answer, which led me to cover the Emerging Languages Camp in the first place: why do we need many different programming languages, or new languages, at all? But if a programming language is not just an interface, but an argument, the need for (or at least desire for) making new ones all the time makes more sense.
Filed under: Thoughts | Leave a Comment
How do you kill a concept? Common wisdom is that you can’t. Just ask Bruce Wayne.
Except we just did. Just ask Osama bin Laden. Or rather, ask the Obama administration, who skillfully and quite brilliantly designed a way to not just capture an enemy of the state, but effectively neutralize the symbol he embodied. The former victory was tactical, and to be honest, almost an implicit embarrassment: according to Neil deGrasse Tyson, finding one dirty little fugitive took as much time and money as putting a man on the Moon. The latter victory, though, looks like a decisive strategic coup.
I couldn’t help but be reminded of Mark Millar’s Nemesis character when considering the blankness left by bin Laden (and bin Laden™)’s removal. Where there was once a kind of “bad Batman” out there, more than just a man, haunting our collective consciousness like a demon, inspiring others by example and, later, by simple nose-thumbing existence, now there is just a Nothing: no body, no image, no locus for more bloodlust or vengeance or worship or debate. Just a lacuna in the text, a literal dead end.
To quote the editor of Fast Company Design, whom I blog for:
All we’re left with is old images of Bin Laden, and the image of a stern, dignified President Obama. There is nothing there for Bin Laden’s cohort to twist and remix for their purposes. There is no whiff of American savagery, and no whiff of personal vendetta. Merely justice.
This isn’t justice as erasure, even though we did “rub him out.” This is something more abstract, and more quintessentially contemporary: justice as absence. The man has been disappeared. But so has his symbology, his meme, his brand. They’ll endure as memories and fixed images but (in all likelihood) won’t adapt or evolve — or at least, not in the same virulently powerful way.
The Obama administration’s expert framing of justice against bin Laden as a kind of anti-communication has an unintentional consequence, though: some Americans don’t seem to quite know how to process it. Justice is something we’re used to knowing when we see it — and feeling it — but that bin Laden-shaped lacuna may leave us at a bit of a dead end, too. (Especially after ten years.) My first reaction to the news was … well, I’m not quite sure what to call it, but it felt about as emotionally cathartic or “closure”-ey as noticing that a CD suddenly skipped on a distantly playing stereo system. Or that the wi-fi went out for a couple minutes. Or — and maybe this is the “truest”-feeling analogy I can think of — that the pointer onscreen briefly turned into a spinning beachball as “the program” called “bin Laden is still out there” suddenly hung, then halted… and then, doink, process killed, pointer restored.
Anyway, it bugged me all morning, that feeling of being gypped out of a head-on-a-pike moment and wanting one to come. To me, images like this feel moving and triumphant, but still somehow indirect. (Not for those firemen, though.) But after some reflection, and reading Cliff’s lucid essay, this dead end is actually more satisfying than “Mission Accomplished” ever could be. The memory-leaking malware called Osama bin Laden is no longer a drain on system resources. Debug complete. End of line.
Filed under: Thoughts | 4 Comments
I was recently invited by the folks at Frank Lloyd Wright’s Unity Temple Restoration Foundation to give a talk as part of their Break the Box series, which celebrates “creative nonconformity.” It was a real honor. I’ve written before about a design pattern in media/culture that I informally call “process value”, and this talk was an opportunity to really attempt a deep dive into the idea.
Here’s the presentation:
From the program notes:
What do homemade music videos by OK Go, live Twitter updates about Egypt, and industrial films from the 1950s have in common? They all have a high degree of “process value”: a willingness to expose the creative act itself and embed it, front and center, in the finished product. And they generate intense engagement on the web — often much more than their big budgeted, high-production-value counterparts. Wired and Fast Company writer and filmmaker Pavlus looks at why that is — and how to put it to use.
One thing I wasn’t able to talk about in the presentation (because I’m just not knowledgeable enough about it) is how this idea of process value applies to architecture. Luckily, there was a gentleman in the audience who filled in that gap for me during the Q&A session, explaining how Frank Lloyd Wright himself was very much into “exposing the scaffolding” of his process both literally and figuratively in his architecture and architectural philosophy.
Here’s a list of links to the videos that I included in the talk:
RadioLab, Infinite Jest, Longshot magazine, slow food, Michael Bay, Kickstarter, and many more.
Filed under: News, Thoughts, Work | 5 Comments
How to succeed in blogging without really trying (which is, coincidentally, the ONLY way to succeed)
Man, it’s tougher than ever out there for writers. Or is it? It may seem like the only jobs available are the journalistic equivalent of waxing Tom Cruise’s motorcycle for $50/week — but it ain’t true. Those are the only jobs advertised. There are much better ones hidden in the foliage, available only to those who take out their machetes and start a-choppin’. Which is to say: same as it ever was.
I’ll put a finer point on it. Editors are desperate for blogging talent. Two really good ones — with non-slave wages to offer — recently emailed me, literally saying: I’ve got money I need to spend, tell me who to hire — please. So it’s not just us writers who feel like we’re stuck in a forbidding economic jungle, fighting over scraps — it’s the editors, too! How do these two camps manage to stay so damned invisible to each other? Who knows. But I do know (or, at least, have some anecdotal personal experience to relate) about how to nip it in the bud from the writer’s side of things.
Basically, you just have to not give a shit. (While totally, passionately giving a shit.) Wait, what?
Filed under: This Digital Life | 5 Comments
New Year’s Resolution: no long-winded opining on devices until I’ve used them with my own two hands.
After lusting after an iPad for most of 2010 (and blowing various brain-farts about this or that aspect of it), I’ve had one in the house for about a week (borrowed from a client) and … dang. I just don’t want one of these things anymore.
The awesome things about it are still awesome. It’s sexy as all git-out. I want to use it all day, every day. But I don’t… because the thing is
- too damn heavy. Just seriously, too damn heavy.
That’s the one attribute that kind of outshines all the others, unfortunately. My right thumb got some scary pre-carpal-tunnel feeling from scrolling while holding it with two hands in what I thought was a comfortable position. If the iPad’s weight was not fully supported by something other than my two hands, I just couldn’t use it for more than a few minutes. Casual surfing/tweeting/anything-ing (even reading long articles) on my tiny phone was much more comfortable.
Unless the next version is significantly lighter, I can’t see myself buying one.
One other thing I was wrong about: turns out that replacing the tiny Home Button with a “big” multitouch gesture makes a lot of sense on a device this size. On a small phone, though, I maintain that dropping this basic piece of haptic/tactile functionality seems dumb (because it turns a simple, one-digit/one-hand, no-look action into a multi-finger/multi-hand, requires-looking-right-at-the-screen action).
Key word there being “seems” — but I feel pretty secure in that judgment based on my experience using an iPod Touch, which already frustrated me greatly by forcing me to use two-hand/look-right-at-it gestures to do simple things like skip forward or backward in my music library, which I could not do without stopping everything else I was doing at the time (like walking down the street) and focusing on the device. No such problems with the olde-tyme physical click-wheel, which was as near to a perfect user-interface-to-use-case match I’m likely to see in my lifetime.
Touch may be the future, but it’s got its limits.
Filed under: Uncategorized | 3 Comments
I cross-posted this on Freelancer Hacks, but I don’t think anyone’s reading that anymore, and I want to get this down, even if it’s only for myself. The most important lesson I learned about successful, productive freelancing in 2010 was this: everything is generative.
In plain English, that means this: Doing trumps planning. There’s no such thing as wasted work or projects. And whatever you do almost always snowballs into more of the same…whether you like it or not.
Filed under: This Digital Life, Thoughts, Work | 7 Comments
I just learned an embarrassing little lesson that I should have known already: Twitter is a small world. It’s fun to serve up “witty” 140-character takedowns of stuff you don’t particularly like, but if you’re not careful, you might very well be slagging someone you know and like without realizing it.
I just did that this morning. I saw a short film on the web that I didn’t particularly like. I actually engaged in a reasoned critique of it with a colleague over email, but on Twitter I just barfed out a venomous little mal mot that wasn’t terribly constructive. About an hour later I received an email from a friend asking me what I thought of his new piece and would I mind giving it some love on Twitter… the same film I had just knee-jerk dismissed. Time to invent a new hashtag! #eggonface
I apologized to him; he hasn’t responded yet, so I don’t know how badly (or if) his feelings were hurt. But I feel pretty ashamed. Hell, I know what it’s like to put a lot of creative energy into something and offer it up to the teeming digi-masses in hopes of pleasing them. There are already enough jerkwads online with nothing better to say about someone’s work than “nice job fag!!1″ Why add to it?
This isn’t to say I (or anyone) should water down honest opinions, but it is possible to be constructive and nuanced in 140 characters. It’s not as easy. But it’s surely possible. Or if it isn’t, there’s always your blog… or — ka-razy though this might sound — just keeping your opinion to yourself.
I’ll certainly be considering that as a New Year’s Resolution…
Filed under: This Digital Life | 1 Comment
I’ve come to believe that thinking too much about where your own creative impulses come from, whose shoulders they stand on — ie, who your “influences” are — is detrimental to actually acting on those creative impulses. Who the hell cares where your ideas come from, as long as you do stuff with them. Let future critics of your genius body of work figure out who “influenced” you.
I like to think about who my heroes are instead. It’s more aspirational — about where you want to go, versus looking backwards and constantly analyzing, as if you’re talking to James Lipton, where you came from.
Tomato, tomahtoe, I know. But it makes sense to me. So for a pre-Thanksgiving post for all of my 2.5 readers who care, here’s a short list of my creative heroes — each of whom I’m thankful to, for inspiring me.
Filed under: Thoughts | Leave a Comment
I finally saw The Social Network this past weekend. Believe the hype: it is a truly thrilling piece of popular art, and an amber-encased chunk of Aughts zeitgeist to boot. But not because of what it says about Facebook, or us — because of what it says about us when we talk about Facebook. Which we do endlessly. And, increasingly, silly-ly.
What is it about Zuckerberg’s brainchild that compels otherwise-smart writers to make pretentious asses of themselves when “analyzing” the subject? First we heard that Facebook’s “Hide” function was some dark new harbinger of the Impending PhonyFriendpocalypse, when actually it’s just a digital version of what humans have already been doing since time immemorial. Now Zadie Smith has written an essay-review about The Social Network in which she makes New-York-editor-dazzling leaps of critical thought that, to anyone who actually uses Facebook, come off more like the stumbles of someone who accidentally tied her shoelaces together.
I realize I may be succumbing to the same ill-conceived urge that these other writers have, simply by writing this. But what the hell.
Filed under: This Digital Life, Thoughts | 7 Comments
I made a fun promo video for Ars Technica’s awesome science video contest, which you should enter. I visualized entropy using the oldest trick in the film-school student’s book: performing backwards and then reversing the footage in editing. No fancy equipment, just a plain ol’ camcorder and Final Cut Pro (which is expensive, but I think you could do all of this with iMovie or Final Cut Express too).
Filed under: Behind the scenes | Leave a Comment