It's user error.

Debu.gs


_why, oh _why, and the Path Forward

In which I come neither to bury nor mourn, but to sort of give a data point to people that weren’t there, and also to talk a little about code as art. I have a lot of friends that are not Ruby programmers, or they weren’t in 2009, so there’s only an inkling of _why in their heads. I’d like to be able to explain some of the context as I saw it (because that is a task I feel more up to than to explain anyone else’s context). I do introduce a corollary, and some thoughts that seem to follow naturally from what _why said when he disappeared, but hopefully they’ll make some sense to you.

Also, this one’s not about Inferno or much about coding, even if it is about code.

_why’s gone.

When _why disappeared in 2009, I was a little shocked. I was working as a consultant at a crazy company in Hollywood that refused to pay for parking. Like a lot of tech companies I’ve worked for, they had a moderately marketable-looking product that was largely uninteresting, but I met a lot of very cool people there, and am still in touch with a few of them, despite having a fairly short tenure at the company.

The news broke, and I got almost no work done that day. I had been watching what he wrote on Twitter with some interest. It was poignant.

Let’s go back for a minute

I’ve never met _why, but had been following his antics on the ruby-talk mailing list for years at that point. Of course, following his antics leads to following his code, and if you look at his code or prose, it’s hard not to become a fan. They were both similarly playful and entertaining. You could see in his code this soul that fails to come out in most people’s code, like it was a much more creative and personal endeavor for him than for most people. Like Mel (see The Story of Mel), you could sense more than a little of his personality. His code was deshabille, yet elegant. It was nearly always small and beautiful, except when it wasn’t. It’s not often you see something like this in any field, something where the creator has a consistent vision that he believes in, and brings it to fruition.

Like many coders, I imagine, I viewed his a mixture of awe, envy and admiration, because what he wrote was great, and I have never been able to produce things like that. I’m happy when my code works; _why’s code was more like a work. It’s as though he took to heart the line from SICP that says “Programs must be written for people to read, and only incidentally for machines to execute.”

I’ve somewhat self-consciously tried to avoid his style, because there he was, doing what I wished I could do, but in a way I couldn’t match. All these years later, I still budget whimsy when I’m writing code or writing about code.

Back to 2009

Really, he seemed a lot like an artist in ways that I could only aspire to. I was there, stuffing things into memcached, and could not for the life of me imagine any art to be produced in cranking on Yet Another Rails Application to do Something as a Service. So there was a melancholy tinge to what would look for a long time like the last thing to be said by _why, his “final words”, so to speak.

Of course there weren’t really any final words, and he just disappeared. But he had been chattering on Twitter as usual. Twitter is big on deleting things, history disappearing under a Sisyphean scrollbar, and its perpetual state of right this minute. I was using a Twitter client that plugged into finch, and so what he wrote was logged, at least for me.

(08/17/2009 10:22:45 AM) twitter.com: _why: burying myself feet first in
the woods with the hope that this will lead to a career as a much
beloved and sought after mouth-under-a-rock.
(08/18/2009 12:16:43 PM) twitter.com: _why: programming is rather
thankless. you see your works become replaced by superior works in a
year. unable to run at all in a few more.
(08/18/2009 12:32:21 PM) twitter.com: _why: if you program and want any
longevity to your work, make a game. all else recycles, but people
rewrite architectures to keep games alive.
(08/18/2009 01:17:43 PM) twitter.com: _why: ahh i'm just so totally
suspicious of anyone who claims to love progress but stridently defends
the status quo!!
(08/18/2009 01:55:00 PM) twitter.com: _why: kafka would be a lot harder
to get into if the trial only ran on a power pc.
(08/18/2009 07:50:07 PM) twitter.com: _why: nailing a small ornate gold
shelf at arm's height above the bed for my cat to sit on. i give you:
norton's perch.
(08/18/2009 07:54:20 PM) twitter.com: _why: i should probably have
little teeny shelves all leading up to it. with their own miniature
portraits or doll banisters or something.
(08/18/2009 08:06:21 PM) twitter.com: _why: an ascending homage to fish
bones. culminating in a delicate canopy of mouse furs.

The timestamps above are US Pacific time, because I live in the US on the Pacific coast.

I can understand the feeling. I’ve got an attachment to my code. I feel like, despite the dearth of soul compared to what you find in _why’s code, what I write when I have an idea and run to the keyboard is distinctly personal. There’s a character to it, something definitively me. Even if it’s trivial to write something functionally equivalent, the way I write something is a way that only I could have written it.

If an artist like _why is feeling a little depressed about his code fading into incomprehensibility, how could I possibly hope for anything I’ve written to survive past my own interest or ability in maintaining it?

I’ve sorted through the data _why’s been posting. It’s been the same as anyone else’s speculation, but I thought that this might have had something to do with it. The impermanence of his work. Sometimes I love that about code, like a secret that I feel will be kept safely. There’s a kind of beauty in transience. But there’s also a sadness in it. There’s a lot of code that no one has seen outside a given language community or outside a specific company, and code that no one will see ever again, but that was beautiful and conveyed a vision, code that could be appreciated. Code that was more akin to a poem written in a now-dead language than to an account of oxen. Code like what Mel wrote.

The Romans burned down the Great Library of Alexandria, and we don’t even know any more what we lost after Egypt fell. Poetic justice, perhaps; more than a few pharoahs had a predilection for destroying statues and records left by their predecessors. Egypt’s hardly an outlier. There’s a taboo against book-burning in most civilizations nowadays, but not everywhere, and this wasn’t always the case. Whole bodies of knowledge have evaporated, entire cultures, religions and civilizations have evaporated as the autocrat-du-jour burned their words. Glimpses into the lives of people in antiquity are lost.

On a smaller scale, we have some truly valuable code. Some of it is brilliant, some of it is poetic, and all of it has at least a little character, a glimpse into the mind of the creators. Sadly, it’s obscured behind licensing restrictions and patents and NDAs, a lack of inclination to make things public, or other more personal reasons. And then it’s burned to the ground when the language or computer goes obsolete, the company that kept it secret goes bankrupt, or the only copy of it on the author’s hard drive is erased, or the disk itself fails. I don’t pretend that some stupid HTML generator I have written is as valuable as what was recorded on papyrus in Alexandria, but it conveys something.

To make matters worse, the language of programming is spoken by only a few people. There are languages that even seasoned veteran hackers cannot make much sense of, but the vast majority of people alive today do not understand a simple for loop.

You are either part of the problem…

I wrote, released, and still use Hoshi. That blog entry was published the month before _why’s big exit. (I have some very old content on this blog; some of it is embarrassing to go back and read.)

I had heard some objections to using Markaby, and they had some validity. I wrote Hoshi partially to address those objections, but mainly I wrote Hoshi because I felt it conveyed the way I wanted to handle HTML generation.

I had put effort into it. Friends had put effort into it by giving me their ideas and by putting up with my jabbering. Talking and thinking. And then I got the go-ahead from my boss to write and release it, and wrote the bulk of it in an evening, and polished from there. It is a stupid HTML generator, but it was a piece that fit perfectly into the way I work. I had more discussions with friends about it, and crystallized it into the above blog post.

I threw in a little script that converted HTML into Ruby code that would generate something somewhat close to the HTML that it was fed. Being a fan of _why’s stuff in general, I used HPricot. _why ended up deprecating HPricot and bitrot set in. Friends would come over to my desk and tell me I ought to switch it to Nokogiri. Part of it was laziness, and part of it was that I had an attachment to _why’s code, but I still haven’t dropped HPricot. (In fact, I’ve just recently used it in another project, for a one-off script.) Hoshi also uses _why’s metaid, incidentally.

And _why opined: “programming is rather thankless. you see your works become replaced by superior works in a year. unable to run at all in a few more.” That’s the crux.

I don’t have the arrogance to suspect that he was talking about Hoshi there, and I’d be pretty surprised if he had even heard of it. _why had his fingers in a number of pies, only one of them being Markaby (and even I wouldn’t say that Hoshi was “superior”, but I can say that it fits my style better). But it was very fresh on my mind, since I had just written that entry, and it made a lasting impression on me when he wrote that.

His code was going away. Your code was going away. My code was going away. Huge swaths of my code are already gone: obsolete or locked up in the vaults of dead companies and now buried under volcanic ash. I’m not old enough to have written a body of code in a dead language, but if I’m lucky enough to survive to old age, I am certain that I will have outlived many of the languages and technologies I’ve used.

…(And the problem is big)…

Small, simple codebases have better survival characteristics, but even they don’t survive the abandonment of a language, operating system, or machine architecture. Sometimes they don’t even survive a big enough update to any of their dependencies. I used SVGAlib a lot in my late teens; the deprecation of a library has killed off a lot of my code. _why had a good point here, though: “if you program and want any longevity to your work, make a game. all else recycles, but people rewrite architectures to keep games alive.” The code itself doesn’t often survive, but the binary programs continue to live and run. The situation is getting worse, even. When there are no servers to authorize you to play a DRM-crippled game, what will you do? When you lose a game with no physical media and heavy DRM, how will you show it to your children or your grandchildren? Will Sony still be selling PS3 games in 50 years? Will Sony still exist as they do now?

“Rights holders” in the game industry have, in my opinion, been very poor stewards. You can read Julius Caesar’s Commentarii de Bello Gallico in the original Latin, but finding the hardware to play, for example, Seiken Densetsu, is difficult. (Not to imply or invite a comparison of their relative value to history; I mean to discuss the medium.) The original software has not made it into the public domain, but it is no longer sold. The hardware it ran on is also no longer sold. The game had an impression on me. I even named Watts after a minor character in it (although there were half a dozen reasons the name stuck). For most of the people reading this, the only way to share the experience of playing this game is piracy and emulation, if the system is even popular enough for a person or team to go to the trouble of reverse-engineering, documenting, and producing an emulator.

A shining counterexample here, well worth pointing out, is id Software. They publish and free their code, and nearly every device that exists can play DOOM, including Plan 9.

When the languages and compilers all die, a body of knowledge dies. Very few programmers can read enough PDP assembly language to make sense of chunks of HAKMEM.

And then there is the “programmer caste” problem. Lots of people don’t own computers. The vast majority of those that do merely stumble through clicking, because of a paternal, condescending “product designer” has thought it best to obscure programming and hide the actual machine from the people that use it. There’s an increasing body of people with closed-hardware phones and tablets as their only computing devices, and they couldn’t learn to code if they wanted to. Not everyone has the interest or inclination, but people on the edge turn back when even getting started is an ordeal.

Maybe I’ll sound crazy when I say this, or if you’ve read what I’ve written, maybe I won’t. Maybe I’ve sounded increasingly crazy as you read. But the production of closed software and hardware destroys the past, and selling a device that its users can’t easily program promotes technical illiteracy. In this view, and it is my view, these practices are insulting and arrogant if not downright malicious.

…or part of the solution

Open source code with permissive licensing solves part of this problem. But code falls into obsolescence daily. A hundred-year language doesn’t last nearly long enough. A fifty million-year language might. But, realistically, we cannot even build a very good hundred-year language. (I hold out some hope for the Urbit system, but we do not have it yet.) Some languages don’t last ten years.

Open architectures that don’t patronize users by walling off “restricted areas” solve a little more of the problem. Helping people become technically literate by letting them see the inside of the system is another step in the right direction. One of my favorite things about Inferno is that the source code is right there in /appl. Plan 9 ships with all of its source in /sys/src. You do not need to type anything to have the code. You aren’t forced to read it, and it is not in your way, but all you need to do in order to read or change the code is to look in the right directory. No further downloads required. Even Debian makes you jump through some hoops to get the source code, and it’s even stingy with including C header files, a mistake repeated by nearly all of the popular Linux distributions. (While I’m inviting flames, perhaps I can head some of them off or encourage a different type of flame by speculating that the overhead for processing and storing dependencies for thousands and thousands of -dev packages outweighs the on-disk size of the header files that would be included and going past speculation to state definitely that figuring out and documenting for other coders the extra .debs that I need has cost me more time and frustration than a few extra kilobytes.)

Finally, and maybe you’ll think I’ve gone all the way off the deep end here, or maybe you’ll think I’ve nailed it (I’m partial to the second option, personally), but I would like the suppression of software to have the same stigma burdening the suppression of any other ideas. These boxes are not any more magical than books, and software is no more magical than the letters covering the books. I see no reason that digitizing something should change the rules. Preventing the distribution of code as a general rule seems to me to be on par with the burning of books, as do obtusely breaking interoperability.

Half-optimism.

I think the way forward is to open up the code, and to stop welding the hardware shut. I don’t think a society that mystifies computing will make proper use of it. I think that Steve Jobs was correct when he said that computers are a multiplier for talent (but not when he locked down the iWhatevers). The corrolary to that is that I don’t think too many generations of business as it is done today in this industry will persist. So I am optimistic that within a few hundred years, computing will hopefully have returned to a more open state. I’m less optimistic that it will happen in my lifetime, and vaguely nervous about the remote possibility that too much knowledge will go by the wayside for computing to continue.

We in the present, though, are not passive observers where the future is concerned. We shape it. I intend to do what I can. Among _why’s activities, he seemed to love teaching people to code.

Especially programming for kids. I’d be lying if I said I wasn’t thinking about his work when I made my own, much more modest attempt to make Inferno easy to play with.

So I’m half-optimistic. I think the future can be great, and I enjoy doing what I can to promote that future.

What money I do have does get placed where my mouth is

I think the future that I want can be achieved. Maybe I’m crazy and we’re all better off with DRM, locked-down hardware, closed software, patents on bubble sort, and the sorry state of “layman programming”. But I’ll be working towards it. Luckily for anyone that thinks I’m crazy, I’m just one person. Not everyone’s grim vision of the future comes to fruition. We vote, sort of, as a species, and although time and chance happen to all, we more resemble a stumbling child than complete randomness. That is, we can occasionally manage to make some of the progress we’re trying for.

I don’t speak officially on behalf of my company on this blog, so take what I’m saying with a grain of salt. REVERSO Labs is a small consulting company right now, but we do have something bigger cooking. We intend to make and keep that code open-source, and make our money elsewhere. Companies producing phones are not making most of their money by selling the software (with the semi-exception of Apple, RIM, and other walled gardens), so I have some confidence that we will be able to get by with open-source software just fine. I intend to do better than “just fine”, but no plan survives the enemy. If you read his recent novella, maybe that’s absurd, and that’s fine.

There was that.

Dictated; not read. —The Management

HN again

Hi, Hacker News! People that are not here from Hacker News: my friend jbrhee has posted this there after complaining that I don’t have a comments section.


<< Previous: "Try Inferno without Installing It"
Next: "Making Music with Computers: Two Unconventional Approaches" >>