The story so far

September 20, 2013

Gabe asked me to write about what I’ve learned in the last six months or so, and how my view of computing and data and tech has changed, so I am going to do that.

I didn’t start off as a total n00b, exactly, I would have considered myself almost a power user in a lot of the applications I spent a lot of time in. I definitely have my generation’s affinity for technology, starting from way back when. I was running DOS commands when I was 6, I remember that, and I remember my mom, who was just fearless with new tech, would always have to spend an hour or so hacking around on our old CGA monitored IBM machine before we could get some dumb floppy disk game to work, like “Duck Tales” or “Commander Keen” or “Duke Nukem.” I remember taking little after school programming courses in elementary- learning rudimentary Qbasic… I remember having a readily apparent baseline of competency for stuff like that… I remember setting up a telnet connection in 4th grade to my friend Ben’s house to instant message each other (I wonder whatever happened to him… Ben from Salem VA…) and freaking out because it was so cool. I’ve often thought and said that if I hadn’t discovered music, I would have probably spent a lot more time with computers in middle and high school; I probably would have gotten way more under the hood about it, at some point. But I never did. The closest I ever got was trying to learn a little HTML in 10th grade, because Mia from across the street was learning it and she was absolutely killing it it was just brutal it was so dying. I never took to it, though, most probably because HTML is ultimately a visual medium, and I didn’t express myself visually in any meaningful way as a teenager (In other words, I sucked real bad). Mia did express herself visually, and does, and is still finding new and various its to murder with great vigor and skill 

And at some point in college, around the time that we were all deciding our majors, deciding on what we “wanted to do” when we “grew up,” and tacitly  supporting the supposition that that decision should or even meaningfully could be made at age 19 based on the life experience of a teenager… my idea of what computers are capable of, or rather, what I might be capable of using computers to do, changed subtly.

Andy from down the hall was the only comp sci major I knew, and once he said something about how hard the curriculum was, and how much work it was, and from then on, way back in the back of my mind, I flagged “computer programming” with things like “difficult” and “specialized” and “a lot of work.”  … So it never really occurred to me, after that, to learn programming for fun. It became a domain of knowledge that felt closed off, rarified. Besides, I was then as I am now: busy with music (the mode I DID express myself with as a teenager), which is ALSO a highly specialized and challenging skill that takes years of blah blah blah how-could-why spend-do new-stuff bork? This was compounded by my possession of a peculiar combination of procrastination and perfectionism- when I want to learn something I really, really want to learn it starting from the absolute most basic level, and that has often short circuited my curiosity by allowing me to convince myself that it’s not even worth starting something if I’m not going to learn the whole thing all at once. With a little maturity I realize that, while perhaps a noble impulse, this is almost never the best way to approach a body of knowledge; you wouldn’t try to learn Italian by starting with Latin, even though if you did learn Latin first you would probably be wicked sick at Italian wicked quick, but you would have invested way too much time learning the ancillary skill. That isn’t practical on any meaningful scale.

This is all to say, that the  hardest part about this experience so far has been shedding the vague and completely erroneous assumption that programming or development or computer science or whatever the hell you want to call it is somehow inaccessible to me. Because it’s not. At all. It never really was, but it’s never been more accessible than it is now, what with the whole internet thing. Have you heard of this internet thing? It’s crazy. Basically the whole sum of human knowledge, just about, and you carry it around in your fucking pocket. Just for the record: That is not something to be blasé about.

And that is the biggest change in my view, as well. That this stuff is approachable. That it’s really fun to model systems and objects and procedures in software. That I can do it, and that all the resources I would need to gain the skills to (someday, after tons of work) do it really well are free-as-in-beer, easy to get, or both.


So that is one way to look at what has changed in my mind recently- and in my opinion it is the most profound shift; it is also, I think, not really what Gabe was asking.

The idea that any type of information can be modelled and manipulated using massive streams of binary data is fundamental to computer science. It is literally atomistic. And before, it was the most obscured piece of the puzzle. I knew that that is how computers functioned, I even vaguely understood that programming languages were high level abstractions of that binary data, but I had nooooo ideaaaa at alllll what came between the two,  let alone how logical systems could be constructed. Then I read Code.

I bought that book completely on a whim. It was the only book in the whole comp sci section that actually looked good. The cover is gorgeous. Simple. Plain. Turns out it’s a classic, and for good reason. It starts from absolute zero… ZERO. Starting with what a code, any code at all, is, explaining binary and hexadecimal and octal number systems, and theoretical mechanical computational systems, and logic gates. I mean… logic gates, wow. I had no idea, and now I do! Also it’s really clearly written, which I can already tell is a rarity in that genre…

That is not to say that it didn’t get miles above my head at a certain point… probably about half way through after he explained early processor architecture… and I’m looking forward to reading it again when I have more perspective. That is probably now, actually… I read it pretty early on. It just answered so many questions for me… and now I at least have a cursory understanding of computing at every level from UI all the way down to the bits. Cursory… just a basic roadmap, but that’s pretty huge. I wish I had read that book years ago, but then again, I can’t really say for sure that I would have understood it in the same way. But the “magic” is gone, in all the right ways… I don’t think of the computer as a black box anymore, I think of it as a really sophisticated conglomeration of a huge amount of tiny, really unsophisticated processes. That is not to say that it’s not still magical, just not magic.

And so, in the end, I got my fundamental understanding after all. Sort of. Don’t get me wrong, I still don’t know all that much, but I sure know orders of magnitude more than I did.

One more thing…

Computing changes really, really fast, but from a consumer’s perspective, it seems to change even faster than it actually does. From the outside in, completely superficial adjustments to user interface and design can seem to transform the entire system. This is really stupid, of course; operating system designers don’t reinvent the wheel with every incremental update, and I shouldn’t have been surprised to learn that OS X Mountain Lion is built on top of the 45 year old UNIX architecture- but I totally was. I was also surprised to learn that so many programming languages have so much in common with each other, and that they generally represent a fairly small number of schools of thought on software design that each also go back decades. In other words, even though there is an enormous amount to learn about, it turns out it is a LOT less than I thought it was, and it’s also all connected in very systematic and understandable ways.