Software as Storytelling, Part 1

When I was younger I remember hearing people talk about computers as quasi-mystical entities. It's such a pervasive narrative that I think everyone has heard that. "It's just a bunch of ones and zeroes", people say, half reverentially, half dismissively. It's probably the only thing most people know about how computers work. 

And it makes sense to be turned off by this fact: the sequence 01010110 is repellent. The average college graduate isn't equipped with tools to  have any intuition about what it means. Further, they will probably (and quite correctly) intuit that it is meaningless. Some ones and zeroes computer gobbledygook. When they determine that it's meaningless, that's usually the end of the story. I don't understand this here is mostly synonymous with I can't understand this. It's the 21st century's version of "It's all Greek to me", except ostensibly you could learn Greek, if you really put your mind to it. No such belief prevails about computers.

So the fact that computers run on binary code is a cheap truth. Everybody knows it. And cheap though it may be, it has deep implications for how we design software. As programmers, once we've achieved the milestone of being able to make computers generally do what we want, our goal becomes to create systems that humans can have intuitions about. We can write software that does something, but if other people can't figure out how to use it, it's not useful. When writing software, any software at all, our goal is to hide computation behind narrative: to plaster over the repellent with the comprehensible.

Hello, World

For software developers, "It's all just a bunch of ones and zeroes" is only half of the story. Crucially, it omits the bridge from meaninglessness to meaning. The full sentence is this: "It's just a bunch of ones and zeroes that people have agreed mean something." Take 01010110. There's a generally agreed upon way to interpret sequences of zeroes and ones as a number. It's called binary, or base-2 encoding. If you choose to treat it as a binary number, its value is equal to 86. If you treat it as a binary representation of a character, it's the letter V. You might ask here: Since 01010110 is equal to 86 and V, does that mean 86 and V are equal to each other? In the context of computing, the answer is yes, they can, if it's useful to you. There's no universal truth that makes this the case: people just decided.

You wake up, chained, in cave. You see shadows on the wall, shrug, and think this is my life now.

[If you're wondering how to convert binary numbers to decimal numbers or to characters, many tutorials have been written about it. The "fact" that 86 is equal to V is defined in the ASCII character system. Tellingly, ASCII stands for "American Standard Code for Information Interchange." ASCII is a standard - an arbitrary set of rules agreed on by people. There's another widely used, much more complicated standard for representing characters called Unicode. It is decidedly less reasonable than the ASCII character system.

[In addition to binary numbers (also called base-2) and decimal numbers (base-10), a system frequently used in computers is base-16. Base sixteen numbers are denoted with 0x in front of them. Rather than having 2 values (0,1) like binary, or 10 values (0, 1, 2, 3, 4, 5, 6, 7, 8, 9) like decimal, it has 16 values (0, 1, 2, 3, 4, 5, 6, 7, 8, 9, A, B, C, D, E, F). Base 16 numbers are frequently denoted by putting 0x in front of them: for example, 0xA435F is a base-16 number equal to 672607 in base-10.]

Having established that binary numbers aren't inherently meaningful, but that it's useful for people to agree on some meanings for them, we might ask: how do you use them? Here's the basic model: You have some "registers" that can hold binary numbers. These registers have names, which are also totally arbitrary. By convention, you have agreed that these binary numbers can mean five distinct things. You can call these five things primitives - the smallest concept available to you, from which you construct other concepts. They can mean 

  1. An integer: 14351
  2. A floating point number: 145.345
  3. A sequence of characters (a "string", in computer science parlance): "The quick Greek fox jumped over the lazy dog."
  4. a memory reference - There is a storage area where you can place sequences of binary numbers. They're placed in locations that are described by binary numbers called memory references
  5. a command to execute - The processor has a defined set of commands that it knows how to perform. These commands are represented by, you probably guessed it, binary numbers. These commands can do various things: add integers, copy information from one memory reference to another location in memory, or send a command to the operating system (this is how your program is ultimately useful: the operating system runs it (or "interprets" it, and the program sends commands back to the operating system.). 

You've got 8 registers, which hold binary numbers, which can mean different things. You can add numbers, you can send commands to the operating system, you can copy information around. A reasonable person might respond here: Fine, but what the fuck am I supposed to do with this?

In 2017, it turns out the answer for most programmers is: not a lot. Let's present a basic program that works which registers, lifted happily from someone else's tutorial. The program is on the left, and on the right are some comments. Glance over it. Take the gestalt. Don't worry about understanding it, but then try to make sure your eyes see every single character. Soak it in.

section     .text
global      _start                            ;must be declared for linker (ld)

_start:                                       ;tell linker entry point

    mov     edx,len                           ;message length
    mov     ecx,msg                           ;message to write
    mov     ebx,1                             ;file descriptor (stdout)
    mov     eax,4                             ;system call number (sys_write)
    int     0x80                              ;call kernel

    mov     eax,1                             ;system call number (sys_exit)
    int     0x80                              ;call kernel

section     .data

msg     db  'Hello, world!',0xa               ;our dear string
len     equ $ - msg                           ;length of our dear string

Much like a binary sequence 01011100, it's repellent. If you're not an experienced software developer, the act of even looking at each of the characters was probably a little difficult. Let's put this program in narrative form to try to make a little more sense of it. 

We move the value "len" to the register named "edx". Then we move the memory reference "msg" to the register named "ecx". Put the value 1 (represented as 00000001 internally) into the register at ebx. Put the value 4 into the eax register. Then tell the operating system to run the command that is in register eax, i.e. command 4. The operating system then runs this command, and grabs information from the other registers. The computer takes the memory reference and the length of characters you provided, gets the a sequence of binary numbers from memory, and sends it to stdout which assumes that sequence as a string.

If you didn't follow that, it's fine. It's actually kind of the point: the narrative value here is awful. A reasonable person, our Joe Sixpack with a college degree, still might not know what this program is even supposed to do. It prints out the string of characters "Hello, world!". It requires a great deal of understanding that can't be gained a priori. It's unreasonable.

The language shown is called assembly language, and it is widely regarded as a terrible language to write programs in, because writing programs in assembly language is confusing as hell. It is surpassingly difficult to use the primitives available to you (integers, strings, memory references, system commands) to write correct, useful software. Take a moment to ponder that: expert programmers cannot consistently write correct code in assembly language, which is built around the basic model of how computer processors work. Put another way: people cannot consistently reason correctly about how computer processors work. It's just too damned difficult.

Overwhelmed by heat and smoke and the endless serpentine dance of shadow, you begin to panic.

Let's pause this train of thought momentarily to talk about expertise.

A Digression on Remembering Horribly Long Strings of Numbers

In the field of cognitive psychology it is agreed that people can, at any moment, remember between about 5-7 different things simultaneously. This is tested by reading a person a sequence of numbers and having them repeat the string back: people generally top out somewhere in that range.

A professor that I'm having trouble googling for right now took a student and had them practice remembering long strings of numbers every day. They did this for weeks. They continued to do this. After continuing to do this for an unreasonably long period of time, eventually the student began to remember longer strings. Eventually he started being able to remember horribly long numbers, hundreds of digits long. Of note, his ability to remember did not transfer to tasks other than remembering long strings ofnumber. His general memory didn't improve, just his ability to remember sequences of numbers. This experiment has proven reproducible.

What gives? The prevailing theory is something called "chunking", where a person learns to group discrete primitive concepts together into one, more complex thing. It becomes a new primitive. The explanation goes something like this: Jane 6-digit gets so much practice remembering strings of numbers, that she is able to remember long consecutive chunks as a single entity in her memory. Thus, rather than remembering "1, 5, 3, and 8" she remembers "653234, 5458, 34058, and 3489".

Of course, you do this too, all of the time. Think of the area code on your phone number - although it's comprised of three digits, you probably lump them together in your had. Now imagine that sort of association writ large across the set of all possible combinations of integers. Neat. But not very useful.

Consider this phenomenon in a more interesting domain: chess. When you're learning chess, the first thing you need to keep in mind is how the pieces move. It's hard to think about strategies before you have the basic mechanics down. Once you memorize how the pieces move, you can start to build experience with different scenarios. Soon, rather than each move being a disconnected action, you can see patterns start to form. You can classify moves, and then sequences of moves, and then whole games. A chess master is better not because they're smarter, it's because they have more sophisticated primitive concepts for describing the game.

Feet still shackled, you spread your arms out, and you begin to notice some of the shadows on the wall changing shape in response.

Hello, Abstraction

Let's take the program we just wrote and write it in another language. Why not Python:

print "Hello, world!"

From the perspective of a person running it, this program will do exactly the same thing as the previous "hello, world" program written in assembly language. We'll talk about what's there in a second, but first, let's enumerate some of the things you didn't have to think about: 

  1. Registers
  2. Integers
  3. The kernel
  4. the "mov" command
  5. the length of the string you're trying to print
  6. where you're trying to print that string
  7. calls to the system
  8. locations in memory
  9. hexadecimal numbers
  10. linkers

There are probably more, but that's what I could think of in two minutes. Think about this in terms of cognitive load, and the limit of 5-7 things in working memory we talked about before. We can reasonably come to the conclusion that the first program is actually harder to think about.

Let's try putting this new program in narrative form:

We call the function "print" and pass into it the string "Hello, world!" It sends the string "Hello, world!" to stdout.

And... we're done. You probably still don't know what stdout is, and that's fine. By the time you've finished reading this, you... you still won't. It almost never matters. 

Here are the primitives that we're dealing with, one of which we've seen already, and one of which we're introducing for the first time :

  1.  A string
  2. A function

What's new here is the idea of a function. If you think back to algebra class in high school, you'll probably remember seeing things like 3 = x + 2, and having to solve for x. You might also remember seeing things like f(x) = x + 2. This is a function. You could read this as "f of x is equal to x + 2." or "f is a function that takes a value of x and adds 2 to it."

The concept of a function in programming is very much derived from the concept of function you've seen in algebra. If you wanted to write a similar function in Python, it would look like this:

def f(x): x + 2

Functions let you group a set of things together and make it into one, larger thing. The example of adding two to another number isn't particularly compelling: it's hard to argue that that's any better than just adding adding two and being done with it. But then, think about the two "hello world" programs. I would find it impossible to argue with a straight face that the assembly program is better. It's less clear, takes longer to write, and harder to check if it's correct, even for expert programmers. There's no upside.

If you're following along, you might notice that there's an analogy here between the concept of functions and chunking. Functions let you group things together; chunking lets you group ideas together. Chunking is how your brain lets you manage complexity. Functions are how programmers manage complexity in software. Functions allow us to have narrative flows like "print the message", rather than "place four different values in the right register, and then invoke the kernel." Functions allow us to hide computation behind narrative. We're not instructing the processor to set registers anymore, we're printing.

Now, printing "hello world" to stdout, whatever that is, isn't particularly interesting. But what if you had a function that could tell you the time of day? Or render an image on your screen? Or send an email? Or send a convincing apology to your aunt and uncle out in Oregon for not ever sending back a thank you note for the hundred dollar traveler's check they sent you, which you forgot about and then felt too guilty to cash and then lost? Suddenly you've entered into the realm of Things People Actually Care About.

It may be an obvious point, but it's worth being explicit here that computer programming languages themselves are software written by people, for people. When people think about a programming language, they might say "programming languages are for telling computers what to do." This is true enough, but misses an important detail. Programming languages are for allowing people to tell computers what to do. Computers only answer to ones and zeroes. Programming languages are software whose purpose is to shield us from the complexities of that truth.

The cavern you're in is vast. Mostly you can't see it - you're facing a wall. You try to turn your head, but you find that it's immobilized. You shout "Hello?", and it rings out into the cavern. The only response is your own voice, reflected off of distant and unseen faces. Still, the shadows move.

A Regrettable Assembly

The assembly language was written, and because writing in it is terrible, in short order people tried to write other, better languages. To varying degrees, they succeeded. C and C++ are the ones you've likely heard of. C's main improvement over assembly language is the concept of variables and functions. Still, it's incredibly difficult to write correct C code. Like assembly, it deals in a set of primitives that are difficult to reason about. A solid portion of all hacks are a result of vulnerabilities in incorrectly written C code. The heart of the problem is this: the narrative of what the code does is too complex to reliably comprehend. The language is broken.

If you're thinking, "wait, people should just learn to stop writing bad code!", think about it in a different domain: It's not inevitable that you're going to die in a car crash. Still, it's statistically inevitable that people are going to die in car crashes. This is a problem with cars. If we had sane public transportation everywhere, there would be hundreds of thousands fewer deaths each year. Likewise: if one person writes buggy code, fine. It wasn't inevitable that that happened. if a hundred thousand people write code with the same bug, it's the language's fault.

Incidentally, the software in cars is written in C. While it's generally unlikely, it's also inevitable that people will die in car crashes caused by buggy code written in C. Although a cursory google search doesn't turn anything up, it has probably already happened.

C inspired a host of languages, which are grouped together taxonomically and called "c-like" languages. Java, C#, Python, Perl are some of the more common ones. They all improve on C in terms of narrative capacity, but also all inherit C's cardinal mistake - null values.

Time passes. You don't know how much. You don't become hungry, or thirsty. You do start to get sleepy and soon you're drifting in and out of consciousness. Your dreams are also of the cave. You see the shadows on the wall start to take coherent shapes, intricate arabesque geometries, endlessly composed of themselves, morphing and reforming. The word fractal pops into your head. You're not sure what it means. A memory appears: a kindergarten teacher (your kindergarten teacher?) in a sunny room pointing to a blurry projected image, saying the word "fractal" over and over again. Another memory: A child pointing to something out of view, asking a mother (your mother?) "is that a fractal?" and a curt response, "Yes, but it's a fractal of bad design."