If you look at pictures of steam locomotives from the 19th century, you'll notice something interesting: rather than being covered up, the moving parts are decorated, even exaggerated. Sewing machines from this era are another good example. Early typewriters likewise. The lacquer begins wherever the brass stops. Pistons and levers have details and decorations cast straight into them. The style was very much to embrace the technical implementation and embellish it, though granted at the time there wasn't a lot of technical detail to embellish.

Beginning in the interwar period of the 20th century, the trend became to cover the machinery up. First it was ostensibly for practical purposes in industrial and military applications, for instance to save money on fuel or steer the aircraft. The practice eventually mutated, as such practices do, into a marketing gimmick to sell chrome-plated living rooms on wheels.

If you ever get a chance to crack the hood on one of those finned behemoths, you'll note how small the engine—still enormous by today's standards—looks in contrast to the rest of the chassis, with gaping voids to the asphalt beneath. The guts of a contemporary iPad are also pretty impressive: the circuitry that actually does the work of being an iPad occupies a small corner of the device, and the rest is battery pack. A denuded Tesla is similar.

In a lecture from 2007, Fred Brooks advanced the notion that there are no more naïve technologies left in the West. Rather, they are now composite, second-order and higher abstractions from physical reality. As such it is now possible to have designers—experts—of all kinds, busy making products and services for other people to use, all the while possessing no idea how the medium they use every day actually works. I wonder in earnest how smart a practice that is.

The Medium Will Always Win

When Sullivan wrote form [ever] follows function in 1896, he wasn't making a normative statement. It wasn't a pro tip to the modernists still in grade school to please not go overboard with the cladding. He was saying that if a thing has a certain behaviour, it can't help but come out looking a certain way. And I submit that certain way is an essential part of our relationship to artifacts in general—something to bear in mind in an era when mimicry is the norm, when an object can look just about any way we want.

If you want a feel for how such a basic transfer of embedded meaning can break down, try using any object made by Philippe Starck or Karim Rashid.

The idea is that the physical reality of the medium, whatever it is, will always find a way to mess with your pristine design, no matter how much you try to spackle over it. Like a dandelion sprout piercing its way through a concrete sidewalk, the force is unstoppable. Best to take it into account.

Seconds later in the aforementioned lecture, Brooks goaded: how many of us can take a pile of sand and produce a computer? I don't know if it's necessary for everybody to know how to do that, but it probably wouldn't hurt to know that computers are currently made out of sand, and a whole bunch of other things about them.

Leaving the Details to the Help

As somebody who has little to no moral difficulty learning a new skill as a means to an end, I'm often puzzled by what I detect to be almost be a pride in technical ignorance, an almost Victorian aesthetic that not getting one's hands dirty, and thus not needing to know how things work, is a sign of elevated social status. The clichéd example involves, of course, one of the most significant inventions of the modern era: the automobile. The line goes something like I don't have to know how it works. I just get in and go.

The problem with that reasoning is that you do have to know a few fundamental things about how a car works, aside from how to drive it. You have to have just enough of a working theory of certain aspects of its implementation if for no other reason than to know when to take it to a mechanic. You have to know that it takes gas, and not Diesel—or vice versa. You have to know that the tires need air and that the radiator needs water. You have to know that you have to turn the engine on before you can go anywhere. It's just that the car has been part of our culture for over a century, so we take these details for granted.

The switch to electric power is notable because of what it takes away: there are numerous cases of pedestrians being hit by a Toyota Prius because they're expecting to hear engine noise. In a similar vein, the role of the ignition—there is no fuel to ignite—has commuted to a safety function, because with an electric motor, you otherwise literally could just get in and go.

Painted Into a Corner

All these decades of hiding the details have altered cultural expectations, even among designers, who, just like every other specialist, is a layperson outside his or her own narrow domain of expertise. But the meta-medium of electronic, digital information technology transcends these classical domains to such an extent that it is a mistake to put it in the same category.

Take a hard problem, like the UX of cryptography. There is a lot of clamour these days, in the wake of revelations of head-smackingly obvious—that is, if you understood how any of this stuff worked—mass government surveillance, for more usable means of communicating with the assurance that nobody was eavesdropping. The underlying problem, I submit, is that in order to use cryptography effectively, you have to understand some fundamental things about the way information works. Okay? Forget computers. The usability problem of crypto would persist whether you were using a computer or not. Then you pile on seven decades of technical expedients, conceived in dimly-lit back offices, and you get the mess we're in today.

The challenge is analogous to when you're driving and the fuel light comes on: as a user, you have to understand what certain events during the use of a system, stemming from its physical reality, mean. For designers to glaze over those details because they don't understand them is like erasing the lid to get at the gas cap from the blueprint because you didn't like the way it looked, and not mentioning that little factoid to the customer: to refuel, you have to bring your car to a service station where they remove the body with a crane that only specialists have access to, because it costs more than the car. Meanwhile the customer is stopped dead in the middle of the highway, wondering what happened, because he was told all he has to know is get in and go. Ah, but look at the sleek, uninterrupted lines.

They Have to Learn on Their Own

Consider a specific UX-of-crypto problem like key verification. The user emphatically cannot maintain a complete ignorance of how the system works in no fewer than two places: First, that the key must be verified through a communication channel other than the one you are using, lest it be tampered with. Second, the key you are presented with must match every time with the one you obtained through that other channel, or the current channel is compromised—and then what do you do?

For the extra-paranoid: you also have to understand that the second communication channel has to be difficult to intercept, at least for long enough to verify the key. All of this understanding is overarched by the fundamental concept that it doesn't matter if you're communicating privately if you can't be sure who you're talking to.

Some might argue that the user doesn't have to know about any of this key verification business if we delegate it to trusted authorties, through public key infrastructure like X.509, which is currently a part of what keeps your credit card number from being stolen when you buy things from Amazon. The second channel in this case is that through which your computer or device was shipped, with the authorities' certificates embedded in its operating system. But X.509 is spectacularly complex, and all it can really assert is that somebody paid to have this or that domain name associated with a cryptographic signature. Well, you can beat that with a cert you bought with a stolen credit card, sitting in a coffee shop and forcing everybody's traffic through your laptop. Or you could be the NSA and simply command Verisign to sign your dirty keys and do the same basic thing, just with the internet backbone routers instead of the Starbucks wi-fi.

Moreover, just about every other week there is a new demonstration on how to fool X.509, which is to say nothing of the hyperbolic false positives due to benign misconfigurations or the extortion tactic of certificate expiry, which undoubtedly happen millions of times a day.

That big scary modal dialogue box that pops up in front of your web browser and screams OMG UR COMPUTAR IS BEIGN HAX0R3D!!!11 when it almost certainly isn't. That.

Oh, and if you're thinking of using meatspace interaction to bootstrap a web of trust that you can continue extending digitally, consider how easy it is for bots to infiltrate Facebook networks. Just make the avatar a picture of a moderately attractive woman. Cinch.

This is a process that you can't do for somebody, they have to do it themselves. A turn-key solution is impossible because the outcome of the process is meaningless if you, as an intermediary, completely envelop it. You can't just do it for them, because the integrity of the system relies specifically on them not needing to trust you.

Technocrat's Gambit

In an ironic twist, the hyper-specialized, let-me-get-that-for-you service economy we've developed over the bulk of the last century, is impeding directly on our ability to solve a problem which is fundamentally couched in observing—and subsequently interpreting—phenomena yourself and taking nobody else's word for it. That really is what crypto is about: trusting nothing but your own senses because anybody in the chain between you and your compatriot could be in cahoots with the enemy.

I mean enemy here as any entity you don't want peering over your shoulder. It's worth noting that cryptographically-secured communication can only really carry on in relatively free societies, since agents of a more repressive regime could just observe that you're communicating clandestinely, throw you in a panel van, and simply beat your secrets out of you.

How we present scenarios like key verification to users is an information design problem, and we can do our best to make that process amenable to the stengths of human perception and cognition. But in order to make that design successful, people are going to have to understand what they are looking at, and they are going to have to care. They have to care enough to understand, and they have to understand enough to care. The designers of those processes, unfortunately, don't care to understand, because those details are the purview of the engineers. The engineers might care, but with something like cryptography, often not enough to understand either.

But then, we still live in a world where passwords are stuck to monitors with Post-It notes. Hell, we still live in a world with passwords.

Ask me about alternatives! There are plenty, many of which are not nearly as boneheaded as a damn fingerprint scanner.

Be Prepared To Be Good Neighbours

My long bet for the future of society as a whole is that we won't be able to sustain this strict orthogonality of expertise that we currently observe. We simply won't be able to say that's not my department, especially in so-called tech, which, as I mentioned, notwithstanding the business of manufacturing its own equipment, is more of an aspect of all domains of specialist knowledge than a domain in its own right. When it colludes with the silos that precede it, tech vastly amplifies the range of exploitative behaviour those silos perform as a matter of course, and turns the potential for abuse from a retail-level affair into a wholesale one. In order to adapt, everybody is going to have to know just a little bit about everybody else's business, lest some tireless, automated system suck them dry.

We really are imploding, as McLuhan wrote, into a global village, where everybody has to know something about everybody else's business, if for no other reason than to to be able to call them out on their bullshit. That, incidentally, is the proximate, practical motivation for a user experience designer to understand how computers work as a medium. The deeper, subtler reason is that it will make you a better designer, equipped to solve problems which are more meaningful to society than the best new way to share funny cat pictures or whatever.