Information is the substance that we use, both as individual people, piloting our everyday lives, and as agents acting on behalf of organizations and social groups of every conceivable kind. Without accurate, timely, noise-free, and well-structured information, we can't perceive our surroundings, make good decisions, acquire new knowledge and capabilities, or plan for the future.
The corporate entity, whether a company, government, institution or non-profit, is the principal mechanism by which our society concentrates resources and coordinates activity, for the ultimate purpose of perpetuating the project we know as human civilization. If we abstract the particular kinds of resources and activity with which any particular corporate entity is concerned, we can see that at root, every one is an information processor.
Over the millennia, we humans have developed numerous tools and methods for recording, transporting, and operating over information. This has been evident in the inventions of writing and mathematics themselves, to the printing press, the postal system, the telegraph and telephone, radio and television, as well as various recording media, from clay tablets through papyrus, vellum and paper, to wax cylinders and vinyl discs, to film, to magnetic tape and optical discs. The practice of making information understandable has also been evident in specific arrangements of information, such as the map, double-entry bookkeeping, the balance sheet and income statement, the Gantt chart, the flow chart, the organization chart, as well as various filing systems and library cataloguing schemes.
In the wake of the Second World War, over 70 years ago, our society became acquainted with the computer. Originally designed to calculate artillery trajectories, the computer took less than ten years to find its first commercial application: to support booking tickets in the budding passenger airline industry. At the time, computers were astronomically expensive tools that could be wielded only by the most massive corporations, institutions, and governments.
Innovations in transportation are no stranger to innovations in information management: it was the American railroads, a century earlier, that invented, largely as a byproduct of just trying to do their job, our contemporary understanding of corporate finance, securities and derivatives, corporate accounting, and the modern corporation itself.
Six decades on from when the computer was first put to work in a commercial setting, we still see a divide. Despite computers becoming a million times more powerful at a millionth the cost, it is still only the mega-corporations and purpose-built technology firms that take full advantage. Why?
My firm belief is that this state of affairs is primarily an issue of mythology—of superstition. There exists a cultural trope that slates computers as arcane. While our corporate offices have been universally run for over two decades on digital files, email, and websites, these still largely mimic the paper-based infrastructure that preceded them. The prevailing attitude among executives toward technology
, as it's called, despite being around for generations, is one of cost reduction. The story goes something like
This is a sad story. It's sad because any business entity is guaranteed to be happily using all sorts of other technology that predates computers, or is indeed run by computers, albeit sufficiently disguising its technology
isn't our core competency, so we outsource it to experts.computer-ness
.
Again, there is something arcane about computers which alienates them from the common understanding as an ordinary tool which can be used for any purpose to which it is suitable—except, again, by large corporations or those which are firmly planted in the technology
sector.
Our society never had a problem adopting paper, vinyl records, celluloid film, the telephone, cassette tapes, VCRs, CDs, et cetera. Sure, there were famous dissenters against each new medium as it rolled out, but they were a minority against an inexorable tide of change. None of these media possessed the occult status which still shrouds the computer to this day.
There is a paradox such that we don't seem to have much trouble using computers disguised as other things—even though they behave much more like computers than the objects of their disguises. We're perfectly content to type into a word processor or spreadsheet, send email, take pictures with our digital cameras, record TV shows on our PVRs, play video games, and download all manner of apps to our smartphones.
And then there are sprawling masses like Facebook which have no analogue, uh, analogue.
Being literally surrounded by these computers-in-drag, I submit, blinds us to what the computer, in its natural state, is for. In a word, and hopefully an obvious one: computation. Computation is simply applying a known process to known information, to reveal latent information, that up to that point was unknown. Computation has been around a lot longer than computers; there's nothing spooky or magisterial about it. It's putting a quarter into a gumball machine and turning the crank. The only difference now is that a computer can turn a few billion cranks every second.
While there are plenty of examples of special-purpose computation, such as a tip calculator app for your phone, I can only think of one example of bona fide general-purpose computation that everybody ought to be familiar with: the spreadsheet. This is a thing whose job is exactly to apply known processes to known information to glean new information. What's more is that you are responsible for getting the information you care about, and composing the operations that yield more information, which, presumably, you also care about.
Nobody ever got fired for buying IBM
When we take the attitude that computation is technology
and therefore best left to the technologists, we miss out on an entire dimension of opportunity. We have to wait for somebody to make an app
, or more dauntingly, a system
. This attitude paints us into a corner, and we pay for it in more ways than one.
Systemshave their own embedded policies which dictate how, and even if, certain things are to be done.
Off-the-shelf software products are great for addressing specific, well-defined needs. In order to determine whether an existing product will satisfy our needs, we have to sharpen each and every need to a point. This is exactly the same process we would have to carry out if we were going to make the software from scratch. Indeed, from scratch
is a concept which is long-dead in software, as virtually all development consists of bending, shimming, gluing together and repackaging a set of solutions to well-defined needs, in order to meet new ones.
The alternative to buy
is to build
, which is scary, because it's risky. It takes a long time and a lot of money in the best of situations, is almost guaranteed to take longer, and therefore cost more than expected, underperform upon delivery, and has a reasonable chance of failing completely.
At least, that's what we can expect when bespoke software is done the conventional way.
This, I am confident, is another symptom of our failed mythology. The risk in software development is completely attributable to an ill-fitting metaphor for what software is: namely, that the goal is to acquire new technology and our chief concern is whether or not we can. This is nonsense: of course we can. C.R. Smith, erstwhile president of American Airlines, knew SABRE was possible in 1954. It wasn't a matter of can we do it?
but what do we want it to be?
This misled obsession with technical capability permeates our culture and our contracts. Budgets get mostly devoted to technical concerns. Progress is registered only in terms of completion of technically functioning software, so there is a huge incentive to get programmers writing code as early as possible. It doesn't matter if they have to rewrite the same thing ten times. As a result, the research and design work that would prevent them from having to do so gets squeezed out, and piles of money get wasted.
The naïve, but painfully common process for procuring bespoke software goes something like this:
doneso he/she can get paid and get out of the contract.
It isn't a surprise that many executives shy away from software development, and that for many more, it isn't even on the radar. A mediocre thing, that you can buy, that works
—at least on technical terms—is a lot better for your professional reputation than the outcome of this aforementioned process—even if it doesn't actually work in any meaningful way.
The first problem with this process is the way in which it is conceived. The feature, we must understand, is a unit of technical work, either already banked by the vendor or some third party, or otherwise a future task to be undertaken by a programmer. In reality, it's several layers removed from the actual needs of the organization. The person writing up the RFP may indeed be an expert in the needs of the organization, but is decidedly not an expert in translating those needs into technical tasks, and can therefore only speak of the project's requirements in terms of things he or she has seen before.
The second problem is that vendors, by and large, are perfectly content to take a list of features from the client at face value. The second problem and a half is the arbitrary assignment of hours to the development of features not already accounted for.
The third problem is the contractual stipulation of a sequence of milestones, and making payment contingent on doing specific things in a specific order. All this does is ratchet up the risk and give the vendor an incentive to create a Potemkin product.
In short, software development is modeled after the construction project, with a strong emphasis on structural engineering, but very little on architecture, and completely without the built-in concern for wasting expensive materials and labour, or for that matter, collapsing and killing people.
A Note on Process
We practitioners have indeed long been aware of better ways to make software. Incremental development was identified as the optimal way as far back as the 1970s. This has matured into the Agile methodologies, DevOps and continuous integration, Lean UX and incremental strategies for research and testing. My argument here is that the thing standing in the way of fully realizing these methodologies is the procurement and contracting process, itself limited by the hobbled understanding of our trade by outsiders, a state of affairs for which we have been complicit, if not something we have actively abetted.
The new mythology I propose is to consider the computer, and more specifically software, not as a technology with all its focus on technical capability, but rather as an expressive medium, with a focus on content. The accompanying narrative is that computers are mature, and have been for a while. Computers are affordable and they work reliably, and their properties have been well-understood for generations. The can we?
question has long been settled. Our problem is rather that we can't stop creating garbage content.
The film industry is one that understands this concept extremely well. Producers and directors don't start out by wondering whether or not the camera will work. What they do do, however, is spend up to years in development and pre-production, fine-tuning the script, generating storyboards and concept art, costumes, sets, props, special effects, et cetera, long before they even produce a shooting schedule, let alone shoot the film.
The content
of software is crystallized business process. It is an extremely verbose, extremely precise incantation, describing exactly how a particular process ought to behave—so precise that it can be executed by machine. There is nothing inherently technical about a business process, by and large: it's simply people doing things to achieve meaningful goals. That's something anybody can understand.
A business executive doesn't have to care how the process ultimately works, just as a movie producer doesn't have to care how the camera works. The movie producer understands, however, that it is a colossal waste of money to just point the camera at some actors and start shooting. This fact was established in the early days of film production, and the industry promptly invented its own idiosyncratic administrative methods for dealing with the realities of the medium. Sixty years from its first major commercial undertaking, we still have yet to do the same for software.
Why? Because software is still considered technology
, rather than content. The way film production has become a comparatively reliable process is by starting with a skeleton—the script—and filling in the details, until the details are so well-defined that it becomes feasible to draw up a budget and schedule to shoot. We typically don't have to blow up cars or hire celebrities in software development, so it isn't sufficient to merely supplant the construction paradigm with the filmmaking one. It is, however, important to recognize that the computer is an idiosyncratic medium that needs special consideration in order to reliably produce effective results.
A business executive who understands that a computer is a tool for computation, that computation is the process of generating valuable information, and that information confers competitive advantage, is all of a sudden enlightened to a whole new world of possibility. Computers and software cease to be a necessary evil—a cost to be mitigated—as the upside finally becomes visible. Moreover, just as the movie producer has learned to recognize storyboards and concept art as progress, so too can the executive with the precursor materials generated through the process of software development. This can only come to pass though, I am confident, if we reframe software as the content of a mature medium, rather than occult technological voodoo.
The story that software is this whizz-bang substance at the bleeding edge of technology, and that we are geeky wizards who type obscure symbols into them to produce magic, albeit enticing to many, is ultimately hurting us. It hurts us as practitioners because we are not seen as credible people who exist inside reality, and are genuinely concerned with solving real-world problems. This in turn hurts society, by severely limiting the ways in which we can help.
If we, as practitioners, can sever our addiction to the role of geeky priesthood, and start telling the ordinary people around us the story that computers are normal, everyday things that have properties which are directly useful to them—and not just as a vehicle for YouTube videos and Candy Crush—then maybe we can stimulate demand for us to do things the right way. Demand, that is, that comes from a place of understanding, rather than superstition.