Google's stated mission has long been to organize the world's information. What they mean is that they want to intermediate the world's information: to insinuate themselves between you and the knowledge you need to conduct your life. This other meaning manifests partly their own designs, but also the way they have responded to events outside their immediate control.

Mise en Scène

We have to begin somewhere, so it might as well be with the JavaScript engine V8, which was developed in parallel to the initial release of Chrome. While it is desirable to have a fast Web browser, we must remember that a major cost centre at Google is crawling Web pages. Consider that at the time, the use of AJAXAsynchronous JavaScript and XML—had proliferated in the so-called Web 2.0 development style.

Such Web properties must have cost orders of magnitude more per page to crawl than conventional pages, because rather than simply scanning a plain document for text, you would have to set up a virtual machine and execute a program, obtain its result, and scan that. Moreover, because JavaScript is Turing-complete, you need to take steps to protect the rest of the system from runaway processes and malicious code. No doubt a lot of this was one-time setup cost, and Google is really good at it by now, but a fast JavaScript engine is at least as important on the crawler side as it is in an end-user product.

Commercial Diplomacy

Technology standardization is commercial diplomacy and the purpose of individual players (as with all diplomats) is to expand one's area of economic influence while defending sovereign territory.

Stephen R. Walli, Understanding Technology Standardization Efforts

The installation of a Google employee as the editor of the HTML5 specification is suspect. Hickson's stated position on structured data has been remarkably congruent to the interests of his employer: that there was no point to it, because so much unstructured data already exists that would have to be organized by artificial intelligence anyway. His actions as specification editor reflected as much: open hostility toward RDFa, and only so much interest in semantic markup as is conducive to current practices in search engine optimization or client-side application development.

, Google was keen about rich snippets, or the use of semantic metadata it found on the Web to embellish its own search result pages—with photos, ratings, and the like. , having digested the experience, Google teamed up with Microsoft and Yahoo to create a one-stop simplified ontology, purpose-built for search engine optimization, known as schema.org. This ontology, while theoretically expressable as RDF, discards a lot of its features. You would have to perform major surgery on any schema.org content you consume in order to use it with other RDF data. In other words, the principal beneficiaries of producing schema.org metacontent are none other than the consortium's proponents.

Google has also announced that it wishes to kill the URL, on the grounds that URLs can be spoofed, that they confuse people, that they are long and ugly, and that mobile screens are too small to fit them. Specifically, they want to de-emphasize the URL as both a signal of authority over content and of orienteering for the user. Taken together with their Accelerated Mobile Pages initiative, this proposition is somewhat disturbing. This is because the acceleration of AMP is provided by Google, in this case by literally intermediating your requested content through their cache infrastructure. So if you removed URLs from the user interface, and you're serving the content yourself, what's stopping you from pulling out of the Web entirely?

As the proprietor of YouTube, Google has an interest in, and is a chief proponent of, the Encrypted Media Extensions specification from the W3C. This controversial standard is considered to be a compromise on the part of the W3C, with the argument that it is better to standardize DRM than it is to have a zillion competing proprietary systems and potentially another browser war. Whether you buy that position or not, the standard has been implemented, though it currently only covers audiovisual content.

The Coming Executable Web

This situation gets more interesting when you consider that WebAssembly is quickly becoming a viable mechanism for delivering executable code: Wasm can be understood as a compilation target for basically any programming language, not just JavaScript. Your Web browser becomes a virtual machine where apps can run, that are just as good as native code.

One of the fundamental characteristics of the World-Wide Web, one that has been around since the very beginning, is that you can simply view source on any aspect of it, and from there, if you're perseverent, you can learn how any particular effect is achieved. We are already at the point, with AJAX, and minified JavaScript frameworks, and all manner of obfuscation and voodoo, where view source doesn't cash out the way it used to, but it at least can be understood with enough elbow grease. Even compared to this, Wasm is completely unintelligible.

Now, take this state of affairs and apply EME to it. Imagine a few years down the road it seems expedient to take the infrastructure which is already present and ubiquitous, and create Version 1.1 of Encrypted Media Extensions, which also applies to code. Not only, at this point, has view source been completely obliterated, but it would actually be a crime—at least in the United States—to try to do what has been taken for granted for the last 30 years.

The Web seems to be inexorably marching toward becoming a programmer's medium. If you try to load a web page with JavaScript disabled, it's an even bet you will see a blank screen. Turn JavaScript back on, and you'll see something that looks not much different from something you might have seen , except under the hood it's gobbeldygook.

A Web page that executes binary code in a VM, in a browser that does not expose the URL, that sources content from a centralized server farm, is not a Web page, it's an app. Whether by grand scheme or by instinct, Google is quietly moving its pieces into position to control that ecosystem.

This is not a conspiracy theory as much as it is an opportunism theory: if Google acts in its own interest, it will find itself in a position to continue to act in its own interest. The side effect will be the shrinkwrapping, and concomitant suffocation of the open Web.