Standardization and the Open Web

We’re done arguing over the importance of web standards. Many advocacy websites created to promote web standards such as Chris Heilmann’s Web Standards for Business and The Web Standards Project haven’t been updated since the mid-2000s. Accessibility, stability, quality control, and ease-of-use are just a few of the reasons modern web developers cite as to why standards are important, and today’s arguments against them seem to stem from a desire to customize one’s tools within the scope of a single language rather than a critique of web standards in general.

We’ve come to reflect the idea that standardization is important in the language we use to describe our projects and communities. For example, the JSON API homepage states that it is a “Standard for building APIs in JSON.” The FAQ page describes JSON API as a specification, and developers are talking about its use in terms of compliance. A competing project, HAL, references the visual language of standardization on its website – the flow of the page reminiscent of a formal Request For Comment, before directing you to the actual Internet Engineering Task Force RFC.

What’s in a standard?

These projects illustrate a conflation of ideas about standards, which left unaddressed can lead to confusion for the broader community over time. The JSON API specification is a de facto standard – an idea for a best practice for common use that its authors are spreading organically. The EcmaScript specification on the other hand is a voluntary consensus standard, meaning that standards-setting bodies and industry consortia have agreed to adopt this specification and create incentives for implementation. JSON (not to be confused with JSON API) actually has two competing voluntary consensus specifications: one with the standards group Ecma, the other with IETF. While the term ‘standard’ is used here in all cases, these specifications are not contextually the same. We also see RFCs on specifications that will never become standards because they are theoretical ideas for how something might work; ergo, all standards will have specifications, but not all specifications are standards.

The next community debate isn’t about web standards; it’s about how web standards should be standardized.

‘Official’ standards are those specifications which have gone through a process of voluntary consensus. To that end, there is potentially a clear path for projects like JSON API to evolve from a de facto specification to one that is officially standardized through voluntary consensus:

  1. Developer identifies problem, proposes solution to peers;
  2. Peer community provides feedback, proposes potential alternate solutions, conversation continues in channels like GitHub or Google Groups;
  3. Peer community reaches mass consensus, hands specification off to a standards body;
  4. Developers implement solution (while standards body formalizes the standard).

This seems like a straightforward idea – most developers I know are smart, resourceful, and prefer the path of least resistance. And thanks to the all bugs are shallow mentality of the OSS community, they’re inclined to work together to solve issues of mutual concern. It’s exciting to think that the next web standards might come from the developer community, but in practice this path to official standardization has obfuscated. The Responsive Images Community Group experienced this firsthand when it proposed a specification for the <picture> element – noting an issue with the way HTML handled responsive images, the RICG proposed a developer-built, de facto solution to the Web Hypertext Application Technology Working Group, maintainers of the ‘living’ HTML standard. In a well-documented series of events, WHATWG practically dismissed the developer solution in favor of a vaguely specified solution they created over the course of a few days. If it weren’t for the passion and persistence of the community and of RICG leadership, the developer solution would’ve been defeated.

The RICG specification was ultimately accepted by WHATWG and the W3C, but the experience of the standardization process certainly left a bad taste in developers’ mouths. It would be easy enough to focus our attention on improving this process for community groups like RICG, and the web would certainly be a better place for developers if we did so – but wouldn’t it be nice if we could define standardization not as ‘a process that makes technology,’ but as ‘a process that makes agreements about technology’?

In reality, open standardization is a fundamentally power-laden and political process, and it’s making its way in to how we think about Open Source project and community governance. Put in terms of Eric Raymond’s seminal essay, we’ve built web technologies in the bazaar-style of the open source development ethos, but standardizing those technologies is a cathedral-building activity. As we seek to standardize technology, we need to recognize the tension inherent in building cathedrals that will later become central authorities for us to reject. Our challenge is to find the balance between capitalizing on the benefits of standardization processes without eroding our community ideals.

When ideals compete

‘Openness’ is a core ideal in the Open Web community, as well as something of a polluted word. The rhetoric of openness is meant to communicate a favorable set of values, and those values often depend on the speaker and the audience. In his book Open Standards and the Digital Age, Professor Andrew Russell notes that “for individuals, ‘open’ is shorthand for transparent, welcoming, participatory, and entrepreneurial; for society at large, open signifies a vast increase in the flow of goods and information through a global, market-oriented system of exchange.” In absence of a single definition that suits all parties, we tend to take ‘open’ to mean ‘inclusive of everything’.

Standardization, on the other hand, is often a process that defines what something is in terms of what it is not. Russell notes that the broader societal goal of standardizing technology is to create a “cohesive and flexible network” that can sustain complex social and economic activity. Thus, the process of making standards means incorporating a wide range of practices and ideas with political, economic and cultural dimensions, all of which may be of strategic importance to creators, implementors, end users, and the general public. Put this way, standards are technically-oriented instances of diplomacy.

Charles Bachmann recognized the diplomatic requirement of standards-making when he founded an ISO subcommittee for developing open working standards called Open Systems Interconnection in 1977. In the establishing papers, he noted that “the adjective ‘open’ means to imply that all participants come to the system as equal partners.” In reality, participants don’t often come to the table as equal partners – the OSI’s own progress was stymied by organizational power plays and the growth of a competing technology, TCP/IP – but equality as an ideal of open standards-making has remained. This ideal is rooted in a deeply held opposition to centralized power, which, according to Russell, is reflected in the histories of many standard-setting organizations. To uphold this vision of equality and achieve successful implementation at times meant hiding conflicts and issues from those outside the meeting room – not exactly the transparent behavior one might expect from an open system.

If standards really are agreements between equal parties, then the agreement is the controlling authority. And if standards-setting is a rejection of centralized control, then the standardization process becomes one of creative destruction. It’s the ideological circle of open standards-making life: a group makes a consensus standard on some technology; as the standard circulates, a new party arises to point out a flaw or an unconsidered use case for the existing standard. The original group then has to make room for the new party and rework the standard, or else face rejection of the group and the standard. In rejecting the original group, the offended party forms a competing group and standard, and the cycle begins anew.

Modern organizations, old governance models

Open source communities can learn a lot from the histories and governance models of standards organizations – indeed, web standards consortia like Ecma International and the W3C already have similar organizational structures, but it’s helpful to understand the prior art upon which we are laying our community standards-setting foundation. After all, the “keep what works” mentality only works in the long run if you understand why it works in the first place.

“Good programmers know what to write. Great ones know what to rewrite (and reuse).” – Eric Raymond

The ideological origins of web standards bodies come from efforts to standardize telegraphy and engineering as far back as the 1850s. The evolution of those standards is a history rife with its own community crises and ideals, with the early ‘rejection of centralized control’ directed primarily at the government and unregulated businesses such as Western Union. Most early standardization attempts were internal: telegraph network operators at Western Union realized that standardizing human and technical components internally would result in economies of scale as the company bought out others and grew its network.

Early industrial committees like the American Society of Civil Engineers, American Society of Mechanical Engineers, and American Institute of Electrical Engineers began to form in response to the chaotic nature of late 19th century American industrial society. Many groups hosted regular ‘congresses’ – victorian-era precursors to today’s web development conferences – which helped to further define the identity of the professional engineer and solidify the idea that professionalism was about lending one’s expertise to help search for societal values like order and reform. But even the professional setting of an engineering society couldn’t keep rival factions from forming – organizational disputes often arose between the practicing ‘shop culture’ engineers and academically minded ‘school culture’ engineers.

As engineering disciplines began to overlap, it became clear that cooperation between industrial societies would be necessary. In 1918, the American Engineering Standards Committee formed to encourage cooperation and coordination of standards between groups. Coordinating consensus among multiple engineering organizations, each comprised of a diverse pool of engineers from a diverse set of companies nearly proved to be an impossible task, but the resulting structure of this “organization of organizations” has stood the test of time. Although the AESC has undergone many changes in its near 100 years of existence – it’s known today as the American National Standards Institute – the model of standards-making and governance it created is now reflected in the standards groups that followed, such as Ecma International and ISO.

Today’s web standards bodies are very much informed by these early professional societies. Ecma International, the standards body responsible for ECMAScript, was originally formed in 1961. Like AESC, it is comprised of four primary units: a General Assembly of dues-paying organizations which appoints and controls its Management; a basic Management structure consisting of a President, Vice-President, and Treasurer; a Secretariat, also appointed by the General Assembly; and a Co-ordinating Committee comprised of no more than 8 members that makes recommendations to the General Assembly regarding the formation, activities, reorganization or dissolution of technical committees. The standards work itself is done within technical committees, which are formed by the Secretariat and comprised of Ecma members. The technical committee specifically responsible for ECMAScript (JavaScript) is TC39, and while that group is good about documenting its proceedings for the JS community, their meetings are not open to the public and subject to Ecma rules and by-laws.

The W3C has a different structure and approach that is nonetheless a reflection of the older organizations – Tim Berners-Lee founded the W3C in 1994 out of frustration and distrust of the Internet standards process, which could have been described as a turf war between competing standards bodies OSI and IETF in the early 1990s. The consortium is administered jointly by university groups MIT CSAIL, Keio University in Japan, Beihang University in China, and the European Research Consortium for Informatics and Mathematics, and has a staff of about 80 people including an executive and business development team who are distributed across the globe. W3C charges a fee for membership, but does not prevent non-members from participating in the working groups that ultimately create standards.

In discussing the governance and structure of web standards bodies as a reflection and critique of the past, WHATWG stands out as something of an outlier. WHATWG formed in 2004 as a group of individuals rejecting the W3C’s focus on XML technology over HTML. There is no apparent method to become a WHATWG member, and there is no apparent method of organizational decision-making. The FAQ page vaguely references a group of invite-only members who have the power to name or override editors of a specification, but these individuals are not named. Indeed, the only publicly named and confirmed ‘member’ of WHATWG is Ian Hickson, who is also the editor of the WHATWG HTML specification. Paradoxically, this makes WHATWG both a rejection of centralized control over the HTML specification and the most centrally-controlled standards body on the web.

It’s complicated.

It’s a tangled web we weave standardizing the Open Web – political, economic, and social relationships between people, technologies, companies, and industry groups are difficult to ascertain at a glance. On closer inspection, one can see that these organizations and communities are complex systems forming a complex network – so complex that I was compelled to create this interactive Open Standards network graph to help keep it all straight as I researched.

Before we rush off to create a complex, decentralized network of open source communities and standards groups, it probably warrants mentioning that complex systems fail 100% of the time. A decentralized network may let us fail smaller in most cases, but the key to longevity of the system is failing smart – and if the research has taught me anything, it’s that standardization fails on the human element, not the technological. For better or worse, complexity is not viral – so to mitigate this, we need to make the complexity of the standardization system consumable without abstracting away meaningful parts of the process.

In the absence of community coordination method-less enthusiasm will ensue, and somewhere caught in the Bermuda triangle of competing standards bodies, implementers, and developers, is the user. If we want our community-driven projects to become official, internationally-recognized standards, we need to start by making sure that our project’s values are clearly stated and reflected in their governance process. Whether it’s an organization on GitHub or a 501(c)6, open source community leaders need to understand how their participation process reflects the values of the project and serves their goals. Without such clarifying statements and shared understanding, debates like that of Node and IO can look like petty disputes over technological control. In an environment like that, neither technology will be officially standardized anytime soon.

Open Web Standardization Map

Leave a Reply

Your email address will not be published. Required fields are marked *