The computer industry is chock full of terms. Many of them are abbreviations: MS-DOS, IRQ, W3C, etc. Some of them are merely numbers, with occasional descriptors: 640KB, Port 80, and 8—the number of bits per byte. What all of these terms have in common is that they are precise, and that is why one term in particular, "Web 2.0," seems to generate so much furor among the software developers and spacey dreamers in Silicon Valley. No one is quite sure what it means.
In reasonable, well-written essays, authors usually define their terms before using them repeatedly with readers. For the sake of mimicking an unfortunate reality, I will defer my own precise meaning for the term until later on. In the meantime, I'll take the same approach that computer book publisher Tim O'Reilly did in his September, 2005 essay on Web 2.0's definition, by substituting some examples for an explicit definition, since one isn't really available. Be forewarned: in its vernacular usage, Web 2.0 is both a noun and an adjective.
YouTube, Facebook and Wikipedia are all very Web 2.0, as they all encourage "user-generated content." On the other hand, you can't leave out Flickr, Meebo, and Wufoo. They are Web 2.0 because their names, deformed by the rare availability of English .coms in recent years, are all not real words, nor are they combinations of two or more words, as were company names of a previous generation. Then again, company name aside, a web site created using Ruby on Rails is also Web 2.0. This is in turn because the employees of software outfit 37signals, which the press says is obviously Web 2.0, use Ruby on Rails. (No one uses Ruby off Rails.) You might say that Digg, an archetypal Web 2.0 company, merits distinction because it has one million users and its web site features very prominently rounded corners. Users and round corners allow for the "disintermediation" of media companies, which will make print newspapers and television stations obsolete, according to the business partner of prominent Web 2.0 blogger Michael Arrington. Indeed, blogging is also Web 2.0, and clearly important, as bloggers who write specifically about Web 2.0 have courted venture capital, not to mention their own personal Chief Executive Officers, not for the Web 2.0 companies they write about, but for their writing hobbies themselves. And so long as Web 2.0 channels millions of dollars of other people's money to Web 2.0 champions, those champions tend to say that Web 2.0 champions "the free flow of information" so that everyone else will hop on the bandwagon—whether it's legal or not.
After reading the preceding paragraph it should be fairly clear that in its current form this concept of a second iteration of the Web is amorphous, quirky, and idealistic at best, and completely bogus at worst. Yet if Web 2.0 is nothing more than a meaningless buzzword, how can one explain the fact that there are differences between the underpinnings of the internet today, in 2007, versus five years ago, in 2002, and five years before that, in 1997? Fortunately, that is actually not very difficult to do.
The term Web 2.0 implies a discrete shift from a prior version—a shift which never took place, because as O'Reilly points out in his own essay, there is no set software release cycle on the internet. Current and former Product Managers at respected companies such as Microsoft and Apple will attest to the fact that orchestrating the smooth integration of numerous discrete programming changes into a single new software release takes a mammoth effort of coordination, which the World Wide Web lacks by design. On the Web, server hardware, operating system software, applications, and content are all upgraded and changed piecemeal, almost always without the approval or coordination of any central authority. The notion of assigning a version number to the internet, or any piece of it, is downright absurd; to remain accurate, a new version number would have to be assigned multiple times per second. There would be no real information conveyed by the number relative to other, prior numbers, or even in absolute terms. In other words, the internet is constantly evolving, and time, usually in terms of days or weeks, is the only useful benchmark we have to analyze its changes.
From an academic perspective, then, Web 2.0, is really a useless phrase which conveys no actual information. The internet shed its academic countenance more than a decade ago, however, giving way to a commercial phenomenon the likes of which the world had never before seen. Therefore, it should come as no surprise to anyone that the term Web 2.0 was co-opted into commercial use almost from the moment it began, as the theme of one of O'Reilly's many computer industry conferences. Today, any software business that refuses to use the term is at an automatic disadvantage, as bewildered consumers (and to a growing extent now, businesses), search for the magic ingredient that they want in all of their web-based software: Web 2.0. Therein lies the real problem: the term is meaningless, but thanks to the press, it is also spreading.
So, now that we are stuck with Web 2.0, what should we do with it? The likelihood that it will go away on its own is slim, and the fact of the matter is that the aforementioned Web 2.0 companies are, in at least the legal sense of the word, real. Yet many people who work in software are still averse to the term. For a very long time, I myself was loathe to describe any of my work through Think with the phrase out of intense distaste for its vacuous nature. On the other hand, Web 2.0 presents software developers with an opportunity that does not come around often. In essence, a new variable, void of value, has entered the computer world. Its entrance, however accidental, represents the aspirations of millions of users for a major, positive change, but as is often the case with software, they will not know what change that is until someone shows it to them, already done. So begins the process of trial and error.
The prevailing wisdom is that Web 2.0 means software that revolves around the community; software that takes "collective wisdom" into account; software that is consumer-oriented, and not enterprise-based; software that is free. These characteristics may represent changes, but they do not always combine to yield an overall improvement, and the continuing confusion of the community over the term, as represented by its reluctance to accept it, is proof. These four characteristics are certainly a different way of doing things, as communism is certainly different from capitalism, but it is not always and on all counts "better" than what we all used before on the World Wide Web, in 1995: software that revolved around the enterprise; software that worked from the top, down; software that was business-oriented; and software that required license fees. In the long run, the community-oriented software development may not even last in a commercial sense—especially if this hype is followed by an economic downturn. (See Wikipedia entry "U.S.S.R.") Time will tell, but for now, there are certainly good reasons to question the community-centric hype.
In searching for a definition, then, keeping in mind that the internet as we know it does not lend itself to distinct versions, and that open software is frequently different but not necessarily better, it would thus seem that any true and precise definition of Web 2.0 could not even apply to the current internet. By extension, this would imply that O'Reilly's 2004 conference theme was actually a false alarm, far ahead of its time, or perhaps just another conference cliché that got blown out of proportion. The Web 2.0 that I foresee has a few key qualities—qualities better than those of today's "Web," if properly implemented:
1. Mandatory authentication of legal entities (people, companies, etc.);
2. Enforced regulation of addressing (IP addresses, domains, etc.) in keeping with rule (1);
3. No discernable difference between client-side and server-side software;
4. Copyright notices aside, open or closed (protected) content, at the author's discretion.
Changes in everything else—community participation levels, the popularity of various languages, or the naming patterns of companies—should be taken as given in a networked environment with high participation, and expected over time. Half-decade-long fads do not necessarily mark substantial progress, and may simply be more useful as economic or style indicators, like techno music in the 1980s, than predictors of productivity.
Substantial technological and economic progress deserving of an entirely new version number, perhaps 2.0, will come when spam is eliminated by rule (1), domain name squatting is a thing of the past due to rule (2), software is easy to create and use independent of its location due to rule (3), and individuals who choose can truly protect their intellectual property due to rule (4). Clearly, this imaginary system is not the Web of 1995, nor does it even remotely resemble the Web of today.
That definition of Web 2.0 makes sense to me. All that's left to do now is build it.