When I was growing up in my early teens circa 1996, I couldn't get Microsoft off my mind. The then-recent Windows 95 product launch was just about the most hyped event in computer history. Microsoft's monopoly on operating system software and its sly browser bundling tactics were the subject of a legendary anti-trust lawsuit. It's difficult to convey the oppressive sense that many felt: that there seemed to be no force capable of eroding Microsoft's ever-growing dominance. It was generally known that the atrociously error-prone Windows NT-based line of server software and the merely unstable Windows 9x line of consumer operating system software were on a deliberate collision course that made reading Byte, PC Magazine, PC World, and other industry rags something akin to watching a train wreck in slow motion—one that finally resolved somewhere around the introduction of Windows XP in 2002. Then, the internet happened, and with Microsoft too distracted to notice, the wreck collapsed into a cacophony of pieces and parts that are the plodding conglomerate we know today, and basically, the reason for Steve Ballmer's much-delayed firing.
Microsoft's behavior throughout the 1990s made it my least favorite company on the planet. I, along with countless others, despised Bill Gates—a man who reportedly believed that helping others in the world with his ill-gotten billions would provoke a Malthusian crisis, a view since remedied by those smarter than he. But for all the company's faults, which is really to say Bill's faults, the fact remained that his technology was more ubiquitous and more affordable than Steve Jobs's, and never really all that far behind. So even though it pained me, I used Windows for many, many years, until I could stand it no longer, and my Sony VAIO S-series laptop simply would not respond on a human time scale. Ironically, this essay is being written on Windows—running in Parallels, on a Mac.
The reason my Sony slowed to a crawl was not because, as many consumers seem to believe, computers grow tired like humans the longer they run. It was because of one thing, and one thing only: a design flaw embedded in Windows from the very early days around the release of Windows 3.0. At that time, someone at Microsoft made a decision to store program settings in a nominally central database—itself not an unreasonable idea. (Those old enough will remember that this was a fantasy at best for many years, as programs would leave .INI files strewn about on a constant basis.) The problem with the central database was that unlike most databases we use today, it was not relational. Instead, it was hierarchical, like IBM mainframe databases of the 1960s, leading to the development years later of a tree so massive, so complex, and so redundant, that it was practically impossible to find anything important, let alone remove anything unimportant. This tree had a name: the Windows Registry.
Microsoft had many chances to fix the Registry and turn it into a true database, but with each version of Windows, it remained, effectively unaltered, itself an altar to versions of Windows past in order to maintain the sacred tradition of backward compatibility. Eventually, script kiddies and serious hackers alike discovered that once data was inserted into the Registry, the likelihood of ever finding it again was virtually nil, and so they handed Steve Jobs a huge gift: spyware. By the early 2000s it was spreading rampantly, along with virsuses, bloatware, and a variety of other forms of malicious software. Apple for the first time in its long history had an undeniable edge aside from high-tech graphics and sound: the lack of everything that made Windows horrible. MacBooks and iBooks began appearing everywhere on college campuses as students realized that their Macs ran faster than PCs, and cost about the same with educational discounts. Apple's "Switch" campaign did real damage. And it all came down to the fact that because Steve Jobs's software didn't have a Registry, he could reasonably and truthfully point out (in the form of a hip twenty-something played by actor Justin Long) that Macs were more stable and secure.
From that point on, every effort Microsoft made to copy Apple—a business model that had once done wonders for the company—looked like a stupid gimmick. Nobody bought a Windows Phone, a Zune, a Surface, or really anything made by Microsoft except Office and server software, allowing the company to hover undisturbed around its favorite number of $30 on the NASDAQ. And of those who did buy something of Microsoft's, they most certainly did not buy it at the most blatant of blatant ripoffs, the "Microsoft Store." Ballmer thought that maybe his company could turn to "developers," but then delivered them an unholy mess called .NET, most notable for causing unresolvable Windows Update errors that haunt countless PCs to this day. Meanwhile, Microsoft's server software, which once had enormous potential, had its lunch eaten by the difficult-to-use open-source movement, led by Linux and the Apache web server. The only true bright spot, dutifully extinguished at this year's CES, was XBox.
Effectively, like a washed-up child actor, Microsoft did everything wrong for a good decade and a half, and only remained alive because of its blockbuster hits of yesteryear. The only surprise is that it took the board of directors so long to figure out that Steve Ballmer was the cause.
So now that Ballmer's reign of ignorance is over, what to do?
The board could (and probably will) hire a Very Experienced Executive to "turn the company around." That was HP's approach. They hired people like Mark Hurd, Leo Apotheker, and Meg Whitman. Now the press writes articles questioning whether it might just be better to wind the once-venerable company down and send everyone home.
Or the board could inject some excitement into Redmond. That was Yahoo's approach. As Chief Executive Vampire, Marissa Mayer acquired all kinds of goofy companies to suck their young blood, praying that the transfusion would restore Yahoo's crippled carcass to life. She also opted to crowdsource the selection of a new logo (something Microsoft has already adopted, to the infinite elation of colored square lovers everywhere) to the public—because when you're being paid $117 million over five years, it is difficult to make those kinds of decisions on your own. So far, that experiment doesn't seem to be going too well, unless your shareholders want a corporate image akin to a 1950s diner, or really value spreads in Vogue.
The board could also just continue on the same downward path the company has been on for years, by promoting from within one of the many self-selected, uninspired laborers who have slaved away for Microsoft for decades. Granted, many of Microsoft's employees are really smart people, but the kind of person the board should be looking for is not the kind of person who, amidst the obvious decline, would have gone to work for Microsoft at any point in the past fifteen years.
If none of these options sounds particularly appealing—and they shouldn't—then at least consider one more.
Microsoft Windows has had a stellar run as a product. It is still installed on more computers, most likely, than any operating system that has ever been. Still, right now, Windows is a bloated knot of insanity, incredibly inefficient in its inner workings and design, the product of a myopic MBA mindset magnified a thousand-fold by corporate infighting and inertia. If Steve Ballmer made any mistake, it was not realizing that Windows has had its time, and all good things must come to an end, because at least in software, Joseph Schumpeter was right: it paves the way for better things.
There was once a time when Microsoft almost had that better thing within its grasp, and it was referred to—if only fleetingly—as Windows Future Storage (WinFS). WinFS almost did away with the traditional file-folder hierarchy (independent of the Registry's hierarchy, which was glommed on top), but it never actually shipped as a product, and was sliced into pieces that are now minor features of Microsoft's various database and programming framework products. It died a most ignoble death, to the company's detriment. And Bill Gates knew it.
Microsoft—or someone (I'd prefer it to be my company, actually, especially given that it may soon hold a relevant patent)—should make a new operating system from the ground up that takes advantage of these principles. But it shouldn't be called Windows, and it should be the unifying focus of everything the company does, much like iPod and iTunes were Apple's focal point for all of its efforts for a decade.
As the sign I hung above my desk in high school so eloquently stated, "In a world without fences or walls, you do not need gates or windows." Thanks to the internet, we live in that world. Steve Ballmer could never figure it out. Microsoft would do well to find someone who can.