One of the most important things to understand about technology (and computers are no exception) is that changes in the scale of the technology change the very nature of the technology. That is, as you make computers faster and cheaper, at some point the sum of all of those small improvements changes the fundamental nature of the beast. We've seen this as the computer transformed from main frame to desktop (which is really just a change in cost and size), finding an entirely new audience, and now again as the computer changes from what we know of now as a computer to cell phones, MP3 players, and other small, mobile devices.
"Commodification" is what happens when, as things get better, cheaper, faster, etc., consumers stop caring about the marginal improvement. Back in the days of Windows 95 and 386's, there were ways you could improve the operating system and hardware in substantial ways; a doubling in processor speed and a rewrite of the operating system got you protected memory, which meant less data loss.
A few years ago we reached the point where desktop hardware became commodified. For the average user, 1.8 vs 2.2 ghz makes no difference at all. It's a question of how quickly your computer can wait for keystrokes and data from the internet. (Answer: even a lowly Celleron is light-years faster than the I/O devices it typically has to talk to. Even if you're the last kid in your class at Harvard, you're going to be bored discussing politics with a bunch of four-year-olds.) At that point things became very difficult for major vendors like IBM (sold out), HP and Compaq (merged), Gateway (bought out of it's misery), etc. The price of a desktop plummeted from over $1000 to less than $400.
I believe we've reached the point where operating systems have become a commodity as well;
- Every major operating system has all of the features of a "real" operating system - that is, protected memory, virtual memory, plug & play driver support, etc.
- The performance for normal applications is just about the same; there are some specific variations that matter in the server market, but for all practical purposes the operating system is not in the way, and the machine is much faster than users need anyway.
- Every major operating system has a similarly designed GUI experience that, once you get used to the quirks of where the close box is, is just about the same, more or less. (Mac users - keep your pants on. :-)
The problem is that operating systems are now a commodity. Simply put, users don't need a new operating system. There are no big ticket features missing from OS X 10.4, Windows XP, or Linux 2.6. This makes Microsoft's business model fundamentally vulnerable to Linux for the first time. If the name of the game is:
- Keep costs down, as low as possible.
- Incrementally improve quality very slowly without ever causing the pain of a major OS upgrade.
When I looked at Windows XP and Ubuntu 6.06 I was afraid that Linux wouldn't make traction into the desktop market. I blamed the adoption of X11, the KDE/GNOME schism, and the Linux communities' being made up of Shell nerds for the tolerable desktop experience.
But look where we are now: Vista is a vehicle for bloat. Combine "we make money by shipping major features" and "there are no more major features to ship" and you get Vista...an attempt to change a lot of things when you should have left things alone.*
By comparison, Ubuntu pretty much just works - you put the live CD on your machine, it asks you some questions and installs...it knows about more hardware, has less bugs, more drivers, and a better user experience. In a commodified operating-system space, the only thing to do is try to avoid a bad user experience - if you can't offer a really juicy carrot to users, try to avoid hitting them with a stick.
And it is in this environment that the Mac is actually gaining market share. Apple's business model has always been at odds with the industry. Complete vertical integration meant higher costs, lack of market share, and out-of-date technology - back when having more for less meant something, that was a real weakness, and explains why the Mac never dominated in market share.
But what a difference a decade makes! Hardware is now commodified (and Apple is integrated at the system-building level, leveraging cheap third party parts like they always should have). Operating systems are commodified. But on the one frontier left, quality of user experience, Apple's vertical integration gives it an immense advantage.
The question is: why does an operating system "just work"?
- Vista: it doesn't. There are too many systems and not enough testers and engineers trying to solve the problem.
- Linux: massive distributed engineering. For any given hardware system, eventually a Linux nerd will integrate it. Anyone can solve the problem of poor user experience.
- Apple: they have it easy. With only half a dozen machines in production (and maybe another two dozen legacy configurations to support) they have a much smaller configuration space to worry about than anyone else.
Their best-case scenario is that they eventually get Vista back to an XP-quality experience, in which case all they've done is spend a huge amount of R&D money and pissed off a lot of customers to maintain the status quo.
* I have mixed opinions on Vista's video-driver-model change. But that's a different post.
6 comments:
Unquestionably another insightful post on the post-Vista future of Microsoft. It's equally damning that MS found the architecture of Windows incapable of supporting the new features they had originally planned for Longhorn/Vista (and can be found in the competition's OSes). I can't help be think Windows is soon going to be pushed out of homes.
Interesting points Ben. In general I do not disagree but I don’t think we are as far as long as you think. I’ve spent all my adult life in technology and the last 15 directly in IT. I have several machines, 1 Vista 32-bit desktop, 1 Vista 64-bit desktop, a WinXP notebook and a Ubuntu based notebook. While it Vista runs fine it really wasn’t the “wow” Microsoft promised. DirectX 10 has been a non-issue for me so far as I do not have any apps that require it.
The real tripper for OSX and Linux is peripheral support. While you may find drivers or get generic support the add-on utils that help you make the most of that peripheral are missing. Take the Saitek X-52 for example. While it works fine under Linux as a basic stick with 27 buttons, under Windows you program those buttons in layers with their profiling software. You can have as many as six layers. That is just one example. When that kind of functionality can exist across all platforms then we can truly say the OS does not matter.
The other thing to keep in mind is that Microsoft did not get where they are because they have a superior product. It was due to superior marketing and questionable business practices.
Hey, Ben, I've been watching with humour and interest as you make these somewhat positive notes about operating system experience. I try to keep my linux flag in my pocket while doing my work, but time and time again, I find my productivity is just so much better under Linux.
However, I can say the only reason I can be 'tolerated' to use a linux only laptop for work, is because of crossover office. This allows me to run Microsoft Office.
In this post, you talk about user experience as being the important distinguishing factor between operating systems. However, I believe that application support is the real factor that leads people to decide on one operating system over another.
That to me provides the answer to the 'What will happen to microsoft?' question. The world runs on MS Office. OpenOffice just plain sucks in comparison, so there is no viable multi-platform alternative.
What are your thoughts? I don't think that Laminar Research is a good example of an ISV that decided to support linux, since X-Plane is such a nerdy tool (especially since you added the SDK). What about other vendors?
Come on Ben! I enjoy your blog but you Apple/Linux nuts are all the same. Get original, man! And check your facts...
I have a Dell laptop with 3 operating systems installed, Win XP, Mac OS X and Ubuntu 8.04. In my experience Vista sucks (vapor ware) actually unistalled it and upadated to Win XP. Vista never got to fullfill the expectations it generated, XP is OK but has its limitations as all of MS stuff. Mac OS X is just a piece of crap, a nice GUI over FreeBSD, nice eye candy over an expensive box its drivers limitations just make it very limited outside Apple configured machines, as Ben says they have it easy, but I don't see much future unless the make OS X,universal to all PCs. Ubuntu is great, nice looks, excellent performace easy handling, but the important thing about a computer is the software it runs, while mayor software companies like MS, Adobe and Autodesk keep there software away from Linux, it will continue being a nerd thing. I get good performance from X-Plane in Ubuntu and XP, but my work demands of a Win XP to run the tools I need, so I will have to stick to it. The good news is that Adobe announced they will port their software to Linux... Conclusion I have to use an OS I really don't find great just because I have to work in tools that demand it.
Julian: there's a bit in your post that I don't agree with, but let me point to two statements:
"its drivers limitations just make it very limited outside Apple configured machines"
"but I don't see much future unless the make OS X,universal to all PCs."
I'm not sure how you get from A to B. OS X isn't designed to run on all boxes - it's an integrated package. More importantly, I don't see how this limits its future.
Apple takes commodity parts, puts on a proprietary mod of an also-ran open source OS, puts a nice logo on it and charges 2x the price. And they succeed at this strategy!
If they were losing market share for this you'd say it was stupid. But it appears to me that they can take this strategy to the bank. If anything, market dynamics have made this approach more viable for Apple, not less.
Post a Comment