The tell-tale signs suggesting a platform's demise

In the post on OS turning circles, I used the concept of a radius of turning as an analogy for agility. One problem with the analogy is that turning in circles implies a return to a starting point or at least a closing of the loop. The idea is that there is lifecycle repetition. However, in reality, this does not apply to the world of operating systems.

An OS, as a platform, usually has a finite life. It is born, grows and usually reaches a point where it is no longer supported. Sometimes, a new platform is born to take its place from the original owner but more often a replacement comes from a new challenger company.

So rather than circles, the analogy of OS lifetimes may be more accurate.

If we do think of platforms as finite, then the natural question is what causes an end? We need to look for patterns which may indicate when a platform is reaching end of life.

The difference in this analysis is that the measure of “age” of a platform I use is not time per se but versioning. The logic is that each major version is a meaningful and significant improvement in a platform which needs to be delineated, marketed and celebrated. It embodies the business logic as well as the engineering logic of the platform custodian.

Taking the data from the last post I added a few more platforms: Symbian[1], PalmOS and Blackberry OS[2] to seek out patterns. I also separated the desktop/portable OS’s from Mobile OS’s and plotted these version-demarcated lifespans.

One thing to observe is that the scales of the two charts are comparable. There are examples of short- and long-term version updates and the number of iterations (lifespan) can be similar. I noted also that there were several platforms which have reached end-of-life[3][4]. Those platforms have a big black dot at the end of the line.

That leads to another observation. The end of a platform seems to be indicated not by simple age (the shortest lived was six years and the longest lived was 16) nor by the number of versions (PalmOS lived for five while MacOS lived for nine). Instead the end of life is most clearly visible as a lengthening of the development cycle.

Note that each platform that ended was preceded by a spike into the vertical–a significant delay in the release of a version. The data is one thing, but it’s anecdotally supported by observation. Industry observers note that delays in improving the product are symptoms of some fundamental architectural or marketing roadblock. In the case of Mac OS, Apple struggled to bring modernity to its “Classic” OS. It need memory management, more reliability and a better file system to support the move to networking and media hub use that defined the consumer expectations of a PC.

There was a change in the basis of competition, away from pure productivity and more toward entertainment that turned out to be more demanding in new ways. Apple had to move away from Mac OS and lost time with its internal Copland effort before punting with NextSTEP. Similar transitions are visible with Palm (from PalmOS to WebOS), Microsoft (from Windows Mobile to Windows Phone) and Nokia (from Symbian to MeeGo) and RIM (from Blackberry OS to QNX). In fact, survival of a transition is relatively rare and never without significant pain and loss of value or share.

This is also understandable through the lens of disruption theory. As a product reached the point of being good enough, “breakthroughs” are harder to come by. Engineers and marketers struggle to push the product into increasingly rarefied strata of performance. The old architecture does not fit the new demands but it’s crammed into them anyway. This last big push is then followed by a stall and ultimate demise. Meanwhile, an entrant gains lift in the rich atmosphere of new bases of competition with an architecture that’s built specifically for it. The process then repeats.

And so the charts relate the same story of sustaining improvements followed by inevitable last gasps that Clayton Christensen first illustrated[5] in The Innovator’s Dilemma. A book that came out just as the first mobile platforms cataloged in these charts began their ascent.


  1. Versions of Symbian do not match easily to integer values. I used the versions as recorded by The sequence is as follows: EPOC Release 3: “1”, Release 4: “2”, Release 5: “3”, Symbian OS 6.0: “4”, OS 7.0: “5”,  OS 8.0: “6”, OS 9.1: “7” (9.0 was deproductized), Symbian ^3: “8”. Symbian ^4 has been cancelled. These original version numbers are noted on the chart.
  2. The times recorded are for “general availability of product” which in the case of mobile OS’s means the time when a device using that OS was released.
    Symbian PalmOS Blackberry
    Jun-97 Jun-96 Jan-99
    Oct-99 Mar-97 Apr-00
    Mar-00 Mar-98 Mar-02
    Jun-01 Mar-01 Oct-03
    Oct-03 Oct-02 Nov-09
    Nov-04 Sep-10
    Feb-06 May-11
  3. End of life is defined as the last version generally available. In some cases (e.g. PalmOS 6) newer versions are built but they may not be released into a working complete product.
  4. Windows Mobile is treated differently here than in the previous post. I chose to declare it EOL after version 6 and consider Windows Phone as a separate OS. This is because the name change is indicative of a break with the past. As before, I defer decisions about continuity to the developers and/or marketers who choose the naming conventions.
  5. See slides 3 and 5 here
  • Does this mean that HTML5 is everybody’s future? No OS can support faster versioning than the web can. Even if the webapp is local to the device, it can get updated every time it runs if the network is up.

    • Wouldn’t HTML be considered the platform/OS and HTML5 the 5th version? So we’re not talking about a web app, we’re talking about the platform being updated. And HTML is probably the hardest of all platforms to update — every single web browser needs to make parallel changes to their code. This can take years and still never be in sync.

      HTML also has decisions made by committees, and that seems to affect its agility.

      Notable mention: Chrome’s auto updating means revisions can be made very quickly. As does the latest Firefox, I believe.

      • HTML isn’t a vendor-managed platform, and the “release” cycle of standardization documents is a poor gauge for the actual advancement of the platform — which happens bit by bit in a decentralized way as people update their different browsers, until there’s enough running versions supporting a given feature that an individual developer decides it’s now worthwhile to use that feature.

        Also, if you go by WHATWG (where the real work is being done) rather than W3C, then advancement of the standard itself has effectively shifted to a set of unversioned living documents — there is only HTML; at any given point, the document reflect the current consensus for how a browser ought to work. For any given feature, whether they *do* work that way is immaterial; the standard itself is govern more by whether all major players are in agreement that it’s the right idea and that they each intend to implement it eventually.

        It would be better to say that HTML is an interoperability standard; the relevant platforms for this metric would be the browsers themselves.

    • mpt

      The Web platform is perhaps the best example of the point I made in commenting on the previous post: focusing on “major versions” is just not meaningful. The working group responsible for most HTML development no longer call it “HTML5”, but just “HTML”, because it’s now a living specification that is updated continuously. And parts of it are implemented at different times by different browsers.

      On one hand, this protects against the platform’s demise, because it’s difficult for any one vendor to kill it. (If one browser, server, or authoring tool ceases or slows development, people can switch to another.) On the other hand, it means the Web usually lags behind native platforms, because application developers have to wait for multiple vendors to implement the same APIs before they can rely on them.

      So the Web platform couldn’t possibly be represented on these charts, because it doesn’t (any longer, if it ever did) have major versions. But again, what is important is not major versions. It is how fast a platform improves, how quickly people can upgrade, and how quickly compelling applications and accessories start relying on features that require upgrades.

      • Anonymous

        No, it works perfectly. There was more time between HTML 3 and 4 than 2 and 3 or 1 and 2, then the Web stalls and is replaced with HTML5 many years later. HTML4 happened when the Web was 9; it is 21 now and adopting HTML5 is still controversial.

        What you are missing is that the Web platform died after HTML4 and was replaced with IE6/FlashPlayer development. It was not the same platform at all. Then a group called WHATWG attempted to create a new Web development specification called Web Applications v1.0 in order to re-standardize the Web. This is entirely independent of W3C, the people who invented the Web and authored HTML 1-2-3-4 and then announced there would be no more HTML specifications. Later, W3C adopted WHATWG’s spec as HTML5. That is an entirely rethought HTML and Web. It is Web X. Same as Mac OS and Mac OS X.

        So Web 1.0 went HTML 1-2-3-4, each coming more and more slowly over the course of 10 years, then collapsed, we used commercial solutions in the interim, and now 10 years later we have started another Web. This time, practically designed for consumers instead of techies, and built to scale to massive size.

        HTML only just switched to not having versions, because they are essentially going to do 5.0.1 and 5.0.2 from now on, little steps that everybody takes together. But they had versions, and we see the pace slowing just like Horace said. In fact 4 was to be the last version. 5 was made by a different group than 1-2-3-4.

        The browsers are no longer relevant in HTML5. What is in the spec is what all browsers support. That is the point of the spec now.

      • mpt

        The WhatWG is what I was referring to as “the working group responsible for most HTML development”. The Web had got in a bad way, certainly, but saying “the Web platform died” or “collapsed” bears no resemblance to reality. (For example, if your IE6+Flash hypothesis was correct, it would not have been feasible to ship Macs without IE in 2005, or iPhones without IE or Flash in 2007.)

        Nor is it true to say “we have started another Web”. On the contrary, “support existing content” is first in the W3C’s list of HTML Design Principles, and new features almost all use the same syntax frameworks as the old ones. That’s both a strength and a weakness; it’s one of the reasons native platforms will always be easier to develop for.

        And even if both those suggestions were true, the graphs still wouldn’t “work perfectly”, for the simple reason I already gave. HTML is no longer going to have major versions like those listed here for other platforms, but that has no relevance to its viability.

    • Some browsers adapt fast to some bits in the standard. Take a look at to see how well an HTML5 will work on an average smartphone. It will take years before you get a universal experience on all platforms, unless there is a common browser provider.
      In fact look at how hard it is to get a website right on the regular web on the most popular browsers only, then multiply that by the number of mobile OS’s

      • Anonymous

        No, that is not how HTML5 works.

        You are not supposed to get the same experience on every device, you are simply happy to run the app successfully on every device. In some cases, you will see animations that help you use the app, in other cases the animations do not run, but you can still access the features of the app. In some cases, the user may be using a text-only browser, or a browser that reads aloud to them because they are visually impaired. The app adapts to whatever place it finds itself running in. We build a core of universal functionality, then enhance it with optional features that will only run if the device supports them. We test for those features before using them. We can run on anything very easily.

        It is possible right now to wrote an HTML5 app that runs not only on every smartphone, and not only on every PC that is working today, bit also on a PC from 15 years ago, or a PDA from 15 years ago. In fact, a screenshot of the recent Boston Globe HTML5 redesign running on an Apple Newton PDA from 1996 or so was the toast of Web development circles last week. No it doesn’t look the same as on an iPad, but it still works, it still provides the core functionality of reading the Boston Globe.

        There already is a common browser provider on mobiles. 99% of them run WebKit, the Apple-authored browser core from OS X. Yes, most mobile vendors screw it up so badly that it can’t run apps with as much fidelity as an iOS device. Doesn’t matter. They are easy to support with HTML5. If the user wants better fidelity, they can buy an iPhone. That is competition.

        You also have it backwards when it comes to getting “regular websites” running in popular browsers being easier than HTML5. No. That is the problem that HTML5 SOLVES. It is much easier to deploy a single HTML5 site than a single HTML4 site because HTML4 standardization failed, it was useless. During those years, we did not make HTML4 sites, we made IE6/FlashPlayer sites. That is what made it hard to deploy websites. IE6 is not standardized, there is no book on how to develop for it, you have to test things and see if they work. Flash is not standardized, and it had 2 different owners during that time and is a pain in the ass to deploy. IE6/FlashPlayer was a disaster that cost everybody on the Web trillions of dollars compared to if HTML4 standardization had been successful.

        So HTML5 was done differently than HTML4. In fact, it was done by different people, and then adopted by the W3C later once it was seen to be the right approach. What is in HTML5 is already successfully standardized, because they don’t put anything in until all browsers support it, and it can be supported in a straightforward way through backwards compatibility for legacy browsers. There are almost no new features, it just clarifies old features we already used, for the most part. And what few new features there are (e.g. audio tag, video tag) have built-in backwards compatibility. Legacy browsers that do not support audio or video tags will render whatever tag is inside the audio or video tag, while browsers that support the audio or video tag skip the inside content.

        For all the bad things you can say about Web development or browsers, HTML5 is still the best way to make a cross-platform app of that is your priority. There is not even anything that comes close. Not IE6/FlashPlayer, not Silverlight, not Java. And HTML5 is not tomorrow, it is the past 3-4 years. There is no IE6/FlashPlayer Web on mobiles, so to be universal, it is HTML5 since 2007. Even the desktop browsers all made their HTML5 parsers their default in 2009 or so. If you want to render as HTML4, you have to specifically ask the browser to go into that mode since then.

    • Anonymous

      Well, yes HTML5 is everybody’s future, but not their only future.

      You actually have it backwards. The Web revs slowly. Very slowly. Painfully, achingly slowly. Web apps can rev fast, but the Web itself revs slowly. And Web apps cannot do anything the Web does not yet support.

      Consider how iOS runs 2 kinds of apps: HTML5 and CocoaTouch (App Store.) CocoaTouch is way ahead in features because if Apple decides to add a new feature, say “location-awareness,” they just add it to iOS and it is done. Developers can use it right away. To add that same “location-awareness” feature to HTML5, Apple would have to add it to WebKit, then submit it for standardization, then Mozilla will change it a bit, Microsoft may change it a bit, then there is a wait while everybody studies it, a few more changes, then a wait while all the browsers add that feature, then another wait while Web developers figure out how to work around the bugs in Microsoft’s implementation, and then finally, a Web app developer can use location-awareness in their app.

      So what you have on iOS is the best of both worlds. CocoaTouch can bring you features and apps the Web won’t have for many years at the cost of running only on Apple devices, while HTML5 brings you cross-platform, vendor neutral open standard apps at the cost of not having the most cutting-edge features. This is very complimentary. The user can run both iMovie and GarageBand in CocoaTouch, and Facebook and YouTube in HTML5. Devices that have HTML5 only do not have video editing and music and audio production. Devices that do not have HTML5, like Windows Phone, do not have Facebook or YouTube unless Facebook or YouTube build a native app (Facebook did, although it lacks features; YouTube didn’t.)

      So your smaller apps and those that do not need cutting-edge features will be HTML5, and many devices will be HTML5 only, but Apple will still be selling iOS and Mac devices to people who specifically buy them to run iMovie class apps and better.

      • Actually, what happens more often is:

        Apple adds it to Mobile WebKit at the same time as Cocoa, as a browser specific –webkit-* feature, and submits the idea for standardization. Developers can use it right away, for browsers running the latest version of Mobile Safari.

        The delays of the standardization process only matter if you want the *same* implementation to be able to work in *all* relevant browsers. If you’re only targeting a single browser, that can move forward very quickly — especially if that browser is Mobile Safari.

    • Davel

      On OS does not need to keep up with HTML. A browser is just an application, microsoft’s definition not withstanding.

  • Prachi Gauriar

    Is there any particular reason you’re using Windows 2000 instead of Windows ME? Admittedly, almost no one used Windows ME, but it was Microsoft’s successor to Windows 98. Windows 2000 was in the NT line and not at all targeted at consumers.

    • Admittedly it’s controversial, but I did not consider Me a major release. The effort was as a stop-gap with a very short shelf life. According to Wikipedia:

      Windows Me was conceived as a quick one-year project that served as a stopgap release between Windows 98 and Windows XP. Many of the new features were available from the Windows Update site as updates for older Windows versions, (System Restore and Windows Movie Maker were exceptions).

      • Rj

        Here you are clearly becoming a judge of relevance… and I’m afraid you’re getting it wrong.

        Admittedly, you’re confusing two different operating systems that were deliberately obfuscated by the manager of the platform (highlighting one of the limitations in trusting them for this analysis).

        Me is definitely significant because it was the last release of DOS/Windows.

        Windows 2000 was released around the same time but it was the successor to NT4 and therefore in a completely different line. After Win2k (internally numbered NT5.0) came Windows XP as the next NT release.

        However, XP was internally only NT5.1 and as you are adamant that you aren’t including point releases (focusing only on whole number releases), it shouldn’t really make your list… although as the first version of NT that was intended for everyone, it was obviously important to Microsoft regardless of just being a “point one” update.

      • (Horace – thank you for your posts. I learn A LOT).
        RJ – great to bring up the NT releases. Horace incorrectly linked Win98 to the NT lineage of OS. In reality, MSFT had two OS groups: 1) NT starting with the release of NT 3.1 circa 97/98 and 2) Win9x – more for the consumer desktop. NT’s predecessor was really OS/2 – NOT in code (both led by VERY DIFFERENT engineers (Letwin for OS/2 and Cutler for NT)…but for intent – which was businesses. IT/business has vastly different needs. Just check out the differences needed in what you might consider “simple stuff” like the TCP/IP stack! In NT, the stack by necessity needed to be robust/secure – and consequently more complex……

        fascinating stuff. (OK – at least to me).

      • The objective is to measure the rate of significant improvements in a product. The metric that spans all operating systems (mobile or not) is the name or version of the product iterations. Microsoft chose a variable naming convention but they used the phrase “Windows” throughout the evolution of their platform. [I excluded server variants and versions that were not significant again for reasons of comparison to other platforms. Me is by Microsoft’s intent and execution was not significant enough.]

        Most of the objections have been that the technological underpinnings are not in the same thread and I know this well. But what I chose to weigh more is the _branding_ of the product because the brand conveys the meaning of the business. If Microsoft says a version of Windows then I have to assume they mean it’s a continuation of a platform called “Windows” and when they release a new name (which they once called a version) then that’s the pivot point of what is significant and meaningful.

        Microsoft also has various internal naming conventions from numbers to code names but I can’t work with that data because it is even less consistent with what other companies communicate (Symbian is even worse as it changed hands more than once). What Microsoft communicates is what I treat as significant and meaningful.

      • Rj

        Fair enough; the data you’re working with isn’t perfect so you have to make some arbitrary decisions in order to normalise it in a manner that you can use. No matter how you slice it, there is going to be some level of arbitrary segmentation, so if this method tells us something perhaps it is good enough.

        I worry though on three levels:

        1. The data does not equate like with like[1].
        2. The data most closely measures the lifespan of an _operating system brand_, based on the frequency of significant marketing events that occur during the life of that _brand_ but your analysis does not make that clear.
        3. While you might be intentionally trying to avoid heated discussion of the meta-platforms (brands) that sometimes live on top of operating systems, the current data certainly obscures analysis of them.

        fn1. By the definition you have expressed in your reply, Mac OS X is just the version after Mac OS 9, yet you have separated them as distinct platforms. The platform of “Macintosh” did continue until this year, in the same way that the platform of “Windows” has and your demarcation clouds your data.

        Apple & MS did use slightly different messaging to consumers during their OS transitions but is this really significant? Even the bulk of both the marketing can be summarised by a common sentence; “This new OS is much better than the old OS but don’t worry you can still run your old applications”.

        Perhaps it would be better to call Windows and Macintosh “meta-platforms” as I have in point 3 above. Then you would be correct to draw a line between Mac OS 9 and Mac OS X; they are separate operating systems in the same way that WindowsDOS is different to WindowsNT.

    • Anonymous

      Yeah, consumer Windows goes 95, 98, Me, XP. The fact that nobody bought Me is why XP was rushed in 2001 and arrived so broken it could not be hooked up to the Internet reliably.

      The first consumer NT was XP. That was the sole and only purpose of the release. I totally get the focus on what the consumer was using, but that was not 2000. They went 98 to XP.

      Windows Me is MS-DOS 8, I believe. It’s an example of MS-DOS development cycles failing. The only feature users wanted from Me was better stability. Bill Gates was shot dead in the South Park movie from 1998 over the lack of quality in Windows 98. Microsoft had to put in NT to answer that request.

      Also, nobody went 2000 to XP. People who chose 2000 over Me or 98 kept 2000 for years. Windows 2003 Server was based on 2000 as a feature for 2000 users, so they could upgrade to that. Longhorn was based on XP, and failed because of that. Vista was restarted with Windows 2003.

  • This’ll be fun when the Symbian people turn up. There was an official press-release launch of many Symbian versions in between those you have mentioned, and Anna and Belle are both different numbers.

    • As I pointed out, I am looking only at point releases. To that end I am looking for Symbian^4 to follow Symbian^3. Symbian Anna is an update to Symbian^3 and Symbian Belle is an update to Symbian Anna. In October 2010 Nokia announced that Symbian^4 will not ship.

      • True Horace, but you’re missing that Belle *IS* Symbian^4, but without the Orbit UI. It instead has a totally new UI started late 2010. The OS release it is built with is the same one that S^4 was being built upon.

        Symbian Anna is also a new OS release, not just a little pack of new features. What you have to realise is that Symbian no longer has version numbers – Symbian^3 was the last one to have a version number. From that point on, they have names which have so far been alphabetical.

        I agree though that the relative silence about Symbian versions over the years has made archaeology such as this way more difficult than it ought to be.

      • This archaeology is as much about the engineering as about the marketing or intentions of the business. I conflate the two but they don’t always agree. Unfortunately if there is room for debate, we need to form a committee and, after much deliberation, vote on what is and what isn’t a significant version. I would not trust my judgement of this.

        Data used in analysis is always prone to margins of error and incomplete. Unfortunately we have even less data about the future so we have to do with what we have about the past.

      • Absolutely, and in Symbian’s case I’m sure I am misremembering parts of it. Despite being intimately involved in lots of parts of it, I don’t recall the details all that well so I wouldn’t trust my memory either!

        As with all histories of this kind, until someone writes a history of a particular platform which everyone agrees with (or mostly, at least) then all interpretations will be just that.

      • I forgot to say that I do mostly agree with the theory though – there is an interesting correlation between increasing release times and OS maturity (which could equally well be called complexity).

  • Horace, you say you were looking at point releases, S3 shouldn’t factor in (it’s v9.5) unless you count also Symbian 1/S60v5 as that was the major change in that platform (the addition of the touch layer was considered a major release) and that was in ’09. The first v9 of Symbian was back in 2006 wasn’t it (the various feature packs in between).

    • My hope is not be the judge importance or relevance. My assumption is that those who manage the platforms are the best judges of that relevance and that the names they choose are meaningful.

      • Rj

        If you don’t want to be the judge of importance and relevance, find someone that will be. You’ve got something really interesting here but the errors in determining versions is obscuring the basic insight.

        You have started mining your audience as a resource for guest editorial. For an analysis like this it certainly seems (from other comments) that you have domain experts in most or all of the OSes that you are discussing.

        To everyone that is complaining about specific details of version timelines, please post a comment outlining the details that you are sure about so we can try to determine accurate timelines.

  • The idea of slowing improvement in a platform is solid but OS version number is a poor guide. Why? Because version number is becoming a marketing term rather than an indication of major new features. Examples abound of new versions being little different to the previous version: Mac OS classic version 8 and 9 were little different to 7, and v8 was a name chosen to invalidate the Mac OS licensing agreements of the cloners; Windows 2000 to XP or ’98 to ME are other examples of OS releases with minimal changes.

    Also, the actual choice of versions is really complex.

    On the desktop, it’s awkward to say Windows 2000 is the successor of Windows ’98 when due to the code is really the update of Windows NT4 (from 1996).

    On mobile, Symbian platform releases are not really the whole story. Nokia’s Series 60 is effectively a separate platform, app developers created for S60 or UIQ and not for Symbian 7. BlackBerry OS 6 to 7 is IMO another example of an OS name change for branding and marketing reasons more than significant changes in the platform.

  • Anonymous

    It’s a nice story, and it may be true going forward, but the full corpus of data just doesn’t support the contention when you start adding in the large numbers of desktop platforms that have died over the years. In fact there we see a very different set of forces in play.

    Generally each platform died due to unpopularity of the underlying hardware, rather than any insurmountable issues with the OS itself. In essence the problem was that the custodian of the platform failed, rather than the platform itself. Very few software platforms have successfully jumped to incompatible hardware, in the consumer desktop market only Apple succeeded. AmigaOS may carry on a zombie existence but it really failed when Commodore stopped making hardware in 1994, Acorn’s OS family died with their hardware division even as their processor went on to conquer the world. Atari, Amstrad, Sinclair, etc. All the names of the golden age of home computers, all had a software platform that died with their hardware business.

    So it’s possible that the Blackberry OS is failing because it simply couldn’t scale, but it’s equally possible that the problem is that people don’t want Blackberry hardware anymore and QNX will fare no better. It’s possible that PalmOS failed in the way you describe, but WebOS also seems to have failed so maybe the problem once again lay with Palm’s hardware? It’s possible that Symbian couldn’t be extended fast enough, but it’s equally possible that Nokia just don’t understand software development since MeeGo doesn’t appear to have any fundamental problems in architecture yet it’s DoA.

    Heck we see a big spike in version delivery around windows XP that was due to entirely the reasons that you give for platform failure – yet it had no effect because the underlying hardware business was so well entrenched.

    • Why is it that people don’t want Blackberry hardware anymore? I’d say it’s almost entirely because they don’t want the OS (or more correctly, they don’t want the platform). They want a different platform (iOS or Android).

      It does seem smart to consider hardware and software development together though.

      “AmigaOS may carry on a zombie existence but it really failed when Commodore stopped making hardware in 1994.”

      This is true, and AmigaOS is a good example — if anything it’s an example of an old OS that probably could have been developed up to the current day (it had preemptive multitasking etc). Maybe a lengthening of the development cycle should be seen as a range of possible failures in the hardware or software or management.

      I still think Horace’s main point stands — AmigaOS failed because of one or more disruptions and was signalled by longer version cycles.

      • Anonymous

        I would say the reason people don’t want BB hardware is because BB never moved forwards with it. They kept making quite good qwerty phones that very few people want, and never made any good touch-screen devices. The software actually isn’t that bad from my limited experience – in fact it does some things really really well – but the joystick thingy makes me nauseus.

        The Amiga failed because it had become essentially a self-hosting game console, and they failed to manage the hardware generation switch adequately. The A600 was a disaster and the cost of the A1200 was just too high considering that there was so little initial software support (it didn’t even reliably run A500 games). The higher end pro models like the 3000 and the 4000 never really cracked a big enough market to matter.

        The Amiga wasn’t really disrupted, the damage was done in 1992. PC gaming took years more to seriously take off, it was still suffering in those years from lack of anything remotely resembling standard sound or graphics. The Playstation didn’t come out till 1994.

    • PalmOS failed not so much due to hardware (yes, Sony, then Handspring, then WinMobile swung at it well), but because of mismanagement of the OS after v4. Version 5 was supposed to be short lived and temporary until the BeOs infused 6 were to come forth. That was supposed to be less than a year, and when it didnt, mainly because the OS as presented was all skin not working on any hardware, Palm suffered.

      They were able to keep it going because of the marvelous UI work done by Handspring on v4. But, by the time the Treo 650 came, it was clear that Palm was essentially bringing a pistol to a missle fight in terms of not just hardware, but software.

      Writing all of that does make an interesting observation: the longest running (versioned) platforms started out with very tight UX requirements and small system requirements. Their failures to go longer ended up being the same small req that kept them viable in the early going – that is, the UX model was never designed to grow beyond those constraints. Given that, iOS would need a major shift in e next few versions to not end up at a similar end.

      • Kizedek

        “Given that, iOS would need a major shift in e next few versions to not end up at a similar end.”

        Except that it IS intimately related to OS X on the desktop, and is already an (the only?) example of a successfully scaled OS…

      • Anonymous

        iOS already had that shift 3 years in when iPad was released. They added a massive GPU, full-size display, and full-size apps without a hiccup. They could add that because iOS is a desktop system that has been scaled down, not a PDA system that has been scaled up.

        Another example would be iLife and Final Cut Pro X being rebuilt in such a way that they could work on a touch Mac, because both iOS and Mac OS run on the same OS X core. If a touch Mac ships, it will descend very much from the first iPhone, not just from the Mac. You’ll be able to say that iPhone grew into iPad and then grew into a new line of Macs.

        Very different from Palm OS. OS X is desktop class wherever it runs.

    • Sander van der Wal

      Nobody bought Amiga’s, Acorn’s and so on because of the hardware itself. They bought the hardware/software combination, and that combination died. Which makes it a platform death.

      • Anonymous

        Yes, but the reason that they stopped buying the combination wasn’t because the OS fell behind. In many cases the OS was still ahead of the alternatives, but the hardware had fallen behind.

        AmigaOS and RISCos both died years before pre-emptive multitasking reached Windows or Mac.

        Horace’s entire thesis is wrong here.

  • Anonymous

    I would guess that this trend is less about platform success and more about software project maintenance.

    • It would have taken heroic project management to save PalmOS, Blackberry OS, Mac OS, Symbian and Windows Mobile.

      • I can’t speak for the others, but Symbian was driven in the wrong direction IMO not due to project managers but bad vision leading to incorrect requirements. No project manager on earth can save a project which is being pushed in the wrong direction 🙂

        All those platforms are exhibiting or have exhibited poor judgement about what the future required.

        Some of the delays you point to are easy to pin on codebase complexity, especially when combined with the reality of maintaining binary compatibility, but I don’t really believe that is totally true.

        There is a definite end to all platforms though. And we’re witnessing quite a lot of those right now in mobile. Was the PC market this scary back in the early 80s?

    • Anonymous

      no. dont you see the platform? after 2 periods of slow development between releases, it’s followed by a desperate hail mary release. It has never worked and the OS has been scrapped 10 out of 10.

      It signifies that OS development is the first funds to be extinguished in a struggling or mismanaged company. Windows is the only exception here but they also fit the pattern. They are implementing Metro right after windows 7. So windows vista is there release. followed by a hail mary windows 7 pass. Then instead of shutting down windows legacy, they are tacking it onto their next generation Metro product.

    • Anonymous

      no. dont you see the platform? after 2 periods of slow development between releases, it’s followed by a desperate hail mary release. It has never worked and the OS has been scrapped 10 out of 10.

      It signifies that OS development is the first funds to be extinguished in a struggling or mismanaged company. Windows is the only exception here but they also fit the pattern. They are implementing Metro right after windows 7. So windows vista is there release. followed by a hail mary windows 7 pass. Then instead of shutting down windows legacy, they are tacking it onto their next generation Metro product.

    • Anonymous

      No. What you are saying is that it isn’t about how heavy the software becomes, it’s about how strong the engineers are. But the software does actually become too heavy to lift later on. It gets to where even if you surround it with as many engineers as there are handles, it is still too heavy to lift. And nothing can be taken out to make it lighter without killing the app platform that lives on top. All you can do is build a second system next door and invite the apps to move over when the new system is ready, like they do with sports stadiums.

      Good management is important, though. With Mac OS X, it is split up into many, many tiny projects, and Apple has aggressively jettisoned legacy as they go, so it promises to be manageable for some time to come. The classic Mac OS was a big soup, and so is Windows. They are much harder to manage. But when the end comes, it comes, no matter how good your management is.

  • MacManMike


    In the post on OS turning circles you used the concept of a radius of turning as an analogy for agility. You noted a problem in “that turning in circles implies a return to a starting point or at least a closing of the loop. The idea is that there is lifecycle repetition. However, in reality, this does not apply to the world of operating systems.”

    What if you were to turn the “turning circles” on their side so you are viewing it three-dimensionally with “time” rising up like the peel of an apple, or the growth of a seashell? There would, of course, be end points, i.e. death of an OS, but it would also show a time/space relationship of the growth of the OS and the release of major versions. This should resolve the “return to a starting point/closing of the loop” as the circle grows upward over time, never returning or closing the loop.

    I don’t have the capability to do this, but I think it would provide an interesting perspective on your point.

    • Anonymous

      Since Horace used a jet fighter analogy, you could have your turning circles augur into the ground when the OS reaches EOL.

  • Steve Weller

    I have a theory that studying the rate of changes to the documentation would be an even earlier predictor of failure. Tech writers are the canary of tech advancement. However this would be a tough measurement to make.

    • Anonymous

      You just have to study the tech. That is all tech writers do. And be realistic about it like Steve Jobs is, not googley-eyed over tech like most engineers.

      When Steve Jobs said, “iPhone runs OS X,” in January 2007, I said right then and there, Microsoft is going to have to port NT to ARM to compete, and anyone else who wants to also compete with Apple will have to build a system with the scale of OS X and NT. Microsoft didn’t start porting NT to ARM until 2010, 3 years later. There is no excuse for that. It shows Ballmer doesn’t have the foggiest. Going after iOS with Windows CE was always going to be ineffective. That is like putting a smart car (CE) against a big rig (iOS) in a hauling competition. They wasted 4 years on that, and the apps don’t even run on Windows 8 without major work.

      Most of the things that business people in technology companies say is just technically wrong on its face. Steve Jobs is obviously one of the few who listens to the people around him. His “Thought on Flash” for example is fundamentally about FlashPlayer being technically impractical today, which is obvious to anyone who knows anything about how FlashPlayer technology works. That is why most Flash developers learned jQuery over the past 5 years.

  • James

    This is fascinating. I’m curious about this: “Instead the end of life is most clearly visible as a lengthening of the development cycle.” Given that statement, what do you make of the fact that Apple has intentionally slowed OS X’s development? Is that indicative that it will reach EOL, or is that not quite the same because there isn’t any giant spike in the cycle?

    • You can look at the chart to answer this question. Is the lengthening (a) abrupt, (b) significant? OS X development has had periods of slower development but the cursory glance tells that it’s not particularly significant. One explanation I’ve heard which I can’t vouch for but sounds plausible is that iOS took resources away from OS X and it has not been as well staffed lately. If that gets worse then it might indeed indicate that they are shifting away from it.

      • The iPhone launch did cause the release of 10.6 to be delayed by a few months, but for the last couple years I think there’s been a lot of synergy between iOS and OS X, and as of Lion and iCloud that seems to be increasing the case as we go forward.

        Really, the only anomalous point in the OS X graph was the huge gap between Tiger and Leopard — and that was when the Intel transition occurred. If you say that this took up 8-10 months of OS developer time* that would have otherwise gone into Leopard’s development, you’re left with a VERY nice trend line that shows a consistent, gradual slowing down of the release cycle. This makes sense, given how, as a platform matures it is unlikely to experience the same need for urgent/rapid/relatively straight-forward improvements.

        People at Apple have, iirc, publicly stated that since Leopard they’ve been targeting a 1.5 – 2 year release cycle, and they’ve stuck within that.

        *- The migration was announced only a month after Tiger’s release, and the gap between the announcement and the first Intel Mac release was 7 months. In addition to the basic issues of porting to a new architecture, the migration also prompted Apple to create Rosetta, their Universal Binary system, and Boot Camp, the last of which was first released 3 months after the first Intel Macs. So, 10 months seems a fair guess for the net cost of this transition, and subtracting this would bring the datapoints for the last four OS X releases to 18 – 20 – 22 – 23 months.

    • Anonymous

      You are factually incorrect. Apple has sped up OS X development. There is only one OS X team at Apple, and they made about double the releases over the last 5 years that they made over the first 5. Over the past 5 years they added such mainstays as the animation framework, Intel and ARM compatibility, multitouch, resolution independence, aggressive battery management, and cloud integration. The pace of improvements in OS X has gone up over the past 5 years, not down.

      The marketing names are important, they tell a story. But we can’t take them too literally. Fact is, improvements made to CoreGraphics, even if specifically for iOS devices, also improve Mac OS X, because it is the same CoreGraphics. A touch Mac could ship any time, and we would see Mac OS X doing great touch because OS X already does great touch. Lion is closer to iOS 5 than it is to the original Tiger release from 6 years ago, which would not even run on any Mac sold in the last 5 years. We may see a Lion tablet with ARM, that is certainly possible because the bottom 3/4 of Mac OS X already runs on shipping ARM devices. That would be even further from a PowerPC Mac running Tiger.

  • I’d be interested in seeing this kind of graph for video editing software. Avid vs. Adobe vs. Apple anyone?

  • Matthias

    I was just wondering why you didn’t start windows with nt. Windows 3.o/3.1, 95 and 98 were not the predecessors of 2000. NT was.

    • I traced the lineage of “Windows”. The idea was to follow the consumer/desktop history (and not the server) because I could more closely compare it with alternatives which were also consumer-facing. Server platforms would need to look at Unix and Sun’s platforms as well as Linux as alternatives.

      Having said that, I don’t think the conclusions would have been different with the NT history instead.

      • Yeah, but then why didn’t you start Mac OS X with NeXTSTEP, OPENSTEP, Rhapsody… get the gist?

      • Anonymous

        I believe it is mostly in the naming. A change in naming marks a point in the development history that involves a greater change than just at the software level.

      • Because “OS X” is not called NextSTEP. The emphasis is on the brand. Windows is a brand, OS X is a brand and NextSTEP was a brand. The lineage is that of a business not a piece of code.

  • Chandra2

    “..There was a change in the basis of competition, away from pure productivity and more toward entertainment…”

    Wonderful. That explains a whole lot of what happened in the past 20 years. Elaborating on the succinct statement above ( there is really no need since it conveys a lot, but let me indulge )..

    The reason for the success of Apple is they were better able to adapt to this change compared to others. Why is that?
    Entertainment is more artistic. Apple from the very beginning had that culture.
    They always viewed themselves as operating at the intersection of Technology, Engineering and Liberal Arts. Aesthetics and design were always important to them.
    It did not do them much good when the focus was on productivity but it did them a world of good when the basis shifted to entertainment.

    The shift happened when the productivity technology space reached a state of ‘good enough’. Any advantage one had was pretty much frozen at that point. Whoever led won, won handsomely but that is frozen. A new brach, namely the entertainment branch, evolved and the game started all over again.

    Hope I got this right, Horace.

    • I never understood how spending 20 min. to reboot a stalled PC with “professional” Windows installations can be considered “productivity”.

    • Anonymous

      You are right, but it is deeper than that. It isn’t just that Apple saw themselves that way, they also built technologies for creative computing in the same way Linux built technologies for Web serving. QuickTime was not made so a consumer can enjoy audio video, it was made so that producer can create audio video. There is a pro music subsystem in Mac OS X that was built for music studios. So when you ask an OS X system such as iPad or iPhone to play music or video or show graphics, it really knows how to do that, that is a walk in the park. The consumer only wants me to run 1 16-bit stereo audio file? No problem. Producers routinely ask for dozens of 24-bit files to be played in perfect synchronization. Consumer needs are easy to meet if you are used to meeting producer needs.

  • Anonymous

    You could apply the same thinking to hardware too, except for one difference. Success there seems to come to the makers who introduce a product, give it a lengthy lifespan and improve the specs and feature set regularly until it is replaced/discontinued. The name Apple comes to mind (cough). Competitors seem to believe that in order to compete against companies like Apple, they have to churn out new products all the time and never really try to establish the idea of a product line that is iterated towards excellence. I know there are a few exceptions but this strategy of dazzling the market with an outpouring of new products will always (eventually) cause dismay be distasteful to consumers. The dismay of anticipated buyer’s remorse.

  • Guest

    One could regard the windows lineage […3.1/95/98/ME] and the NT lineage […NT/XP/vista/7/…] as two separate lineages. In fact I would argue that this was one of the few successful transitions from one os lineage to another. And MS obfuscated it to the degree that many would argue that it is the continuation of the same lineage.

    • Anonymous

      Not just obfuscated, they brought over many of the bugs. They put the graphics in the kernel for performance, which destroyed stability, and even now, Windows users are accessing volumes by MS-DOS drive letter and nonstandard pathnames.

      But definitely 2 different operating systems.

      And on the other hand, Windows CE and Windows Phone are the same system. That is the same old CE in Windows Phone. Windows Phone is part of the end of CE.

      • From the developer perspective (which is probably more relevant here), I’d say the opposite is true: the same software could run, unmodified, on both 98/ME and 2000/XP. Windows Phone, however, is a complete new and separate beast from its CE ancestors, as far as application and interface design are concerned.

        The important thing here is to look at the whole platform, not just the kernel and core services of the OS. That’s just plumbing. Bad plumbing can ruin an otherwise nice house, but no one ever bought a house just because it had really good plumbing.

  • Guest

    One could regard the windows lineage […3.1/95/98/ME] and the NT lineage […NT/XP/vista/7/…] as two separate lineages. In fact I would argue that this was one of the few successful transitions from one os lineage to another. And MS obfuscated it to the degree that many would argue that it is the continuation of the same lineage.

  • Guest

    One could regard the windows lineage […3.1/95/98/ME] and the NT lineage […NT/XP/vista/7/…] as two separate lineages. In fact I would argue that this was one of the few successful transitions from one os lineage to another. And MS obfuscated it to the degree that many would argue that it is the continuation of the same lineage.

  • And what about the big anomaly of BSD dying for the last 40 years?

  • Anonymous


    Michael Mace wrote about the death of computer platforms, I’m sure you’re familiar with it as well:

    What would be the result if you were to add his reasoning and signals to watch for with the OS cycle times? It would be interesting to see how declining growth rate and declining gross profit per unit sold correlate with the OS cycle time. I would hypothesize that, based on Michael’s theory, at the beginning the strong growth of sales allows the company to invest heavily on the development of the platform, but when sales growth starts to stall, the resources are cut to maintain profitability and thus the cycle time increases. That could then, in turn, contribute to greater loss of sales and also lower prices and profits per unit.

    • Highly relevant, and highly recommended. I was going to point out that link too.

      Another factor with increased cycle time is, well, the age of the platform — in ways that are completely independent of the business itself Some enhancements are easier done than others, and there are eventually going to be big changes that need to be made in the guts of an OS for it to stay competitive.

      They’re unlikely to be necessary for the first few revisions, which tend to be more about adding polish and new features for which there simply wasn’t the time or resources before the initial release.

      Eventually diminishing returns kicks in, and mid-way through a platform’s life is more likely when the eventual necessity of some major changes becomes apparent — and/or they realize the difficulty of integrating those changes into the existing platform. Ambitious new designs are well-known to bear the risk of feature creep and the likelihood of major unanticipated, project-delaying problems. It’s both easier and in the short term often safer to delay it’s easy to delay such a daunting change as much as possible.

      You could call it a mix of procrastination and denial, at a company-wide level. When the changes finally become absolutely necessary, the company is in a real pickle, unless they’ve had some people working on that in the background the whole time, or have in place a plan for a major platform-transition.

  • Anonymous

    The more I think about this topic the more I feel that it’s mistaken. You simply can’t try to draw a large conclusion like this from a small data sample, especially when there is significant dispute about what the data sample should be and when some of the available data doesn’t support the conclusion.

    Some more counter examples:

    BeOS died with no version spike.
    Newton OS died with no version spike.

    It was 15 years between version 2.0 and version 3.0 of the Linux kernel. 15!!!! Linux is still very much kicking.

    If you took a dataset of every major OS.
    If you could come up with a realistic metric for determining how actively it’s being developed rather than just major version.
    If you could reliably distinguish between platforms that run-together such as epoc-symbian, or the windowses,
    If you could reliably determine the point of platform death in a way that was consistent between open and proprietary platforms –
    If you could do all those things you might be able to build some kind of conclusion that could stand up to more than a casual poke.

    Unfortunately I don’t think that such an analysis is possible.

    • The Linux kernel, itself, is not an OS — it’s a portion of many OSes, such as Red Hat, Ubuntu, etc. New features in the kernel aren’t really used to help sell anything to consumers, so it’s fair to say it doesn’t experience the same pressures toward version-number-incrementing. (Plus, the recent bump to 3.0 was effectively done as an acknowledgement that for at least the past half-decade, all active development was done in the form of point-releases to the 2.6, and the current development cycle would have continued this perpetually.)

      Some underlying points are valid:

      -The mere version number itself isn’t meaningful; the comparison should be based on substantial OS updates.

      -This could break down when faced with rapid, iterative releases, where no individual update might be itself “substantial” but such improvements are made over time.

      -Similarly, OSes with calendar-based release timelines wouldn’t show anything meaningful on this metric; there, you might have to devise a measure of the ‘size’ of the update at each version. But this only seems to be the case for certain open-source distributions, which aren’t part of this comparison.

      But this is starting to pull away from what I think this metric is meant to indicate, which isn’t simply the effective rate of advancement of a platform.

      With commercial, consumer OSes, it’s telling to just let company be the judge of what is substantial enough to be a major upgrade, and to see how often they reach that point.

      How much competitive pressure are they feeling to advance the platform? How readily are they responding to that? These are some of the factors this metric likely means to capture. Yes, this metric could be gamed, by a company releasing frequent but trivial or half-baked updates — but there are costs associated with that (e.g., losses in developer interest, user confidence, user experience) and it hurts them in the end.

      Consider the following:

      -A release that’s one year in the making, with 10 new features — repeated every year for a decade
      -A release that’s 10 years in the making, with 100 new features

      They are *not* the same, and should not be counted as such. Sure, they might reach the same destination after ten years, but the OS with the frequent release cycle is probably much more responsive to user needs, with a more active and vibrant ecosystem. In the users’ view, it will be the more advanced of the two for 90% of the time — all else being equal, for nine out of those ten years, it’s going to be the more attractive option of the two.

      It certainly seems like the sweet spot for major update frequency is on the order of 1-2 years.

      • Anonymous

        ‘How much competitive pressure are they feeling to advance the platform? How readily are they responding to that? These are some of the factors this metric likely means to capture. ‘

        The problem is that the same message could have multiple interpretations. A high frequency of releases might mean the vendor feels under pressure, indicating a platform at risk, or it might be because the vendor is innovating really fast, indicating a dominant platform. A very low frequency might indicate extreme confidence or lethargy.

  • Anonymous

    Spell/context check: “Months to prior resease” on the charts should be something like: “Months after prior release”.

  • Davel

    I wonder if your data is relevant here.

    You are comparing OS versions with Relevence.

    Unix as a platform has been around for half a century. It has changed, micro kernel, lightweight threads, multiple processors, etc.

    But in essence it has not changed much. Macos x of course is a version of unix.

    Is vista a different OS than win 7? I don’t know. I do not know what architectural differences there are between the two. Win 7 came out because vista was terrible. In fact most companies still use xp.

    The mobile os’s are new. Any new animal goes thru rapid change and then the change slows down because it doesn’t need to change to survive.

    So unix is dead. In the early 90’s companies used it because it was better than the mainframe. Then windows grew up and supplanted unix. However unix still lives in iOS and Mavis and even android to a degree.

    So from a technical perspective unix is probably the strongest and most active consumer facing OS there is and yet consumers don’t know about unix. They see android and iOS. In fact many technical people are not even aware of the roots of the phones they use every day.