Categories

Exclusive interview with Asymco's Horace Dediu | The Tech Block

Horace, you spent nearly a decade at Nokia, where you worked as a business development manager and industry analyst. Did you foresee their current, increasingly dire situation?

I did not see an explicit downfall. I anticipated difficult times ahead and a deep crisis. My view of what would happen was published as my first Asymco post.

What led you to start Asymco?

I started a consulting company which I hoped would generate leads through a blog. The blog became far more exciting than consulting and it became my primary focus after about one year. I had no ambition to write for a living or to be a “blogger”. I did not anticipate there would be any interest on the topic I wrote [about] beyond a handful of people. In that regard, things played out as they do at most start-ups: what you end up doing is not anywhere near the target you aimed at.

Apple’s clearly one of your favorite topics. What about the company appeals to you?

Business education is predicated on storytelling, also known as the case method. Business management is not a discipline that has “axioms” defining basic truths, or if it does, they change frequently. Therefore business education (i.e. the MBA) is the equivalent of people teaching each other by telling stories around a campfire. The best stories get repeated more often and are better ‘teaching tools’. So it is with Apple. It’s a great medium for story telling because people can see the stories unfolding in real time or at least within their lifetimes. They are not about a distant past or an abstract industry. There is also a lot of passion around the brand, both positive and negative and so it leads to more attention.

Read more here: Exclusive interview with Asymco’s Horace Dediu | The Tech Block.

  • http://twitter.com/qka qka

    “Business education is predicated on storytelling, also known as the case method.”

    Wonderful explanation!

    • richlo

      “Therefore business education (i.e. the MBA) is the equivalent of people teaching each other by telling stories around a campfire”
      And great analogy

  • def4

    What makes you so sure the rate of innovation will increase or even stay the same?

    Moore’s Law is dead and has been so for a while now.
    It’s still alive on mobile because ARM is behind the curve compared to x86.
    On the desktop, improvements have been marginal.
    A desktop bought today is about 50% faster than a desktop bought 5 years ago.
    One bought 10 years ago was about 50 TIMES faster than on boght 15 years ago.

    • http://www.asymco.com Horace Dediu

      The innovation I’m referring to is business model innovation which manifests itself as the rate of birth, maturation and death of successful large businesses. Increased performance does not correlate to innovation within any industry.

      • def4

        “The innovation I’m referring to is business model innovation which
        manifests itself as the rate of birth, maturation and death of
        successful large businesses.”

        I do not remember reading or hearing in the podcast about business model
        innovation without the catalytic help of technology leaps to enable the disruption of incumbents.

        I’d appreciate if you’d elaborate on that sometime.

        “Increased performance does not correlate to innovation within any industry.”

        That’s just plain wrong.
        Increases in computing power have absolutely been clear prerequisites for innovation in computing.
        The same applies for increases in network speed and reliability as prerequisites for innovation in telecommunications, both fixed and mobile.

        Both iPad and iPhone are great examples.
        The original iPad was a good product, but it needed a doubling of RAM and a massive (much faster than Moore’s Law) computational power increase to be able to deliver its first iconic applications like Garage Band.

        The original iPhone felt faster than iPhone 3G because it lacked third party application support so it didn’t need to use sand-boxing.
        iPhone 3GS rectified the situation by using a superscalar CPU and together with the improved 3G speeds and GPS allowed for the mobile social revolution we take for granted today to start.

        GMail couldn’t have existed in the age of dial-up because AJAX requires a reasonably fast and very stable, reliable connection.
        And without AJAX, GMail would have gone nowhere because it would have competed on storage space alone with the Yahoo! and Hotmail incumbents.

      • http://www.asymco.com Horace Dediu

        Sustaining improvements tend to sustain the incumbents and prevent business model innovation. When I said there is no correlation I meant that there is no 1:1 relationship between an improvement and a disruption. There will be disruptions predicated by an improvement in a core technology but there will also be disruptions which occur when improvements are not considered valuable. The move to mobile is a case in point. Faster microprocessors don’t help when what matters is lower power consumption. The seminal example given by Christensen was hard drives. The sustaining improvement was to make them bigger and faster for large computers, but every disruption came from making the drives smaller (and slower) and conformable to different applications. Breakthroughs in the prevailing basis of performance did not create innovative new products.

      • def4

        There have been low power microprocessors suitable for battery powered mobile applications for about as long as there have been microprocessors.

        Faster processors is exactly what made mobile computing platforms possible.

        ARM based SoCs improved faster because they were at the beginning of exponential growth, while Intel’s attempt at low power with Atom failed.

        Hard drives were never disrupted.
        IBM made and sold them successfully from the first commercial one in 1956 until they sold the division off to Hitachi in 2003 because of commoditization.
        The change is happening now with hard drives being replaced by flash storage.

        My point is not that breakthroughs in performance automatically create innovative products, but that they are prerequisites.
        Just like changes in human interface are prerequisites for new platforms.

      • http://www.asymco.com Horace Dediu

        I refer to The Innovator’s Dilemma for a discussion of the disruption in the hard drive industry. The work was part of the Ph. D. Dissertation Clay Christensen wrote and further developed into Disruption Theory.

    • http://twitter.com/handleym99 Maynard Handley

      You are right that pure hardware has hit some limits. The speed of a single CPU can hit practical limits, likewise current techniques of wireless modulation (OFDM, error correction, 64 and 256-QAM) are close to theoretically optimal, likewise for displays.
      I think it would be premature, however, to conclude that the glory days are over.

      (a) There are things that can be done at the social/cultural/political level to provide substantially better (almost) always-available wireless access. These rely not on better wireless tech, but simply on many many more base stations blanketing urban areas. We’ve still not quite internalized what always-on wireless gives us; especially when it’s hooked up to things like personal cameras (Project Glass, though perhaps not as in your face). Consider, for example, being able to have “the system” tell you whenever you meet someone who they are, when you last met them, and what you talked about.

      (b) There is plenty of scope for better coding (ie things feeling faster) via better utilization of multiple CPUs. [Moore’s law refers to number of transistors, not CPU speed, and that’s been going OK, though I do agree that it’s probably on its last legs with only a few years left.]
      Which leads us to

      (c) The next big battleground is large data and what it enables in such fields as speech recognition, translation, OCR, augmented reality. For example, NONE of the big companies out there has really embraced augmented reality yet. Even something as trivial as a third camera, built into the top of a phone, so that you can use augmented reality while the camera is flat rather than vertical, has still not been implemented; not by Apple, and not by that supposed hive of innovation that is Android and Windows Phone.
      Right now we have bits of these that kinda work if you try hard — we’re at the PC in 1976 stage of life, putting up with storing data on tapes and things not working half the time.

      There will probably come a day when most of the innovation has left the digital sphere, when the excitement is all in biology or whatever; but it seems to me that is a long way away.
      There were very few DRAMATIC changes during the PC years — only two really, intro of the GUI (Mac and Win3/95) and intro of the robust OS (NT/XP and OSX), but those were still exciting years; and what we will see over the next fifteen years is the same relentless accumulation of one improvement after another. Error rates for Siri going from 20% to 10% to 5% to 3% — that sort of thing. You can’t point to any single dramatic thing, but there will be a constant stream of small improvements that does actually amount to REAL improvement, qualitative changes. I mean, damn, five years ago you couldn’t honestly expect to land in some random country, spend a few dollars to install a SIM, and be walking around with a GPS and fully up-to-date maps, including satellite photos, and now it’s routine! It is crazy what we’ve got used to so fast.
      How long till, eg, when you buy a movie in iTunes, you can pay a dollar more, and switch the actress from whoever was in the movie to someone you think is hotter? Things like this are bound to come…

      • theothergeoff

        To pile on…

        Moore’s Law is about ‘cost’ not about performance. performance/$$ doubles every 18 months.

        A PC has a lot of other things that make it slow (lack of discrete threads to drive parallelism of multiple cores, compiler technology, OS complexity, bus technology that is ‘least common denominator’, and backwards compatibility to 10/15 yo instruction sets) and costly (primary of which is margin and distribution costs). When a workstation was 10K, it was easy to halve the cost/double the performance. When a Laptop is $499, it’s a lot harder to make that computer $249, and still keep the stockholders happy at Microsoft, Dell, BestBuy, and Intel, and sell a high performing product.

        However, AT SCALE… (say a cloud data center with a couple Billion cores in it), Moore’s law is still operating fine.

        As for innovation, now it’s all about metcalf’s law… Can we drive the value of the network up per $$ spent (which means more things connected, and lower cost per GB)? Which in the same token, can we drive the cost of GB down, and the number of useful connections up

        I would posit that the more ‘things’ connected (and IPV6 world where my coffee maker, camera, toothbrush, my iPad and my car, as well as my team’s mobile devices, their office (are they at their desk?) are all things I interact with, innovation comes at the effective integration ‘at the moment’ I need it.

        AOL was sort of the first big commercial innovator here, with ideas like Facebook, Twitter, all the V1.1 of the ‘connected community.’

        the Apple Ecosystem is built on this law… iCloud and ITMS become more valuable the more things (pod, phone, pad, computer, TV, cloud, gamers, movie viewers, song listeners) that the ecosystem ‘connects to me’ in a context relevant manner (Siri: “Your car is getting 29mpg, and given current conditions, it looks to me your destination is too far for your current fuel level…I’ve taken the liberty to find you the least expensive gas on your current route, which is Joe’s Gas in 15 miles… Do you want other options?”

        Innovation is now less about portable power (MIPS is dead), and more about effective connectivity relative to my context.

      • def4

        Thank you for the thoughtful reply.
        I agree that there are plenty of great things left to do and explore.

        My point is that a lot of improvements in efficiency across society and our lives in the last several decades have come about from harnessing exponentially increasing computation speed and communication bandwidth.

        We KNOW those will slow down soon, so the rate of innovation may do the same.

      • http://www.facebook.com/joseph.carducci1 Joseph Carducci

        read up on the fast approaching realities of quantum computing, optical chips, carbon nanotubes replacing silicon wafers… innovation does not stop with silicon. If anything I’d say we are in a transistional phase while the possible new platforms and paradigms sort themselves out before ushering in the next exponential expansion of computing power

    • jawbroken

      Interesting that your definition of innovation is solely based on processor speed.

      • def4

        Interesting that your definition of adding to the conversation is being a smartass.

    • Tatil_S

      Moore’s law says the number of transistors on a chip doubles every 18 months. Nothing more, nothing less. Some of that increase comes from having smaller transistors and some of it comes from having chips with larger area. Moore’s law does not say anything about speed, cost or whether Windows, Flash or your IT department steals the additional CPU power with inefficient background processes that you never asked for.

      • def4

        Aren’t you a clever little nitpicker?

        As far as I can tell Intel hasn’t transitioned into the space heater business yet, so i doubt they or their customers would like to invest into ever larger CPUs with no computing power benefits.

      • Tatil_S

        Umm, you can use your additional transistors for cost or battery life or whatever, as well as computing benefits. In any case, considering you are the one making grand proclamations of stagnation in computing, I’d like to see your objective hardware spec that shows that rate of growth has gone done from 50x over 5 years to 50% over 5 years.

  • FalKirk

    “I started a consulting company which I hoped would generate leads through a blog. The blog became far more exciting than consulting and it became my primary focus after about one year.”

    I love this. Life certainly does have many funny twists and turns.