Categories

Category Theory

Ten years ago: Clayton Christensen on Capturing the Upside

You can hear this as an MP3.

[It's important to understand just how much the theory has evolved in the last 10 years. Much more perhaps than in its first eight.]

Doug Kaye: Hello, and welcome to IT Conversations, a series of interviews recording and transcripts on the hot topics of information technology. I am your host, Doug Kaye, and in today’s program, I am pleased to bring you this special presentation from the Open Source Business Conference held in San Francisco on March 16 and 17, 2004.

Mike Dutton: My name is Mike Dutton, and it is my pleasure to introduce to you today Clayton Christensen. Professor Christensen hardly needs an introduction. His first bestseller, “The Innovator’s Dilemma,” has sold over half a million copies and has added the terms “disruptive innovation” to our corporate lexicon. His sequel — and you have to have a sequel to be a management guru — is entitled “The Innovator’s Solution” and is currently Business Week’s bestseller’s list. Professor Christensen began his career at the Boston Consulting Group and served as a White House fellow in the Reagan administration. In 1984, he cofounded and served as chairman of Ceramics Process Systems Cooperation. Then, as he was approaching his 40th birthday, he took the logical step of quitting his job and going back to school, where he earned a doctorate in Business Administration from Harvard Business School. So, today he is a professor of Business Administration at Harvard Business School where teaches and researches technology commercialization innovation. Professor Christensen is also a practicing entrepreneur. In 2000 he founded Innosight, a consulting firm focused on helping firms set their innovative strategies. And according to a recent article in Newsweek, “Innosight’s phones ring off the hook, and the firm cannot handle all the demand,” very similar to all the startups in open source here today. So, please join me in welcoming Clayton Christensen.

Clayton Christensen: Thank you, Mike! I’m 6 feet 8, so if it’s okay, I’ll just…the mic picks up okay. I’m sure delighted to be with you, especially because there is blizzard in Boston today; my kids have to shovel the snow!

As Mike mentioned, I came in to academia late in life, and the first chunk of research that I was engaged in was trying to understand what it is that could kill a successful, well — run company. And those of you who are familiar with it, probably know that the odd conclusion that I got of that was that it was actually good management that kills these companies. And subsequent then to the publishing of the book that summarized that work, “The Innovator’s Dilemma,” I’ve been trying to understand the flip side of that, which is if I want to start a new business that has the potential to kill a successful, well — run competitor, how would I do it? And that’s what we tried summarize in the book, “The Innovator’s solution.” It’s really quite a different book than the “Dilemma” was, because the “Dilemma” built a theory of what is it that caused these companies to fail. And then in the writing of this solution, I’ll just give you analogy for where we came out on how to successfully start new growth businesses.

I remember when I first got out of business school and had my first job. I was taught the methods of total quality management as they existed in the 1970’s, and we had this tool that was called a “statistical process control chart.” (Do they still teach that around here?) Basically you made a piece, you measured the critical performance parameter and you plotted it on this chart, and there was a target parameter that you were always trying to make the piece to hit, but you had this pesky scatter around that target. And I remember being taught at the time that the reason for the scatter is that there is just intrinsic variability and unpredictability in manufacturing processes.

So, the methods that were taught about manufacturing quality control in the ‘70’s were all oriented to helping you figure out how to deal with that randomness. And then the quality movement came of age, and what they taught us is, “No, there’s not randomness in manufacturing processes.” Every time you got a result that was bad, it actually had a cause, but it just appeared to be random because you didn’t know what caused it. And so the quality movement then gave us tools to understand what are all the different variables that can affect the consistency of output in a manufacturing operation. And once we could understand what those variables were and then develop methods to control them, manufacturing became not a random process, but something that was highly predictable and controllable.

▶ Horace Dediu: “Transformation of Business and Society through Technology”

My talk on the future of things from Censhare’s FutureDay 2014 in Munich.

Who Solved the Capitalist’s Dilemma?

In The Capitalist’s Dilemma, Clayton Christensen and Derek van Bever introduce a powerful new theory which explains the relative paucity of growth in developed economies. They draw a causal relationship between the mis-application of capital in pursuit of innovation and the failure to grow.[1]

In particular, they observe that capital is allocated toward the type of innovations which increase efficiency or performance and not toward those which create markets (and hence long term growth and jobs.) This itself is caused by a prioritization and rewarding of performance ratios rather than cash flows and that itself is due  to a perversion of the purpose of the firm.[2]

For this statement of causality to be confirmed we need to observe whether it predicts measurable phenomena. For instance, we need to see whether companies which create markets apply capital toward market-creating innovations and whether companies which create value through efficiencies or performance improvements hoard abundant capital.

Over the entire global economy, the pattern of capital over-abundance is easy to see. The amount of cash or securities on balance sheets is extraordinary and unprecedented (estimated at $7 Trillion, doubling over a decade). However, growing cash is not a perfect indicator of inactivity. Cash is the by-product of earnings after investment. So if operating profits are growing and investment is growing, but not as fast, then it’s possible to grow cash while still growing investment.

The better measure is investment in capital equipment or, more specifically, purchases of plant, property and equipment.[3] Indeed, on a global scale, capital expenditure as a percent of sales is at a 22-year low.

CapEx is a good proxy for non-financial “investment”. It’s also a measure that can be easily obtained as companies report this activity in their Cash Flow Statements.

So the best method for assessing the theory’s predictive power is to look at market creators and measure their investment in PP&E. At the same time we need to look at market sustainers and measure their (probable) lack of investment in PP&E.

So here is my first attempt:

Screen Shot 2014-05-27 at 5-27-3.25.43 PM

It’s an admittedly small sample of companies that are not that dissimilar. But within this group, over the time frame of about 9 years, we can see how capital expenditures are growing.[4] This sample shows that for a few companies, the amount spent on capital equipment grew dramatically. Especially since they are in businesses that might be thought of as not capital intensive.

Notes:
  1. and, indirectly, in the increase in inequality and hence the destabilization of socio-political institutions []
  2. That being the creation of customers not shareholder returns []
  3. Operating expenditures can also be measured but they cannot grow inorganically due to most of the costs being related to skilled employment which has supply constraints. []
  4. Note that Apple’s data extends to the end of their fiscal year and reflects their forecast given last October in the 10-K filing []

Categorizing technologies

In the graph below the grey circles represent the US penetration (percentage of households which own) MP3 players.

Screen Shot 2014-05-19 at 5-19-7.55.22 PM

Superimposed on this sparse sample graph is a line showing the sales of iPod touch. This second graph has a different scale, shown with a gridline at 10,000, representing millions of units shipped by Apple. To smooth out seasonality I show the trailing four quarter average with a thick line.

The correlation is fairly evident. As iPod sales grew, penetration grew and “peak MP3″ was recorded in September 2010 while peak sales occurred at the end of that year.

It’s not a stretch to say that iPod touch sales are causal to MP3 penetration, especially since the iPod has remained the market share leader in the segment for a long time (at least 70% share) and that the iPod touch is consistently half or more of the iPod.

The absence of data for penetration beyond 2012 is therefore not a problem. We can assume that MP3 devices have a finite lifespan and, if not replaced, the penetration will decline.

I modeled both the increase and decline with a diffusion curve as follows:

Postmodern Computing (Summit)

Screen Shot 2014-05-12 at 7.03.42 AM

Steve Jobs famously said that Apple stands at the intersection of of Technology and the Liberal Arts. He said it more than once because he thought it was an important distinction of the company.

In an intuitive way, the message may have gotten through to the average person, but I don’t think professional observers and managers of technology have quite grasped what he meant.

It’s not a glib throw-away marketing phrase. I can imagine many other, more evocative ways of saying that Apple blends the hard and the soft; the heart and mind, if you will.

His choice of words makes me believe that he meant it as a fundamental blending of two disparate and considered-opposite concepts, rather like yin-yang: things which do not naturally mix but which are complementary, interconnected, interdependent, and give rise to each other.

This interaction however is not well understood and even more rarely exploited. The reason they don’t mix well in business in particular is that individuals are typically not trained in both. Our education systems (from where these phrases originate) are unwilling or incapable of providing us with a grounding in both, so individuals tend to absorb only one or the other.

But it turns out that the interaction between these nominal opposites have determined our world to date and will continue to determine our fate. A cursory review of history shows that the “soft”, perceptive and feeling-based disciplines always combined with the analytical and judgmental to create a future which neither could create alone.

I note how Apple uses this combination to an advantage and have also used this methodology myself to understand and sense the future. Taking this method further, I would like to share it with others. I would like to recognize some faint but powerful patterns and bare some of the more audacious conclusions of my analysis.

The method chosen is a forum we are convening called The Post Modern Computing Summit.

It’s a small gathering where we are inviting the most enlightened thinkers of the future of computing to lead us into its next age, and perhaps, tentatively, the next era of civilization.[1]

Notes:
  1. We’ll also answer the questions of where tablets are going, and where they will takes us, what is the future of apparel computing, what does intimate computing mean and who will benefit and who won’t. []

Innoveracy: Misunderstanding Innovation

Illiteracy is the inability to read and write. Though the percent of sufferers has halved in the last 35 years, currently 15% of the world has this affliction. Innumeracy is the inability to apply simple numerical concepts. The rate of innumeracy is unknown but chances are that it affects over 50% of us. This tragedy impedes our ability to have a discourse on matters related to quantitative judgement while policy decisions increasingly depend on this judgement.

But there is another form of ignorance which seems to be universal: the inability to understand the concept and role of innovation. The way this is exhibited is in the misuse of the term and the inability to discern the difference between novelty, creation, invention and innovation. The result is a failure to understand the causes of success and failure in business and hence the conditions that lead to economic growth.

My contribution to solving this problem is to coin a word: I define innoveracy as the inability to understand creativity and the role it plays in society. Hopefully identifying individual innoveracy will draw attention to the problem enough to help solve it.

One example is in the following quote:

“Lastly, nationally circulating tabloid Ilta-Sanomat gets a look at Nokia’s fabled tablet computer that was developed nine years before the iPad hit the market. According to the paper, Nokia had its own innovative tablet device ready in 2001, but unfortunately it never made it to the shops. A former Nokia expert Esko Yliruusi says that the project was suspended a heartbeat before the tablet hit the market because it was thought that there was insufficient demand for such a device.”[1]

To explain what’s wrong with this usage we need some definitions.

The definition of innovation is easy to find but it’s one thing to read the definition and another to understand its meaning. Rather than defining it again, I propose using a simple taxonomy of related activities that put it in context.

  • Novelty: Something new
  • Creation: Something new and valuable
  • Invention: Something new, having potential value through utility
  • Innovation: Something new and uniquely useful

The taxonomy is illustrated with the following diagram. The position of the circles shows the embedding of meaning[2]

Screen Shot 2014-04-18 at 7.54.26 AM

To illustrate further, here are some examples of the concepts.

  • Novelties: The choice of Gold as a color for the iPhone; the naming of a version of Android as “Kit Kat”; coining a new word.
  • Creations: The fall collection of a fashion designer; a new movie; a blog post.
  • Inventions: Anything described by a patent; The secret formula for Coca Cola.
  • Innovations: The iPhone pricing model; Google’s revenue model; The Ford production system; Wal-Mart’s store design; Amazon’s logistics.

The differences are also evident in the mechanisms that exist to protect the works:

  • Novelties are usually not protectable but since their value is very limited the copying is not seen to cause harm.
  • Creations are protected by copyright or trademark but are not patentable since they lack utility.
  • Inventions can be protected for a limited time through patents but can also be protected indefinitely by being kept secret. Their uniqueness may also be the means by which they can be kept a secret.
  • Innovations can be protected through market competition but are not defensible through legal means.

Note that the taxonomy has a hierarchy. Creations are novel, inventions are creations and innovations are usually based on some invention. However inventions are not innovations and neither are creations or novelties. Innovations are therefore the most demanding works because they require all the conditions in the hierarchy. Innovations implicitly require defensibility through a unique “operating model”. Put another way, they remain unique because few others can copy them.

To be innovative is very difficult, but because of the difficulty, being innovative is usually well rewarded. Indeed, it might be easier to identify innovations simply by their rewards. It’s almost a certainty that any great business is predicated on an innovation and that the lack of a reward in business means that some aspect of the conditions of innovation were not met.

The causal, if-and-only-if connection with reward is what should be the innovation litmus test. If something fails to change the world (and hence is unrewarded) you can be pretty sure it was not innovative enough.

Which brings us to the quote above. The fact that the Nokia tablet of 2001 not only did not succeed in the market but was not even released implies that it could not have been innovative. The product was only at the stage of perhaps being an invention (if it can be shown to be unique) or merely a creation (if it isn’t.) Furthermore, if the product is so poorly designed that it is literally unusable then it is just a novelty. A design, sketch or verbal description might be novel but it does not qualify as an innovation or an invention or even a creation. How far the depiction went toward making a dent in the universe defines its innovativeness.

Why does this matter?

Understanding that innovation requires passing a market test and that passing that test is immensely rewarding both for the creator and for society at large means that we can focus on how to make it happen. Obsessing over the mere novelties or inventions means we allocate resources which markets won’t reward. Misusing the term and confusing it with activities that don’t create value takes our eye off the causes and moves us away from finding ways of repeatably succeeding.

Recognizing that innoveracy is a problem allows us to address it. Addressing it would mean we could speak a language of value creation that everyone understands.

Wouldn’t that be novel?

Notes:
  1. A video showing the device is here, in Finnish. []
  2. The size of the circles also suggests degree of effort required and potential reward. Note that this is not a Venn diagram. []

Postmodern computing

There are 7.1 billion people on Earth. Coincidentally there are also 7 billion mobile connections.  Those connections are held by 3.45 billion unique mobile subscribers.[1] Unsurprisingly, the largest national mobile markets (by number of subscriptions) correspond closely to the most populous nations.

Screen Shot 2014-04-07 at 7.21.46 AM

Considering smartphones, last year 1 billion smartphones were sold and the number of smartphones in use is about 2 billion[2]

Given the rapid adoption of smartphones, it’s also safe to assume that smartphone penetration will follow population distribution. In the US, where comScore data is published monthly, penetration is following a predictable logistic curve.

Screen Shot 2014-04-07 at 7.55.13 AM

 

Assuming similar patterns world-wide we can forecast regional smartphone penetration. Screen Shot 2014-04-07 at 7.56.49 AM

This yields the following forecast for smartphone usage world-wide.

Notes:
  1. GSMA []
  2. There are also about 2 billion 3G/4G connections world-wide []

The price is right

One of the axioms of hardware business is that prices fall over time. The consumer price index for personal computers and peripheral equipment from 1998 to 2014 is shown below:

CUUR0000SEEE01_Max_630_378

The price index suggests that prices for computers should be 54% of 2007 levels. Charles Arthur illustrated this on a global basis using a separate set of data.

The data shows that the weighted average selling price (ASP) of a PC has fallen from $614.60 in the first quarter of 2010 to just $544.30 in the third quarter of 2013, the most recent date for which data is available.

Invulnerable

On a recent podcast I noted that Google was perceived as invulnerable. In contrast, Apple is seen as temporarily enjoying a stay of execution.[1] This is not necessarily a bad thing for Apple. The more gushing the loathing or scorn, the more likely it’s a reaction to love and attraction. A brand dies not from hate but from apathy.

But nor is it necessarily a good thing for Google be be seen as invulnerable. There might be no “Google death knell counter”. There might not be a “Google is doomed” trope. If an executive from Google quits or is fired there is no investor panic. If a product is withdrawn there is no mourning. There are no journalists pursuing Pulitzer prizes by describing some seamy underside of Google. But there are no overt displays of affection either. Google is seen, on balance, as benevolent and hopeful. The discussion on business robustness is simply missing.

I suspect the absence of scrutiny comes from Google being seen as an analogy of the Internet itself. We don’t question the survival of the Internet so we don’t question the survival of Google — its backbone, its index, and its pervasive ads which, somehow, keep the lights on. We believe Google is infrastructure. We don’t dwell on whether electric grids are vulnerable, or supplies of fuel, or the weather(!)

Too complex, too pervasive. These are systems, not things. And people are not designed to contemplate systems. We leave that to experts, or better yet, computers.

The reason Apple is contemplated at all is that it’s not seen as a system. Even the suggestion that Apple is a system is implicitly treated as an impossibility. Because it’s not a system it’s fragile. It’s a person, or an idea, or a product or a singular “key” to something. It is, ultimately, mortal. The only debate is when it will die and points are earned for calling it sooner rather than later.

But what if Apple were a system? And what if Google were a person (or three?)

 

Notes:
  1. The list of Apple Achilles’ Heels is so long and creatively composed that it would take ages to compile, but here are just a few: the Mac (vs. Windows), Digital Rights Management (which kept the iPod alive), dozens of lawsuits (including from The Beatles), the Mac (when it ran Windows), PlaysForSure, Music Labels retaliation, the Zune, Android and clones, the Kindle and Amazon in general, more Mac, iTunes, iPod and iPhone and iPad killers than can be counted; Steve Jobs is ill, Jony Ive will quit, Tony Fadell quit, Rubinstein quit, Forstall was fired, etc. Feel free to add more through comments. See also Apple Death Knell Counter. []

But Apple does not pursue profits either!

In my essay on Google’s absence of profit (or income or business) motives questions were raised on the stated absence of hunger for profits from Apple and what difference there might be from Google’s philosophy.

Indeed, Jony Ive stated:

“Our goal isn’t to make money. Our goal absolutely at Apple is not to make money. It may sound a little flippant, but it’s the truth.”[1].

He was probably repeating what Jobs had previously stated:

“I remember very clearly Steve announcing that our goal is not just to make money but to make great products”[2]

However, note that both quotes are qualified. In the case of Jobs, he said “not just to make money”. Jobs clearly stated that great products lead to money. That great products are causal to money and therefore that if you make great products you make money. One leads to the other.

Ive also continued in this reasoning:

“Our goal, and what gets us excited, is to try to make great products. We trust that if we are successful people will like them. And if we are operationally competent we will make revenue. But we are very clear about our goal.”

I would paraphrase the Apple logic as “Great products are the means by which we sustain our business. By focusing on the product, the customer is satisfied and through that satisfaction we create the free cash flows which can be used to fund more products.”

There is a difference between Apple’s “indifference to money” and the “indifference to business models” that Google exhibits.

Google steps even further away from cash flows. Its goals are to build great things guided by their vision and patterns in the data they collect. The value is in the data itself rather than in any transaction.

As long as the source of money is unfettered, its provenance is uninteresting. A business model is a profit algorithm. It could be linked to the data but it need not be. Markets are messy and imperfect. Data provides much clearer views into value. You could conclude that value itself cannot be trusted to the judgement of the public. Value is to be determined through the recognition of patterns on data privately collected.

So when I say that Google has disdain for market mechanisms I mean that they believe they can do better. Apple still values the user as the ultimate adjudicator of its actions. Google looks past the user and interprets their intentions.

Google sees markets as ultimately obsolete.

Notes:
  1. at the British Embassy’s Creative Summing in July 2012 []
  2. Walter Isaacson’s Steve Jobs []