Silicon Valley

You’ve probably heard of Jony at Apple but probably don’t know about Johny.

Jony is a celebrity executive known as the face of Apple Design. Johny is the executive in charge of custom silicon and hardware technologies across Apple’s entire product line.

Under Johny’s leadership, Apple has shipped 1.7 billion processors in more than 20 models and 11 generations. Currently Apple ships more microprocessors than Intel.[1]

The Apple A11 Bionic processor has 4.3 billion transistors, six cores and an Apple custom GPU using a 10nm FinFET technology. Its performance appears to be almost double that of competitors and in some benchmarks exceeds the performance of current laptop PCs.

A decade after making the commitment to control its critical subsystems in its (mobile) products, Apple has come to the point where is dominates the processor space. But they have not stopped at processors. The effort now spans all manners of silicon including controllers for displays, storage, sensors and batteries. The S series in the Apple Watch the haptic T series in the MacBook, the wireless W series in AirPods are ongoing efforts. The GPU was conquered in the past year. Litigation with Qualcomm suggests the communications stack is next.

This across-the-board approach to silicon is not easy or fast or cheap. This multi-year, multi-billion dollar commitment is rooted in the Jobsian observation that the existing supplier network is not good enough for what you’re driving at. Tiny EarPods, Smart Watches, Augmented Reality, Adaptive Acoustics require wrapping your arms around all parts of the problem. The integration and control it demands are in contrast to the modular approach of assembling off-the-shelf components into a good-enough configuration.

There are times and places where modules are adequate and times and places where they aren’t. The decision depends on whether you are creating new experiences or new “measures of performance” vs. optimizing for cost within existing experiences or measures of performance.

The very notion of a microprocessor is a rejection of the discrete component designs that preceded it. Earlier computers had central processors made up of many discrete components. VLSI stands for Very Large Scale Integration with emphasis on Integration. As computing has progressed toward ambience and ubiquity the idea of using discrete components became normative again but that was not considered sufficient by Apple.

So while the “Silicon” in Silicon Valley has come to be seen as an anachronism, silicon development today means competitive advantage. The only problem is that it takes years, decades even to establish competence. The same duration that it took for the building of Apple as a design-centric business fronted by Jony Ive.

Apple also now needs to be understood along the dimension of silicon-centric engineering as led by Johny Srouji.

  1. Trailing 12 months’ PC shipments 265 million. Equivalent iOS devices 281 million. Not included are Apple processors in Apple TV. []
  • Roger

    Apple have become better at building (systems on) chips than the semiconductor companies have been at building systems (on chips).

  • klahanas

    Well put. Apple in some ways parallels some of the old workstation companies, like Silicon Graphics, Sun, etc., but in the consumer space (which protects them with volume).

    Personally, I’m very leery of being beholden to a single source for the same reasons I was with the workstation vendors. This is evidenced as the inevitable lock-in/lock-out situation. If I’m going to be locked in, I personally would want to be locked-in to the broadest, most open, system available.

    • jbelkin

      Except in the consumer market, that leads to commoditization in branding, the death knell as there are always new assemblers in another country who can undercut you. By owning their own chips and in volumes larger than even Intel(as noted0, it’s BOTH economies of scale and controlling the pipeline.

      • klahanas

        From a business point of view, I agree, it’s very good for Apple. As a user, I think it stinks. I like commoditization, it favors the user. And don’t tell me it stifles innovation, because the PC usher so much innovation, from every angle, it’s what truly changed the world. As well as the internet.

        Central control stifles innovation in at least two ways:
        a) One company can’t do it all.
        b) Censorship stifles innovation.

      • Scott Sterling

        I personally think the smartphone has already resulted in more innovation than the PC. The PC was greatly hampered by the MS monopoly in operating systems. While they offered standardization, they also prevented others from innovating.

        I’m wrong, it won’t be too long before it is true.

      • klahanas

        Other than mobility, I don’t think it’s possible that I can disagree more.

      • Fran_Kostella

        I think you’re on the mark. I worked a lot on the MS stack in the late 80s & 90s and the only real attempt they made at fixing their kernel, NT (New Tech) was a step up, but never really improved very much and their multitasking never worked well. Contrast that to the steady improvements to the iOS core that come out every year, making great improvements that benefit all apps or provide fantastic new capabilities. It seems obvious that they are really paying attention to improving the systems as much as they can. And they have long term plans!

        In the early 00s I was maintaining an ORM that supported most popular databases, including MS databases, and every release they did was a series of new corner cases to discover and document and patch around, often introducing new errors into stable APIs or breaking stable behaviors over 10 years old. And then there were the new relational database APIs introduced every year or two, none of which seemed to be improvements over older APIs. From the outside it looked like rewards went to those who introduced complexity and that nobody was trying to make fundamental improvements. Perhaps they believed their monopoly protected them forever, so there was plenty of time?

        All the devs I grew up with from late 80s to early 90s went from loving MS to utterly despising them for their poor technical shepherding. That’s why we all moved to unix systems, I think.

      • klahanas

        Insofar that the *nix BSD core multitasks better, we can agree. That did not promote the innovation that standardizing around an open access expandable, free to program platform did. I agree, unfortunately, it was MS that had a disproportionate amount of the power.

        Contrast that with the power Apple has. MS could never,ever, impose a single store for Windows programs. Yet Apple does, must be the market share.

        Finally, in humor, you shouldn’t use “shepherding” in an Apple conversation. People get upset.

      • Fran_Kostella

        Well, “eventually free” platform. My first few years as a MS consulting dev I was paying about a grand annually to get access to the developer program and tools, which was a big chunk of change back then and hard to do without. Apple’s $100 is much more reasonable, cheap even, but they could probably drop the fee at this point.

        I think that what the real problem with MS was the need to tie everything back into main cash cows. Innovation? Yes, at first, pre-MS the market was fragmented and full of versions and weird hacks. I recall running cp/m for some apps and macs were astronomically expensive at the time, and you never knew if your machine would actually run something or not. Windows build a simpler marketplace by making the hardware less important for end users and enterprises with some kind of reasonable standard, unless you made a competing product. In which case lots of API changes always seemed to appear that delayed product launches that benefitted MS. From where I sit, there were about 5-8 years where MS was a force for good, then…well the hardball tactics backfired. They eventually pissed off Jim Clark and we know the rest.

      • klahanas

        If it were up to me, there would have been a split into OS and Applications. The actual “bundling” of IE was lame, but had legal teeth.

        I was far more offended by private APIs and undocumented functions that the applications groups had, but competitors didn’t. Clearly anti-competitive. Not as bad as flat out forbidding, which is worse.

      • anonymous

        secundum quid et simpliciter

      • klahanas

        Caveat Emptor

      • tmay

        Horace has mentioned that diffusion is the reason that the auto industry is difficult to disrupt from the perspective of another automaker. From the perspective of other modes of transportation, disruption is quite possible.

        Diffusion is likely why you will get your commoditization, maybe a generation behind Apple, and innovation is why Apple will still be picking up most of the profits.

        Certainly there are other players in smartphones that are providing “innovations”, albeit most are really just immature “features”, but Apple having almost full control of its technology stack is still a huge advantage.

        I look forward to seeing another rising star like Huawei attempt to create a business model like Apple’s while at the same time, being beholden to the commoditization in Android OS OEM’s.

  • Luis Alejandro Masanti

    Thanks again. I will not shy of myself telling you that yours are the most seeked articles.
    To my deduction of long years of reading you, I know that you ‘love’ modularity, and maybe you’ll like it to become as important as disruption. And I’m with you.

    On the other hand… there is the world.
    quote: “The Apple A11 Bionic processor has 4.3 billion transistors…”
    I’m trying to make an idea of a zillion employees in Foxcon’s factories soldering that quantity of transistors to make ONE iPhone!
    What I want to say is that ‘modularity’ has ‘levels.’
    For an airline a ‘module’ could be an aircraft. But for the aircraft company, a ‘module’ could be the ‘body,’ the ‘wings,’ the ‘motor.’
    And with time and scale this ‘module’ change.
    That’s the reason why Apple is now making its own chips.
    For Apple is chips are its ‘module’ level now.
    As a matter of facts, see that this ‘the chip is the module’ has changed. At first, Apple built SoCs out of ‘module’ parts: ARM design, others’ GPU designs…
    Now, Apple control of the ‘module’ went deeper…

    quote: “The decision depends on whether you are creating new experiences or new “measures of performance” vs. optimizing for cost within existing experiences or measures of performance.”

    This is —at least for me— the measure of ‘modularization.’ Common or good enough modules are ‘horizontally optimizes’: Intel makes CPUs for every PC maker…
    Apple, at 70 million devices per quarter, has the size to do its own ‘vertical optimization,’
    And they are ‘optimizing for cost’ but in a ‘whole device’ level.

    Again, you are right again (as many times ago): We have to pay more attention to Apple’s Siliconization!

  • Thanks again for a great post, Horace.

    Anyone who wants a good belly laugh, here’s Jony Ive (pre-Steve Jobs by the way), doing what he does but about the 20th Century Mac. The guy actually had hair!

    • Space Gorilla

      Interesting, thanks. It’s clear from the video that some of the products Apple is creating today, they were already laying the groundwork and thinking about various aspects and features 20 years ago. Much like Apple’s capability when it comes to silicon, that has been a decade in the making and is paying off huge now.

  • BMc

    The overall silicon prowess of Apple, what it means for their future and the ability to continue to lead the mobile & CE industries, seems to be a relatively unreported story, outside of Apple centric sites. Oh, the media have a little write up on the “whiz-bang” numbers of each new Ax chip, but that is where it stops.

    Apple’s ability to invest in proprietary silicon is *increasing*, with both growing profits and successfully launching new chips, selling in the 10M+ to 100M+ each year. With A11, the inclusion of “neural engine” for specific ML functions is a milestone. Apple has wowed with their image processing for a few years, along with increasing CPU/GPU, but now the specialized silicon is advancing exponentially.

    It isn’t hard to imagine future silicon dedicated to improving voice assistants. On-board speech recognition, understanding simple commands, could be a game changer in this space and put Apple back in the lead.

  • iwod

    I think the Intel figure should include those used in Servers. Xeon E3 which is really the same as Core i7. And by that Account Intel still has a lead in Processor Shipment.

    But assuming the trend continues 2018 will be the year Apple exceed this numbers.

    It would also be much more interesting to compare Transistors, since Unit shipment varies a lot by transistor count. I think in that number Apple has exceed Intel already.

  • Don Joe

    This is only happening because anti-trust authorities are asleep at the wheel or, more probably, paid off. Google and Apple and a few others should have been blocked from becoming so large and powerful a long time ago. We’re headed in a dangerous direction.

    • Rick Deckard

      Hi Don, could you expound on your thoughts, I’m interested to hear more.
      Thank you in advance.

  • beautiful freak

    Is Apple ever going to buy a foundry? Under what conditions would that make sense? I wonder how comfortable they can be entrusting chip masks to TSMC et al, but their approach seems to be to innovate so fast that nobody can keep up, even given the blueprints. Their reliance on a single fab is a risk, but nobody seems to worry about it. I’d like to think Apple has a lithography skunkworks, so they can own those last few recalcitrant nanometers.