Who killed the Intel microprocessor?

The recently announced move by Microsoft to support the ARM architecture with their Windows product, indicates something profound is happening in the market for microprocessors.

Dr. Hermann Hauser puts in bluntly:

“The reason why ARM is going to kill the microprocessor is not because Intel will not eventually produce an Atom [Intel’s low-power microprocessor] that might be as good as an ARM, but because Intel has the wrong business model,” said Dr. Hauser. “People in the mobile phone architecture do not buy microprocessors. So if you sell microprocessors you have the wrong model. They license them. So it’s not Intel vs. ARM, it is Intel vs. every single semiconductor company in the world.”

via Intel Microprocessor Business ‘Doomed,’ Claims ARM Co-Founder – Tech Europe – WSJ.

To make sense of that you have to step back and look at what’s been happening in microprocessors and how mobile computing is affecting the whole processor value chain.

The first observation to be made is that, as Dr. Hauser points out, ARM licensees use a different model than Intel. The development of an ARM-based microprocessor is done in a modular way; in contrast to the integrated way that Intel builds their products. ARM supplies a license, and designers build the chip adding various other circuits like Bluetooth or music decoding on the same silicon as the processor creating so-called SOC (system on a chip). Nokia has been doing this for years and Apple began doing it recently with their A4 SOC.

The design is then sent to a fabricator who actually manufactures the chip (TI and Samsung are some of the bigger ones), often in smaller batches than what an integrated vendor produces.  SOC fabs need to be much more flexible in their production and are much more sensitive to the needs of their customers, the designers.

In contrast, Intel has a proprietary microprocessor architecture that is not available for licensing and they have their own designers and their own fabs to build chips that they sell from a catalog to device builders. PC or device vendors only get to use what Intel offers and there is little in the way of customization (In a rare exception Apple famously forced Intel to modify the packaging of the first CPU fitted to a MacBook Air.)

Because of the differences in business models, the markets for SOC and microprocessors are diverging rapidly with subsequent business model priorities becoming completely asymmetric.

To illustrate, I draw upon an observation[1] that due to Moore’s Law, every year, through the equipment they deploy (e.g. from Applied Materials) fabs make 60% more transistors available on an area of silicon than were available the prior year. Intel almost always takes advantage of this by making more cores or adding more cache.

But if you look at the ability of SOC designers to utilize transistors, they are only able to utilize 20% more transistors than they were the year before. The reason is that they have constrained design budgets; they don’t have enough money or time to design circuits that are complex enough to utilize all the transistors that Moore’s Law makes available. Plus, when it comes to devices, they don’t need the power.

What that means is, for most of the volume applications in the world, circuit designers are actually awash in transistors.  They are being over-served by the advances in fabrication equipment. However, Intel designers aren’t. Being at the very high end they still need even finer line widths and demand that Moore’s Law take the next step to that next leap of technology.

What value chain evolution theory would then predict is that circuits which had to be proprietary and interdependent in their architecture are following an over-service path. In contrast, to serve the needs of SOC designers, fabs, as suppliers, will have to change their pitch for winning new business to being fast and flexible and responsive and to being able to deliver systems on chips that offer every device exactly the functionality that they need and none of the functionality that they don’t need.

Now in the world of mobile computing (and almost any situation where logic gets embedded in a system) it’s the device itself that is not yet good enough and, therefore, you cannot back off from pushing all the components to conform to your device’s purpose.  For a device to be competitive it has to be optimized with a proprietary, interdependent architecture.  That means that the processor inside of a BlackBerry or an N8 has to be modular and conformable to allow the system to be optimized.

This is a quirky idea so I’ll repeat it: since the device is not good enough it has to be integrated, including the processor, which being good enough, has to be conformable. The processor has to bend to the will of the device market not vice versa.

This then puts Dr. Hauser’s point in perspective: a modular business architecture for microprocessors is a necessary condition to being successful supplying devices. The corollary is that Intel’s integrated business model is obsolete in a device world. No amount of polishing of the Atom will help.

The root cause for this obsolescence is the over-shooting of Moore’s law when processing power (and hence circuit density) is superfluous while power consumption, size and flexibility of design are straining to eke out any improvement.

So this renaissance of ARM is happening now because Moore’s Law has overshot what most circuit designers can utilize. And that overshooting is happening because mobile computing products don’t need a multi-core multi-gigahertz processor, allowing designers to back away from the frontier of what’s possible.

What it culminates in is that the places in the value chain where attractive profits can be earned are migrating in a very predictable way: away from Intel and toward ARM.

So in a perverse way, it was Intel founder Moore‘s law that killed the Intel microprocessor.

[1] This discussion is inspired by Clayton Christensen’s “Capturing the Upside” presentation at the Open Source Business Conference 2004-03-17. Although taking place nearly seven years ago, Clay’s vision regarding the future of the microprocessor was astonishingly, though perhaps unsurprisingly, prescient.


Discover more from Asymco

Subscribe to get the latest posts sent to your email.

Discover more from Asymco

Subscribe now to keep reading and get access to the full archive.

Continue reading