Categories

Who killed the Intel microprocessor?

The recently announced move by Microsoft to support the ARM architecture with their Windows product, indicates something profound is happening in the market for microprocessors.

Dr. Hermann Hauser puts in bluntly:

“The reason why ARM is going to kill the microprocessor is not because Intel will not eventually produce an Atom [Intel's low-power microprocessor] that might be as good as an ARM, but because Intel has the wrong business model,” said Dr. Hauser. “People in the mobile phone architecture do not buy microprocessors. So if you sell microprocessors you have the wrong model. They license them. So it’s not Intel vs. ARM, it is Intel vs. every single semiconductor company in the world.”

via Intel Microprocessor Business ‘Doomed,’ Claims ARM Co-Founder – Tech Europe – WSJ.

To make sense of that you have to step back and look at what’s been happening in microprocessors and how mobile computing is affecting the whole processor value chain.

The first observation to be made is that, as Dr. Hauser points out, ARM licensees use a different model than Intel. The development of an ARM-based microprocessor is done in a modular way; in contrast to the integrated way that Intel builds their products. ARM supplies a license, and designers build the chip adding various other circuits like Bluetooth or music decoding on the same silicon as the processor creating so-called SOC (system on a chip). Nokia has been doing this for years and Apple began doing it recently with their A4 SOC.

The design is then sent to a fabricator who actually manufactures the chip (TI and Samsung are some of the bigger ones), often in smaller batches than what an integrated vendor produces.  SOC fabs need to be much more flexible in their production and are much more sensitive to the needs of their customers, the designers.

In contrast, Intel has a proprietary microprocessor architecture that is not available for licensing and they have their own designers and their own fabs to build chips that they sell from a catalog to device builders. PC or device vendors only get to use what Intel offers and there is little in the way of customization (In a rare exception Apple famously forced Intel to modify the packaging of the first CPU fitted to a MacBook Air.)

Because of the differences in business models, the markets for SOC and microprocessors are diverging rapidly with subsequent business model priorities becoming completely asymmetric.

To illustrate, I draw upon an observation[1] that due to Moore’s Law, every year, through the equipment they deploy (e.g. from Applied Materials) fabs make 60% more transistors available on an area of silicon than were available the prior year. Intel almost always takes advantage of this by making more cores or adding more cache.

But if you look at the ability of SOC designers to utilize transistors, they are only able to utilize 20% more transistors than they were the year before. The reason is that they have constrained design budgets; they don’t have enough money or time to design circuits that are complex enough to utilize all the transistors that Moore’s Law makes available. Plus, when it comes to devices, they don’t need the power.

What that means is, for most of the volume applications in the world, circuit designers are actually awash in transistors.  They are being over-served by the advances in fabrication equipment. However, Intel designers aren’t. Being at the very high end they still need even finer line widths and demand that Moore’s Law take the next step to that next leap of technology.

What value chain evolution theory would then predict is that circuits which had to be proprietary and interdependent in their architecture are following an over-service path. In contrast, to serve the needs of SOC designers, fabs, as suppliers, will have to change their pitch for winning new business to being fast and flexible and responsive and to being able to deliver systems on chips that offer every device exactly the functionality that they need and none of the functionality that they don’t need.

Now in the world of mobile computing (and almost any situation where logic gets embedded in a system) it’s the device itself that is not yet good enough and, therefore, you cannot back off from pushing all the components to conform to your device’s purpose.  For a device to be competitive it has to be optimized with a proprietary, interdependent architecture.  That means that the processor inside of a BlackBerry or an N8 has to be modular and conformable to allow the system to be optimized.

This is a quirky idea so I’ll repeat it: since the device is not good enough it has to be integrated, including the processor, which being good enough, has to be conformable. The processor has to bend to the will of the device market not vice versa.

This then puts Dr. Hauser’s point in perspective: a modular business architecture for microprocessors is a necessary condition to being successful supplying devices. The corollary is that Intel’s integrated business model is obsolete in a device world. No amount of polishing of the Atom will help.

The root cause for this obsolescence is the over-shooting of Moore’s law when processing power (and hence circuit density) is superfluous while power consumption, size and flexibility of design are straining to eke out any improvement.

So this renaissance of ARM is happening now because Moore’s Law has overshot what most circuit designers can utilize. And that overshooting is happening because mobile computing products don’t need a multi-core multi-gigahertz processor, allowing designers to back away from the frontier of what’s possible.

What it culminates in is that the places in the value chain where attractive profits can be earned are migrating in a very predictable way: away from Intel and toward ARM.

So in a perverse way, it was Intel founder Moore‘s law that killed the Intel microprocessor.

[1] This discussion is inspired by Clayton Christensen’s “Capturing the Upside” presentation at the Open Source Business Conference 2004-03-17. Although taking place nearly seven years ago, Clay’s vision regarding the future of the microprocessor was astonishingly, though perhaps unsurprisingly, prescient.

  • minimoog

    But marketing and competiton will demand to undershoot. Dual core this year. Quad core next year. 512 MB this year, 1 GB next year. Obsoleting smartphones every six months.

    Intel microprocessor is not dead.

  • Horace the Grump

    @ minimoog… you've missed the point, which is actually quite subtle… Intel's approach is to make a processor and sell that to the market, which due to growth has been willing to lap up the next big thing – its a 'supply push' model that works well in growth markets – the market will take just about everything it can get…

    However, with mobile devices it is a 'demand pull' model… the device (phone/PDA/whatever) want a chip that does X number of things, but no more, because of the constraints on mobile – battery life being the biggest issue. So the business model is completely different…

    Intel's business model is like an all you can eat buffet and the system builders are all large eaters. Whereas ARM's business model is like a high end restaurant where the system builders are all supermodels….

    • Chris

      So if battery life is the biggest issue, and battery technology changes in any significant way, does it switch back to a supply push model?

      • asymco

        Battery life is an issue but not the only issue. Packaging, customization and time to market are also issues with an integrated CPU development model. The more likely tipping point in the value chain would occur when the device becomes good enough and standardized to a single design, similar to the PC model.

        Although some people assume that already has happened, I would argue that devices still have a way to go before they need to stop improving.

      • CndnRschr

        In addition, the handset makers want to differentiate their products which is far easier to do with SOC than with with off-the-shelf chips from intel that everyone has access to. Even/especially with Android, the OEMs are desperate to stand out from their competitors and the only way to do so is to incorporate their own SOC designs. Tegra2 is great, but eeking out additional efficiencies and solutions that increase battery life, etc. are important differentiators. Apple clearly sees this as their strongest card since they FINALLY control the chip design AND software. That is the most powerful place to be as your fate is in your hands. No need to even depend on a single FAB.

  • Frank

    500Mhz ARM Cortex A9 keeping pace (more or less) with a 1600Mhz Atom
    http://www.youtube.com/watch?v=W4W6lVQl3QA

    Not only is ARM more appealing to device architects, it's as good if not better than Atom in every way (computational power, power consumption, design integration, cost, etc.).

    Presumably Intel could, if they wanted to, get into the custom fab/SoC design game and offer bespoke Atom SoCs but they'd still be fighting a losing battle. Their only protection is marketing, and they know how to use that very well, but I suspect ARM do as well.

  • Nikolay Andreev

    So Intel’s best option is to become ARM SoC design of say Apple, as Intel has the BEST manufacturing facilities and will likely make a smaller and better chip than say Samsung

    Or perhaps they can start offering reference designs themselves which after customization by the licencie in question are returned to Intel for manufacturing.

  • Nikolay Andreev

    One other thing… Intel’s current processors cost about 5x, 10x of what ARM chips do. Part of that being that the design cost and R&D is included into the final price. So in essence Intel forces manufacturers to pay for the designs Intel conure up with and then design their produce around the chip they buy. No wonder computers looked the same for 30 years.

    Intel will have to give up the chip design and cost/benefits associated with is to compete with ARM chips on price, let alone features.

    The last thing I see them doing to avoid options 1 & 2 from my previous post is to design a phone and OS for their own chips. Only this way can they hope to retain their high profit margins. Maybe this is what they hope to do with Meego, but it’s seem that’s just way to far into the future and way to late.

    Great analysts Horace. Asimco is one of the blogs with highest original content to post ratio on the net, if not the highest.

  • http://twitter.com/jeffunity @jeffunity

    Intel needs two things – a new battery innovation to allow atom to be competitive with RISC sets like ARM, and SoC capability with atom. Unfortunately the latter would take 2 years to get off the ground and the former probably longer than that. Intel's bottom line is secure for now but they are in a RIM like nexus – do they continue with their (very profitable) existing business or risk it all to jump onto SoC with the hope that they will suceed?

  • http://twitter.com/brisance @brisance

    The entire PC industry has been too hung up on specs. It's as if the only way to sell a car is by advertising its horsepower, which is senseless because comfort, crash safety, fuel efficiency etc are all important factors today. And that's how mobile devices are, or should be, sold. No one (except the geeks) care about how much RAM or GHz a mobile device has. In Apple's case, the apps and easy integration with iTunes are the competitive advantages.

    • David

      How much battery life does the Nexus One have? What's the resolution on the iPhone screen again? How many jigapixels on on my whatsit and does it come with wifis?

      Specs are there. They will be there for a looong time.

      • dchu220

        Hey David. I think the commenters case is that there are so many important factors in devices that trying to limit them to a handful of specs does not give you the full picture. What is important are the benefits of the device.

        For example. You can talk about the number of hours of battery life, or you can simply say the phone will last all day. One is a spec. The other is a benefit.

    • kizedek

      It's worse than that. Anyone can get 200 horsepower out of a 5-litre engine. It takes a little more work and expertise to get 200 horsepower out of 2 litres, or 1.5. Formula one teams are small, dedicated teams.

      Some more fine-tuning is required, and the ability to say no to certain things — maybe reduce the spec sheet by dropping the power steering or something.

  • stsk

    1. Intel killer is about as bright a meme as iPhone killer, or iOS killer.
    2. These points if any, are true for mobile, period. Horses for courses in servers/laptops/desktops/workstations.
    3. How much of Intel's revenue and income comes from mobile? If you take that all away… so what? Right now, ARM is more an Intel paper-cutter than Intel killer.
    4. It may be true that Intel's business model makes it difficult for them to compete in mobile… BUT if they were nimble there is nothing inherently stopping them structurally from modular production and their vertical integration presumably makes their costs lower than separate design/fab structure. Note: I'm not saying they are nimble, but they could be.
    5. Time will tell if it's David slaying Goliath or "We have awakened a sleeping giant and filled him with a terrible resolve"… could go either way.

    • dchu220

      I think you might be taking a very static view of the market.

      The problem for Intel is that power consumption is quickly becoming more important than pure processing speed. It is only a matter of time before you begin to see ARM chips used for servers in power hungry data centers. Eventually, the processing power of ARM chips will become 'good enough' and when that happens, Intel could be in a lot of trouble.

      That's what we call skating towards the puck.

      • Sandeep

        "Eventually, the processing power of ARM chips will become 'good enough' and when that happens, Intel could be in a lot of trouble. "

        exactly the same is going to iOS based devices when compared to android based devices, when android based devices become 'good enough', the sheer volume will inundate the iOS based devices.

      • dchu220

        I don't think anybody will argue with you that Android is going to lead in Marketshare.

        But I don't see the big benefit that Android can provide that should scare Apple. ARM has a clear edge in processing power to power consumption ratio.

        There still are areas where Android is still not good enough. The main area right now is to create an ecosystem for developers and third parties to make money from their platform. We will see if Amazon can solve this problem for Android, but it's not as simple as putting up a store. Amazon will need to work with the carriers and manufacturers, all of whom want their piece of the pie. If they can pull it off, I do agree that Apple could be in trouble, but it won't be easy.

      • Sandeep

        android is designed with modular design in mind and google is kinda the ARM equivalent but on software front and is not looking to making money through licensing like how ARM does.

      • kizedek

        Sandeep, I think you are conflating a number of things. Horace has made a good case for Apple doing well with its integrated model versus Google's modular model. In contrast, ARM has more of a modular approach, while Intel has the integrated approach. This article doesn't imply Google's modular approach with Android is such a good thing, just because ARM is "modular". Different case entirely…

        Chips are small components of a mobile device. That is completely different from the complex device that incorporates the chip that the user is unaware of; the device itself must provide a whole UX that the user is all too aware of — the UX is affected by the integration of the individual hardware components, and the software, and the ecosystem. With a complex end-product that has life-style implications, integration is a great way to go.

        With Intel, the chip choices are limited, and mobile device companies have to build their device around the chip (many of which are unsuitable). With ARM, the mobile device company gets to choose features that directly relate to the end product that they have designed. ARM licensees get to influence the integration of the components that make up their own products.

        Apple takes this component integration a step further and develops the software, too. This means that Apple can eke out better performance from the components, and thus improve the UX and differentiation of its mobile devices. Makes the support and customer satisfaction better, too.

        Android is already "good enough" (There have been plenty of comments to that effect on the last couple of articles, including yours). The thing is, who is nimble and able to continue to innovate as technologies and computing paradigms change? Who is changing the paradigms? ARM rather than Intel on the chip front, and Apple (proven again and again) for the actual devices and experience.

    • asymco

      Your points are valid. Intel's ability to respond to a new basis of competition is the question. The company survived one disruption in the mid 90s away from high-end chips and it could, through a re-definition of priorities, survive the disruption of mobile computing.

      However, note that I did not claim that Intel is dead, I claimed the Intel microprocessor is dead. It's a very specific way to build and market computing that's been disrupted.

  • Hamranhansenhansen

    As enjoyable as they were at the time, 2005's OS X on Intel and 2007's OS X on ARM just keeping getting better and better as time goes on. They were a Wintel-killing 1-2 punch.

  • Bob Shaw

    Great Article and analysis Horace. This just goes to show that core competencies become core rigidities and prevents a firm from adapting to the changing environment.

  • Omacvi

    Please keep in mind that people will still need computers and Mac Share is growing. Yes Intel will be hurt but far from killed.

    • asymco

      Correct. I merely suggested that the Intel microprocessor *as an integrated business model* is dying. It will become harder to move that business into becoming a mobile supplier and overall growth will suffer.

      It's not for a lack of trying: note the fate of Xscale, an ARM architecture that Intel owned but sold. http://en.wikipedia.org/wiki/Xscale#Sale_of_PXA_p

  • Foo Bar

    Sorry, but your thinking is more vague than subtle. BTW, it's better if *other* people say your arguments are subtle.

    * Comparing high-end Intel to ARM is apples to oranges.

    * Intel could certainly go modular. But they would have to dominate ARM to keep the profit margins they're used to.

    * Phone makers should know enough not to go single-source, especially with Intel. But if Intel offers high-performance and low price, will the phone makers be able to resist?

    • asymco

      Where did I say my argument is subtle?

    • Brenden

      I think you are confusing the person who posted under the name "Horace the Grump" above with Horace Dediu, who always posts as "asymco."

  • Nikolay Andreev

    People in the comments seam to imply that Intel is untouchable in the PC market and that it will get to compete in the ARM market successfully, not the other way around.

    Well, as soon as SSD go mass maket, a lot of manufacturers will start using low powered SoC in PC/Laptops too’s too.

    The new Macbook Air are are proof that you can increases overall responsiveness of the computer system with a lower class CPU

    Intel isn’t catching up to ARM, ARM is getting into Intel territory. Specs of components are not the things that matters most, end user responsiveness and power efficiency are.

    • Playos

      It's a bit of a myth that HDD are a "huge" bottle neck… if your doing disk intensive operations then yes… but really as the cloud increases in presence, network stack, processor, and RAM come back as our issues.

    • dms

      MacBook Air still runs a desktop-class OS and still runs on Intel. If the future of computing is MacBook Airs, Intel would be still OK.

      The real question is how the iPad and other tablets will impact Intel. Already, you're seeing the Microsoft-Intel alliance coming apart a bit. Once tablets start to replace laptops and netbooks (atom-based) in significant volumes, Intel could be in trouble.

  • ddebug

    So what? Intel all the time cheats Microsoft with Linux. Now MS cheats Intel with ARM :) And this is not news – remember that Windows CE runs on ARM for many years already.

  • darth

    The logical thing to do for Intel is to buy ARM Holdings. Even at 10 times the price, before they kill them.

    • asymco

      It might seem logical to you and me but not to Intel. What does make sense to Intel management is to run as far away from ARM as possible.
      http://en.wikipedia.org/wiki/Xscale#Sale_of_PXA_p

      Asymmetric business models are sinister to incumbents.

      • J Osborne

        "What does make sense to Intel management is to run as far away from ARM as possible"

        Well that sale was a while ago, and the smartphone market was seen as pretty small. So what Intel thought in 2006 could be very different to what they think in 2011.

        However Intel has a whole lot of good designers, and has produced a lot of good (but not commercially successful, except maybe for the i960) RISC CPUs over the years. If they decide that the mid-performance modular CPU market is a good place for them to make money, they may well not attempt another go at the ARM. They may make the "grandson of the i960". Or they may just decide to make a modular design for the ATOM, and pay the price for having instruction decoders that take more die area then many full ARM CPUs. Depends on where they aim their transistor budget.

        They could even take a middle road and not serve companies that want to make their own fully custom SOC, but just give out a menu of blocks to add onto the CPU (like PA Semi was attempting to do with the PPC before they got bought out). I don't think the middle road would have a lot of profits, but I could be wrong (if they have enough diversity in the menu, and the parts don't suck).

  • BertieBig

    This blog is my new favourite. It’s incredibly informative to read about the business models that are being built and challenged by the explosion of a new kind of mobile computing. As far as I’m aware, this blog takes a unique perspective. Thank you.

    The topic makes me think about what’s underpinned the Wintel monopoly for the past 15 years. Their monopolisation of profit has meant that almost no-one else was able to be innovative. Of those big players from the 80s who didn’t go to the wall, pretty-much all were sucked into the Wintel ecosystem (by fair means or foul). Apple and Oracle are the only big boys who spring to mind as independent survivors (and Sun, until recently). The new kids on the block (Amazon, eBay, Facebook) all seem to have been relatively starved of investment throughout the 2000s, when compared to their forebears, like HP and MS.

    Is there any chance of an article that explores levels of external investment over, say, the last 30 years in tech companies exc. Wintel, and compares these levels to Wintel profits over time? Maybe there would be some interesting insights to gain.

  • arrrgh_jimlad

    ultimately customers decide intel's fate.

    A huge ecosystem. IOW a bubble that inevitably will collapse if the demand for desktop PCs keeps on shrinking.

  • http://twitter.com/relentlessFocus @relentlessFocus

    I get it, Intel is closed, ARM is open… open always wins! oops….

    Well one difference between the Intel/Arm competition and the iOS v Android competition in terms of "open always wins" is that for Intel the closed model simply benefits Intel and the customer gains nothing that Intel engineering doesn't bring while with Arm being an "open" model the customer benefits. In the smartphone competition Apple's closed model benefits the great mass of customers through ease of use, tie with an whole ecosystem of products and services and filtering for higher quality apps while Google's "open" model benefits Google but aside from some guys who read Gizmodo and Hacker News the customer doesn't benefit from Google's approach. I guess it's not the model that matters, but who benefits.

    • asymco

      Modular or Integrated wins depending entirely on where the business has evolved on the dimension of serving its core customers. As integrated processors over-serve the market will be inherited by modular processor builders. As modular PCs over-serve, the market will be inherited by integrated device vendors.

      • OpenMind

        Then it is pendulum. It swings from left to right, back right to left. If catch it at the right time, you will have a hell of ride. Those who are talent and lucky are one who benefit. Those who are stubborn will left behind.

  • timnash

    Intel's major disadvantage with x86 is that its performance for power consumption is worse than ARM. So when Windows is available on ARM, a real danger for Intel is that much of the laptop market moves over to ARM and then progressively the server market (to reduce power consumption) and PCs (to reduce cost).

    As the market for x86 reduces, AMD is likely to go to the wall.

  • Christian Sciberras

    I must admit I stared at the title for like 5 seconds thinking, "what on Earth could kill Intel?!"

    I'm not Intel fanboy…so don't get me wrong. BUT professing Intel will die because they don't support the mobile business model is sounds so much like a stab at some temporary fame with inaccurate comments.

    At one point in time, there might be more smartphones than computers, sure. Does that mean desktop computing will die? NO. I wouldn't give up my workstation for a million iPhones (unless I could buy me a new one selling them).

    So there. I doubt Intel would even be hit by this industry, let alone die because of it!

    • asymco

      The argument I'm making is not on the death of Intel but the death of the Intel microprocessor.

  • Dennis

    Have to agree that Intel are not yet finished. There is a market out there for extremely powerful CPUs and so far this is not a space in which ARM have shown any interest.
    Intel may take a bit of a beating over the next 5-10 years as the demand for cutting edge laptops dies down, but they may also see growth in markets that demand high throughput performance. There is always room for more power in a PC, and high powered applications are expanding not shrinking.
    ARM certainly will have the edge in low powered, modular applications but I doubt that it has the means (or the balls) to attempt to take Intel on in the enthusiast/intensive game just yet. Don't forget that Intel have been investing in improving the SOC offerings as well, with integrated graphics becoming a focus of theirs over the last few years.

    Basically Intel have the time, the resources, the engineers and the market to stop the rot.

    • chano

      But the PC market will wane. 80% of PC users will use a tablet instead and tablets work well enough and run far longer on low-power CPUs. ARM has blue skies ahead. More than good enough today with plenty of scope to ascend the power scales.

  • Sergei

    "The recently announced move by Microsoft to support the ARM architecture with their Windows product"

    This is not true.
    Windows supported ARM since ~1990.

    It was just recognized that windows, including desktop and server,
    must have lightweight kernel supporting many HW platforms.

  • TechU

    like all initial Theory that fail, you seem to be only thinking One single Algorithm,instead of several in combination….
    replace Intel with AMD and you may have a point, but Not for the reasons you outline…..

    not going to go into great detail other than links, that's for the reader to finally get out of the cave and connect All the dot's as it where :D

    ill give you a hint AMD do Not have Any product or strategy skunk works to be a force against ARM (lest you forget. the number one Goliath in the worlds Embedded market FOR MANY YEARS NOW)

    roll on the Massive White Box world market Vendors as they take up the ARM Cortex A9 OC, and Great… for both everything mobile (not just phones) AND lower power yet powerful and productive Desktop use tpp :D

    Here's why you need to get out the phone only mindset single Algorithm only cave more If You Didn't already Know about this…. :D
    its not an ether/or thing any more, its many things used at the same time thing Now.

    Freescale actually introducing a few days ago what people want to actually buy THIS Year, that being an ARM QUAD core cortex A9/NEON 128bit SIMD powered mobile device with many hours of battery use AT FULL LOAD http://www.linuxfordevices.com/c/a/News/Freescale
    “The i.MX 6 series is Freescale’s first ARM-based multicore SoC and first Cortex-A9 model. The processor advances the i.MX family with dual-stream 1080p video playback at 60 frames per second (fps), 3D video playback at 50Mbps, desktop-quality gaming, augmented reality applications, and novel content creation capabilities, says Freescale.
    The SoC is also touted for being one of the first applications processors to offer hardware support for the open source VP8 codec.
    VP8 drives the related WebM (MKV) open container format, both of which are supported in the most recent Android 2.3 release….”
    “the SoC is claimed to enable 1080p video (single stream) with only 350mW consumption. As a result, the i.MX 6 series can deliver up to 24 hours of HD video playback and 30-plus days of device standby time, claims the company.”
    “All three i.MX 6 models are clocked to 1.2GHz, and offer the ARMv7 instruction set with Neon multimedia extensions, VFPvd16 (vector floating point graphics), and Trustzone support, says Freescale. While the single-core i.MX 6Solo offers 256KB of L2 cache, the dual and quad versions are each said to offer 1MB of L2 cache.”

    there's also the Key Point OC that so called 'reconfigurable computing' is back in vogue with both http://www.staho.com/quad-core-to-ki…cessor/208… and http://www.gla.ac.uk/news/headline_183814_en.html
    bringing some basic hints and updated advances news since the Rapport kilocore 256 and 1024 FPGA's appeared back in 2006 (always give them credit where's it due)
    the commercial Rapport kilocore 256 and 1024 CPUs on a single FPGA (Field Programmable Gate Array) where seeing 30 frames a second while consuming only 100 milliwatts (true only CIF i think but still) in 2006 where ARM at the time were getting 3.3 a second while consuming half a watt of power.
    http://arstechnica.com/old/content/2006/04/6556.a

    but moving to current progress you have http://www.electronicsweekly.com/Articles/2010/04

    and http://www.eetimes.com/electronics-news/4210937/I

    there Are NOW 28-nm FPGA parts AND even BIG HINT :D cheap 22-nm devices operated at a peak performance of 1.5 GHz, far greater than any other FPGA's that you should be searching on …

    and even In-Socket FPGA Accelerator's/coprocessor that fit and sit into a spare CPU socket…

    all in all, the world of lower power devices is good for 2011 except when it comes AMD….

    • dchu220

      Wow. Did your post cause AMD's CEO to abruptly resign today?

  • mark

    TechU: exactly my thoughts…

    one thing most of the comments and the article itself have ignored is the main looser in an ARM SoC world….

    is AMD…

    Ignoring the server world; AMD currently competes mostly in the lower end desktop/laptop/netbook CPU world, not the high end and high margin enthusiast PC. If ARM SoC's expand beyond their existing markets (/bold everything other then desktops and servers) then the main looser in sales will be AMD not Intel.

    Also, to those of you suggesting intel buy ARM: Are you crazy? do you really think the EU / UK (ARM is a UK based company) and USA regulators would be OK with that? While they're at it, they might as well purchase AMD/ATI and NVIDIA….

  • Tim

    Intel is not going away anytime soon. Their market share will drop, but there will always be a demand for the fastest processors, especially as the price point keeps dropping. There will be two main markets, those which need fast laptops/desktops and those who only need smart phones or other integrated cpu's in what people do on a daily basis. This is where micro-controllers will also see steady growth, IMO.

    • chano

      Agreed there will always be demand for zippy chips, but the numbers making those demands will decline. Slightly better than fast enough is good enough for 90% of people and price is key of course. It's a rocky road for power computing.

  • Alan

    When Apple introduces the next MacBook Air running on A4 (or some follow-on) this will become apparent to everyone.

    The App store, BTW, is the final piece to be able to do this – every app built using XCode can be compiled to run on ARM rather than Intel with a setting in XCode. The big things that Intel can do that ARM can't are virtual memory and, to some extent, multiprocessing (due to memory management limitations of the current ARM stack) .

    The recent trend in Mac OS X to full-screen view apps help out this transition.

    At that point ARM will be a major force in phones, tablets, and laptops.

    • http://twitter.com/aegisdesign @aegisdesign

      The limitations you're quoting are in iOS not ARM.

      MacOSX running on ARM need not have all the limitations of iOS. One would hope not anyway as full screen apps like on iOS on Mac OSX are an abomination.

      • Alan

        Every successful operating system running on ARM does these things in contrast to every successful operating system running on Intel- perhaps it has to do with the hardware?

        Mac OS X will inherit some of iOS 'features' to enable it to run for 15+ hours in a lighter, cheaper package with lots of apps support. MS is following – give em the usual 2 to 3 years.

        (The other big thing I forgot to mention was backstore of the screen so no overlapping windows. They could do something with some extra RAM for this).

  • Pingback: イノベーション理論から見るIntelのビジネルモデルの問題 at バイオの買物.com の制作者の頭ã

  • newtonrj

    Can someone school-me on a question about Intel's microprocessor's death today and how it applies or doesn't to the AIM/PowerPC 20yrs ago? Intel had the 486-class moving to Pentium and PowerPC came online with Apple/IBM/Motorola. The initiative lasted about 4 solid years and trailed on for another 8-10 before all AIM incumbents exited or changed focus. Meanwhile, Intel moved Pentium through its lifecycle, introduced Celeron and now Atom. Microsoft also helped kill PowerPC.

    My interest is parallel to this article in that PowerPC was the Intel killer of the 90s. Intel continued to push CISC to prevail. Is there a similar argument here with Intel's demise at the hands of ARM/SoC? -RJ

    • http://twitter.com/aegisdesign @aegisdesign

      Apple really killed off PowerPC way before OSX or the Intel switch. By not licensing Mac OS 8 to the AIM partners or clone makers and then failing to deliver Copeland in a timely manner, PowerPC was doomed to being a niche CPU without a popular OS.

      There's no wonder Microsoft and IBM bailed on Windows and PPC powered desktop hardware respectively.

  • davel

    I don't think anyone killed the Intel Microprocessor.

    Intel has successfully defended its turf for years. They have a vibrant technology path and make lots of money being the center of the desktop and server market.

    As you point out, Intel has problems in the mobile space.

    Specifically they are not good at power management. Apple has been pushing them to make it better in the x86 space, but they get shellacked in the mobile space where ARM is the clear winner here and has years of experience in this market.

    I am not sure your integrated vs. modular paradigm is apropos here. It is all about power draw.

    On a phone or a tablet you want battery life. The screen and the cpu are major drains here.

    • asymco

      Indeed Intel has successfully defended its turf for years. Mobile computing however is not its turf. It's not only power draw. Every major design incorporating an ARM architecture has been customized to a product. That's the new basis of competition: conformability of the chip to the device surrounding it. The quote from Dr. Hauser is pretty clear on this point. It's Intel vs. every other semiconductor fab in the world making hundreds of variants of ARM chips.

      • davel

        Not sure I agree here with regard to customization of ARM. There are a handful of ARM designers out there that build off the shelf chips for use by manufacturers. The Hummingbird or Snapdragon implementations comes to mind.

        HTC/Moto/et al design products using these chipsets as the brains.

  • pierrelefou

    Most 'death of' discussions get it wrong; generally, such discussions claim that a major player in business X is bound to go out of business because business Y exists now.

    Nope.

    Separate businesses. Different models. Different objectives. Different customers. As the article makes clear, selling the ability to customize silicon isn't the same as selling silicon.

    Saying Intel dies because SoCs exist is like saying airplane manufacturers are doomed because bicycles have been invented.

    What may be true is that Intel has (unless they change how they want to make money) a limited chance at success in the cellphone market. Or most other embedded markets. However, (i) this isn't new and (ii) there's MUCH more money to be made selling x86's to Dell et al than there is selling ARM licenses to all and sundry.

    And (no, I don't have the figures to hand) I very much doubt that if Intel had all of ARM's business that their bottom line would look much different. That's why Intel has been very careful about entering any form of embedded.

    • asymco

      Perhaps I should define death a bit more precisely. Death, in business terms, is defined as an end to growth. The end to growth means the end of share price appreciation and the end of attraction of talent and the end of innovation. Life goes on but it's not a wealth building enterprise.

      For other perspectives on predicting company failures I suggest reading the following two parables: http://www.asymco.com/2010/09/16/the-parable-of-thttp://www.asymco.com/2010/08/09/the-parable-of-t

      • simon

        "Death, in business terms, is defined as an end to growth. The end to growth means the end of share price appreciation and the end of attraction of talent and the end of innovation"

        I think that's the point you should've made clearer in your writing. People who concentrate more on the technological side often tend to forget that important notion of "growth" in business analysis.

  • JohnM

    Yeah RISC was suppose to beat out X86, but nothing came of it!

    • chano

      And Apple was supposed to die in the mid 90s but turned out to be No.2 company in the world by market cap just 14 years later.
      It's not about the death of Intel, it's about a sea change in computing. The desktop is destined to become a minrity interest phenomenon – say 10% of the market, or less. This is already happening. Laptops too will decline in a decade or so. Mobile is the ultimate personal computing paradigm because it is ultra portable and more than good enough for 90% of the needs of 90% of future users. A future tablet can easily dock into a powerful distributed processing base station for heavy number crunching but, even then, only for those few who need the added processing power.

    • J Osborne

      When RISC first came to market and everyone made noises about it killing all the CISC CPUs (which it mostly did, it killed off 11 or so CISC CPU families leaving pretty much only the x86…unless you look at embedded markets where it has still done well, or mainframes which I'm not so clear on)…

      Let me start again.

      When RISC first came to market and everyone made noises about it killing all the CISC CPUs one of the biggest advantages it had was a low transistor count. So low that the instruction decoder and microcode on an x86 took more transistors then some of the early RISC CPUs (like the Fuji gate array version of the SPARC). There are lots of ways that is an advantage. Like you can use older technology (gate arrays), or get more CPUs per die on modern technology making the manufacturing cost lower. On superscaler CPUs the decoder is one of the parts you need to duplicate, and so on.

      Moore's law has increased the transistor count by about 60% a year (and for all that people say Moore's law is dead, the transistor count HAS kept pace with it). That has made the cost of having a complicated instruction decoder less and less painful as time goes on. So what once was severe pain for the x86 they can now pay several times over and not worry too much.

      Likewise things like RISC's comparatively large register sets were a great boon to avoiding pipeline hazards in deeply pipeline systems, while the x86 had to reinvent register renaming (borrowed from the mainframe world), and pay a high cost in complexity. However as time passed the "large register sets" were not large enough to hide deeper and deeper pipelines, so RISCs had to also reinvent register renaming and pay the same sort of transistor budget that the x86 did.

      In effect the steady march of transistor budgets have made most of the RISC advantages less and less useful over the years. At the same time the x86 has retained one significant advantage. It is bought in huge numbers, so whatever non-rcurring engineering cost is present gets smeared over a huge number of chips. So while Intel spent more the DEC on x86 R&D then DEC did on Alpha R&D the per-chip R&D cost was likely about $2/each and on the Alpha it was more like $1000/each. That is sustainable on $5,000 to $50,000 systems, but obviously not on $100 to $1000 systems.

      A pity, because I find the x86 pretty painful to use, but that's life.

  • anon

    Personally I welcome our new ARM overlords.
    It's been way too long since assembly programming last was FUN!

  • http://www.notesark.com iphoned

    Isn't this as simple as Intel just lacks the X86 instruction set monoploy in the mobile space?

    So now they need to complete without the X86-instriction-set monopoly, while behind ARM on low-power technology?

    (Surely the "modularity" is secondary as Intel could easily supply a modular system if that's what the market demanded? I and I am sure, if they had same X86-legacy-compatibility-monopoly in mobile they would come out on top. But mobile seems to have broken free of the X86 legacy yoke.)

  • Ned

    As a consumer I use to update my computer every 2 years or so. Now I update the computing device I use most every 2 years, my phone. My computer is now 4 years old and still doing everything I need it to do. If that is a general trend it has got to hurt Intel chip sales in established markets.

  • g0atm1lk

    I thought Intel had a 22 nm die processor coming up soon. Why couldn't they integrate graphics (oh, they are…), bluetooth, gps, and whatnot on the same chip and sell as an SoC to compete.

    The buyers would only be able to use the functionality they bought.

    What could a totally integrated SoC have if it where made today?

    CPU, ram, graphics, bluetooth, WiFi, GSM, CDMA, IR, sound, accelerometer, gyroscope, and GPS.

    Dang, all you need then is a systemboard with a few wires going to the various i/o ports to the outside world and you're done. LOL, Intel could even colour code the wires like computer case and motherboard manufacturers do for us. Intel customers could hit the market faster with less to worry about on the system board and need fewer workers to design them.

    All of the tech you see are being manufactured at commodity levels now and we pay alot for the wow factor.

    I think the design process is still mostly backwards now. Now the HW is commodity, companies should start designing software and then retrofit the hardware to the software. It's like the man said, Intel is too much for most computing applications.

    • g0atm1lk

      Which gives me an interesting ideal, if Intel were even ever to make SoCs and with there abundance in a typical person's house (say in 2 or 3 smartphones, in a desktop, in a TV, in a stereo, in a refrigerator (weird I know), in a car, in a home security system, and so on, what if these SoCs could link up and contribute their cycles when someone was using the desktop or doing some really complex computing tasks?

      • http://parallellogic.deviantart.com/ Parallel Logic

        That’s an interesting idea.  I think the two problems you would run into there would be bandwidth and speed.  You’re limited by the bandwidth available in your local network, and all the nuances therein (varying data rates, dropped signal, etc).  In terms of speed – if talking to main memory is like talking with Mars, then talking with your TV is probably most akin to talking to the next solar system over in terms of clock cycles spent waiting for the data to get processed and returned.

        Still, that’s an interesting idea, those limitations are really only troublesome if you;re going for real-time processing, but if you’re doing heavy simulations that will take a long time, might as well utilize the family TV for what it’s worth.  The processors for each are no doubt specialized for the tasks they perform, so that would be another complexity to consider

  • g0atm1lk

    Someone who knows more about CPU arch then I please say, what is the battery problem?

    Is it because CISC chips still use more power then RISC chips? I guess that makes sense. Does the 22 nm die reduce power usage? I though I read once that it does.

    • J Osborne

      Transistors use power. They use power when they switch, and the kind normally used in CPUs also leak power all the time. Some of that can be combated by downclocking when not in heavy use, and some by shutting down parts of the CPU when not in use (if the design can turn them back on quickly). A design that can do the job with fewer transistors and/or at a lower clock speed will use less power (all other things being equal at least).

      Changing the feature size (22nm) tends to reduce the switching power, but I think it also frequently increases current leakage. I'm not a EE, so I'm not sure if there are physical laws that cause this, or if it is just the way things have been going and maybe the next turn of the crank can reduce leakage as opposed to bumping it up.

      CISC chips lose out here because they have to dedicate transistors to more complex decode logic, and more complex pipelineing issues. There are other things that CISC's spend more on at earlier points in the performance curve then RISCs, but if you push performance hare enough they tend to equalize out (for example register renaming, CISCs tend to need that long before RISCs do…but push hard enough and both need it).

      On the other hand CISCs tend to have higher instruction density, and that frequently gives them some other advantages (less RAM/FLASH/whatnot to power, better use of I-CACHE space, so a smaller I-CACHE for a CISC can work as well as a large I-CACHE for a RISC). Unfortunately for this contest the ARM has an instruction mode that gets nearly the same instruction density as most CISCs (it does cost the ARM something, but the cost isn't too bad). So most of that advantage is moot.

  • g0atm1lk

    OK, now I think it's clearer to me.

    So CISC is only really power hungry relative to RISC if all these extra instructions in CISC aren't doing unique work, that is many of the CISC instructions are doing work that it would take a half dozen RISC instructions to do.

    So if you have an app that need x amount of y type work done then it is the power used to do that work that should be how these RISC and CISC processors need to be compared.

    There must be some standard test someone wrote somewhere that can be normalized that shows whether ARM or ATOM is getting more work done given X amount of power. We can bet Intel and ARM both have ways of doing such measurements.

  • fieldforce

    Wow, great thread here. As someone who supplies both ARM and Intel with technology, allow me to add some insights.

    I wouldn't suggest, as some have, that this boils down to a CISC vs. RISC debate. If you've looked at the ARM Cortex microarchitecture, you'll realize that such superscaler, out-of-order designs are as complex as classic CISC and you could argue that the Intel ATOM core is less complex than the ARM Cortex-A9. Then you'd also have to look at the respective bus fabric, and ask whether a typical ARM AXI bus fabric is less complex and/or power hungry than, say, IOSF/DMI bus fabric on the ATOM. Probably not, it's a wash.

    Likewise, as was suggested initially, does the IP-based business model of ARM somehow make it's solutions superior to Intel's because of the increased competition that can occur between Intel and the "rest of the semiconductor industry?" Well, Intel will claim that they'll be able to compete with anyone on process improvements and to an extent that's true. The ARM-based solutions from Qualcomm and Marvell rely on TSMC's fabs, and it's a race to see first who gets priority on the latest TSMC process node and then whether TSMC can move faster than Intel. Similar story with proprietary fabs at TI or Global Foundries. I wound't count Intel out, in other words, just because of their business model.

    But then let's turn towards the actual deployment of devices, and the ecosystem around handsets and mobile software. Here the difference is really black-and-white. ARM devices are by far the dominant software target in the mobile space, and have been for years. The tide of new mobile software has, if anything, increased the lead ARM-based designs have had on Intel (IA32) architecture. And despite Intel's purchase of Windriver, and their subsequent Android focus, I don't think you'll see much of a dent in the number of Android devices on IA32 let alone RIM or iPhone. I guess the story's still out on Nokia.

    So, in conclusion, who killed the Intel microprocessor? Mobile software stacks did.

    And unless IA32, and the next gen Intel SOC's can offer something compelling to an Android, Blackberry or iOS stack, I we'll shift from ARM based designs in mobile anytime soon.

  • Pingback: Linkbait 3 | Programming Blog

  • http://www.valleyside.de Peter

    Very nice visualization… Hope on.

  • Pingback: Linkbait 3 « Web Development

  • Pingback: Mercado de microprocessadores » Márcio Francisco Dutra e Campos -

  • Pingback: In Depth: How smartphones and tablets are taking over | Nyheter

  • Pingback: In Depth: How smartphones and tablets are taking over – GhazaliRidzwan.com

  • Pingback: I Djup: Hur smartphones och tabletter tar över | Nyheter

  • Pingback: Quel que soit le secteur, David gagne de plus en plus souvent contre Goliath : Entreprise Globale

  • Pingback: Entreprise Globale » Blog Archive » Quel que soit le secteur, David gagne de plus en plus souvent contre Goliath

  • http://www.facebook.com/people/Tom-Baker/100000085924677 Tom Baker

    Intel famously refused to bid on Sega’s Saturn CPU needs, they felt the margins not big enough to bother, they called the Saturn CPU, just jelly beans.   Intel claimed the sub-1000 PC market was not a trend.  Intel has made blunder after blunder but stayed on top because Manufacturing wise they had the horsepower, to pump out the CPU designs which were locked into Windows, then even Apple.   There were some exceptions when AMD lead, but for the most part with the Core Duo, Intel never gave up first place for notebooks and PCs.

    Why did I mention Sega?  Because it was about then the GPU was taking off.   3dfx.  Intel did half hearted attempts with GPU but left it to ATI and Nvidia to take over that processing.  Nvidia is now large in the mobile market.

    Now the smart phone, and smart phone large=the tablet, function to be able to take 80% of what consumers need.   The missing piece is the keyboard and precision mousing for larger screens.

    Microsoft is behind the edge, but will seek to solve this with windows 8 which will have very thin convertible tablet PCs.  Not the old style thinkpad X tablets, but more of a Ipad with a keyboard that be swung around.

    Microsoft has followed Apple before and done well, but only because of price point. 

    So to wind it down fast now.   Intel toast because they did not nurture the markets that became the tablet.  They gave away the fight for the GPU.   They ignored new platforms like Xbox, Playstation.    All recoverable mistakes until…..Microsoft went windows 8 to ARM.

    Its a done deal.   Intel is toast.   Today is as clear as it was a year to 1.5 years ago, when Android was spawning…..the result to RIMM was going to be clear.  

    There are no functions, no needs to have INTEL INSIDE.   Heck I can not even tell how fast these CPU are easily with the i3, i5 and i7, and the new i7 Core 2, whatever that is.  Its all meaningless.

    Windows 8 on ARM is what craters Intel.   No they won’t vanish.  The Atom will improve.  But I can design a motherboard to have lots of Quad Cores to run Windows 8, and lots of jelly bean CPU will be transparent with Windows 8.

    Intel should have fought hard to keep all the CPU design wins it could get.  All of them.   And they forgot to buyout ARM when they were rich with Cash around about 1998.

    Intel is the new RIMM.   Who killed the Intel CPU?  Windows 8 on ARM.  Who caused it to go to a non-Intel CPU…..Intel because the forgot to plant new seeds, they stood on Moore’s law, as if the CPU was a single train track.   Intel forgot to be inside phones, as by defacto Tablets, and then Tablets became the PC, and Windows went to ARM.

    Intel is the new Blackberry.