Shared publicly  - 
x86's Days as a Consumer Microarchitecture are Numbered

The big news today is AMD is ending the x86 wars! This is huge, but the writing has been on the wall for a while. It is further confirmation of trend I've been thinking about lately. The future of the CPU is a battle between the ARM vendors (TI/Nivida/Qualcomn/Samsung/AMD/Apple) and Intel. Intel knows this and is scared. AMD hasn't been a real threat since Intel debuted the Core architecture 5 years ago.

The beauty of the ARM ecosystem is that the vendors are forced to compete with each other on price, performance, and advanced features. ARM CPUs have been advancing at a rate we haven't seen in the desktop space for 10 years. If Intel continues to fail to produce power efficient mobile processors and ARM continues its march towards laptop performance at smartphone battery consumption, x86's days as a consumer processor architecture are numbered.

It's really not hard to fathom. Few will be surprised if OS X goes ARM next year or soon after. Apple likes to control the entire computer stack, and the A5 line of processors proves Apple can execute on fantastic CPU design. A Macbook Air with an A6 would have double the battery life of today's model.

We've already seen hybrid smartphones/laptops from Motorola (The Atrix laptop dock) and hybrid tablets (Asus EeePad Transformer). These products may be slightly premature given the speed of typical ARM processors, but if things continue as they have for the past 4 years, laptops run by speedy smartphones or tablet will be commonplace in a few years.

But, perhaps the most salient point: Windows 8 supports ARM and Microsoft will be pushing this hard. The days of "Wintel" are over (perhaps ARMDroid will be more appropriate going forward). Long-term, Microsoft will likely unify Windows Phone and Windows 8 around ARM.

The question now is, can Intel hold onto the enterprise and server market, or will it be forced to become an ARM licensee like everybody else?
Paul Allen's profile photoAnanthasayanan Kandiah's profile photoEugen Zwinger's profile photoVic G. Reyes's profile photo
This seems a bit premature. I expect the status quo to continue for the next five years at least, with mobile computing being ARM while Windows and OSX are mostly all x86. Porting software is a huge headache... Apple went through hell to get everyone on x86 five years ago, and the cost of switching has only risen as the third-party software ecosystem grows...
Apple transitioned to X86 very quickly, and five years from now the laptop as we know it won't exist, except as a old, stodgy "work" computers. Laptops and desktops will be the pickup trucks of computers. Remember, 5 years ago the iPhone didn't even exist.
So, people keep saying stuff like that, and pointing to e.g. the iPad as the replacement for computers today. Well, I'd bet good money that 9 in 10 iPad owners have a real computer too and like it that way. The people that can get by with just a tablet are the people who aren't tech-savvy... and the percentage of the population that's computer illiterate is going to keep dropping in the future. A tablet is a perfectly good complement to a laptop or desktop, but it's only a viable replacement for a small and shrinking demographic.
Wrong. Operating Systems like iOS and Android will eventually replace most computing tasks for most users. How can a $1000 Macbook Air compete with an EeePad Transformer Prime that costs $500 and has 18 battery life with a full querty keyboard?

Not to mention iOS and Android are both a huge improvement on the leaky abstracting that is OS X and Windows 7. I'd be much, much more comfortable with my mother using iOS for all her computing tasks. No viruses, no registry, no firewalls, no BS.
The sandboxed app model and form factor for computers aren't necessary related. (See, sandboxed apps ala Apple's App Store, and WinRT apps in Windows 8).
Here's the thing. Steve Jobs thinks power users are a small fraction of the market. I think they're a large and growing fraction, especially when "power user" means "wants to run Word". I don't dispute that mobile OSes will continue to grow, but I don't think the timeframe you present is realistic.
The vast majority of users will be able to use a tablet by itself for most of their computing needs. If you're going to tell me that some hardcore gamer with SLI'd video cards and a 24 inch widescreen is going to do all their gaming on a 10 inch tablet with a single dinky graphics chip, I'd tell you you're smoking something. Same goes for programming. I write software all day and I'm sure as hell never giving up my dual monitor setup to do it. If you want to go from dual 22" widescreen monitors to a single 10 inch, be my guest. I don't see tablets catching up to those use cases in the near future.
Amr Ali
I don't foresee the changing from x86 to ARM as an issue as nowadays more high level languages are being used and even low level ones that utilize specific platform features are now moving on to a more abstracted methodology where platform independence happens at a much lower level.

ARM is fantastic, let alone that it is RISC, it's very power efficient and most of infrastructure networking hardware does utilize ARM. As for the PC/Desktop scene, the fine hair line that distinguished mobile implementations of a PC and a desktop has disappeared. Sure thing, VLSI boards beat laptop form factor in expansion and capacity, but that argument can now be kept in the servers' room.

Mobile computers are more powerful than ever, cloud computing is emerging in scary figures, if you have a need for super computing experience you can now have a NVIDIA card that has close to 1024 cores of parallel computing at your disposal (yes that's inside a laptop). Computing is not centered around software needs anymore, it's all about usability.

And to be honest, I don't see mobile computers with at least 15" monitor and a full QWERTY keyboard disappear any time soon within the next 7-10 years, simply because the whole computer user experience is largely still centered around the keyboard, and that's not necessarily a bad thing, but I'm confident that unless one figures out a method of input that at least provides the control, the granularity, and the accuracy of a keyboard, the traditional form and shape of a computer won't simply just go away because of a change to the processor market. You'd certainly witness very powerful and very mobile computers, but they will still be the same in form and shape. Tablets and what not are very much needed tools, but they are never meant to replace your laptop or desktop, they just facilitate another type of usage for a computer that fits particular real life situations.

In a nutshell, if you are dreaming of the one computer solving all of the problems then you should be looking at nano technology in depth, and other forms and methods of computer input. If a computer can't be under your skin and you can't read off of eye lenses, which is going under research only battery is the problem right now, you'll always have a need for several forms and shapes of a computer that address different usability situations.
+Amr Ali I would hardly consider ARM to be RISC nowadays with parallel instruction sets (thumb). Also, people have been saying the hire level language thing for 20+ years now, hasn't happen yet.

Why I don't disagree that ARM is going to end up ruling the world next. It will be painful with without a 64bit address space. As these devices grow up they will have more memory (some of them are shipping with 1gig ram already) and I for one don't want to go back to worrying about manually paging things out.
Spot on analysis. On the client side, enterprises are already provisioning thin clients, and there is little reason why this couldn't be Arm-based. On the server side, lower power consumption will drive ARM adoption with non-Windows application servers initially.

Any app that is built on WinRT for Windows 8 will be the equivalent of Apple's universal binaries that will run on Arm and x86. In due time, consumers could be pushed off x86 altogether.

I've grown up on WinTel for so long that I couldn't see a world without it. Thanks for the insightful post.
I agree with +Derek Thurn; I don't think tablets or mobile devices are going to replace the desktop. But the computer as we know it is increasingly going to be relegated to a secondary role, complementing those devices where specialized horsepower or interfaces are required.

For most users, a fast, accessible, and highly-available portable computer fulfills most of their needs. The demand for traditional desktop/laptop machines won't go away anytime soon, but that demand won't be growing nearly at the rate of the mobile segment -- if at all. They'll be occupying an ever-decreasing share of the overall market.
At the end of the day 'computer' growth will come from India, Asia, Africa and the Middle East - and those areas are seeing explosive mobile device growth. ARM will rule supreme in the future.
Did someone just say Asus Transformer Prime cost 500 USD with the keyboard dock? Where do i sign up!?
I talk to a lot of end users and consumers, and I have to say that many of them have already ditched the desktop for the laptop, and that many more will forgo having a general purpose computer entirely in favor of mobile devices in the future. The trend towards mobile as the default, with many people choosing smartphones as an exclusive platform puts ARM in a great position. Maybe once the desktop is dead, we can have the year of desktop Linux?
"But, perhaps the most salient point: Windows 8 supports ARM and Microsoft will be pushing this hard. The days of "Wintel" are over. Long-term, Microsoft will likely unify Windows Phone and Windows 8 around ARM."

At face value, I really like this general idea. ARM-based processors and low-power GPUs and getting up there in performance, and who doesn't want a small, quiet, powerful-enough box to get 99% of the jobs done? The big, glaring problem, though, is that Microsoft won't let developers create ARM-based desktop application for Windows 8. With ARM and Win8, it's Metro or nothing at all, which pretty much ruins that fairy tale.
+James Rodrigues Concerning the ARM/WinTel subject: In brief it looks reasonable - ARM = low power & the future, x86 = power drain & old school. But I fear, it's much more complicated. Specs won't win - that's what I've learned from iPad and Kindle Fire (both platforms are rubbish from a technical/spec oriented view - but the customers are buying thouse gadgets, because of it content).

But let's go away from that place and have a look back on technology. If you buy today a WinTel machine, you will be able to run several versions of Windows, maybe a Hackintosh, different flavors of Linux and so on. On the Windows side, there are plenty of drivers for nearly any device. But is that the same on ARM based hardware? In my opinion No. If you buy an ARM gadget, you depends on the vendor and it's software update policy. And I guess, we will not be able, to use a todays ARM slate as a Windows 8 testing platform.

Overall: We will see - but if Intel manages to squeeze power consumption of their upcoming Ivy bridge processors and SOC solutions, I'm sure, there will be a lot of new options on the WinTel field.
Intel already is an ARM licensee. They even used to produce the XScale (ARMv5TE). I think we shouldn't worry about Chipzilla.
I think your getting a bit ahead of yourself. What do you do for a living? If it aint broke dont fix it and the intel x86 processors are v good at what they do. the only reason amd is changing course is its latest architecture failed miserably so they ahve to go back to the drawing board and in the meantime come up wit ha product. as for your iphone comment wtf? Have you ever heard of palm pilots or the newton? the idea has been around for years the iphone didn't do anything new - it just made it 'affordable' for most people. The reason why windows 8 supports ARM is that microsoft have spent years setting up for this architecturally wit the way .net compiles. Its like java in that you can have many processor targets its just they hadn't been implemented. Dont get me wrong i am a big arm fan and love risc processors but you are blowing things out of proportion - also this cloud thing really isnt going to replace desktop computers because i need a crap ton of cpu power at home and cannot simply use or want to get into the financial decision of buying cpu time / obscence bandwidth. The cloud has been before in other forms - most notably the mainframe concept. It has strengths and weaknesses and does not solve every problem and creates a whole new set. The current cloud solves all is a total fad.
Amr Ali
+Alan Zeino I'd like to clarify, if you mean "computer" growth as in innovation, then I doubt it would come from Africa or the Middle East (except for Israel and countries on similar scale) any time soon. If you are talking about consumer growth, then of course Africa and the Middle East are very big consumer centered markets and has been for decades now. India is getting there in terms of "whole-sale" software and that's because top talents get dehydrated and shipped off to countries like the U.S.

Asia has been a pool of very particular advanced talents for some time now, but countries' politics and structure prevents good utilization of such talents and hence results in the same situation as in India but not on the same level.

MENA region has many top talents, but the poor countries' infrastructure and whatnot literally kicks such talent off to other much more developed countries.
It was trivial for ARM CPUs to advance quickly. They were very small chips, it's very easy to add stuff that already exists to them. It's no innovation. When they will reach the complexity of IA-32e CPUs they will share their drawbacks.

It's nothing special about ISAs that makes CPUs better or worse. ARM's strong points come from the fact that they are much smaller CPUs than most IA-32e CPUs. If you add complexity for performance you lose the strong points.
"How can a $1000 Macbook Air compete with an EeePad Transformer Prime that costs $500 and has 18 battery life with a full querty keyboard?"

Simple: make a MacBook Air that costs $500 and has an 18 hour battery. If they switched to ARM, they would get there pretty fast. And you can bet your ass that Apple is maintaining secret ARM builds of OSX.

Once x86 is gone, we'll forget about the PC vs post-PC dichotomy. But we'll still have computers with keyboards on our desks, tablets on our coffee tables, "phones" in our pockets, and perhaps some TV-like computer as well. Each of these form factors serves an important purpose that is not going away.

The big battle will be over the OS. We will need something that runs on all of these devices, offers a great experience to all types of users, and has an open ecosystem. Android is in a great position here because it has the potential to be all of these things. Apple seems committed to the two-tier approach: one open platform and one curated. This is working well for now, but it may become too complicated down the road when users expect more integration of their devices. I suspect they will continue to recreate the iOS experience on OSX and eventually use it for everything.
Jon, ARM is the more powerful architecture in terms of computation / power, which is what counts in the long run.
however computation/power does NOT scale linearly. You are comparing apples and pairs (the fruit lol my mind gone blank on spelling of that!) Morevoer the whole cloud thing is wrong ... the way cloud will go is users being the cloud allowing their processing power to be used in commercial services (Eg yacy, btcoin, sett at home etc). this daa center approach is fine for some applications but definably not for development work - my current build data is over 1 gig and takes 8 mins to compile locally. Have you any idea how much dead time cloud woudl add to that and how much that time would cost the client (Ill give you a clue, the build server takes 40 mins by the time you generate all the targets)? As for the OS battle microsoft kinda did that one back in 2002 when they designed .net in the first place (although i don thtink they use .net to write an os lol - but you can bet the compilers they use for the OS use the same tricks)
ARM can't be powerful or anything, it's just an interface. The microarchitectures used in ARM chips offers a better performance / power ratio now but this ratio can only become progressively worse as they approach the complexity found in today's IA-32e chips.
I don't know much about the ARM architecture, but I have to imagine that it scales considerably better than one that is ~30 years old and designed to run at a couple of MHz.

EDIT: Apparently ARM is 25 years old and x86 is 35. Regardless, the mere feasibility of changing architectures strikes me as a death sentence for x86. If ARM doesn't supplant it, something else will.
+Jedediah Smith IA-32e is a piece of paper (well, about 5k pages long). ARM is again only a piece of paper. It would be relatively easy to implement ARM on the Sandy Bridge microarchitecture just as people implemented IA-32 on chips vastly simpler than current ARM chips.

There's nothing special about pieces of paper, it's the microarchitecture that gives the general performance and power characteristics. Microarchitectures used today in IA-32e CPUs are not 30 years old, they are a couple years old and have invested more resources into decreasing power than ARM chips have.
+Jon Rabone SIMD doesn't matter at all except for niche applications. 99.5% are happy with just a web browser. Any CPU intensive application today is a niche app, used by professionals.

The trend is to push ARM for the masses (even more so than it already is) not to replace CPUs used by professionals.
Not yet. The assertion is that x86's days are numbered, not that they are over. Presumably ARM will bolt on extensions in the same way x86 has over the years. Or, we'll just use the GPU for that stuff.
Also one of the ideas of arm processors is that you can clock them far faster but as intel finds there are physical limits on clock speeds which means your instruction set starts to do more and more complex things per cycle (ill probably be corrected from all directions on that one - but the gist is true) - the point of risc is that it doesn't do this. When I was really into the risc debate it was 13 years ago and everyone was having the same discussions - its just because the mobile world has taken off that its coming back to light as there is now a very public application. Does ARM kick ass? certainly but its a tool for a specific kind of job. The point is you should get to work on public transport rather than a bentley. As soon as they bolt on extensions to rrsc they are no longer risc and will lose the kind of characteristics that currently makes them more efficient - at which point you have to give the knowledge pool of intel and co a very large nod.
So it seems that ARM is more power efficient, but could anybody here explain why this is? I do not understand much about processor architectures.
ill answer that only to get shot down and learn something new ;) They do the minimum possible so they can focus on doing that highly efficiently. Intel processors on the other hand can do a lot more per clock cycle which requires more energy. so if you clock them art the same speed the arm will use less power. However that is not a fair comparison - the comparison is power usage at a certain data/instruction flow rate which will vary depending on the computation. If you did a computation that is hard coded into the intel processors id expect them to have a better chance of optimising than a risc processor that will be relying on s compiler to get that optimisation. A compiler doesn't know what a processor will be doing at the time the code will be run and so cannot make any optomisations at that point. So if you compared both architectures at adding lists of numbers id expect arm/risc to destroy intel. If you compare on some of the things intel has hard coded you would expect the opposite to be true - the dream of risc is to optimise to a point where this fails - not to add on the extra operations. Its like comparing a knife to a swiss army knife - sure you can unscrew things with a knife but why not just use the screwdriver on the swiss army knife? Better yet why not use a screwdriver? (anyway my build has finsihed now so back to work)
Interesting and what about all the sophisticated features that an Intel x86 CPU offers? I'm thinking of branch prediction, perfecting, out of order execution and pipelining. Wondering how this compares to an ARM CPU.
+Nils Brünggel simplifying to great extent, both employ about the same power saving technology, but ARM chips simply have far fewer transistors compared to IA-32e chips. Increasing the number of transistors to add more cache, or various units that would make the chip perform better computationally would mean an increase in power consumption.

You'd reach the power level of current IA-32e chips before reaching their performance because IA-32e chips employ slightly better power saving technology and a microarchitecture that gives significantly better performance at the same complexity.
I agree that PC levels of performance are not likely to be reached in the next few years by ARM implementations. On the other hand, I'd be very interested to see a version of Larrabee using ARM cores instead of Pentium P54s. I'd expect more cores with greater efficiency.
+Jon Rabone I wouldn't call IA-32e uOps close to ARM at all, they are much more low level. Current IA-32e designs use a horizontal microcode that merely specifies what control lines are to be asserted.

Multimedia decoding is not a niche application but audio is extremely non-CPU intensive and video is usually done on a GPU.
+Jon Rabone Unfortunately, the IA-32 CPU would need huge amounts of cache to perform great at that scale, negating the advantage of using simple IA-32e cores, unless you turn it into a barrel CPU, which would be a huge waste of space and power for HPC, though I think that barrel CPUs for servers are great.
ARM and x64 are converging to a degree, x64 has got 16 GPRs, Sandy Bridge added 3 operand instructions (VEX prefix). ARM is getting a 64-bit ISA. But pretty much the essential RISC/CISC difference remains - fixed vs variable length instructions. Process differences, excellent OOO implementation, etc. have masked the impact. Maybe if we reach a limit of IC manufacturing it'll be relevant again. I don't recall the Hennesay and Paterson numbers for the impact of fixed/variable (and it was probably based on VAX vs. proto-MIPS) but I seem to think it was less than a 20% difference.
+Jon Rabone 1Tflop with ~50 cores is very poor performance. 12 Core i7 2600L could do it. It doesn't matter much that Knight's Ferry 's cores are simpler, HPC (and corresponding benchmarks) are only about floating point, mostly only the floating point units matter.

CISC requires lots of cache, and IA-32e even more so. We solved the IA-32e scalability problem in the I/O pace replacing crossbars with switched fabrics and integrating the memory controller in the CPU, but for the floating point performance to scale we need to scale caches accordingly.
+Aram Hăvărneanu "CISC requires lots of cache" ?? Since many common CISC instructions are smaller than the equivalent RISC ones, you can consider a CISC program to be partly compressed. Most measures of program size confirm this. That reduces I-cache and L2+ cache usage for CISC.
+Andrew Murdoch Instruction size dominated cache usage in 1985, and a selling point of RISC CPUs at that point was that they did not require so much cache. Today deep pipelines and out of order execution require lots of cache and instruction size does not matter as much.
Bring it on... I want a ARM powered chromebook. Why isn't google releasing a quad core ARM powered chromebook for xmas!!????
Yusuf M
Most people dont know how Intel developed the Atom core. And thats the reason we see a lot of "ARM conquering the X86 market" speculation.

The original Atom core is a "stripped down" version of the original Pentium 3 willamette core (15 years old). Intel removed many performance enhancement features from P3 in favour of battery life including the out of order unit and added some power optimizations into it and made the Atom Core.

So when you compare ARM vs Intel, we should understand that we are essentially comparing the most mordern ARM CPU with a 15 year old Intel Processor that's been stripped off of its most important performance feature (The OOO unit).

Now we'll have to really laugh when some one says the next ARM is going to bash the Core i7.
So cause Intel choose to use a 15 year old design to go up against the ARM cpus, we shouldn't compare ARM against them cause the Intel isn't cutting it with its 15 year old design? You're funny....
Add a comment...