Intel’s 40th Anniversary: Top 4 milestones that shaped our present and four predictions for future

See the full 90 mins of my recent press briefing

As Intel celebrates it’s 40th Anniversary, it gives me great pleasure to share with you the insights and learning from 29 years of my career with Intel; its’ been rewarding experience and an amazing journey. There are four key milestones that to me not only shaped Intel but also shaped the computer industry of today.

1. The introduction of 32-bits with the Intel 80386. The 80386 was a huge turning point for Intel and for microprocessors as we went from multi-sourced to single source. The Press hounded us with constant questions of “what applications will require 32-bits” when the PC world was pre-dominantly running DOS but we were laying the architecture foundation for the next decade so new OS and Applications might take advantage of new HW capabilities. Compaq chose not to wait for IBM and boldly moved forward with introduction of a PC based on the 80386 and Intel launched the Red X campaign against the 80286. We also saw moves in software and system architecture. Microsoft moved ahead with Windows relegating IBM OS/2 to a footnote in OS history. Intel, Compaq and others drove the the evolution of ISA to EISA bus and subsequently to PCI making the microchannel architecture irrelevant as well.

2. The CISC/RISC debate. The fundamental issue was not about the architecture or the instructions per clock (IPC) but really about software compatibility and RISC failed because it lacked such compatibility. The Intel® Pentium Pro processor with it’s out of order execution and integrated Level 2 cache brought about an end to the RISC vs. CISC debate as Intel was able to demonstrate to the world the transition to more efficient architecture that was fully compatible with existing software. I was a graduate student with Prof John Hennessey of Stanford and we had multiple teacher/student and academic/industry debates both privately and publicly. The debate continued even after Intel introduced the 80486, Intel Pentium® processor and the Intel Pentium® Pro processors. Intel even had RISC developments including the 860 micropocessor which proved equally unsuccessful. Many in the industry joined the ACE consortium to build an open RISC processor. In the paper Micro2000 which a team of us published in 1989, we made the prediction that software compatibility was very fundamental; even if something was better, it needs to be sustainable long enough time for people to develop software architecture.

3. Right hand turn with multi-core to deliver on Moore’s law and energy efficiency. I see the history of Moore’s Law as breaking into three eras of semiconductor scaling. The first era was developing process technology just to make Moore’s Law work, the age of invention – what are the materials, physics and chemistry to build these things. The second era was really about scale and manufacturing efficiency to deliver in high volume in a predictable way. I made a prediction of getting to 10Ghz – I was probably wrong on that one by 20 years or so! The third era is really about the major transition to energy efficient performance which is a fundamental shift. In an ISSCC paper in 2001 we predicted a power wall with a famous picture of die thermal densities equal to that of a nuclear reactor or the surface of the sun – clearly we needed to change and our answer was the “Right Hand Turn”. While we saw this fundamental shift, we were one generation too late and attempted to extend the Pentium IV but luckily, we recovered quickly with our focus on energy efficient performance with Centrino and our great Tick-Tock execution.

4. Power of compatibility of Intel Architecture. In my early years working on the 80486 architecture definition I attempted to eliminate some of the legacy items in the instruction set like EBCIDIC support or decimal arithmetic. The marketing manager and I at the time debated this feverishly until I gained the religion of Instruction Set compatibility. Andy Grove solidified this perspective with his famous Software Spiral analysis indicating that we had billions of dollars of software ready to run for our next generation processors even before we announced them if we could deliver on that promise of compatibility. It was all about the architecture, the investments of the industry and the software that delivered the end user application capabilities.

Looking ahead into the future, the major technology trends I am predicting that will have a profound impact are:

1. Moore’s law and Silicon scaffolding. Two decades ago, 1 micron was challenging and 100nm looked impossible and now we casually talk about what it takes to get to 10nm. We have about 10 years of visibility into future until 10nm, after that it’s hard to say. This is similar to driving down a road on foggy night where you are able to see 100 yards and as you cross the first 50 yards, the next 100 yards become clear. We continue to see Silicon as the scaffolding with other materials; earlier we used 10 elements of periodic table and now we use over ½ of periodic table. We also continue to see the economic value of increases in wafer size to reduce die costs – a critical component of the industries ability to scale. We expect the move to 450mm will happen sometime in the early part of the next decade as seen by our recent announcements with Samsung and TSMC in joining with Intel to drive this transition. This shift will continue to provide the opportunity for both scaling and cost reductions. We expect it will also drive further industry structure changes and move to greater models of foundry usage in the industry.

2. Many-core ushers in tera scale age of computing: The multi-core (2, 4, 6, 8 cores) computing resulted in spectacular gains in energy efficient performance and we need software programmers to take advantage of parallelism. We are not talking about existing apps, but about new generation of apps, e.g. search across model based data, visual computing applications such as physics, etc. Further, we see breakthroughs in user interfaces that will be intuitive, immersive and interactive. Intel is unlikely to be directly the source of those application and UI breakthroughs but will continue to be an enabler of those innovations from the millions of developers across the industry. We have to solve problems in parallel programming, extend tools to many core and tera scale computing as the value proposition of compatibility continues to grow across different segments. Delivering tera flops of performance will enable a new wave of development which will be an exciting proposition for Intel and the industry.

3. Power of compatibility: IA every where from milli Watts to Peta FLOPs. Software compatibility still reigns into the future as it has in the past. Thus, we are taking our process technology leadership along with the IA architecture and extending those capabilities to deliver a solution for everything that computes; literally from the very smallest platforms such as MIDs and phones to the very largest super computers. From delivering very power sensitive solutions with Atom and building SoC’s around that architecture to building 10 or 100 Tera FLOPs on a single chip of big and little cores, in all these efforts, it’s all about software compatibility.

4. IA everywhere, 24×7 for every modality of life. Our goal is that all users on the planet will use IA, internet connectivity for every modality of life and accessible 24×7. The modality includes work, learn, rest and play. This shift in compute model to enable 24×7 always on/always connected experience is broadening and if you think about bookends of computing: on the high end is Data Center scale computing (Cloud computing or mega DC) where the scale is increasing and efficiency in cloud services. On the other end is mobility and hand-held with Intel Centrino and Atom processors. In the middle is visual computing experience. The embedded computing experience is emerging with a strong ecosystem, and supports everything we do; for example, rooms know what you need, smart power grids that are more efficient and scalable. If you use this definition of 247All*Every we are only 4 or 5% complete with our task. There is a lot of growth left to do in building big, mobile, embedded and visual computing models of the future.

5 thoughts on “Intel’s 40th Anniversary: Top 4 milestones that shaped our present and four predictions for future

  1. The world has 6.5 billion people as of 2006. Roughly 1 billion of those are in the developed countries and the rest 5.5 billion in developing countries. And the population in developed countries is shrinking. Most experts agree that the growth in demand for computing devices will come from developing countries. But the average per capita GDP in a developing country such as China is about $2000. (The per capita income is much lower.) So my prediction is that we will need to figure out a way to lower our product cost dramatically to take advantage of the growing demand in the developing world.

  2. Maybe it’s late and I’m reading this wrong (also go ahead and remove my comment when you make the change…or if I’m wrong), but the sentence “it needs to be sustainable long enough time for people to develop software architecture” in point #2 needs to be changed. It doesn’t read right. Maybe to: “it needs to be sustainable long enough for people to develop a software architecture”?

  3. ‘Our goal is that all users on the planet will use IA, internet connectivity for every modality of life and accessible 24×7. The modality includes work, learn, rest and play.’ So, basically, our goal is to create the Matrix.

  4. IA Compatibility need not mean IA and we should not assume that because that was the right answer in the past it is the right answer in the future. Especially as in the past, power was not a driver.
    A simpler instruction set combined with binary emulation of IA instruction sets may have a turn-over point at which it does more for less power, compared to pure IA. You obviously need to get the critical apps native to avoid the power drain of JIT binary translation for those apps.
    We have a much greater capability to make that happen with Intel Tools now than we did in the past. We are certainly able to provide all three kinds of useful technologies: JITs, offline client hosted translators, and native development tools. In addition, as whoever owns the development tools effectively owns the architecture, making sure Intel tools are pervasive has strategic value.
    It is possible we will have to do this to achieve power leadership for the MID market.

  5. From the SW developers perspective it needs to be ‘compatible’. While binary translation may be a means to achieve that in some cases, it’s important that the SW of the past (and future) continues to run. Efficient binary translation at the application level and the system level is not easy to accomplish but with continued innovations, it is clearly making progress.

Comments are closed.