The Current Revolution in Computing and Why I Have the Best Job at Intel

I would like to tell you why I have the best job at Intel and why I think we are in the most exciting time in the history of computing. Before I do that I would like to start back at the beginning of the personal computer revolution.

I had the pleasure of working on the original 80386 microprocessor, which was launched in 1986. . This was a very exciting project. Everyone in the team was very energized to be part of the project. We definitely knew we were on the brink of something very big. The vast transistor budget available for that project permitted us to build an entire CPU on one chip. We achieved unprecedented performance at a very low cost level. On top of that we were able to easily beat the performance of mini computers, We projected we would exceed the performance of main frame machines within a few generations of microprocessors. Two product generations later we developed the Pentium™ Microprocessor. At that time we know that microprocessors had became the universal building block for computing. In my mind this was a very major milestone in the history of computing.

I had the pleasure to lead the development of quite a number of microprocessors over a 20 year period. I experienced first hand the huge growth in instructions per clock and the increases in clock rate, that advances in silicon process technology enabled. We continued to smash through technology barriers and innovate on many fronts to increase performance. As performance increased, computer usage broadened to new uses, which we now take for granted.

Microprocessors have now touched every aspect of our lives. We all have come to expect that the internet connects us to each other, and makes information available and commerce easy. We take for granted that email is ubiquitous, and that the computer has come to be the place to store and enjoy all kinds of media. This has clearly been a revolutionary change and we are by far not at the end of the revolution.

We are actually starting the next inflection point in the revolution of computing and I would like to illustrate this by example. I recently built a quad core computer for my home. This computer is simply fantastic. I use this computer to edit digital images and convert RAW files. These are demanding tasks for 13 mega pixel images. A single core computer simply gets bogged down doing RAW conversions and some of the more complex image manipulations. This is painfully obvious after a shooting session when there are hundreds of pictures to process, since it can take hours. The quad core computer, with threaded software is so fast; I actually can process the images as fast as I can decide how I want the images to look. Most compute intensive software applications can be written to benefit from the use of multiple cores. The ability to use multiple cores to work on one task is a major break through since performance can be increased simply by implementing more cores in the system. Moore’s Law continues to give us increases in transistor budgets that will bring this technology to increasing numbers of people. The excitement is just as palpable as when we designed the original 80386 microprocessor. The next revolution in computing is developing before our eyes.

My job at Intel is leading a team of talented researcher who are working on ways to accelerate the trend to multiple cores. We are making great progress. It is exciting to be part of this revolution. This is why I always say I have the best job at Intel.

Next time I post I will describe some of the challenges we are working on.

13 Responses to The Current Revolution in Computing and Why I Have the Best Job at Intel

  1. Ketan says:

    Some AMD professionals came to our college and explained the various differences between intel and AMD. One of their points which I did not agree, was they said that the intel dual core processor was not actually dual core but only one core was working at a time and because of this defect they launched core 2 duo. How relevent is this information? Please reply….

  2. Joe Schutz says:

    Ketan, Thanks for the question. Both cores actually work at the same time, which is the reasons we get such stunning performance from all of our dual core parts.

  3. “The next revolution in computing is developing before our eyes”
    Cell computer.
    Actually, it’s not about cores – Intel proved it. It’s about displays.

  4. Igor says:

    Joe, can you please tell us exactly which threaded software you use for RAW conversion?
    Moreover, since you are working on the next big thing, may I remind you that scatter/gather instructions are sorely missing in x86 SIMD instruction set?
    Five iterations of SIMD enhancements later, with 6th (SSE4) in the works, and still no sign of this very important feature. I am really disappointed in Intel engineers.
    Here is what I suggested for the implementation, hopefully someone will take a look at it:
    http://softwarecommunity.intel.com/isn/Community/en-US/forums/thread/30228599.aspx

  5. Joe Schutz says:

    First thanks to Michael and Igor for questions.
    I would first like to address Michael Molin’s comment. Cell is a specialized product used mostly as a co-processor. The work in my lab is related to the development of general purpose parrallel hardware. There are many advantages to using general purpose hardware in both performance and in the development environment used by software engineers, that can not be achieved with co-processors.
    Igor, I have just started using Abobe Lightroom for RAW file conversions. The performance is outstanding. If you are using a Quad Core machine, just open the process monitor in Windows and watch how it keeps the four cores busy.
    I have a group of top enigineers working in the area of scatter gather, and I will pass your comment on to them. There are a number of workload types that dramatically benefit from scatter gather, and we have done a lot of work in this area.
    Your comment and my answer raises an interesting point and disclaimer: In the research community at Intel, we try very hard to cover a wider field of interests than in the product groups. This keeps our knowledge at the leading edge of technology. The product groups make the feature selections for the products, which is not our charter. So you will see us working on a broad range of things, of which some will not make it into products.

  6. Igor says:

    Thanks for replying. Maybe now it’s the time for Intel engineers to take back the (well deserved) lead at Intel?
    Scatter/Gather is a crucial operation for many tasks. The one I am working on at the moment and where I am sorely missing those is 3D reconstruction (backward and forward projection, trilinear interpolation) for medical imaging purposes.
    Friend of mine and I have a very fast code which reconstructs 600 frames in resolution of 352×472 (16-bit data from an X-ray scanner) into a 265 frames of 512×512 in approx. 20.8 seconds on a Quad Core CPU at 2.66GHz.
    What is a bit disappointing is that a single 8800GTX GPU can do it in mere 4.5 seconds.
    Both architectures use SIMD for calculations and the CPU code is something I wrote in assembler by hand and carefully tuned and threaded for maximum performance but GPU still beats it by a factor of 4.6x.
    I believe that scatter/gather alone would be able to decrease this gap substantially. Especially if carefully implemented with some clever shortcuts (like detecting and avoiding overlapping reads and writes for example) it could help in keeping the CPU as the most powerfull general purpose computing device at least until we get Larrabee :)
    Just my 0.02$.

  7. Anon says:

    In regards to Ketans post – I believe he meant Pentium D.
    Ive also heard the same thing,

  8. Lord Volton says:

    Here are three things I hope you’re already working on:

    X86 cellphone chips. I think we’re at a point where we should be able to run standard applications (Word, Excel, etc.) on our cellphones. This will also allow 3rd world countries to get super cheap computers.
    Dedicated graphics and physics processors. This could be part of your multiple core approach. It’s amazing how few games make use of multiple cores. Peer-to-peer massively multiplayer online games would really be able make good use use all of the horsepower that multiple cores provide.
    The end of single cores. As long as you keep pumping them out people will continue to code for them. I believe a press release that officially announces the END of single core will put the future into full perspective for everyone. A date after which single threads are officially part of history.

  9. Kempion says:

    When are we going to see multi-core MCC’s?
    I would like to see one core per address bus and one channel per address bus which would connect to a memory bank of any size. Of course, then we would have to expand the number of addres buses which would lead to redesiging CPUs to match the address bussing changes… I have a headache already!
    In the end though… woosh! Talk about MASSIVE amounts of RAM!

  10. Brandon Clinger says:

    Hi, I’m a freshmen engineering student at the Ohio State University. Right now though, I am currently engineering undecided, but can switch anytime within the next 2 years. At first I really wanted to go into computer hardware design, and possibly aim at getting a job at Intel, but there are a few concerns that I have. How secure would that job really be? That job is so specialized, and I’m sure engineers over seas are very attractive to big corporations due to the cost of labor. While being a chip designer would be fun and fast paced enough that I could thrive at solving problems until I retire (who knows what age that’ll be when its my time), I do not ever want to be in danger of being unemployed. Could you provide any insight into this? Thanks!

  11. Sam Louke says:

    Brandon, I retired from Intel after 27 years in 2005. I was hired by Intel right out of school with a BS in ChemE and over the years I had many Intel jobs, from Fab Engineer, to Equipment Development engineer, to service contract management, and for the last 10 years as the Bulk Chemical Engineer in Technology Development. I guess what I’m trying to say is that over your career you will have lots of opportunities for many jobs, so you should do your best to prepare for your career as you see it while you are in college. But understand, no company can guarantee you a job for life, so you should prepare yourself for changes along the way by continually learning new skills throughout your career. Intel is large enough that when a door closed to me, there were always several open ones nearby.

  12. mike lee says:

    I am trying to get an email to some one in the technology support group, have a couple workstations that are based on the x38 chipset and would like to talk to someone about that chipset. If you could point us in the right way, most appreciative.

  13. digit.all.eyes says:

    I totally agree with Lord Volt, on his point on Dedicated graphics and physics processors, most duo core laptops nowadays have really attractive feature and most of them (even on a budget) still pack a power punch, except for the fact that they dont have a boxing glove!!(thus the dedicated graphic).are we going to see intel graphics cards(both intergrated or otherwise) that can actually play games?no no, not just any games, but graphic intensive games??
    sory,i might have posted this on the wrong blog,but you got to read it,right? :-)