This morning I had the opportunity to present a keynote speech at the U.S. Innovation Summit in Washington D.C., alongside the US CTO and CIO, among many other distinguished participants.
I was asked to speak about the importance of U.S. innovation to job creation, the economy and the future of U.S. competitiveness, so I took the opportunity to discuss how Intel undertook a transformation in our approach to research and innovation and how far we’ve come in the past few years. We view this approach as a 21st century model of industrial research in contrast to the 20th century model of Bell Labs and the many U.S., European, and Asian companies that copied the Bell Labs model.
I’ve received several requests for the text from the speech, so I’ve included it below. Enjoy.
The U.S. Innovation Summit
The Newseum, Washington, D.C.June 20, 2012
Prepared Speech by Justin Rattner, Intel CTO
Thank you and good morning.
It’s a pleasure to be here today to discuss the importance of innovation to the economic future of the United States. I’ll try to avoid the usual platitudes and get right to what I think U.S. industry needs to do to get its innovation house in order.
It is no doubt clear to those of you who work inside the beltway that the word “innovation” is on the lips of everyone from corporate executives, government leaders and university presidents. Each of them talks about the need for the U.S. to accelerate its pace of innovation or be overrun by innovation coming from virtually every other point on the planet. The message is simply: innovate or die.
Unfortunately, many of these same leaders often confuse innovation with ideation and that in my judgment is a critical, if not fatal, mistake. As the CTO of a major technology company, I am constantly exposed to new ideas for all manner of products and services. They’re ideas bubbling up in my organization and ideas streaming in from our customers and our collaborators, from both industries and universities. Ask any VC if he or she is lacking for ideas. They’ll tell you the same thing. Ideas are cheap; a dime a dozen. Innovation, not ideation, is where we need to focus.
Another common confusion is over the difference between invention and innovation. Every time I hear people reminiscing about the good ol’ days of research when Bell Labs or IBM Research was winning another Nobel Prize or Xerox PARC was off inventing the future of computing, I just cringe. While those industrial-scale research labs of the 20th century were great inventors of things, from the first laser to the laser printer, they were absolute disasters at making them practical and getting them to market. Despite the fact that most of these labs were part of very successful manufacturing companies, the labs themselves had little interest or desire to move their ideas to market. This must be considered their fatal flaw.
Another common belief is that most new ideas are brought to market by entrepreneurs funded by venture capitalists. While venture capital is a great U.S. success story, and Intel Capital is one of the largest venture capital groups in the world, venture-funded businesses represent a relatively small portion of the innovation taking place across U.S. industry. Most innovation is, in fact, done by mid to large sized companies who must continually innovate to stay in business. This is especially true in the information and communication technologies where much of our economic future will be won or lost. The question we must answer and answer quickly is how do we turbo-charge industrial innovation in the 21st century to ensure U.S. economic and, therefore, global leadership for decades to come.
I believe that transforming the way the U.S. does industrial research is the key. I want to spend the rest of my time talking about how we undertook such a transformation at Intel and how far we’ve come. We think of it as a 21st century model of industrial research in contrast to the 20th century model of Bell Labs and the many U.S., European, and Asian companies that copied the Bell Labs model.
It may surprise you to hear that for the first two decades of its existence, Intel consciously avoided using the term “research” to describe anything that was not strictly product or technology development. The origin of this “no research” thinking dates back to the time when Intel’s founders, Robert Noyce and Gordon Moore, were running Fairchild Semiconductor in California. Despite invention after invention, from the planar transistor to the silicon-gate MOs technology, Fairchild struggled to move new semiconductor technology out of its research lab and into Fairchild’s products.
When Noyce and Moore left Fairchild to found Intel in 1968, they agreed that there would be no line dividing semiconductor research from manufacturing at Intel. New devices and processes would be developed on the factory floor alongside the processes then in production. It wasn’t until the mid-80s before a small research team was allowed to form. Called Components Research, it has become the principal engine for semiconductor technology innovation at Intel.
Many of the stunning Intel chip innovations you’ve read about in the last half-dozen years were invented by this one, fairly small, research team, but what separates them from others in our industry is how they get these inventions out of the lab and ready for manufacture. The key is a process we call pathfinding, and I’ll talk more about the critical role it plays at Intel in just a few moments.
Before I do that, let me take you back to late 2006, when Intel launched a broad restructuring program intended to support a new business model for the company. Virtually every part of Intel was analyzed and restructured to match the new model. No part of the company was safe and that included research and development. One aspect of R&D that received considerable attention was the interface between the primary research arm of the company, Intel Labs, and its primary product development arm known as the Intel Architecture Group or simply IAG. Of particular importance was the question: how do we dramatically improve the transfer efficiency or “hit rate” of new technologies coming out of Intel Labs and going into IAG’s mainstream products.
The analysis clearly showed that our low hit rate was not a reflection of the relevance or importance of the technology coming out of the Labs, but was due in large part to timing differences between the completion of the research work and the start of product development. Too often a research project would be complete, but there were no developers ready to pounce on the results and get them ready for product development, a phase we call technical readiness.
After months of unsuccessfully shopping a new technology, a research team would move on to their next project and the motivation to transfer their earlier work would fade. Similarly, when a development team would come by looking for new ideas for their next product, the research team had little interest in returning to what they viewed as yesterday’s news. We came to refer to this synchronization problem as “the valley of death” given its remarkable ability to kill perfectly good technologies before their time.
The solution to the problem, as it turned out, was right under our noses. Of particular interest was the way our semiconductor manufacturing R and D teams bring the next generation semiconductor technology to market. The key being a process they call, you guessed it, pathfinding. Interestingly, there is no pathfinding department at Intel. In fact, only one scientist and one admin are permanently assigned to pathfinding. In place of a standing army of pathfinders, a pathfinding team is assembled for each generation of chip technology. The team is made up, and this is the secret, of both researchers and developers for a sufficient period of time, typically 12-18 months, to bring about one or more successful technology transfers. With a 100% success record of the last two decades, we knew we were staring the solution in the face. The challenge Intel Labs and IAG set out to solve was how to scale out the pathfinding process to cover the literally dozens of new technologies that go into a new Intel chip or system design.
Despite our fears, adapting the pathfinding process to these numerous areas of research has been a remarkable success. Just to give you one example, you may have heard of McAfee’s DeepSAFE anti-malware technology, the first commercial defense against zero-day, rootkit, cyber-attacks. In plain English, that means defending against Stuxnet and Aurora. What you may not know, is this technology was invented by Intel Labs and then jointly developed for the market as part of a 3-way pathfinding effort between Intel Labs, Intel IAG, and McAfee over the last two years. Now that McAfee is an Intel company, there are many more pathfinding projects underway between Intel Labs and McAfee. Look for these coming to market over the next three years.
Our pathfinding process has become so successful that today the Intel product groups literally fight over the pathfinding slots. It’s also not unusual for more developers to be assigned to a pathfinding project than researchers. At any moment, we have over 50 distinct joint pathfinding projects between Intel Labs and the various product development teams.
While roadmap impact is certainly a critical part of being a 21st century industrial research lab, it is not the whole story. To better understand what works and what doesn’t work in modern industrial research and how it differs from academic research or government research, we initiated a benchmarking effort with various multi-national, industrial research labs around the world. Included in the study were many well-known industrial labs including IBM Research, Microsoft Research, Google Research, and GE Global Research. We looked at about a dozen different labs in all.
Among the things we learned from the benchmarking effort was the importance of balancing research directed at existing product lines and research aimed at exploring technologies that have no immediate business impact or even relevance. For the last four years we have spent half of every dollar on business-directed research and the other half on what we call exploratory research. This 50-50 balance has worked remarkably well. While it would be easy to argue for a much higher spend on the business-directed side, we feel we create much more long-term value for Intel by keeping exploratory research and business-directed research in equal proportion.
We also recognized that not all the smart people in the world work for Intel. Collaboration is a powerful tool to expand on and amplify research results. Such was the case with Apple with whom we worked to create the Thunderbolt I/O technology now found on every Macintosh.
Collaboration turns out to be especially important when working with the university community. To that end, Intel has built its own worldwide university collaborative research network including what will soon be seven Intel Science and Technology Centers (ISTCs) in the U.S. and five Intel Collaborative Research Institutes (ICRIs) in Europe and Asia. We look to these centers and institutes for the long-range research work that was typical of that being done at Bell Labs in the 40s and 50s and Xerox PARC in the 70s and 80s. Each center or institute is led by a “hub” school and spans multiple “spoke” universities, designed to form a new, multidisciplinary community of the best researchers from around the world in a given field. Critically, we locate senior Intel researchers at the affiliated academic institutions, where they can work side-by-side the top researchers in the U.S. and the rest of the world. The Intel researchers report directly to the various research groups at Intel Labs and are the principal means, along with hiring the best students, for bringing the academic breakthroughs to Intel.
At this point you may be thinking that 21st century industrial research is all about creating the great technologies that enable the great new products and services to come to market. Let me suggest that it is just as important for the research organization to prevent poor technologies from ever reaching the market. As you can imagine, product failures in the chip business are extremely expensive. A typical mainstream microprocessor may cost upwards of $600 million dollars to develop. More significantly it will be built in a factory that costs ten times as much to construct. If the product flops because of a bad technology decision, it’s a really big problem. Yet, such failures can and do happen, and more often than you might suspect.
What a 21st century research lab can do is vet those new ideas, validate the good ones and weed out the bad ones. This is called “failing fast” and it’s something we consider to be as important to the success of Intel as anything we do on the innovation side. We believe it should be the responsibility of every industrial lab of a certain scale to celebrate this kind of internal failure as much as it celebrates its external successes.
While Bell Labs may have been the model of 20th century industrial research, and there are still a number of companies chasing that vision, it is increasingly dated and out of step with today’s fast moving information and communications technologies. We are trying to set a new course for 21st century industrial research and hope many other companies will join us in this important transformation and not simply for their own near-term success, but for the long-term success of the nation.