We’ve just finished up with the 9th annual Research@Intel press event, at which Intel Labs showcased examples of forward looking research to media and industry analysts from around the globe. Intel CTO Justin Rattner kicked off the event by stressing the importance of collaboration between Intel, academia, governments, and the broad technology industry in order to turn the ideas of today into reality. He cited Intel’s collaboration with Apple to develop ThunderboltI/O technology, which had its genesis in Intel Labs as well collaborations with DARPA to develop extreme-scale computing technology 1000x more capable than what can be done today.
Justin also announced a new Intel Science and Technology Center (ISTC) for Secure Computing. This center will be co-led by Intel Labs and UC Berkeley and will research breakthroughs to make personal devices and data more secure. Berkeley will act as the hub of academic research for the center, which also includes researchers from University of Illinois, CMU, Drexel, and Duke. This is the second in a series of ISTCs to be announced this year – we announced the ISTC for Visual Computing, co-led by Stanford University, in January.
He also referenced the Many-core Applications Research Community, a network of more than 80 institutions testing next generation software ideas using the 48-core “Single Chip Cloud Computer,” a concept chip developed at Intel Labs. The learning from these research efforts will help guide the development of future architectures which are even better suited to the needs of tomorrow’s cloud datacenters.
The mood on the show floor today was exciting. I had the opportunity to give a livecast on the demos in the “cloud” zone describing projects such as a cloud-based, ray traced game running on handheld devices and a virtual city designed to for disaster response training. For the latter the innovation shown was the ability to scale the number of ‘players’ in such online, virtual training scenarios from hundreds to thousands. And, announced today, this code will be made available via open source for use by the OpenSim virtual world community later this month.
We also announced today that Intel Labs will open-source code for a different ray tracing project that targets offline, photorealistic rendering for uses such as design and digital effects in film. This code can provide up to a 2x speed boost for such professional applications and will also be available later this month.
In other parts of the show floor we showed an array of interesting projects: a bike powered netbook for developing economies, a “magic mirror” that allows you to see your virtual body and change it based on real human body scans, a programming system designed for exa-scale supercomputers of the future, human perception technologies, and more. Many of these projects include academic, industry and government collaborations of the type Justin emphasized in his opening address.
Looking around the event I was amazed at the diversity of projects, the number of leading minds in the room, and the level of interest in the research. And, as part of the R@I planning team I know that the ~35 demonstration on the floor represent only a fraction of the projects at Intel Labs. From here, the future looks bright.