Intel’s Pittsburgh Research Lab opened its doors this week for a tour of the fascinating exploratory research they’re doing on future technologies including a natural gesture interface for games built as a novel application of SLIPstream parallelization techniques. The Pittsburgh lab demonstrated this interface with a head-to-head Tetris-style game, where the players use whole body gestures to control the motion of their pieces. Unlike typical approaches to gesture detection that employ props, special clothing/markers (motion capture systems) or a controlled environment such as a blue screen), the Intel approach is designed to work in everyday environments and does not require users to be segmented from the background. Although the technique is computationally expensive, the researchers have achieved interactive speeds by parallelizing the vision algorithm across a cluster of machines in a manner that minimizes latency.
Connect With Us
- Qingfeng Zhu on The Third Eye View
- Anil on The Third Eye View
- Olajfestmény on Intel and Stanford Researchers Reveal Peptide Chip Details to Categorize Diseases and Analyze Protein Interactions
- Tony Rivers on Intel and Stanford Researchers Reveal Peptide Chip Details to Categorize Diseases and Analyze Protein Interactions
- Neel on Our ISTC-VC will rock at SIGGRAPH 2012
Tags#IntelR&Dday 80-core @idf08 Big Data Cloud Computing Ct CTO energy efficient Future Lab Future Lab Radio IDF IDF2008 IDF 2010 Immersive Connected Experiences innovation Intel Intel Labs Intel Labs Europe Intel Research ISSCC Justin Rattner many core microprocessor mobility multi-core parallel computing parallel programming radio Rattner ray tracing research Research@Intel Research At Intel Day Robotics security silicon silicon photonics software development Stanford technology terascale virtual worlds Wi-Fi WiMAX wireless