Intel’s Pittsburgh Research Lab opened its doors this week for a tour of the fascinating exploratory research they’re doing on future technologies including a natural gesture interface for games built as a novel application of SLIPstream parallelization techniques. The Pittsburgh lab demonstrated this interface with a head-to-head Tetris-style game, where the players use whole body gestures to control the motion of their pieces. Unlike typical approaches to gesture detection that employ props, special clothing/markers (motion capture systems) or a controlled environment such as a blue screen), the Intel approach is designed to work in everyday environments and does not require users to be segmented from the background. Although the technique is computationally expensive, the researchers have achieved interactive speeds by parallelizing the vision algorithm across a cluster of machines in a manner that minimizes latency.
Connect With Us
- gta on What makes a super computer become a super computer?
- gk-edv on The Internet of Things will overtake you only if you let it
- Negin Owliaei on The Internet of Things will overtake you only if you let it
- website packages on Ask the Expert: The Internet of Things
- Jordan on Ask the Expert: The Internet of Things
Tags#IntelR&Dday @idf08 Big Data circuits Cloud Computing Ct CTO energy efficient Future Lab Future Lab Radio HPC IDF IDF2008 IDF 2010 Immersive Connected Experiences innovation Intel Intel Labs Intel Labs Europe Intel Research ISSCC Justin Rattner many core microprocessor mobility multi-core parallel computing parallel programming radio Rattner ray tracing research Research@Intel Research At Intel Day Robotics security silicon photonics software development Stanford technology terascale virtual worlds Wi-Fi WiMAX wireless