What is Intel doing to visualize futuristic applications? Watch it live!

Have you ever wondered how Intel Labs’ internal and sponsored visual computing research will improve your daily life in the future? As a technical marketing engineer in Intel Labs, I get to attend Research at Intel Day 2011. Thus I have the privilege to witness Intel Lab’s futuristic visual computing research demos at Research at Intel Day 2011 on June 7th and 8th. Here is one example. Imagine shopping online for almost everything you need in daily life without guess work? Today’s online shoppers have to imagine the look and feel of the actual merchandize. How would it be if you could utilize today’s smart TV’s to shop online while sitting in your living room couch and have a 3D view of each item and be able to try it on using the tracking camera in your living room? Technologies unveiled at this year’s Research at Intel Day are going to enable a plethora of such use cases.

Looking under the covers at the online touring demo, you can see that research at the Intel Visual Computing Institute into transparent 3D internet technology will allow 3D-like realistic viewing of online content without proprietary browser plugins. Intel’s latest processors provide the hardware support for very rapid rendering of realistic views of objects in the browser. However, Intel Labs didn’t stop there. They extended the improved visual experience to digital content by coupling Intel’s platforms with consumer electronic devices to provide a must-have user experience in our living room.

Intel realizes that it’s not all about shopping. Next generation virtual environments are impacting the gaming industry and becoming a ground for realistic test scenarios. Small businesses are now generating revenue using technologies based on virtual environments. What if we can add real-time technologies such as, facial and emotion recognition etc to virtual applications? What if today’s virtual world backend can support more than 20 times the avatars that are supported today in a virtual scene? This enables you to interact with virtual objects in many new and interesting ways for a much more realistic immersive virtual experience. A good example Intel Labs is showing at Research at Intel day is a massive multiplayer “game” to train first responders for different disaster scenarios. As part of its strategy to increase collaboration across the industry and the academic sectors, Intel Labs will release source code for its Distributed Scene Graph 3D Internet technology. This code is part of an ongoing effort to augment the OpenSim open-source virtual world simulator and will enable developers to build virtual regions where people can work or play online with a cast of thousands instead of being limited to less than a hundred today.

Finally, applications such as architectural visualization often require expensive propriety applications to create realistic computer models. Intel labs will release open source software which can be utilized by third parties to enable photo-realistic rendering of 3D models into 3D images that are indistinguishable from a photograph. This advanced ray tracing code targets professional applications and is a separate effort from our game-focused real-time ray tracing project shown previously.

Cool stuff, huh? Since there are over 40 demos planned, I’ve described just a very small sample of what will be shown to the press at Research at Intel Day.

I hope this quick peek behind the scenes has been interesting to you. As I learn more leading up to Research at Intel Days and attend the event, I will keep my thoughts coming on my blog. So stay tuned and don’t forget to check back with me for the live stream of the event as I walk through the visual computing research demos. The live stream will start on June 7th, 2011 at 1PM PST. Mark your calendars!

Comments are closed.