Wolfenstein gets ray traced – on your laptop!

It’s this time of the year again: IDF! Time to show off the cool stuff our graphics research group has been working on. Today at the exhibition I demonstrated our new project called “Wolfenstein: Ray Traced”.

1) The visual content of the demo. The up-to-date Wolfenstein game is rendered through a real-time ray tracer with several special effects that haven’t been possible before in games with such an accuracy. Two of several highlights are:

The chandelier model. wolf_chandelier-new.jpgThrough ray tracing we are calculating physically correct reflections and refractions in the many glass objects it contains. The model is highly detailed with around one million triangles. This one model consists of three times the detail than everything else in the level combined. Have you ever seen something similar in a current game? Probably not, as with the traditional rendering approach it would not be efficiently doable.

The surveillance station.wolf_station.jpg At a wall in the game you see twelve screens that each show a different location of the level. This can be used by the player to get a tactical gaming advantage. Have you ever seen something similiar in a current game? Again – probably not.

2) The method how the demo runs. The images are rendered from a “cloud” of four servers with Intel’s Knights Ferry platform inside. You might have read (http://www.intel.com/pressroom/archive/releases/2010/20100531comp.htm) about this hardware before. It is a Many Integrated Core (MIC) architecture targeted towards the High-Performance Computing market (meaning: not targeted to the individual gamer to be bought). As ray tracing is a highly parallel application it can therefore take very good benefit of the many cores that are in a single chip on the Knights Ferry board. Once a chip in one of the servers has finished calculating a new frame for the game it will send it over the network to a thin client, in this case a small laptop.

Rendering high-end graphics for applications such as gaming in the cloud is an emerging trend that has some interesting advantages. Rather than being constrained to running these high-end applications only on your desktop at home, you can in principle play on any computer – such as a system at your friend’s house or even lightweight systems like a netbook or tablet. In the future, it could also free up compute resources on your system to be used for voice and gesture recognition, increasing the level of immersion for the application.

More screenshots are available at http://www.wolfrt.de .

The original Wolfenstein game has been made by id Software and Raven Software

13 Responses to Wolfenstein gets ray traced – on your laptop!

  1. Meh says:

    Other games have portals, TV screens. I’m pretty sure Half Life 2 does. I doubt even they invented it. Though, I don’t know whether theirs could handle recursion.
    One thing, if this ray-traced game engine is not for consumers, ie is just a fun demo and not to actually play games on, why talk about the ‘tactical advantages’ of reflections in your sniperscope?

  2. There are games with the “remote screen” ability; UT2004 uses this in some of the Double Domination levels, and I also think in some of the Bombing Run levels too.

  3. Codepwned says:

    While gaming in the cloud is an awesome concept, it’s not realistic for FPS due to network latency times. Basically there is a greater lag between keyboard/mouse action and the command being received. You can imagine this as if you move your mouse and slightly after the character responds, whereas on a local machine the response time is nil.
    once that is fixed this will be an extremely power delivery system for people who don’t want to buy hardware.

  4. Nemo1024 says:

    “At a wall in the game you see twelve screens that each show a different location of the level. This can be used by the player to get a tactical gaming advantage. Have you ever seen something similiar in a current game? Again – probably not.”
    Such in-game tactical real-time displays from other locations on the same level were available in Half-Life 2.

  5. WHAMMO! says:

    So the server actually renders the frames for each client and streams them back to them? What kind of bandwidth hit is that?

  6. Robert says:

    The “surveillance station” is nothing new (and doesn’t seem to be related to ray tracing); the Jedi Knight add-on Mysteries of the Sith did it over 12 years ago.

  7. Daniel Pohl says:

    Thanks for your comments. I should have been more precise in my description about what I want to highlight with the surveillance station. Yes, there have been games before that show a small number of those screens. Often there is only one screen and the view has to be changed over pressing a button, but a large screen wall like in this example has afaik not been in any of today’s game. The technical reason for it is that doing that would lower the frame rate significantly.
    Compared to that in ray tracing you pay for the number of pixels of such a TV being visible and the costs it takes for the rays that hit the surface to be transformed to the new position and being continued to be traced. In relation to the screenshot above that means that you get roughly the same performance numbers for all of these scenarios:
    · 12 different views on the 12 screens (like in the screenshot)
    · 1 view on a 12 times larger screen
    · 144 different views on screens that are only 1/12 of the size (more like in <a href="this picture from the“The Matrix”)
    For a fast game it is obviously easier to get a strategic overview of the level by looking at a wall of screens compared to zapping through different channels on one screen.
    As you can see in the video this method also allows recursions to be happening. Might be useful if you are having a look at those screens and an enemy is trying to approach you from behind…
    Regarding the bandwidth requirements: As we are mostly graphics researchers we did NOT highly optimize our network code. We took the easy approach of just using a Gigabit Ethernet for this project. The images we are sending are DXT1 compressed. Obviously this can be optimized much further. There are companies like OnLive and Gaikai that made a business out of offering cloud-based games. According to Wikipedia OnLive has a minimum requirement of 5 Mbit/s.

  8. Snake says:

    Ignore the haters, this is awesome. It takes 4 servers today but with the speed that technology advances how long will it be before it only takes 1 server.. than half a server.. than one pc…

  9. Robin says:

    The light reflections on the surface of objects impressed me.
    It says that data is sent through network to the laptop from four servers equipped with Knights Ferry, not from any graphical interfaces. But I see the Knights Ferry card has HDMI, DVI and display port. Dose it has the ability to output from those graphical interfaces?

  10. Daniel Pohl says:

    This demo displays images on the client while the Knights Ferry card accelerates computation on the server-side.
    While the Knights Ferry prototype used in the demonstration does include a display connector for testing purposes, Intel MIC should be viewed as a co-processor, and not a display device.

  11. Neeraj says:

    hi Daniel 🙂
    The screenshots are really impressive. I was trying to find some details pertaining to how exactly ray-tracing has been embedded here. Has everything been rendered by pure ray-tracing ( which I don’t think is the case 🙂 ). Seems like a ray-tracing + rasterization hybrid approach has been used. Am I correct? Can you elaborate? That would be great!
    Also, I was curious how it utilizes CPU/GPU and how exactly the CPU-GPU load-balancing was achieved?
    Thanks! And once again, really good work 🙂