Making “virtual” more real

Within the Intel labs we were shocked by the public reaction to our 80 core disclosure last spring. The interest level was astounding, but after the initial discussions (around core type, how they were arranged/interconnected, power vs. teraflops, and the like) a common question came up: “What would you do with 80-cores?” Sean raised this question in an earlier post and we received some great feedback from readers. I’d like to continue to discuss some of possibilities which we are currently looking into.

Our simplest answer – there are a number of applications you would want to use whether at home or at work which require this kind of compute density. They are highly parallel and speedup scales very well with increasing core count. And, you would use these apps today if the platform was available.

We have discussed Tera-scale applications extensively in a variety of forums – after all, it is these apps that provide the motivation and context for scaling multi-core platforms. Clearly multimedia based apps that involve resolution enhancement, natural language and contextual searches, and “automatic” video highlight extraction all benefit from parallel execution and would be used today if the hardware and software tools to create them were in place. In the business sector, real time decision support for complex optimization models supported by large database searches are also compelling apps that benefit from high levels of parallel execution and would have immediate application.

One I want to highlight here is physical modeling. Certainly a physically realistic game could consume every compute resource available, providing enjoyment to those of us finished with chores or a hard day of work. There is almost a visceral draw to those games where momentum is conserved in collisions, smoke is generated not just by an opaque, fuzzy “ball” but rather a simulated flame with temperature gradients and the resultant, seemingly random buoyancy effects and smoke eddies that makes watching a fire a pleasurable experience. The average 10 year old recognizes these realistic features immediately even without formal training in Newton’s Laws of Mechanics. They experience life and the intricate and cascading interactions of bodies (such as when crashing their bikes!) and immediately gravitate toward games that provide a similar degree of realism.

Hollywood has completely embraced the utilization of realistic digital effects – imagine such realism occurring in real time in a game or even a movie that is not heavily scripted, but rather adapts and accepts changes you suggest midstream. You don’t like the direction the plot is taking – change it! The potential applications to virtual worlds like “Second Life” are obvious. Here’s an example of digital water done by Professor Ron Fedkiw’s team at Stanford.

More very realistic simulations can be seen at Professor Fedkiw’s site at Stanford. Note the large number of digital special effects he has contributed to major motion pictures. In fact, our CTO Justin Rattner plans to feature some of Ron’s hollywood effects in his “virtual worlds” keynote at the Intel Developer Forum next Thursday (9/20).

We work closely with Ron and his grad students to parallelize and analyze the performance of these physics modeling workloads. The reason is that physics is a truly general purpose application and can achieve better performance on array of CPUs than on a GPU. Check out this paper, which we published last month, showing that both movie and game physics workloads lend themselves very well to parallel execution and increasing core count.

4 Responses to Making “virtual” more real

  1. Sam says:

    Great article on the terrascale processor. What about a 7 way router for a stacked terrascale processor? Maybe three layers deep (as long as it can be cooled). Could this be accomplished, one day?
    Very exciting.

  2. Tracy says:

    One significant use of such a processor is in the reconstruction of 3D or 4D medical images. Such computatational power will be required to create dynamic radiotherapy solutions in the near future and reconstruction is one of those great problems that can be distributed across as many cores as you can throw at it.

  3. Andrei I. Yafimau says:

    This is a somewhat belated comment on the common question “What would you do with 80-cores?”.
    More precisely, the comment proposes an answer on the more generalized question
    “What would you do with a large, essentially virtual, set of processes and threads?”.
    This question becomes very simple if one take in account the fact that any work on a computer
    from a user’s point of view is represented by virtual entities – processes and threads,
    rather than hardware entities – cores and strands.
    The natural answer is – to use the virtual set for simplifying the programming
    of parallel systems of arbitrary size and complexity, just the same as the invented more than
    40 years ago virtual memory is used for simplifying a large sequential program implementation.
    The comprehensive approach of how to combine the hardware features for direct
    hardware multiprogramming over a virtual set of processes and threads is proposed in the paper
    “Virtual-Threading: Advanced General Purpose Processors Architecture”,
    Paraphrasing the headline of this discussion, one can summarize that by essence, the
    proposed virtual-threading architecture makes “virtual” the only real.
    I hope this comment may revitalize the discussion over a program concurrency support in hardware.