Do you want to play your high-quality PC game with all the amazing graphics on your mobile device, sitting on the couch in the living room? Do you want a fully wireless, mobile virtual reality experience? While smartphones and tablets have become very fast over the last years, there is still a big performance gap compared to high-end PCs. To overcome that gap in a home or office environment the so called in-home streaming comes into place. There, rendered content is generated on a fast performing machine and the image is transferred over a local (wired or wireless) network to another device, in our case a smartphone. Touch inputs or other user controls from the phone are transferred to the server for processing. For this approach to work, it is imperative that the latency is low and that the image quality is high. Given both, the offloaded calculations are not noticeable anymore and it feels like using any other local app on the phone.
In our research at the Intel Visual Computing Institute in Saarbrücken, Germany, we are using real-time ETC1 texture compression in combination with a distributed rendering architecture to fully leverage recent progress in wireless computer networking standards (IEEE 802.11ac) for mobile devices. We achieve much higher image quality at half the latency compared to other in-home streaming solutions.
The full paper “High quality, low latency in-home streaming of multimedia applications for mobile devices” by Daniel Pohl, Stefan Nickels, Ram Nalla, Oliver Grau will be presented at FedCSIS 2014. It can be downloaded here.
We designed our approach from the beginning in a way that not only one, but many machines can contribute to rendering the image. In our case we have the following setup.
On the software side we have a client / server architecture with the following tasks.
One big difference to other in-home streaming approaches is that we use intra-frame compression, which means that every image is encoded independently of each other. We use the ETC1 texture compression format as this can be natively used for displaying on smartphones without manual decompression. Please refer to the full paper for a broader comparison to approaches using H.264 or MJPEG.
We compared the achieved image quality with other approaches in a scenario with heavy camera movements.
In terms of image quality we noticeably surpass the other in-home streaming approaches. H.264, given enough bit-rate, generates even better images. However, as we will show below, the ETC1 approach has significantly lower latency.
In terms of latency we made sure to always measure the full pipeline from starting an input like touching the screen to the moment where a change on the screen can be seen. This is referred to as the motion-to-photons time. Using our approach led to a motion-to-photons latency of 60 to 80 ms. On Nvidia Shield, which uses H.264 video streaming, we measured 120 to 140 ms. The Splashtop streaming solution, also relying on H.264, shows 330 to 360 ms of lag.
Here we show a video of our research prototype:
There are various application scenarios using low latency, high image quality in-home streaming. Starting from entertainment like high-quality gaming to HPC and Big Data visualizations. Imagine modifying your highly complex molecule, calculated by 32 servers, in real-time from your smartphone.
Another interesting area is Virtual Reality for Smartphones. There are projects like FOV2GO and Durovis Dive that developed cases for smartphones with wide-angle lenses attached to it. Once this is strapped on the head of a user, mobile virtual reality can be experienced. For a good Quality of Experience high-quality stereo images need to be rendered that have pre-warped optical distortion compensation to cancel out spatial and chromatic distortions of the lenses. While this works well on desktop PCs, the performance and quality that smartphones can achieve today is not very compelling for virtual reality. To achieve higher image quality, these applications have to switch from a local to a server-based rendering approach. As latency is an even more important issue in virtual reality, our latency-optimized approach is in particular suitable for this scenario. Our motion-to-photons latency of 60 to 80 ms is still not ideal, but a big step forward compared to other in-home streaming solutions.
We have shown that our new in-home streaming approach using ETC1 texture compression creates an experience with much higher image quality at half the latency of other in-home streaming approaches.
If you are interested in this topic, you might find me at GDC-Europe (Aug 11-13, 2014) or GamesCom. Also, please feel free to write your thoughts in the comment section.
Download full paper.
Screenshots and figures are under Creative Commons Attribution 4.0 International License.