We divide the scene geometry using triangle footprints to gather shading via Oversampling (blue) and L-packing (green) into a shading atlas. With this data, we construct near ground-truth novel views in under 1 ms on a desktop and with full 60 FPS on a smartphone.
Presenting high-fidelity 3D content on compact portable devices with low computational power is challenging. Smartphones, tablets and head-mounted displays (HMDs) suffer from thermal and battery-life constraints and thus cannot match the render quality of desktop PCs and laptops. Streaming rendering enables to show high-quality content but can suffer from potentially high latency. We propose an approach to efficiently capture shading samples in object space and packing them into a texture. Streaming this texture to the client, we support temporal frame up-sampling with high fidelity, low latency and high mobility. We introduce two novel sample distribution strategies and a novel triangle representation in the shading atlas space. Since such a system requires dynamic parallelism, we propose an implementation exploiting the power of hardware-accelerated tessellation stages. Our approach allows fast de-coding and rendering of extrapolated views on a client device by using hardware-accelerated interpolation between shading samples and a set of potentially visible geometry. A comparison to existing shading methods shows that our sample distributions allow better client shading quality than previous atlas streaming approaches and outperforms image-based methods in all relevant aspects.
Jozef Hladky, Hans-Peter Seidel, Markus Steinberger
Tessellated Shading Streaming
To appear in: Computer Graphics Forum (Proc. Eurographics Symposium on Rendering 2019)
@article{Hladky2019_TSS,
author = {Jozef Hladky and Hans-Peter Seidel and Markus Steinberger},
title = {Tessellated Shading Streaming},
journal = {Computer Graphics Forum (Proc. Eurographics Symposium on Rendering 2019)},
year = {2019},
ISSN = {1467-8659},
DOI = {10.1111/cgf.13780}
}