PanoVerse: automatic generation of stereoscopic environments from single indoor panoramic images for Metaverse applications
Giovanni Pintore, Alberto Jaspe-Villanueva, Markus Hadwiger, Enrico Gobbetti, Jens Schneider, and Marco Agus
Web3D 2023 - 28th International ACM Conference on 3D Web Technology Honorable Mention award
Paper PDF Bibtex DOI VCCVis Website CRS4 Website
@inproceedings{Pintore:2023:PAG, author = {Giovanni Pintore and Alberto {Jaspe Villanueva} and Markus Hadwiger and Enrico Gobbetti and Jens Schneider and Marco Agus}, title = {PanoVerse: automatic generation of stereoscopic environments from single indoor panoramic images for Metaverse applications}, booktitle = {Proc. Web3D 2023 - 28th International ACM Conference on 3D Web Technology}, month = {October}, year = {2023}, doi = {10.1145/3611314.3615914}, note = {To appear}, url = {http://vic.crs4.it/vic/cgi-bin/bib-page.cgi?id='Pintore:2023:PAG'}, }
We present a novel framework, dubbed PanoVerse, for the automatic creation and presentation of immersive stereoscopic environments from a single indoor panoramic image. Once per 360° shot, a novel data-driven architecture generates a fixed set of panoramic stereo pairs distributed around the current central view-point. Once per frame, directly on the HMD, we rapidly fuse the precomputed views to seamlessly cover the exploration workspace. To realize this system, we introduce several novel techniques that combine and extend state-of-the art data-driven techniques. In particular, we present a gated architecture for panoramic monocular depth estimation and, starting from the re-projection of visible pixels based on predicted depth, we exploit the same gated architecture for inpainting the occluded and disoccluded areas, introducing a mixed GAN with self-supervised loss to evaluate the stereoscopic consistency of the generated images. At interactive rates, we interpolate precomputed panoramas to produce photorealistic stereoscopic views in a lightweight WebXR viewer. The system works on a variety of available VR headsets and can serve as a base component for Metaverse applications. We demonstrate our technology on several indoor scenes from publicly available data.
Please check the extended version of this paper for Computer & Graphics journal: "Deep synthesis and exploration of omnidirectional stereoscopic environments from a single surround-view panoramic image". It contains live demo and several improvements with respect to this one.