TOWARD TEXTURING FOR IMMERSIVE MODELING OF ENVIRONMENT RECONSTRUCTED FROM 360 MULTI-CAMERA - CNRS - Centre national de la recherche scientifique
Communication Dans Un Congrès Année : 2020

TOWARD TEXTURING FOR IMMERSIVE MODELING OF ENVIRONMENT RECONSTRUCTED FROM 360 MULTI-CAMERA

Maxime Lhuillier

Résumé

The computation of a textured 3D model of a scene using a camera has three steps: acquisition, reconstruction and texturing. The texturing is important for visualization applications since it removes visual artifacts due to inaccuracies of the reconstruction, varying photometric parameters of the camera and non-Lambertian scene. This paper presents the first texturing pipeline for an unfrequent but important case: the reconstruction of immersive 3D models of complete environments from images taken by a 360 multi-camera moving on the ground. We contribute in many ways: sky texturing (not done in previous work), estimation of gain and bias corrections, and seam leveling. All methods are designed to deal with ordered sequences of thousands of keyframes. In the experiments, we start from videos taken by biking during 25 minutes in a campus using a helmet-held Garmin Virb 360.
Fichier principal
Vignette du fichier
pIC3D20.pdf (2.11 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03206040 , version 1 (22-04-2021)

Identifiants

Citer

Maxime Lhuillier. TOWARD TEXTURING FOR IMMERSIVE MODELING OF ENVIRONMENT RECONSTRUCTED FROM 360 MULTI-CAMERA. 2020 International Conference on 3D Immersion (IC3D), Dec 2020, Brussels (virtual), Belgium. ⟨10.1109/IC3D51119.2020.9376323⟩. ⟨hal-03206040⟩
67 Consultations
113 Téléchargements

Altmetric

Partager

More