Single-Image Mesh Reconstruction and Pose Estimation via Generative Normal Map

Nan Xiang, Li Wang, Tao Jiang, Yanran Li, Xiaosong Yang, Jianjun Zhang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We present a unified learning framework for recovering both 3D mesh and camera pose of the object from a single image. Our approach learns to recover outer shape and surface geometric details of the mesh without relying on 3D supervision. We adopt multi-view normal maps as the 2D supervision so that the silhouette and geometric details information can be transferred to neural network. A normal mismatch based objective function is introduced to train the network, and the camera pose is parameterized into the objective, it integrates pose estimation with the mesh reconstruction in a same optimization procedure. We demonstrate the abilities of the proposed approach in generating 3D mesh and estimating camera pose with qualitative and quantitative experiments.
Original languageEnglish
Title of host publicationProceedings of the 32nd International Conference on Computer Animation and Social Agents
Place of PublicationNew York, NY, USA
PublisherAssociation for Computing Machinery, Inc
Pages79–84
Number of pages6
ISBN (Print)9781450371599
DOIs
Publication statusPublished - 1 Jul 2019
EventThe 32nd International Conference on Computer Animation and Social Agents 2019 - Paris, France
Duration: 1 Jul 20193 Jul 2019
Conference number: 32
https://casa2019.sciencesconf.org/

Publication series

NameCASA '19
PublisherAssociation for Computing Machinery

Conference

ConferenceThe 32nd International Conference on Computer Animation and Social Agents 2019
Abbreviated titleCASA 2019
Country/TerritoryFrance
CityParis
Period1/07/193/07/19
Internet address

Keywords

  • deep learning
  • pose estimation
  • mesh reconstruction

Fingerprint

Dive into the research topics of 'Single-Image Mesh Reconstruction and Pose Estimation via Generative Normal Map'. Together they form a unique fingerprint.

Cite this