Robust Flow-Guided Neural Prediction for Sketch-Based Freeform Surface Modeling

Changjian Li, Hao Pan, Yang Liu, Xin Tong, Alla Sheffer, Wenping Wang

Research output: Contribution to journalArticlepeer-review

Abstract / Description of output

Sketching provides an intuitive user interface for communicating free form shapes. While human observers can easily envision the shapes they intend to communicate, replicating this process algorithmically requires resolving numerous ambiguities. Existing sketch-based modeling methods resolve these ambiguities by either relying on expensive user annotations or by restricting the modeled shapes to specific narrow categories. We present an approach for modeling generic freeform 3D surfaces from sparse, expressive 2D sketches that overcomes both limitations by incorporating convolution neural networks (CNN) into the sketch processing workflow.Given a 2D sketch of a 3D surface, we use CNNs to infer the depth and normal maps representing the surface. To combat ambiguity we introduce an intermediate CNN layer that models the dense curvature direction, or flow, field of the surface, and produce an additional output confidence map along with depth and normal. The flow field guides our subsequent surface reconstruction for improved regularity; the confidence map trained unsupervised measures ambiguity and provides a robust estimator for data fitting. To reduce ambiguities in input sketches users can refine their input by providing optional depth values at sparse points and curvature hints for strokes. Our CNN is trained on a large dataset generated by rendering sketches of various 3D shapes using non-photo-realistic line rendering (NPR) method that mimics human sketching of free-form shapes. We use the CNN model to process both single- and multi-view sketches. Using our multi-view framework users progressively complete the shape by sketching in different views, generating complete closed shapes. For each new view, the modeling is assisted by partial sketches and depth cues provided by surfaces generated in earlier views. The partial surfaces are fused into a complete shape using predicted confidence levels as weights.We validate our approach, compare it with previous methods and alternative structures, and evaluate its performance with various modeling tasks. The results demonstrate our method is a new approach for efficiently modeling freeform shapes with succinct but expressive 2D sketches.
Original languageEnglish
Article number238
Number of pages12
JournalACM Transactions on Graphics
Volume37
Issue number6
DOIs
Publication statusPublished - 4 Dec 2018
EventThe 11th ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia, 2018 - Tokyo, Japan
Duration: 4 Dec 20187 Dec 2018
Conference number: 11
https://sa2018.siggraph.org/en.htm

Keywords / Materials (for Non-textual outputs)

  • freeform surface
  • multiple view
  • robust statistics
  • direction field
  • sketch
  • convolutional neural network

Fingerprint

Dive into the research topics of 'Robust Flow-Guided Neural Prediction for Sketch-Based Freeform Surface Modeling'. Together they form a unique fingerprint.

Cite this