Symmetry-driven 3D Reconstruction from Concept Sketches

Felix Hähnlein Université Côte d'Azur, Inria
Yulia Gryaditskaya University of Surrey, Surrey Institute for People Centred AI and Centre for Vision, Speech and Signal Processing (CVSSP)
Alla Sheffer University of British Columbia
Adrien Bousseau Université Côte d'Azur, Inria
SIGGRAPH North America 2022

[Paper]

Concept sketches are dominated by symmetric strokes both in the shapes they depict, and in the construction lines employed to draw these shapes in perspective (a). Our algorithm first decomposes the sketch into locally-symmetric groups of strokes (b), and proceeds to identify pairs of strokes that are symmetric with respect to triplets of axis-aligned planes (c). At its core, our method selects the stroke correspondences that result in the most symmetric and well connected shape (d). We only show a subset of the symmetry planes and correspondences for illustration purpose.

Abstract

Concept sketches, ubiquitously used in industrial design, are inherently imprecise yet highly effective at communicating 3D shape to human observers. We present a new symmetry-driven algorithm for recovering designer-intended 3D geometry from concept sketches. We observe that most concept sketches of human-made shapes are structured around locally symmetric building blocks, defined by triplets of orthogonal symmetry planes. We identify potential building blocks using a combination of 2D symmetries and drawing order. We reconstruct each such building block by leveraging a combination of perceptual cues and observations about designer drawing choices. We cast this reconstruction as an integer programming problem where we seek to identify, among the large set of candidate symmetry correspondences formed by approximate pen strokes, the subset that results in the most symmetric and well-connected shape. We demonstrate the robustness of our approach by reconstructing 82 sketches, which exhibit significant over-sketching, inaccurate perspective, partial symmetry, and other imperfections. In a comparative study, participants judged our results as superior to the state-of-the-art by a ratio of 2:1.

Code

The code for the paper: [Coming soon]

Data