3D Textile Reconstruction based on KinectFusion and Synthesized Texture

Pengpeng Hu, Taku Komura, Duan Li, Ge Wu, Yueqi Zhong

Research output: Contribution to journalArticlepeer-review

Abstract

The 3D textile model plays an important role in textile engineering. The 3D textile model mainly consists of the 3D geometric shape and the texture. Depth cameras such as Microsoft Kinect, is much cheaper than conventional 3D scanning devices. However, not much work about 3D textile reconstructed based on depth cameras and the texture is also limited by photography methods in 3D scanning. This paper presents a novel framework of reconstructing the 3D textile model with synthesized texture. Firstly, a pipeline of 3D textile reconstruction based on KinectFusion is proposed to obtain a better 3D model. Secondly, convolutional neural networks (CNN) is used to synthesize the textile texture. Experimental results show that our method can conveniently obtain 3D textile models with synthesized texture.
Original languageEnglish
Pages (from-to)355-364
Number of pages12
JournalInternational Journal of Clothing Science and Technology
Volume108
Early online date14 Sept 2017
Publication statusE-pub ahead of print - 14 Sept 2017

Fingerprint

Dive into the research topics of '3D Textile Reconstruction based on KinectFusion and Synthesized Texture'. Together they form a unique fingerprint.

Cite this