BTF data generation based on deep learning

Xiaohua Zhang, Junyu Dong, Yanhai Gan, Hui Yu, Lin Qi

Research output: Contribution to journalArticlepeer-review

251 Downloads (Pure)


Many applications, such as computer-aided design and game rendering, need to reproduce realistic material appearance in complex light environment and different visual conditions. The authenticity of the three-dimensional object or the scene is heavily depended on the simulation of textures, where the Bidirectional Texture Function (BTF) data plays an essential role. Researches on BTF has been focused on data acquisition, compression and modeling. In this paper, we propose a deep convolutional generative adversarial network (DCGAN) to learn the appearance of the BTF for predicting new BTF data under novel conditions. We use the illumination direction, viewing direction and material type as the conditional constraints to train the network. The proposed method was tested on a public BTF dataset and it was shown that it reduces the data storage cost and produces satisfactory synthetic results.
Original languageEnglish
Pages (from-to)233-239
Number of pages7
JournalProcedia Computer Science
Publication statusPublished - 6 Feb 2019
EventInternational Conference on Identification, Information and Knowledge in the Internet of Things - Beijing, China
Duration: 19 Oct 201821 Oct 2018


Dive into the research topics of 'BTF data generation based on deep learning'. Together they form a unique fingerprint.

Cite this