Skip to content

BTF data generation based on deep learning

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Many applications, such as computer-aided design and game rendering, need to reproduce realistic material appearance in complex light environment and different visual conditions. The authenticity of the three-dimensional object or the scene is heavily depended on the simulation of textures, where the Bidirectional Texture Function (BTF) data plays an essential role. Researches on BTF has been focused on data acquisition, compression and modeling. In this paper, we propose a deep convolutional generative adversarial network (DCGAN) to learn the appearance of the BTF for predicting new BTF data under novel conditions. We use the illumination direction, viewing direction and material type as the conditional constraints to train the network. The proposed method was tested on a public BTF dataset and it was shown that it reduces the data storage cost and produces satisfactory synthetic results.
Original languageEnglish
Title of host publicationInternational Conference on Identification, Information and Knowledge in the Internet of Things
Subtitle of host publicationIIKI 2018
StateAccepted for publication - 7 Oct 2018
EventInternational Conference on Identification, Information and Knowledge in the Internet of Things - Beijing, China
Duration: 19 Oct 201821 Oct 2018
http://business.bnu.edu.cn/iiki2018/

Conference

ConferenceInternational Conference on Identification, Information and Knowledge in the Internet of Things
Abbreviated titleIIKI 2018
CountryChina
CityBeijing
Period19/10/1821/10/18
Internet address

Related information

Relations Get citation (various referencing formats)

ID: 11809297