Predicting and generating wallpaper texture with semantic properties

Xiaohan Feng, Lin Qi, Yanhai Gan, Ying Gao, Hui Yu, Junyu Dong

Research output: Chapter in Book/Report/Conference proceedingConference contribution

238 Downloads (Pure)


Humans naturally use semantic descriptions to express their visual perception of textures; this is also the fact for perception and description of wallpaper texture. Classification of wallpaper's style is mainly based on understanding of visual information. However, the complexity of real-world wallpaper images is difficult to be captured by existing datasets. Inspired by a publicly available Procedural Textures Dataset, a number of wallpaper images was collected and assembled into a wallpaper dataset. A series of psychophysical experiments was performed to further collect semantic descriptions for this dataset. Each wallpaper was labeled with 5–10 semantic descriptions. More importantly, our dataset contains complex wallpaper images with rich annotations. To our best knowledge, our dataset is the first public wallpaper dataset with semantic descriptions. We use label distribution to analysis semantic descriptions and texture characteristics. Furthermore, a texture generation method based on GAN was tested using our wallpaper dataset, which produced state-of-the-art results.
Original languageEnglish
Title of host publication2018 11th International Conference on Human System Interaction
ISBN (Electronic)978-1-5386-5024-0
Publication statusPublished - 13 Aug 2018
Event11th International Conference on Human System Interaction - Gdansk, Poland
Duration: 4 Jul 20186 Jul 2018


Conference11th International Conference on Human System Interaction
Abbreviated titleHSI 2018


Dive into the research topics of 'Predicting and generating wallpaper texture with semantic properties'. Together they form a unique fingerprint.

Cite this