Skip to content

Perception-driven procedural texture generation from examples

Research output: Contribution to journal › Article

  • Jun Liu
  • Yanhai Gan
  • Junyu Dong
  • Lin Qi
  • Xin Sun
  • Muwei Jian
  • Lina Wang
  • Dr Hui Yu
Procedural textures are widely used in computer games and animations for efficiently rendering natural scenes. They are generated using mathematical functions, and users need to tune the model parameters to produce desired texture. However, unless one has a good knowledge of these procedural models, it is difficult to predict which model can produce what types of textures. This paper proposes a framework for generating new procedural textures from examples. The new texture can have the same perceptual attributes as those of the input example or re-defined by the users. To achieve this goal, we first introduce a PCA-based Convolutional Network (PCN) to effectively learn texture features. These PCN features can be used to accurately predict the perceptual scales of the input example and a procedural model that can generate the input. Perceptual scales of the input can be redefined by users and further mapped to a point in the perceptual texture space, which has been established in advance by using a training dataset. Finally, we determine the parameters of the procedural generation model by performing perceptual similarity measurement in the perceptual texture space. Extensive experiments show that our method has produced promising results
Original languageEnglish
Pages (from-to)21-34
JournalNeurocomputing
Volume291
Early online date21 Feb 2018
DOIs
StatePublished - 24 May 2018

Documents

  • perception-driven procedural texture generation from examples

    Accepted author manuscript (Post-print), 7 MB, PDF-document

    Due to publisher’s copyright restrictions, this document is not freely available to download from this website until: 21/02/19

    License: CC BY-NC-ND

Related information

Relations Get citation (various referencing formats)

ID: 8963250