Perception-driven procedural texture generation from examples

Jun Liu, Yanhai Gan, Junyu Dong, Lin Qi, Xin Sun, Muwei Jian, Lina Wang, Hui Yu

    Research output: Contribution to journalArticlepeer-review

    543 Downloads (Pure)

    Abstract

    Procedural textures are widely used in computer games and animations for efficiently rendering natural scenes. They are generated using mathematical functions, and users need to tune the model parameters to produce desired texture. However, unless one has a good knowledge of these procedural models, it is difficult to predict which model can produce what types of textures. This paper proposes a framework for generating new procedural textures from examples. The new texture can have the same perceptual attributes as those of the input example or re-defined by the users. To achieve this goal, we first introduce a PCA-based Convolutional Network (PCN) to effectively learn texture features. These PCN features can be used to accurately predict the perceptual scales of the input example and a procedural model that can generate the input. Perceptual scales of the input can be redefined by users and further mapped to a point in the perceptual texture space, which has been established in advance by using a training dataset. Finally, we determine the parameters of the procedural generation model by performing perceptual similarity measurement in the perceptual texture space. Extensive experiments show that our method has produced promising results
    Original languageEnglish
    Pages (from-to)21-34
    JournalNeurocomputing
    Volume291
    Early online date21 Feb 2018
    DOIs
    Publication statusPublished - 24 May 2018

    Fingerprint

    Dive into the research topics of 'Perception-driven procedural texture generation from examples'. Together they form a unique fingerprint.

    Cite this