Procedural textures have been widely used as they can be easily generated from various mathematical models. However, the model parameters are not perceptually meaningful or uniform for non-expert users. In this paper, we proposed a system that can generate procedural textures interactively along certain perceptual dimensions. We built a procedural texture dataset and measured twelve perceptual properties of a small subset through psychophysical experiments. The perceived magnitude of the rest textures was estimated by Support Vector Machines using computational features from a cascaded PCA network. For a given texture displayed on a touch screen, the user makes finger gestures which were then transferred to magnitude changes in perceptual space. The texture in the database that matches the new perceptual scale and with nearest distance in computational feature space will be chosen and displayed. We reported our experiment results for two particular perceptual properties: surface roughness and directionality. Other properties can be manipulated similarly.
|Title of host publication||Proceedings of the 8th International Conference on Human System Interactions|
|Number of pages||6|
|Publication status||Published - Aug 2015|
|Event||8th International Conference on Human System Interactions (HSI) - Warsaw, United Kingdom|
Duration: 25 Jun 2015 → 27 Jun 2015
|Conference||8th International Conference on Human System Interactions (HSI)|
|Period||25/06/15 → 27/06/15|