Various generative adversarial networks model for synthetic prohibitory sign image generation

Christine Dewi, Rung-Ching Chen, Yan-Ting Liu, Hui Yu

Research output: Contribution to journalArticlepeer-review

51 Downloads (Pure)


A synthetic image is a critical issue for computer vision. Traffic sign images synthesized from standard models are commonly used to build computer recognition algorithms for acquiring more knowledge on various and low-cost research issues. Convolutional Neural Network (CNN) achieves excellent detection and recognition of traffic signs with sufficient annotated training data. The consistency of the entire vision system is dependent on neural networks. However, locating traffic sign datasets from most countries in the world is complicated. This work uses various generative adversarial networks (GAN) models to construct intricate images, such as Least Squares Generative Adversarial Networks (LSGAN), Deep Convolutional Generative Adversarial Networks (DCGAN), and Wasserstein Generative Adversarial Networks (WGAN). This paper also discusses, in particular, the quality of the images produced by various GANs with different parameters. For processing, we use a picture with a specific number and scale. The Structural Similarity Index (SSIM) and Mean Squared Error (MSE) will be used to measure image consistency. Between the generated image and the corresponding real image, the SSIM values will be compared. As a result, the images display a strong similarity to the real image when using more training images. LSGAN outperformed other GAN models in the experiment with maximum SSIM values achieved using 200 images as inputs, 2000 epochs, and size 32 × 32.
Original languageEnglish
Article number2913
Pages (from-to)1-15
Number of pages15
JournalApplied Sciences
Issue number7
Publication statusPublished - 24 Mar 2021


  • data generation
  • GAN
  • synthetic images
  • WGAN


Dive into the research topics of 'Various generative adversarial networks model for synthetic prohibitory sign image generation'. Together they form a unique fingerprint.

Cite this