平行皮肤:一种基于视觉的皮肤病分析框架

Translated title of the contribution: Parallel skin: a vision-based dermatological analysis framework

Fei-Yue Wang, Chao Gou, Jiangong Wang, Tianyu Shen, Wenbo Zheng, Hui Yu

    Research output: Contribution to journalArticlepeer-review

    Abstract

    Skin is the biggest organ in human being, and skin disease is one of the most common ones. Most people have skin-related problems. With the rapid development of computer and artificial intelligence, image-based methods for skin analysis have achieved preferable results and received increasing attention in academia and industry. However, the performance of computer aided diagnosis systems based on deep learning methods relies on big medical data labeled by domain experts. In addition, there is limitation of interpretability for the diagnosis results. To address aforementioned problems, we propose a vision-based unified framework for dermatological analysis termed as parallel skin. Inspired by the ACP method and the parallel medical image analysis framework, we construct the artificial skin image system to perform data selection and generation. Then, computational experiments are conducted with predictive learning for model building and evaluation. We further introduce descriptive and prescriptive learning to leverage the power of domain knowledge to guide data selection and generation. In the proposed parallel-skin framework, the closed-loop diagnostic analysis model can be optimized.
    Translated title of the contributionParallel skin: a vision-based dermatological analysis framework
    Original languageChinese
    Pages (from-to)577-588
    Number of pages12
    Journal Journal of Pattern Recognition and Artificial Intelligence
    Volume7
    DOIs
    Publication statusPublished - 7 Jul 2019

    Keywords

    • parallel skin
    • parallel intelligence
    • generative models

    Fingerprint

    Dive into the research topics of 'Parallel skin: a vision-based dermatological analysis framework'. Together they form a unique fingerprint.

    Cite this