Skip to content

Pixel histogram based background modeling for moving target detection

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Existing moving target detection methods mainly include inter-frame differences, background differences, optical flow and so on. For the recognition of human motions in the process of human-computer collaboration, existing algorithms are usually difficult to meet the requirements of real-time processing and easily interfered by lighting or image noises. In this paper, a method for establishing a static background model based on pixel histogram is proposed. The effect of moving targets and noises on the background model is excluded due to the selectivity of the new algorithm to the gray values, so it can detect the real background more reliably. Compared with other moving target detection methods, this method has the characteristics of fast speed, strong anti-interference ability, and the ability to identify human body movement quickly and accurately.
Original languageEnglish
Title of host publicationProceedings of the 2020 International Symposium on Community-centric Systems (CcS)
PublisherInstitute of Electrical and Electronics Engineers
Number of pages4
ISBN (Electronic)978-1-7281-8741-9, 978-1-7281-8740-2
ISBN (Print)978-1-7281-8742-6
Publication statusPublished - 20 Oct 2020
EventCcS 2020 International Symposium on Community-centric Systems - Tokyo, Japan
Duration: 23 Sep 202026 Sep 2020


ConferenceCcS 2020 International Symposium on Community-centric Systems


  • Postprint_CCS2020

    Rights statement: © 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.

    Accepted author manuscript (Post-print), 363 KB, PDF document

Related information

Relations Get citation (various referencing formats)

ID: 22822652