Skip to content

A novel probabilistic projection model for multi-camera object tracking

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Correlation Filter (CF)-based algorithms have achieved remarkable performance in the field of object tracking during past decades. They have great advantages in dense sampling and reduced computational cost due to the usage of circulant matrix. However, present monocular object tracking algorithms can hardly solve fast motion which usually causes tracking failure. In this paper, a novel probabilistic projection model for multi-camera object tracking using two Kinects is proposed. Once the object is found lost using multimodal target detection, the point projection using a probabilistic projection model is processed to get a better tracking position of the targeted object. The projection model works well in the related experiments. Furthermore, when compared with other popular methods, the proposed tracking method grounded on the projection model is demonstrated to be more effective to accommodate the fast motion and achieve better tracking performance to promote robotic autonomy.

Original languageEnglish
Title of host publicationTowards Autonomous Robotic Systems - 20th Annual Conference, TAROS 2019, Proceedings
EditorsKaspar Althoefer, Jelizaveta Konstantinova, Ketao Zhang
PublisherSpringer Verlag
Number of pages11
ISBN (Print)9783030238063
Publication statusPublished - Jul 2019
Event20th Annual Conference on Towards Autonomous Robotic Systems - London, United Kingdom
Duration: 3 Jul 20195 Jul 2019

Publication series

NameLecture Notes in Computer Science
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Conference20th Annual Conference on Towards Autonomous Robotic Systems
Abbreviated titleTAROS 2019
CountryUnited Kingdom


  • linTarosMulticameraTracking-0622

    Rights statement: The final authenticated version is available online at:

    Accepted author manuscript (Post-print), 505 KB, PDF document

Related information

Relations Get citation (various referencing formats)

ID: 14912763