Skip to content

Kinect depth recovery via the cooperative profit random forest algorithm

Research output: Chapter in Book/Report/Conference proceedingConference contribution

The depth map captured by Kinect usually contain missing depth data. In this paper, we propose a novel method to recover the missing depth data with the guidance of depth information of each neighborhood pixel. In the proposed framework, a self-taught mechanism and a cooperative profit random forest (CPRF) algorithm are combined to predict the missing depth data based on the existing depth data and the corresponding RGB image. The proposed method can overcome the defects of the traditional methods which is prone to producing artifact or blur on the edge of objects. The experimental results on the Berkeley 3-D Object Dataset (B3DO) and the Middlebury benchmark dataset show that the proposed method outperforms the existing method for the recovery of the missing depth data. In particular, it has a good effect on maintaining the geometry of objects.
Original languageEnglish
Title of host publication2018 11th International Conference on Human System Interaction (HSI)
PublisherIEEE
Pages57-62
ISBN (Electronic)978-1-5386-5024-0
ISBN (Print)978-1-5386-5025-7
DOIs
Publication statusPublished - 13 Aug 2018
Event11th International Conference on Human System Interaction - Gdansk, Poland
Duration: 4 Jul 20186 Jul 2018

Conference

Conference11th International Conference on Human System Interaction
Abbreviated titleHSI 2018
CountryPoland
CityGdansk
Period4/07/186/07/18

Documents

  • Kinect_postprint

    Rights statement: © © 2018 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.

    Accepted author manuscript (Post-print), 1 MB, PDF-document

    Licence: Unspecified

Related information

Relations Get citation (various referencing formats)

ID: 11458244