Accurately estimating rigid transformations in registration using a boosting-inspired mechanism

Yonghuai Liu, Honghai Liu, Ralph Martin, Luigi De Dominicis, Ran Song, Yitian Zhao

Research output: Contribution to journalArticlepeer-review

126 Downloads (Pure)

Abstract

Feature extraction and matching provide the basis of many methods for object registration, modeling, retrieval, and recognition. However, this approach typically introduces false matches, due to lack of features, noise, occlusion, and cluttered backgrounds. In registration, these false matches lead to in- accurate estimation of the underlying transformation that brings the overlapping shapes into best possible alignment. In this paper, we propose a novel boosting-inspired method to tackle this challenging task. It includes three key steps: (i) underlying transformation estimation in the weighted least squares sense, (ii) boosting parameter estimation and regularization via Tsallis entropy, and (iii) weight re-es- timation and regularization via Shannon entropy and update with a maximum fusion rule. The process is iterated. The final optimal underlying transformation is estimated as a weighted average of the trans- formations estimated from the latest iterations, with weights given by the boosting parameters. A comparative study based on real shape data shows that the proposed method outperforms four other state-of-the-art methods for evaluating the established point matches, enabling more accurate and stable estimation of the underlying transformation.
Original languageEnglish
Pages (from-to)849-862
Number of pages14
JournalPattern Recognition
Volume60
Early online date8 Jul 2016
DOIs
Publication statusPublished - Dec 2016

Keywords

  • feature extraction
  • feature matching
  • point match evaluation
  • boosting-inspired
  • rigid underlying transformation

Fingerprint

Dive into the research topics of 'Accurately estimating rigid transformations in registration using a boosting-inspired mechanism'. Together they form a unique fingerprint.

Cite this