Two-stage deep regression enhanced depth estimation from a single RGB image

Jianyuan Sun, Zidong Wang, Hui Yu, Shu Zhang, Junyu Dong, Pengxiang Gao

Research output: Contribution to journalArticlepeer-review

102 Downloads (Pure)


Depth estimation plays a significant role in industrial applications, e.g. augmented reality, robotic mapping and autonomous driving. Traditional approaches for capturing depth, such as laser or depth sensor based methods, are difficult to use in most scenarios due to the limitations of high system cost and limited operational conditions. As an inexpensive and convenient approach, using the computational models to estimate depth from a single RGB image offers a preferable way for the depth prediction. Although the design of computational models to estimate the depth map has been widely investigated, the majority of models suffers from low prediction accuracy due to the sole utilization of a one-stage regression strategy. Inspired by both theoretical and practical success of two-stage regression, we propose a two-stage deep regression model, which is composed of two state-of-the-art network architectures, i.e. the fully convolutional residual network (FCRN) and the conditional generation adversarial network (cGAN). FCRN has been proved to possess a strong prediction ability for depth prediction, but fine details in the depth map are still incomplete. Accordingly, we have improved the existing cGAN model to refine the FCRN-based depth prediction. The experimental results show that the proposed two-stage deep regression model outperforms existing state-of-the-art methods.
Original languageEnglish
Number of pages9
JournalIEEE Transactions on Emerging Topics in Computing
Early online date28 Oct 2020
Publication statusEarly online - 28 Oct 2020


  • RCUK
  • EP/N025849/1
  • depth prediction
  • a single RGB image
  • the rough depth map
  • neural networks


Dive into the research topics of 'Two-stage deep regression enhanced depth estimation from a single RGB image'. Together they form a unique fingerprint.

Cite this