BI-GCN: Boundary-Aware Input-Dependent Graph Convolution Network for biomedical image segmentation

Yanda Meng, Hongrun Zhang, Dongxu Gao, Yitian Zhao, Xiaoyun Yang, Xuesheng Qian, Xiaowei Huang, Yalin Zheng

Research output: Chapter in Book/Report/Conference proceedingConference contribution


Segmentation is an essential operation of image processing. The convolution operation suffers from a limited receptive field, while global modelling is fundamental to segmentation tasks. In this paper, we apply graph convolution into the segmentation task and propose an improved \textit{Laplacian}. Different from existing methods, our \textit{Laplacian} is data-dependent, and we introduce two attention diagonal matrices to learn a better vertex relationship. In addition, it takes advantage of both region and boundary information when performing graph-based information propagation. Specifically, we model and reason about the boundary-aware region-wise correlations of different classes through learning graph representations, which is capable of manipulating long range semantic reasoning across various regions with the spatial enhancement along the object's boundary. Our model is well-suited to obtain global semantic region information while also accommodates local spatial boundary characteristics simultaneously. Experiments on two types of challenging datasets demonstrate that our method outperforms the state-of-the-art approaches on the segmentation of polyps in colonoscopy images and of the optic disc and optic cup in colour fundus images.
Original languageEnglish
Title of host publicationProceedings of BMVC 2021
PublisherBritish Machine Vision Association
Publication statusAccepted for publication - 15 Oct 2021
Event32nd British Machine Vision Conference - Online
Duration: 22 Nov 202125 Nov 2021


Conference32nd British Machine Vision Conference
Abbreviated titleBMVC 2021


Dive into the research topics of 'BI-GCN: Boundary-Aware Input-Dependent Graph Convolution Network for biomedical image segmentation'. Together they form a unique fingerprint.

Cite this