Shared-private information bottleneck method for cross-modal clustering

Xiaoqiang Yan, Yangdong Ye, Yiqiao Mao, Hui Yu

Research output: Contribution to journalArticlepeer-review

297 Downloads (Pure)

Abstract

Recently, the cross-modal analysis has drawn much attention due to the rapid growth and widespread emergence of multimodal data. It integrates multiple modalities to improve learning and generalization performance. However, most previous methods just focus on learning a common shared feature space for all modalities and ignore the private information hidden in each individual modality. To address this problem, we propose a novel shared-private information bottleneck (SPIB) method for cross-modal clustering. First, we devise a hybrid words model and a consensus clustering model to construct the shared information of multiple modalities, which capture the statistical correlation of low-level features and the semantic relations of the high-level clustering partitions, respectively. Second, the shared information of multiple modalities and the private information of individual modalities are maximally preserved through a unified information maximization function. Finally, the optimization of SPIB function is performed by a sequential “draw-and-merge” procedure, which guarantees the function converges to a local maximum. Besides, to solve the lack of tags in cross-modal social images, we also investigate the use of structured prior knowledge in the form of knowledge graph to enrich the information in semantic modality and design a novel semantic similarity measurement for social images. The experimental results on four types of cross-modal datasets demonstrate that our method outperforms the state-of-the-art approaches.
Original languageEnglish
Pages (from-to)36045-36056
Number of pages12
JournalIEEE Access
Volume7
DOIs
Publication statusPublished - 12 Mar 2019

Fingerprint

Dive into the research topics of 'Shared-private information bottleneck method for cross-modal clustering'. Together they form a unique fingerprint.

Cite this