On optimal low rank Tucker approximation for tensors: the case for an adjustable core size

Bilian Chen, Zhening Li, Shuzhong Zhang

Research output: Contribution to journalArticlepeer-review

275 Downloads (Pure)

Abstract

Approximating high order tensors by low Tucker-rank tensors have applications in psychometrics, chemometrics, computer vision, biomedical informatics, among others. Traditionally, solution methods for finding a low Tucker-rank approximation presume that the size of the core tensor is specified in advance, which may not be a realistic assumption in many applications. In this paper we propose a new computational model where the configuration and the size of the core become a part of the decisions to be optimized. Our approach is based on the so-called maximum block improvement method for non-convex block optimization. Numerical tests on various real data sets from gene expression analysis and image compression are reported, which show promising performances of the proposed algorithms.
Original languageEnglish
Pages (from-to)811-832
JournalJournal of Global Optimization
Volume62
Issue number4
Early online date16 Aug 2014
DOIs
Publication statusPublished - Aug 2015

Keywords

  • multiway array
  • Tucker decomposition
  • low-rank approximation
  • maximum block improvement

Fingerprint

Dive into the research topics of 'On optimal low rank Tucker approximation for tensors: the case for an adjustable core size'. Together they form a unique fingerprint.

Cite this