Abstract
When a tensor is partitioned into subtensors, some tensor norms of these subtensors form a tensor called a norm compression tensor. Norm compression inequalities for tensors focus on the relation of the norm of this compressed tensor to the norm of the original tensor. We prove that for the tensor spectral norm, the norm of the compressed tensor is an upper bound of the norm of the original tensor. This result can be extended to a general class of tensor spectral norms. We discuss various applications of norm compression inequalities for tensors. These inequalities improve many existing bounds of tensor norms in the literature, in particular tightening the general bound of the tensor spectral norm via tensor partitions. We study the extremal ratio between the spectral norm and the Frobenius norm of a tensor space, provide a general way to estimate its upper bound, and in particular, improve the current best upper bound for third order nonnegative tensors and symmetric tensors. We also propose a faster approach to estimate the spectral norm of a large tensor or matrix via sequential norm compression inequalities with theoretical and numerical evidence. For instance, the complexity of our algorithm for the matrix spectral norm is O(n2+ε) where ε ranges from 0 to 1 depending on the partition and the estimate ranges correspondingly from some close upper bound to the exact spectral norm.
Original language | English |
---|---|
Article number | 11 |
Number of pages | 27 |
Journal | Calcolo |
Volume | 57 |
Issue number | 1 |
Early online date | 18 Feb 2020 |
DOIs | |
Publication status | Published - Mar 2020 |
Keywords
- norm compression inequality
- spectral norm
- tensor partition
- block tensor
- rank-one approximation