t3toolbox.backend.tucker_tensor_train.dense_t3svd.ttsvd_dense#

t3toolbox.backend.tucker_tensor_train.dense_t3svd.ttsvd_dense(T: t3toolbox.backend.common.NDArray, min_ranks: t3toolbox.backend.common.typ.Sequence[int] = None, max_ranks: t3toolbox.backend.common.typ.Sequence[int] = None, rtol: float = None, atol: float = None, use_jax: bool = False) t3toolbox.backend.common.typ.Tuple[t3toolbox.backend.common.typ.Tuple[t3toolbox.backend.common.NDArray, Ellipsis], t3toolbox.backend.common.typ.Tuple[t3toolbox.backend.common.NDArray, Ellipsis]]#

Compute tensor train (TT) decomposition and unfolding singular values for dense tensor.

Parameters:
  • T (NDArray) – The dense tensor. shape=(N1, …, Nd)

  • min_ranks (typ.Sequence[int]) – Minimum TT-ranks for truncation. len=d+1. e.g., (1,3,3,3,1)

  • max_ranks (typ.Sequence[int]) – Maximum TT-ranks for truncation. len=d+1. e.g., (1,5,5,5,1)

  • rtol (float) – Relative tolerance for truncation.

  • atol (float) – Absolute tolerance for truncation.

  • xnp – Linear algebra backend. Default: np (numpy)

Returns:

  • typ.Tuple[NDArray,…] – TT cores. len=d. elm_shape=(ri, ni, r(i+1))

  • typ.Tuple[NDArray,…] – Singular values of unfoldings. len=d+1. elm_shape=(ri,)

See also

truncated_svd, tucker_svd_dense, t3_svd_dense, t3_svd

Examples

>>> import numpy as np
>>> import t3toolbox.t3svd as t3svd
>>> T0 = np.random.randn(40, 50, 60)
>>> c0 = 1.0 / np.arange(1, 41)**2
>>> c1 = 1.0 / np.arange(1, 51)**2
>>> c2 = 1.0 / np.arange(1, 61)**2
>>> T = np.einsum('ijk,i,j,k->ijk', T0, c0, c1, c2) # Preconditioned random tensor
>>> cores, ss = t3svd.tt_svd_dense(T, rtol=1e-3) # Truncate TT-SVD to reduce rank
>>> print([G.shape for G in cores])
[(1, 40, 13), (13, 50, 13), (13, 60, 1)]
>>> T2 = np.einsum('aib,bjc,ckd->ijk', cores[0], cores[1], cores[2])
>>> print(np.linalg.norm(T - T2) / np.linalg.norm(T)) # should be slightly more than rtol=1e-3
0.0023999063535883633