t3toolbox.backend.bcf_operations.orthogonal_representations#

t3toolbox.backend.bcf_operations.orthogonal_representations(x: t3toolbox.backend.common.typ.Union[t3toolbox.backend.common.typ.Tuple[t3toolbox.backend.common.typ.Tuple[t3toolbox.backend.common.NDArray, Ellipsis], t3toolbox.backend.common.typ.Tuple[t3toolbox.backend.common.NDArray, Ellipsis]], t3toolbox.backend.common.typ.Tuple[t3toolbox.backend.common.NDArray, t3toolbox.backend.common.NDArray]], already_left_orthogonal: bool = False, squash: bool = True, use_jax: bool = False) t3toolbox.backend.common.typ.Union[t3toolbox.backend.common.typ.Tuple[t3toolbox.backend.common.typ.Tuple[t3toolbox.backend.common.typ.Tuple[t3toolbox.backend.common.NDArray, Ellipsis], t3toolbox.backend.common.typ.Tuple[t3toolbox.backend.common.NDArray, Ellipsis], t3toolbox.backend.common.typ.Tuple[t3toolbox.backend.common.NDArray, Ellipsis], t3toolbox.backend.common.typ.Tuple[t3toolbox.backend.common.NDArray, Ellipsis]], t3toolbox.backend.common.typ.Tuple[t3toolbox.backend.common.typ.Tuple[t3toolbox.backend.common.NDArray, Ellipsis], t3toolbox.backend.common.typ.Tuple[t3toolbox.backend.common.NDArray, Ellipsis]]], t3toolbox.backend.common.typ.Tuple[t3toolbox.backend.common.typ.Tuple[t3toolbox.backend.common.NDArray, t3toolbox.backend.common.NDArray, t3toolbox.backend.common.NDArray, t3toolbox.backend.common.NDArray], t3toolbox.backend.common.typ.Tuple[t3toolbox.backend.common.NDArray, t3toolbox.backend.common.NDArray]]]#

Construct base-variation representations of TuckerTensorTrain with orthogonal base.

Input TuckerTensorTrain:

          1 -- G0 -- G1 -- G2 -- G3 -- 1
X    =         |     |     |     |
               B0    B1    B2    B3
               |     |     |     |

Base-variation representation with non-orthogonal TT-backend H1:

          1 -- L0 -- H1 -- R2 -- R3 -- 1
X    =         |     |     |     |
               U0    U1    U2    U3
               |     |     |     |

Base-variation representation with non-orthogonal tucker backend V2:

          1 -- L0 -- L1 -- O2 -- R3 -- 1
X    =         |     |     |     |
               U0    U1    V2    U3
               |     |     |     |
The input tensor train x is defined by:
  • x_tucker_cores = (B0, B1, B2, B3)

  • x_tt_cores = (G0, G1, G2, G3)

The “base cores” are:
  • tucker_cores = (U0,U1, U2, U3), up orthogonal

  • left_tt_cores = (L0, L1, L2), left orthogonal

  • right_tt_cores = (R1, R2, R3), right orthogonal

  • outer_tt_cores = (O0, O1, O2, O3), down orthogonal

The “variation cores” are:
  • tucker_variations = (V0, V1, V2, V3)

  • tt_variations = (H0, H1, H2, H3)

Parameters:
  • x (TuckerTensorTrain) – Input TuckerTensorTrain x = (x_tucker_cores, x_tt_cores) x_tucker_cores = (B0, …, B(d-1)) x_tt_cores = (G0, …, G(d-1))

  • xnp – Linear algebra backend. Default: np (numpy)

Returns:

  • T3Base – Orthogonal base for base-variation representations of x.

  • T3Variation – Variation for base-variation representaions of x.

Examples

>>> import numpy as np
>>> import t3toolbox.tucker_tensor_train as t3
>>> import t3toolbox.basis_coordinates_format as bcf
>>> x = t3.t3_corewise_randn((14,15,16), (4,5,6), (1,3,2,1))
>>> base, variation = bcf.orthogonal_representations(x) # Compute orthogonal representations
>>> tucker_cores, left_tt_cores, right_tt_cores, outer_tt_cores = base
>>> tucker_vars, tt_vars = variation
>>> (U0,U1,U2) = tucker_cores
>>> (L0,L1,L2) = left_tt_cores
>>> (R0,R1,R2) = right_tt_cores
>>> (O0,O1,O2) = outer_tt_cores
>>> (V0,V1,V2) = tucker_vars
>>> (H0,H1,H2) = tt_vars
>>> x2 = ((U0,U1,U2), (L0,H1,R2)) # representation with TT-backend variation in index 1
>>> print(np.linalg.norm(x.to_dense() - x2.to_dense())) # Still represents origional tensor
4.978421562425667e-12
>>> x3 = ((U0,V1,U2), (L0,O1,R2)) # representation with tucker backend variation in index 1
>>> print(np.linalg.norm(x.to_dense() - x3.to_dense())) # Still represents origional tensor
5.4355175448533146e-12
>>> print(np.linalg.norm(U1 @ U1.T - np.eye(U1.shape[0]))) # U: orthogonal
1.1915111872574236e-15
>>> print(np.linalg.norm(np.einsum('iaj,iak->jk', L1, L1) - np.eye(L1.shape[2]))) # L: left orthogonal
9.733823879665448e-16
>>> print(np.linalg.norm(np.einsum('iaj,kaj->ik', R1, R1) - np.eye(R1.shape[0]))) # R: right orthogonal
8.027553546330097e-16
>>> print(np.linalg.norm(np.einsum('iaj,ibj->ab', O1, O1) - np.eye(O1.shape[1]))) # O: outer orthogonal
1.3870474292323159e-15

Example where r0 and rd are not 1:

>>> import numpy as np
>>> import t3toolbox.tucker_tensor_train as t3
>>> import t3toolbox.orthogonalization as orth
>>> x = t3.t3_corewise_randn(((14,15,16), (4,5,6), (2,3,2,2)))
>>> base, variation = orth.orthogonal_representations(x) # Compute orthogonal representations
>>> tucker_cores, left_tt_cores, right_tt_cores, outer_tt_cores = base
>>> tucker_vars, tt_vars = variation
>>> (U0,U1,U2) = tucker_cores
>>> (L0,L1,L2) = left_tt_cores
>>> (R0,R1,R2) = right_tt_cores
>>> (O0,O1,O2) = outer_tt_cores
>>> (V0,V1,V2) = tucker_vars
>>> (H0,H1,H2) = tt_vars
>>> x2 = ((U0,U1,U2), (L0,H1,R2)) # representation with TT-backend variation in index 1
>>> print(np.linalg.norm(t3.t3_to_dense(x) - t3.t3_to_dense(x2))) # Still represents origional tensor
2.5341562994067855e-12
>>> x3 = ((V0,U1,U2), (O0,R1,R2)) # representation with tucker backend variation in index 0
>>> print(np.linalg.norm(t3.t3_to_dense(x) - t3.t3_to_dense(x3))) # Still represents origional tensor
2.9206090606788446e-12
>>> print(np.linalg.norm(U0 @ U0.T - np.eye(U0.shape[0]))) # U: orthogonal
1.675264510304594e-15
>>> print(np.linalg.norm(np.einsum('iaj,iak->jk', L0, L0) - np.eye(L0.shape[2]))) # L: left orthogonal
9.046146325204653e-16
>>> print(np.linalg.norm(np.einsum('iaj,kaj->ik', R2, R2) - np.eye(R2.shape[0]))) # R: right orthogonal
1.1775693440128312e-16
>>> print(np.linalg.norm(np.einsum('iaj,ibj->ab', O0, O0) - np.eye(O0.shape[1]))) # O: outer orthogonal
1.2300840868850519e-15