SIAGA: Tensors

Seven pictures from Applied Algebra and Geometry: Picture #6

The Society for Industrial and Applied Mathematics, SIAM, has recently released a journal of Applied Algebra and Geometry called SIAGA. See here for more information on the new journal. They will start taking online submissions on March 23rd.

The poster for the journal features seven pictures. In this penultimate blog post I will talk about the sixth picture, on the subject of Tensors. In the first section of this post, “The Context”, I’ll set the mathematical scene. In the second section, “The Picture”, I’ll talk about this particular image.

fig6bigforblog

The Context

Tensors are the higher-dimensional analogues of matrices. They are data arrays with three or more dimensions, and are represented by an array of size n_1 \times \cdots \times n_d, where n_k is the number of ‘rows’ in the kth direction of the array. The entries of the tensor A are denoted by A_{i_1 \ldots i_d} where i_k \in \{ 1, \ldots, n_k \} tells you which row in the kth direction you are looking at. Just as for a matrix, the entries of a tensor are elements in some field, for example real or complex numbers.

Tensors occur naturally when it makes sense to organize data by more than two indices. For example, if we have a function that depends on three or more discretized inputs f(x,y,z) where x \in \{ x_1, \ldots, x_{n_1} \}, y \in \{ y_1, \ldots, y_{n_2} \} and z \in \{ z_1, \ldots, z_{n_3} \}, then we can organize the values A_{ijk} = f(x_i,y_j,z_k) into a tensor of size n_1 \times n_2 \times n_3. Tensors are increasingly widely used in many applications, especially signal processing, where the uniqueness of a tensor’s decomposition allows the different signals comprising a mixture to be found. They have also been used in machine learning, genomics, geometric complexity theory and statistics.

Our data analysis techniques are currently limited to a matrix-centric perspective. To overcome this, there has been tremendous effort to extend the well-understood properties of matrices to the higher-dimensional world of tensors. A greater understanding of tensors paves the way for very exciting new developments that can cater to the natural structure of tensor-based data, for example in experimental design or confounding factor analysis. This analysis and understanding uses interesting and complicated geometry.

One requirement for computability of a tensor is to have a good low rank approximation. Tensors of size n_1 \times \cdots \times n_d have n_1 \ldots n_d entries and, for applications, this quickly becomes unreasonably large. Matrices are analyzable via their singular value decomposition, and the best low rank approximation is obtainable directly from this by truncating at the rth largest singular value. We can extend many of the useful notions from linear algebra to tensors: we have eigenvectors and singular vectors of tensors, and a higher order version of the singular value decomposition.

 

The Picture

As well as being a picture of the well-known Rubik’s cube, this picture describes a cartoon of a tensor of size 3 \times 3 \times 3. Such a tensor consists of 27 values.

To understand the structure contained in a tensor, we use its natural symmetry group to find a presentation of it that is simple and structurally transparent. This motivation also underlies the Rubik’s puzzle although the symmetries can be quite different: a change of basis transformation for the tensor case, and a permutation of pieces in the case of the puzzle.

Despite being small, a 3 \times 3 \times 3 tensor has interesting geometry. It is known that a generic tensor of size 3 \times 3 \times 3 has seven eigenvectors in \mathbb{P}^2. In the paper “Eigenconfigurations of Tensors” by Abo, Seigal and Sturmfels, we show that any configuration of seven eigenvectors can arise, provided no six of the seven points lie on a conic.

Advertisements

One thought on “SIAGA: Tensors

  1. David Scott March 15, 2016 / 8:57 pm

    Using a Rubik’s Cube is an interesting way to introduce Tensors. A little easier than how I remember first learning about them… I think it was in a thermodynamics class, or maybe materials science. Yeah I think that was it they used a tensor to describe all of the stress vectors going on inside the material. I was like there are vectors inside the matrix? It kind of reminded me of when I first learned about imaginary numbers.

    Liked by 1 person

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s