Outline of the Course:
This course is an introduction to tensor computations, focusing on theory, algorithms, and applications of tensor decompositions to data sciences. In the era of BIG data, artificial intelligence, and machine learning, we are faced with the need to process multiway (tensor-shaped) data. These data are mostly in the three or higher order dimensions, whose orders of magnitude can reach billions. Huge volumes of multi-dimensional data are a great challenge for processing and analyzing; the matrix representation of data analysis is not enough to represent all the information content of the multiway data in different fields. Further, the importance of being able to decompose a tensor is (at least) two-fold. First, finding the decomposition provides hidden information about the data at hand, and second, having a concise decomposition of the tensor allows us to store it much more efficiently. The course will provide an understanding of tensor operations and decomposition with various applications, including image deblurring, image compression, neural network, and solving high dimensional partial differential equations. The emphasis of the course will be more on the deep understanding of tensor computations, with homework geared towards application and analysis of this. Midterm and final exam will be geared towards testing your understanding of advantages/limitations of these computations.
Fundamentals: Basic concepts of matrix properties: norms, rank, trace, inner products, Kronecker product, similarity matrix. Fast Fourier transform, diagonalization of matrices. Toeplitz and circulant matrices with their properties (eigenvalue and eigenvector), block matrix computation, and warm-up algorithms.
Introduction to Tensors: Tensors and tensor operations: Mode-n product of a tensor. Kronecker product of two tensors, tensor element product, tensor trace, tensor convolution, quantitative tensor product, Khatri-Rao product, the outer product. The Einstein product and t-product tensors. The explicit examples include identity tensor, symmetric tensor, orthogonal tensor, tensor rank, and block tensor.
Tensor Decomposition: Block tensor decomposition, Canonical Polyadic (CP) decomposition, the Tucker decomposition, the multilinear singular value (the higher-order SVD or HOSVD) decomposition, the hierarchical Tucker(HT) decomposition, and the tensor- train (TT) decomposition. Eigenvalue decomposition and singular value decomposition via t-product and the Einstein product. Truncated tensor singular value decomposition. Tensor inversion, and Moore-Penrose inverse. Power tensor, solving system of multilinear equations.
Applications of Tensor decompositions: Low-rank tensor approximation, background removal with robust principal tensor component analysis, image deblurring, image compression, compressed sensing with robust Regression, higher-order statistical moments for anomaly detection, solving elliptic partial differential equations
Tensors for Deep Neural Networks: Deep neural networks, Tensor networks, and their decompositions, including CP decomposition, Tucker decomposition, Hierarchical Tucker decomposition, Tensor train, tensor ring decomposition, and Transform-based tensor decomposition.
Reference:
Marking Scheme:
For more details Please see the following links