DS 285

Tensor Computations for Data Science (January 2025)

Course Outline

This course is an introduction to tensor computations, focusing on theory, algorithms, and applications of tensor decompositions to data sciences. In the era of BIG data, artificial intelligence, and machine learning, we are faced with the need to process multiway (tensor-shaped) data. These data are mostly in the three or higher order dimensions, whose orders of magnitude can reach billions. Huge volumes of multi-dimensional data are a great challenge for processing and analyzing; the matrix representation of data analysis is not enough to represent all the information content of the multiway data in different fields. Further, the importance of being able to decompose a tensor is (at least) two-fold. First, finding the decomposition provides hidden information about the data at hand, and second, having a concise decomposition of the tensor allows us to store it much more efficiently. The course will provide an understanding of tensor operations and decomposition with various applications, including image deblurring, image compression, neural network, and solving high dimensional partial differential equations. The emphasis of the course will be more on the deep understanding of tensor computations, with homework geared towards application and analysis of this. Midterm and final exam will be geared towards testing your understanding of advantages/limitations of these computations.

Topics Covered

Fundamentals
Basic concepts of matrix properties: norms, rank, trace, inner products, Kronecker product, similarity matrix. Fast Fourier transform, diagonalization of matrices. Toeplitz and circulant matrices with their properties (eigenvalue and eigenvector), block matrix computation, and warm-up algorithms.
Introduction to Tensors
Tensors and tensor operations: Mode-n product of a tensor. Kronecker product of two tensors, tensor element product, tensor trace, tensor convolution, quantitative tensor product, Khatri-Rao product, the outer product. The Einstein product and t-product tensors. The explicit examples include identity tensor, symmetric tensor, orthogonal tensor, tensor rank, and block tensor.
Tensor Decomposition
Block tensor decomposition, Canonical Polyadic (CP) decomposition, the Tucker decomposition, the multilinear singular value (the higher-order SVD or HOSVD) decomposition, the hierarchical Tucker(HT) decomposition, and the tensor-train (TT) decomposition. Eigenvalue decomposition and singular value decomposition via t-product and the Einstein product. Truncated tensor singular value decomposition. Tensor inversion, and Moore-Penrose inverse. Power tensor, solving system of multilinear equations.
Applications of Tensor Decompositions
Low-rank tensor approximation, background removal with robust principal tensor component analysis, image deblurring, image compression, compressed sensing with robust Regression, higher-order statistical moments for anomaly detection, solving elliptic partial differential equations.
Tensors for Deep Neural Networks
Deep neural networks, Tensor networks, and their decompositions, including CP decomposition, Tucker decomposition, Hierarchical Tucker decomposition, Tensor train, tensor ring decomposition, and Transform-based tensor decomposition.

References

  • 1) Liu, Y. (Ed.). Tensors for Data Processing: Theory, Methods, and Applications. Academic Press. (2021).
  • 2) Liu Y, Liu J, Long Z, Zhu C. Tensor Computation for Data Analysis. Springer; 2022.
  • 3) T. G. Kolda and B. W. Bader. Tensor decompositions and applications. SIAM Rev., 51(3):455-500, 2009.
  • 4) C. D. Martin, R. Shafer, B. Larue. An order-p tensor factorization with applications in imaging. SIAM J Sci Comput. 2013;35(1): A474-90.
  • 5) M. Brazell, N. Li, C. Navasca, et al. Solving multilinear systems via tensor inversion. SIAM J. Matrix Anal Appl. 2013;34(2):542-570.
  • 6) Ji, Y., Wang, Q., Li, X., & Liu, J. (2019). A survey on tensor techniques and applications in machine learning. IEEE Access, 7, 162950-162990.

Marking Scheme

  • a) Assignments/Quizzes/Homework/Presentation (to be done individually) - 50%
  • b) Midterm Exam - 10%
  • c) Final Exam - 20%
  • d) Final Project (to be done individually) - 20%

Course Resources