Department of Computational and Data Sciences
Department Seminar
Speaker : Prof. Predrag S. Stanimirovic, University of Nis, Serbia
Title : Continuous-time gradient neural networks for solving linear, Sylvester and Stein matrix equations.
Date & Time: February 27th, 2026 (Friday), 10:00 AM
Venue : # 102, CDS Seminar Hall
ABSTRACT
The dynamics of the gradient neural network (GNN) for solving the matrix equations evolves based on appropriately defined error monitoring matrix (EMM) defined by using an unknown state-variable matrix that approximates the exact solution. The objective function within the GNN framework is defined using the Frobenius norm. The linear GNN is designed to force the objective function to zero following a continuous- time analogy of gradient descent methods for solving nonlinear optimization problems. The GNN model achieves exponential and global convergence, starting from an arbitrary initial point. Three modifications of the traditional GNN continuous-time flow are considered.
The first approach utilizes a modified error function which is designed to calculate the best approximate solution. This method is particularly effective for addressing ill-conditioned input matrices. The proposed best approximate GNN (BGNN) dynamics are formulated using a GNN-based dynamical system defined over an appropriately modified error monitoring matrix. Lyapunov stability theory indicates that the convergence of the BGNN design is closely associated with the exact solution of a generalized Sylvester equation. The second modified version, termed GGNN, utilizes the GNN method to minimize the gradient the classical EMM. The third modification to the GNN design draws inspiration from discrete conjugate gradient (CG) iterations commonly applied in unconstrained nonlinear optimization. The central concept is to exploit a novel EMM that incorporates the momentum information from the previous time instance. In this approach, the proposed EMM follows the principles established in conjugate gradient (CG) methods for nonlinear unconstrained optimization. The resulting CGGNN dynamics is formulated using a modified EMM and enhanced dynamical flow, both defined in analogy with CG methods.
Simulation examples are conducted on a subset of ill-conditioned matrices selected from the Gallery test matrices in MATLAB. Defined methods are implemented and applied in solving linear matrix equations, the Sylvester equation and the Stein equation.
BIO: Predrag S. Stanimirovic has accomplished his Ph.D. in Computer Science at the University of Nis, Serbia. He is a full Professor at the University of Nis, Faculty of Sciences and Mathematics, Departments of Computer Science, Nis, Serbia. He acquired thirty-six years of experience in scientific research in diverse fields of mathematics and computer science, which span multiple branches of numerical linear algebra, recurrent neural networks, linear algebra, nonlinear optimization, symbolic computation and others. His main research topics include Numerical Linear Algebra, Operations Research, Recurrent Neural Networks and Symbolic Computation.
Prof. Predrag S. Stanimirovic has published over 390 publications in scientific journals, including seven research monographs, six textbooks, and over 80 peer-reviewed research articles published in conference proceedings and book chapters. He is an editorial board member of more than 20 scientific journals, 5 of which belong to the Journal Citation Report (JCR) list. Currently, he is section editor of the scientific journals Electronic Research Archive (ERA), Filomat, Journal of Mathematics, Contemporary Mathematics (CM), Facta Universitatis, Series: Mathematics and Informatics, and several other journals. He is the author in the World Rank List of 2% best authors in 2021, 2022 and 2023.
Host Faculty: Dr. Ratikanta Behera
ALL ARE WELCOME



