{Seminar} @ CDS: #102, June 25th, 10:30: “Modifications of continuous-time gradient-based recurrent neural networks.”

When

25 Jun 25    
10:30 AM - 11:30 AM

Event Type

Department of Computational and Data Sciences
Department Seminar


Speaker : Prof. Predrag S. Stanimirovic
Title   : “Modifications of continuous-time gradient-based recurrent neural networks”
Date & Time: June 25th, 2025 (Wednesday), 10:30 AM
Venue : # 102, CDS Seminar Hall


ABSTRACT

Our motivation is to create innovative GNN-based dynamical systems for solving the general linear matrix equation AXB = D. The first modification is defined using a modified gradient neural network (GNN) model inspired by discrete conjugate gradient (CG) iterations for unconstrained nonlinear optimization. The resulting CGGNN dynamics are developed utilizing a modified error function and enhanced dynamical flow defined in analogy with CG methods. The core idea is to define a new error function that incorporates information from the previous time instance. Another modification is based on the gradient momentum technique. The concept of momentum involves creating a velocity vector that accumulates both current and past gradients. We aim to enhance GNN dynamics by incorporating momentum.
The best approximate GNN (BGNN) dynamics utilizes a GNN dynamical system over two error monitoring matrices (EMMs), defined by E(t) = ||AV (t)B − D|| and ||V (t)||. The primary concept is to use the best approximate solution rather than relying on least squares solutions.

Keywords: Unconstrained optimization, conjugate-gradient optimization methods, gradient neural network (GNN), gradient momentum method, best approximate solution.

BIO: Predrag S. Stanimirovic has accomplished his Ph.D. in Computer Science at the University of Nis, Serbia. He is a full Professor at the University of Nis, Faculty of Sciences and Mathematics, Departments of Computer Science, Nis, Serbia. He acquired thirty-six years of experience in scientific research in diverse fields of mathematics and computer science, which span multiple branches of numerical linear algebra, recurrent neural networks, linear algebra, nonlinear optimization, symbolic computation and others. His main research topics include Numerical Linear Algebra, Operations Research, Recurrent Neural Networks and Symbolic Computation.

Prof. Predrag S. Stanimirovic has published over 350 publications in scientific journals, including seven research monographs, six textbooks, and over 80 peer-reviewed research articles published in conference proceedings and book chapters. He is an editorial board member of more than 20 scientific journals, 5 of which belong to the Journal Citation Report (JCR) list. Currently, he is section editor of the scientific journals Electronic Research Archive (ERA), Filomat, Journal of Mathematics, Contemporary Mathematics (CM), Facta Universitatis, Series: Mathematics and Informatics, and several other journals. He is the author in the World Rank List of 2% best authors in 2021, 2022 and 2023.

Host Faculty: Dr. Ratikanta Behera


ALL ARE WELCOME