BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//wp-events-plugin.com//7.2.3.1//EN
TZID:Asia/Kolkata
X-WR-TIMEZONE:Asia/Kolkata
BEGIN:VEVENT
UID:183@cds.iisc.ac.in
DTSTART;TZID=Asia/Kolkata:20260227T100000
DTEND;TZID=Asia/Kolkata:20260227T110000
DTSTAMP:20260223T130612Z
URL:https://cds.iisc.ac.in/events/seminar-cds-102-february-27th-1000-conti
 nuous-time-gradient-neural-networks-for-solving-linear-sylvester-and-stein
 -matrix-equations/
SUMMARY:{Seminar} @ CDS: #102\, February 27th: 10:00: "Continuous-time grad
 ient neural networks for solving linear\, Sylvester and Stein matrix equat
 ions."
DESCRIPTION:Department of Computational and Data Sciences\nDepartment Semin
 ar\n\n\n\nSpeaker : Prof. Predrag S. Stanimirovic\, University of Nis\, Se
 rbia\nTitle : Continuous-time gradient neural networks for solving linear\
 , Sylvester and Stein matrix equations.\nDate &amp\; Time: February 27th\,
  2026 (Friday)\, 10:00 AM\nVenue : # 102\, CDS Seminar Hall\n\n\n\nABSTRAC
 T\nThe dynamics of the gradient neural network (GNN) for solving the matri
 x equations evolves based on appropriately defined error monitoring matrix
  (EMM) defined by using an unknown state-variable matrix that approximates
  the exact solution. The objective function within the GNN framework is de
 fined using the Frobenius norm. The linear GNN is designed to force the ob
 jective function to zero following a continuous- time analogy of gradient 
 descent methods for solving nonlinear optimization problems. The GNN model
  achieves exponential and global convergence\, starting from an arbitrary 
 initial point. Three modifications of the traditional GNN continuous-time 
 flow are considered.\nThe first approach utilizes a modified error functio
 n which is designed to calculate the best approximate solution. This metho
 d is particularly effective for addressing ill-conditioned input matrices.
  The proposed best approximate GNN (BGNN) dynamics are formulated using a 
 GNN-based dynamical system defined over an appropriately modified error mo
 nitoring matrix. Lyapunov stability theory indicates that the convergence 
 of the BGNN design is closely associated with the exact solution of a gene
 ralized Sylvester equation. The second modified version\, termed GGNN\, ut
 ilizes the GNN method to minimize the gradient the classical EMM. The thir
 d modification to the GNN design draws inspiration from discrete conjugate
  gradient (CG) iterations commonly applied in unconstrained nonlinear opti
 mization. The central concept is to exploit a novel EMM that incorporates 
 the momentum information from the previous time instance. In this approach
 \, the proposed EMM follows the principles established in conjugate gradie
 nt (CG) methods for nonlinear unconstrained optimization. The resulting CG
 GNN dynamics is formulated using a modified EMM and enhanced dynamical flo
 w\, both defined in analogy with CG methods.\nSimulation examples are cond
 ucted on a subset of ill-conditioned matrices selected from the Gallery te
 st matrices in MATLAB. Defined methods are implemented and applied in solv
 ing linear matrix equations\, the Sylvester equation and the Stein equatio
 n.\n\nBIO: Predrag S. Stanimirovic has accomplished his Ph.D. in Computer 
 Science at the University of Nis\, Serbia. He is a full Professor at the U
 niversity of Nis\, Faculty of Sciences and Mathematics\, Departments of Co
 mputer Science\, Nis\, Serbia. He acquired thirty-six years of experience 
 in scientific research in diverse fields of mathematics and computer scien
 ce\, which span multiple branches of numerical linear algebra\, recurrent 
 neural networks\, linear algebra\, nonlinear optimization\, symbolic compu
 tation and others. His main research topics include Numerical Linear Algeb
 ra\, Operations Research\, Recurrent Neural Networks and Symbolic Computat
 ion.\nProf. Predrag S. Stanimirovic has published over 390 publications in
  scientific journals\, including seven research monographs\, six textbooks
 \, and over 80 peer-reviewed research articles published in conference pro
 ceedings and book chapters. He is an editorial board member of more than 2
 0 scientific journals\, 5 of which belong to the Journal Citation Report (
 JCR) list. Currently\, he is section editor of the scientific journals Ele
 ctronic Research Archive (ERA)\, Filomat\, Journal of Mathematics\, Contem
 porary Mathematics (CM)\, Facta Universitatis\, Series: Mathematics and In
 formatics\, and several other journals. He is the author in the World Rank
  List of 2% best authors in 2021\, 2022 and 2023.\n\nHost Faculty: Dr. Rat
 ikanta Behera\n\n\n\nALL ARE WELCOME
CATEGORIES:Events,Talks
END:VEVENT
BEGIN:VTIMEZONE
TZID:Asia/Kolkata
X-LIC-LOCATION:Asia/Kolkata
BEGIN:STANDARD
DTSTART:20250227T100000
TZOFFSETFROM:+0530
TZOFFSETTO:+0530
TZNAME:IST
END:STANDARD
END:VTIMEZONE
END:VCALENDAR