BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//wp-events-plugin.com//7.2.3.1//EN
TZID:Asia/Kolkata
X-WR-TIMEZONE:Asia/Kolkata
BEGIN:VEVENT
UID:132@cds.iisc.ac.in
DTSTART;TZID=Asia/Kolkata:20250625T103000
DTEND;TZID=Asia/Kolkata:20250625T113000
DTSTAMP:20250623T064543Z
URL:https://cds.iisc.ac.in/events/seminar-cds-102-june-25th-1030-modificat
 ions-of-continuous-time-gradient-based-recurrent-neural-networks/
SUMMARY:{Seminar} @ CDS: #102\, June 25th\, 10:30: "Modifications of contin
 uous-time gradient-based recurrent neural networks."
DESCRIPTION:Department of Computational and Data Sciences\nDepartment Semin
 ar\n\n\n\nSpeaker : Prof. Predrag S. Stanimirovic\nTitle   : "Modificati
 ons of continuous-time gradient-based recurrent neural networks"\nDate &am
 p\; Time: June 25th\, 2025 (Wednesday)\, 10:30 AM\nVenue : # 102\, CDS Sem
 inar Hall\n\n\n\nABSTRACT\n\nOur motivation is to create innovative GNN-ba
 sed dynamical systems for solving the general linear matrix equation AXB =
  D. The first modification is defined using a modified gradient neural net
 work (GNN) model inspired by discrete conjugate gradient (CG) iterations f
 or unconstrained nonlinear optimization. The resulting CGGNN dynamics are 
 developed utilizing a modified error function and enhanced dynamical flow 
 defined in analogy with CG methods. The core idea is to define a new error
  function that incorporates information from the previous time instance. A
 nother modification is based on the gradient momentum technique. The conce
 pt of momentum involves creating a velocity vector that accumulates both c
 urrent and past gradients. We aim to enhance GNN dynamics by incorporating
  momentum.\nThe best approximate GNN (BGNN) dynamics utilizes a GNN dynami
 cal system over two error monitoring matrices (EMMs)\, defined by E(t) = |
 |AV (t)B − D|| and ||V (t)||. The primary concept is to use the best app
 roximate solution rather than relying on least squares solutions.\n\nKeywo
 rds: Unconstrained optimization\, conjugate-gradient optimization methods\
 , gradient neural network (GNN)\, gradient momentum method\, best approxim
 ate solution.\n\nBIO: Predrag S. Stanimirovic has accomplished his Ph.D. i
 n Computer Science at the University of Nis\, Serbia. He is a full Profess
 or at the University of Nis\, Faculty of Sciences and Mathematics\, Depart
 ments of Computer Science\, Nis\, Serbia. He acquired thirty-six years of 
 experience in scientific research in diverse fields of mathematics and com
 puter science\, which span multiple branches of numerical linear algebra\,
  recurrent neural networks\, linear algebra\, nonlinear optimization\, sym
 bolic computation and others. His main research topics include Numerical L
 inear Algebra\, Operations Research\, Recurrent Neural Networks and Symbol
 ic Computation.\n\nProf. Predrag S. Stanimirovic has published over 350 pu
 blications in scientific journals\, including seven research monographs\, 
 six textbooks\, and over 80 peer-reviewed research articles published in c
 onference proceedings and book chapters. He is an editorial board member o
 f more than 20 scientific journals\, 5 of which belong to the Journal Cita
 tion Report (JCR) list. Currently\, he is section editor of the scientific
  journals Electronic Research Archive (ERA)\, Filomat\, Journal of Mathema
 tics\, Contemporary Mathematics (CM)\, Facta Universitatis\, Series: Mathe
 matics and Informatics\, and several other journals. He is the author in t
 he World Rank List of 2% best authors in 2021\, 2022 and 2023.\n\nHost Fac
 ulty: Dr. Ratikanta Behera\n\n\n\nALL ARE WELCOME
CATEGORIES:Events,Talks
END:VEVENT
BEGIN:VTIMEZONE
TZID:Asia/Kolkata
X-LIC-LOCATION:Asia/Kolkata
BEGIN:STANDARD
DTSTART:20240625T103000
TZOFFSETFROM:+0530
TZOFFSETTO:+0530
TZNAME:IST
END:STANDARD
END:VTIMEZONE
END:VCALENDAR