BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//wp-events-plugin.com//7.2.3.1//EN
TZID:Asia/Kolkata
X-WR-TIMEZONE:Asia/Kolkata
BEGIN:VEVENT
UID:152@cds.iisc.ac.in
DTSTART;TZID=Asia/Kolkata:20251015T140000
DTEND;TZID=Asia/Kolkata:20251015T150000
DTSTAMP:20251007T043455Z
URL:https://cds.iisc.ac.in/events/ph-d-thesis-colloquium-102-cds-15-octobe
 r-2025-transformer-neural-operators-for-learning-generalized-solutions-of-
 partial-differential-equations-and-data-assimilation/
SUMMARY:Ph.D: Thesis Colloquium: 102 : CDS: 15\, October 2025 "Transformer 
 Neural Operators for Learning Generalized Solutions of Partial Differentia
 l Equations and Data Assimilation"
DESCRIPTION:DEPARTMENT OF COMPUTATIONAL AND DATA SCIENCES\nPh.D. Thesis Col
 loquium\n\n\n\nSpeaker: Mr. Boya Sumanth Kumar\nS.R. Number: 06-18-01-10-1
 2-20-1-18435\nTitle: "Transformer Neural Operators for Learning Generalize
 d Solutions of Partial Differential Equations and Data Assimilation"\nRese
 arch Supervisor: Dr. Deepak Subramani\nDate &amp\; Time : October 15\, 202
 5 (Wednesday)\, 02:00 PM\nVenue : #102\, CDS Seminar Hall\n\n\n\nABSTRACT\
 nScientific machine learning is transforming the way we solve physical dyn
 amics governed by partial differential equations (PDEs). Solving nonlinear
  PDEs across multiple initial and boundary conditions requires re-running 
 traditional numerical solvers\, and existing physics-informed neural netwo
 rks require costly retraining for each new condition. Furthermore\, with t
 he availability of sparse observational data in the meteorological and oce
 an science domains\, the necessity for data assimilation in forward simula
 tions is crucial. The above challenges are addressed by proposing three ne
 ural operators with architectural innovations: (1) the Physics-Informed Tr
 ansformer Neural Operator (PINTO) for simulation-free PDE solving that is 
 trained using only physics loss and generalizes to unseen initial and boun
 dary conditions\, (2) the Physics-Guided Transformer Neural Operator (PGNT
 O) for finding generalizable PDE solutions from simulation data in the abs
 ence of governing equations\, and (3) the Implicit Neural Transformer Oper
 ator (INTO) for the data assimilation task of reconstructing continuous fi
 elds from limited observational data. Collectively\, the above contributio
 ns push the boundaries of scientific machine learning in solving PDEs and 
 data assimilation.\n\nThe first part of this thesis introduces a novel Phy
 sics-Informed Transformer Neural Operator (PINTO). Current neural operator
  approaches\, though capable of learning functional mappings between infin
 ite-dimensional spaces\, suffer from two critical limitations: dependence 
 on substantial simulation data and poor generalization to unseen condition
 s. PINTO addresses these fundamental challenges by introducing a novel phy
 sics-informed framework that achieves efficient generalization through sim
 ulation-free\, physics-only training. The core innovation of PINTO lies in
  the development of iterative kernel integral operator units that leverage
  cross-attention mechanisms to transform domain points into initial and bo
 undary condition-aware representation vectors. This attention-based archit
 ecture enables context-aware learning that fundamentally differs from exis
 ting neural operators\, allowing PINTO to learn mappings from input functi
 ons (initial/boundary conditions) to complete PDE solution spaces through 
 a single forward pass. The architecture comprises three key stages: liftin
 g layers for projecting the solution's domain coordinates to a higher-dime
 nsional representation space\, iterative kernel integration layers for con
 text-aware representation learning\, and projection layers for mapping the
  learned representation to the solution space. We demonstrate PINTO's supe
 rior performance across critical fluid mechanics and engineering applicati
 ons\, including linear advection\, nonlinear Burgers equation\, and steady
  and unsteady Navier-Stokes equations spanning multiple flow scenarios. Un
 der challenging unseen conditions\, PINTO achieves relative errors of mere
 ly 20 to 33% of those obtained by state-of-the-art physics-informed neural
  operator methods when compared with analytical and numerical solutions. C
 ritically\, PINTO exhibits temporal extrapolation capabilities absent in c
 ompeting approaches\, accurately solving the advection and Burger’s equa
 tions in time steps not present during training.\n\nIn the second part of 
 the thesis\, we extend our framework to physics-guided transformer neural 
 operators (PGNTO)\, trained on simulation data rather than physics loss\, 
 to address scenarios where governing PDEs are unavailable and to eliminate
  the training instabilities that arise due to physics loss for high-dimens
 ional PDEs. We introduce Gabor filters in the coordinate lifting layers to
  develop PGNTO\, a version of our transformer neural operator that shows i
 mproved PDE solving with simulation data. Our PGNTO successfully solves st
 andard PDE benchmarks that include the nonlinear Burgers\, Darcy\, 2D Navi
 er Stokes\, plasticity\, elasticity\, and also scales to complex 3D turbul
 ent flows\, specifically predicting wake dynamics behind wind turbines und
 er varying inlet velocities. Across all test cases\, PGNTO maintains super
 ior accuracy with relative errors that are only one-third of competing neu
 ral operators.\n\nFinally\, in the third part of the thesis\, we introduce
  the implicit neural transformer operator (INTO) for neural data assimilat
 ion\, enabling continuous field reconstruction from sparse observational d
 ata. By modifying PINTO's architecture with Gabor filters for coordinate l
 ifting and Fourier filters for value representation\, INTO outperforms exi
 sting implicit neural representation networks on reconstruction tasks. Uni
 quely\, a single INTO model generalizes across varying sparsity levels\, e
 liminating the need for training separate models for different data availa
 bilities. The performance of INTO is demonstrated on global surface temper
 ature and air temperature datasets\, and compared with other leading deep 
 neural models for physical field reconstruction.\n\nOverall\, this thesis 
 establishes transformer-based neural operators as a unified framework for 
 physics-informed PDE solving and neural data assimilation\, achieving unpr
 ecedented generalization\, computational efficiency\, and accuracy while r
 equiring minimal training data. Our suite of transformer neural operators 
 opens new avenues for real-time prediction and data assimilation in physic
 s and engineering applications.\n\n\n\nALL ARE WELCOME
CATEGORIES:Events,Ph.D. Thesis Colloquium
END:VEVENT
BEGIN:VTIMEZONE
TZID:Asia/Kolkata
X-LIC-LOCATION:Asia/Kolkata
BEGIN:STANDARD
DTSTART:20241015T140000
TZOFFSETFROM:+0530
TZOFFSETTO:+0530
TZNAME:IST
END:STANDARD
END:VTIMEZONE
END:VCALENDAR