BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//wp-events-plugin.com//7.2.3.1//EN
TZID:Asia/Kolkata
X-WR-TIMEZONE:Asia/Kolkata
BEGIN:VEVENT
UID:191@cds.iisc.ac.in
DTSTART;TZID=Asia/Kolkata:20260421T120000
DTEND;TZID=Asia/Kolkata:20260421T130000
DTSTAMP:20260421T052655Z
URL:https://cds.iisc.ac.in/events/ph-d-thesis-defense-cds-april-21-2026-tr
 ansformer-neural-operators-for-learning-generalized-solutions-of-partial-d
 ifferential-equations-and-data-assimilation/
SUMMARY:Change in timings: Ph.D. Thesis Defense: CDS: April 21\, 2026 "Tran
 sformer Neural Operators for Learning Generalized Solutions of Partial Dif
 ferential Equations and Data Assimilation"
DESCRIPTION:DEPARTMENT OF COMPUTATIONAL AND DATA SCIENCES\nPh.D. Thesis Def
 ense\n\n\n\nSpeaker : Mr. Boya Sumanth Kumar\nS.R. Number : 06-18-01-10-12
 -20-1-18435\nTitle : "Transformer Neural Operators for Learning Generalize
 d Solutions of Partial Differential Equations and Data Assimilation"\nRese
 arch Supervisor : Dr. Deepak Subramani\nThesis examiner : Prof. Anoop Kris
 hnan\, IIT Delhi\nDate &amp\; Time : April 21\, 2026 (Tuesday) at 12:00 No
 on\nVenue: The Thesis Defense will be held on HYBRID Mode\n# 303 CDS Class
  Room /MICROSOFT TEAMS.\n\nPlease click on the following link to join the 
 Thesis Defense\nMS Teams link\n\n\n\nABSTRACT\nScientific machine learning
  is transforming the way we solve physical dynamics governed by partial di
 fferential equations (PDEs). Solving nonlinear PDEs across multiple initia
 l and boundary conditions requires re-running traditional numerical solver
 s\, and existing physics-informed neural networks require costly retrainin
 g for each new condition. Furthermore\, given the limited availability of 
 observational data in meteorological and oceanic sciences\, data assimilat
 ion in forward simulations is crucial. The above challenges are addressed 
 by proposing three neural operators with architectural innovations: (1) th
 e Physics-Informed Transformer Neural Operator (PINTO) for simulation-free
  PDE solving that is trained using only physics loss and generalizes to un
 seen initial and boundary conditions\, (2) the Physics-Guided Transformer 
 Neural Operator (PGNTO) for finding generalizable PDE solutions from simul
 ation data in the absence of governing equations\, and (3) the Attention b
 ased Coordinate Operator (ACO) for the data assimilation task of reconstru
 cting continuous fields from limited observational data. Collectively\, th
 e above contributions push the boundaries of scientific machine learning i
 n solving PDEs and data assimilation.\n\nThe first part of this thesis int
 roduces a novel Physics-Informed Transformer Neural Operator (PINTO). Curr
 ent neural operator approaches\, though capable of learning functional map
 pings between infinite-dimensional spaces\, suffer from two critical limit
 ations: dependence on substantial simulation data and poor generalization 
 to unseen conditions. PINTO addresses these fundamental challenges by intr
 oducing a novel physics-informed framework that achieves efficient general
 ization through simulation-free\, physics-only training. The core innovati
 on of PINTO lies in the development of iterative kernel-integral operator 
 units that leverage cross-attention mechanisms to transform domain points 
 into initial- and boundary-condition-aware representation vectors. This at
 tention-based architecture enables context-aware learning that fundamental
 ly differs from existing neural operators\, allowing PINTO to learn mappin
 gs from input functions (initial/boundary conditions) to complete PDE solu
 tion spaces through a single forward pass. The architecture comprises thre
 e key stages: lifting layers that project the solution's domain coordinate
 s into a higher-dimensional representation space\, iterative kernel integr
 ation layers for context-aware representation learning\, and projection la
 yers that map the learned representation to the solution space. We demonst
 rate PINTO's superior performance across critical fluid mechanics and engi
 neering applications\, including linear advection\, the nonlinear Burgers 
 equation\, and the steady and unsteady Navier-Stokes equations\, spanning 
 multiple flow scenarios. Under challenging unseen conditions\, PINTO achie
 ves relative errors of merely 20 to 33\\% of those obtained by state-of-th
 e-art physics-informed neural operator methods when compared with analytic
 al and numerical solutions. Critically\, PINTO exhibits temporal extrapola
 tion capabilities absent in competing approaches\, accurately solving the 
 advection and Burgers equations in time steps not present during training.
 \n\nIn the second part of the thesis\, we extend our framework to physics-
 guided transformer neural operators (PGNTO)\, trained on simulation data r
 ather than a physics loss\, to address scenarios where governing PDEs are 
 unavailable and to eliminate the training instabilities that arise from ph
 ysics loss in high-dimensional PDEs. Our PGNTO successfully solves standar
 d PDE benchmarks\, including the nonlinear Burgers and airfoil problem\, a
 nd also scales to complex 3D turbulent flows\, specifically predicting wak
 e dynamics behind wind turbines under varying inlet velocities. Across all
  test cases\, PGNTO maintains superior accuracy\, with relative errors tha
 t are only one-third of those of competing neural operators.\n\nFinally\, 
 in the third part of the thesis\, we introduce the Attention-based coordin
 ate operator (ACO) for neural data assimilation\, enabling the continuous 
 reconstruction of fields from sparse observational data. By applying Gabor
  filters for coordinate lifting and Fourier filters for value representati
 on\, ACO outperforms existing implicit neural representation networks on r
 econstruction tasks. Notably\, a single ACO model generalizes across varyi
 ng sparsity levels\, eliminating the need for separate models tailored to 
 different data availabilities. The performance of ACO is demonstrated usin
 g four challenging datasets: (i)sea surface height (SSH)\, (ii) chlorophyl
 l concentration (CHL)\, (iii) Global surface temperature (GST)\, and (iv) 
 sea surface temperature (SST). The ACO model is compared with other leadin
 g deep neural models for physical field reconstruction.\n\nOverall\, this 
 thesis demonstrates that transformer-based neural operators can effectivel
 y address both physics-informed PDE solving and neural data assimilation\,
  achieving unprecedented generalization\, computational efficiency\, and a
 ccuracy while requiring minimal training data. Our suite of transformer ne
 ural operators opens new avenues for real-time prediction and data assimil
 ation in physics and engineering applications.\n\n\n\nALL ARE WELCOME
CATEGORIES:Events,Thesis Defense
END:VEVENT
BEGIN:VTIMEZONE
TZID:Asia/Kolkata
X-LIC-LOCATION:Asia/Kolkata
BEGIN:STANDARD
DTSTART:20250421T120000
TZOFFSETFROM:+0530
TZOFFSETTO:+0530
TZNAME:IST
END:STANDARD
END:VTIMEZONE
END:VCALENDAR