BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//wp-events-plugin.com//7.2.3.1//EN
TZID:Asia/Kolkata
X-WR-TIMEZONE:Asia/Kolkata
BEGIN:VEVENT
UID:191@cds.iisc.ac.in
DTSTART;TZID=Asia/Kolkata:20260421T110000
DTEND;TZID=Asia/Kolkata:20260421T120000
DTSTAMP:20260415T150736Z
URL:https://cds.iisc.ac.in/events/ph-d-thesis-defense-cds-april-21-2026-tr
 ansformer-neural-operators-for-learning-generalized-solutions-of-partial-d
 ifferential-equations-and-data-assimilation/
SUMMARY:Ph.D. Thesis Defense: CDS: April 21\, 2026 "Transformer Neural Oper
 ators for Learning Generalized Solutions of Partial Differential Equations
  and Data Assimilation"
DESCRIPTION:DEPARTMENT OF COMPUTATIONAL AND DATA SCIENCES\nPh.D. Thesis Def
 ense\n\n\n\nSpeaker : Mr. Boya Sumanth Kumar\nS.R. Number : 06-18-01-10-12
 -20-1-18435\nTitle : "Transformer Neural Operators for Learning Generalize
 d Solutions of Partial Differential Equations and Data Assimilation"\nRese
 arch Supervisor : Dr. Deepak Subramani\nThesis examiner : Prof. Anoop Kris
 hnan\, IIT Delhi\nDate &amp\; Time : April 21\, 2026 (Tuesday) at 11:00 AM
 \nVenue: The Thesis Defense will be held on HYBRID Mode\n# 303 CDS Class R
 oom /MICROSOFT TEAMS.\n\nPlease click on the following link to join the Th
 esis Defense\nMS Teams link\n\n\n\nABSTRACT\nScientific machine learning i
 s transforming the way we solve physical dynamics governed by partial diff
 erential equations (PDEs). Solving nonlinear PDEs across multiple initial 
 and boundary conditions requires re-running traditional numerical solvers\
 , and existing physics-informed neural networks require costly retraining 
 for each new condition. Furthermore\, given the limited availability of ob
 servational data in meteorological and oceanic sciences\, data assimilatio
 n in forward simulations is crucial. The above challenges are addressed by
  proposing three neural operators with architectural innovations: (1) the 
 Physics-Informed Transformer Neural Operator (PINTO) for simulation-free P
 DE solving that is trained using only physics loss and generalizes to unse
 en initial and boundary conditions\, (2) the Physics-Guided Transformer Ne
 ural Operator (PGNTO) for finding generalizable PDE solutions from simulat
 ion data in the absence of governing equations\, and (3) the Attention bas
 ed Coordinate Operator (ACO) for the data assimilation task of reconstruct
 ing continuous fields from limited observational data. Collectively\, the 
 above contributions push the boundaries of scientific machine learning in 
 solving PDEs and data assimilation.\n\nThe first part of this thesis intro
 duces a novel Physics-Informed Transformer Neural Operator (PINTO). Curren
 t neural operator approaches\, though capable of learning functional mappi
 ngs between infinite-dimensional spaces\, suffer from two critical limitat
 ions: dependence on substantial simulation data and poor generalization to
  unseen conditions. PINTO addresses these fundamental challenges by introd
 ucing a novel physics-informed framework that achieves efficient generaliz
 ation through simulation-free\, physics-only training. The core innovation
  of PINTO lies in the development of iterative kernel-integral operator un
 its that leverage cross-attention mechanisms to transform domain points in
 to initial- and boundary-condition-aware representation vectors. This atte
 ntion-based architecture enables context-aware learning that fundamentally
  differs from existing neural operators\, allowing PINTO to learn mappings
  from input functions (initial/boundary conditions) to complete PDE soluti
 on spaces through a single forward pass. The architecture comprises three 
 key stages: lifting layers that project the solution's domain coordinates 
 into a higher-dimensional representation space\, iterative kernel integrat
 ion layers for context-aware representation learning\, and projection laye
 rs that map the learned representation to the solution space. We demonstra
 te PINTO's superior performance across critical fluid mechanics and engine
 ering applications\, including linear advection\, the nonlinear Burgers eq
 uation\, and the steady and unsteady Navier-Stokes equations\, spanning mu
 ltiple flow scenarios. Under challenging unseen conditions\, PINTO achieve
 s relative errors of merely 20 to 33\\% of those obtained by state-of-the-
 art physics-informed neural operator methods when compared with analytical
  and numerical solutions. Critically\, PINTO exhibits temporal extrapolati
 on capabilities absent in competing approaches\, accurately solving the ad
 vection and Burgers equations in time steps not present during training.\n
 \nIn the second part of the thesis\, we extend our framework to physics-gu
 ided transformer neural operators (PGNTO)\, trained on simulation data rat
 her than a physics loss\, to address scenarios where governing PDEs are un
 available and to eliminate the training instabilities that arise from phys
 ics loss in high-dimensional PDEs. Our PGNTO successfully solves standard 
 PDE benchmarks\, including the nonlinear Burgers and airfoil problem\, and
  also scales to complex 3D turbulent flows\, specifically predicting wake 
 dynamics behind wind turbines under varying inlet velocities. Across all t
 est cases\, PGNTO maintains superior accuracy\, with relative errors that 
 are only one-third of those of competing neural operators.\n\nFinally\, in
  the third part of the thesis\, we introduce the Attention-based coordinat
 e operator (ACO) for neural data assimilation\, enabling the continuous re
 construction of fields from sparse observational data. By applying Gabor f
 ilters for coordinate lifting and Fourier filters for value representation
 \, ACO outperforms existing implicit neural representation networks on rec
 onstruction tasks. Notably\, a single ACO model generalizes across varying
  sparsity levels\, eliminating the need for separate models tailored to di
 fferent data availabilities. The performance of ACO is demonstrated using 
 four challenging datasets: (i)sea surface height (SSH)\, (ii) chlorophyll 
 concentration (CHL)\, (iii) Global surface temperature (GST)\, and (iv) se
 a surface temperature (SST). The ACO model is compared with other leading 
 deep neural models for physical field reconstruction.\n\nOverall\, this th
 esis demonstrates that transformer-based neural operators can effectively 
 address both physics-informed PDE solving and neural data assimilation\, a
 chieving unprecedented generalization\, computational efficiency\, and acc
 uracy while requiring minimal training data. Our suite of transformer neur
 al operators opens new avenues for real-time prediction and data assimilat
 ion in physics and engineering applications.\n\n\n\nALL ARE WELCOME
CATEGORIES:Events,Thesis Defense
END:VEVENT
BEGIN:VTIMEZONE
TZID:Asia/Kolkata
X-LIC-LOCATION:Asia/Kolkata
BEGIN:STANDARD
DTSTART:20250421T110000
TZOFFSETFROM:+0530
TZOFFSETTO:+0530
TZNAME:IST
END:STANDARD
END:VTIMEZONE
END:VCALENDAR