Mtech Research Thesis Defense: HYBRID: CDS: 18, November 2024 “Learning Multiple Initial Conditions Using Physics-Informed Neural Networks (PINNs)”

When

18 Nov 24    
10:30 AM - 11:30 AM

Event Type

DEPARTMENT OF COMPUTATIONAL AND DATA SCIENCES
Mtech Research Thesis Defense


Speaker : Mr. Mahesh Tom
S.R. Number : 06-18-01-10-22-21-1-20317
Title : “Learning Multiple Initial Conditions Using Physics-Informed Neural Networks (PINNs)”
Thesis examiner : Prof. Nagaiah Chamakuri
Research Supervisor: Prof. Sashikumaar Ganesan
Date & Time : November 18, 2024 (Monday) at 10:30 AM
Venue : The Thesis Défense will be held on HYBRID Mode

# 102 CDS Seminar Hall /MICROSOFT TEAMS

Please click on the following link to join the Thesis Defense: MS Teams link


ABSTRACT

Physics-Informed Neural Networks (PINNs) and their variants have emerged as tools for solving differential equations in the past few years. Although several variants of PINNs have been proposed for time-dependent partial differential equations (PDEs), the majority of these physics-informed approaches are based on solving a problem for a single set of initial conditions. In this work, we consider one-dimensional time-dependent PDEs and focus on solving multiple initial conditions (ICs) with a single network simultaneously. Trying to solve multiple ICs in a single network presents certain challenges, such as spectral bias, that we address in our work. We also look at how our approach performs in the FastVPINNs framework to solve multiple ICs using Variational Physics-Informed Neural Networks (VPINNs). The choice of activation functions is crucial in the performance of a network; hence we also test the influence of various activation functions on FastVPINNs for some standard test cases. While training multiple ICs, we also look at the impact of the network parameters and how they contribute to each trained task via an ablation study.

Once we have a fully trained model that works on multiple ICs, incorporating new ICs without having to retrain all the previous ICs is a challenging task, and a brute-force way of training all the ICs again is not always feasible. To this end, we explore the usage of elastic weight consolidation (EWC), a regularization technique that is used in continual learning, and study its effect on PINNs for training new ICs.


ALL ARE WELCOME