Self-Scalable Tanh (Stan): Multi-Scale Solutions for Physics-Informed Neural Networks

Document Type

Article

Publication Date

12-1-2023

Abstract

Differential equations are fundamental in modeling numerous physical systems, including thermal, manufacturing, and meteorological systems. Traditionally, numerical methods often approximate the solutions of complex systems modeled by differential equations. With the advent of modern deep learning, Physics-informed Neural Networks (PINNs) are evolving as a new paradigm for solving differential equations with a pseudo-closed form solution. Unlike numerical methods, the PINNs can solve the differential equations mesh-free, integrate the experimental data, and resolve challenging inverse problems. However, one of the limitations of PINNs is the poor training caused by using the activation functions designed typically for purely data-driven problems. This work proposes a scalable -based activation function for PINNs to improve learning the solutions of differential equations. The proposed Self-scalable (Stan) function is smooth, non-saturating, and has a trainable parameter. It can allow an easy flow of gradients and enable systematic scaling of the input-output mapping during training. Various forward problems to solve differential equations and inverse problems to find the parameters of differential equations demonstrate that the Stan activation function can achieve better training and more accurate predictions than the existing activation functions for PINN in the literature.

Identifier

85168747712 (Scopus)

Publication Title

IEEE Transactions on Pattern Analysis and Machine Intelligence

External Full Text Location

https://doi.org/10.1109/TPAMI.2023.3307688

e-ISSN

19393539

ISSN

01628828

PubMed ID

37610913

First Page

15588

Last Page

15603

Issue

12

Volume

45

This document is currently not available here.

Share

COinS