演讲者:张晔(深圳北理莫斯科大学)
时间:2025-04-08 10:30-11:30
地点:理学院大楼M616
Abstract
Physics-informed neural network (PINN) has shown great potential in deep learning and differential equation modeling. However, it struggles with problems involving steep gradients. This work focuses on singularly perturbed time-dependent reaction-advection-diffusion equations, which exhibit internal transition layers with sharp gradients. To address this challenge, we propose a deep asymptotic expansion (DAE) method that leverages deep learning to obtain explicit smooth approximate solutions. Inspired by asymptotic analysis, we first derive the governing equations for transition layers in one-, two-, and three-dimensional settings and then solve them using PINN. Numerical experiments demonstrate that DAE outperforms PINN, providing accurate solutions where PINN fails. Moreover, DAE surpasses other PINN extensions, such as gPINN and PINN with the residual-based adaptive refinement (RAR), and exhibits a faster convergence speed. The robustness of DAE is further evaluated by varying training point distributions, network architectures, and random seeds.