Abstract
Closure modeling based on the Mori-Zwanzig formalism has proven effective to improve the stability and accuracy of projection-based model order reduction. However, closure models are often expensive and infeasible for complex nonlinear systems. Towards efficient model reduction of general problems, this paper presents a recurrent neural network (RNN) closure of parametric POD-Galerkin reduced-order model. Based on the short time history of the reduced-order solutions, the RNN predicts the memory integral which represents the impact of the unresolved scales on the resolved scales. A conditioned long short term memory (LSTM) network is utilized as the regression model of the memory integral, in which the POD coefficients at a number of time steps are fed into the LSTM units, and the physical/geometrical parameters are fed into the initial hidden state of the LSTM. The reduced-order model is integrated in time using an implicit-explicit (IMEX) Runge-Kutta scheme, in which the memory term is integrated explicitly and the remaining right-hand-side term is integrated implicitly to improve the computational efficiency. Numerical results demonstrate that the RNN closure can significantly improve the accuracy and efficiency of the POD-Galerkin reduced-order model of nonlinear problems. The POD-Galerkin reduced-order model with the RNN closure is also shown to be capable of making accurate predictions, well beyond the time interval of the training data.
Original language | English |
---|---|
Article number | 109402 |
Journal | Journal of Computational Physics |
Volume | 410 |
DOIs | |
State | Published - 1 Jun 2020 |
Externally published | Yes |
Keywords
- Conditioned long-short term memory
- Implicit-explicit Runge-Kutta
- Memory closure
- Model reduction
- POD-Galerkin