Towards optimal $\beta$ -variational autoencoders combined with transformers for reduced-order modeling of turbulent flows
The code in this repository features a Python implementation of reduced-order model (ROM) of turbulent flow using
We share the original data with 10,000 snapshots and 26,000 snapshots in OneDrive. We also provide the pre-trained models of
-
To train
$\beta$ -VAE, please run:python beta_vae_train.py
-
For post-processing, please run:
python beta_vae_postprocess.py
-
For ranking the
$\beta$ -VAE mode, please run:python beta_vae_rankModes.py
-
To train a self-attention-based transformer, please run:
python temporal_pred_train_selfattn.py
-
For post-processing, pleas run:
python temporal_pred_postprocess.py
-
To yield the sliding-window error
$\epsilon$ , please run:python temporal_pred_sliding_window.py
-
The transformer and LSTM archiectures are in the utils/NNs
-
The
$\beta$ -VAE archiectures are in the utils/VAE -
The configurations of employed archiectures are in /utils/configs.py
We offer the scripts and data for reproducing the figures in the paper. For instance, to visualise the results of parametric studies, please run:
python visual_lines.py
-
01_Data : The flow data of the streamwise velocity components of flow around square cylinder. Please find more details in the paper: "Causality analysis of large-scale structures in the flow around a wall-mounted square cylinder", Álvaro Martínez-Sánchez, Esteban López, Soledad Le Clainche, Adrián Lozano-Durán, Ankit Srivastava, Ricardo Vinuesa
-
02_Checkpoints : Store the
$\beta$ -VAE model, loss evolution and computation time in .pt format -
03_Mode : Store the obtained
$\beta$ -VAE latent-space modes. -
04_Figs : The figures and visualisation.
-
05_Pred : The data of temporal-dynamics predictions in latent space.
-
06_ROM : The time-series prediction for building the Reduce-order model (ROM)
-
07_Loss : The training loss evolution during training VAE.
-
08_POD : The results from Proper-orthogonal decomposition (POD)
-
csvFile : The csv files to record
$\beta$ -VAE performance, where small denotes$Arch1$ and large denotes the$Arch2$ , respectively. -
utils: The functions and architectures used in the scripts.