# ECML-PKDD2021 CNN Boundary Conditions for spatio-temporal dynamics
# Effects of Boundary Conditions in Fully Convolutional Networks for Learning Spatio-temporal Dynamics (ECML-PKDD 2021)
This repository is the supplementary material of the article **"Effects of boundary conditions in fully convolutional networks for learning spatio-temporal dynamics"**, submitted to the Applied Data Science Tracks at ECML-PKDD 2021. It contains the complete description of the neural network and of the computing environement, used code, implementation details and supplementary results.
This repository contains the data, code and additional results of our [paper](https://arxiv.org/abs/2106.11160) accepted to the Applied Data Science Tracks at ECML-PKDD 2021. If you find this code useful in your research, please consider citing:
Repository is organized as follows:
@misc{alguacil2021effects,
title={Effects of boundary conditions in fully convolutional networks for learning spatio-temporal dynamics},
author={Antonio Alguacil and Wagner Gonçalves Pinto and Michael Bauerheim and Marc C. Jacob and Stéphane Moreau},
year={2021},
eprint={2106.11160},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
The repository is organized as follows:
-[images](./images): folder containing the figures shown in this page
-[network](./network): implementation of the neural network, train and testing scripts
-[data_generation](./data_generation): code for the generation of the database using Palabos
-[network](./network): implementation of the neural network, train and testing scripts
More details are availble in the subfolders
You can browse the different subfolder to generate the data with an open-source CFD code, train the neural network or
test the method.
Network architecture
------------
Neural network is multi-scale (field dimensions of N, N/2 and N/4), composed by 17 two-dimensional convolution operations, for a total of 422,419 trainable parameters. ReLUs are used as activation function and replication padding is used to maintain layers size unchanged inside each scale.
The employed neural network is a Multi-Scale architecture [from this paper](https://arxiv.org/abs/1511.05440). 3 Scales are used, with dimensions N, N/2 and N/4, composed by 17 two-dimensional convolution operations, for a total of 422,419 trainable parameters. ReLUs are used as activation function and replication padding is used to maintain layers size unchanged inside each scale.