Skip to content

Commit 44f5209

Browse files
committed
update readme
1 parent 53d37a9 commit 44f5209

File tree

1 file changed

+9
-8
lines changed

1 file changed

+9
-8
lines changed

README.md

Lines changed: 9 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -2,13 +2,16 @@
22

33
Welcome to the MACE repository!
44

5-
***MACE, a Machine-learning Approach to Chemistry Emulation***, by [Maes et al. (*in press.*)](https://ui.adsabs.harvard.edu/abs/2024arXiv240503274M/abstract), is a surrogate model for chemical kinetics. It is developed in the contexts of circumstellar envelopes (CSEs) of asymptotic giant branch (AGB) stars, i.e. evolved low-mass stars.
6-
7-
Currently it still under development.
8-
Planned release: Sept 2024
5+
***MACE, a Machine-learning Approach to Chemistry Emulation***, by [Maes et al. (2024)](https://ui.adsabs.harvard.edu/abs/2024arXiv240503274M/abstract), is a surrogate model for chemical kinetics. It is developed in the contexts of circumstellar envelopes (CSEs) of asymptotic giant branch (AGB) stars, i.e. evolved low-mass stars.
96

107
MACE is implemented in Python and uses [PyTorch](https://pytorch.org/), together with [torchode](https://github.com/martenlienen/torchode) [(Lienen & Gunnemann, 2022)](https://openreview.net/pdf?id=uiKVKTiUYB0), to be trained.
118

9+
---
10+
## Notes on installation
11+
- MACE is not available on ```pypi```, the package named ```mace``` is not this one.
12+
- To use MACE, please clone the repo and install the required packages, see ```requirements.txt```.
13+
14+
1215
---
1316
## What?
1417

@@ -21,14 +24,14 @@ In formula, MACE is stated as
2124
$${\hat{\boldsymbol{n}}}(t) = \mathcal{D}\Big( G \big( \mathcal{E} ({\boldsymbol{n}}, {\boldsymbol{p}}),t \big) \Big).$$
2225
Here, ${\hat{\boldsymbol{n}}}(t)$ are the predicted chemical abundances at a time $t$ later dan the initial state ${\boldsymbol{n}}$ . $\mathcal{E}$ and $\mathcal{D}$ represent the autoecoder, with the encoder and decoder, respectively. The autoencoder maps the chemical space ${\boldsymbol{n}}$ together with the physical space ${\boldsymbol{p}}$ to a lower dimensional representation $\boldsymbol{z}$, called the latent space. The function $G$ describes the evolution in latent space such that $\boldsymbol{z}(\Delta t) = G(\boldsymbol{z}, \Delta t)=\int_0^{\Delta t} g(\boldsymbol{z}){\rm d}t$.
2326

24-
For more details, check out our paper: [Maes et al. (*in press.*)](https://ui.adsabs.harvard.edu/abs/2024arXiv240503274M/abstract).
27+
For more details, check out our paper: [Maes et al. (2024)](https://ui.adsabs.harvard.edu/abs/2024arXiv240503274M/abstract).
2528

2629
---
2730
## How to run?
2831

2932
Once the Dataset class is set up properly (see src/mace/CSE_0D/dataset.py), a MACE model can be trained. This can be done using the script 'run.py', which takes an input file with the needed (hyper)parameter setup. An example of such an input file can be found in input/.
3033

31-
The script run.py trains the model, as explained by [Maes et al. (*in press.*)](https://ui.adsabs.harvard.edu/abs/2024arXiv240503274M/abstract), and is immediately applied to the specified test dataset once training is finished. As such, it returns an averaged error on the MACE model compared to the classical model.
34+
The script run.py trains the model, as explained by [Maes et al. (2024)](https://ui.adsabs.harvard.edu/abs/2024arXiv240503274M/abstract), and is immediately applied to the specified test dataset once training is finished. As such, it returns an averaged error on the MACE model compared to the classical model.
3235

3336

3437

@@ -37,5 +40,3 @@ The script run.py trains the model, as explained by [Maes et al. (*in press.*)](
3740

3841

3942

40-
41-
(version 0.0.7)

0 commit comments

Comments
 (0)