Releases: benmoseley/FBPINNs
Releases · benmoseley/FBPINNs
FBPINNs v0.2.0
New features
🔥 This is a major new update to the FBPINN library 🔥. We have rewritten the entire library in JAX and added a much more flexible and easier to use high-level interface.
Speed-up
- The library now runs 10-1000X faster than the original PyTorch code. The core reason is the use of
jax.vmap
, which parallelises subdomain forward and gradient computations on the GPU (previously, they were done sequentially). This allows us to scale to 1000s+ subdomains, whereas we could only manage <100s of subdomains before.
Flexibility
The high-level interface is much more flexible. In particular you can now:
- Define irregular and multilevel domain decompositions
- Define custom subdomain neural network architectures
- Add arbitrary types of boundary/data constraints, including training FBPINNs with "soft" boundary losses
- Solve inverse problems
- Learn domain decompositions
Ease-of-use
Furthermore the interface is easier to use. Compared to the previous code:
- There is no need to update the gradients of the FBPINN by hand when applying constraining operators, this is now done automatically using autodiff
- The
Domain
,Problem
,Decomposition
andNetwork
classes are designed to be intuitive and minimal - Python logging is now used to control the level of output
- The library is pip installable
- More examples have been added