Releases: kisonho/torchmanager-diffusion
Releases · kisonho/torchmanager-diffusion
v1.2 (Beta 2)
API Updates:
- Add EMA support in
Manager
- Add multiple devices support in training configs
- Introducing
callbacks.SamplingCallback
to sample images during training - Introducing
optim.EMAOptimizer
for EMA optimizer with context control to swap parameters
Beta Updates:
- Compatibility fixed
- FP16 support
v1.1.1
v1.1
API Updates:
- Add forward diffusion control in
train_step
andtest_step
function - Add SDE configurations in
configs
package - Introducing
build
function to build non-timed wrapped UNet - Introducing
managers.Manager
, one manager for all separatednn.DiffusionModule
- Introducing
nn.diffusion.FastSamplingDiffusionModule
- Introducing
nn.LatentMode
fornn.LatentDiffusionModule
toencode
,decode
, orforward
- Introducing
sde.SDEType
to load SDEs in default settings. - Introducing the separated
nn.DDPM
andnn.LatentDiffusionModule
- Introducing the separated
nn.DiffusionModule
for separated diffusion process into PyTorch module - Introducing the separated
nn.SDEModule
- Multi-GPUs support for
nn.DiffusionModule
- Parsing additional arguments when forwarding in
nn.diffusion.LatentDiffusionModule
- The new separated
nn.DiffusionModule
no longer need ann.TimedModule
, but a normaltorch.nn.Module
instead for better compatibility on other unets - Use
prior_sampling
in SDE instead of direct randomize when adding noises.
Other updates:
- Compatibility for non-wrapped models improved
- Minor bugs fixed
- Typing improvement
v1.1 (Release Candidate 1)
API Updates:
- Add SDE configurations in
configs
package - Introducing
build
function to build non-timed wrapped UNet - Introducing
managers.Manager
, one manager for all separatednn.DiffusionModule
- Introducing
nn.diffusion.FastSamplingDiffusionModule
- Introducing
nn.LatentMode
fornn.LatentDiffusionModule
toencode
,decode
, orforward
- Introducing
sde.SDEType
to load SDEs in default settings. - Introducing the separated
nn.DDPM
andnn.LatentDiffusionModule
- Introducing the separated
nn.DiffusionModule
for separated diffusion process into PyTorch module - Introducing the separated
nn.SDEModule
- Multi-GPUs support for
nn.DiffusionModule
- Parsing additional arguments when forwarding in
nn.diffusion.LatentDiffusionModule
- The new separated
nn.DiffusionModule
no longer need ann.TimedModule
, but a normaltorch.nn.Module
instead for better compatibility on other unets - Use
prior_sampling
in SDE instead of direct randomize when adding noises.
Other updates:
- Compatibility for non-wrapped models improved
- Minor bugs fixed
- Typing improvement
Release candidate updates:
- Add forward diffusion control in
train_step
andtest_step
function
v1.0.3
v1.0.2
v1.0.1
v1.0
Main release:
- Introducing an abstract
DiffusionManager
to control all diffusion models - Introducing
data
package with a noised data typedata.DiffusionData
and an unsupervised datasetdata.SequenceDataset
- Introducing
DDPMManager
andSDEManager
for DDPM and SDE implementation - Intorducing
metrics.LPIPS
using lpips package - Introducing
scheduling
package for beta scheduling in DDPM - Introducing
sde
package for SDE implementation