Skip to content

Releases: kisonho/torchmanager-diffusion

v1.2 (Beta 2)

25 Nov 22:30
Compare
Choose a tag to compare
v1.2 (Beta 2) Pre-release
Pre-release

API Updates:

  • Add EMA support in Manager
  • Add multiple devices support in training configs
  • Introducing callbacks.SamplingCallback to sample images during training
  • Introducing optim.EMAOptimizer for EMA optimizer with context control to swap parameters

Beta Updates:

  • Compatibility fixed
  • FP16 support

v1.1.1

16 Nov 15:19
Compare
Choose a tag to compare

Updates:

  • Minor bugs fixed

v1.1

09 Sep 13:50
Compare
Choose a tag to compare

API Updates:

  • Add forward diffusion control in train_step and test_step function
  • Add SDE configurations in configs package
  • Introducing build function to build non-timed wrapped UNet
  • Introducing managers.Manager, one manager for all separated nn.DiffusionModule
  • Introducing nn.diffusion.FastSamplingDiffusionModule
  • Introducing nn.LatentMode for nn.LatentDiffusionModule to encode, decode, or forward
  • Introducing sde.SDEType to load SDEs in default settings.
  • Introducing the separated nn.DDPM and nn.LatentDiffusionModule
  • Introducing the separated nn.DiffusionModule for separated diffusion process into PyTorch module
  • Introducing the separated nn.SDEModule
  • Multi-GPUs support for nn.DiffusionModule
  • Parsing additional arguments when forwarding in nn.diffusion.LatentDiffusionModule
  • The new separated nn.DiffusionModule no longer need a nn.TimedModule, but a normal torch.nn.Module instead for better compatibility on other unets
  • Use prior_sampling in SDE instead of direct randomize when adding noises.

Other updates:

  • Compatibility for non-wrapped models improved
  • Minor bugs fixed
  • Typing improvement

v1.1 (Release Candidate 1)

19 Aug 16:57
Compare
Choose a tag to compare
Pre-release

API Updates:

  • Add SDE configurations in configs package
  • Introducing build function to build non-timed wrapped UNet
  • Introducing managers.Manager, one manager for all separated nn.DiffusionModule
  • Introducing nn.diffusion.FastSamplingDiffusionModule
  • Introducing nn.LatentMode for nn.LatentDiffusionModule to encode, decode, or forward
  • Introducing sde.SDEType to load SDEs in default settings.
  • Introducing the separated nn.DDPM and nn.LatentDiffusionModule
  • Introducing the separated nn.DiffusionModule for separated diffusion process into PyTorch module
  • Introducing the separated nn.SDEModule
  • Multi-GPUs support for nn.DiffusionModule
  • Parsing additional arguments when forwarding in nn.diffusion.LatentDiffusionModule
  • The new separated nn.DiffusionModule no longer need a nn.TimedModule, but a normal torch.nn.Module instead for better compatibility on other unets
  • Use prior_sampling in SDE instead of direct randomize when adding noises.

Other updates:

  • Compatibility for non-wrapped models improved
  • Minor bugs fixed
  • Typing improvement

Release candidate updates:

  • Add forward diffusion control in train_step and test_step function

v1.0.3

12 Jul 15:50
Compare
Choose a tag to compare

Updates:

  • Minor bugs fixed

v1.0.2

22 Apr 19:52
Compare
Choose a tag to compare

Updates:

  • Minor bugs fixed
  • Remove reversed typing for sampling_range
  • Replace possible type of sampling_rage from typing.Iterable[int] with typing.Sequence[int]

v1.0.1

21 Mar 18:33
Compare
Choose a tag to compare

Updates:

  • Minor bugs fixed in SDE
  • Remove deprecated package

v1.0

29 Jan 15:56
Compare
Choose a tag to compare

Main release:

  • Introducing an abstract DiffusionManager to control all diffusion models
  • Introducing data package with a noised data type data.DiffusionData and an unsupervised dataset data.SequenceDataset
  • Introducing DDPMManager and SDEManager for DDPM and SDE implementation
  • Intorducing metrics.LPIPS using lpips package
  • Introducing scheduling package for beta scheduling in DDPM
  • Introducing sde package for SDE implementation