You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to artistically experiment with your great and clear work.
I am looking for smaller articulated grains. From my experiments, I indeed find the synthesis too close from just a usual granular synthesis.
I can train new models without any issue.
In Finetune_Dance_Diffusion.ipynb, changing the sample size to something else than 65536 gives an error from Torch: RuntimeError: Argument #4: Padding size should be less than the corresponding input dimension, but got: padding (3, 3) at dimension 2 of input [2, 512, 2]
==> How would you change sample-size in Finetune_Dance_Diffusion which runs train_uncond.py?
I have the feeling having smaller grains for training is hapenning somewhere else though. Where can I control that?
Additional question: I need to provide an already existing model (args.ckpt_path in train_uncond.py) even if it is only used for the start. Is there a way to avoid this? Also, I guess the sample size of this model matters if I want to change mine.
Thank you very much!!
The text was updated successfully, but these errors were encountered:
Hello!
I am trying to artistically experiment with your great and clear work.
I am looking for smaller articulated grains. From my experiments, I indeed find the synthesis too close from just a usual granular synthesis.
I can train new models without any issue.
In Finetune_Dance_Diffusion.ipynb, changing the sample size to something else than 65536 gives an error from Torch:
RuntimeError: Argument #4: Padding size should be less than the corresponding input dimension, but got: padding (3, 3) at dimension 2 of input [2, 512, 2]
==> How would you change sample-size in Finetune_Dance_Diffusion which runs train_uncond.py?
I have the feeling having smaller grains for training is hapenning somewhere else though. Where can I control that?
Additional question: I need to provide an already existing model (args.ckpt_path in train_uncond.py) even if it is only used for the start. Is there a way to avoid this? Also, I guess the sample size of this model matters if I want to change mine.
Thank you very much!!
The text was updated successfully, but these errors were encountered: