Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Change sample size for training new models #34

Open
opasquetdotfr opened this issue Aug 16, 2024 · 0 comments
Open

Change sample size for training new models #34

opasquetdotfr opened this issue Aug 16, 2024 · 0 comments

Comments

@opasquetdotfr
Copy link

opasquetdotfr commented Aug 16, 2024

Hello!

I am trying to artistically experiment with your great and clear work.
I am looking for smaller articulated grains. From my experiments, I indeed find the synthesis too close from just a usual granular synthesis.

I can train new models without any issue.
In Finetune_Dance_Diffusion.ipynb, changing the sample size to something else than 65536 gives an error from Torch:
RuntimeError: Argument #4: Padding size should be less than the corresponding input dimension, but got: padding (3, 3) at dimension 2 of input [2, 512, 2]

==> How would you change sample-size in Finetune_Dance_Diffusion which runs train_uncond.py?

I have the feeling having smaller grains for training is hapenning somewhere else though. Where can I control that?

Additional question: I need to provide an already existing model (args.ckpt_path in train_uncond.py) even if it is only used for the start. Is there a way to avoid this? Also, I guess the sample size of this model matters if I want to change mine.

Thank you very much!!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant