Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Apply on custom data #10

Open
WarrenSkywalker opened this issue Dec 24, 2024 · 1 comment
Open

Apply on custom data #10

WarrenSkywalker opened this issue Dec 24, 2024 · 1 comment

Comments

@WarrenSkywalker
Copy link

Hello, I have reproduced your work, and the results are quite good. Therefore, I tried to apply your method to my own data. I converted the data into the format of the 3DFRONT dataset and successfully ran it. However, I noticed that the Loss dropped very quickly during training, stabilizing around 0.01 after approximately 2000 epochs. Then, when using the trained model for generation, the results were not very satisfactory. I would like to ask if MIDiffusion has been fine-tuned specifically for 3DFRONT? Or is there any advice for me, thanks in advance.

@SiyiHu
Copy link
Collaborator

SiyiHu commented Jan 3, 2025

We have not fine-tuned MiDiffusion or trained models from scratch on other datasets. We only tested our models on a few custom floor plans and found that they work quite well as long as the floor plans are not too much different from the training data. You dataset might be quite different from 3D-FRONT. Our training losses usually drops to about 0.01 after ~60% of the training epochs (i.e. 30k or 60k epochs depending on the dataset). The training loss does not change much afterwards unless plotted on a log scale. You can try training the models for more epochs and view training loss on a log scale.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants