Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unstable results #35

Open
CuthbertCai opened this issue Jun 20, 2022 · 2 comments
Open

Unstable results #35

CuthbertCai opened this issue Jun 20, 2022 · 2 comments

Comments

@CuthbertCai
Copy link

Hi, thanks for your contribution!

I've re-run the model twice with GCC+RedCaps. For the first time, I got 51.9 on ImageNet and 11.6 on COCO, while for the second time, I got 54.3 on ImageNet and 25.8 on COCO. It seems that the results are not stable.

Is this a normal phenomenon for this model?

@xvjiarui
Copy link
Contributor

Hi @CuthbertCai

Sorry for the late reply.

Are you running training or inference? If you are running training, fixing seed will make the result more stable.

@bryanyzhu
Copy link

@CuthbertCai Hi, do you have results on VOC? Is it stable? We used the GCC+RedCaps config, and got very different results for different runs. Does fixing seed help the stableness? Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants