-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Details of the competition reg: #5
Comments
Hello @Narayan2407 , Thank you for these questions and your interest in our NetML Competition 2020. At the beginning of this competition, we considered to award 3 prizes in total, one for each dataset. For example, Track 1 and Track 2 results would be used to determine the winner of the first prize, Track 3 and Track 4 would be for the second prize, and Track 5-7 for the third prize. After a few weeks, we increased the amount of the prizes from 3 to 7, to the winner of each of 7 tracks. After this change, I guess, there was an issue while updating the eval.ai page. This ambiguity is now resolved, thanks to your attention. To determine the final score of each track, the overall scores obtained from 'dev' and 'challenge' tracks are to be arithmetically averaged. You are right that previously announced numbers are not the exact averages. We have fixed that, but it doesn't change the rankings. Thanks for your contributions! Best, |
Hello @onurbarut Regards, |
Hello again @onurbarut . We know that prize money will be awarded to only the winners in each of 7 tracks but do the runner-ups and second runner-ups will at least receive acknowledgment or certificate of any kind so that it will be great for us to add that in our resume which will help us in the near future. I hope you understand my point sir. Best, |
Hi @Narayan2407 , Thank you for your interest and contributions to our NetML Competition 2020. We will check with our Sponsor, Intel Corp. if we can provide small prizes for the runner-ups or second runner-ups. Please stay tuned! |
Hello @onurbarut. |
Hello again @onurbarut
I am Narayan, a representative of Team01_5C_2020_SoCSE_KLETech.
I have some doubts related to the competition results.
These lines are from the challenge overview page(hosted on EvalAI.org), "Three winners will be announced for each track. In order to win the prize, for example for non-vpn2016 toplevel track, a team must submit their results for both non-vpn2016_toplevel_dev and vpn2016_toplevel_challenge phases. The final score for each track will be calculated by averaging the 'overall' scores of both dev and challenge phases."
As per this, 3 winners will be announced for each of 7 tracks but as of now there is announcement of only 2 winners per track, so we wanna know a update on this matter. And also as said earlier, the final score will be calculated by averaging the overall scores of both phases(dev and challenge), for example, in Track 3 – Malware Detection using top-level annotations with CICIDS2017 dataset, we achieved score of 0.98573 in development phase and 0.98550 in challenge so averaging that would be around 0.985615 but the score shown on website is 0.97144.
Can you please clarify about the above mentioned doubts? that would be grateful.
Regards,
Narayan Kulkarni
The text was updated successfully, but these errors were encountered: