Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Accessing other pre-trained models in Table 1 and 2 #25

Open
yliu1240 opened this issue Oct 31, 2024 · 1 comment
Open

Accessing other pre-trained models in Table 1 and 2 #25

yliu1240 opened this issue Oct 31, 2024 · 1 comment

Comments

@yliu1240
Copy link

Hi, all!
I'm wondering the other pre-trained models in Table 1 and 2 are models in their original papers or if you trained separate models using their strategies? Are these models publicly available as well (if you have trained separate versions)?

Thank you so much!

Best regards,

@ZJU-Fangyin
Copy link
Contributor

Hi,

In our study, we compared KANO with 12 baseline methods, including GCN, GIN, N-GRAM, Hu et al., MGSSL, GraphMVP, MPNN, DMPNN, CMPNN, GEM, GROVER, and MolCLR. The results of GCN, GIN, N-GRAM, and Hu et al. were taken from the paper of MolCLR, while the results of MGSSL and GraphMVP were obtained from the original text of these two articles. We reproduced the results of MPNN, DMPNN, CMPNN, GEM, GROVER, and MolCLR based on their respective GitHub code repositories. To ensure a fair comparison, we followed the experimental setup of previous works, which involved conducting experiments using three different splits of train/val/test data.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants