-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to use SentiCap dataset and FlickrStyle10k dataset? #1
Comments
Thanks for your positive comment. We follow the dataset split in MSCap and MemCap to train a TextCNN classifier. Details of splits and code of TextCNN are as follows:
|
Your reply has been a great help to me. Thank you very much. |
Dear author, I am reproduce your experiment of table 4. I set the parameter sentiment_type ,candidate_ k, num_ iterations, α, β ,γ, sentence_ len is positive, 200, 15, 0.02, 2, 5, and 10. The test results show that BLEU-3 is 1.45, MENTOR is 7.65, and CLIP-S is 0.99, which is a little different from the results in Table 4. In addition, I trained a texCNN sentiment classifier to calculate Acc, but I was unable to obtain the Acc in the table. Could you tell me more about the experimental details in Table 4 and provide the complete code for calculating Acc? |
Dear author, I have the same question about the results in Table 4, could you provide the complete code for training of TextCNN classifier and calculating Acc? |
Can you tell me where the factual captions came from? Did you randomly choose from the coco test or somewhere else? |
Can you tell me where the factual captions came from? Did you randomly choose from the coco test or somewhere else? |
Dear author, your paper is very creative and I am very interested in it. I am preparing to reproduce your experiment, but the code you provided does not seem to explain how to use the SentiCap dataset and FlickrStyle10k dataset. Can you give me some guidance? I would appreciate it if you could take the time to reply to me.
The text was updated successfully, but these errors were encountered: