-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Experiment on STAR dataset #5
Comments
Hi, please find the feature for STAR here. |
Thank you so much @doc-doc. That's way much better and helpful. Regrading the data, did you use the [Raw Videos from Charades(scaled to 480p) mp4] with the AG dump tool to extract the frames or just directly extracted features on [RGB frames at 24fps (76 GB)]? I am asking this question in a sense that, they have different fps (AG dump tools extract some samples given the annotation file) based on each videos original fps and I may not capture the frames you extracted the features out of them. I appreaciate your assistance. |
Hi, we use ffmpeg and decode each video ( or QA related segment for STAR) at 3pfs. |
@doc-doc Thank you so much for your explanation. Did you use the original scale or the downscaled one (480p)? |
It shoud be the original scale. |
Hello, I am also interested in doing some experiments in STAR dataset. |
hi, please find the updated link here. |
Thank you so much :) |
Hello,
I want to conduct some experiments on STAR dataset and noticed that there are some parts in the code that you tried to load its data. I was wondering if I could have access to the files needed to load STAR dataset and extract its features, e.g.:
if self.dset == 'star': self.vid_clips = load_file(osp.dirname(csv_path)+f'/clips_{self.mode}.json')
It would be great if you could also share how you sampled data into some clips based on the id (referring to the json file). Did you follow the same clip-wise sampling exists in the preprocess_features for every qid?
` def get_video_feat_star(self, video_name, qid, width=320, height=240):
...
`
I sincerely appreciate your help.
The text was updated successfully, but these errors were encountered: