You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It seems that nimare.io.convert_neurosnyth_to_dataset can take a while to finish. This makes it a little bit tedious when writing an analysis script because the user always has to wait for this function to finish until other analysis steps can be tried out. Maybe offering caching here (e.g. nimare.io.convert_neurosnyth_to_dataset(cache_dir='../path/to/cache/dir') would help?
The text was updated successfully, but these errors were encountered:
I think we currently have a workaround to this. Most NiMARE's classes support save/load methods, which save the object as a pickle file, and load a pickle file as a NiMARE class.
Generally, the Neurosynth dataset is downloaded and converted into a NiMARE dataset object only once. The class could be saved with dset.save(dset_fn), keep it somewhere accessible, and load it with dset = nimare.dataset.Dataset.load(dset_fn) if it needs to be reused.
This is not as elegant as caching the function, but at least it reduces the overhead.
Summary
It seems that
nimare.io.convert_neurosnyth_to_dataset
can take a while to finish. This makes it a little bit tedious when writing an analysis script because the user always has to wait for this function to finish until other analysis steps can be tried out. Maybe offering caching here (e.g.nimare.io.convert_neurosnyth_to_dataset(cache_dir='../path/to/cache/dir'
) would help?The text was updated successfully, but these errors were encountered: