Memory issues and unloading data from memory #672
mattwarkentin
started this conversation in
General
Replies: 2 comments 8 replies
-
I don't know if it is super hacky or bad form but something like: class Image(dict):
def unload(self):
new_image = Image(...)
self.__dict__.update(new_image.__dict__) |
Beta Was this translation helpful? Give feedback.
1 reply
-
This is a good point! But the memory shouldn't grow thanks to the copy that is performed here: torchio/torchio/data/dataset.py Line 82 in db45b77 So
and the original |
Beta Was this translation helpful? Give feedback.
7 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi @fepegar,
I'm not sure if this question/discussion makes much sense, but here goes. I've been running into some memory issues during training and I've been trying to think about what might be happening. My batch sizes are already pretty small (~2-6). I know
torchio
uses lazy loading to wait to load images until necessary, but as the epoch proceeds forward, isn't it true that more and more subjects will have their data loaded into memory? Is this memory every released?Of course we only ever need the current batch in memory, and all previous subjects can be unloaded. I'm wondering if the
Image
class should have anunload()
method to go along with theload()
to relinquish memory to keep the memory footprint as small as possible. Similarly, theSubject
class would have anunload()
method to unload all images from memory.Maybe I have an incorrect mental model of how things are working behind the scenes. I look forward to discussing.
Beta Was this translation helpful? Give feedback.
All reactions