Unbounded RAM usage? #993
Unanswered
guillemc23
asked this question in
Q&A
Replies: 1 comment 3 replies
-
@guillemc23 can you try using the memory usage has been quite a weird thing and not limited to titiler |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hey! I'm trying to deploy a minimal Titiler instance on a docker container. I'm running the latest version, pulling my image from
ghcr.io/developmentseed/titiler:0.18.9
. I'm not adding any custom endpoints, just the extracogValidateExtension
,cogViewerExtension
andColorMapFactory
.I have noticed that RAM usage seems unbounded (the maximum I've seen before killing the process was around 20GB). I have tried setting resource limitation on my
docker-compose.yml
and GDAL environment variables as follows:I am not sure what effects do these variables have, since RAM usage keeps growing all the time, looking like a memory leak. To me, it looks as if GDAL is not being able to properly close the datasets after reading.
This is what
docker ps
returns for this container, with the RAM usage limited at 4GB. I haven't seen it go down for hours now.I'm not sure if this is expected or if some of my configuration parameters are wrong, so any kind of input would be appreciated! If you need any extra information, I will be happy to provide it to you.
Thank you!
EDIT: This was happening to me as well when running Titiler v0.15.6 outside of Docker
Beta Was this translation helpful? Give feedback.
All reactions