Replies: 1 comment 1 reply
-
Would a solution to this be to transfer big files (artifacts) in chunks? Are all the chunks cached too? And if yes, are chunks discarded after they are transferred? |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello, I'm using the connector instance in a data procuder-consumer scenario on Docker. I'm exposing some dataset via the Nginx service, then register these resources on the producer's side to then try to download (after negotiation) to the consumer side. When I try to download a 350MB file there is no problem (other than 2,1 minutes to fetch it) but when I try to get a 500+MB file I get a "Caused by: java.lang.OutOfMemoryError: Java heap space" error on the producer side. I tried to increase the Heap size using environment variables but seems to not work. Any clue?
Thanks for any response.
Beta Was this translation helpful? Give feedback.
All reactions