-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
S3 putObject memory leak #2824
Comments
Hi @tamslinn, sorry to hear about your issues. Would it be possible for you to provide some metrics where we can see the memory consumption before you start running your tests and after they finish running? You could do something as follow: <?php
$startingMemoryUsage = memory_get_usage();
// Do your operations
$endingMemoryUsage = memory_get_usage();
$consumed = $endingMemoryUsage- $startingMemoryUsage;
echo "Memory consumed in bytes is " . $consumed; I look forward to your response. Thanks! |
Hi @yenfryherrerafeliz - apologies for the delayed reply, I have been off for a few days. I've just run the upload function with the memory metrics as requested. This is the output when uploading a 100Mb file to AWS:
This is a bit odd as I think that equates to about 12Mb? In the Digital Ocean memory usage console I can see that approx 100Mb of RAM is being consumed and not released. I've attached a small screen shot - this is running on a 512Mb instance. I'm not sure if this means it is a filesystem issue rather than a PHP issue?? |
Hi @tamslinn, I still need to do more investigation around this but, I think that there could be an issue just when the provided body is not a resource that you open and close yourself, for example: <?php
require '../vendor/autoload.php';
use \Aws\S3\S3Client;
$client = new S3Client(['region' => getenv('TEST_REGION')]);
$cmd = $client->getCommand('putObject', [
'Bucket' => getenv('TEST_BUCKET'),
'Key' => 'test.file.txt',
'Body' => str_repeat('#', self::TEST_BODY_SIZE)
]);
$client->execute($cmd); But if I do: <?php
require '../vendor/autoload.php';
use \Aws\S3\S3Client;
$client = new S3Client(['region' => getenv('TEST_REGION')]);
$file = fopen(self::TEST_FILE_PATH, 'r');
$cmd = $client->getCommand('putObject', [
'Bucket' => getenv('TEST_BUCKET'),
'Key' => 'test.file.txt',
'Body' => $file
]);
$client->execute($cmd);
fclose($file); There is not floating opened file around. Here is how I check for which files are still opened: $filesStillOpen = [];
$resources = get_resources();
foreach ($resources as $resource) {
if (get_resource_type($resource) !== 'stream') continue;
if (ftell($resource) !== false) {
$meta = stream_get_meta_data($resource);
$filesStillOpen[] = $meta['uri'];
}
}
print_r($filesStillOpen); I will keep working on investigating and fixing this if is needed but, something you could do to workaround this is the following: $resources = get_resources();
foreach ($resources as $resource) {
if (get_resource_type($resource) !== 'stream') continue;
if (ftell($resource) !== false) {
$meta = stream_get_meta_data($resource);
if (strpos($meta['uri'], 'php://temp') !== false) {
fclose($resource);
}
}
} Thanks! |
@yenfryherrerafeliz thanks for the reply - I have tried the check for open files but none are reported - I am already using a resource which I open and close myself though, as this is one of the things I tried before reporting my issue. Thanks very much for your help on this - I am beginning to be suspicious that it is an issue with how the free memory is being calculated on my server control panel! I have raised an issue with my provider to look into it and will let you know. In the meantime, I can do a workaround by making a copy of the file which I can delete after uploading. Thanks again. |
This does sound a lot like the reported memory usage simply includes the file system cache. That would also explain why deleting the file has any effect. If PHP was somehow holding on to that file, deleting it shouldn't free any memory. And if there was any memory usage not directly linked to that file, deleting the file shouldn't have any effect. For example,
Notice how only 1.7GB are reported as "free", but 9.9GB are reported as "available", that's because out of the 12.9G in the "buff/cache" about 8.2GB are from cached files that aren't actively used, so the space could be freed if, and only if, needed. |
|
Describe the bug
The memory usage on my server increases every time I use PutObject, by roughly the size of the file being uploaded and is only freed if I delete the file after putting it.
Expected Behavior
I would expect the memory to be freed once the upload is complete.
Current Behavior
The server memory increases every time I call PutObject with a different file. The memory appears to be freed either by deleting the file directly from the server using
rm
or by uncommenting theunlink
line in the code below.Reproduction Steps
Here is the code I am using. I was initially using the
SourceFile
parameter, but changed to sendBody
so I could try closing the file handle myself. The unset lines make no difference and neither does calling gc_collect_cycles.Possible Solution
No response
Additional Information/Context
I searched for existing issues and found #1816 but it does not have a suggested resolution. I was using version 3.183.13 of the SDK and tried upgrading to the latest version but this did not fix the issue.
I have also run the compatibility-test.php script which did not report any problems.
SDK version used
3.286.1
Environment details (Version of PHP (
php -v
)? OS name and version, etc.)PHP 7.4.33, Debian 10.3, Apache 2.4.38
The text was updated successfully, but these errors were encountered: