Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

S3 putObject memory leak #2824

Closed
tamslinn opened this issue Nov 15, 2023 · 7 comments
Closed

S3 putObject memory leak #2824

tamslinn opened this issue Nov 15, 2023 · 7 comments
Assignees
Labels
bug This issue is a bug. p2 This is a standard priority issue queued This issues is on the AWS team's backlog

Comments

@tamslinn
Copy link

tamslinn commented Nov 15, 2023

Describe the bug

The memory usage on my server increases every time I use PutObject, by roughly the size of the file being uploaded and is only freed if I delete the file after putting it.

Expected Behavior

I would expect the memory to be freed once the upload is complete.

Current Behavior

The server memory increases every time I call PutObject with a different file. The memory appears to be freed either by deleting the file directly from the server using rm or by uncommenting the unlink line in the code below.

Reproduction Steps

Here is the code I am using. I was initially using the SourceFile parameter, but changed to send Body so I could try closing the file handle myself. The unset lines make no difference and neither does calling gc_collect_cycles.

      function upload($bucket, $filepath, $key, $region) {
            $s3Client = new S3Client([
                    'region' => $region,
                    'version' => 'latest',
                    'credentials' => CredentialProvider::env()
            ]);
        
            $source = fopen($filepath, 'r');
        
            $result = $s3Client->putObject(['Bucket' => $bucket, 'Key' => $key, 'Body' => $source]);
            fclose($source);
            //unlink($filepath);
            unset($result);
            unset($s3Client);  
      }

Possible Solution

No response

Additional Information/Context

I searched for existing issues and found #1816 but it does not have a suggested resolution. I was using version 3.183.13 of the SDK and tried upgrading to the latest version but this did not fix the issue.
I have also run the compatibility-test.php script which did not report any problems.

SDK version used

3.286.1

Environment details (Version of PHP (php -v)? OS name and version, etc.)

PHP 7.4.33, Debian 10.3, Apache 2.4.38

@tamslinn tamslinn added bug This issue is a bug. needs-triage This issue or PR still needs to be triaged. labels Nov 15, 2023
@yenfryherrerafeliz yenfryherrerafeliz self-assigned this Nov 15, 2023
@yenfryherrerafeliz yenfryherrerafeliz added the investigating This issue is being investigated and/or work is in progress to resolve the issue. label Nov 21, 2023
@yenfryherrerafeliz
Copy link
Contributor

Hi @tamslinn, sorry to hear about your issues. Would it be possible for you to provide some metrics where we can see the memory consumption before you start running your tests and after they finish running?

You could do something as follow:

<?php
$startingMemoryUsage = memory_get_usage();

// Do your operations

$endingMemoryUsage = memory_get_usage();

$consumed = $endingMemoryUsage- $startingMemoryUsage;

echo "Memory consumed in bytes is " . $consumed;

I look forward to your response.

Thanks!

@yenfryherrerafeliz yenfryherrerafeliz added response-requested Waiting on additional info and feedback. Will move to "closing-soon" in 7 days. and removed needs-triage This issue or PR still needs to be triaged. labels Nov 22, 2023
@tamslinn
Copy link
Author

Hi @yenfryherrerafeliz - apologies for the delayed reply, I have been off for a few days. I've just run the upload function with the memory metrics as requested. This is the output when uploading a 100Mb file to AWS:

Starting Memory Usage is 889752
Ending Memory Usage is 13646272
Memory consumed in bytes is 12756520

This is a bit odd as I think that equates to about 12Mb? In the Digital Ocean memory usage console I can see that approx 100Mb of RAM is being consumed and not released. I've attached a small screen shot - this is running on a 512Mb instance.
9.35 - upload 100mb file
9.42 - manually delete file from disk
9.44 - upload 100mb file
9.52 - upload a 2nd 100mb file.

I'm not sure if this means it is a filesystem issue rather than a PHP issue??

Screenshot 2023-11-27 at 09 59 57

@yenfryherrerafeliz yenfryherrerafeliz removed the response-requested Waiting on additional info and feedback. Will move to "closing-soon" in 7 days. label Nov 27, 2023
@yenfryherrerafeliz
Copy link
Contributor

yenfryherrerafeliz commented Nov 28, 2023

Hi @tamslinn, I still need to do more investigation around this but, I think that there could be an issue just when the provided body is not a resource that you open and close yourself, for example:
The following code seems to leave n temp files open, where n is the number of times a request was performed.

<?php
require '../vendor/autoload.php';

use  \Aws\S3\S3Client;

$client = new S3Client(['region' => getenv('TEST_REGION')]);
$cmd = $client->getCommand('putObject', [
    'Bucket' => getenv('TEST_BUCKET'),
    'Key' => 'test.file.txt',
    'Body' => str_repeat('#', self::TEST_BODY_SIZE)
]);
$client->execute($cmd);

But if I do:

<?php
require '../vendor/autoload.php';

use  \Aws\S3\S3Client;

$client = new S3Client(['region' => getenv('TEST_REGION')]);
$file = fopen(self::TEST_FILE_PATH, 'r');
$cmd = $client->getCommand('putObject', [
    'Bucket' => getenv('TEST_BUCKET'),
    'Key' => 'test.file.txt',
    'Body' => $file
]);
$client->execute($cmd);
fclose($file);

There is not floating opened file around.

Here is how I check for which files are still opened:

$filesStillOpen = [];
$resources = get_resources();
foreach ($resources as $resource) {
    if (get_resource_type($resource) !== 'stream') continue;
    
    if (ftell($resource) !== false) {
        $meta = stream_get_meta_data($resource);
        $filesStillOpen[] = $meta['uri'];
    }
}
print_r($filesStillOpen);

I will keep working on investigating and fixing this if is needed but, something you could do to workaround this is the following:

$resources = get_resources();
foreach ($resources as $resource) {
    if (get_resource_type($resource) !== 'stream') continue;

    if (ftell($resource) !== false) {
        $meta = stream_get_meta_data($resource);
        if (strpos($meta['uri'], 'php://temp') !== false) {
            fclose($resource);
        }
    }
}

Thanks!

@yenfryherrerafeliz yenfryherrerafeliz added p2 This is a standard priority issue queued This issues is on the AWS team's backlog and removed investigating This issue is being investigated and/or work is in progress to resolve the issue. labels Nov 28, 2023
@tamslinn
Copy link
Author

@yenfryherrerafeliz thanks for the reply - I have tried the check for open files but none are reported - I am already using a resource which I open and close myself though, as this is one of the things I tried before reporting my issue.

Thanks very much for your help on this - I am beginning to be suspicious that it is an issue with how the free memory is being calculated on my server control panel! I have raised an issue with my provider to look into it and will let you know. In the meantime, I can do a workaround by making a copy of the file which I can delete after uploading. Thanks again.

@dotdash
Copy link

dotdash commented Nov 30, 2023

This does sound a lot like the reported memory usage simply includes the file system cache.

That would also explain why deleting the file has any effect. If PHP was somehow holding on to that file, deleting it shouldn't free any memory. And if there was any memory usage not directly linked to that file, deleting the file shouldn't have any effect.

For example, free -m reports this on my machine right now:

               total        used        free      shared  buff/cache   available
Mem:           31791       21864        1741        4328       12961        9926

Notice how only 1.7GB are reported as "free", but 9.9GB are reported as "available", that's because out of the 12.9G in the "buff/cache" about 8.2GB are from cached files that aren't actively used, so the space could be freed if, and only if, needed.

@yenfryherrerafeliz
Copy link
Contributor

Hi @dotdash, I agree with your comment. @tamslinn I am closing this for now, but if you still needs any help from us feel free of either reopening this issue or opening a new one.

Thanks!

Copy link

⚠️COMMENT VISIBILITY WARNING⚠️

Comments on closed issues are hard for our team to see.
If you need more assistance, please either tag a team member or open a new issue that references this one.
If you wish to keep having a conversation with other community members under this issue feel free to do so.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug This issue is a bug. p2 This is a standard priority issue queued This issues is on the AWS team's backlog
Projects
None yet
Development

No branches or pull requests

3 participants