Replies: 9 comments 6 replies
-
I have no direct experience to assist with this exact environment. Is the cached permission size "expected"? (thinking here: do you have thousands of permissions, and intertwined relations amongst them with users and roles and other models; understanding the numbers of these might help understand scope) |
Beta Was this translation helpful? Give feedback.
-
Note: You can set the config for |
Beta Was this translation helpful? Give feedback.
-
Thanks for the response. I don't think that we have an extraordinary number of permissions / roles. Here a short snippet I used to determine the size $permissions = Spatie\Permission\Models\Permission::getPermissions()
$strlen = strlen(serialize($permissions)); // -> 514280 -> ~514kB I guess
$count = count($permissions); // -> 84 Thanks for your suggestion for switching the cache store. I already thought about that. However, the only alternative is to use a redis cluster. This will add ~50$ of additional monthly cost to that setup which is rather a lot for the size of this and probably many other projects. Is there a way to deactivate caching or any other idea how to resolve this? |
Beta Was this translation helpful? Give feedback.
-
You could set |
Beta Was this translation helpful? Give feedback.
-
I'm running into the same issue here, disabling caching by updating my config/permissions.php and adding For reference my application has 25 roles and about over 600 permissions (50+ database models each with their own view, create, edit, delete permissions). It sounds like a lot but really it isn't, it just an average web app that has a slightly above average relational schema. There must be a way to be able to cache and persist roles and permissions to improve performance whilst avoiding creating a single 400KB+ object? |
Beta Was this translation helpful? Give feedback.
-
I'm getting this issue as well. Would love to know if there are any solutions aside from setting the cache store to array |
Beta Was this translation helpful? Give feedback.
-
@Kevmue don't set up a VPC to use AWS redis because of Vapor. Sign up an account with RedisLabs. It may be cheaper than using DynamoDB. I don't have a lot of permission with my Vapor. I m totally not happy with DynamoDB monthly fee. It is roughly $15. I don't use cache anywhere besides this package permission cache. |
Beta Was this translation helpful? Give feedback.
-
NOTE: v |
Beta Was this translation helpful? Give feedback.
-
I'm facing the same issue here. I'll prob just plug in a redis store for that matter.. |
Beta Was this translation helpful? Give feedback.
-
When deploying one of our projects to Laravel Vapor we encountered an issue with the laravel-permission extension:
Error:
Client error: 'POST https://dynamodb.eu-central-1.amazonaws.com' resulted in a '400 Bad Request' response: {"__type":"com.amazon.coral.validate#ValidationException","message":"Item size has exceeded the maximum allowed size"}
I've tracked down the issue to the getPermissions method in the PermissionsRegistrar.php. The caching of the permissions seems to exceed the allowed 400KB by DynamoDB.
Is there a way to resolve this except reducing the number of permissions or switching to redis?
Beta Was this translation helpful? Give feedback.
All reactions