You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
What's the current limit of EdgeDB in terms of schema size and database size?
I would like to try an instance with 20 000+ types, where some of those types feature multiple GB of data (billions of rows).
Side question: Would DDL operations (like altering a specific property) naturally slow down in proportion to the schema size/complexity or is that somewhat independent?
I could share some benchmarks after testing, but happy for links to any prior work/examples.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
What's the current limit of EdgeDB in terms of schema size and database size?
I would like to try an instance with 20 000+ types, where some of those types feature multiple GB of data (billions of rows).
Side question: Would DDL operations (like altering a specific property) naturally slow down in proportion to the schema size/complexity or is that somewhat independent?
I could share some benchmarks after testing, but happy for links to any prior work/examples.
Thank you!
Beta Was this translation helpful? Give feedback.
All reactions