Workarounds for "Type instantiation is excessively deep and possibly infinite.ts(2589)" #167
-
Hello, I've been experimenting with adopting Seems this is a common issue with other libraries with complex type inference such as Kysely, though there are suggested workarounds. Was curious if there's any such thing for Edit: I just found microsoft/TypeScript#57878, so looks like this is known by the team. Still love to hear if there are any recommended workarounds! |
Beta Was this translation helpful? Give feedback.
Replies: 5 comments 1 reply
-
It's hard for me to make any general comments here without knowing what you're doing exactly. It's a little easier for me to first of all link to all relevant changes that we made already: https://github.com/0no-co/gql.tada/milestone/1?closed=1 There is a size limit now to all documents, which is basically intentional. With a recent change we discovered that we can make a performance optimisation in type inference. However, the trade-off was that there is a size limit to any document to create. It's however past a size which we would consider reasonable given fragment co-location. So, if you have a document that's excessively large and the error goes away when you make it shorter, that document is basically past that size limit, which is tied to how the tokenizer scans the document you write. You shouldn't really be hitting recursion limits in other cases though. If you are this may be an issue we're unaware of. |
Beta Was this translation helpful? Give feedback.
-
This is good context, thank you! I don't think the document that's hitting this issue is too large, but I have a clearer understanding of where to investigate from here. I'll post again if we ever figure it out, but it also looks like #160 may also solve this when implemented. |
Beta Was this translation helpful? Give feedback.
-
To fully close out this issue: it seems upgrading my version of Additionally, I enabled turbo mode and am a fan already. |
Beta Was this translation helpful? Give feedback.
-
Hi, first I have to say that I really like gql.tada; It helped me a lot. However, I still have a problem with large TypeScript files when generating types from the GraphQL API. I still haven't found a solution. How to reduce the file because it is composed in one unit in const or bypass the error can anyone help me in this use case? |
Beta Was this translation helpful? Give feedback.
-
Note on this thread:
Some are expected, others may not be. It will always come down to your personal setup. I'm locking this thread to encourage new (personalised) threads to be created, so we can discuss this in relation to each person's problems and to narrow down when and how they're running into this. Please create a thread of your own if you're running into this and give some information on when (and when not) you're seeing the error. The most common case you'll likely see this (until we have a dedicated warning for this in the TS plugin) is when creating a large query. One that exceeds length limits after which inference in TS slows down. See: https://discord.com/channels/1082378892523864074/1252289704854945895 Quote of the reply from Discord thread aboveThere's an inherent limit to the size of each GraphQL document due to a limitation and trade-off in how much gql.tada can handle at a time. TypeScript can only convert so much of a string literal into types before slowing down and becoming less than usable when editing GraphQL queries. While turbo mode is available the base restriction still exists, which will always affect editing performance. That's why the optimisation that ultimately limits the length of queries is still in place. We basically determined that in code bases that utilise fragment colocation this doesn't become problematic very often as it's always possible to further split up dependencies per component further. |
Beta Was this translation helpful? Give feedback.
To fully close out this issue: it seems upgrading my version of
gql.tada
from^1.4.0
to^1.5.2
and@0no-co/graphqlsp
from^1.7.0
to^1.10.1
has resolved the issue.Additionally, I enabled turbo mode and am a fan already.