-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Stack overflow when computing universe
of Map Word Word
.
#64
Comments
universe
of Map Int Int
.universe
of Map Integer Integer
.
universe
of Map Integer Integer
.universe
of Map Int Int
.
universe
of Map Int Int
.universe
of Map Word Word
.
You are using the library correctly. I don't... really think it's reasonable to expect the computer to be able to answer |
That's a very good point. I hadn't performed the calculation here, but you're right. My apologies!
I'm wondering, does the library have a generalized notion of the "complexity" of a value, and does If we assume that it's possible to build an integral measure of complexity (for example, For example, for some type >>> take 10 (complexity <$> universe @Foo)
[0, 1, 1, 2, 2, 2, 3, 3, 3, 3] If we could rely on the property that evaluating elements earlier in the list will be less expensive than evaluating items later in the list, then perhaps it would be easier to reason about how much evaluation will be required to evaluate some prefix of the list. Apologies if you've already thought of such a solution and it's impractical, or if the library already does this. Many thanks for replying, and for making this library! |
Hi there!
Many thanks for making this library.
I encountered the following issue while experimenting within
ghci
:The issue seems to be related to the cardinality of the key type, since:
Additionally:
(does not terminate, and CPU spins at 100%)
Perhaps I'm using the library incorrectly?
Using a type with a much smaller cardinality works as I would expect:
Details of my environment:
universe-base == 1.1.3
Let me know if you need me to provide any more details, am happy to help.
Many thanks again!
Jonathan
The text was updated successfully, but these errors were encountered: