-
-
Notifications
You must be signed in to change notification settings - Fork 198
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make non-OO consumption-saving solvers #1394
Conversation
Adds a non-OO solve_one_period function for the ConsPerfectForesight model. Will do for the other solvers as well. The value function in the PF model isn't right when there's an artificial borrowing constraint (in current HARK). This commit makes it less wrong, but I don't think it's *right* yet.
Still need to add CubicBool and vFuncBool options.
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #1394 +/- ##
==========================================
- Coverage 71.53% 68.28% -3.26%
==========================================
Files 83 83
Lines 13938 14969 +1031
==========================================
+ Hits 9971 10221 +250
- Misses 3967 4748 +781 ☔ View full report in Codecov by Sentry. |
The only failing test is that the overall codecov percentage fell because new code was added, but the legacy code was not removed. The only two uncovered lines in the new code are an exception that comes up when a degenerate cFunc would be generated. |
This ended up 75 lines longer on net, but it's much easier to understand.
I agree our current OO solvers are a painful mess. However, I also don't like I don't have a solution or proposal for a better design, but wanted to note this opinion. |
Non-separable utility of consumption and wealth (savings), Wealth In the Utility Function, as an alternative to separable utility of consumption and wealth. |
Functional solvers also have the big advantage of being much easier to compile using |
This is true in theory, but we'd have to dispense with using HARK objects like LinearInterp, ValueFuncCRRA, and MargValueFuncCRRA |
About
I agree with this and like the way you structure your solvers, with distinct steps in separate functions, very much. Perhaps a good compromise is to encourage the use of sub-functions. That also makes the architecture more modular and allows for other models to import parts of a solver. E.g.
|
@Mv77 yes, i think that's a good idea. Some sort of segmentation of long functions, or hybrid OO + small functions not built into the object, might be a good solution. If we do a hybrid approach with OO + small functions not built into the object, you don't have the messy inheritance of previous OO solvers, small functions can be jitted, and the OOo can handle the HARK objects and interpolators. |
We might want to also have a rule that in HARK, we don't take those small
functions *from other model files*. I.e. we can have multiple types/solvers in
the same file, and they can share some of their code via the smaller
functions, but we aren't reaching across model files to get something. If
something is so useful that it's used by multiple model files (utility
functions, income distribution generators), then it goes in a *different*
place.
Also, the current failure on this (partial) PR is a tiny discrepancy in the
KinkedR cFunc, which I'm working on.
…On Thu, Mar 7, 2024 at 10:15 AM Alan Lujan ***@***.***> wrote:
@Mv77 <https://github.com/Mv77> yes, i think that's a good idea. Some
sort of segmentation of long functions, or hybrid OO + small functions not
built into the object, might be a good solution.
If we do a hybrid approach with OO + small functions not built into the
object, you don't have the messy inheritance of previous OO solvers, small
functions can be jitted, and the OOo can handle the HARK objects and
interpolators.
—
Reply to this email directly, view it on GitHub
<#1394 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/ADKRAFNIYKRY57SH6IQUSCLYXCAARAVCNFSM6AAAAABEJYU32OVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSOBTG4ZDSMZRHA>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
Had an off-by-one error in the "coefficient hacking" line, creating a discrepancy in the fourth decimal place. Now fixed,
If you are not doing these in any particular order, could I request you do portfolio choice next? |
I had a vaguely intended order, but I can do that next, yes.
I think the basic bequest model might get moved into ConsIndShock's file a
little later.
…On Thu, Mar 7, 2024, 7:30 PM Alan Lujan ***@***.***> wrote:
If you are not doing these in any particular order, could I request you do
portfolio choice next?
—
Reply to this email directly, view it on GitHub
<#1394 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/ADKRAFPITPAQUZRKDRYVW7LYXEBDDAVCNFSM6AAAAABEJYU32OVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSOBUHAZTQMJZG4>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
Matches behavior exactly for the default dictionary. Need to add vFunc functionality, then expand to discretized version. This commit has the new solver commented out in the type init method so as to not create a ton of new failing tests while it's still a WIP.
Have not tested fixed share functionality, but I expect it works.
@alanlujan91 These commits have a single-function solve_one_period_ConsPortfolio that seems to match the behavior of the continuous choice model with independent distributions. I'll add the discrete choice and general distribution stuff on Monday, and probably also restructure the function a little more. It looks like there's some redundant evaluations going on. Note that the new solver isn't actually used in these commits; that line is commented out. |
I missed the beginning part of Wednesday's meeting. Did you ask Mridul about these black discrepancies? The current one here is the order of import statements! |
Sounds good, I don't think anyone uses the discrete and/or generalized ones, although this might be a good project for @sidd3888 in the summer. |
I did not get to ask him this... @MridulS? |
Testing and debugging dynamic solution methods is a little bit of an art. Even if we break down long solver functions into smaller steps (and we should, but not too small!), I don't think it's any easier to test or debug. The inputs for the smaller steps often include the outputs of prior steps, which can be complex objects that can't easily be handcrafted in advance for testing-- independent unit testing is very hard. When I'm writing a solver for a big /difficult model, I do a lot of passing partial solutions. That is, even if there's mode code below, I just throw a |
Ignore the pre-commit bit here. I will make the pre-commit config a bit more easy in a new PR. |
These updates include previously merged branches that didn't get entries. Whoops.
Had to move one very small agent type into its own file. It looks like it's a PortfolioConsumerType with a slightly different solver.
I said that this was like untangling rope, but it's more like brushing a badly groomed dog.
Redirect to another file.
This is now handled by the same solver: just set boolean flag and let it run.
There used to be a special solver in ConsRiskyAssetModel.py that did the "basic" portfolio model, without discrete choice nor adjustment frictions. A simple solver of that kind has now been added.
There is now a "simple solver" for a model in which the risky share is *fixed* (but might vary by age), but is not necessarily 0 or 1. This previously existed, was temporarily exiled to a new file, and now has been restored.
Two very small compatibility changes
This branch might actually be approaching completion. I made a couple more (very easy) solvers following a conversation with @alanlujan91 this morning. Not right now, but we should have a focused discussion on how many different solvers/models we actually want in HARK. I.e. if model A is just model B with a particular restriction, under what circumstances do we want model A at all? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks good to me.
I think this is much needed work. The remaining question is now about how we remove redundancies in a not-OO way, which I think involves creating smaller functions that do specific, testable things. I have started some of this in #1395
I would propose we go ahead and merge this, and take care of "functional" solvers in a different PR
try: | ||
MPCminNow = 1.0 / (1.0 + PatFac / solution_next.MPCmin) | ||
except: | ||
MPCminNow = 0.0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do we need this try/except, does a test fail if we take it off?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
IIRC, this is a failsafe for parameterizations where MPCmin would go negative, or where MPCmin_next is already zero. That makes PatFac / MPCmin_next break, so we need an exception here to catch those possibilities and leave something valid-ish.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
), "Interest factor on debt less than interest factor on savings!" | ||
# If the kink is in the wrong direction, code should break here. If there's | ||
# no kink at all, then just use the ConsIndShockModel solver. | ||
if Rboro == Rsave: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this case should also break and alert the user that they should be using ConsIndShock instead, or to check their parameter inputs
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can maybe add a warning here.
# WHAT IS THIS STUFF FOR?? | ||
aGrid=save_points["a"], | ||
Share_adj=save_points["share_adj"], | ||
EndOfPrddvda_adj=save_points["eop_dvda_adj"], | ||
ShareGrid=save_points["share_grid"], | ||
EndOfPrddvda_fxd=save_points["eop_dvda_fxd"], | ||
EndOfPrddvds_fxd=save_points["eop_dvds_fxd"], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think at some point we were saving some midway calculations for diagnosing issues?
We should probably remove these, unless they are somehow important downstream for @Mv77
try: | ||
MPCminNow = 1.0 / (1.0 + PatFac / solution_next.MPCmin) | ||
except: | ||
MPCminNow = 0.0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I removed the try/except locally and I think all tests still passed
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, getting to that except statement requires oddball parameter sets.
vNvrsFuncNow = CubicInterp( | ||
mNrm_temp, vNvrs_temp, vNvrsP_temp, MPCminNvrs * hNrmNow, MPCminNvrs | ||
) | ||
vFuncNow = ValueFuncCRRA(vNvrsFuncNow, CRRA) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
vFunc is only ever cubic?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The pseudo-inverse value function should use cubic spline interpolation, yes. We already have marginal value as u'(c), and it's one arithmetic operation to turn that into marginal pseudo-inverse value.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
but this won't support vFuncs with kinks, correct?
Right.
I think we should never use the Cubic as a default. Splines should be the
default.
If the person knows that they have a problem appropriate for the cubic,
then it should be an option.
…On Wed, Mar 27, 2024 at 1:01 PM Alan Lujan ***@***.***> wrote:
***@***.**** commented on this pull request.
------------------------------
In HARK/ConsumptionSaving/ConsIndShockModel.py
<#1394 (comment)>:
> + vNvrsFuncNow = CubicInterp(
+ mNrm_temp, vNvrs_temp, vNvrsP_temp, MPCminNvrs * hNrmNow, MPCminNvrs
+ )
+ vFuncNow = ValueFuncCRRA(vNvrsFuncNow, CRRA)
but this won't support vFuncs with kinks, correct?
—
Reply to this email directly, view it on GitHub
<#1394 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAKCK7YIAQW72XENSH76CRLY2L3PRAVCNFSM6AAAAABEJYU32OVHI2DSMVQWIX3LMV43YUDVNRWFEZLROVSXG5CSMV3GSZLXHMYTSNRTHE2TAMZZHE>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
--
- Chris Carroll
|
This model will never have a kinked vFunc. Kinked vFunc only happens in a
consumption-saving model if cFunc is discontinuous.
When we have HARK to the point where we think the future cFunc might be
discontinuous, and this period's might be as well (because next period
isn't necessarily basic ConsIndShock), then we would not necessarily use
this representation. But a *whole lot* of other stuff would need to be in
the solver as well if we expected that situation to come up.
On Wed, Mar 27, 2024, 5:21 PM Christopher Llorracc Carroll <
***@***.***> wrote:
… Right.
I think we should never use the Cubic as a default. Splines should be the
default.
If the person knows that they have a problem appropriate for the cubic,
then it should be an option.
On Wed, Mar 27, 2024 at 1:01 PM Alan Lujan ***@***.***> wrote:
> ***@***.**** commented on this pull request.
> ------------------------------
>
> In HARK/ConsumptionSaving/ConsIndShockModel.py
> <#1394 (comment)>:
>
> > + vNvrsFuncNow = CubicInterp(
> + mNrm_temp, vNvrs_temp, vNvrsP_temp, MPCminNvrs * hNrmNow, MPCminNvrs
> + )
> + vFuncNow = ValueFuncCRRA(vNvrsFuncNow, CRRA)
>
> but this won't support vFuncs with kinks, correct?
>
> —
> Reply to this email directly, view it on GitHub
> <#1394 (comment)>, or
> unsubscribe
> <
https://github.com/notifications/unsubscribe-auth/AAKCK7YIAQW72XENSH76CRLY2L3PRAVCNFSM6AAAAABEJYU32OVHI2DSMVQWIX3LMV43YUDVNRWFEZLROVSXG5CSMV3GSZLXHMYTSNRTHE2TAMZZHE>
> .
> You are receiving this because you were mentioned.Message ID:
> ***@***.***>
>
--
- Chris Carroll
—
Reply to this email directly, view it on GitHub
<#1394 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/ADKRAFPSH6XQYEMCVM3LAMLY2MZ3XAVCNFSM6AAAAABEJYU32OVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAMRUGAYTANRQHA>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Just a quick comment... The term to use for the 'legacy00solvers' is "deprecated". They are deprecated. Typically this is signaled in the API documentation. After a few releases of a feature being deprecated, it is removed from the library. (Downstream users that need the old functionality an always use a previous version of the software). Since you've moved the OO solvers to a new located, you haven't exactly preserved the old API access to them. I guess that makes their functionality especially deprecated, since it will take extra work to adapt downstream code to use them. That will make it especially easy to remove the OO solvers down the line. |
I think the only thing downstream users need to do is change one import
line at the top of their code. I moved them to a separate file to reduce
clutter in the model files, and because I didn't want to have test coverage
for them-- I meant to see if there was a way to indicate that that file
doesn't get counted for coverage. I found some small-ish problems with the
old solvers, and corrected some of them with the new solvers; I don't want
to write tests for code we're deprecating, to have them hit false targets.
…On Thu, Mar 28, 2024 at 8:26 AM Sebastian Benthall ***@***.***> wrote:
Just a quick comment...
The term to use for the 'legacy00solvers' is "deprecated
<https://en.wikipedia.org/wiki/Deprecation>". They are deprecated.
Typically this is signaled in the API documentation. After a few releases
of a feature being deprecated, it is removed from the library. (Downstream
users that need the old functionality an always use a previous version of
the software).
Since you've moved the OO solvers to a new located, you haven't exactly
preserved the old API access to them. I guess that makes their
functionality especially deprecated, since it will take extra work to adapt
downstream code to use them.
That will make it especially easy to remove the OO solvers down the line.
—
Reply to this email directly, view it on GitHub
<#1394 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/ADKRAFLCASDU4AB7QG3G7XLY2QD7XAVCNFSM6AAAAABEJYU32OVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAMRVGA3DMNZYHE>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
makes sense. does the new file show up in the rendered API docs? |
I... don't know where to look for that.
…On Thu, Mar 28, 2024 at 12:23 PM Sebastian Benthall < ***@***.***> wrote:
makes sense. does the new file show up in the rendered API docs?
—
Reply to this email directly, view it on GitHub
<#1394 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/ADKRAFIGNGG2WFFCRT4T4EDY2Q7YDAVCNFSM6AAAAABEJYU32OVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAMRVGYYTMNRZGE>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
I'll file an issue... |
Very early on in HARK's lifetime, we decided to make solve_one_period functions be object-oriented, with an inheritance structure among "parent" and "child" models. The (well meaning) intent was to prevent solver code from being redundantly re-written for "small" model changes, and to ensure that improvements in a parent model propagated to child models. The end result, however, was that the solver code was a confusing maze: a user trying to work their way through the solver for (say) ConsMarkovModel would be bounced around that file from method to method, occasionally needing to go up to one or more predecessor classes to find what they were looking for. Moreover, this approach required solver code in "parent" models to be written in an unusual and cryptic way to account for potential downstream alterations... not all of which could be anticipated-- not by a long shot.
This PR goes through and adds straightforward, non-OO solve_one_period functions for our existing models that use the "spaghetti maze" style. In my opinion, the resulting code is much, much easier to read, follow, and understand, and lends itself better to (external) documentation explaining the method step by step. The code is also much shorter: the solvers for ConsPerfForesight and ConsIndShock are 391 lines (inclusive of docstrings, comments, and whitespace) in this PR vs 887 in the legacy version.
This PR is in progress, as I've only done those two solvers right now. I'm making this PR just to verify that tests are passing, which they should (other than whatever black version discrepancies crop up). I will add a "solver checklist" to this post shortly.
EDIT: The OO-solver idea was really the "big thing" we were going for in HARK early on. The Econ-ARK logo is literally a stylized version of the KinkyPrefConsumerType's consumption function, because the solver class for that is nothing more than a double inheritance from KinkedR and PrefShock.
SOLVER CHECKLIST
WealthPortfolioConsumerType(what is this??)PR CHECKLIST