econ-ark / HARK

Heterogenous Agents Resources & toolKit
Apache License 2.0
315 stars 195 forks source link

Make non-OO consumption-saving solvers #1394

Closed mnwhite closed 1 month ago

mnwhite commented 2 months ago

Very early on in HARK's lifetime, we decided to make solve_one_period functions be object-oriented, with an inheritance structure among "parent" and "child" models. The (well meaning) intent was to prevent solver code from being redundantly re-written for "small" model changes, and to ensure that improvements in a parent model propagated to child models. The end result, however, was that the solver code was a confusing maze: a user trying to work their way through the solver for (say) ConsMarkovModel would be bounced around that file from method to method, occasionally needing to go up to one or more predecessor classes to find what they were looking for. Moreover, this approach required solver code in "parent" models to be written in an unusual and cryptic way to account for potential downstream alterations... not all of which could be anticipated-- not by a long shot.

This PR goes through and adds straightforward, non-OO solve_one_period functions for our existing models that use the "spaghetti maze" style. In my opinion, the resulting code is much, much easier to read, follow, and understand, and lends itself better to (external) documentation explaining the method step by step. The code is also much shorter: the solvers for ConsPerfForesight and ConsIndShock are 391 lines (inclusive of docstrings, comments, and whitespace) in this PR vs 887 in the legacy version.

This PR is in progress, as I've only done those two solvers right now. I'm making this PR just to verify that tests are passing, which they should (other than whatever black version discrepancies crop up). I will add a "solver checklist" to this post shortly.

EDIT: The OO-solver idea was really the "big thing" we were going for in HARK early on. The Econ-ARK logo is literally a stylized version of the KinkyPrefConsumerType's consumption function, because the solver class for that is nothing more than a double inheritance from KinkedR and PrefShock.

SOLVER CHECKLIST

PR CHECKLIST

codecov[bot] commented 2 months ago

Codecov Report

Attention: Patch coverage is 63.19773% with 389 lines in your changes are missing coverage. Please review.

Project coverage is 68.28%. Comparing base (bbb07a5) to head (ccbdd76).

Files Patch % Lines
HARK/ConsumptionSaving/ConsBequestModel.py 0.00% 247 Missing :warning:
HARK/ConsumptionSaving/ConsPrefShockModel.py 66.98% 70 Missing :warning:
HARK/ConsumptionSaving/ConsMarkovModel.py 80.70% 33 Missing :warning:
HARK/ConsumptionSaving/ConsIndShockModel.py 87.50% 31 Missing :warning:
HARK/ConsumptionSaving/ConsPortfolioModel.py 95.53% 8 Missing :warning:
Additional details and impacted files ```diff @@ Coverage Diff @@ ## master #1394 +/- ## ========================================== - Coverage 71.53% 68.28% -3.26% ========================================== Files 83 83 Lines 13938 14969 +1031 ========================================== + Hits 9971 10221 +250 - Misses 3967 4748 +781 ```

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.

mnwhite commented 2 months ago

The only failing test is that the overall codecov percentage fell because new code was added, but the legacy code was not removed. The only two uncovered lines in the new code are an exception that comes up when a degenerate cFunc would be generated.

alanlujan91 commented 2 months ago

I agree our current OO solvers are a painful mess. However, I also don't like long python functions that make it impossible to determine which part of your solution method might be wrong or have a bug. You can not incrementally test, the solver either fully works or doesn't.

I don't have a solution or proposal for a better design, but wanted to note this opinion.

alanlujan91 commented 2 months ago
  • [ ] WealthPortfolioConsumerType (what is this??)

Non-separable utility of consumption and wealth (savings), Wealth In the Utility Function, as an alternative to separable utility of consumption and wealth.

Mv77 commented 2 months ago

Functional solvers also have the big advantage of being much easier to compile using jit or even JAX.

alanlujan91 commented 2 months ago

Functional solvers also have the big advantage of being much easier to compile using jit or even JAX.

This is true in theory, but we'd have to dispense with using HARK objects like LinearInterp, ValueFuncCRRA, and MargValueFuncCRRA

Mv77 commented 2 months ago

About

I agree our current OO solvers are a painful mess. However, I also don't like long python functions that make it impossible to determine which part of your solution method might be wrong or have a bug. You can not incrementally test, the solver either fully works or doesn't.

I don't have a solution or proposal for a better design, but wanted to note this opinion.

I agree with this and like the way you structure your solvers, with distinct steps in separate functions, very much. Perhaps a good compromise is to encourage the use of sub-functions. That also makes the architecture more modular and allows for other models to import parts of a solver. E.g.

def solve_ind_shock(solution_next, a_grid, crra, disc_fac, ...):

    dvda, cont_v = compute_expectations(solution_next, ...)
    c, m = egm_inversion(dvda)
    ...
alanlujan91 commented 2 months ago

@Mv77 yes, i think that's a good idea. Some sort of segmentation of long functions, or hybrid OO + small functions not built into the object, might be a good solution.

If we do a hybrid approach with OO + small functions not built into the object, you don't have the messy inheritance of previous OO solvers, small functions can be jitted, and the OOo can handle the HARK objects and interpolators.

mnwhite commented 2 months ago

We might want to also have a rule that in HARK, we don't take those small functions from other model files. I.e. we can have multiple types/solvers in the same file, and they can share some of their code via the smaller functions, but we aren't reaching across model files to get something. If something is so useful that it's used by multiple model files (utility functions, income distribution generators), then it goes in a different place.

Also, the current failure on this (partial) PR is a tiny discrepancy in the KinkedR cFunc, which I'm working on.

On Thu, Mar 7, 2024 at 10:15 AM Alan Lujan @.***> wrote:

@Mv77 https://github.com/Mv77 yes, i think that's a good idea. Some sort of segmentation of long functions, or hybrid OO + small functions not built into the object, might be a good solution.

If we do a hybrid approach with OO + small functions not built into the object, you don't have the messy inheritance of previous OO solvers, small functions can be jitted, and the OOo can handle the HARK objects and interpolators.

— Reply to this email directly, view it on GitHub https://github.com/econ-ark/HARK/pull/1394#issuecomment-1983729318, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADKRAFNIYKRY57SH6IQUSCLYXCAARAVCNFSM6AAAAABEJYU32OVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSOBTG4ZDSMZRHA . You are receiving this because you authored the thread.Message ID: @.***>

alanlujan91 commented 2 months ago

If you are not doing these in any particular order, could I request you do portfolio choice next?

mnwhite commented 2 months ago

I had a vaguely intended order, but I can do that next, yes.

I think the basic bequest model might get moved into ConsIndShock's file a little later.

On Thu, Mar 7, 2024, 7:30 PM Alan Lujan @.***> wrote:

If you are not doing these in any particular order, could I request you do portfolio choice next?

— Reply to this email directly, view it on GitHub https://github.com/econ-ark/HARK/pull/1394#issuecomment-1984838197, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADKRAFPITPAQUZRKDRYVW7LYXEBDDAVCNFSM6AAAAABEJYU32OVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSOBUHAZTQMJZG4 . You are receiving this because you authored the thread.Message ID: @.***>

mnwhite commented 2 months ago

@alanlujan91 These commits have a single-function solve_one_period_ConsPortfolio that seems to match the behavior of the continuous choice model with independent distributions. I'll add the discrete choice and general distribution stuff on Monday, and probably also restructure the function a little more. It looks like there's some redundant evaluations going on. Note that the new solver isn't actually used in these commits; that line is commented out.

mnwhite commented 2 months ago

I missed the beginning part of Wednesday's meeting. Did you ask Mridul about these black discrepancies? The current one here is the order of import statements!

alanlujan91 commented 2 months ago

@alanlujan91 These commits have a single-function solve_one_period_ConsPortfolio that seems to match the behavior of the continuous choice model with independent distributions. I'll add the discrete choice and general distribution stuff on Monday, and probably also restructure the function a little more. It looks like there's some redundant evaluations going on. Note that the new solver isn't actually used in these commits; that line is commented out.

Sounds good, I don't think anyone uses the discrete and/or generalized ones, although this might be a good project for @sidd3888 in the summer.

alanlujan91 commented 2 months ago

I missed the beginning part of Wednesday's meeting. Did you ask Mridul about these black discrepancies? The current one here is the order of import statements!

I did not get to ask him this... @MridulS?

mnwhite commented 2 months ago

Testing and debugging dynamic solution methods is a little bit of an art. Even if we break down long solver functions into smaller steps (and we should, but not too small!), I don't think it's any easier to test or debug. The inputs for the smaller steps often include the outputs of prior steps, which can be complex objects that can't easily be handcrafted in advance for testing-- independent unit testing is very hard.

When I'm writing a solver for a big /difficult model, I do a lot of passing partial solutions. That is, even if there's mode code below, I just throw a return thisthing statement and solve just one non-terminal period, then examine its partial progress. If the complication I'm investigating comes up further back (e.g. only after solving 20 or 50 periods), then I'll add a boolean input to the solver that returns partial output iff that flag is set (and then pass a list of Falses with one True at the right time period).

MridulS commented 2 months ago

I did not get to ask him this... @MridulS?

Ignore the pre-commit bit here. I will make the pre-commit config a bit more easy in a new PR.

llorracc commented 2 months ago

@mnwhite,

One of my flaws as a coder is that I tend not to try to figure out how other people have done the things I want to do, and just code up whatever seems reasonable to me. When you coded up the original version of HARK, the OO approach to solving things seemed reasonable. But, even then, it would have behooved us to look at what Pablo was doing with Dolo, and for that matter what the Dynare folks had done, before plunging ahead.

As you rework this, I want to do our best to be sure we're doing it right, by understanding your (new) approach compared to what other people are doing, and in light of what we will need for the DYNARK (=HARK 2.0) backend.

Here's a tiny example. In thinking through my HARK 2.0 pre-alpha, I tried to tackle the "whiteboard" problem, by constructing distinct namespaces for the variables encompassed in the "end-of-period" perch, the "choice" perch, and the "expectorations" perch; and, I think, a separate namespace for elements that were not specific to a perch. There was a global namespace in which all of the elements from all of these namespaces were available (there's an easy syntax for that), but it meant that when you looked for the info you were interested in it was not buried in a welter of other things.

Though I found it very helpful in penetrating the mass of information available when trying to understand the existing code, maybe that idea is too prescriptive or otherwise has flaws.

But I feel that there are dozens of questions like this which deserve discussions before you plunge too far ahead.

For next week's HARK Wednesday, could you try to prepare a systematic discussion of the underlying principles you are trying to use to guide what you are doing?

I confess that I'm somewhat unnerved by your response to my query a few days ago about whether you were encompassing the ideas we had agreed on a couple of years ago about the "perches," because it is hard for me to see how you could do an appropriate restructuring of the code in which the solution to those problems is not already encompassed. We had a conv about this and perhaps the upshot was that you said that the new framework would be able to handle either the (bad, current) or the (good, future) architecture.

So, before you proceed further, I'd like you to do a deep dive into how others handle all of this. The answer may well be that the only "others" are Dolo and DYNARE, and because they assume infinite horizon problems from the outset they do not address the crucial questions. But you are doing foundational things, and I think a thorough discussion is in order before you plunge too far ahead.

mnwhite commented 2 months ago

@llorracc To be more clear about what I said (or tried to say) at the end of the meeting on Wednesday: I just meant that the "perches" thing isn't being handled in this step. What I'm doing right now is a basic first step toward everything else we've talked about. I don't think I'm capable of doing it all at once-- or if I did, I'd prefer to just start over from zero, rather than trying to stay within HARK.

Here's an analogy that I will surely break and come to regret: our solvers as the rigging of a ship. When we made HARK's solver structure, we didn't have a plan for its scope or the range of models it would encompass. We had this big idea that the world of consumption-saving models fit into some kind of family tree of model inheritance, but we didn't draw that tree out before beginning. On the ship, we added a new sail, or a new outrigger section, and needed to add more ropes to connect them together on one big boat. And then we did that repeatedly, without really thinking about how the mega-boat should be designed... so now the rigging is a massive, unwieldy knot that no one understands.

All I'm doing right now is untangling all of the rope. I'm laying all of the lines out straight, so we know that they each do one and only one thing, without hidden and unexpected connections. Then we can implement better choices about re-rigging the ship: compartmentalizing repeated steps as smaller functions, breaking periods up into sequential chunks/blocks, adjusting the timing among those blocks, etc.

When I said that when we're done, users can do it "the old way" or "the new way", I meant that with respect to timing of periods-- people can put their "time marker" where they want, even if we disagree with them.

If I sounded cagey in my comments about "perches", it's because I was nervous about which version of that you were talking about / how attached you are to the version you originally wrote out. What I'm trying to do encompasses what you proposed, but with additional possibilities. My worry was that you meant literally the "three perches" concept, which I think is overly restrictive and doesn't apply to all models.

I can carefully look at dolo / dolark source code, but that shouldn't be a blocker on the basic, preliminary steps for getting our house in order.

mnwhite commented 2 months ago

I'm going to try merging master into this shortly, but also run black again-- I keep forgetting to do it.

The latest commit puts discrete risky share choice into the single solver function, but "dependent distributions" still uses the old solver.

llorracc commented 2 months ago

From: Matthew N. White @.> Date: Monday, March 11, 2024 at 14:03 To: econ-ark/HARK @.> Cc: Christopher Llorracc Carroll @.>, Mention @.> Subject: Re: [econ-ark/HARK] Make non-OO consumption-saving solvers (PR #1394)

I'm going to try merging master into this shortly, but also run black again-- I keep forgetting to do it.

The latest commit puts discrete risky share choice into the single solver function, but "dependent distributions" still uses the old solver.

— Reply to this email directly, view it on GitHubhttps://github.com/econ-ark/HARK/pull/1394#issuecomment-1989109910, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AAKCK724WGMDKT4XZVUWEC3YXX5ZNAVCNFSM6AAAAABEJYU32OVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSOBZGEYDSOJRGA. You are receiving this because you were mentioned.Message ID: @.***>

mnwhite commented 2 months ago

At some point, you wanted HARK to have the capability to only offer the consumer a discrete choice among portfolio shares, so I put it in several years ago. The version currently in HARK might be my code, or someone else's edit of my code. This commit just adds the code that handles the discrete choice option into the single solver, rather than being in a separate solver class. No new functionality is being added here, and this code is only used if the user sets DiscreteShareBool=True on their agents.

Even when DiscreteShareBool is True, the ShareFunc is still a continuous function: it takes in an mNrm and spits out the optimal (allowed) portfolio share at that mNrm. But because the allowed Share levels are a finite set, the slope of this function is zero almost everywhere; in a region of mNrm that is 0.000000000001 wide at each step-point, the slope will be very large (negatively). The applications that want a properly smooth ShareFunc just wouldn't use this option.

The Merton-Samuelson share is already used as the limiting lower bound of the risky share as mNrm goes to infinity. Or more precisely, something called ShareLimit is used for this purpose, and on first impression it looks like the Merton-Samuelson value, but I haven't re-checked that code specifically.

On Mon, Mar 11, 2024 at 2:26 PM Christopher Llorracc Carroll < @.***> wrote:

  • The latest commit puts discrete risky share choice into the single solver function, but "dependent distributions" still uses the old solver. Not sure what is meant by ‘discrete’ risky share choice. I believe that some of the uses to which the existing ConsPortfolio stuff is used require a policy function that is (continuous) policy function. Like, the SolvingMicroDSOPs lecture notes, showing that the resulting choice asymptotes to Merton-Samuelson as wealth goes to infinity. (Actually, that gives me a good idea: Whatever the other gridpoints are for portfolio share, we should include the Merton-Samuelson prediction so that we will get a precise marginal value function for that particularly interesting choice.

From: Matthew N. White @.> Date: Monday, March 11, 2024 at 14:03 To: econ-ark/HARK @.> Cc: Christopher Llorracc Carroll @.>, Mention @.> Subject: Re: [econ-ark/HARK] Make non-OO consumption-saving solvers (PR

1394)

I'm going to try merging master into this shortly, but also run black again-- I keep forgetting to do it.

The latest commit puts discrete risky share choice into the single solver function, but "dependent distributions" still uses the old solver.

— Reply to this email directly, view it on GitHub< https://github.com/econ-ark/HARK/pull/1394#issuecomment-1989109910>, or unsubscribe< https://github.com/notifications/unsubscribe-auth/AAKCK724WGMDKT4XZVUWEC3YXX5ZNAVCNFSM6AAAAABEJYU32OVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSOBZGEYDSOJRGA>.

You are receiving this because you were mentioned.Message ID: @.***>

— Reply to this email directly, view it on GitHub https://github.com/econ-ark/HARK/pull/1394#issuecomment-1989157314, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADKRAFJM2LQUWH2C4MSLA5TYXXZOPAVCNFSM6AAAAABEJYU32OVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSOBZGE2TOMZRGQ . You are receiving this because you were mentioned.Message ID: @.***>

llorracc commented 2 months ago

OK, sounds like everything is good. Thanks.

PS. The share is actually a function of a, which itself is a function of m. One of many reasons to move to the new timing.

From: Matthew N. White @.> Date: Monday, March 11, 2024 at 14:40 To: econ-ark/HARK @.> Cc: Christopher Llorracc Carroll @.>, Mention @.> Subject: Re: [econ-ark/HARK] Make non-OO consumption-saving solvers (PR #1394) At some point, you wanted HARK to have the capability to only offer the consumer a discrete choice among portfolio shares, so I put it in several years ago. The version currently in HARK might be my code, or someone else's edit of my code. This commit just adds the code that handles the discrete choice option into the single solver, rather than being in a separate solver class. No new functionality is being added here, and this code is only used if the user sets DiscreteShareBool=True on their agents.

Even when DiscreteShareBool is True, the ShareFunc is still a continuous function: it takes in an mNrm and spits out the optimal (allowed) portfolio share at that mNrm. But because the allowed Share levels are a finite set, the slope of this function is zero almost everywhere; in a region of mNrm that is 0.000000000001 wide at each step-point, the slope will be very large (negatively). The applications that want a properly smooth ShareFunc just wouldn't use this option.

The Merton-Samuelson share is already used as the limiting lower bound of the risky share as mNrm goes to infinity. Or more precisely, something called ShareLimit is used for this purpose, and on first impression it looks like the Merton-Samuelson value, but I haven't re-checked that code specifically.

On Mon, Mar 11, 2024 at 2:26 PM Christopher Llorracc Carroll < @.***> wrote:

  • The latest commit puts discrete risky share choice into the single solver function, but "dependent distributions" still uses the old solver. Not sure what is meant by ‘discrete’ risky share choice. I believe that some of the uses to which the existing ConsPortfolio stuff is used require a policy function that is (continuous) policy function. Like, the SolvingMicroDSOPs lecture notes, showing that the resulting choice asymptotes to Merton-Samuelson as wealth goes to infinity. (Actually, that gives me a good idea: Whatever the other gridpoints are for portfolio share, we should include the Merton-Samuelson prediction so that we will get a precise marginal value function for that particularly interesting choice.

From: Matthew N. White @.> Date: Monday, March 11, 2024 at 14:03 To: econ-ark/HARK @.> Cc: Christopher Llorracc Carroll @.>, Mention @.> Subject: Re: [econ-ark/HARK] Make non-OO consumption-saving solvers (PR

1394)

I'm going to try merging master into this shortly, but also run black again-- I keep forgetting to do it.

The latest commit puts discrete risky share choice into the single solver function, but "dependent distributions" still uses the old solver.

— Reply to this email directly, view it on GitHub< https://github.com/econ-ark/HARK/pull/1394#issuecomment-1989109910>, or unsubscribe< https://github.com/notifications/unsubscribe-auth/AAKCK724WGMDKT4XZVUWEC3YXX5ZNAVCNFSM6AAAAABEJYU32OVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSOBZGEYDSOJRGA>.

You are receiving this because you were mentioned.Message ID: @.***>

— Reply to this email directly, view it on GitHub https://github.com/econ-ark/HARK/pull/1394#issuecomment-1989157314, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADKRAFJM2LQUWH2C4MSLA5TYXXZOPAVCNFSM6AAAAABEJYU32OVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSOBZGE2TOMZRGQ . You are receiving this because you were mentioned.Message ID: @.***>

— Reply to this email directly, view it on GitHubhttps://github.com/econ-ark/HARK/pull/1394#issuecomment-1989185551, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AAKCK762MRSDN37CYMDEXODYXX3C5AVCNFSM6AAAAABEJYU32OVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSOBZGE4DKNJVGE. You are receiving this because you were mentioned.Message ID: @.***>

mnwhite commented 2 months ago

I think the current pre-commit failure was introduced by black.

mnwhite commented 2 months ago

This branch now has "simple solvers" for ConsPrefShockModel and ConsKinkyPrefModel. As noted in the commit comments, the code for these solvers makes clear that we really do need to have sub-functions within solvers, as the code is quite repetitive across the various functions. That better structure will happen, and it will be easier to both plan and execute once everything is "flat".

I'm going to tackle ConsMarkovModel and ConsGenIncProcessModel tomorrow (starting a bit today), then will get to the bequest model and risky return model after that.

mnwhite commented 2 months ago

The ConsMarkovModel now has a "simple solver" that replicates the behavior of the current OO solver. I fixed one fatal error in the vFunc code for the old solver, and then found another issue with it: the limiting linear function for the pseudo-inverse value function is incorrect. That functionality is commented out in my new function, and it should probably be fixed in the old version as well (just comment out half a line of code); I'll do that on the next commit.

The code and structure for this solver really spotlights how the "time change" will greatly benefit HARK. The code will be a lot simpler and understandable when period t stuff is actually handled by the period t solver.

This one took longer than expected, but I should be able to knock out GenIncProcess and WarmGlowBequest tomorrow.

mnwhite commented 2 months ago

Simple solvers for both "warm glow bequest" models have now been added. For the ConsIndShock one, I stripped out the CubicBool and vFuncBool capability, to match the existing version, but I now see that this was unnecessary and the code can be cleaned up very slightly to account for it. I left the vFuncBool and DiscreteShareBool stuff in for the Portfolio model.

@alanlujan91 In doing this, I found two critical typos / mistakes in your WarmGlowPortfolio code:

1) The "effective bequest factor" wasn't being applied, so LivPrb isn't actually used for the bequest motive. 2) A capitalization error in one variable name meant that the bequest motive wasn't used at all.

I think fixing this will make tests fail because the targets won't be hit. We might need to fix it separately in a different branch, which also includes a change to the test, then merge it in here, so as not to muddy this PR's purpose.

alanlujan91 commented 2 months ago

ugh we need a better system than object names like self.EndOfPrddvda @mnwhite we don't have tests for this type, since this was hastily added for my dissertation work

mnwhite commented 2 months ago

Functional solvers will help with that I think.

I forgot to mention it in my prior post, but the WarmGlow models also need to adjust the cFunc extrapolation / limiting linear function. I think.

On Wed, Mar 13, 2024 at 5:07 PM Alan Lujan @.***> wrote:

ugh we need a better system than object names like self.EndOfPrddvda

— Reply to this email directly, view it on GitHub https://github.com/econ-ark/HARK/pull/1394#issuecomment-1995817092, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADKRAFJD7YTMMS3V7VVDJZDYYC5XHAVCNFSM6AAAAABEJYU32OVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSOJVHAYTOMBZGI . You are receiving this because you were mentioned.Message ID: @.***>

mnwhite commented 2 months ago

I just pushed a simple / flat version of the GenIncProcess solver, which reproduces the test model solution exactly. Oddly, it runs 5-6x faster than the existing solver, depending on whether vFuncBool and CubicBool are set. I didn't try to make it faster, but there must be some unseen issue / hidden loop in the old code.

As I mentioned to Alan this morning, I'm going to pass over WealthPortfolioConsumerType for now because it's not something I've worked with before and there isn't a lot of documentation in the file to guide me. And I think Mateo said he liked its structure anyway.

The two remaining files that I want to tackle here are ConsMedShockModel and RiskyAssetModel. The latter will be easy, as it's a very slight extension of ConsIndShock. The former is a bit of a mess: I wrote it a long time ago, and then all the numeric improvements have been in my own personal work, never moved to HARK. So the changes to it might be both cosmetic (flattening) and actual methodological / accuracy / stability changes.

mnwhite commented 2 months ago

It looks like the value function code didn't work at all for the existing code. I fixed the issue just now (pending a commit). There's a small vFunc discrepancy between my code and the old code.

mnwhite commented 2 months ago

@alanlujan91 What is ConsPortfolioIndShkRiskyAssetSolver in ConsRiskyAssetModel? There's no description of the model, but it looks like it might just be ConsPortfolio?

mnwhite commented 2 months ago

Note to self: It looks like hNrm calculation has an issue in the RiskyAsset models, but MPCmin seems correct. Need to double check that math and compare it to the calculation in ConsMarkovModel, which allows Rfree to vary.

mnwhite commented 2 months ago

I just pushed the simple solver for the MedShk model, which has a moderate speedup and fixes broken cubic interpolation on the OO solver (in the new code; OO version is still broken). This one wasn't nearly as big of a mess as I had feared, but I do want to revise the model/code at some point. Not here, though.

I'm going to double back to do the non-independent distributions for PortfolioConsumerType, and add vFunc and cubic capability to WarmGlowBequest.

In another branch, I'm going to add some tests that are missing or inadequate. That branch should be merged in before this one, so that this branch can a) be subject to more tests and b) improve its testing coverage numbers to satisfy codecov.

mnwhite commented 2 months ago

@alanlujan91 @sbenthall Once everything here is straightened out, how do you want to handle its merge into the master branch? Here's my Option A pitch:

Because these flat/simple solvers reproduce existing behavior exactly (or down to the 14th digit), anyone using HARK models "off the shelf" won't notice any difference, other than performance improvement (and a few added capabilities that were broken before). Anyone modifying HARK solvers just needs to change one line of their code after they update to the next release.

Option B is to just not merge this at all, and instead use it as the starting point for a new working branch with more elegant solver structure (see what Alan is doing in his branch off of this one). My preference is option A, because this has actual improvements in it, but I'm not hellbent on it.

mnwhite commented 2 months ago

The two solver bits described two posts above have now been completed. I still need to add CHANGELOG comments for this PR, and exile the old solvers to another file.

Do not merge until another branch with new/expanded tests has been added.

mnwhite commented 1 month ago

This branch might actually be approaching completion. I made a couple more (very easy) solvers following a conversation with @alanlujan91 this morning. Not right now, but we should have a focused discussion on how many different solvers/models we actually want in HARK. I.e. if model A is just model B with a particular restriction, under what circumstances do we want model A at all?

llorracc commented 1 month ago

Right.

I think we should never use the Cubic as a default. Splines should be the default.

If the person knows that they have a problem appropriate for the cubic, then it should be an option.

On Wed, Mar 27, 2024 at 1:01 PM Alan Lujan @.***> wrote:

@.**** commented on this pull request.

In HARK/ConsumptionSaving/ConsIndShockModel.py https://github.com/econ-ark/HARK/pull/1394#discussion_r1541502235:

  • vNvrsFuncNow = CubicInterp(
  • mNrm_temp, vNvrs_temp, vNvrsP_temp, MPCminNvrs * hNrmNow, MPCminNvrs
  • )
  • vFuncNow = ValueFuncCRRA(vNvrsFuncNow, CRRA)

but this won't support vFuncs with kinks, correct?

— Reply to this email directly, view it on GitHub https://github.com/econ-ark/HARK/pull/1394#discussion_r1541502235, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAKCK7YIAQW72XENSH76CRLY2L3PRAVCNFSM6AAAAABEJYU32OVHI2DSMVQWIX3LMV43YUDVNRWFEZLROVSXG5CSMV3GSZLXHMYTSNRTHE2TAMZZHE . You are receiving this because you were mentioned.Message ID: @.***>

--

mnwhite commented 1 month ago

This model will never have a kinked vFunc. Kinked vFunc only happens in a consumption-saving model if cFunc is discontinuous.

When we have HARK to the point where we think the future cFunc might be discontinuous, and this period's might be as well (because next period isn't necessarily basic ConsIndShock), then we would not necessarily use this representation. But a whole lot of other stuff would need to be in the solver as well if we expected that situation to come up.

On Wed, Mar 27, 2024, 5:21 PM Christopher Llorracc Carroll < @.***> wrote:

Right.

I think we should never use the Cubic as a default. Splines should be the default.

If the person knows that they have a problem appropriate for the cubic, then it should be an option.

On Wed, Mar 27, 2024 at 1:01 PM Alan Lujan @.***> wrote:

@.**** commented on this pull request.

In HARK/ConsumptionSaving/ConsIndShockModel.py https://github.com/econ-ark/HARK/pull/1394#discussion_r1541502235:

  • vNvrsFuncNow = CubicInterp(
  • mNrm_temp, vNvrs_temp, vNvrsP_temp, MPCminNvrs * hNrmNow, MPCminNvrs
  • )
  • vFuncNow = ValueFuncCRRA(vNvrsFuncNow, CRRA)

but this won't support vFuncs with kinks, correct?

— Reply to this email directly, view it on GitHub https://github.com/econ-ark/HARK/pull/1394#discussion_r1541502235, or unsubscribe < https://github.com/notifications/unsubscribe-auth/AAKCK7YIAQW72XENSH76CRLY2L3PRAVCNFSM6AAAAABEJYU32OVHI2DSMVQWIX3LMV43YUDVNRWFEZLROVSXG5CSMV3GSZLXHMYTSNRTHE2TAMZZHE>

. You are receiving this because you were mentioned.Message ID: @.***>

--

  • Chris Carroll

— Reply to this email directly, view it on GitHub https://github.com/econ-ark/HARK/pull/1394#issuecomment-2024010608, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADKRAFPSH6XQYEMCVM3LAMLY2MZ3XAVCNFSM6AAAAABEJYU32OVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAMRUGAYTANRQHA . You are receiving this because you were mentioned.Message ID: @.***>

sbenthall commented 1 month ago

Just a quick comment...

The term to use for the 'legacy00solvers' is "deprecated". They are deprecated.

Typically this is signaled in the API documentation. After a few releases of a feature being deprecated, it is removed from the library. (Downstream users that need the old functionality an always use a previous version of the software).

Since you've moved the OO solvers to a new located, you haven't exactly preserved the old API access to them. I guess that makes their functionality especially deprecated, since it will take extra work to adapt downstream code to use them.

That will make it especially easy to remove the OO solvers down the line.

mnwhite commented 1 month ago

I think the only thing downstream users need to do is change one import line at the top of their code. I moved them to a separate file to reduce clutter in the model files, and because I didn't want to have test coverage for them-- I meant to see if there was a way to indicate that that file doesn't get counted for coverage. I found some small-ish problems with the old solvers, and corrected some of them with the new solvers; I don't want to write tests for code we're deprecating, to have them hit false targets.

On Thu, Mar 28, 2024 at 8:26 AM Sebastian Benthall @.***> wrote:

Just a quick comment...

The term to use for the 'legacy00solvers' is "deprecated https://en.wikipedia.org/wiki/Deprecation". They are deprecated.

Typically this is signaled in the API documentation. After a few releases of a feature being deprecated, it is removed from the library. (Downstream users that need the old functionality an always use a previous version of the software).

Since you've moved the OO solvers to a new located, you haven't exactly preserved the old API access to them. I guess that makes their functionality especially deprecated, since it will take extra work to adapt downstream code to use them.

That will make it especially easy to remove the OO solvers down the line.

— Reply to this email directly, view it on GitHub https://github.com/econ-ark/HARK/pull/1394#issuecomment-2025066789, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADKRAFLCASDU4AB7QG3G7XLY2QD7XAVCNFSM6AAAAABEJYU32OVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAMRVGA3DMNZYHE . You are receiving this because you were mentioned.Message ID: @.***>

sbenthall commented 1 month ago

makes sense. does the new file show up in the rendered API docs?

mnwhite commented 1 month ago

I... don't know where to look for that.

On Thu, Mar 28, 2024 at 12:23 PM Sebastian Benthall < @.***> wrote:

makes sense. does the new file show up in the rendered API docs?

— Reply to this email directly, view it on GitHub https://github.com/econ-ark/HARK/pull/1394#issuecomment-2025616691, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADKRAFIGNGG2WFFCRT4T4EDY2Q7YDAVCNFSM6AAAAABEJYU32OVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAMRVGYYTMNRZGE . You are receiving this because you were mentioned.Message ID: @.***>

sbenthall commented 1 month ago

I'll file an issue...