IMFS-MMB / mmb-gui-electron

Electron-vue based GUI of the Macroeconomic Model Database
http://www.macromodelbase.com
Other
7 stars 16 forks source link

Compatibility tests with octave fail #82

Closed gboehl closed 4 years ago

gboehl commented 4 years ago

Describe the bug Compatibility tests with octave don't go through.

To Reproduce Steps to reproduce the behavior:

  1. Go to 'Menu -> Settings'
  2. Chose working version of Octave
  3. Hit 'Check compatibility'
  4. Tests are runnning, but give the message that at least one of the tests fail

(this is for reference to facilitate keeping track of all open issues)

@millazar, please have a look why the tests fail

millazar commented 4 years ago

I have tried the compatibility test, as requested.

When I ran it through my Matlab versions (2015 and 2019b) it worked well:


---OUTPUT-START--- --- Autocorrelation --- expected -0.071, actual -0.071 success --- Unconditional variances --- expected 1.1702, actual 1.1702 success --- Input response functions --- expected 0.0415, actual 0.0415 success --- [AL] Autocorrelation --- expected 0.7396, actual 0.7396 success --- [AL] Unconditional variances --- expected 8.2656, actual 8.2656 success --- [AL] Input response functions --- expected -0.6807, actual -0.6807 success 6 out of 6 successful


However, the final message is a bit confusing, suggesting that something does not work (see the screenshot attached).

When it comes to Octave, I have done the same. The command window doesn't stay there long enough to see potential errors directly, everything you get is the same message from the screenshot.

I then tried manually... to pick Octave and run simulations (both through the MMB platform and within the Octave command window). It turns out that with the first model (RW97) the output json file is produced and the values are as expected. On the other hand, for the Adaptive learning model, the output file is not produced at all (neither through the platform nor Octave manually). The warnings I got when I used the Octave were the following:


addpath c:\dynare\4.5.7\matlab mmb('config.test.json') Dynare path is "/usr/lib/dynare" warning: addpath: \usr\lib\dynare\matlab: No such file or directory warning: Unsupported version of octave used. We cannot guarantee that all models will be simulate d. remove entire contents of C:\MMB\mmb-gui-electron-master\static\mmci-cli\work? (yes or no) yes

warning: Dynare is not on top of matlab's path! This can cause problems because the Dynare versio n of ver_greater_than.m will be overriden.

warning: I put c:\dynare\4.5.7\matlab on top of your matlab's path. Note that this is a a temp orary change (ie will not affect future matlab's session). If the ordering was intentional, ie if you really want to override the routines distributed with Dynare, you can change this behaviour using option nopathchange (see the reference manual).


But have in mind that with the same warnings for the regular model the output was produced. For the AL one, it simply collapsed.

Hope this helps...

screenshot
gboehl commented 4 years ago
Dynare path is "/usr/lib/dynare"
warning: addpath: \usr\lib\dynare\matlab: No such file or directory

At least this part should not be there under Windows.

@j2L4e is this relevant for this problem, or is it a separate bug?

For me the AL models run through in general (apart from the following exception). Setup: Octave 5.1 / dynare 4.5.7.

Other thing: The NK_CGG99AL does have problems with the cholesky decomposition. I have no clue how the AL models work. @AlexDece @JTWendelborn could one of you check if the model ran in the last 2.x version of the mmb (and if so, what was the difference now?)

AlexDece commented 4 years ago

I will check it and report the result here.

On Tue, 3 Dec 2019, 16:58 Gregor Boehl notifications@github.com wrote:

Dynare path is "/usr/lib/dynare" warning: addpath: \usr\lib\dynare\matlab: No such file or directory

At least this part should not be there under Windows.

@j2L4e https://github.com/j2L4e is this relevant for this problem, or is it a separate bug?

For me the AL models run through in general (apart from the following exception). Setup: Octave 5.1 / dynare 4.5.7.

Other thing: The NK_CGG99AL does have problems with the cholesky decomposition. I have no clue how the AL models work. @AlexDece https://github.com/AlexDece @JTWendelborn https://github.com/JTWendelborn could one of you check if the model ran in the last 2.x version of the mmb (and if so, what was the difference now?)

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/IMFS-MMB/mmb-gui-electron/issues/82?email_source=notifications&email_token=ANVSGFJ74GZ5UHNPXMRDJRLQWZ6ZTA5CNFSM4JTZYE6KYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEFZ3W2Q#issuecomment-561232746, or unsubscribe https://github.com/notifications/unsubscribe-auth/ANVSGFI6PGNFGSPINWAQIHDQWZ6ZTANCNFSM4JTZYE6A .

j2L4e commented 4 years ago

is this relevant for this problem, or is it a separate bug?

It's not a bug, you need to set the proper dynare path in the config

@millazar did you load dynare by hand? (first line?)

AlexDece commented 4 years ago

The model does not run in MMB 2.3. Error: Reference to non-existent field 'jacobia'.

The mod files are identical.

In TRFromBeta.m jacobia is defined as oo_.dr.jacobia

I looked in this structure. There are several things defined such as eigval (11x1 vector) or kstate (11x4 matrix), but no jacobia.

On Tue, 3 Dec 2019, 16:58 Gregor Boehl notifications@github.com wrote:

Dynare path is "/usr/lib/dynare" warning: addpath: \usr\lib\dynare\matlab: No such file or directory

At least this part should not be there under Windows.

@j2L4e https://github.com/j2L4e is this relevant for this problem, or is it a separate bug?

For me the AL models run through in general (apart from the following exception). Setup: Octave 5.1 / dynare 4.5.7.

Other thing: The NK_CGG99AL does have problems with the cholesky decomposition. I have no clue how the AL models work. @AlexDece https://github.com/AlexDece @JTWendelborn https://github.com/JTWendelborn could one of you check if the model ran in the last 2.x version of the mmb (and if so, what was the difference now?)

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/IMFS-MMB/mmb-gui-electron/issues/82?email_source=notifications&email_token=ANVSGFJ74GZ5UHNPXMRDJRLQWZ6ZTA5CNFSM4JTZYE6KYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEFZ3W2Q#issuecomment-561232746, or unsubscribe https://github.com/notifications/unsubscribe-auth/ANVSGFI6PGNFGSPINWAQIHDQWZ6ZTANCNFSM4JTZYE6A .

gboehl commented 4 years ago

@AlexDece I opened a new bug on this #86 to dicuss

@millazar @j2L4e He did load it by hand. Lazar, the GUI is setting the dynare path in static/mmci-cli/config.json or config.test.json(?). If you dont' use the GUI you have to set that by hand.

j2L4e commented 4 years ago

static/mmci-cli/config.json or config.test.json(?)

it loads it from the config file you hand over to mmb()

I'll add comments on that as part of the documentation

millazar commented 4 years ago

Sure, I set all the directories and paths properly, I would say. As reported, it worked well for the regular model, otherwise I would have had an error in that case as well.

Sent from my iPhone

On 4 Dec 2019, at 10:47, Jan Lohage notifications@github.com wrote:



static/mmci-cli/config.json or config.test.json(?) it loads it from the config file you hand over to mmb()

I'll add comments on that as part of the documentation

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHubhttps://github.com/IMFS-MMB/mmb-gui-electron/issues/82?email_source=notifications&email_token=AHVLUUAWOFQ7TOVMUY5MN3TQW54D7A5CNFSM4JTZYE6KYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEF4LZCI#issuecomment-561560713, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AHVLUUERSUSGIOY3ID4RNOLQW54D7ANCNFSM4JTZYE6A.

gboehl commented 4 years ago

Thats not a very helpful report. As your error message references "/usr/lib/dynare", this path must be misspecified somewhere.

Please have a look, because we can not reproduce that the AL models are not working at all (apart from NK_CGG99AL).

gboehl commented 4 years ago

@millazar sounds like you might have only changed config.json', but nottest/config.test.json`?

j2L4e commented 4 years ago

You can run the tests from the command line via

test('test/config.test.json', 'test/config.test-al.json');

after changing the dynare path in those test files.

That's all the GUI does as well. It injects the dynare path and runs the test() command

millazar commented 4 years ago

@gboehl @j2L4e OK guys, thank you for your suggestion. I installed the new version again and proceeded as suggested.

The compatibility test for Matlab 2019b and Dynare 4.5.7 worked the same as before (it produced results as expected, with the unexpected window popping up at the end again).

Then I set Octave 4.4.1 together with Dynare 4.5.7 and it collapsed (screenshot1 shows the command window a few seconds before everything turns off)

screenshot1

I ran everything manually within the Matlab 2019b once again and no problem occurred.

Finally, I tried everything in Octave directly. When it comes to the regular model, the output json file was produced with desirable results (screenshot2 of the command window attached)

screenshot2

Once I involved the AL model, the program collapsed (either for test('test/config.test.json', 'test/config.test-al.json') or mmb('test/config.test-al.json') with the command window looking like presented in the screenshot3 before turning off.

screenshot3

To conclude, as suggested in my first post (there I obtained a warning, not an error, in both Matlab and Octave, and both worked well for the regular model), everything seems to work well with the regular model (both Matlab and Octave versions), but the test and simulation collapse in Octave once the AL model is considered (unfortunately, without giving an error, but closing the program completely).

millazar commented 4 years ago

@AlexDece Can you please check whether it works for you in Octave? It will take you 5-10 minutes.

Everything you need to do is to change json files for tests (both config.test.json and config.test-al.json) in the folder ...mmb-gui-electron-master\static\mmci-cli\test

such that in the first line you put your Dynare path instead of "/usr/lib/dynare"

and then write the following in the command window in Octave:

test('test/config.test.json', 'test/config.test-al.json');

(while your cd is ...mmb-gui-electron-master\static\mmci-cli)

AlexDece commented 4 years ago

This is what I get

test('test/config.test.json', 'test/config.test-al.json'); warning: delete: D:\gsefm\Wieland_work\MMB\mmb-gui-electron\static\mmci-cli\out\diary.log: Permission denied Dynare path is "D:/dynare/4.5.7" warning: Unsupported version of octave used. We cannot guarantee that all models will be simulated. remove entire contents of D:\gsefm\Wieland_work\MMB\mmb-gui-electron\static\mmci-cli\work? (yes or no)

when I say no, then this happens

no

millazar commented 4 years ago

Yes, and I just type yes then (from what I got, the mmb always deletes files from the previous exercise in the work folder, when running the new exercise). I have never used Octave before (I just know that the language is almost the same as in Matlab) so take everything I say with the grain of salt.

millazar commented 4 years ago

Yes, and you need to type it again (for the second model). It could be that saying yes or no makes no difference at all (I actually believe that I tried both... and got the same outcome)

AlexDece commented 4 years ago

Okay, both run through, the second one (AL) shows me this (after the PF hits and exits). So Autocorrelations and Unconditional variances fail (I used the | in front that the result is not commented out)

|---OUTPUT-START--- |--- Autocorrelation --- expected -0.071, actual -0.071 success |--- Unconditional variances --- expected 1.1702, actual 1.1702 success |--- Input response functions --- expected 0.0415, actual 0.0415 success |--- [AL] Autocorrelation --- expected 0.7396, actual 0.777 failed |--- [AL] Unconditional variances --- expected 8.2656, actual 15.9088 failed |--- [AL] Input response functions --- expected -0.6807, actual -0.6807 success 4 out of 6 successful

millazar commented 4 years ago

OK, this is already better than what I get! Now the question is what leads to different outcomes...

gboehl commented 4 years ago

That looks a lot as if the AL stuff does not work with Octave/gives false results.

Bad thing is that I can't checkout what Matlab is doing because I don't have a valid License left to install on this computer (propietary software sux).

gboehl commented 4 years ago

I suspect the source to be numerical errors, potentially related to the condition numbers of the matrices. This would also explain #86 -> it just needs one minus at the wrong position of the matrix to render a perfectly fine positive definite matrix to indefinit.

We did check if this Q matrix from #86 looks the same, right? This implies that the cholesky decomposition is screwed up. Could you check if the results from chol are the same in matlab/Octave?

I'm just checking the different specs... https://octave.sourceforge.io/octave/function/chol.html https://de.mathworks.com/help/matlab/ref/chol.html

gboehl commented 4 years ago

@AlexDece @millazar IF the results of chol look different for the same input matrix, could you check if they stop being different by using sqrtm (the matrix square root) instead of chol?

https://octave.org/doc/v4.2.2/Functions-of-a-Matrix.html#XREFsqrtm

@JTWendelborn this is what I meant. Could you check as well?

AlexDece commented 4 years ago

I will take a look at it tomorrow. For #82 I can check with Lazar and #86 I will take a quick look, but I can say that this model also worked for me (possibly with wrong results though. It showed no error and produced IRFs. Same as for Jonas).

gboehl commented 4 years ago

Great, thank you! Did you do have to do the renaming of the ALTOOL folder as suggested?

JTWendelborn commented 4 years ago

@AlexDece @millazar IF the results of chol look different for the same input matrix, could you check if they stop being different by using sqrtm (the matrix square root) instead of chol?

https://octave.org/doc/v4.2.2/Functions-of-a-Matrix.html#XREFsqrtm

@JTWendelborn this is what I meant. Could you check as well?

Yes, I will do so! But it seems that it would need more than 5 minutes, so I would postpone it until tomorrow - it's quite late over here ;-)

j2L4e commented 4 years ago

Yes, and you need to type it again (for the second model). It could be that saying yes or no makes no difference at all (I actually believe that I tried both... and got the same outcome)

It clears the /out folder. If you answer "no" everything will work as expected simulation-wise but the result .json files will be mixed up with those from the last run. It doesn't make a difference for the test function.

AlexDece commented 4 years ago

chol(Q) gives me the same result for MATLAB (2017) and Octave (4.4.1). However, Octave gives me "0.00000" in the upper and lower triangular, while Matlab gives me "0" in the lower and "0.00000" in the upper triangular.

For sqrtm(Q), the resuls look different (maybe Matlab rounds to 0 and octave does not?)

See the pictures attached (first for Matlab, second for Octave)

Matlab Matlab

Octave Octave

gboehl commented 4 years ago

These are the same results numerically...

gboehl commented 4 years ago

@AlexDece for which model did you get these results? Keep in mind that we are looking for the reason of these deviations here:

|--- [AL] Autocorrelation --- expected 0.7396, actual 0.777 failed |--- [AL] Unconditional variances --- expected 8.2656, actual 15.9088 failed |--- [AL] Input response functions --- expected -0.6807, actual -0.6807 success

which are generated using the setup in static/mmci-cli/test/config/test-al.json.

Maybe blaming the decomposition of Q isn't the right track, as the initial IRF values look good. Is there a difference in how AC and unconditional variances are calculated for AL models?

AlexDece commented 4 years ago

I dont know which results you mean, I am a bit confused 😄

The chol(Q) and stuff are directly in Octave/Matlab using the 1 line commnd given above (test)

The IRFs are directly in the mmb test version gui and developer version gui respectively. I did not use test or octave/matlab directly.

On Thu, 5 Dec 2019, 19:31 Gregor Boehl notifications@github.com wrote:

@AlexDece https://github.com/AlexDece for which model did you get these results? Keep in mind that we are looking for the reason of these deviations here:

|--- [AL] Autocorrelation --- expected 0.7396, actual 0.777 failed |--- [AL] Unconditional variances --- expected 8.2656, actual 15.9088 failed |--- [AL] Input response functions --- expected -0.6807, actual -0.6807 success

which are generated using the setup in static/mmci-cli/test/config/test-al.json.

Maybe blaming the decomposition of Q isn't the right track, as the initial IRF values look good. Is there a difference in how AC and unconditional variances are calculated for AL models?

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/IMFS-MMB/mmb-gui-electron/issues/82?email_source=notifications&email_token=ANVSGFKSBSP244MIDSQHCO3QXFCJ3A5CNFSM4JTZYE6KYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEGBVR5Q#issuecomment-562256118, or unsubscribe https://github.com/notifications/unsubscribe-auth/ANVSGFKAF2GIKRRM43BSQV3QXFCJ3ANCNFSM4JTZYE6A .

gboehl commented 4 years ago

chol(Q) gives me the same result for MATLAB (2017) and Octave (4.4.1). However, Octave gives me "0.00000" in the upper and lower triangular, while Matlab gives me "0" in the lower and "0.00000" in the upper triangular.

For sqrtm(Q), the resuls look different (maybe Matlab rounds to 0 and octave does not?)

See the pictures attached (first for Matlab, second for Octave)

Matlab Matlab

Octave Octave

I mean these results. All of the matrices are numerically identical to machine precision.

Okay, you used the test command, I understand. Can you track down why the test fails, i.e. where Octave and Matlab start giving different results? Best coordinate with @millazar as he is responsible for the tests.

AlexDece commented 4 years ago

Okay, I understand. I will try to track it down and consult Lazar if neccessary.

You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/IMFS-MMB/mmb-gui-electron/issues/82?email_source=notifications&email_token=ANVSGFLFNSMOVMMXRILF57TQXFIJJA5CNFSM4JTZYE6KYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEGB2MMY#issuecomment-562275891, or unsubscribe https://github.com/notifications/unsubscribe-auth/ANVSGFJQBNHWB2IU2ULJGODQXFIJJANCNFSM4JTZYE6A .

JTWendelborn commented 4 years ago
  1. I also checked and I get the exact same results that Alex reported above for cool(Q) and sqrtm(Q) under Octave and Matlab.

  2. I don't know whether this is relevant or helps, but I noticed that the displayed series of 'PF hits' and 'Exits' differs between Matlab and Octave. No clue about the reason though :( In Matlab I have:

PF hits 28 Exits 354 PF hits 24 Exits 291 PF hits 6 Exits 276 PF hits 10 Exits 329 PF hits 32 Exits 90 PF hits 96 Exits 261 PF hits 13 Exits 281 PF hits 15 Exits 297 PF hits 35 Exits 380 PF hits 13 Exits 339

Whereas in Octave I have:

PF hits 29 Exits 303 PF hits 18 Exits 249 PF hits 2 Exits 169 PF hits 153 Exits 42 PF hits 1 Exits 165 PF hits 14 Exits 169 PF hits 23 Exits 335 PF hits 17 Exits 197 PF hits 16 Exits 335 PF hits 99 Exits 4

j2L4e commented 4 years ago

I'm unsubbing here as it clutters my inbox. Ping me if there's anything I can do.

AlexDece commented 4 years ago

I think different PF hits and Exits are normal because fictional data is generated using random normal numbers.

I have problems reconstruction the values of parameters in matlab/octave. I changed the matlab code using this fprintf() function, but in matlab the command window output does not change, while in octave it changes unreasonably. Can anyone tell me how to do that?

However, what I find a bit puzzling is the following: I use dynare 4.5.7 in octave AND matlab. While Octave uses 4.5.7, matlab says in the workspace that dynare version is 4.5.6. Using 4.5.6 in Octave delivers error because I use version 4.4.1. Also, Octave does not store M_ and other stuff (alpha, beta) in the workspace. The eigenvalues and jacobian are the same by machine precision ...

My first guess is that R or T and by that betamat or SecMom are different. They play an important role in the following calculations.

gboehl commented 4 years ago

I think different PF hits and Exits are normal because fictional data is generated using random normal numbers.

Agreed! Altough one would normally want to fix the randoms seed to ensure reproducibility.

I have problems reconstruction the values of parameters in matlab/octave. I changed the matlab code using this fprintf() function, but in matlab the command window output does not change, while in octave it changes unreasonably. Can anyone tell me how to do that?

I'm not sure what you mean by "changes". But sorry, can't help you here as I'm not using the standard interfaces.

However, what I find a bit puzzling is the following: I use dynare 4.5.7 in octave AND matlab. While Octave uses 4.5.7, matlab says in the workspace that dynare version is 4.5.6. Using 4.5.6 in Octave delivers error because I use version 4.4.1.

That is odd. Do you even have 4.5.6 installed? As we do not support this version, maybe removing it would be an option.

Also, Octave does not store M_ and other stuff (alpha, beta) in the workspace. The eigenvalues and jacobian are the same by machine precision ...

My first guess is that R or T and by that betamat or SecMom are different. They play an important role in the following calculations.

Great that you have a clue! Lets say, if we don't know more until Monday, let us stick with a warning message for octave/AL users. It is not our core responsibility to fix bugs in the AL-part that have always been there.

AlexDece commented 4 years ago

@btatar13 and I debgged the AL algorithm. It is true that the issue is in the file 'sim_AL_alt_gain.m'. Although, the matrices mentioned above (R,T,betamat and all) are the same under Matlab and Octave, we found out that random number generator works in a different way. Both use the similar algorithm, yet they deliver different seqences and it seems there is no way to align them. For the calculation of autocorrelations and variances random numbers are also used. As the Covar matrix is only averaged over ten simulations (n_sims=10) the averages differ. If setting n_sims to 100, the results change on both platforms, yet the difference in the results seems to decline. However, 100 simulations on Octave takes already way too much time.

IRFs are the only values, which are as expected under Octave and AL. That is the case because in the file, which prduces the IRFs, there is no random number generation involved. This procedure is deterministic.

Maybe someone runs the test under Octave and stores the values as the expected values (as it was done for the current expeted values with Matlab).

Finally, it would be useful to state somewhere that AL under Octave delivers similar IRFs, but different autocrrelations and variances than Matlab.

gboehl commented 4 years ago

Very well done!

Well, the variances should never depend on the random seed. That simply means that these are unreliable results. How long do 100 take under Octave? How much different are the results then?

We are now thinking about warning the user that the simulation may take a long time.

A different solution would be to take a low-discrepancy series (like Sobol) as input instead of the random numbers. But even then you'd need more samples...

AlexDece commented 4 years ago

I think using more samples and/or the other method is okay if the result become reliable.

100 simulations takes a few minutes under Octave. We didnt stop time, but I think it was 2-5 minutes. The results are quite close (Matlab and Octave), but they differ quite a bit from the 10 simulations case.

In case you want to see the exact numbers, @btatar13 and I can post them. I saved the Matlab test statistics and if I remember correctly, Balint also saved his Octave test statistics.

If we take more than 100 samples or the other method is slower, this might be annoying to some people or they will simply think that Octave broke. A warning message would be fine for the second issue.

.

On Mon, 9 Dec 2019, 18:19 Gregor Boehl notifications@github.com wrote:

Very well done!

Well, the variances should never depend on the random seed. That simply means that these are unreliable results. How long do 100 take under Octave? How much different are the results then?

We are now thinking about warning the user that the simulation may take a long time.

A different solution would be to take a low-discrepancy series (like Sobol https://en.wikipedia.org/wiki/Sobol_sequence) as input instead of the random numbers. But even then you'd need more samples...

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/IMFS-MMB/mmb-gui-electron/issues/82?email_source=notifications&email_token=ANVSGFNRAG2V3ZR2NS6XUNDQXZ43NA5CNFSM4JTZYE6KYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEGJ6SUI#issuecomment-563341649, or unsubscribe https://github.com/notifications/unsubscribe-auth/ANVSGFLZPXDGOTOJFT3MYG3QXZ43NANCNFSM4JTZYE6A .

gboehl commented 4 years ago

I think using more samples and/or the other method is okay if the result become reliable.

I wouldn't do Sobol for now. Thats something for the future. I don't expect many people to be affected, and implementing a low-discrepancy series might be a bit of a hassle. They are a very useful tool though!

100 simulations takes a few minutes under Octave. We didnt stop time, but I think it was 2-5 minutes. The results are quite close (Matlab and Octave), but they differ quite a bit from the 10 simulations case.

Both the Matlab and Octave estimates of the cov were wrong as you need way more samples (> ndim^2) for an reliable estimate. So this is fine. How close are they? What is the magnitude? How many samples are needed for the estimates to become independen of the random draws on a 1e-3 precision level?

@j2L4e it seems that we're going for that warning. Lets discuss where and how to implement it...

AlexDece commented 4 years ago

This is what I get using Matlab and n_sims=100

---OUTPUT-START--- --- [AL] Autocorrelation --- expected 0.7396, actual 0.7286 failed --- [AL] Unconditional variances --- expected 8.2656, actual 14.4198 failed --- [AL] Input response functions --- expected -0.6807, actual -0.6807 success 1 out of 6 successful

I think in Octave we got something like: 0.723 13.something -0.6807

This is not precise to the third digit after the comma. However, this is a lot closer than before. So, tomorrow I can try it with up to 1000 simulations (or more if speed allows) and see if the values become the same. I will also report the speed of the calculations.

btatar13 commented 4 years ago

I ran the test using Matlab with n_sims=10,100,1000 and with Octave n_sims=10,100. With Matlab 1000 simulations ran through in 2 min 30 sec on i5-6300u. With Octave this would be infeasible. You can see, that jumping from 10 to 100 changes results significantly for Matlab. From a testing perspective, I wonder whether it makes sense to increase n_sims, as if something goes wrong than it goes wrong, then it does not matter whether n_sims=10 or n_sims=100. Maybe we should think about increasing n_sims for model comparison purporses, not to depend so much on randomness.

Results for the test:

Matlab n_sims=10

--- [AL] Autocorrelation --- expected 0.7396, actual 0.7396 success --- [AL] Unconditional variances --- expected 8.2656, actual 8.2656 success --- [AL] Input response functions --- expected -0.6807, actual -0.6807 success

Matlab n_sims=100

--- [AL] Autocorrelation --- expected 0.7396, actual 0.7286 failed --- [AL] Unconditional variances --- expected 8.2656, actual 14.4198 failed --- [AL] Input response functions --- expected -0.6807, actual -0.6807 success

Matlab n_sims=1000

--- [AL] Autocorrelation --- expected 0.7396, actual 0.7285 failed --- [AL] Unconditional variances --- expected 8.2656, actual 12.4508 failed --- [AL] Input response functions --- expected -0.6807, actual -0.6807 success

Octave n_sims=10

--- [AL] Autocorrelation --- expected 0.7396, actual 0.777 failed --- [AL] Unconditional variances --- expected 8.2656, actual 15.9088 failed --- [AL] Input response functions --- expected -0.6807, actual -0.6807 success

Octave n_sims=100

--- [AL] Autocorrelation --- expected 0.7396, actual 0.726 failed --- [AL] Unconditional variances --- expected 8.2656, actual 13.247 failed --- [AL] Input response functions --- expected -0.6807, actual -0.6807 success

gboehl commented 4 years ago

Thanks guys. Lets settle on 200 simulations and put a warning in the user guide that 200 are not enough to achive real precision. Could you send a respective PR?

Jan will also implement a warning about the duration.

AlexDece commented 4 years ago

200 simulations might be okay for testing, but in octave it takes a LOT of time. See below the time elapsed for the AL part for different numbers of simulations for ONE model and ONE policy rule. If the user wants to compare different models and different rules, the simulation can take an hour if Octave is used.

Can someone please check if Octave 5.x also takes so much time for the simulation?

Matlab: n_sims=200 [AL] Autocorrelation expected 0.7396, actual 0.7172 [AL] Unconditional variances expected 8.2656, actual 12.559 Elapsed time is 37.514643 seconds.

Octave: n_sims=200 [AL] Autocorrelation expected 0.7396, actual 0.7113 [AL] Unconditional variances expected 8.2656, actual 12.8588 Elapsed time is 583.827 seconds.

Matlab: n_sims=100 [AL] Autocorrelation expected 0.7396, actual 0.7286 [AL] Unconditional variances expected 8.2656, actual 14.4198 Elapsed time is 20.608989 seconds.

Octave: n_sims=100 [AL] Autocorrelation expected 0.7396, actual 0.726 [AL] Unconditional variances expected 8.2656, actual 13.247 Elapsed time is 433.82 seconds.

Octave: n_sims=50 [AL] Autocorrelation expected 0.7396, actual 0.7032 [AL] Unconditional variances expected 8.2656, actual 12.3156 Elapsed time is 143.046 seconds.

-One suggestion is to check in the code if Octave or Matlab is used and then to set n_sims accordingly (e.g. Matlab: n_sims=200, Octave: n_sims=50).

-We could also allow the user to set n_sims in the AL case and we show the user a message with suggested values or time, which is needed for one model and one policy rule. But this needs some time to think about and implement.

-We could also just write it in the userguide and tell the user where to change n_sims in case the user wants to change precision for the AL models.

I will wait for your answer to change the code and commit a PR.

gboehl commented 4 years ago

Thanks for the benchmarks! I think 200 is fine. People were be warned.

I would like to avoid elaborated solutions such as checking for running versions or further user interaction (requires extension of the UI) and other backend interactions as I don't think this is a very relevant issue in practice.

If someone wants to be scientific and serious about comparing AL models, they would have to have a look at the code in any case.

So short comment on that in the guide and setting n_sim=200 for AL models should be sufficient.

gboehl commented 4 years ago

Okay, so we added the warning, increased n_sim to 200 and I removed AC & var testing for AL models from compatibility testing in https://github.com/IMFS-MMB/mmb-gui-electron/commit/a3f5cb3c8a9c3010c1c1c7069fa66ea566a78c1d.

@j2L4e Tests are now passing fow Octave.

@AlexDece @millazar can you confirm the test is still running under Windows?

AlexDece commented 4 years ago

Both work, but it shows that 6 out 4 tests were successful

Matlab: ---OUTPUT-START--- --- Autocorrelation --- expected -0.071, actual -0.071 success --- Unconditional variances --- expected 1.1702, actual 1.1702 success --- Input response functions --- expected 0.0415, actual 0.0415 success

successful =

 4

successful =

 5

--- [AL] Input response functions --- expected -0.6807, actual -0.6807 success 6 out of 4 successful

Octave ---OUTPUT-START--- --- Autocorrelation --- expected -0.071, actual -0.071 success --- Unconditional variances --- expected 1.1702, actual 1.1702 success --- Input response functions --- expected 0.0415, actual 0.0415 success successful = 4 successful = 5 --- [AL] Input response functions --- expected -0.6807, actual -0.6807 success 6 out of 4 successful

j2L4e commented 4 years ago

Sorry, refixed via 89db05d

gboehl commented 4 years ago

So then this is finally fixed, right?

AlexDece commented 4 years ago

Yes

On Wed, 11 Dec 2019, 19:29 Gregor Boehl notifications@github.com wrote:

So then this is finally fixed, right?

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/IMFS-MMB/mmb-gui-electron/issues/82?email_source=notifications&email_token=ANVSGFMRWLCDFVJ77B6IUOTQYEWSJA5CNFSM4JTZYE6KYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEGUD2AA#issuecomment-564673792, or unsubscribe https://github.com/notifications/unsubscribe-auth/ANVSGFLBSBOG66TFLCOWHZLQYEWSJANCNFSM4JTZYE6A .

gboehl commented 4 years ago

Sorry, my bad. I was hoping to find a way to make the calculation of vars/ACs for AL models optional via config.json. Unfortunately there is no quick way to do this, so I reverted to n_sims=25 and edited the warning to inform the user of inferior precision.

@AlexDece you are right, the best way would have been to set n_sims via the config file. But thats quite some effort...