Closed bjuergens closed 3 years ago
@Dranero Da sind noch ein paar fehler bei den unit tests, wo ich mir nicht sicher bin ob die vom mergen kommen, von Fehlern in den Tests oder von Fehlern im Code. Kannst Du dir das mal anschauen?
Insb. beunruhigt mich, dass test_run
test fehlschlägt
PYTHONPATH=./neuro_evolution_ctrnn pytest tests/test_experiment.py::TestExperiment::test_run
(der schlägt auch fehl, wenn ich die config vorher korrigiere)
Alle Tests außer den konnte ich durch Bug fixes beheben. Den Test, den du genannt hast, kann ich fast nicht nachvollziehen. Ich kann mir aber vorstellen, dass durch das drehen der Output Matrix in Kombination mit evtl. fest definierten individuals die Matrix auch verkehrt aufgefüllt wird. Das weißt du aber bestimmt besser was in dem Test genau passiert
Alle Tests außer den konnte ich durch Bug fixes beheben
sind die fixes schon gepusht?
und bei der gelegenheit bitte auch gleich nochmal den master reinmergen
ich habe jetzt in den letzen 3 stunden das problem mit den test_run
test untersucht.
Ergebnis:
random
sondern nur noch np.random
verwende).--> In Zukunft bitte öfter den Master reinmergen. Oder alternativ: selbst troubleshooten.
--> Die erwarteten Werte im test_run
unit test müssen angepasst werden. (kann ich selbst machen, sobald der master reingemergt ist und die anderen tests gefixt sind)
the conflicts in the json files are irrelevant. I deleted a bunch of them on another branch and did some housekeeping, which I could do again after this PR has been merged. So when merging just pick "theirs" or "ours" for the whole files and be done with out. No special considerations needed.
Some background: The json files in the repo are examples. And they are used in the unit test, to make sure most things can be initialized. Besides this they serve no functional purpose. They can be changed as much as you want, as long as their unit test still passes.
sorry, I forgot to push.
I have also deleted the old LSTM numpy implementation and fixed all configurations
are you sure you merged the master branch correctly? Because their are still plenty merge conflicts remaining
oh, i forgot to pull this time. ^^ give me an hour
I just did a small test with these changes:
$ git diff|cat
diff --git a/neuro_evolution_ctrnn/brains/continuous_time_rnn.py b/neuro_evolution_ctrnn/brains/continuous_time_rnn.py
index afef401..e086764 100644
--- a/neuro_evolution_ctrnn/brains/continuous_time_rnn.py
+++ b/neuro_evolution_ctrnn/brains/continuous_time_rnn.py
@@ -136,7 +136,7 @@ class ContinuousTimeRNN(IBrain[ContinuousTimeRNNCfg]):
else:
self.y = np.clip(self.y, self.clipping_range_min, self.clipping_range_max)
- o: Union[np.ndarray, np.generic] = np.tanh(self.T.dot(self.y))
+ o: Union[np.ndarray, np.generic] = np.tanh(self.T.T.dot(self.y))
return o
@classmethod
@@ -173,7 +173,8 @@ class ContinuousTimeRNN(IBrain[ContinuousTimeRNNCfg]):
w_mask = cls._generate_mask(config.w_mask, config.number_neurons, config.number_neurons,
config.w_mask_param)
# TODO The mask was flipped on the diagonal for mathematical correct structure. check if no errer exist
- t_mask = cls._generate_mask(config.t_mask, output_size, config.number_neurons, config.t_mask_param)
+ # t_mask = cls._generate_mask(config.t_mask, output_size, config.number_neurons, config.t_mask_param)
+ t_mask = cls._generate_mask(config.t_mask, config.number_neurons, output_size, config.t_mask_param)
cls.set_class_state(v_mask=v_mask, w_mask=w_mask, t_mask=t_mask)
and the unit test passed.
Thus I conclude, that the change of the results come from this exact change. And we know that this change is intentional. Thus I 've updated the unit test, and will also update it with the CI runners.
I will now merge this into master. And tomorrow I will run a short experiment to make sure nothing big broke unexpectedly
also I assume, that all requested changes are fine
I think that was our biggest PR so far. I'm glad it's finally merged lol.
And I hope not too much broke during the merge
directly merge into master, or? @pdeubel