Closed sagikazarmark closed 3 years ago
@damtur any chance you could take a look at this PR? I know it's rather large and therefore scary, but the individual commits make sense.
Thank you for your contribution @sagikazarmark I'll review your PR as soon as I can, but it can take about a week or so as I'm quite busy right now.
Sure thing, thanks!
Here are some test results: https://github.com/sagikazarmark/fargo/runs/3058950358?check_suite_focus=true
Some of the tests fail, I don't know why. I suspect it's a performance/environment issue (these tests don't fail on my computer, others do).
I'm not sure how we should proceed, but I don't know how much more time I want to invest in trying to revive the tests.
One potential solution is simply running a single eureka instance. That should fix most of the performance issues. I'd just ignore the rest of the failing tests.
BTW all of the failing tests are net tests, marshal tests are fixed.
@damtur I managed to make the tests pass on both supported Eureka versions. Unfortunately running multiple instances doesn't seem to work.
Here is an example of a passing build: https://github.com/sagikazarmark/fargo/runs/3061570011?check_suite_focus=true
I've spent way more time on making this work than I wanted to, so I hope it can be merged and tagged soon.
Thank you @sagikazarmark for all your contributions. There are still a few tests failing locally, but it's better than it was previously so I decided to merge this I've merged it now and released it here: https://github.com/hudl/fargo/releases/tag/v1.4.0
I would like to get all the tests passing and reduce the time those tests are running for. I believe it's related to some DNS resolution timeouts and it can be fixed. If you find some time to contribute to this I'll appreciate that as well.
Thanks @damtur !
I'll see what I can do.
I took a quick look at the tests and I'm not sure they are slow because of DNS resolution. There are time.Sleep
s all over the place sleeping for 30 seconds. There are also tickers with the same interval. I guess they are in place to deal with eventual consistency.
Meh... that's not great whatsoever. Thank you for looking though. Are all tests passing for you locally though?
If I restart the containers before every test run then yeah, they do. I suspect something stays behind from the tests that makes them fail on the next run (at least that's what the logs suggest)
I guess we should add some clean-up to tests then. Also to speed up tests we could run them in parallel, but that can introduce much more problems than it solves.
I guess I'm not as worried about performance as I'm about those failing tests. So cleaning the state would be probably my preference in terms of focus area.
And again, thank you for all your work here. We do not currently have the capacity to look at those issues, so contributions like yours help a lot!
Running them in parallel could help, but it would require to use different identifiers (I think there is already some contention between tests because of using the same names and that's why sometimes tests my fail, apart from eventual consistency).
Apologies for submitting a huge PR like this, but the changes are kinda related to each other and I didn't want to submit them one by one. The commits can be independently reviewed though, I made sure they are meaningful and focus on one thing.
The changes in summary: