kadena-io / chainweb-node

Chainweb: A Proof-of-Work Parallel-Chain Architecture for Massive Throughput
https://docs.kadena.io/basics/whitepapers/overview
BSD 3-Clause "New" or "Revised" License
249 stars 95 forks source link

Support latest version of servant #154

Closed larskuhtz closed 4 years ago

larskuhtz commented 5 years ago

Some code in the mempool depends servant <= 1.4. The most recent version is 1.6.

Since servant depends on a wide range of other packages that we also use elsewhere, we should try to be liberal on which servant versions we support. In particular we should always try to be compatible with the most recent version.

mightybyte commented 5 years ago

This is a huge task because the new version of servant has poor backwards compatibility. I recently tried to do it on one of my personal projects and blew an entire day, with the end result being that I ditched the upgrade attempt. This has also already been attempted on chainweb by @gregorycollins with similar lack of success. Moving to icebox.

larskuhtz commented 5 years ago

We have to do it soon. I am increasingly blocked by this on other tasks. Servant has aggressive upper bounds and prevents updating other basic packages like network and http-client. I need to test with recent versions of those packages for debugging.

(When filing bug reports for packages, the first thing that maintainers will ask, is to try with the most recent version of the package)

larskuhtz commented 5 years ago

I got most of this working. The only missing piece is connecting servant streams (that are used by the mem-pool) to io-streams. There are packages for using conduit, pipes, and machines, but io-streams support is still missing.

larskuhtz commented 5 years ago

This has also already been attempted on chainweb by @gregorycollins with similar lack of success. Moving to icebox.

I think this is the wrong approach. If we find this more difficult than expected, we better start early, because we'll have to do this in any case. So, if it takes longer we have to start sooner.