Closed michaelpj closed 3 years ago
Are we ever likely to want to do something with all of the tokens for a particular currency? That would presumably become more difficult.
That would presumably become more difficult.
Yeah, you'd have to iterate over all the entries in the map. Having the nested maps effectively gives you an index on the currency ID. Which is nice for querying but less good for updating.
Having said that: we probably do more querying than updating on-chain....
If necessary we might be able to implement something that covers both situations reasonably efficiently: for example you could maybe have some kind of structure where all of the pairs (X,Y) for a fixed X form an interval and then you could have an index saying where the start and finish of the entries for the different Xs are. That would take up extra room and might be more trouble than it's worth though.
This raises the question of exactly how efficient Scott-encoded data structures are, which is something I don't have a good feel for. The compilation paper has a good specification of how the translation is done, so in theory we might be able to prove something about how complexity bounds are transformed when you encode something (at least if we pretend that the Haskell code is strict). I'm not proposing that we actually do that, but do you have a feel for what's going on, @michaelpj ? Will we always get a linear increase in time/space usage, for example? It might be useful to know what's going on because it could help us (or others) to implement data structures which would have good on-chain behaviour. I suppose you could always do that empirically though.
Poking around, I found https://github.com/input-output-hk/plutus/pull/692 (about red-black trees for maps, from March-June). What's the status of that now?
If necessary we might be able to implement something that covers both situations reasonably efficiently ... That would take up extra room and might be more trouble than it's worth though.
Yeah, I think this would probably be more trouble than it's worth, but we could try it.
The compilation paper has a good specification of how the translation is done, so in theory we might be able to prove something about how complexity bounds are transformed when you encode something
I wouldn't expect the complexity to change - we're transforming the structure of the computation fairly faithfully.
Poking around, I found #692 (about red-black trees for maps, from March-June). What's the status of that now?
Yes, we absolutely want to have a better map implementation anyway. The current one has linear complexity for a bunch of things which is terrible. I have a somewhat updated version of that but I'm not pushing it since I believe @nau is going to look at this soon.
Right now the spec and the implementation both have
Value
asMap Currency (Map Token Quantity)
. However, this is probably a bad implementation:Ord
instances forMap
types are unlikely to be good, and we'd want to try quite hard to make them lazy to avoid constantly looking at whole inner trees over and over.I think the "uncurried" version
Map (Currency, Token) Quantity
would probably be better. We would incur the cost that every key node would include the currency id bytestring, whereas with the curried version the key nodes of the inner map would only contain the token id. Unclear what the tradeoff is.