OmniLayer / omnicore

OmniCore staging tree
http://www.omnilayer.org/
MIT License
758 stars 231 forks source link

Meta DEx progress #9

Closed dexX7 closed 9 years ago

dexX7 commented 9 years ago

As continuation of https://github.com/mastercoin-MSC/mastercore/issues/303.

In short: I coded the whole meta DEx test plan, and some other tests, as Python based RPC tests some time ago and the current omnicore-0.0.10 + wallet fix + cancel logic passes all tests.

Rather sooner than later all tests should be converted into spock tests, which are much, much nicer, and once a test base stands, all tests can hopefully be added via a spreadsheet, similar to sto-testplan.tsv, which is literally processed line by line.

However, if someone likes to dig into it, no setup or bitcoin.conf is required, but only:

git clone https://github.com/dexX7/mastercore-rpc-tests.git
./mastercore-rpc-tests/run_mdex_tests.py --daemon=/path-to/omnicore-0.0.10/src/bitcoind --cli=/path-to/omnicore-0.0.10/src/bitcoin-cli

This executes MetaDexPlanTest, MetaDexCancelAtPriceTest, MetaDexCancelPairAndLookupTest, DexCrossEcosystemSideEffectsTest, MetaDexCancelEverythingInSameEcosystemTest, MetaDexCancelEverythingIgnorePropertyTest and MetaDexCancelEverythingScopeTest.

Orderbook states and trades were added as comments for the MetaDexPlanTest.


So what's missing? First, the test plan is available here:

I'm actually not convinced this covers everything, and I'm sceptical regarding the rounding behavior of the current implementation, see mega thread for the general discussion: https://github.com/mastercoin-MSC/spec/issues/170.

Rounding is only barely tackled by the test plan, and further, the current test plan doesn't add up, because the actors have insufficient amounts of TMSC at the end of the last rows. This is a minor issue though.

At this point the most valuable missing pieces are probably addtional test sequences, preferably in a format such as:

action state infos X state infos Y state infos Z ...
E1 offers x.xxx TInd for x.xxx TMSC ... ... ... ...
... ... ... ... ...

Where state infos could be expected balances, open offers, or simliar, like in sto-testplan.tsv.

zathras-crypto commented 9 years ago

@dexX7 this is fantastic work (as always) :) thanks a lot!

Agreed a more in depth test plan should be devised, I remember when I first looked at it I found it difficult to accept something as complex as MetaDEx could be tested (including all possible fringe cases) with just 10 scenarios comprising under 100 steps in total across them all. But since I'm not experienced in software testing, I can't speak with any authority on this and will simply say, yep I support more in depth testing 100%.

Regarding the "all tests pass" this highlights the test plan is insufficient furthermore. There are known bugs in the math according to @m21 (he said we should see them pretty easily) so if everything is passing, those math issues are not being picked up in the testing.

P.S. The spreadsheet shows a bunch of failure results - these now pass is that correct?

Thanks again dude

dexX7 commented 9 years ago

There are known bugs in the math according to @m21

It would be incredibly useful, if those were documented.

The spreadsheet shows a bunch of failure results - these now pass is that correct?

@faizkhan00 spent a lot of good work on testing the plan on testnet, but the plan was changed in the meantime here and there, if I recall correctly. But if you were referring to the rows with invalid - that's the result which should be invalid, e.g. when trying to trade non-existing properties.

Good tests don't require millions of lines, but rather breaking down the problem, ideally starting with unit tests.

I agree nevertheless, and what I currently miss are tests that tackle rounding issues.

zathras-crypto commented 9 years ago

It would be incredibly useful, if those were documented.

Agreed, I recall @m21 saying they should be easy to spot, but paging @m21 - do you have any notes on those MetaDEx math bugs you mentioned bud?

But if you were referring to the rows with invalid - that's the result which should be invalid, e.g. when trying to trade non-existing properties.

Personally I didn't like the way the spreadsheet gave results. Hover over most of those 'invalid' cells to see the tooltip - you'll note it's a fail result on a good percentage of them, eg: image

Good tests don't require millions of lines, but rather breaking down the problem, ideally starting with unit tests.

Understood, and that's why I'm happy people like yourself, Marv and Sean are handling the testing aspects because I just don't have the experience sadly. Infrastructure testing leverages a lot of experience, I can only assume software testing is a similar situ. So my thanks there to you guys :)

dexX7 commented 9 years ago

Personally I didn't like the way the spreadsheet gave results. Hover over most of those 'invalid' cells to see the tooltip - you'll note it's a fail result on a good percentage of them, ...

Expected result: invalid, actual result: fail => test passed.. ;)

Probably not very intuitive, yes. But as mentioned, the test plan was changed here and there, so I wouldn't give much about those transactions. Row 21 to 131 are coded, with some minor differences at the very beginning ("NOTE: The following tests do not reflect the test plan because ..."). Back then I spent quite some time to accurately convert the plan into RPC tests, even though it's an interesting idea to broadcast test sequences and check them on-chain. @msgilligan had something similar in mind as well: https://github.com/msgilligan/bitcoin-spock/issues/23

The STO tests (and using spock in general!) was a game changer, because the way they were created, it's possible to extend the tests, simply by adding new lines to sto-testplan.tsv. Even though each line represents an atomic test, something similar could be done for whole test sequences.

Another awesome aspect of bitcoin-spock is that specifications can basically be coded line by line, here is an example: DexSpec.groovy + the test report. Most of the quoted lines were directly taken from spec#sell-mastercoins-for-bitcoins.


Anyway, sorry for the OT, and back to the actual topic. :)

zathras-crypto commented 9 years ago

Expected result: invalid, actual result: fail => test passed.. ;)

Yeah, that's - hmm hehe. I've received many, many test reports over my years in infra and when a test result is a fail, the test is a fail. Getting into semantics now but surely "Expected outcome vs actual outcome => result" or similar would be better? Again perhaps it's just my lack of SW testing experience, but when I see "result: fail" on test reports, that leads me to believe the test has failed.

It's probably just the way I'm used to working :) I did however for example look up the first transaction, which is supposed to be 2.00000000 TMSC for 2.00000000 TDiv1 which is invalidated because the TX version is >1. But when we look at the transaction the amounts don't seem right and the version appears to be fine:

zathras@coredev01:~/github/build/omnicore$ src/bitcoin-cli gettransaction_MP ef918abe90aaa5fbd727edb4c727dbf2c3de3dcd4a74e7c42ca2bb0ff26f8288
{
    "txid" : "ef918abe90aaa5fbd727edb4c727dbf2c3de3dcd4a74e7c42ca2bb0ff26f8288",
    "sendingaddress" : "mkRyG8wpC4micZHz7Xj8T2UNeHnJa39S6M",
    "ismine" : false,
    "confirmations" : 20516,
    "fee" : 0.00010684,
    "blocktime" : 1416334923,
    "version" : 0,
    "type_int" : 21,
    "type" : "MetaDEx token trade",
    "amountoffered" : "0.20000000",
    "propertyoffered" : 1,
    "propertyofferedisdivisible" : true,
    "amountdesired" : "0.20000000",
    "propertydesired" : 18,
    "propertydesiredisdivisible" : true,
    "action" : "new sell",
    "valid" : false
}

However that version display is clearly inaccurate, and there are some interesting things happening on the back end... Black holing??? What!!! haha

2015-03-25 07:19:30 version: 3, Class B
2015-03-25 07:19:30                 type: 21 (MetaDEx token trade)
2015-03-25 07:19:30             property: 1 (MSC)
2015-03-25 07:19:30                value: 0.20000000
2015-03-25 07:19:30 Black Hole identified !!! 308996, 1, 21, 3
2015-03-25 07:19:30 !!! interpretPacket() returned -80888 !!!

EDIT: In case not clear, I thought we had abandoned the idea of black holing.

dexX7 commented 9 years ago

Again perhaps it's just my lack of SW testing experience, but when I see "result: fail" on test reports, that leads me to believe the test has failed.

Hehe yeah, I agree, the more expressive and intuitive, the better.

EDIT: In case not clear, I thought we had abandoned the idea of black holing.

Sort of.. it actually has no effect at the moment, see src/mastercore.cpp#L974-L982.

I do have some ideas in this context, but this is probably material for another thread.

m21 commented 9 years ago

Hm, been a while, I can't seem to find the specifics of the issue I was thinking about, but some general notes:

zathras-crypto commented 9 years ago

Yeah I was surprised to see it still there - IIRC black holing was dropped in favour of an alert system that could shutdown outdated clients (one of the alert types leverages isTransactionTypeAllowed to determine whether the client supports a new message type).

Note this can be considered sensitive, so I did add a CLI option to override the forced shutdown (lest we be accused of having a centralized kill switch).

zathras-crypto commented 9 years ago

Hey @m21 :) thanks for the info!

zathras-crypto commented 9 years ago

Quick one, regarding the transaction we were discussing in the meeting - I've dumped a debug parse here http://pastie.org/pastes/10097075/text?key=zyhapet9kp6npfj93w9grg for anyone interested - about to start looking through it.

dexX7 commented 9 years ago

Hmm.. I just noticed my remote testnet node (running 0.9) went into safe mode four days ago (and remains in that state after restarting). Probably when you mentioned the crazy number of new blocks: "Safe mode: Warning: The network does not appear to fully agree! Some miners appear to be experiencing issues."

Let's see, if the 0.10 based Omni Core can handle it. :)

dexX7 commented 9 years ago

Somewhat shorter log: https://gist.githubusercontent.com/dexX7/1853bfe4d58a2595334f/raw/4617bf94f897184ce824a400b6eeab8ae5400f01/faultytrade.log

The bad state was appearingly created via 84ceae5eec2b933d170f63955fa19533da0a3ea02fdfa1ee51dc23e9f5b61e32:

A2 offers 11.50000000 MSC, and desires 11.50000000 SPX   @  1.00000000                 SPX/MSC
A1 offers 67.72727273 SPX, and desires  6.77272727 MSC   @ 10.000000004429530203125985 SPX/MSC

6.77272727 MSC trade for 67.72727270 SPX

A1 gets 6.77272727 MSC, and gives 67.72727270 SPX
A2 gets 67.72727270 SPX, and gives 6.77272727 MSC

A1 still has 0.00000003 SPX for sale. The updated desired amount is 0 MSC. (!)
A2 still has 4.72727273 MSC for sale. The updated desired amount is 4.72727273 SPX.

This results in a new offer of 0.00000003 SPX for 0.00000000 MSC from A1. (!)
And another offer of 4.72727273 MSC for 4.72727273 SPX from A2.

Edit: when executing this trade alone it matches and the order of A1 is filled completely.. maybe I read the log wrong.

marv-engine commented 9 years ago

As long as there's still a non-zero amount for sale, it shouldn't matter that the amount desired is zero. It's perfectly ok for a seller to receive more than the desired amount. And, the matching is done based on the original unit price, not the remaining amount desired.

As for this log entry:

inf= 0.00000000000000000000000000000000000000000000000000:mpZATHupfCLqet5N1YL48ByCM1ZBfddbGJ in 306356/010, txid: 75b687ca67 , trade #12 0.00000003 for #1 0.00000000

It appears to be related, and it's not clear why the unit price and reciprocal have changed from their original values.

An order's unit price never changes.

From the spec (highlighted for emphasis):

If there are no matches for the new sell order or the aggregate amount desired in the matching orders is less than the amount for sale in the new sell order, the new sell order must be added to the list of existing sell orders, with the remaining amount for sale at the original unit price. This order is now a candidate for matching against future sell orders. Note that when only some coins from an existing order are purchased, the remaining coins from that order are still for sale at the original unit price.

There is a test to sell the largest possible amount of indiv for the smallest possible amount of div desired. unit price = 0.00000001 / 9,223,372,036,854,775,807 unit price = 0.000000000000000000000000001084 (30 decimal digits)

So, a unit price can never be calculated as zero, using at least 27 decimal digits.

dexX7 commented 9 years ago

@marv-engine: I'm still in the phase where I figure out the implementation details, but after the very first trade:

A1 offers 100.00000000 SPX for 10.00000000 MSC
A2 offers 0.80000000 MSC for 7.27272727 SPX

A1 receives 0.80000000 MSC, and has 9.272727273 SPX left for sale (A1's offer is updated)
A2 receives 7.27272727, and has 0 SPX left for sale (A2's offer is erased completely)

A1's offer is updated to: 92.72727273 SPX for 9.27272727 MSC

There is this debug line:

PRICE CHECK TRADED_MOREINBUYER: buyer = 0.10000000000000000000000000000000000000000000000000 , inserted = 0.09999999996764705882448096885810350091593813232600 : PROBLEM!

... and this slightly off unit price is due to the ratio of 9.27272727/92.72727273.

Edit:

I think there might be the underlying issue: the original unit price is not honored 100 %, but only approximated by a representable ratio, which carries through the following trades.

I'm further not sure, if an offer of 0.00000003 SPX for 0.00000000 MSC is another issue, or the result of this, and whether such an offer should ever exist (= being added to the offers list).

marv-engine commented 9 years ago

@dexX7: The unit price is calculated once, based on the original amounts, in this case "A1 offers 100.00000000 SPX for 10.00000000 MSC".

After the match and transfers, the remaining amount for sale "92.72727273 SPX" is still offered at the original unit price (0.10).

Can you see why/where the unit price is recalculated? That's not supposed to happen.

dexX7 commented 9 years ago

Sorry, I actually flipped the initial trade in the post before. It's:

A1 offers 100.00000000 SPX for 10.00000000 MSC
A2 offers 0.80000000 MSC for 7.27272727 SPX

...and not vice versa. This makes a difference, and in this order the unit price is "updated".

@marv-engine: as far as I can see there is no concept of an original unit price, or original amounts desired and offered, but only those two amounts, which are updated.

My test setup for this is a bit hacky right now, but I think it would help to have a base test template for bitcoin spock, which ideally uses a TSV file as input, and simply dumps results.

zathras-crypto commented 9 years ago

The bad state was appearingly created via 84ceae5eec2b933d170f63955fa19533da0a3ea02fdfa1ee51dc23e9f5b61e32:

It appears to me that things started falling apart the transaction before that in 8aeb5cdbd282ea2a49e92d94c7f7aa201195ca6377ddb1808b62c46b1a359362

http://pastie.org/pastes/10100648/text?key=9cueyytgdv61v79sad8xw

dexX7 commented 9 years ago

How do you define "falling apart" in this context?

zathras-crypto commented 9 years ago

How do you define "falling apart" in this context?

I may be wrong - that was from a very brief look. As you probably guessed by now I'm working on some extra code to help us identify where and when things go wrong - my comment above was just based on that (nothing in-depth).

Specifically I see

Auditor has detected an invalid trade (txid: 75b687ca67f580d2ab1f7b88afc30fb1893d6eae4d40e83e6a596e49d21801a5) present in the MetaDEx following block 307607

so I look and see 75b687ca67 was not parsed in that block, so other trades must have affected it - so looked at the trades in block 307607 and see two trades (8aeb5cdbd2 & 84ceae5ee). Both trades are flagged as '`PROBLEM!`` and 8aeb5cdbd2 was first in the sequence. Hence the idea that things started going wrong in 8aeb5cdbd2.

As I say not thorough at all, I've somewhat segwayed for a day or two to build this auditing thing because I believe overall it'll make tracking down these bugs easier/faster.

zathras-crypto commented 9 years ago

Just a quick note, the unit price does in fact change - often. Using the auditor:

2015-04-19 01:37:13 ++ inserting seller_replacement: 4.34782608695652173913043478260869565217391304347826:mxaYwMv2Brbs7CW9r5aYuEr1jKTSDXg1TH in 272804/003, txid: fbeeee3116 , trade #2147483652 11.50000000 for #2 50.00000000
2015-04-19 01:37:13 x_Trade()=2:TRADED_MOREINSELLER
2015-04-19 01:37:13 recordTX(37502eb5ddb7779d14844560649d78f1aa11d3c0cded3d7408204dee84e3acc4, valid=YES, block= 272807, type= 21, value= 5000000000)
2015-04-19 01:37:13 Auditor has detected an invalid trade (txid: fbeeee3116cc98def24b066a96bce98b694f4538a5d0bb6e10e7640131f26899) present in the MetaDEx following block 272807
2015-04-19 01:37:13 Reason: Effective price has changed since original trade
Original price:4.34782608695652173913043478260869565217391304347826086956521739130434782608695652173913043478260869565217391304347826087300000
   State price:4.34782608695652173913043478260869565217391304347826086956521739130434782608695652173913043478260869565217391304347826087400000
2015-04-19 01:37:13 Shutdown requested, stop scan at block 272808 of 349410

Need to really up the precision to see it sometimes though (eg 120th digit).

dexX7 commented 9 years ago

@zathras-crypto: as far as I can see this is, because the original unit price isn't stored at all.

I don't feel very comfortable with using floating numbers, and with the unexpected behavior of STO at the beginning, I can only assume this is worse for the meta DEx, given that there are even more calculations, which are further mixed with string -> number conversions, for example here:

cpp_dec_float_100 seller_amount_stilldesired = (cpp_dec_float_100) seller_amountLeft * sellers_price;
seller_amount_stilldesired += (cpp_dec_float_100) 0.5; // ROUND UP
std::string str_amount_stilldesired = seller_amount_stilldesired.str(INTERNAL_PRECISION_LEN, std::ios_base::fixed);
std::string str_stilldesired_int_part = str_amount_stilldesired.substr(0, str_amount_stilldesired.find_first_of("."));

CMPMetaDEx seller_replacement = *p_older;
seller_replacement.setAmountForSale(seller_amountLeft, "seller_replacement");
seller_replacement.setAmountDesired(boost::lexical_cast<int64_t>(str_stilldesired_int_part), "seller_replacement");

But that said, I haven't really looked at the code yet, and this just looks scarry.. :)

I'm currently trying to extend bitcoin-spock for additional test support.

zathras-crypto commented 9 years ago

@zathras-crypto: as far as I can see this is, because the original unit price isn't stored at all.

Yikes...

I'd just been kind of feeding off some of the testing discussion - one of those discussions related to unit prices should not be allowed to change so I codified that in the auditor - I see that an existing trade is replaced with new values after a part trade so I guess this is where the problem lies...

I don't feel very comfortable with using floating numbers

I'm not too knowledgeable on this topic, Faiz & Michael talked together a fair bit about integer vs float but I was working on different aspects and didn't get too involved. I believe the end outcome was that our token precision is a maximum of 8 decimal digits, so using such a large float (cpp_dec_float_100 is, you guessed it, 100 decimal digits) would take care of any concerns around precision.

But that said, I haven't really looked at the code yet

Same, I'm by no means fully familiar with the MetaDEx stuff yet either

ghost commented 9 years ago
I'm not too knowledgeable on this topic, Faiz & Michael talked together a fair bit about
 integer vs float but I was working on different aspects and didn't get too involved.
 I believe the end outcome was that our token precision is a maximum of 8 decimal digits,
 so using such a large float (cpp_dec_float_100 is, you guessed it, 100 decimal digits)
 would take care of any concerns around precision.

Essentially this, and that code for the metadex had to separate the fraction and whole part in calculation, which is why some of those calculations look so strange. I think the simplification is to use some kind of library to do arbitrary precision fractional calculations and then keep them in fraction form until they're ready to be displayed- I attempted this but the lack of libraries meant I wrote the calculations out by hand (probably due to time constraint)- If anyone can recommend a library I might have a crack at a refactor later.

zathras-crypto commented 9 years ago

Essentially...

Holy Hallucination Batman!!! Could that really be Faiz!?!?!

Hello mate :) Been too long, hope things are cool for you and things are going well :)

ghost commented 9 years ago
Holy Hallucination Batman!!! Could that really be Faiz!?!?!

Hello mate :) Been too long, hope things are cool for you and things are going well :)

Heh, yea- If I have time this week I'll certainly investigate that library in an attempt to fix the code, I know it looks hiary, I was meaning to slap a big TODO on it, but never had the chance... :)

m21 commented 9 years ago

Wow, the old team's back :) On metadex:

  1. The price should not be allowed to change -- agreed, thus the original ratio must always be preserved.
  2. Stay away from floats until the last moment, use them only to derive the price for matching (yes 100 digits) perhaps derive at every iteration, then forget the derived floats ASAP and rely on ints 99% of the time.
  3. The older trade must always be satisfied to the fullest, the newer trade may get fewer tokens (depending on the divisibility & whatnot); whomever didn't get satisfied will get his trade re-entered into the market with the original ratio (effective price) and a fewer number of wanted tokens.
dexX7 commented 9 years ago

Haha, welcome back! :)

If anyone can recommend a library I might have a crack at a refactor later.

I'm wondering, if 128 bit wide integer could be used, similar to the plain integer STO logic, as described here: https://github.com/mastercoin-MSC/spec/issues/301#issuecomment-71384036 https://github.com/mastercoin-MSC/mastercore/pull/275 https://github.com/mastercoin-MSC/mastercore/issues/273

Further, it would be very helpful, if x_Trade() would be refactored, and the match finding and loop externalized. x_Trade() could then consume two CMPMetaDEx objects, the new and old trade. Ideally, indexes->erase, indexes->insert, t_tradelistdb->recordTrade and update_tally_map should be moved outside of that new x_Trade() as well. The end goal is unit testing of x_Trade(), which can not depend on some internal state, but must be self sustaining.

@m21 a question to you:

That leftover of 3 units, causing an update to for sale = 3, desired = 0, resulting in an unit price of 0 and an inverse of inf: is this the result of the slighly off unit price, or is there a missing check that should prevent the insertion of such trade objects?

CraigSellars commented 9 years ago

@faizkhan00 - awesome to have you taking a peek back into the guts!

dexX7 commented 9 years ago

There is clearly a pattern ... :)

Here is the trade sequence of the testnet trade and some others I tested to trigger trading of zero amounts:

The section on the left describes the trade, the sections to the right describe the match and the results, whereby the labels and values were pulled from the log file.

zathras-crypto commented 9 years ago

Looking at the math here, is it even possible to have the unit price unchanged?

Let's take an example. I setup the following trade Amount for sale: 55555556, Amount desired: 500000003. The price is:

0.1111111113333333320000000079999999520000002879999982720000103679999377920003732479977605120134369279

Then let's say 22222222 is sold in a partial trade. This leaves 33333334 remaining for sale.

Now, as far as I can tell, there is no possible (valid) amount desired value that can apply this same price. To apply the original price, the amount desired would need to be 300000005.39999999280000005759...

Since this amount desired is not valid, the closest valid amount desired is 300000005. But as soon as you round the amount desired to valid values, the price is no longer the same. Even if we 'store' the original price we can't then apply it because we can't trade 33333334 for 300000005.39999999280000005759.., we can only trade for 300000005 (which is a different price).

TL:DR; after a partial trade, is it even possible to conduct all remaining trades at exactly the same unit price?

Amount remaining for sale: 55555556
Amount desired: 500000003   : Price: 0.1111111113333333320000000079999999520000002879999982720000103679999377920003732479977605120134369279

Amount remaining for sale: 33333334
Amount desired: 300000004   : Price: 0.1111111118518518419753087736625496844993375400088327998822293349036088679518817606415765247789796696
Amount desired: 300000005   : Price: 0.1111111114814814753086420781892986968450217192496380125060331248994479183425346942910884284818595253
Amount desired: 300000006   : Price: 0.1111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111
m21 commented 9 years ago

That leftover of 3 units, causing an update to for sale = 3, desired = 0, resulting in an unit price of 0 and an inverse of inf: is this the result of the slighly off unit price, or is there a missing check that should prevent the insertion of such trade objects? The price should not be updated that's my reason number 1 above, to store the original price ratio.

m21 commented 9 years ago

`Looking at the math here, is it even possible to have the unit price unchanged?`` We need to separate satisfying an order from re-entering the remaining order into the market (yes at the original price ratio).

To Dexx: 128 bits didn't work for us. Even 50 digits after the decimal point wasn't enough. That's why 100 digits is the correct number.

dexX7 commented 9 years ago

Looking at the math here, is it even possible to have the unit price unchanged?

https://github.com/mastercoin-MSC/spec/issues/170#issuecomment-44072344

This is a mega thread, with a lot of noise, and probably false information, but the linked post basically summarizes the essence: when calculating the amounts to trade, the unit price is rounded up, to the next tradable unit, constrained by the amounts available, however, for the matching and partially filled orders, the original unit price should prevail. Right now it looks like the price could slowly drift, because the original unit price is not stored.

Nevertheless, and re:

The price should not be updated that's my reason number 1 above, to store the original price ratio.

I'm not convinced that the drifting of unit prices causes zero trades, and the issue probably starts, when offers, which have some amount left for sale, but no more desire, are added back to the book:

Here is an example, where A2 ends up with for sale = 0.00000049 and desired = 0.00000000:

A1: offer 0.00005100 TMSC  for 0.00000051 TDiv1
A2: offer 1.00000000 TDiv1 for 0.01000000 TMSC   <-- fills A1 complely
A3: offer 0.00999999 TMSC  for 0.99999900 TDiv1  <-- is filled by A2, A2 now has 0.00000049 for sale, but 0.00000000 desired
A4: offer 0.00000049 TMSC  for 0.00000049 TDiv1  <-- matches with A2, but it's a zero amount trade

RPC output with some extra info (using gettrade_MP, getorderbook_MP): https://gist.githubusercontent.com/dexX7/45097058b357cc82d5ce/raw/

Debug log, with some extra output for the desired amount: https://gist.githubusercontent.com/dexX7/abe123c5e37f722aae9b/raw/

From another point of view you're right though: if the unit price is fixed, and the amount desired calculated based on that fixed unit price, and the amount for sale, then this does indeed make a significant difference.

Edit: actually, the desired amount is indeed calculated based on unit price and amount available, but it basically comes down to:

amount desired = amount left for sale * unit price
               = 0.00000049 * 0.01
               = 0.0000000049
               = 0
dexX7 commented 9 years ago
diff --git a/src/mastercore_dex.cpp b/src/mastercore_dex.cpp
index 0e855ce..4e8bf25 100644
--- a/src/mastercore_dex.cpp
+++ b/src/mastercore_dex.cpp
@@ -281,7 +281,7 @@ const XDOUBLE desprice = (1/buyersprice); // inverse, to be matched against that
       if (bBuyerSatisfied)
       {
         // insert the updated one in place of the old
-        if (0 < seller_replacement.getAmountForSale())
+        if (0 < seller_replacement.getAmountForSale() && 0 < seller_replacement.getAmountDesired())
         {
           file_log("++ inserting seller_replacement: %s\n", seller_replacement.ToString());
           indexes->insert(seller_replacement);
ghost commented 9 years ago

Side note, it seems that Boost comes with a rational number class (http://www.boost.org/doc/libs/1_58_0/libs/rational/rational.html) which might simplify the metadex calculations a fair bit- I'll need to do some testing to see if the implementation can support a unlimited precision type on the backend, but given that- we might have a shot at a simplier implmentation of the math

dexX7 commented 9 years ago

Hey @faizkhan00: this is an interesting library. May I ask in which cases you guys faced trouble with precision? I'm currently looking at the core math, and it looks like there isn't really an incredible number of values and operations involved.

Each offer, in it's most basic form (without property identifiers, or block positions), has four properties, and we need to calculate three values on each side, and further, determine whether an offer is cheaper than another offer. I haven't tested it yet with any values, and a second eye would be apprechiated:

But if that is correct, and the most complex calculation is basically ceil(a * b / c), then 128 bit wide integers are sufficient to cover the full range, see: https://github.com/mastercoin-MSC/mastercore/pull/275#issue-55376106

ghost commented 9 years ago

@dexX7 those values seem mostly correct - the key trouble area would be: how are you handling the fractional part of those calculations? recall that the orderbook calculations need to retain a certain precision for there not to be rounding errors- this is mainly why just 128 bit integers alone dont suffice, you would need (IIRC) two arbitrary precision integers to have the whole part calcluated to some high accuracy , and the same for the fractional part.

You can see an example of this here: https://github.com/OmniLayer/omnicore/blob/omnicore-0.0.10/src/mastercore.cpp#L698-L733

basically, it makes sense to use 128bit ints for some calculations, and its incorrect for use in other areas, depending on the level of precision needed for A) the calculation intermediaries and B) the end result of the calculations

dexX7 commented 9 years ago

recall that the orderbook calculations need to retain a certain precision for there not to be rounding errors

Please correct me, if I'm wrong, but all we need to know is, whether some order is cheaper (or more expensive) than another order, to find a match.

Further, what I forgot earlier: we also need to determine, whether some order's price equals another order's price, for the cancel logic.

m21 commented 9 years ago

Sure, multiplying 2 64bit numbers yields at most a 128bit number. But it's totally unrelated to price precision discussion, where we have to be able to discern the difference in 1/9gazillion and 2/9gazzilion. Trial & error showed us 100 decimal places was proper.

ghost commented 9 years ago
Trial & error showed us 100 decimal places was proper.

Yeah, and looking over the MetaDex code again (whew, its been awhile!) I want to roll back the amount I thought we could optimize/clean up those calcs - they are pretty good as-is, except for the rational number handling, and I'm reasonably confident in saying that it probably isn't worth including new dependencies/build changes for only approximately a 20 line refactor (basically I think a couple of lines of comments achieves the goal of code clarity, more or less)

dexX7 commented 9 years ago

Trial & error showed us 100 decimal places was proper.

Aside from whether it's possible or not, having concrete numbers here would be extremely useful for testing, and should become part of the test plan, so please, also in general, and not limited to the exchange, publish any edge cases or unexpected scenarios, whenever you stumble upon one. :)

m21 commented 9 years ago

I unfortunately lost many of my notes recently. The simplest case here is to capture the meaning of 1 divided by maxint64 (price) and compare it to say 2 divided by maxint64. maxint64 is what I called the 9gazillion = 922372036854775807

dexX7 commented 9 years ago

except for the rational number handling

Switching from boost::multiprecision::cpp_dec_float_100 to boost::rational<int64_t> (or something similar) could guard against potential pitfalls, so I support the general idea, and this is where I was getting in the first place.. :)

It's an open question though, whether all this may actually have an effect for the numbers we use.

The simplest case here is ...

Ah, thanks, I see. Here we go:

/**
 * price_lhs = lhs_desired / lhs_forsale
 * price_rhs = rhs_desired / lhs_forsale
 *
 * price_lhs < price_rhs
 * =>
 *   int128_t(lhs_desired) * int128_t(rhs_forsale) <
 *       int128_t(rhs_desired) * int128_t(lhs_forsale)
 *
 * price_lhs != price_rhs:
 * =>
 *   int128_t(lhs_desired) * int128_t(rhs_forsale) !=
 *       int128_t(rhs_desired) * int128_t(lhs_forsale)
 */
BOOST_CHECK(int128_t(1) * int128_t(INT64_MAX) <  int128_t(2) * int128_t(INT64_MAX));
BOOST_CHECK(int128_t(1) * int128_t(INT64_MAX) != int128_t(2) * int128_t(INT64_MAX));

Edit: see here for the DEx related calculations (untested).

m21 commented 9 years ago

Hm, replacing division by multiplication? How would 1/2 be != 2/1?

dexX7 commented 9 years ago

Hm, replacing division by multiplication?

Yes, to avoid fractions and the hassle with floating point numbers altogether.

How would 1/2 be != 2/1?

a/b != c/d <=> a*d != c*b  // generalized
1/2 != 2/1 <=> 1*1 != 2*2
2/4 != 6/3 <=> 2*3 != 6*4
...

This works for other operations as well, which might be handy to compare orders:

a/b == c/d <=> a*d == c*b
1/2 == 1/2 <=> 1*2 == 1*2
6/9 == 2/3 <=> 6*3 == 2*9
a/b <  c/d <=> a*d <  c*b
a/b >  c/d <=> a*d >  c*b
a/b <= c/d <=> a*d <= c*b
a/b >= c/d <=> a*d >= c*b

Now in the context of the DEx, we have for example:

// ..this is equal to (XDOUBLE) seller_amountGot / sellers_price;
XDOUBLE x_buyer_got = (XDOUBLE) seller_amountGot / ((XDOUBLE) seller_amountWanted / (XDOUBLE) seller_amountForSale);
x_buyer_got += (XDOUBLE) 0.5; // ROUND UP
std::string str_buyer_got = x_buyer_got.str(INTERNAL_PRECISION_LEN, std::ios_base::fixed);
std::string str_buyer_got_int_part = str_buyer_got.substr(0, str_buyer_got.find_first_of("."));
const int64_t buyer_amountGot = boost::lexical_cast<int64_t>(str_buyer_got_int_part);

... which can be replaced by (tested):

// for integer rounding up: ceil(num / denom) => 1 + (num - 1) / denom
int128_t x_buyer_got = 1 + ((int128_t) seller_amountGot * (int128_t) seller_amountForSale - 1) / (int128_t) seller_amountWanted;
const int64_t buyer_amountGot = x_buyer_got.convert_to<int64_t>();

This is basically a similar trick as used for send-to-owners, to get rid of those issues with precision, which caused some trouble for this STO test Marv created.. :)

I'm more and more confident that it's possible, in similar fashion, to avoid any use of XDOUBLE (... except for text outputs probably), and to prevent issues related to a loss of precision.

zathras-crypto commented 9 years ago

Re storing original prices, my initial thought would be to expand the CMPMetaDEx class with two extra fields to store the original AmountDesired and AmountForSale values rather than attempting to store the price itself.

It looks like there may have been some thought down this road as the CMPMetaDEx class already contains uint64_t amount_forsale; // the amount for sale specified when the offer was placed and uint64_t still_left_forsale; but from reviewing the code flow it looks like still_left_forsale is never used and amount_forsale is changed away from the original amount to whatever is currently for sale.

If we expand the class to keep those two original values, we'll always have the original price available.

dexX7 commented 9 years ago

... my initial thought would be to expand the CMPMetaDEx class with two extra fields to store the original AmountDesired and AmountForSale values

I support this direction. This likely requires an extension in the context of data persistence, as well, which I haven't looked into yet.

dexX7 commented 9 years ago

This thread became very large, and several issues and topics are disscued at the same time. If there are no objections, I'd like to close this issue (edit: or use it for general discussion related to the meta DEx), and suggest to move to seperated threads from here, to focus on one point at the time.

So far we have:

I labeled the related issues, but it is also thinkable to tag them with a milestone.

CraigSellars commented 9 years ago

@dexX7, thanks for consolidating the discussion issues into individual ones. I have no objections to closing this thread, unless as you said, others wish to leave it open for discussion.

In any event, let’s at least move the discussion for the items listed above in their individual issues.

zathras-crypto commented 9 years ago

^^^ what he said :)