sherlock-audit / 2024-02-rubicon-finance-judging

5 stars 3 forks source link

blutorque - `PartialFillLib::partition()` unexpectedly reverts with `PartialFillOverflow` error, due to rounding up the output tokens. #29

Closed sherlock-admin2 closed 7 months ago

sherlock-admin2 commented 7 months ago

blutorque

medium

PartialFillLib::partition() unexpectedly reverts with PartialFillOverflow error, due to rounding up the output tokens.

Summary

GladiusReactor.sol#resolve( order , quantity ) is using partition() on input/output tokens, any ask order can be partially filled with input tokens of amount quantity such that: fillThreshold <= quantity <= order.input.amount.

In partition(), even though the filled quantity is same as order.input.amount, it is expected the output token amount(which is outpart) to be same as output[0].amount. The issue its not, because mulDivUp rounds up by adding 1 to it, which means outPart always > output[0].amount for such case.

Check out the comments below; https://github.com/transmissions11/solmate/blob/c892309933b25c03d32b1b0d674df7ae292ba925/src/utils/FixedPointMathLib.sol#L66

    function _validatePartition(
        uint256 _quantity,
        uint256 _outPart,
        uint256 _initIn,
        uint256 _initOut,
        uint256 _fillThreshold
    ) internal pure {
        if (_quantity > _initIn || _outPart > _initOut) // <= @audit if quantity==inputAmount, _outPart always > _initOut, hence revert
            revert PartialFillOverflow();
        ...SNIP...
    }

Vulnerability Detail

See summary.

Impact

Unable to fill the order via above even it expected to be filled. Note that order can also be filled via resolve(order) but until the filler switch to that, someone may end up executing the order already via resolve(order). Fillers may end up losing their winning arbitrage.

Code Snippet

https://github.com/sherlock-audit/2024-02-rubicon-finance/blob/11cac67919e8a1303b3a3177291b88c0c70bf03b/gladius-contracts-internal/src/lib/PartialFillLib.sol#L93-L101

Tool used

Manual Review

Recommendation

    function _validatePartition(
        uint256 _quantity,
        uint256 _outPart,
        uint256 _initIn,
        uint256 _initOut,
        uint256 _fillThreshold
    ) internal pure {
-       if (_quantity > _initIn || _outPart > _initOut)
+       if (_quantity > _initIn)
            revert PartialFillOverflow();
        if (_quantity == 0 || _outPart == 0) revert PartialFillUnderflow();
        if (_quantity < _fillThreshold) revert QuantityLtThreshold();
    }
sherlock-admin commented 7 months ago

1 comment(s) were left on this issue during the judging contest.

0xAadi commented:

daoio commented 7 months ago

it won't always add 1 wei, the full comment says:

If x y modulo the denominator is strictly greater than 0, 1 is added to round up the division of x y by the denominator.

ArnieSec commented 7 months ago

Escalate

this issue should be a valid medium

it won't always add 1 wei, the full comment says:

If x y modulo the denominator is strictly greater than 0, 1 is added to round up the division of x y by the denominator.

This still bug is still relevant when modulo is greater than 0, which will be the majority of the time. The only time modulo is 0 is if the number are perfectly divisible from another, this is unlikely. Most cases will see the bug happen should be valid, just because it doesnt happen all the time does not justify removing this issue as a valid medium...

sherlock-admin2 commented 7 months ago

Escalate

this issue should be a valid medium

it won't always add 1 wei, the full comment says:

If x y modulo the denominator is strictly greater than 0, 1 is added to round up the division of x y by the denominator.

This still bug is still relevant when modulo is greater than 0, which will be the majority of the time. The only time modulo is 0 is if the number are perfectly divisible from another, this is unlikely. Most cases will see the bug happen should be valid, just because it doesnt happen all the time does not justify removing this issue as a valid medium...

You've created a valid escalation!

To remove the escalation from consideration: Delete your comment.

You may delete or edit your escalation comment anytime before the 48-hour escalation window closes. After that, the escalation becomes final.

ArnieSec commented 7 months ago

Further more issue #51 which deals with revert because of mulDivDown and mulDivUp in fee calculation is sponsor confirmed and valid. So this issue in turn should also be valid medium because this issue causes reverts due to overflow because of mulDivUp in the partition function.

daoio commented 7 months ago

here, the thing is that the report mentiones that the bug occurs under a certain condition:

quantity is same as order.input.amount

which is basically a full fill of an order, but note that in such case, we're ending up with input.amount.mulDivUp(output[0].amount, input.amount) thus: $$\frac{x y}{x}$$ simplifying, we'll end up with just $y$ So refering to perfectly divisible numbers, here are they -> $xy$ divided by $x$.

As an example

    /// forge-config: default.fuzz.runs = 8000
    /// @dev Parameters:
    ///      * o - output[0].amount
    ///      * i - input.amount
    /// @notice 'quantity' == 'i', thus we're ending up with:
    ///         (i * o) / i
    function test_Rounding(uint256 o, uint256 i) external {
        // avoid overflow
        o = bound(o, 0, type(uint128).max);
    // can't divide by 0, thus
        i = bound(i, 1, type(uint128).max);

    uint256 remainder = (i * o) % i;

        assertEq(remainder, 0);

    // compare built-in solidity math with 'FixedPointMathLib'.
    uint256 floorValue = (i * o) / i;
    uint256 ceilValue = i.mulDivUp(o, i);

    assertEq(floorValue, ceilValue);
    }
ArnieSec commented 7 months ago

I still think mulDivUp in partition in this codebase should be changed to mulDivDown.

  1. Functions that mutate inputs and outputs in the protocol already use MulDivDown. Example: decay function.

  2. Decay on outputs lower the value of output over time, this means if we use mulDivUp when in partition we are working against decay. Negating some of the intended decay.

  3. In the same fashion, since input decay, raises input. Since outPart may be rounding up in some scenarios. Again in this case we negate some of the input decay. Example. Before decay 100 input token for 100 output token. After decay 110 input token to 100 output token. After a partition of 50%. 55 input token to 51 output token. Because of the mulDivUp the ratio of input to output was brought back down and hence negating some of the decays value. Using arbitrary numbers for now but you get the gist.

  4. When adding up these small leaks of value and decay negation or amplification in conjunction with each other, we may get up to 3-4% difference in expected value in the swap.

  5. Values in solidity are rounded down by default and is expected behavior. Users will expect the downward rounding if they have used uniswapX, because decay already rounds down.

Since MulDivUp may cause a result to be off by an entire % point when rounding, this is a valid concern and issue for users.

All assumptions above are assuming quantity does not equal input. You were right, when input and quantity are strictly equal we do not see any rounding up. This only happens when we have differing quantity and inputs.

daoio commented 7 months ago

@ArnieGod I get your concerns, but I disagree with the impact (e.g., 3-4% difference) and this particular report isn't about it and doesn't mention these aspects. In terms of noting a rounding inconsistency #51 addresses it, but it's a different scenario and incosistency in resolve calculations won't cause such errors.

ArnieSec commented 7 months ago

@daoio yes this exact report does not state these issues, but this report is not mine. My report is duplicate of this and is #31. Since this issue was the main duplicate I have to state my escalation here.

If you look at #31 I do describe the issues above

In some scenarios, this bug is forcing fillers to fill at lower rates because of this bug.

I expanded on it in my previous comment.

Maybe not exactly 3-4% difference but definitely will be 1-2% which is already a big difference. Let me know your thoughts 👍

We can see the percentage point difference if we use inputs as follows

MulDivUp(991,100,1000)= 100 OutPart=100

The outPart stays the same even though we dropped quantity by .9%

And new values are now

991 and 100.

Partition happens, outPart value stays the same. The filler wanted to fill less inputs but in the end he ended up paying the same output even though he receives less input.

ArnieSec commented 7 months ago

An invariant of partition is, if quantity drops, then outPart should drop aswell. But due to MulDivUp this is not the case. We should be always rounding down, this is the standard in uniswapX. UniswapX does not use MulDivUp at all. MulDivUp has caused other issues in this protocol as you have stated in issue 51. While both talk about MulDivUp, the overall reports are completely different from each other.

mstpr commented 7 months ago

Issue is invalid.

You should definitely keep using mulDivUp. Fillers are full control of which orders they want to fill in or not. If they see that roundUp makes it unprofitable then simply no filler fills it. If you roundDown then the filler can take advantage and fills it but the profit to be make here is at max 1 wei.

example provided above says I have 1000 input tokens and filler fills 991 of it. Normally the math result for this operation would be 99.1. However, in solidity if you round down you should get 99 or 100. This is definitely a weird order and the main goal should be that filler should never fill an order that is lesser than amountOut that a swapper wants assuming swapper is the simple user and the filler is sophisticated user.

Again, fillers choose which orders to fill, main goal should be filler never fills an order by underpaying it. If roundUp makes it unprofitable, well then dont fill this trade that's it.

ArnieSec commented 7 months ago

@mstpr

You should definitely keep using mulDivDown.

I'm assuming you meant MulDivUp here.

while it is up to the filler to take only orders that they see profitable,

People who created the order, ie. sellers do not have this luxury, and this issue impacts their profit aswell.

Before partition and decay a seller expects:

1000 input tokens for 100 output tokens.

Now let us imagine output decay happens and now we accept 1000 input tokens for 90 output tokens.

The input to output ratio is 11.11 meaning for every 1 outputs token the filler will get 11.11 inputs.

Let's say a filler wants to fill 991 tokens out of the full 1000.

He expects the same 11.11 input to output token ratio, this is because partition should scale both values equally.

Instead if he fills 991 tokens, he is still forced to pay 90 output because of mulDivUp(991,90,100)=90

His input to output ratio is 11.01. Meaning for every output he sends he gets 11.01 input. After partition his profitability dropped because of MulDivUp rounding up and Messing up the input to output ratio.

Let us say that the order is profitable at 11.11 input to output ratio for the filler, but not profitable at 11.01 input to output ratio. Since after MulDivUp rounding the ratio was worsened when it shouldn't, all fillers wait for decay to happen again in output in order to become profit able. Output drops to 80 after decay and finally is now profitable.

The order gets filled.

We just witnessed the seller miss out on selling 991 tokens for 90 output, he is now selling 991 tokens for 80 output.

MulDivUp causes a loss for the seller.

ArnieSec commented 7 months ago

Also in this scenario protocol loses out too from less fee generation.

ArnieSec commented 7 months ago

We are really undermining the impact/ severity of this issue.

Fillers make or break this protocol.

They are in charge of execution of the orders. They are the biggest piece of the puzzle. Without them swaps take longer, less fees are generated.

Rounding in their favor is paramount. UniswapX understood this, which is why none of their functions round up throughout the entire protocol.

The main innovation of rubicon is these partial fills, this is what will gain them market share. Once fillers start to notice that filling partial orders yields them less profit than uniswapX because of the rounding. Fillers will likely all flock back to uniswapX. fillers, market makers, and the such are motivated by profit. If another protocol offers the same service but yields them more profit, who do you think they will stay with?

Such a simple fix will have such a big positive effect on this protocol, and ensure rubicon can remain competitive in the realm of these kinds of swaps.

daoio commented 7 months ago

Neither floor nor ceil rounding could avoid a significant relative rounding error here. Floor rounding would favor the filler, while ceil rounding would favor the swapper, but the question is what are the real world token ratios that will trigger it.

For example, both mulDivDown and mulDivUp can imply into $≥0.1$% (and higher) relaitve error:

    // x = quantity
    // y = output[0].amount
    // z = input.amount
    /// forge-config: default.fuzz.runs = 8000
    function testFuzz_PartitionRoundingErr(uint256 x, uint256 y, uint256 z) external {
        z = bound(z, 1, type(uint128).max);
        y = bound(y, 1, type(uint128).max);
        // quantity (should be less than or equal to 'z'
        x = bound(x, 1, z);

        // both calculations will imply into rounding error.
        vm.assume(isRoundingErrorCeil(x, y, z));
        vm.assume(isRoundingErrorFloor(x, y, z));

        // check if rounding error >= 0.1%
        // and it will be, because we've assumed rounding err already.
        bool ceilErr = isRoundingErrorCeil(x, y, z);
        bool floorErr = isRoundingErrorFloor(x, y, z);

        assertTrue(ceilErr && floorErr);
    }

    function isRoundingErrorFloor(
        uint256 x,
        uint256 y,
        uint256 z
    ) internal view returns (bool isError) {
        uint256 remainder = (x * y) % z;
        isError = remainder * 1_000 >= y * x;
    }

    function isRoundingErrorCeil(
        uint256 x,
        uint256 y,
        uint256 z
    ) internal view returns (bool isError) {
        uint256 remainder = (x * y) % z;
        remainder = z - (remainder % z);
        isError = remainder * 1_000 >= y * x;
    }

Uni's functions that utilize mulDivDown typically involve small denominators (e.g., BPS, duration), which means significant relative rounding errors are unlikely to occur there (as if they were using mulDivUp instead).

While it's possible to avoid rounding error on a contract side by checking whether it has occured or not, I don't think it's needed at all, because if order isn't profitable to execute for fillers, they simply won't do so.

ArnieSec commented 7 months ago

TLDR: below i have shown that using mulDivUp vs mulDivDown will cause a user(seller) to lose 9$ this is a valid medium.

i have gone in depth to prove it. There exists no viable scenarios where mulDivDown will cause a user to receive less funds than mulDivUp and is > than precision or wei. Users lose out on funds because of mulDivUp, 2 perfectly viable scenarios are described below.

I have tried writing more scenarios without bias, it is impossible that we see mulDivDown lose more than wei amounts. When on the other hand, the scenario below demostrates a viable scenario where muldivup cause a loss of 9$ compared to mulDivDown.

I urge the judge read my entire comment, this will show you that the issue is valid medium.


For example, both mulDivDown and mulDivUp can imply into >= 0.1% (and higher) relaitve error: you are proving that both have rounding error.

this is known. The problem is that even though both have rounding error, rounding up will cause more loss of funds when in context of this protocol. Consider the scenario i have described.

while it is up to the filler to take only orders that they see profitable, People who created the order, ie. sellers do not have this luxury, and this issue impacts their profit aswell. Before partition and decay a seller expects: 1000 input tokens for 100 output tokens. Now let us imagine output decay happens and now we accept 1000 input tokens for 90 output tokens. The input to output ratio is 11.11 meaning for every 1 outputs token the filler will get 11.11 inputs. Let's say a filler wants to fill 991 tokens out of the full 1000. He expects the same 11.11 input to output token ratio, this is because partition should scale both values equally. Instead if he fills 991 tokens, he is still forced to pay 90 output because of mulDivUp(991,90,100)=90 His input to output ratio is 11.01. Meaning for every output he sends he gets 11.01 input. After partition his profitability dropped because of MulDivUp rounding up and Messing up the input to output ratio. Let us say that the order is profitable at 11.11 input to output ratio for the filler, but not profitable at 11.01 input to output ratio. Since after MulDivUp rounding the ratio was worsened when it shouldn't, all fillers wait for decay to happen again in output in order to become profit able. Output drops to 80 after decay and finally is now profitable. The order gets filled. We just witnessed the seller miss out on selling 991 tokens for 90 output, he is now selling 991 tokens for 80 output. MulDivUp causes a loss for the seller.

@daoio can you deny rounding up causes more loss of funds? No it is clear from my scenario that rounding up will cause more loss of funds.

The fact that both will have rounding error of the same % does still not negate the fact that is rounding up will cause more funds to be lost.

If a scenario can be written that shows that rounding down will cause more loss of funds than rounding up, and the loss is > than just precision or wie, then i will accept this issue as invalid.

So far no such scenario has been described. Only thing that has been said is that both rounding directions have rounding errors.

The facts are rounding up will cause more loss of funds than rounding down, so why are we trying to invalidate the issue when it is clear as day.

sherlock docs

Causes a loss of funds but requires certain external conditions or specific states, or a loss is highly constrained. The losses must exceed small, finite amount of funds, and any amount relevant based on the precision or significance of the loss.

the criteria fits perfectly, before we say that this is not valid because of precision/low amounts of loss. My scenario is not a loss of value equal to a low amount or precision amount/wei. My scenario is that rounding up will cause order to not be filled when they otherwise should, this will cause excess decay to be added. Decay can be very large. Depending on user input. loss can anywhere from 1 to 99% depending on the decay configuration.

Can this be denied? if it can ill happily accept an invalid issue.

I will make another scenario so there can be no doubt of issue validity

My scenario is that rounding up will cause order to not be filled when they otherwise should, this will cause excess decay to be added. Decay can be very large. Depending on user input. loss can anywhere from 1 to 99% depending on the decay configuration.

i will prove the above assumption i just made with a perfectly viable scenario that can happen.

seller(user) wants to sell 1000 token x for 100 token y. The value of 1000 token x comes out to 99.99$ and the value of 100 token y comes out to 100$. meaning 1 token x is worth 0.09999$ and 1 token y is worth 1$ a perfectly viable scenario that may happen.

let us assume each block we decay output by 10 token y and a fill threshold of 100 for token x, lets see what happens depending on differing rounding directions.

Filler wants to fill 491 of token y, lets calculate outparts based on both mulDivUp and mulDivDown

outpart when using mulDivUp = 50 outpart when using mulDivDown = 49

let us see which if either is profitable

50 tokeny is worth 50$ because 50 x 1$(value of tokenY) =50$ in the same way 49 token y outpart = 49$

491 token x =49.09$ because 491 x 0.09999$(the value of token x) = 49.09$

the protocol is intentded to launch on optimism. current tx fee for a uniswap swap is 0.04$ when checking dune

swap when rounding down the filler is sending 49$ of his token y for 49.09$, the profit is 5 cents if we also add optimism gas fee for a swap. the swap is profitable swap when rounding up the filler is sending 50$ of his token y for 49.09$ of token x, this is not profitable

When rounding down the swap is profitable so it will be filled, when rounding up the swap is not profitable, it will not be filled and that means we will see decay.

let us assume each block we decay output by 10 token y

the rounding up swap didnt get filled we are now at the next block and decay happens the swap is now as follows

fillers sends 40 token y(decay of 10 is added) for 491 token x 40$ for 49.09$ the swap is now profitable and is filled.

In conclusion when rounding up the user(seller) receives 40$ of token y when rounding down the user(seller) receives 49$ of token y

49-40 = 9$

the user lost 9$ because of rounding up in partition.

This issue is without a doubt valid, clear loss of funds, these states that are perfectly viable and possible.

again if we look at sherlock docs

Causes a loss of funds but requires certain external conditions or specific states , or a loss is highly constrained. The losses must exceed small, finite amount of funds, and any amount relevant based on the precision or significance of the loss.

this issue once again fits the description of a valid medium.

mstpr commented 7 months ago

@mstpr

You should definitely keep using mulDivDown.

I'm assuming you meant MulDivUp here.

while it is up to the filler to take only orders that they see profitable,

People who created the order, ie. sellers do not have this luxury, and this issue impacts their profit aswell.

Before partition and decay a seller expects:

1000 input tokens for 100 output tokens.

Now let us imagine output decay happens and now we accept 1000 input tokens for 90 output tokens.

The input to output ratio is 11.11 meaning for every 1 outputs token the filler will get 11.11 inputs.

Let's say a filler wants to fill 991 tokens out of the full 1000.

He expects the same 11.11 input to output token ratio, this is because partition should scale both values equally.

Instead if he fills 991 tokens, he is still forced to pay 90 output because of mulDivUp(991,90,100)=90

His input to output ratio is 11.01. Meaning for every output he sends he gets 11.01 input. After partition his profitability dropped because of MulDivUp rounding up and Messing up the input to output ratio.

Let us say that the order is profitable at 11.11 input to output ratio for the filler, but not profitable at 11.01 input to output ratio. Since after MulDivUp rounding the ratio was worsened when it shouldn't, all fillers wait for decay to happen again in output in order to become profit able. Output drops to 80 after decay and finally is now profitable.

The order gets filled.

We just witnessed the seller miss out on selling 991 tokens for 90 output, he is now selling 991 tokens for 80 output.

MulDivUp causes a loss for the seller.

There is no loss here. The "loss" happens because of decay. As a trader if you do want your orders to get filled in properly and timely: 1- don't trade dusts 2- if you are not ok with decaying, don't decay 3- if you don't want fillers to have full control on your input then don't make the minimum quantity as "1 Wei"

if you are filling 991 of tokens out of 1000 and the total output is 90 then that means you should get 89.19 tokens. If you roundUp fillers has to give 90 if you round down as you suggest then there is an actual loss of "0.19" wei.

As for the next example you give... it is definitely not a normal scenario. No one forces swappers to have "decay" if you are dealing with a token that's worth 100$ with only 1000 wei then you should not use decay at the first place.

ArnieSec commented 7 months ago

Youre making assumptions on what a trader should or should not do. The trader is free to do as he wants, if he wants decay then that is his choice. Regardless my scenario is a perfectly viable trade that may happen, and the rounding direction allows for the trade to not be filled and forces decay to happen. I describe that rounding down would have avoided decay in this scenario.

1- don't trade dusts

the dust amounts i showed were for simplicity of the scenario, scenario works the same if you scale up, Additonally 100$ worth of wBTC is dust amounts according to you. If this user wants to swap his wbtc why shouldnt he be able to? because you said he shouldnt? hes free to do it and shouldnt be penalized for it.

2- if you are not ok with decaying, don't decay

Irrelevant comment, its not that the user is not okay with decay or not, its that rounding up will result in more decay than needed and therefore causing the end user to lose funds..A user who does want to use decay should not be subject to more than needed decay because of the rounding direction. As i have shown rounding up increases the chances we will have to decay more than needed.

3- if you don't want fillers to have full control on your input then don't make the minimum quantity as "1 Wei"

this comment is irrelevant, my scenarios work even if minimum quantity is 90% of the original input....

You are trying to refute my issue by saying what the trader should and should not do. Uniswap x only rounds down. The problem with roundingup is that is affects order desirability for the filler. This causes orders to be filled at a lower rate after decay. My scenarios describe perfectly why this is an issue.

As for the next example you give... it is definitely not a normal scenario. No one forces swappers to have "decay" if you are dealing with a token that's worth 100$ with only 1000 wei then you should not use decay at the first place.

Normal scenario or not its still a loss of funds. sherlock docs on what makes a valid medium

Causes a loss of funds but requires certain external conditions or specific states, or a loss is highly constrained. The losses must exceed small, finite amount of funds, and any amount relevant based on the precision or significance of the loss.

"causes loss of funds but requires certain conditions or specific states" sherlock allows for specific states that causes loss of funds to count as a valid medium. my second scenario is a specific state that causes loss of funds, this issue is valid medium.

You just tried to refute my issue with nothing but your personal opinion on what a trader should or should not do...

To this point no one has been able to give a viable scenario that shows rounding down will cause more than wei amounts of loss. While i have show rounding up in a specific state loses a user 9$.

ArnieSec commented 7 months ago

final points

  1. Uniswapx which this project is a fork of, exclusively uses muldivdown
  2. decay function uses muldivdown, the use of muldivup after decay function causes some of the intended decay to be negated.
  3. muldivup can cause later fills for the user
  4. muldivup can cause orders to incur decay that would otherwise not happen with muldivdown
  5. In certain states the use of muldivup causes a loss of funds, shown in a comment above.
  6. incase a swap of 2 tokens of equal amounts with 0 decay, rounding up in favor of the user may cause the order to never be filled, rounding down allows the order to be profitable for the filler and therefore be filled.
  7. muldivup causes the invariant that is, when input quantity is lowered, then output should be lowered aswell, to not be true.
  8. rounding should be consistent throughout the entire protocol. we should not use muldivup if the protocol built by uniswap was built on muldivdown. take a look at issue 51, the inconsistency of rounding direction causes a revert that is not present in uniswapX, this dos is caused by no other than MulDivUp

All of these points and the overall loss of funds should see this issue be a valid medium. It makes 0 sense to me why the protocol insists on the use of muldivup when all the above points show that it is bad for the protocol.

cvetanovv commented 7 months ago

I agree with @daoio and @mstpr comments. This report should remain invalid.

In my opinion, there are several points that make this report invalid:

ArnieSec commented 7 months ago

this makes 0 sense to me

A user would miss a possible winning arbitrage opportunity, but he still has the choice not to use this function but to use resolve(order).

the entire innovation of rubicon is partial fills, we are okay with partial fills making an order less profitable??? Now you are assuming the filler has infinite funds and can fill the entire order. If it is logical for you to assume that the filler has enough tokens to fill an entire order, than it is also logical for me to assume that sometimes the filler does not have enough tokens to fill the entire order and must use partial fill, this makes this point irrelevant.

ill highlight this aswell

A user would miss a possible winning arbitrage opportunity

you agree that there is a loss of funds, and your solution is for the filler to fill the entire order. Why are we assuming a filler has enough tokens to fill an entire order... you are making this assumption to invalidate my issue. It is a perfectly logically scenario that fillers do not have enough tokens in a given block to fill an entire order. Why are we assuming fillers have unlimited tokens and can always fill an entire order? In the scenario where the filler does not have enough tokens and cannot fill and be profitable when he otherwise should because of rounding up, the user will lose out on funds.

If rounding up makes filling an order unprofitable, the filler has the option not to fill it.

this point proves my point not the contrary. If a filler doesn't fill a users order because of the code made by rubicon. The users order will decay and the users order will be filled at a lower price when it shouldnt have. This is a CLEAR loss of funds...

In the comments, the sponsor prefers to stay that way, so we can consider it a design decision.

the choice of a sponsor to fix or not to fix an issue never has determined if an issue is a valid medium or not, the criteria according to sherlock rules and documentation does.

There is still a loss of funds.

ArnieSec commented 7 months ago

In summary;

but he still has the choice not to use this function but to use resolve(order).

Also saying this is like telling a person who can only afford a toyota that he still has the choice to buy a lambo... There is a reason why that filler wanted to use partial fill in the first place, and that was because he didn't have enough tokens to fill the entire order, people who cannot afford to fill an entire order shouldn't be penalized.

If rounding up makes filling an order unprofitable, the filler has the option not to fill it.

we agree that partial fill rounding up causes an order to not be filled and therefore incur more decay than it should.

since decay can be anywhere from 1% - 99% the user can lose 1-99% of his order value because of the fault of the code. This should make this issue a clear medium.

ArnieSec commented 7 months ago

@daoio

Can you answer this honestly with no bias, this should clear up the issue validity.

Using mulDivUp in partition may cause an order to not be filled in a certain block, this will cause more decay to happen, therefore causing a loss of funds.

This same scenario would not happen if we used mulDivDown.

Is this true yes or no @daoio

Czar102 commented 7 months ago

Hey @ArnieGod, please do not make multiple of these simple analogies, or comment in an "aggressive" manner, while tagging the sponsor. We will arrive at a fair outcome, and there is no reason to panic.

Czar102 commented 7 months ago

From my perspective, if I want 100 A tokens for 10 B tokens, then getting me 5 A tokens isn't sufficient to take one of my B tokens, since the remainder of the order will be at a worse rate for the next filler, and if the remainder is not filled, the exchange rate on the partial fill was worse than I wanted. If the filler can't afford to fill the order completely, then they must not benefit from that, and the rounding should be to their disadvantage.

I believe that's how the code currently works.

@ArnieGod if you'd like to challenge this, please formulate a single, concise response.

I'm planning to reject the escalation and leave the issue as is.

ArnieSec commented 7 months ago

Hey @ArnieGod, please do not make multiple of these simple analogies, or comment in an "aggressive" manner, while tagging the sponsor. We will arrive at a fair outcome, and there is no reason to panic.

You are right my apologies to the sponsor

ArnieSec commented 7 months ago

@Czar102

I'm not sure I follow your scenario, could you explain further having trouble following..

This is the main gist of my issue is as follows.

Decay lowers a users expected tokens each block.

If in a given block there exists a filler who wants to partially fill an order who is profitable barely if we call resolve with no partition. If that filler were to try to fill using partition, there is a very high chance that the order will not be profitable after partition because of the rounding inside of the function. The resolve function rounds down while the partition resolve function rounds up. This means that the same function has differing profitability in the same protocol. The main innovation of rubicon is the resolve with partition function. But as I have explained. This function always causes less profit then the original non partition resolve function that was written by uniswapX which rubicon is a fork of.

Since the filler was unable to fill at that block because it was unprofitable when it otherwise should, the next block the users order decays and this means that the rounding inside partition caused a decay that shouldn't be intended.

ArnieSec commented 7 months ago

Essentially 2 functions which are identical in function round in different directions. Meaning that a filler with less funds that cannot afford to call the original resolve function must call resolve with partial fill and be forced with worse rates. There is 2 outcomes to this

  1. He does fill at a lower rate which shouldn't be a thing since both functions have the exact same functionality.

  2. The partial filler will not fill in that block and then decay will cause the users return to be lower all because the partial filler function rounds in the opposite direction than the cannon function by uniswapX

This causes a loss for the user in certain scenarios

In either scenario there is a loss for either the filler or user. Both of these scenarios would be completely mitigated if we followed the same rounding direction as the normal resolve function.

Czar102 commented 7 months ago

Your argument is that the price set by the seller is too high, so it will decrease in a dutch-like auction and the seller will get less.

This is the model.

Additionally, even if this was a loss, it's a 1 wei difference, so it doesn't qualify for a Medium per docs:

The losses must exceed small, finite amount of funds

I'm not sure what is the "normal resolve function rounding" referring to.

ArnieSec commented 7 months ago

@Czar102

Your argument is that the price set by the seller is too high, so it will decrease in a dutch-like auction and the seller will get less.

Not exactly

The normal resolve function could see the same order be profitable simultaneously while the resolve with partition will be unprofitable.

This means that if you are not a whale, you always have worse rates than whales.

Additionally if there is no one to call normal resolve function for an order in a given block. The decay would happen.

Essentially if there are no users that can fill an order and therefore must fill partially. That order will be unprofitable when it was with a normal fill. This means that the rounding inside partition causes a forced decay.

ArnieSec commented 7 months ago

I'm not sure what is the "normal resolve function rounding" referring to.

There are two different way to fill an order. Both functions are called resolve but one resolve function includes partition logic.

The partition logic resolve always has worse rates because it rounds up while the normal resolve function rounds down. These 2 functions have the same job but round in completely different directions.

Czar102 commented 7 months ago

If a user wants 100 A tokens for 10 B tokens, a filler can send 50 A tokens to get 5 B tokens. The rate will be the same.

"The partition logic resolve always has worse rates" is false.

ArnieSec commented 7 months ago

If a user wants 100 A tokens for 10 B tokens, a filler can send 50 A tokens to get 5 B tokens. The rate will be the same.

"The partition logic resolve always has worse rates" is false.

No that's not true I have done the calculations for both methods that proves it I will quote below

ArnieSec commented 7 months ago

We can see the percentage point difference if we use inputs as follows

MulDivUp(991,100,1000)= 100 OutPart=100

The outPart stays the same even though we dropped quantity by .9%

And new values are now

991 and 100.

Partition happens, outPart value stays the same. The filler wanted to fill less inputs but in the end he ended up paying the same output even though he receives less input.

ArnieSec commented 7 months ago

Filler wants to fill 491 of token y, lets calculate outparts based on both mulDivUp and mulDivDown

outpart when using mulDivUp = 50 outpart when using mulDivDown = 49

Czar102 commented 7 months ago

This is expected behavior. You said that "the partition logic resolve always has worse rates", which is false, because that's not always the case.

I feel confident about rejecting this escalation.

Czar102 commented 7 months ago

Result: Invalid Has duplicates

sherlock-admin2 commented 7 months ago

Escalations have been resolved successfully!

Escalation status:

ArnieSec commented 7 months ago

@Czar102 I don't understand how this can be invalid. It's not always the case but in specific states it is.

Sherlock docs for valid medium

Causes a loss of funds but requires certain external conditions or specific states, or a loss is highly constrained. The losses must exceed small, finite amount of funds, and any amount relevant based on the precision or significance of the loss.

You are going against Sherlock rules I have shown a loss in a specific state without a doubt.

ArnieSec commented 7 months ago

Your reason for invalidation goes against Sherlock rules and documentation

Czar102 commented 7 months ago

@ArnieGod please stop spamming the comments section. Feel free to dm me on Discord if you think the judgment is inaccurate.