code-423n4 / 2024-01-curves-findings

1 stars 0 forks source link

`FeeSplitter::onBalanceChange` potential gas issue due to duplicate token entries #1221

Open c4-bot-3 opened 9 months ago

c4-bot-3 commented 9 months ago

Lines of code

https://github.com/code-423n4/2024-01-curves/blob/main/contracts/FeeSplitter.sol#L99

Vulnerability details

Impact

There's an issue with the FeeSplitter::onBalanceChange function of the FeeSplitter.sol contract which is called after each trade that can lead to gas issues and DOS. If the same token subject is traded over and over again it will be pushed each time into storage which can lead to gas issues in the future.

Likelihood: low Impact: High

Details

function onBalanceChange(address token, address account) public onlyManager {
        TokenData storage data = tokensData[token];
        data.userFeeOffset[account] = data.cumulativeFeePerToken;
     @> if (balanceOf(token, account) > 0) userTokens[account].push(token);
    }

As long as the balanceOf(token, account) > 0, the function updates/does userTokens[account].push(token) without checking if there are duplicate entries of the token for the specified account and pushing such entries into the array making it bigger. This can easily add up to a point where the array would have millions of entries, keeping in mind bot trades that may amass several thousands of trades per week.

Let's imagine that Alice's bot has a balance of 2 curve token buys for the same subject. Subject in this example means the token

  1. It calls sellCurvesToken passing the curve subject and 2 as the arguments
  2. The previous state of the userTokens[aliceBot] contains 1 entry because it bought all 2 curve tokens in one transaction.
  3. The new state of the userTokens[aliceBot] now has 2 entries of the same token because it just sold all 2 curve tokens in one transaction.
  4. Then the bot proceeds to run a streak of 1 token buys across 10 different subjects immediately followed by 10 token sales of those subjects.
  5. userTokens[aliceBot] will have 1 entry each for the first 10 buys
  6. Then it will have another 1 entry for each token (2 each now because now it sold 1 + 1 dupe)

This will easily compound to a ton of duplicate token addresses existing in the userTokens[account] arrays for accounts with consistent trading activity.

This issue of duplicates will also make the getUserTokensAndClaimable function return a faulty UserClaimData.

Proof of Concept

This section provides a coded PoC.

Here's a link to the GitHub Gist setup for this issue: https://gist.github.com/Maroutis/37b1d67df90912f33cd059eadf58c9b9

    function testDupsInUserTokens() public {
        // Add holderFee
        vm.prank(owner);
        curves.setMaxFeePercent(protocolFee + subjectFee + subjectFee);
        vm.prank(owner);
        curves.setExternalFeePercent(subjectFee, 0, subjectFee);

        // Owner buys the first token
        vm.prank(owner);
        curves.buyCurvesToken(owner, 1);

        // Owner keeps buying tokens
        (,,,, uint256 totalFee) = curves.getFees(curves.getPrice(1, 10));
        uint256 value = curves.getPrice(1, 10) + totalFee;
        vm.prank(owner);
        curves.buyCurvesToken{value: value}(owner, 10);

        (,,,, uint256 totalFee1) = curves.getFees(curves.getPrice(11, 10));
        uint256 value1 = curves.getPrice(11, 10) + totalFee1;
        vm.prank(owner);
        curves.buyCurvesToken{value: value1}(owner, 10);

        // Calculate the number of duplicates
        uint256 dup;
        for (uint256 i = 0; i < feeSplitter.getUserTokens(owner).length;) {
            if (feeSplitter.getUserTokens(owner)[i] == owner) {
                dup += 1;
            }
            unchecked {
                ++i;
            }
        }
        // Same token stored 3 times
        assertEq(dup, 3);
    }

This test shows that same token can be stored after each trade. Over time this can create a huge mapping with duplicates.

Tools Used

Manual review + foundry

Recommended Mitigation Steps

Our recommendation is to check for duplicate entries before pushing new data.

Assessed type

Other

c4-pre-sort commented 9 months ago

raymondfam marked the issue as insufficient quality report

c4-pre-sort commented 9 months ago

raymondfam marked the issue as duplicate of #684

c4-judge commented 9 months ago

alcueca changed the severity to 3 (High Risk)

c4-judge commented 9 months ago

alcueca changed the severity to 2 (Med Risk)

c4-judge commented 9 months ago

alcueca changed the severity to QA (Quality Assurance)

c4-judge commented 9 months ago

alcueca marked the issue as grade-b