Closed c4-bot-5 closed 5 months ago
GalloDaSballo marked the issue as insufficient quality report
bufferIncrease is used to keep track of modulo values
This finding looks invalid
bufferIncrease is used to keep track of modulo values
This finding looks invalid
correct, so should be marked accordingly - invalid and dismissed.
trust1995 marked the issue as selected for report
trust1995 marked the issue as not selected for report
trust1995 marked the issue as unsatisfactory: Invalid
@trust1995 , the issue I'm referring to is partially described in accepted finding #79 . I'm sorry for not being explicit enough with it.
What I was referring to in this sentence:
the calculations go through multiple truncations, until finally shares are calculated
I didn't mean that bareIncrease
is truncated as @GalloDaSballo suggested. It uses bufferIncrease
to counteract rounding. What I meant is that the fees calculation goes through multiple calculations - divisions and precission loss - until fees are l calculated:
uint256 bareIncrease = borrowPoolData[_poolToken].borrowRate
* (currentTime - timestampsPoolData[_poolToken].timeStamp)
* borrowPoolData[_poolToken].pseudoTotalBorrowAmount
+ bufferIncrease[_poolToken];
This part is ok, because we use bufferIncrease
for the remainder of bareIncrease
from previous executions. However then feeAmount
is calculated as (bareIncrease / PRECISION_FACTOR_YEAR) * globalPoolData[_poolToken].poolFee / 1e18
:
uint256 amountInterest = bareIncrease
/ PRECISION_FACTOR_YEAR; // @audit If called often, it will be small. There's also hidden divide before multiply error
uint256 feeAmount = amountInterest
* globalPoolData[_poolToken].poolFee //@audit poolFee is set to 20% in the code
/ PRECISION_FACTOR_E18;
This part is well described in #79 issue. Generally, both amountInterest
and feeAmount
are hit by precission loss, which adds up. Finally, fee shares are calculated based on feeAmount
, which went through multiple rounds of precission loss:
// @audit if feeAmount is smaller than difference between shares and total pool, it will be 0, so fees will be 0
uint256 feeShares = feeAmount
* lendingPoolData[_poolToken].totalDepositShares
/ (lendingPoolData[_poolToken].pseudoTotalPool - feeAmount);
This value will be truncated more if protocol is used more often. The PoC below shows the difference between using the same deposit amounts, and calculating fees wither often, or not:
// SPDX-License-Identifier: -- WISE --
pragma solidity =0.8.24;
import "./WiseLendingBaseDeployment.t.sol";
contract FeesLostTest is BaseDeploymentTest {
address borrower = address(uint160(uint(keccak256("alice"))));
uint256 depositAmountETH = 100 ether; // 10 ether
uint256 depositAmountToken = 10 ether; // 10 ether
uint256 borrowAmount = 5e18; // 5 ether
uint256 PRECISION_FACTOR_YEAR = 1e18 * 365 days;
uint256 userNft; // nftId of borrower
uint256 FEE_MANAGER_NFT = 0;
uint256 debtShares;
function _setupIndividualTest() internal override {
_deployNewWiseLending(false);
// set token value for simple calculations
MOCK_CHAINLINK_3.setValue(0.001 ether);
vm.stopPrank();
deal(address(MOCK_ERC20_3), borrower, 1e30);
}
function testFeesLostX() public {
// take 5 seconds - on Arbitrum block time is ~1 second
uint TIME_UPDATE = 5;
uint LOOP_ITERATIONS = 10000;
uint INTERMEDIATE_TOPUP_AMOUNT = 10000;
vm.prank(borrower);
userNft = POSITION_NFTS_INSTANCE.mintPosition();
vm.warp(
// warp some time, otherwise system safeguards won't allow to deposit
// due to share price value change outside of the bounds
block.timestamp + 10 days
);
// deposit big amount immediately, to influence total deposits and borrows, used in protocol fees calculations
_depositToken(MOCK_ERC20_3, borrower, 500000e6, userNft);
_borrowToken(MOCK_ERC20_3, borrower, 100000e6, userNft);
// take a snapshot. It is required to compare often vs rare upates
uint256 snapshot = vm.snapshot();
for (uint i; i < LOOP_ITERATIONS; i++) {
vm.warp(
block.timestamp + TIME_UPDATE
);
// deposit amounts, just to force invoke fees and pseudo borrows recalculation
_depositToken(MOCK_ERC20_3, borrower, INTERMEDIATE_TOPUP_AMOUNT, userNft);
_borrowToken(MOCK_ERC20_3, borrower, INTERMEDIATE_TOPUP_AMOUNT, userNft);
}
// check how much fees manager accrued and revert to the state before
(, uint256 managerSharesIncremental) = LENDING_INSTANCE.userLendingData(FEE_MANAGER_NFT, address(MOCK_ERC20_3));
vm.revertTo(snapshot);
// perform the same actions as before, but instead use 500 seconds update interval at once
vm.warp(
block.timestamp + (TIME_UPDATE * LOOP_ITERATIONS)
);
_depositToken(MOCK_ERC20_3, borrower, INTERMEDIATE_TOPUP_AMOUNT, userNft);
_borrowToken(MOCK_ERC20_3, borrower, INTERMEDIATE_TOPUP_AMOUNT, userNft);
(, uint256 managerSharesOnce) = LENDING_INSTANCE.userLendingData(FEE_MANAGER_NFT, address(MOCK_ERC20_3));
console.log("MANAGER shares incremental ", managerSharesIncremental);
console.log("MANAGER shares once ", managerSharesOnce);
}
function _depositToken(MockErc20 token, address depositor, uint amount, uint nftId) internal {
vm.startPrank(depositor);
token.approve(address(LENDING_INSTANCE), amount * 2);
LENDING_INSTANCE.depositExactAmount( // supply collateral
nftId,
address(token),
amount
);
vm.stopPrank();
}
function _borrowToken(MockErc20 token, address depositor, uint amount, uint nftId) internal {
vm.prank(depositor);
LENDING_INSTANCE.borrowExactAmount( // supply collateral
nftId,
address(token),
amount
);
}
}
The outcome of the test is as follows:
❯ forge test -vvv --mt testFeesLost
[...]
MANAGER shares incremental 100000
MANAGER shares once 116311
Test result: ok. 1 passed; 0 failed; 0 skipped; finished in 8.66s
Ran 1 test suites: 1 tests passed, 0 failed, 0 skipped (1 total tests)
As can be seen in the following test, even with time period as short as ~10 minutes, the changes are significant - around 14% ((116311-100000) / 116311
) difference. It gets more severe with the token value to decimals ratio increase, like for example WBTC, beucase the fees are calculated in value expressed in token units, not dollar equivalent.
Hi, Unfortunately the submission does not pinpoint the correct root cause in the manner that #79 did, I am inclined to leave as insufficient quality as the additional explanation cannot breplace the validity of the original submission.
I would like just to add a remark here concerning validity of original submission. While issue #79 pinpoints precission loss in amountInterest
, our initial submission comments, which I'll allow myself to post below, concern more broad phenomenon
in the vulnerable function. In our original submission, and additional explanation we covered the same topics I believe. Excerpts from original submission:
// @audit dependent on last update
uint256 bareIncrease = borrowPoolData[_poolToken].borrowRate
// [...]
if (bareIncrease < PRECISION_FACTOR_YEAR) { // @info PRECISION_FACTOR_YEAR is 1e18 * (60 * 60 * 24 * 365) 1year in secs
bufferIncrease[_poolToken] = bareIncrease;
// [...]
uint256 amountInterest = bareIncrease
/ PRECISION_FACTOR_YEAR; // @audit PRECISION_FACTOR_YEAR is 1e18 * (60 * 60 * 24 * 365) 1year in secs. If called often, it will be small, like 1 or 2
uint256 feeAmount = amountInterest
* globalPoolData[_poolToken].poolFee //@audit poolFee is set to 2e17 in code, so combining the 2, it will be 0 for the most part
/ PRECISION_FACTOR_E18;
// @audit if feeAmount is smaller than difference between shares and total pool, it will be 0, so fees will be 0
uint256 feeShares = feeAmount
While I do agree that the description on the original submission could be phrased better, I believe that the additional explanation we rephrased our original submission, delving deeper into specific topics covered in original submission, like time dependence, token decimals dependence, multiple truncations (precission losses), which all add up to the severity shown in PoC based on our initial submission.
Lines of code
https://github.com/code-423n4/2024-02-wise-lending/blob/79186b243d8553e66358c05497e5ccfd9488b5e2/contracts/MainHelper.sol#L500-L577
Vulnerability details
Impact
Protocol fee loss.
Proof of Concept
Function
MainHelper._updatePseudoTotalAmounts()
to be used with every pool modification operation (borrow, lend, etc). Currently, fees for fee manager are dependent on calculations based on time since last update. In case of Ethereum it's 12 secs, in case of Arbitrum < 1 sec. The problem is that trigger to redistribute fees is reaching1e18 * (60 * 60 * 24 * 365)
, while the calculations go throug multiple truncations, until finally shares are calculated. Additional factor is that it will behave differently for different tokens, e.g. for USDC it's almost guaranteed to never work, because of 6 decimals, while for ETH it may work sometimes, and it's working on bare amount, not considering decimal precision. If the function is executed often, no fee shares will be distributed to fee manager - those will be considered lost.Tools Used
Manual Review
Recommended Mitigation Steps
Consider saving shares remainder and update it with each execution, similar to
bufferIncrease
.Assessed type
Decimal