The impact of this finding is significant as it concerns the improper handling of token decimal values in the decodeWellData function. When one of the token decimal values (specifically decimal1) is zero, it fails to default to 18, as intended. This can cause inaccuracies in the downstream calculations that depend on token decimal values, such as computing the liquidity pool (LP) token supply, token reserves, and rates.
Proof of Concept
Consider the decodeWellData function in the current implementation:
function decodeWellData(bytes memory data) public view virtual returns (uint256[] memory decimals) {
(uint256 decimal0, uint256 decimal1) = abi.decode(data, (uint256, uint256));
if (decimal0 == 0) {
decimal0 = 18;
}
// This should check for `decimal1` instead of `decimal0`
if (decimal0 == 0) {
decimal1 = 18;
}
if (decimal0 > 18 || decimal1 > 18) revert InvalidTokenDecimals();
decimals = new uint256[](2);
decimals[0] = decimal0;
decimals[1] = decimal1;
}
The error is highlighted in the comment. The second check should verify decimal1, not decimal0.
A Foundry test case can demonstrate this issue:
// SPDX-License-Identifier: MIT
pragma solidity ^0.8.20;
import "forge-std/Test.sol";
contract TestContract is Test {
function testDecodeWellData() public {
// Setup
bytes memory data = abi.encode(0, 0); // Simulating zero decimals for both tokens
InitialStable2 stable2 = new InitialStable2();
// Decode Well Data
uint256[] memory decimals = stable2.decodeWellData(data);
// Check for expected default values
assertEq(decimals[0], 18, "decimal0 should default to 18");
assertEq(decimals[1], 0, "decimal1 incorrectly defaults to 0"); // Should fail
}
}
contract InitialStable2 {
function decodeWellData(bytes memory data) public view virtual returns (uint256[] memory decimals) {
(uint256 decimal0, uint256 decimal1) = abi.decode(data, (uint256, uint256));
// Incorrect handling: only decimal0 is checked and assigned, but not decimal1
if (decimal0 == 0) {
decimal0 = 18;
}
if (decimal0 == 0) {
decimal1 = 18;
}
if (decimal0 > 18 || decimal1 > 18) revert();
decimals = new uint256[](2);
decimals[0] = decimal0;
decimals[1] = decimal1;
}
}
The test demonstrates that decimal1 does not default to 18, causing potential downstream issues.
Tools Used
Solidity
Foundry
Recommended Mitigation Steps
To resolve this issue, update the decodeWellData function to check and default decimal1 appropriately. The corrected function should look like this:
Lines of code
https://github.com/code-423n4/2024-07-basin/blob/7d5aacbb144d0ba0bc358dfde6e0cc913d25310e/src/functions/Stable2.sol#L317
Vulnerability details
Impact
The impact of this finding is significant as it concerns the improper handling of token decimal values in the decodeWellData function. When one of the token decimal values (specifically decimal1) is zero, it fails to default to 18, as intended. This can cause inaccuracies in the downstream calculations that depend on token decimal values, such as computing the liquidity pool (LP) token supply, token reserves, and rates.
Proof of Concept
Consider the decodeWellData function in the current implementation:
The error is highlighted in the comment. The second check should verify decimal1, not decimal0.
A Foundry test case can demonstrate this issue:
The test demonstrates that decimal1 does not default to 18, causing potential downstream issues.
Tools Used
Solidity Foundry
Recommended Mitigation Steps
To resolve this issue, update the decodeWellData function to check and default decimal1 appropriately. The corrected function should look like this:
This ensures that both token decimal values are correctly defaulted to 18 if they are initially zero.
Assessed type
Decimal