Open hats-bug-reporter[bot] opened 1 month ago
excellent issue
@luzzif I think duplicate of issue #6 , it's the same issue in there.
@luzzif I think duplicate of issue #6 , it's the same issue in there.
Exactly, Because the root cause of the issue is the same and the recommendation provided in #6 would fix it.
Yes, you guys are right, marking this as invalid.
@0xmahdirostami @0xShax2nk This issue and #6 are not the same.
Issue #6 is talking about small reward amounts. The issue here is talking about using low decimals even when the rewards are high.
Here is what #6 talks about:
The attacker sets up a campaign with a reward amount of 50 tokens. The global fee percentage is 1% (represented as 10,000 in contract terms). The fee calculation is 50 * 10,000 / 1,000,000 = 0 after integer division
Issue #6 is using 50 wei and not 50 tokens. So 1) nowhere is there a mention of low decimals 2) the understanding of 50 tokens is flawed.
Here is what my issue talks about:
_feeAmount = 90000 * 10 / 1000000 = 900000 / 1000000 = 0.
Here 90000 means 900 tokens for a token that has 2 decimals.
@luzzif Please check this ^
@luzzif Can you please read my comment above, the issues are totally different.
This is the exact same thing written in two different ways. The concept of decimals is exclusively off-chain so the 2 issues are equivalent.
Hi @luzzif, I totally disagree.
The concept of decimals applies onchain as well. An amount of 100000 (1e6) could mean two different things in tokens having different decimals. For example, 1e6 means 1 USDC (which has 6 decimals) while 1e6 means 1e6 wei for some other token like WETH (which has 18 decimals).
The other issue no where mentions low decimals, which is evident in its POC when it says 50 tokens but uses 50 wei.
In my POC, low decimals does not mean reward amounts are low, I've used 90000 meaning that 900 tokens are provided for a token that has 2 decimals and the campaign owner is still able to bypass fees.
Please re-evaluate this finding, thank you!
@mcgrathcoutinho The root cause of both issues is precision loss in division. There are many scenarios in which this could happen, but the issue is solely about precision loss in division. For the impact, Just like issue #6 there is no user funds at risk. It's just a minor issue; the owner (not users) will lose some reward in a special scenario.
@0xmahdirostami My main argument is to consider this issue instead of #6. Bypassing fees causes loss to the protocol and it accumulates over time as more campaign owners use low decimals tokens to bypass the fees (whether intentionally or unintentionally). Issue #6's POC is flawed in the logic it presents and in fact talks about a scenario that will never occur i.e. using 50 wei as a reward amount. The issue here mentions low decimals explicitly and demonstrates how even when high reward amounts are used, the fees are still bypassed (intentionally or unintentionally). The accumulated fees lost would be proportional as well. For example, 1% of 900 tokens is 9 tokens lost as fees while in the other issue it would be a dust amount.
@mcgrathcoutinho Yes, you are right. #6 is an unrealistic situation, and this issue is more reasonable.
I don't know how this could judged, but @luzzif please review it again.
@0xmahdirostami @luzzif If #6 is considered valid and the fix provided in the issue is implemented the above is also fixed, so it doesn't exist on the fix for #6 cause of same root cause, and so should be considered the dup of #6 simply.
@0xShax2nk The issue demonstrated here is of Medium-severity while yours is of low-severity mainly due to the fact that issue #6's logic is flawed and uses 50 wei (dust amounts) while the issue here explicitly mentions low decimals to demonstrate how fees are bypassed even with high reward amounts.
According to the Hats rules, the first issue that points out the highest severity of an issue correctly should be considered valid as the primary issue.
Correct me if I am wrong @0xmahdirostami since you're a lead auditor on Hats who knows the rules better.
@0xShax2nk The issue demonstrated here is of Medium-severity while yours is of low-severity mainly due to the fact that issue #6's logic is flawed and uses 50 wei (dust amounts) while the issue here explicitly mentions low decimals to demonstrate how fees are bypassed even with high reward amounts.
According to the Hats rules, the first issue that points out the highest severity of an issue correctly should be considered valid as the primary issue.
Correct me if I am wrong @0xmahdirostami since you're a lead auditor on Hats who knows the rules better.
I am not the lead auditor here, but for a similar impact, I assigned a low severity in another competition for my own submission.
@0xShax2nk The issue demonstrated here is of Medium-severity while yours is of low-severity mainly due to the fact that issue #6's logic is flawed and uses 50 wei (dust amounts) while the issue here explicitly mentions low decimals to demonstrate how fees are bypassed even with high reward amounts. According to the Hats rules, the first issue that points out the highest severity of an issue correctly should be considered valid as the primary issue. Correct me if I am wrong @0xmahdirostami since you're a lead auditor on Hats who knows the rules better.
I am not the lead auditor here, but for a similar impact, I assigned a low severity in another competition for my own submission.
Will this be considered a medium severity?
@0xShax2nk I just said low
Github username: -- Twitter username: MrPotatoMagic Submission hash (on-chain): 0x4574d5f7be5ffbc7756219710704b4db09c24d96e6baf01f1ada7ef4d35863ef Severity: high
Description: Description\ A campaign owner can create campaigns by calling the createCampaigns function. This function charges the campaign owner (msg.sender) if there is a specific fee set by the team or it simply charges the global amount.
Attack Scenario:
The issue arises here, where the fee amount for each token is calculated using the formula below.
If the reward token has low decimals (see here), then the
_feeAmount
variable would evaluate to 0.For example, if we use Gemini USD as the token which only has 2 decimals and the
_amount
= 90000 and the_fee
= 10 (0.001%)._feeAmount = 90000 * 10 / 1000000 = 900000 / 1000000 = 0.
This occurs since the numerator is smaller than the denominator, thus causing rounding down to 0.
Attachments
Here is the code snippet - https://github.com/hats-finance/Metrom-0xfdfc6d4ac5807d7460da20a3a1c0c84ef2b9c5a2/blob/e9d6b1e594d5bb3694bfe68f73399156ebb5d3a4/src/Metrom.sol#L204
Mitigation: Make UNIT smaller. For example, Consider using the BPS model where 100% = 10000. This would ensure even the tokens with the lowest decimals are considered. No functionality is lost here since the ratio remains the same i.e. 10% can be represented as 1000, 1% as 100 and so on.