blockchain-etl / ethereum-etl

Python scripts for ETL (extract, transform and load) jobs for Ethereum blocks, transactions, ERC20 / ERC721 tokens, transfers, receipts, logs, contracts, internal transactions. Data is available in Google BigQuery https://goo.gl/oY5BCQ
https://t.me/BlockchainETL
MIT License
2.96k stars 849 forks source link

Missing data for major tokens #181

Open alexkroeger opened 5 years ago

alexkroeger commented 5 years ago

The tokens table in bigquery is missing symbols for a many key tokens, e.g. DAI (0x89d24a6b4ccb1b6faa2625fe562bdd9a23260359) and MKR (0x9f8f72aa9304c8b593d555f12ef6589cc3a579a2).

It's also missing rows entirely for certain ERC-20 tokens like Ethfinex wrapped tokens (for instance, ETHW, 0xaa7427d8f17d87a28f5e1ba3adbb270badbe1011).

medvedev1088 commented 5 years ago

Thanks for reporting this. It seems the DAI contract returns bytes32 instead of string for name() and symbol() which doesn't comply with the ERC20 standard https://eips.ethereum.org/EIPS/eip-20. This tool should have an option to be able to handle "fuzzy" ERC20 contracts.

A few other limitations are listed here https://github.com/blockchain-etl/ethereum-etl#limitations

medvedev1088 commented 4 years ago

Contract 0xaa7427d8f17d87a28f5e1ba3adbb270badbe1011 doesn't implement approve(address,uint256) required by ERC20 standard so technically not an ERC20. Apparently Etherscan uses "fuzzy" detection of ERC20's.

dsshap commented 3 years ago

@medvedev1088 where can I find the latest limitations?

I followed the link but that section no longer exists in the readme. I reviewed this markdown and just want to confirm if it is still current or a bit outdated.

medvedev1088 commented 3 years ago

@medvedev1088 where can I find the latest limitations?

I followed the link but that section no longer exists in the readme. I reviewed this markdown and just want to confirm if it is still current or a bit outdated.

This documents is current https://github.com/blockchain-etl/ethereum-etl/blob/develop/docs/limitations.md

Vaiyani commented 2 years ago

I guess this thing still persists. I was joining token address field from token_transfers table with address field in tokens (and amended_tokens) table in big query dataset, resulting in null values.