subscan-explorer / subscan-issue-tracker

The issue tracker for Subscan.io.
3 stars 5 forks source link

API/ UI pagination generates duplicate records which overwrites other valid records #2

Closed lgmnemesis closed 2 years ago

lgmnemesis commented 2 years ago

Confirmation

Affected Network(s)

Polkadot

Steps to reproduce

In my example I use the following address: 1n4ufDK4XVpFdoejYrmVoQzadxU7pArCUR6eah2JRhfca65

  1. query polkadot reward address and go to the reward&slash section: https://polkadot.subscan.io/reward?address=1n4ufDK4XVpFdoejYrmVoQzadxU7pArCUR6eah2JRhfca65&role=account
  2. click on "view all" so it will display 25 records per page with the ability to paginate back and forth.
  3. download the full data records as a CSV file (by clicking on "Download all data")
  4. Look for event ID 7012725-14: you will see it is found twice (in my case on page 1 and on page 2)
  5. Look for event ID 7012725-74: Can not be found on the WEB (or using the https API)
  6. Look for event ID 7012725-14 in the CSV file. record exists ONLY once in the csv file.
  7. Look for event ID 7012725-74 in the CSV file. record exists in the csv file.

In summary: a. The number of records in the cvs and as reported in the UI are the same - 5655 records (at the time of the report) a. There are NO duplicate records in the csv file. b. the "missing" records are found in the csv file.

Expected output

ALL records can be retrieved using the API pagination with out gaps/holes and missing records.

As for my example above, I would expect to see the following record: 7012725-74 7012725 2021-09-27 13:56:42 staking(Reward) 752.6099929845 DOT

Actual output

Duplicate records and missing records when using pagination. easy to see when you compare it with the full data downloaded as a csv.

In my example: 7012725-14 Is duplicated twice 7012725-74 Is missing.

Additional factoids or references

No response

freehere107 commented 2 years ago

@lgmnemesis Thank you for your feedback, we will fix this issue as soon as possible

freehere107 commented 2 years ago

@lgmnemesis This problem has been fixed, thank you for your feedback

lgmnemesis commented 2 years ago

Thank you for the fix and for the update. Its working as expected now! thanks again.