Open bajtos opened 9 months ago
Oh! We don’t have any catch block, that’s why we are not setting end_at. And I am guessing that PG or node-pg converts null
to Date(0)
.
See also https://github.com/filecoin-station/spark/issues/43#issuecomment-1832209200
Figure out how to handle measurements with end_at set to Date(0) - they are clearly invalid, but why are we receiving them? Are they produced by fraudulent nodes?
This will be fixed by https://github.com/filecoin-station/spark-api/pull/160
At the moment, most retrievals (>99.99%) fail. We are not measuring the duration of failed retrievals, and therefore we don't know how many tasks can an honest checker node complete every round.
Let's start collecting that data.
[ ] Modify spark checker to report duration for failed retrievals too.
Note: we should be collecting this data, but apparently, some measurements come with an invalid
end_at
value. See https://github.com/filecoin-station/spark/issues/43#issuecomment-1832146304Let's ensure
end_at
is always set correctly.[ ] Modify spark-evaluate to produce two retrieval stats - duration of successful requests and duration of all requests.
[x] Figure out how to handle measurements with
end_at
set toDate(0)
- they are clearly invalid, but why are we receiving them? Are they produced by fraudulent nodes?