Closed planemad closed 7 years ago
Hi @planemad, thanks! This is already possible to some extent: https://github.com/OpenRefine/OpenRefine/wiki/Reconciliation#comparing-values
The issue is that we are limited by the reconciliation API: it does not let the server return the matching scores for individual properties, only one global matching score per query. However, once the matching is done, it is straightforward to retrieve the coordinates of the matched items (especially with the data extension dialog), and compute the distance in OpenRefine.
I agree the user story you have would be ideal, but it requires some changes on OpenRefine's side (which we should totally consider! it will just take more time to agree on an improved protocol, and so on).
Thanks for the explanation and the useful documentation @wetneb. Did not know we can set the columns for matching with P625@lat, P625@lng
Currently a score of 0 is reached when the points are 1km away from each other
As a first step it would be great if this could be tuned by the user since it would vary on the type of dataset. For example, for matching city locations, the centroid could vary from Wikidata by over 5 kms and it still should be ok. For a mountain range this could be 100kms etc.
@planemad yes I agree this should be user-defined… but how? Essentially, the problem we have here is that all the matching heuristics are server-side, while we would need to have some finer control on them client-side, and the reconciliation API does not give any control over that…
@wetneb do you know where the code is that scores matches only upto 1km?
The only idea I can think of is to store a mapping of common instance types and acceptable distance threshold for a match. eg:
Tbh, for most general cases 1km would work fine. This is an issue only for features that are bigger than 1km and maybe the use case is too limited.
@planemad ideally i'd prefer not to hard-code this in the reconciliation interface. AFAICT the ideal situation would be that people use the dedicated "precision" field in Wikibase in a meaningful way with regard to the object they are representing. But unfortunately this precision field is not really used yet (it is poorly exposed in the UI and in the SPARQL query service).
Closing this, the orignal issue of matching using coordinates is already documented.
Agree that either 'precision' or some other property on Wikidata should indicate to external services how to set a suitable proximity threshold. This is out of scope for this API.
This is such an amazing project that really helps so many other datasets to connect to Wikidata! Thank you @wetneb 🙇
I'm using the tool to reconcile the Natural Earth dataset of public domain map data to Wikidata. This worked really well for countries with 100% matches https://github.com/nvkelso/natural-earth-vector/issues/224#issuecomment-340794184 but while matching airports by the name/IATA code, there are multiple matches due to errors in Wikidata itself of wrong data.
In such cases of geographical datasets it would be really useful to leverage the coordinate information in the source dataset (usually as
X
=lon,Y
=lat columns) to suggest the best match when possible.User story