Closed Layoric closed 9 years ago
I guess that's what we get for improving the results from the default Gazetteer ones :)
The problem, as I think you're aware, is that there are many things being returned that are subtly different: the geographic centre of "Perth", the administrative boundary, the town boundary etc etc. But I guess since we don't expose any of that extra detail to the user, we may as well just pick the first (ie, highest priority) in the list, and discard all other duplicates.
I've been meaning to ask: is the data they serve available for download somewhere?
@stevage no good deed and all that.. Personally, I think what you've done with improving the UX so far as been good :+1: , considering I don't believe the web service is a good fit for the providing decent, weighted results based on commonly used places etc. Saying that, I think the data itself is quite extensive.
@nahidakbar I'll try and find out for sure, but I think the data behind the Solr instance the newer version of the Gazetteer 2010 data found on data.gov.au. May be the same data, but if not, not sure when data.gov.au is scheduled to be updated.
I think removing "duplicates" based on lat/long and name would probably catch a large amount of the listed duplicates for the user.
Something along the lines of (not tested and could be written nicer):
var unfilteredResults = solrQueryResponse.response.docs;
var filteredResults = [];
for(var i = 0; i < unfilteredResults.length; i++) {
var duplicateFound = false;
for(var j = 0; j < filteredResults.length; j++) {
if(filteredResults[j].name === unfilteredResults[i].name &&
filteredResults[j].location === unfilteredResults[i].location) {
duplicateFound = true;
break;
}
}
if(!duplicateFound) {
filteredResults.push(unfilteredResults[i]);
}
}
var results = filteredResults.sort(
function (a,b) { return sortResults(a,b,searchText); } );
results.slice(0,10).forEach(function(result) {
that.searchResults.push(new SearchResultViewModel({
name: result.name + (result.state_id !== 'N/A' ? ' (' + result.state_id + ')' : ''),
isImportant: !!result.feature_code.match('^(CNTY|CONT|DI|PRSH|STAT|LOCB|LOCU|SUB|URBN)$'),
clickAction: createZoomToFunction(that, result)
}));
});
Would probably be enough.
@stevage Any update for this issue of displaying (what appear to the user as) duplicates?
Thanks for the reminder :) Followed your suggestion. Not perfect, but better.
@stevage Thanks!
Hi Team,
I've been asked to raise an issue with the "Official Place Names" search duplicate results. For example, "Wagga Wagga". "Perth" is another example, 3 results with same name and coordinates.
I understand these duplicates are due to the GA web service exposing a dataset that is probably not suited to the searching common locations within NM, but would it be possible to filter out results on the client to remove entries with the same name and coordinates?
Since you are only displaying the name and the click action is to zoom, this should at least make it perceived as removing duplicates. I can see that you are already doing some client side manipulation of the result to better handle data quality/applicability issues from the web service.
There will be some results that have the same name and only slightly different coordinates, but this will at least improve results for users.