Since building the dashboard for analyzing API health, I noticed that many place name geocodes that do not match a grid are simple misspellings or they contain the state.
I propose that the api
Use levenshtein, a trie, fuzzysharp, levenshtypo, or something to fuzzy match place names with a character or two spelling error. We need to make sure that the actual values still match to the intended key and no error is introduced by this. It could be a key lookup and then fallback to the fuzzy search?
Many request carry ut or utah in the zone which fouls up the address system mapping. We can remove these common errors to help with address system mapping hits.
I benchmarked levenshteining the place names list at 91.01 us or 0.091 milliseconds. That's pretty fast. My suggestion would be to start small with a distance of 4 or under.
Since building the dashboard for analyzing API health, I noticed that many place name geocodes that do not match a grid are simple misspellings or they contain the state.
I propose that the api
ut
orutah
in the zone which fouls up the address system mapping. We can remove these common errors to help with address system mapping hits.