Closed LukePrior closed 1 year ago
Created: https://lukeprior.github.io/nbn-upgrade-map/sitemap.xml
Very quick and dirty clone to generate a file which I ran through a formatter:
suburbs_res = suburbs.read_all_suburbs()
urls = []
for state, suburb_list in suburbs_res.items():
for suburb in suburb_list:
process_name = suburb.name.replace(" ", "-")
process_name = process_name.lower()
urls.append(f"https://lukeprior.github.io/nbn-upgrade-map/?suburb={process_name}&state={state}")
start = """
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
"""
for url in urls:
start += f"<url><loc>{url}</loc></url>"
start += "</urlset>"
start = start.replace("&", "&")
file = open("site/sitemap.xml", "w")
file.write(start)
file.close()
If I get around to productionising it will take the opportunity to add lastmod
value and update it when underlying data has a change.
Not sure if google will actually index the pages but it now knows they exist
Looks like Google has indexed all the pages, will see if search traffic changes.
It has gotten about 2/3
Useful for Google as it won't index all the different suburbs otherwise.
Can initially be done manually as suburbs won't change often, in future create production ready code and a GitHub Action.