You could generate some statistics on the URLs that could be added in a second phase, like prevalence of domains. Based on that list, new properties could be proposed or domains could be whitelisted, etc. --Azertus (talk) 13:48, 19 July 2021 (UTC)
[x] aggregate all URLs from links validation
[x] rank them in descending order of Web domain frequency
[x] post them in the monthly report on Meta
[x] elicit feedback in the Wikidata chat & mailing list
Following a thread from https://www.wikidata.org/wiki/Wikidata:Project_chat/Archive/2021/07#Item_validation_criteria, let's split IDs from bare URLs, (see also #399):