At least, it should normalise URLs that it's expanded from other sources. URLs we get from KerbalStuff can include characters (such as square brackets) that should be URL encoded. If we write them to JSON files in unencoded form them the spec validator shows great disapproval.
Our client seems to be fine reading these files, but we want to be strict with out ourputs and lenient with our inputs, and I really don't like to see travis cry.
Attentioning @Felger and @Ippo343 on this, as the expanded UniversalStorage files are getting caught by this. This isn't a fault with the originating NetKAN files themselves, it's a fault that the data we're reading isn't quite as clean as our validation tools would like.
At least, it should normalise URLs that it's expanded from other sources. URLs we get from KerbalStuff can include characters (such as square brackets) that should be URL encoded. If we write them to JSON files in unencoded form them the spec validator shows great disapproval.
Our client seems to be fine reading these files, but we want to be strict with out ourputs and lenient with our inputs, and I really don't like to see travis cry.
Attentioning @Felger and @Ippo343 on this, as the expanded UniversalStorage files are getting caught by this. This isn't a fault with the originating NetKAN files themselves, it's a fault that the data we're reading isn't quite as clean as our validation tools would like.