Closed hiroshige-g closed 5 years ago
Also, non-fetch-scheme URLs are considered as bare specifiers (e.g. foobar://something
).
I'm torn on this.
As discussed in #101, I really like the simplicity of invalid URL = bare specifier, and allowing universal mapping.
However, you bring up an interesting new point about the space of "something that looks like a URL to a human".
And, we have the always-applicable principle that it's easier to throw, and loosen restrictions over time, than it is to start out unrestricted and try to tighten them later.
So, I guess I could be on board with some restrictions to start with.
Do you or @guybedford have concrete suggestions for such restrictions? I agree with his comment that you cite that restricting to ASCII is not great. Should we go with a blocklist, composed mainly of punctuation?
I don't think this realistically affects users though where they use these on the LHS in mappings, as if something is a URL it gets normalized as a URL, and if something is not a URL, it does not get normalized as a URL. But in both cases we still get mappings applied, so that from a user perspective things will work out fine.
If we had very different algorithmic behaviours apart from this just affecting normalization I agree that would be more of a problem, but we get those RHS errors.
In addition, allowing users to map other URL types is important if one browser implements a new protocol or standard module that others don't, for users still to be able to map it.
It seems like there is not a lot of appetite for this change, so I'll close this. Let me know if I'm misreading.
Invalid absolute URLs, such as
https://
,https://:invalid
, are considered as bare specifiers, according to the current draft spec and Chromium implementation. Test: https://chromium-review.googlesource.com/c/chromium/src/+/1672232At first glance, this looked unexpected to me (I expected
https://:invalid
to be an error, not a bare specifier).On second thought, this seemed a natural consequence, because bare specifiers are defined as "something that cannot parsed as an URL", so there are many "something that looks like a URL to human ... but isn't a URL in the strict sense" considered as bare specifiers, and we don't have definitions of "something that looks like a URL".
But, this might be still confusing in terms of readability of the code using import maps.
I don't expect there are valid use cases for pseudo-URL specifiers like
https://
orhttps://:invalid
, so I'm wondering if we should limit the range/format of bare specifiers further to exclude pseudo-URLs. For example, the pattern in https://github.com/WICG/import-maps/issues/101#issuecomment-460046963 would exclude URL-like strings (by not allowing:
and limiting/
).WDYT?