In unicode-math v0.8k, when using range=… to take some symbols from another font (where range does not include \sqrt), \sqrt{x} uses the correct root symbol, but \sqrt[n]{x} uses the symbol from a different font, producing inconsistent/broken output. As usual, adding a font declaration with the main font at the end (e. g. \setmathfont{latinmodern-math.otf}[range=\int]) causes both commands to use the same (correct) root symbol. It seems that only LuaTeX is affected.
Description
In
unicode-math
v0.8k, when usingrange=…
to take some symbols from another font (whererange
does not include\sqrt
),\sqrt{x}
uses the correct root symbol, but\sqrt[n]{x}
uses the symbol from a different font, producing inconsistent/broken output. As usual, adding a font declaration with the main font at the end (e. g.\setmathfont{latinmodern-math.otf}[range=\int]
) causes both commands to use the same (correct) root symbol. It seems that only LuaTeX is affected.(I’m not sure whether this is related to #423 or the discussion in https://tex.stackexchange.com/q/364310?)
Check/indicate
Minimal example demonstrating the issue
Further details
Output with
\setmathfont{latinmodern-math.otf}[range=\int]
at the end:Output without
\setmathfont{latinmodern-math.otf}[range=\int]
at the end: