Closed simonbyrne closed 1 year ago
What happens when someone accesses the help in REPL or in VSCode? Does the docstring expand somehow?
At the moment it will be empty, which is not ideal. I can change it so that it displays the function name (same as how @ref
links are displayed)
Hmm, I was wrong. It looks like only @ref
tags are rendered as such. Any other tags will be rendered the same as http links, e.g. [
fname](@doxygen)
will be rendered as
fname (@doxygen])
help?> HDF5.API.h5d_get_chunk_info
h5d_get_chunk_info(dataset_id::hid_t, fspace_id::hid_t, index::hsize_t, offset::Ptr{hsize_t}, filter_mask::Ptr{Cuint}, addr::Ptr{haddr_t}, size::Ptr{hsize_t})
External links
≡≡≡≡≡≡≡≡≡≡≡≡≡≡≡≡
• (@doxygen /tagfile/compound[name='H5D']/member[name='H5Dget_chunk_info']).
I think we should make URL docstrings usable from the REPL and the only way I know how to do that is interpolate the URLs in at generation time. While we could dynamically load the tag file at package init, I don't see a clear advantage for doing that over just statically including the URL in the doc string. Perhaps one thing might be changing the URLs to match the version of the HDF5 library loaded. The URLS should be accessible beyond just the Documenter.jl generated docs.
I do like the xpath approach though.
Do not forget about these URLs as well: https://github.com/JuliaIO/HDF5.jl/blob/2796b08741b6c5fbfda12435b3733e78abe19abc/gen/gen_wrappers.jl#L125
Some symbols are not being resolved in the H5F, H5P, and H5T category. Might want to just to do //member[name='$cfuncname']
.
Here we are doing a new query for each function thay we are trying to lookup. This means that we are likely O(N^2) with the number of XML elements.
My current thinking here is that we should just do a single XPath query to allow us to iterate over the members once and build a lookup table like I do in #1043. Basically refactor my for loop into a single iteration over findall
call with a XPath.
We should check in the hdf5.tag file but not the intermediate lookup tables. The package should be able to be regenerated from the git repo alone without an active Internet connection.
I'm still not seeing the advantage of tying the URL construction with Documenter, especially because it would be useful to have the URLs available outside of docker.
I'm still not seeing the advantage of tying the URL construction with Documenter, especially because it would be useful to have the URLs available outside of docker.
I think you're right, there's probably not much to be gained in this case. The main thing I wanted to avoid was making the XML parser an actual package dependency.
The only downside with doing it in the gen
scripts is that you can't use it elsewhere (e.g. link to sections of the manual). Not sure the best way to do that.
The only downside with doing it in the gen scripts is that you can't use it elsewhere (e.g. link to sections of the manual). Not sure the best way to do that.
We could retain file such as the group URLs linked below in git.
We could then load this into a global Dict at the top level during precompile time, interpolate the needed values, and then empty the table Dict at the end of precomp.
My exploratory diff to this branch is currently here: https://github.com/JuliaIO/HDF5.jl/compare/sb/doxygen-tag...mkitti:HDF5.jl:mk/doxygen-tag?expand=1
One thing about the xpath approach is that we do end up iterating over the tags more than once since each query starts a new iteration. It's not a very big deal in terms of performance though, which is not a huge consideration here.
Merged #1043 over this one. We should revisit this later though, especially with a pure-Julia XML parser. See #1043 comments.
This was my effort at #1029, as an alternative to #1043. It adds a pass to Documenter that expands
@doxygen
tags based on XML xpaths.Still not quite complete, as the xpaths are a bit complicated.