Closed digama0 closed 2 years ago
@digama0 did you check this website generator? https://github.com/tirix/metamath-web
I use it to generate the content here: http://metamath.tirix.org:3030/mpests/itgsplit
I tried to advertise it several times on the mailing list and in some related GitHub issues, maybe I was not vocal enough!
If we have a $t
comment parser, I (or you) can extend it to generate mpeuni
pages.
Or maybe you're thinking about a static website generator?
Ah, that website generator is nicer than I remember. I'm actually going more for a carbon copy of metamath.exe's website generator, which is why I need $t parsing. Are you interested in extending the STS website to include everything else on metamath.org (or at least the autogenerated parts of it)? I'm actually about halfway done with it already, but I haven't decided where to put it yet, and I'm happy to contribute it to an existing project if that's the way it goes.
It shall be very quick to add Unicode typesetting since you've already done all the hard work parsing the $t
comment! I'll see if I can do that.
Of course contributions are welcome! I think it would be best if we collaborate on the same project rather than spread on several concurrent ones.
If you prefer generating static pages rather than serving them, I suppose the same project can be used, too, with only minor changes.
I've pushed 2 more commits (3 counting a cargo fmt
):
HashMap
instead of Vec
for the HTML tokens. I do believe that we need this kind of API (I though this was in my comment but it seems to have slipped away). I've only introduced a getter for alt_html_defs
and left alt_html_defs
itself public, but I think ultimately it shall become private.
I've now updated the metamath-web
project with "unicode" typesetting.
The changes are in PR #7, waiting for this PR to be merged.
The online site is already updated with the changes:
I have a working prototype of a metamath theorem page generator in the style of metamath.exe
using this stuff (plus some more changes to this repo to be PR'd soon). https://github.com/digama0/mm-web-rs
- one with a change of API, using
HashMap
instead ofVec
for the HTML tokens.I do believe that we need this kind of API (I though this was in my comment but it seems to have slipped away). I've only introduced a getter for
alt_html_defs
and leftalt_html_defs
itself public, but I think ultimately it shall become private.
While I suppose it's fine to have it in a hashmap, the intended use was for the user to process the list into their own data structure. See for example https://github.com/digama0/mm-web-rs/blob/10fdbc2d/src/main.rs#L84 . I wouldn't want to make the hashmap private (at least not with the get
API alone) because then you wouldn't be able to iterate over all the declarations.
I'm okay with merging this now and tackling the follow up stuff in another PR.
I have a working prototype of a metamath theorem page generator in the style of
metamath.exe
using this stuff (plus some more changes to this repo to be PR'd soon). https://github.com/digama0/mm-web-rs
Ok, so no common project.. Anyway, I find it very interesting that you reproduced there a part of the "verify proof" part to display the proofs. This reminded me of an attempt I made earlier to "abstract out" the process of going through the proof steps. I've posted that change in #63 , I'll try to explain more about it in the PR comments.
While I suppose it's fine to have it in a hashmap, the intended use was for the user to process the list into their own data structure. See for example https://github.com/digama0/mm-web-rs/blob/10fdbc2d/src/main.rs#L84 . I wouldn't want to make the hashmap private (at least not with the
get
API alone) because then you wouldn't be able to iterate over all the declarations.
I see. It's probably a matter of taste, but I would rather see the API providing separately a get
and e.g. an iterator to get all symbols rather than a direct public access to a Vec which still needs to be post-processed. For me the lookup service has to be provided by the library, and the way it is implemented (HashMap or Vec or something else) is an implementation detail which shall be private and hidden.
Ok, so no common project..
Like I said, I haven't yet decided where this code should ultimately end up. I want metamath-knife to fulfill our original aspirations for a community verifier that can subsume everything metamath.exe does today, but the way this repo is developing it seems more like a backend library, which means we need a suitable frontend which will inherit all the ugly stuff from metamath.exe.
Anyway, I find it very interesting that you reproduced there a part of the "verify proof" part to display the proofs. This reminded me of an attempt I made earlier to "abstract out" the process of going through the proof steps. I've posted that change in #63 , I'll try to explain more about it in the PR comments.
What part are you referring to? At first I thought I would have to process compressed letters myself, but I saw that ProofTreeArray
provides a decent interface already. The part about crawling the tree in the right order and only visiting non-syntax steps seems reasonable to be relegated to the client code.
approximate_clique_cover()
(or even exact_clique_cover
) could totally be a part of metamath-knife, though.
I want metamath-knife to fulfill our original aspirations for a community verifier that can subsume everything metamath.exe does today, but the way this repo is developing it seems more like a backend library, which means we need a suitable frontend which will inherit all the ugly stuff from metamath.exe.
I agree. I assume what @david-a-wheeler had in mind when naming this library metamath-knife
was some tool with the power and the versatility of metamath-exe
, but I think there is a lot of value in splitting this in two, with a library roughly as we have now, which allows to expose functionality to many proof browsers, exporters, assistants, etc.
approximate_clique_cover()
(or evenexact_clique_cover
) could totally be a part of metamath-knife, though.
Definitely. We'll also need that if we want to export a $d
statement for a frame.
ProofTreeArray provides a decent interface already. The part about crawling the tree in the right order and only visiting non-syntax steps seems reasonable to be relegated to the client code.
Let me answer that on PR #63 !
I think we had a thorough review and discussion about these changes. The API as it is now is unsound (Hashmap is both public and has accessors), but we can use it as a starting point. I'm merging this for now, let's continue with API changes in a follow-up PR.
The API as it is now is unsound (Hashmap is both public and has accessors), but we can use it as a starting point.
Unsound how? Certainly it's not memory unsafe but you probably mean something else. I don't think it is an issue that there are public hashmap fields, because the API never gives out a mutable reference to the TypesettingData
so it's read only access. The only way you could mutate it is if you use Arc::make_mut()
on the returned Arc<TypesettingData>
, but in that case it's just a separate data structure not managed by metamath-knife so it doesn't really matter what is done with it.
I agree. I assume what @david-a-wheeler had in mind when naming this library metamath-knife was some tool with the power and the versatility of metamath-exe, but I think there is a lot of value in splitting this in two, with a library roughly as we have now, which allows to expose functionality to many proof browsers, exporters, assistants, etc.
You're right about my overall goal, but I'm perfectly find with splitting code into parts as long as they can work together. For example, having a library and then separate crates that provide a command line interface or higher-level constructs or whatever is great.
I haven't had the time I intended to devote to this for personal reasons, but I still have dreams for it. And while it's currently under david-a-wheeler, I would gladly support moving this to the "metamath" org.
I haven't had the time I intended to devote to this for personal reasons, but I still have dreams for it. And while it's currently under david-a-wheeler, I would gladly support moving this to the "metamath" org.
I think you have to pull the trigger on this one, see https://docs.github.com/en/repositories/creating-and-managing-repositories/transferring-a-repository.
This extends the
$j
command parser a bit to handle the contents of the$t
comment, and adds a complete parser for$t
commands. I'm thinking about writing a web site generator, but I'm not sure whether all the metamath hacks belong in this repo, so maybe it should just be a separate crate (and/or repo) which uses this library for parsing.