epfl-lara / lisa

Proof assistant based on first-order logic and set theory
Apache License 2.0
33 stars 18 forks source link

Integration of the front #34

Closed FlorianCassayre closed 1 year ago

FlorianCassayre commented 2 years ago

This ticket describes the work to do for the integration of FlorianCassayre/master-project into LISA.

Some key questions:

The tasks are partitioned in sections ordered by increasing amount of work.

Can be integrated directly

Simple adaptions needed

Fundamental improvements needed

To be discussed

Should not be integrated

SimonGuilloud commented 2 years ago

This ticket describes the work to do for the integration of FlorianCassayre/master-project into LISA.

Some key questions:

* In the kernel, how is the proof checker going to evolve with respect to the equivalence checker?

I'm not a hundred percent sure about how to answer that. I think you can assume it as it's current state. If it changes, it will with higher probability be in astronger way, otherwise I/we will adapt the rest as needed. No intent to change it in the near future at least,

  • Should we keep a separate layer for first-order logic and sequent? I think so. For now it may only have little more features than the kernel (though they are nice features!) But there may be quite a bit more in the future as we develop more and more things. It will not stay such a precise 1-1 of the kernel.
  • How to improve the representation of tactics and patterns? I don't know. It will certainly be improved in the future, but how is a work in itself
  • What CFG should we stick to? Keep in mind the suggestions of allowing custom operators to be defined (context-sensitivity). What do you mean by CFG?

The tasks are partitioned in sections ordered by increasing amount of work.

Can be integrated directly

* `util`

  * `SCUtils.scala` are utilities to shrink proofs in different ways. They can be integrated as is in `lisa-utils`, and in the future can be improved if better heuristics are found.

Sounds good!

Simple adaptions needed

* `front`

  * `parser` / `printer` work on the front FOL but should be updated to work on the kernel FOL for an integration into `lisa-utils`. For that, the grammar should be slightly adapted to account for the absence of schematic connectors and the changes regarding variables/schemas.

Sounds good to! The fact that there are no variables (or that all variables are bounded) should simplify the description.

Fundamental improvements needed

* `front`

  * `proof`

    * `unification` a greedy matcher for sequents/FOL. It is useful for detecting how schemas should be instantiated. It is also used by patterns since it shares the representation: pattern variables = schemas. It was pointed out that the procedure could be greatly improved to deal with pathological cases by returning lazy lists of solutions, rather than just the most likely solution.

Certainly needs some improvements, but so do many aspects and direction of LISA :) If it works as intended, I think it's good enough to integrate for now.

To be discussed

* `front`

  * `fol` a complete rewrite of LISA FOL. This is not desirable but it provides several advantages: schematic connectors, order-preserving sequents, custom `toString` and refined types. Schematic connectors are needed for the encoding of the iff-substitution as a pattern. The remaining features can be implemented without this extra layer (and we should aim towards that, in my opinion).
    * The possibility of ordering formulas as desired is there for convenience to the user, but does not actually change the semantic; it is implemented at the level of the sequent. The ability to override `toString` is also cosmetic. The refined types can be implemented as opaque types.

Discussed a bit earlier. As more gets developped, more and more features will get "simulated" before they get to the kernel, like a picker or choice term, set theoretic functions, numbers, etc. And we may also want to add more informations or specifications to formulas in the future, so I think it's all good! Some of the things you mention, like schematic connectores or ordered sets could be included in the kernel, but that will be decided or changed in needed time.

  • proof

    * `predef`, `state`, `sequent` contain the logic that defines the concepts of (front) proof, patterns and tactics.
    
      * Patterns allow us to encode the behavior of a rule or a chain of rules. The benefit is that an algorithm written to work with patterns will by extension work for all the LISA rules. The iff-substitution rules cannot be encoded without schematic connectors, otherwise we would need to define another logic ADT (which is what the front FOL did). We should determine whether these patterns are desirable enough to be kept, or if they can be reduced to something simpler.

    To do so properly, we need to test it and try to work with it.

    • The framework allows partial proofs to be written. The current implementation creates a kernel theorem only after all the branches have been closed, however it was pointed out that each step in a backward proof could be stated as a (kernel) theorem, such that when they are combined together we can conclude the desired theorem. This approach would be safer and does in fact make more sense. I believe it is slightly trickier to implement but very attainable. Yes, this is what we discussed with Jasmin Blanchette. The issue with it is that it requires destruction rules, which are derived rules in our system, so the multiply the length of the proofs by arround 3. So let's keep it that way for now. Note that imports in proofs in the kernel also do exactly permit to write partial proofs (with open branches)

Should not be integrated

  • util

  • SCProofBuilder.scala and MonadicSCProofBuilder.scala and SCProofStepFinder.scala are utilities that provide a light front layer allowing to write proofs in a monadic way (using for-comprehensions) and by only having to state the conclusion. The main advantage of the monadic proof-style is that it could dynamically keep track of the indices. The proof step finder attempts to find a proof step with the given conclusion; it is by essence incomplete because of substitutive rules and rewriting. As is I would not integrate them in LISA however I think they are interesting ideas that could be taken as the basis for something else.

    • legacy: nothing to include in LISA

      • meta: experimentation with formulas being a special case of sequents. Might work but quickly becomes overwelmingly complicated.
      • parser: an improved version of this parser already exists in front/parser.
      • unification: a basic matcher for FOL.
      • GoalBasedProofSystem.scala and GoalBasedProofSystemREPL.scala the predecessor of the front.