Closed dselsam closed 9 years ago
Just to make sure I understand: where you wrote "i.e." did you mean "e.g."? In other words, we'll be able to use general congruence rules, not just for sub singletons?
No, it is really "i.e.". The system can only handle automatically the subsigleton case. For other cases, users have to prove the congruence theorem by themselves. We will have a [congruence]
attribute.
Ah, but Haitao's proposed is_finite, which is what I was initially responding to, is not a subsingleton. Or have I misunderstood?
Correct. Haitao's is_finite
type class is not a subsingleton. Daniel's finite_set
type class is, but it doesn't solve Haitao's problem because we would still need a way to list the elements of a finite set.
If we use Haitao's approach we would have to prove "custom" congruence theorems for definitions such as card
.
Anyhow, as I understand it now, the proposal is to use listable to address Haitao's immediate concern, and there is no concrete proposal to change our current treatment of finite sets. I am o.k. with that.
There are different proposals. I will try to summarize them.
1- Daniel's finite_set
type class. It is a subsigleton
, and the simplifier will be able to generate congruence theorems automatically for any definition that uses it. The main technical obstacle is https://github.com/leanprover/lean/issues/654#issuecomment-109439028. It also does not provide a way to list the elements of a finite set.
2- Haitao's is_finite
type class. It is not subsingleton. So, we would have to prove congruence theorems by ourselves. It is also affected by the issue https://github.com/leanprover/lean/issues/654#issuecomment-109439028
The finite
type class (at https://github.com/leanprover/lean/issues/650#issuecomment-118832920) is a variant of this proposal.
3- listable
type class. It provides an interface for converting finite sets into lists. The idea is to abstract the different possible tricks we can use to perform the conversion (sort elements, use constructive choice, use classical choice, etc), and address Haitao's problem. The argument can be ignored by classical users since we can show all types are listable
if we assume Hilbert's choice.
4- Give up finset
and use list
+ equivalence relation. We have to write s1 ~ s2
instead of s1 = s2
. We have to prove congruence theorems ourselves. It may look awkward for classical users.
Thanks for clearing this up. The summary is helpful.
It would be nice if Daniel's finite_set
could be made to work. I would then prefer to call it finite
. We could hide finsets from the user, as Daniel suggests, and use the hypothesis finite s
as the canonical way to state theorems about finite sets. But I am still a little worried about how hard it will be to "reason up to subsingletons".
I have no objection to 2, though I am even more worried about how hard it will be to reason up to equalities induced by congruences.
3 seems uncontroversial, i.e. I don't see any downside.
As for 4, writing s1 ~ s2 instead of s1 = s2 makes me uncomfortable, but maybe I could get used to it. We could redefine finset to be lists without duplicates, and try to make them "feel" like finite sets.
I guess there is also proposal 5, which is to stick with the current handling of finsets, or maybe a slight variant that bundles in the corresponding set.
It is a little depressing that finite sets are so complicated. But these issues are really fundamental.
I guess there is also proposal 5, which is to stick with the current handling of finsets, or maybe a slight variant that bundles in the corresponding set.
Yes. BTW, 3 does not require us to change anything in the finset
library.
It is a little depressing that finite sets are so complicated. But these issues are really fundamental.
I agree. Andrej Bauer has a list of "contructive gems and stones" :-) A "constructive gem" is something nice about constructive mathematics, and a "constructive stone" is a complication which does not exist in classical mathematics. Finite sets and cardinality of sets are the first two "constructive stones" in his list :-)
I think I found a way to kick the can further down the road. I will using subtype and quot to build quotient groups and directly build the concept of set (instead of type) quotients into lcoset_type. So instead of:
definition is_fin_lcoset [reducible] (S : finset A) : Prop := ∃ g, fin_lcoset H g = S
definition list_lcosets : list (finset A) := erase_dup (map (fin_lcoset H) (elems A))
definition lcoset_type [reducible] : Type := {S : finset A | is_fin_lcoset H S}
we will have
definition is_fin_lcoset' [reducible] (S : finset A) : Prop := ∃ g, g ∈ G ∧ fin_lcoset H g = S
definition to_list : list A := list.filter (λ g, g ∈ G) (elems A)
definition list_lcosets' : list (finset A) := erase_dup (map (fin_lcoset H) (to_list G))
definition lcoset_type' [reducible] : Type := {S : finset A | is_fin_lcoset' H G S}
Nice. Even in the absence of the subtype -> finite type problem, this way of doing the construction seems natural.
this way of doing the construction seems natural
It does force everything to carry a prop (set) : g ∈ G.
Another possibility is to make subtype <-> finset easier. So we can subtype on G : finset A, and the type class can automatically derive group (finset_type G) from is_finsubg G. We can translate back and forth with:
finset_to_set (H : finset A) : set (finset_type G) := λ g, (elt_of g) ∈ H
set_to_finset (H : set (finset_type G)) : finset A = dfilter G (λ g PginG, tag g PginG ∈ H)
assuming decidability. dfilter is a dependent filter on finset that takes an additional argument certifying what we are filtering over similar to dmap which works on list.
Now we can do algebra with type and set and when cardinality is needed translate to finset.
Following are identities:
set_to_finset_to_set (H : set (finset_type G)) : finset_to_set (set_to_finset H) = H
finset_to_set_to_finset (H : finset A) (Psub : H ⊆ G) : set_to_finset (finset_to_set H) = H
Hi,
I apologize in advance if this is not the appropriate place for questions like this. Please feel free to ignore if it is off-topic or just generally not helpful.
Could somebody please explain the envisioned interactions between the
set
library and thefinset
library? In particular, isfinset
designed to be user-facing? I fear it might get confusing to deal with both types of sets at the same time, especially when the notation means slightly different things in different places. Another option might be to use thefinset
library to support afinite_set
type class.We could then use the lemmas in the
finset
library to propagate thefinite_set
property, with instances such as:We could then define versions of all functions that only make sense on finite sets in terms of the type class, e.g.:
The net result is that users could use traditional set-theoretic notation everywhere, have it always mean the same thing, and still make use of finiteness when necessary without needing to track it explicitly. For example, one could write:
If this approach seems appealing, I would be happy to write a draft of it.
Thanks,
Daniel