They proposed GPT-f, which automatically proves theorems using a transformer. The proofs of theorems are defined as a language model such as [GOAL,(theorem),PROOFSTEP,…]. They have succeeded in shortening the theorem proofs with 23 theorems compared to existing ones.
TL;DR
They proposed GPT-f, which automatically proves theorems using a transformer. The proofs of theorems are defined as a language model such as [GOAL,(theorem),PROOFSTEP,…]. They have succeeded in shortening the theorem proofs with 23 theorems compared to existing ones.
Why it matters:
Paper URL
https://arxiv.org/abs/2009.03393
Submission Dates(yyyy/mm/dd)
2020/09/07
Authors and institutions
Stanislas Polu, Ilya Sutskever
Methods
Results
Comments