exercism / python

Exercism exercises in Python.
https://exercism.org/tracks/python
MIT License
1.94k stars 1.29k forks source link

[CI]: Consider/Discuss Tools for Verifying Fenced Code in Markdown Documents #2848

Open BethanyG opened 2 years ago

BethanyG commented 2 years ago

This came up in discussing PR #2838, so opening an issue here for longer discussion/evaluation.

We currently "hand validate" example python code used in introduction.md, about.md, and instruction.md and similar documents. This leads to errors where certain code will not work in the REPL, or syntax or other errors get made and published. As we scale up exercises, this doesn't feel like a sustainable solution, hence this issue to propose, evaluate, and track possible tools and strategies for verifying code , and (possibly) adding that verification to the track CI.

Below are three applicable libraries, but I'd warmly welcome more. Of the three below, pmdoctest feels like the nicest solution, and I've run the comparisons concept exercise through it with reasonable results. But I'd like to see if there are other strategies/libraries out there.

doctest - this is the old-school original, but doesn't really work well in markdown fences. mkcodes - have not tried this yet. pmdoctest - reasonably good, but requires some weird quirks with code fence language names and or excess >>> in code fences to make parsing work.

github-actions[bot] commented 2 years ago

πŸ€–   πŸ€–

Hi! πŸ‘‹πŸ½ πŸ‘‹ Welcome to the Exercism Python Repo!

Thank you for opening an issue! 🐍  πŸŒˆ ✨


​          ◦ If you'd also like to make a PR to fix the issue, please have a quick look at the Pull Requests doc.
             We  πŸ’™  PRs that follow our Exercism & Track contributing guidelines!


πŸ’›  πŸ’™  While you are here... If you decide to help out with other open issues, you have our gratitude πŸ™Œ πŸ™ŒπŸ½.
Anything tagged with [help wanted] and without [Claimed] is up for grabs.
Comment on the issue and we will reserve it for you. 🌈 ✨