Open BethanyG opened 2 years ago
π€ π€
Hi! ππ½ π Welcome to the Exercism Python Repo!
Thank you for opening an issue! π π β¨
β ◦ If you'd also like to make a PR to fix the issue, please have a quick look at the Pull Requests doc.
We π PRs that follow our Exercism & Track contributing guidelines!
Please feel free to submit a PR, linking to this issue.
π
βΌοΈ Please Do Not βΌοΈβ β Run checks on the whole repo & submit a bunch of PRs. This creates longer review cycles & exhausts reviewers energy & time. It may also conflict with ongoing changes from other contributors. β β Insert only blank lines, make a closing bracket drop to the next line, change a word to a synonym without obvious reason, or add trailing space that's not an[ EOL][EOL] for the very end of text files. β Introduce arbitrary changes "just to change things" . _...These sorts of things are **not** considered helpful, and will likely be closed by reviewers._ |
π π While you are here... If you decide to help out with other open issues, you have our gratitude π ππ½.
Anything tagged with [help wanted]
and without [Claimed]
is up for grabs.
Comment on the issue and we will reserve it for you. π β¨
This came up in discussing PR #2838, so opening an issue here for longer discussion/evaluation.
We currently "hand validate" example python code used in
introduction.md
,about.md
, andinstruction.md
and similar documents. This leads to errors where certain code will not work in the REPL, or syntax or other errors get made and published. As we scale up exercises, this doesn't feel like a sustainable solution, hence this issue to propose, evaluate, and track possible tools and strategies for verifying code , and (possibly) adding that verification to the track CI.Below are three applicable libraries, but I'd warmly welcome more. Of the three below,
pmdoctest
feels like the nicest solution, and I've run thecomparisons
concept exercise through it with reasonable results. But I'd like to see if there are other strategies/libraries out there.doctest - this is the old-school original, but doesn't really work well in markdown fences. mkcodes - have not tried this yet. pmdoctest - reasonably good, but requires some weird quirks with code fence language names and or excess
>>>
in code fences to make parsing work.