microsoft / pybryt

Python library for pedagogical auto-assessment
https://microsoft.github.io/pybryt
MIT License
63 stars 19 forks source link
auto-assessment educators pybryt-library python-library

PyPI GitHub license GitHub contributors GitHub issues GitHub pull-requests PRs Welcome GitHub Actions codecov

GitHub watchers GitHub forks GitHub stars

PyBryt - Python Library

PyBrytLogo

PyBryt is an auto-assessment Python library for teaching and learning.

Features

Educators and Institutions can leverage the PyBryt Library to integrate auto assessment and reference models to hands on labs and assessments.

Getting Started

See the Getting Started page on the pybryt documentation for steps to install and use pybryt for the first time. You can also check the Microsoft Learn interactive modules on Introductions to PyBryt and Advanced PyBryt to learn more about to use the library to autoassess your learners activities.

Testing

To run the demos, all demos are located in the demo folder.

First install PyBryt with pip:

pip install pybryt

Simply launch the index.ipynb notebook in each of the directories under demo from Jupyter Notebook, which demonstrates the process of using PyBryt to assess student submissions.

Technical Report

We continuously interact with computerized systems to achieve goals and perform tasks in our personal and professional lives. Therefore, the ability to program such systems is a skill needed by everyone. Consequently, computational thinking skills are essential for everyone, which creates a challenge for the educational system to teach these skills at scale and allow students to practice these skills. To address this challenge, we present a novel approach to providing formative feedback to students on programming assignments. Our approach uses dynamic evaluation to trace intermediate results generated by student's code and compares them to the reference implementation provided by their teachers. We have implemented this method as a Python library and demonstrate its use to give students relevant feedback on their work while allowing teachers to challenge their students' computational thinking skills. Paper available at PyBryt: auto-assessment and auto-grading for computational thinking

Citing Technical Report

@misc{pyles2021pybryt,
      title={PyBryt: auto-assessment and auto-grading for computational thinking}, 
      author={Christopher Pyles and Francois van Schalkwyk and Gerard J. Gorman and Marijan Beg and Lee Stott and Nir Levy and Ran Gilad-Bachrach},
      year={2021},
      eprint={2112.02144},
      archivePrefix={arXiv},
      primaryClass={cs.HC}
}

Citing of Codebase

Please use the citing this repositry on the repo menu or citation.cff file in the root of this repo.

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.

When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

Trademarks

This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.