openjournals / joss-reviews

Reviews for the Journal of Open Source Software
Creative Commons Zero v1.0 Universal
722 stars 38 forks source link

[REVIEW]: ReadmeReady: Free and Customizable Code Documentation with LLMs - A Fine-Tuning Approach #7489

Open editorialbot opened 2 days ago

editorialbot commented 2 days ago

Submitting author: !--author-handle-->@souradipp76<!--end-author-handle-- (Souradip Pal) Repository: https://github.com/souradipp76/ReadMeReady Branch with paper.md (empty if default branch): main Version: v1.1.0 Editor: !--editor-->@crvernon<!--end-editor-- Reviewers: @Manvi-Agrawal, @camilochs Archive: Pending

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/279a09ad7a2963efe219581eba56bfb2"><img src="https://joss.theoj.org/papers/279a09ad7a2963efe219581eba56bfb2/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/279a09ad7a2963efe219581eba56bfb2/status.svg)](https://joss.theoj.org/papers/279a09ad7a2963efe219581eba56bfb2)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@Manvi-Agrawal & @camilochs, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review. First of all you need to run this command in a separate comment to create the checklist:

@editorialbot generate my checklist

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @crvernon know.

✨ Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest ✨

Checklists

πŸ“ Checklist for @Manvi-Agrawal

editorialbot commented 2 days ago

Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.

For a list of things I can do to help you, just type:

@editorialbot commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@editorialbot generate pdf
editorialbot commented 2 days ago

Software report:

github.com/AlDanial/cloc v 1.90  T=0.06 s (1047.7 files/s, 184202.5 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
Python                          34            961            445           5075
Markdown                        11            297              0            680
Jupyter Notebook                 2              0           1525            602
TeX                              1             36              1            283
YAML                             7             27             53            159
make                             1             16              4            102
Bourne Shell                     3             14              9             84
-------------------------------------------------------------------------------
SUM:                            59           1351           2037           6985
-------------------------------------------------------------------------------

Commit count by author:

    38  Sayak Chakrabarty
    36  Souradip Pal
    33  souradipp76
    32  souradip_pal
    17  hellokayas
    11  dependabot[bot]
     6  test
editorialbot commented 2 days ago
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

βœ… OK DOIs

- 10.1109/TIT.1956.1056813 is OK
- 10.1016/S1364-6613(03)00029-9 is OK
- 10.48550/arXiv.1410.5401 is OK
- 10.48550/arXiv.1409.0473 is OK
- 10.18653/v1/P16-1208 is OK
- 10.48550/arXiv.1603.06744 is OK
- 10.48550/arXiv.1704.01696 is OK
- 10.1109/MSR.2013.6624030 is OK
- 10.48550/arXiv.1611.08307 is OK
- 10.1109/ASE.2015.72 is OK
- 10.3115/v1/P15-1085 is OK
- 10.48550/arXiv.2005.14165 is OK
- 10.48550/arXiv.2203.02155 is OK
- 10.48550/arXiv.1706.03762 is OK
- 10.48550/arXiv.2104.08691 is OK
- 10.48550/arXiv.1707.02275 is OK
- 10.1109/TPAMI.2018.2889473 is OK
- 10.48550/arXiv.2305.14314 is OK
- 10.48550/arXiv.2106.09685 is OK
- 10.1109/TCSS.2023.3321345 is OK
- 10.48550/arXiv.2305.13560 is OK
- 10.48550/arXiv.2311.05046 is OK

🟑 SKIP DOIs

- No DOI given, and none found for title: Language models are unsupervised multitask learner...
- No DOI given, and none found for title: gpt-3.5-turbo
- No DOI given, and none found for title: gpt-4
- No DOI given, and none found for title: gpt-4-32k
- No DOI given, and none found for title: Llama-2-7B-Chat-GPTQ
- No DOI given, and none found for title: CodeLlama-7B-Instruct-GPTQ
- No DOI given, and none found for title: Llama-2-7b-chat-hf
- No DOI given, and none found for title: CodeLlama-7b-Instruct-hf
- No DOI given, and none found for title: gemma-2b-it
- No DOI given, and none found for title: codegemma-2b-it
- No DOI given, and none found for title: AutoDoc-ChatGPT
- No DOI given, and none found for title: AutoDoc
- No DOI given, and none found for title: Auto-GitHub-Docs-Generator
- No DOI given, and none found for title: Sentence Transformers: all-mpnet-base-v2

❌ MISSING DOIs

- None

❌ INVALID DOIs

- None
crvernon commented 2 days ago

πŸ‘‹ @souradipp76, @Manvi-Agrawal, and @camilochs - This is the review thread for the paper. All of our communications will happen here from now on.

Please read the "Reviewer instructions & questions" in the first comment above.

Both reviewers have checklists at the top of this thread (in that first comment) with the JOSS requirements. As you go over the submission, please check any items that you feel have been satisfied. There are also links to the JOSS reviewer guidelines.

The JOSS review is different from most other journals. Our goal is to work with the authors to help them meet our criteria instead of merely passing judgment on the submission. As such, the reviewers are encouraged to submit issues and pull requests on the software repository. When doing so, please mention https://github.com/openjournals/joss-reviews/issues/7489 so that a link is created to this thread (and I can keep an eye on what is happening). Please also feel free to comment and ask questions on this thread. In my experience, it is better to post comments/questions/suggestions as you come across them instead of waiting until you've reviewed the entire package.

We aim for the review process to be completed within about 4-6 weeks but please make a start well ahead of this as JOSS reviews are by their nature iterative and any early feedback you may be able to provide to the author will be very helpful in meeting this schedule.

editorialbot commented 2 days ago

Paper file info:

πŸ“„ Wordcount for paper.md is 1625

βœ… The paper includes a Statement of need section

editorialbot commented 2 days ago

License info:

βœ… License found: Apache License 2.0 (Valid open source OSI approved license)

editorialbot commented 2 days ago

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

Manvi-Agrawal commented 1 day ago

Review checklist for @Manvi-Agrawal

Conflict of interest

Code of Conduct

General checks

Functionality

Documentation

Software paper

Manvi-Agrawal commented 1 day ago

@editorialbot generate pdf

editorialbot commented 1 day ago

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

Manvi-Agrawal commented 1 day ago

Hi @souradipp76, I did a high-level pass and left an initial review. I would appreciate more insights into how to reproduce the results and run readme-ready locally for non OpenAI models(I exhausted my free credits xD). After you provide with these details, I can do another round of review.