eli64s / readme-ai

README file generator, powered by AI.
https://eli64s.github.io/readme-ai/
MIT License
1.65k stars 172 forks source link
ai ai-documentation anthropic badge-generator cli developer-tools devtools documentation documentation-generator gemini gpt markdown markdown-generator python readme readme-generator readme-md readme-md-generator readme-template

readme-ai-banner-logo

Designed for simplicity, customization, and developer productivity.

github-actions codecov pypi-version pepy-total-downloads license


๐Ÿ”— Quick Links

  1. Overview
  2. Demo
  3. Features
  4. Getting Started
  5. Configuration
  6. Examples
  7. Contributing

[!IMPORTANT] โœจ Visit the [Official Documentation][mkdocs] for detailed guides and tutorials.


๐Ÿ”ฎ Overview

README-AI is a developer tool that automatically generates README markdown files using a robust repository processing engine and advanced language models. Simply provide a URL or path to your codebase, and a well-structured and detailed README will be generated.

Why README-AI?

This tool is designed to streamline the documentation process for developers, saving time and effort while ensuring high-quality README files. Key benefits include:


๐Ÿ‘พ Demo

Running from the command line:

readmeai-cli-demo

Running directly in your browser:

readmeai-streamlit-demo


โ˜„๏ธ Features

Let's take a look at some possible customizations created by readme-ai:

custom-dragon-project-logo
--image custom --badge-color FF4B4B --badge-style flat-square --header-style classic
docker-go-readme-example
--badge-color 00ADD8 --badge-style for-the-badge --header-style modern --toc-style roman

ascii-readme-header-style
--header-style ascii

svg-
--badge-style for-the-badge --header-style svg
readme-header-with-cloud-logo
--align left --badge-style flat-square --image cloud
readme-header-with-gradient-markdown-logo
--align left --badge-style flat --image gradient
custom-balloon-project-logo
--badge-style flat --image custom
readme-header-with-skill-icons-light
--badge-style skills-light --image grey
readme-header-with-blue-markdown-logo
--badge-style flat-square
readme-header-with-black-readme-logo
--badge-style flat --image black

compact-readme-header
--image cloud --header-style compact --toc-style fold
readme-header-style-modern
-i custom -bc BA0098 -bs flat-square -hs modern -ts fold

[!IMPORTANT] See the [Official Documentation][mkdocs] for more information on customization options and best practices.

Next, let's explore the key sections of a typical README generated by readme-ai.

๐Ÿ“ Overview
Overview

โ—Ž High-level introduction of the project, focused on the value proposition and use-cases, rather than technical aspects.

readme-overview-section
โœจ Features
Features Table

โ—Ž Generated markdown table that highlights the key technical features and components of the codebase. This table is generated using a structured prompt template.

readme-features-section
๐Ÿ“ƒ Codebase Documentation
Directory Tree

โ—Ž The project's directory structure is generated using pure Python and embedded in the README. See readmeai.generators.tree. for more details.

directory-tree
File Summaries

โ—Ž Summarizes key modules of the project, which are also used as context for downstream prompts.

file-summaries
๐Ÿš€ Quickstart Instructions
Getting Started Guides

โ—Ž Prerequisites and system requirements are extracted from the codebase during preprocessing. The parsers handles the majority of this logic currently.

getting-started-section-prerequisites
Installation Guide

โ—Ž Installation, Usage, and Testing guides are generated based on the project's dependency files and codebase configuration.

getting-started-section-usage-and-testing
๐Ÿ”ฐ Contributing Guidelines
Contributing Guide

โ—Ž Dropdown section that outlines general process for contributing to your project.

โ—Ž Provides links to your contributing guidelines, issues page, and more resources.

โ—Ž Graph of contributors is also included.

contributing-guidelines-section
Additional Sections

โ—Ž Roadmap, Contributing Guidelines, License, and Acknowledgements are included by default.

footer-readme-section

๐Ÿ›ธ Getting Started

System Requirements

Supported Repository Sources

The readmeai CLI can retrieve source code from the following Git hosting services or your local file system:

Platform Description Resource
File System Access repositories on your machine [Learn more][file-system]
GitHub World's largest code hosting platform [GitHub.com][github]
GitLab Complete DevOps platform [GitLab.com][gitlab]
Bitbucket Atlassian's Git solution [Bitbucket.org][bitbucket]

Supported LLM API Providers

To unlock the full potential of readmeai, you'll need an account and API key from one of the providers below:

Provider Description Resource
OpenAI Recommended for general use [OpenAI Developer quickstart][openai]
Anthropic Advanced language models [Anthropic Developer docs][anthropic]
Google Gemini Google's multimodal AI model [Gemini API quickstart][gemini]
Ollama Free and open-source (No API key required) [Ollama GitHub repository][ollama]
Offline Mode Run readmeai without a LLM API [Example offline mode README][offline-mode]

โš™๏ธ Installation

Choose your preferred installation method:

 Pip

Recommended method for most users:

โฏ pip install readmeai

 Pipx

Use pipx to use readmeai in an isolated environment, ensuring no dependency conflicts with other Python projects:

โฏ pipx install readmeai

 Uv

Use uv for the fastest way to install readmeai with a single command:

โฏ uv tool install readmeai

 Docker

To run readmeai in a containerized environment, pull latest Docker image from Docker Hub:

โฏ docker pull zeroxeli/readme-ai:latest

 From source

Click to expand instructions 1. **Clone the repository:** ```sh โฏ git clone https://github.com/eli64s/readme-ai ``` 2. **Navigate to the `readme-ai` directory:** ```sh โฏ cd readme-ai ``` 3. **Install dependencies:** ```sh โฏ pip install -r setup/requirements.txt ``` Alternatively, the project can be setup using the bash script below: ###  Bash 1. **Run the setup script:** ```sh โฏ bash setup/setup.sh ``` Or, use `poetry` to build the project: ###  Poetry 1. **Install dependencies using Poetry:** ```sh โฏ poetry install ```

> [!IMPORTANT] > To use the **Anthropic** and **Google Gemini** clients, extra dependencies are required. Install the package with the following extras: > > - **Anthropic:** > ```sh > โฏ pip install "readmeai[anthropic]" > ``` > - **Google Gemini:** > ```sh > โฏ pip install "readmeai[google-generativeai]" > ``` > > - **Install Multiple Clients:** > ```sh > โฏ pip install "readmeai[anthropic,google-generativeai]" > ``` ## ๐Ÿค– Running the CLI **1. Set Up Environment Variables** With OpenAI: ```sh โฏ export OPENAI_API_KEY= # Or for Windows users: โฏ set OPENAI_API_KEY= ```
Additional Providers (Ollama, Anthropic, Google Gemini)
Ollama
Refer to the [Ollama documentation](https://github.com/ollama/ollama) for more information on setting up the Ollama API. Here is a basic example: 1. Pull your model of choice from the Ollama repository: ```sh โฏ ollama pull mistral:latest ``` 2. Start the Ollama server and set the `OLLAMA_HOST` environment variable: ```sh โฏ export OLLAMA_HOST=127.0.0.1 && ollama serve ```
Anthropic 1. Export your Anthropic API key: ```sh โฏ export ANTHROPIC_API_KEY= ```
Google Gemini 1. Export your Google Gemini API key: ```sh โฏ export GOOGLE_API_KEY=
**2. Generate a README** Run the following command, replacing the repository URL with your own: ```sh โฏ readmeai --repository https://github.com/eli64s/readme-ai --api openai ``` > [!IMPORTANT] > By default, the `gpt-3.5-turbo` model is used. Higher costs may be incurred when more advanced models. > Run with `Ollama` and set `llama3` as the model: ```sh โฏ readmeai --api ollama --model llama3 --repository https://github.com/eli64s/readme-ai ``` Run with `Anthropic`: ```sh โฏ readmeai --api anthropic -m claude-3-5-sonnet-20240620 -r https://github.com/eli64s/readme-ai ``` Run with `Google Gemini`: ```sh โฏ readmeai --api gemini -m gemini-1.5-flash -r https://github.com/eli64s/readme-ai ``` Use a `local` directory path: ```sh readmeai --repository /path/to/your/project ``` Add more customization options: ```sh โฏ readmeai --repository https://github.com/eli64s/readme-ai \ --output readmeai.md \ --api openai \ --model gpt-4 \ --badge-color A931EC \ --badge-style flat-square \ --header-style compact \ --toc-style fold \ --temperature 0.9 \ --tree-depth 2 --image LLM \ --emojis ``` ###  Docker Run the Docker container with the OpenAI client: ```sh โฏ docker run -it --rm \ -e OPENAI_API_KEY=$OPENAI_API_KEY \ -v "$(pwd)":/app zeroxeli/readme-ai:latest \ --repository https://github.com/eli64s/readme-ai \ --api openai ``` ###  From source
Click to expand instructions ###  Bash If you installed the project from source with the bash script, run the following command: 1. **Activate the virtual environment:** ```sh โฏ conda activate readmeai ``` 2. **Run the CLI:** ```sh โฏ python3 -m readmeai.cli.main -r https://github.com/eli64s/readme-ai ``` ###  Poetry 1. **Activate the virtual environment:** ```sh โฏ poetry shell ``` 2. **Run the CLI:** ```sh โฏ poetry run python3 -m readmeai.cli.main -r https://github.com/eli64s/readme-ai ```
###  Streamlit Try readme-ai directly in your browser, no installation required. See the readme-ai-streamlit repository for more details. [](https://readme-ai.streamlit.app/) --- ## ๐Ÿงช Testing The [pytest](https://docs.pytest.org/en/7.2.x/contents.html) and [nox](https://nox.thea.codes/en/stable/) frameworks are used for development and testing. Install the dependencies using Poetry: ```sh โฏ poetry install --with dev,test ``` Run the unit test suite using Pytest: ```sh โฏ make test ``` Run the test suite against Python 3.9, 3.10, 3.11, and 3.12 using Nox: ```sh โฏ make test-nox ``` > [!TIP] > Nox is an automation tool that automates testing in multiple Python environments. It is used to ensure compatibility across different Python versions. --- ## ๐Ÿ”ก Configuration Customize your README generation using these CLI options: | Option | Description | Default | |-------------------|-----------------------------------------------|-------------------| | `--align` | Text alignment in header | `center` | | `--api` | LLM API service provider | `offline` | | `--badge-color` | Badge color name or hex code | `0080ff` | | `--badge-style` | Badge icon style type | `flat` | | `--header-style` | Header template style | `classic` | | `--toc-style` | Table of contents style | `bullet` | | `--emojis` | Adds emojis to the README header sections | `False` | | `--image` | Project logo image | `blue` | | `--model` | Specific LLM model to use | `gpt-3.5-turbo` | | `--output` | Output filename | `readme-ai.md` | | `--repository` | Repository URL or local directory path | `None` | | `--temperature` | Creativity level for content generation | `0.1` | | `--tree-depth` | Maximum depth of the directory tree structure | `2` | Run the following command to view all available options: ```sh โฏ readmeai --help ``` Visit the [Official Documentation][mkdocs] for more detailed information on configuration options, examples, and best practices. --- ## ๐ŸŽจ Examples View example README files generated by readme-ai across various Tech Stacks: | Technology | Example Output | Repository | Description | |------------|---------------|------------|-------------| | Readme-ai | [readme-ai.md][default] | [readme-ai][readme-ai] | Readme-ai project | | Apache Flink | [readme-pyflink.md][modern-header] | [pyflink-poc][pyflink] | Pyflink project | | Streamlit | [readme-streamlit.md][svg-banner] | [readme-ai-streamlit][streamlit] | Streamlit web app | | Vercel & NPM | [readme-vercel.md][dalle-logo] | [github-readme-quotes][vercel] | Vercel deployment | | Go & Docker | [readme-docker-go.md][for-the-badge] | [docker-gs-ping][docker-golang] | Dockerized Go app | | FastAPI & Redis | [readme-fastapi-redis.md][fastapi-redis] | [async-ml-inference][fastapi] | Async ML inference service | | Java | [readme-java.md][compact-header] | [Minimal-Todo][java] | Minimalist todo Java app | | PostgreSQL & DuckDB | [readme-postgres.md][classic-header] | [Buenavista][postgres] | Postgres proxy server | | Kotlin | [readme-kotlin.md][readme-kotlin] | [android-client][kotlin] | Android client app | | Offline Mode | [offline-mode.md][offline-mode] | [litellm][litellm] | LLM API service | Find additional README examples in the [examples directory](https://github.com/eli64s/readme-ai/tree/main/examples). --- ## ๐ŸŽ๐Ÿ’จ Roadmap * [ ] Release `readmeai 1.0.0` with enhanced documentation management features. * [ ] Develop `Vscode Extension` to generate README files directly in the editor. * [ ] Develop `GitHub Actions` to automate documentation updates. * [ ] Add `badge packs` to provide additional badge styles and options. + [ ] Code coverage, CI/CD status, project version, and more. --- ## ๐Ÿ”ฐ Contributing Contributions are welcome! Please read the [Contributing Guide][contributing] to get started. - **๐Ÿ’ก [Contributing Guide][contributing]**: Learn about our contribution process and coding standards. - **๐Ÿ› [Report an Issue][issues]**: Found a bug? Let us know! - **๐Ÿ’ฌ [Start a Discussion][discussions]**: Have ideas or suggestions? We'd love to hear from you.

--- ## ๐Ÿ™Œ Acknowledgments * [Shields.io](https://shields.io/) * [Simple Icons](https://simpleicons.org/) * [Aveek-Saha/GitHub-Profile-Badges](https://github.com/Aveek-Saha/GitHub-Profile-Badges) * [Ileriayo/Markdown-Badges](https://github.com/Ileriayo/markdown-badges) * [tandpfun/skill-icons](https://github.com/tandpfun/skill-icons)
[![][return]](#-quick-links)
--- #### ๐ŸŽ— License Copyright ยฉ 2023 [readme-ai][readme-ai].
Released under the [MIT License][license]. [readme-ai]: https://github.com/eli64s/readme-ai [return]: https://img.shields.io/badge/Back_to_top-5D4ED3?style=flat&logo=ReadMe&logoColor=white [contributing]: https://github.com/eli64s/readme-ai/blob/main/CONTRIBUTING.md [discussions]: https://github.com/eli64s/readme-ai/discussions [issues]: https://github.com/eli64s/readme-ai/issues [license]: https://github.com/eli64s/readme-ai/blob/main/LICENSE [pulls]: https://github.com/eli64s/readme-ai/pulls "submit a pull request" [mkdocs]: https://eli64s.github.io/readme-ai "Official Documentation" [docker]: https://docs.docker.com/ "docker" [pip]: https://pip.pypa.io/en/stable/ "pip" [pipx]: https://pipx.pypa.io/stable/ "pipx" [uv]: https://docs.astral.sh/uv/ "uv" [file-system]: https://en.wikipedia.org/wiki/File_system "Learn more" [github]: https://github.com/ "GitHub.com" [gitlab]: https://gitlab.com/ "GitLab.com" [bitbucket]: https://bitbucket.org/ "Bitbucket.org" [openai]: https://platform.openai.com/docs/quickstart/account-setup: "OpenAI Developer quickstart" [anthropic]: https://docs.anthropic.com/en/home "Anthropic Developer docs" [gemini]: https://ai.google.dev/tutorials/python_quickstart "Gemini API quickstart" [ollama]: https://github.com/ollama/ollama "Ollama GitHub repository" [default]: https://github.com/eli64s/readme-ai/blob/main/examples/readme-ai.md "readme-python.md" [ascii-header]: https://github.com/eli64s/readme-ai/blob/main/examples/headers/ascii.md "ascii.md" [classic-header]: https://github.com/eli64s/readme-ai/blob/main/examples/headers/classic.md "readme-postgres.md" [compact-header]: https://github.com/eli64s/readme-ai/blob/main/examples/headers/compact.md "readme-java.md" [modern-header]: https://github.com/eli64s/readme-ai/blob/main/examples/headers/modern.md "readme-pyflink.md" [svg-banner]: https://github.com/eli64s/readme-ai/blob/main/examples/banners/svg-banner.md "readme-streamlit.md" [dalle-logo]: https://github.com/eli64s/readme-ai/blob/main/examples/logos/dalle.md "readme-vercel.md" [readme-kotlin]: https://github.com/eli64s/readme-ai/blob/main/examples/readme-kotlin.md "readme-kotlin.md" [for-the-badge]: https://github.com/eli64s/readme-ai/blob/main/examples/readme-docker-go.md "readme-docker-go.md" [fastapi-redis]: https://github.com/eli64s/readme-ai/blob/main/examples/readme-fastapi-redis.md "readme-fastapi-redis.md" [offline-mode]: https://github.com/eli64s/readme-ai/blob/main/examples/offline-mode/readme-litellm.md "readme-litellm.md" [readme-ai]: https://github.com/eli64s/readme-ai "readme-ai" [pyflink]: https://github.com/eli64s/pyflink-poc "pyflink-poc" [postgres]: https://github.com/jwills/buenavista "Buenavista" [java]: https://github.com/avjinder/Minimal-Todo "minimal-todo" [kotlin]: https://github.com/rumaan/file.io-Android-Client "android-client" [docker-golang]: https://github.com/olliefr/docker-gs-ping "docker-gs-ping" [vercel]: https://github.com/PiyushSuthar/github-readme-quotes "github-readme-quotes" [streamlit]: https://github.com/eli64s/readme-ai-streamlit "readme-ai-streamlit" [fastapi]: https://github.com/FerrariDG/async-ml-inference "async-ml-inference" [litellm]: https://github.com/BerriAI/litellm "offline-mode"