Designed for simplicity, customization, and developer productivity.
๐ Quick Links
- Overview
- Demo
- Features
- Getting Started
- Configuration
- Examples
- Contributing
[!IMPORTANT]
โจ Visit the [Official Documentation][mkdocs] for detailed guides and tutorials.
๐ฎ Overview
README-AI is a developer tool that automatically generates README markdown files using a robust repository processing engine and advanced language models. Simply provide a URL or path to your codebase, and a well-structured and detailed README will be generated.
Why README-AI?
This tool is designed to streamline the documentation process for developers, saving time and effort while ensuring high-quality README files. Key benefits include:
- AI-Powered: Leverage language models for intelligent content generation.
- Consistency: Ensure clean, standardized documentation across projects.
- Customization: Tailor the output to fit your project's requirements.
- Language Agnostic: Works with most programming languages/frameworks.
- Save Time: Generate comprehensive READMEs in less than a minute.
๐พ Demo
Running from the command line:
readmeai-cli-demo
Running directly in your browser:
readmeai-streamlit-demo
โ๏ธ Features
- ๐ Automated Documentation: Generate comprehensive README files automatically from your codebase.
- ๐จ Customizable Output: Tailor the styling, formatting, badges, header designs, and more preferences.
- ๐ค Flexible Backends: Seamlessly integrate with
OpenAI
, Ollama
, Anthropic
, Google Gemini
.
- ๐ Language Agnostic: Compatible with a wide range of programming languages and project types.
- ๐ Offline Mode: Create boilerplate README files offline, without any external API calls.
- ๐ Best Practices: Ensures clean, professional documentation, adhering to markdown best practices.
Let's take a look at some possible customizations created by readme-ai:
--image custom --badge-color FF4B4B --badge-style flat-square --header-style classic
|
--badge-color 00ADD8 --badge-style for-the-badge --header-style modern --toc-style roman
|
--header-style ascii
|
--badge-style for-the-badge --header-style svg
|
--align left --badge-style flat-square --image cloud
|
--align left --badge-style flat --image gradient
|
--badge-style flat --image custom
|
--badge-style skills-light --image grey
|
--badge-style flat-square
|
--badge-style flat --image black
|
--image cloud --header-style compact --toc-style fold
|
-i custom -bc BA0098 -bs flat-square -hs modern -ts fold
|
[!IMPORTANT]
See the [Official Documentation][mkdocs] for more information on customization options and best practices.
Next, let's explore the key sections of a typical README generated by readme-ai.
๐ Overview
Overview
โ High-level introduction of the project, focused on the value proposition and use-cases, rather than technical aspects.
|
|
โจ Features
Features Table
โ Generated markdown table that highlights the key technical features and components of the codebase. This table is generated using a structured prompt template.
|
|
๐ Codebase Documentation
Directory Tree
โ The project's directory structure is generated using pure Python and embedded in the README. See readmeai.generators.tree. for more details.
|
|
File Summaries
โ Summarizes key modules of the project, which are also used as context for downstream prompts.
|
|
๐ Quickstart Instructions
Getting Started Guides
โ Prerequisites and system requirements are extracted from the codebase during preprocessing. The parsers handles the majority of this logic currently.
|
|
Installation Guide
โ Installation , Usage , and Testing guides are generated based on the project's dependency files and codebase configuration.
|
|
๐ฐ Contributing Guidelines
Contributing Guide
โ Dropdown section that outlines general process for contributing to your project.
โ Provides links to your contributing guidelines, issues page, and more resources.
โ Graph of contributors is also included.
|
|
Additional Sections
โ Roadmap , Contributing Guidelines , License , and Acknowledgements are included by default.
|
|
๐ธ Getting Started
System Requirements
- Python Version:
3.9
or higher
- Package Management/Conainter Runtime: Choose one of the following:
- [
pip
][pip]: Python's default package installer, recommended for most users.
- [
pipx
][pipx]: Install and run readme-ai in an isolated environment.
- [
uv
][uv]: Fastest way to install readme-ai with a single command.
- [
docker
][docker]: Run readme-ai in a containerized environment.
Supported Repository Sources
The readmeai
CLI can retrieve source code from the following Git hosting services or your local file system:
Platform |
Description |
Resource |
File System |
Access repositories on your machine |
[Learn more][file-system] |
GitHub |
World's largest code hosting platform |
[GitHub.com][github] |
GitLab |
Complete DevOps platform |
[GitLab.com][gitlab] |
Bitbucket |
Atlassian's Git solution |
[Bitbucket.org][bitbucket] |
Supported LLM API Providers
To unlock the full potential of readmeai
, you'll need an account and API key from one of the providers below:
Provider |
Description |
Resource |
OpenAI |
Recommended for general use |
[OpenAI Developer quickstart][openai] |
Anthropic |
Advanced language models |
[Anthropic Developer docs][anthropic] |
Google Gemini |
Google's multimodal AI model |
[Gemini API quickstart][gemini] |
Ollama |
Free and open-source (No API key required) |
[Ollama GitHub repository][ollama] |
Offline Mode |
Run readmeai without a LLM API |
[Example offline mode README][offline-mode] |
โ๏ธ Installation
Choose your preferred installation method:
 Pip
Recommended method for most users:
โฏ pip install readmeai
 Pipx
Use pipx to use readmeai
in an isolated environment, ensuring no dependency conflicts with other Python projects:
โฏ pipx install readmeai
 Uv
Use uv for the fastest way to install readmeai
with a single command:
โฏ uv tool install readmeai
 Docker
To run readmeai
in a containerized environment, pull latest Docker image from Docker Hub:
โฏ docker pull zeroxeli/readme-ai:latest
 From source
Click to expand instructions
1. **Clone the repository:**
```sh
โฏ git clone https://github.com/eli64s/readme-ai
```
2. **Navigate to the `readme-ai` directory:**
```sh
โฏ cd readme-ai
```
3. **Install dependencies:**
```sh
โฏ pip install -r setup/requirements.txt
```
Alternatively, the project can be setup using the bash script below:
###  Bash
1. **Run the setup script:**
```sh
โฏ bash setup/setup.sh
```
Or, use `poetry` to build the project:
###  Poetry
1. **Install dependencies using Poetry:**
```sh
โฏ poetry install
```
> [!IMPORTANT]
> To use the **Anthropic** and **Google Gemini** clients, extra dependencies are required. Install the package with the following extras:
>
> - **Anthropic:**
> ```sh
> โฏ pip install "readmeai[anthropic]"
> ```
> - **Google Gemini:**
> ```sh
> โฏ pip install "readmeai[google-generativeai]"
> ```
>
> - **Install Multiple Clients:**
> ```sh
> โฏ pip install "readmeai[anthropic,google-generativeai]"
> ```
## ๐ค Running the CLI
**1. Set Up Environment Variables**
With OpenAI:
```sh
โฏ export OPENAI_API_KEY=
# Or for Windows users:
โฏ set OPENAI_API_KEY=
```
Additional Providers (Ollama, Anthropic, Google Gemini)
Ollama
Refer to the [Ollama documentation](https://github.com/ollama/ollama) for more information on setting up the Ollama API. Here is a basic example:
1. Pull your model of choice from the Ollama repository:
```sh
โฏ ollama pull mistral:latest
```
2. Start the Ollama server and set the `OLLAMA_HOST` environment variable:
```sh
โฏ export OLLAMA_HOST=127.0.0.1 && ollama serve
```
Anthropic
1. Export your Anthropic API key:
```sh
โฏ export ANTHROPIC_API_KEY=
```
Google Gemini
1. Export your Google Gemini API key:
```sh
โฏ export GOOGLE_API_KEY=
**2. Generate a README**
Run the following command, replacing the repository URL with your own:
```sh
โฏ readmeai --repository https://github.com/eli64s/readme-ai --api openai
```
> [!IMPORTANT]
> By default, the `gpt-3.5-turbo` model is used. Higher costs may be incurred when more advanced models.
>
Run with `Ollama` and set `llama3` as the model:
```sh
โฏ readmeai --api ollama --model llama3 --repository https://github.com/eli64s/readme-ai
```
Run with `Anthropic`:
```sh
โฏ readmeai --api anthropic -m claude-3-5-sonnet-20240620 -r https://github.com/eli64s/readme-ai
```
Run with `Google Gemini`:
```sh
โฏ readmeai --api gemini -m gemini-1.5-flash -r https://github.com/eli64s/readme-ai
```
Use a `local` directory path:
```sh
readmeai --repository /path/to/your/project
```
Add more customization options:
```sh
โฏ readmeai --repository https://github.com/eli64s/readme-ai \
--output readmeai.md \
--api openai \
--model gpt-4 \
--badge-color A931EC \
--badge-style flat-square \
--header-style compact \
--toc-style fold \
--temperature 0.9 \
--tree-depth 2
--image LLM \
--emojis
```
###  Docker
Run the Docker container with the OpenAI client:
```sh
โฏ docker run -it --rm \
-e OPENAI_API_KEY=$OPENAI_API_KEY \
-v "$(pwd)":/app zeroxeli/readme-ai:latest \
--repository https://github.com/eli64s/readme-ai \
--api openai
```
###  From source
Click to expand instructions
###  Bash
If you installed the project from source with the bash script, run the following command:
1. **Activate the virtual environment:**
```sh
โฏ conda activate readmeai
```
2. **Run the CLI:**
```sh
โฏ python3 -m readmeai.cli.main -r https://github.com/eli64s/readme-ai
```
###  Poetry
1. **Activate the virtual environment:**
```sh
โฏ poetry shell
```
2. **Run the CLI:**
```sh
โฏ poetry run python3 -m readmeai.cli.main -r https://github.com/eli64s/readme-ai
```
###  Streamlit
Try readme-ai directly in your browser, no installation required. See the readme-ai-streamlit repository for more details.
[](https://readme-ai.streamlit.app/)
---
## ๐งช Testing
The [pytest](https://docs.pytest.org/en/7.2.x/contents.html) and [nox](https://nox.thea.codes/en/stable/) frameworks are used for development and testing.
Install the dependencies using Poetry:
```sh
โฏ poetry install --with dev,test
```
Run the unit test suite using Pytest:
```sh
โฏ make test
```
Run the test suite against Python 3.9, 3.10, 3.11, and 3.12 using Nox:
```sh
โฏ make test-nox
```
> [!TIP]
> Nox is an automation tool that automates testing in multiple Python environments. It is used to ensure compatibility across different Python versions.
---
## ๐ก Configuration
Customize your README generation using these CLI options:
| Option | Description | Default |
|-------------------|-----------------------------------------------|-------------------|
| `--align` | Text alignment in header | `center` |
| `--api` | LLM API service provider | `offline` |
| `--badge-color` | Badge color name or hex code | `0080ff` |
| `--badge-style` | Badge icon style type | `flat` |
| `--header-style` | Header template style | `classic` |
| `--toc-style` | Table of contents style | `bullet` |
| `--emojis` | Adds emojis to the README header sections | `False` |
| `--image` | Project logo image | `blue` |
| `--model` | Specific LLM model to use | `gpt-3.5-turbo` |
| `--output` | Output filename | `readme-ai.md` |
| `--repository` | Repository URL or local directory path | `None` |
| `--temperature` | Creativity level for content generation | `0.1` |
| `--tree-depth` | Maximum depth of the directory tree structure | `2` |
Run the following command to view all available options:
```sh
โฏ readmeai --help
```
Visit the [Official Documentation][mkdocs] for more detailed information on configuration options, examples, and best practices.
---
## ๐จ Examples
View example README files generated by readme-ai across various Tech Stacks:
| Technology | Example Output | Repository | Description |
|------------|---------------|------------|-------------|
| Readme-ai | [readme-ai.md][default] | [readme-ai][readme-ai] | Readme-ai project |
| Apache Flink | [readme-pyflink.md][modern-header] | [pyflink-poc][pyflink] | Pyflink project |
| Streamlit | [readme-streamlit.md][svg-banner] | [readme-ai-streamlit][streamlit] | Streamlit web app |
| Vercel & NPM | [readme-vercel.md][dalle-logo] | [github-readme-quotes][vercel] | Vercel deployment |
| Go & Docker | [readme-docker-go.md][for-the-badge] | [docker-gs-ping][docker-golang] | Dockerized Go app |
| FastAPI & Redis | [readme-fastapi-redis.md][fastapi-redis] | [async-ml-inference][fastapi] | Async ML inference service |
| Java | [readme-java.md][compact-header] | [Minimal-Todo][java] | Minimalist todo Java app |
| PostgreSQL & DuckDB | [readme-postgres.md][classic-header] | [Buenavista][postgres] | Postgres proxy server |
| Kotlin | [readme-kotlin.md][readme-kotlin] | [android-client][kotlin] | Android client app |
| Offline Mode | [offline-mode.md][offline-mode] | [litellm][litellm] | LLM API service |
Find additional README examples in the [examples directory](https://github.com/eli64s/readme-ai/tree/main/examples).
---
## ๐๐จ Roadmap
* [ ] Release `readmeai 1.0.0` with enhanced documentation management features.
* [ ] Develop `Vscode Extension` to generate README files directly in the editor.
* [ ] Develop `GitHub Actions` to automate documentation updates.
* [ ] Add `badge packs` to provide additional badge styles and options.
+ [ ] Code coverage, CI/CD status, project version, and more.
---
## ๐ฐ Contributing
Contributions are welcome! Please read the [Contributing Guide][contributing] to get started.
- **๐ก [Contributing Guide][contributing]**: Learn about our contribution process and coding standards.
- **๐ [Report an Issue][issues]**: Found a bug? Let us know!
- **๐ฌ [Start a Discussion][discussions]**: Have ideas or suggestions? We'd love to hear from you.
---
## ๐ Acknowledgments
* [Shields.io](https://shields.io/)
* [Simple Icons](https://simpleicons.org/)
* [Aveek-Saha/GitHub-Profile-Badges](https://github.com/Aveek-Saha/GitHub-Profile-Badges)
* [Ileriayo/Markdown-Badges](https://github.com/Ileriayo/markdown-badges)
* [tandpfun/skill-icons](https://github.com/tandpfun/skill-icons)
[![][return]](#-quick-links)
---
#### ๐ License
Copyright ยฉ 2023 [readme-ai][readme-ai].
Released under the [MIT License][license].
[readme-ai]: https://github.com/eli64s/readme-ai
[return]: https://img.shields.io/badge/Back_to_top-5D4ED3?style=flat&logo=ReadMe&logoColor=white
[contributing]: https://github.com/eli64s/readme-ai/blob/main/CONTRIBUTING.md
[discussions]: https://github.com/eli64s/readme-ai/discussions
[issues]: https://github.com/eli64s/readme-ai/issues
[license]: https://github.com/eli64s/readme-ai/blob/main/LICENSE
[pulls]: https://github.com/eli64s/readme-ai/pulls "submit a pull request"
[mkdocs]: https://eli64s.github.io/readme-ai "Official Documentation"
[docker]: https://docs.docker.com/ "docker"
[pip]: https://pip.pypa.io/en/stable/ "pip"
[pipx]: https://pipx.pypa.io/stable/ "pipx"
[uv]: https://docs.astral.sh/uv/ "uv"
[file-system]: https://en.wikipedia.org/wiki/File_system "Learn more"
[github]: https://github.com/ "GitHub.com"
[gitlab]: https://gitlab.com/ "GitLab.com"
[bitbucket]: https://bitbucket.org/ "Bitbucket.org"
[openai]: https://platform.openai.com/docs/quickstart/account-setup: "OpenAI Developer quickstart"
[anthropic]: https://docs.anthropic.com/en/home "Anthropic Developer docs"
[gemini]: https://ai.google.dev/tutorials/python_quickstart "Gemini API quickstart"
[ollama]: https://github.com/ollama/ollama "Ollama GitHub repository"
[default]: https://github.com/eli64s/readme-ai/blob/main/examples/readme-ai.md "readme-python.md"
[ascii-header]: https://github.com/eli64s/readme-ai/blob/main/examples/headers/ascii.md "ascii.md"
[classic-header]: https://github.com/eli64s/readme-ai/blob/main/examples/headers/classic.md "readme-postgres.md"
[compact-header]: https://github.com/eli64s/readme-ai/blob/main/examples/headers/compact.md "readme-java.md"
[modern-header]: https://github.com/eli64s/readme-ai/blob/main/examples/headers/modern.md "readme-pyflink.md"
[svg-banner]: https://github.com/eli64s/readme-ai/blob/main/examples/banners/svg-banner.md "readme-streamlit.md"
[dalle-logo]: https://github.com/eli64s/readme-ai/blob/main/examples/logos/dalle.md "readme-vercel.md"
[readme-kotlin]: https://github.com/eli64s/readme-ai/blob/main/examples/readme-kotlin.md "readme-kotlin.md"
[for-the-badge]: https://github.com/eli64s/readme-ai/blob/main/examples/readme-docker-go.md "readme-docker-go.md"
[fastapi-redis]: https://github.com/eli64s/readme-ai/blob/main/examples/readme-fastapi-redis.md "readme-fastapi-redis.md"
[offline-mode]: https://github.com/eli64s/readme-ai/blob/main/examples/offline-mode/readme-litellm.md "readme-litellm.md"
[readme-ai]: https://github.com/eli64s/readme-ai "readme-ai"
[pyflink]: https://github.com/eli64s/pyflink-poc "pyflink-poc"
[postgres]: https://github.com/jwills/buenavista "Buenavista"
[java]: https://github.com/avjinder/Minimal-Todo "minimal-todo"
[kotlin]: https://github.com/rumaan/file.io-Android-Client "android-client"
[docker-golang]: https://github.com/olliefr/docker-gs-ping "docker-gs-ping"
[vercel]: https://github.com/PiyushSuthar/github-readme-quotes "github-readme-quotes"
[streamlit]: https://github.com/eli64s/readme-ai-streamlit "readme-ai-streamlit"
[fastapi]: https://github.com/FerrariDG/async-ml-inference "async-ml-inference"
[litellm]: https://github.com/BerriAI/litellm "offline-mode"