Generative AI with LangChain, First Edition
This is the code repository for Generative AI with LangChain, First Edition, published by Packt.
Build large language model (LLM) apps with Python, ChatGPT, and other LLMs
Ben Auffarth
About the book
ChatGPT and the GPT models by OpenAI have brought about a revolution not only in how we write and research but also in how we can process information. This book discusses the functioning, capabilities, and limitations of LLMs underlying chat systems, including ChatGPT and Bard. It also demonstrates, in a series of practical examples, how to use the LangChain framework to build production-ready and responsive LLM applications for tasks ranging from customer support to software development assistance and data analysis – illustrating the expansive utility of LLMs in real-world applications.
Unlock the full potential of LLMs within your projects as you navigate through guidance on fine-tuning, prompt engineering, and best practices for deployment and monitoring in production environments. Whether you're building creative writing tools, developing sophisticated chatbots, or crafting cutting-edge software development aids, this book will be your roadmap to mastering the transformative power of generative AI with confidence and creativity.
Key Learnings
- Understand LLMs, their strengths and limitations
- Grasp generative AI fundamentals and industry trends
- Create LLM apps with LangChain like question-answering systems and chatbots
- Understand transformer models and attention mechanisms
- Automate data analysis and visualization using pandas and Python
- Grasp prompt engineering to improve performance
- Fine-tune LLMs and get to know the tools to unleash their power
- Deploy LLMs as a service with LangChain and apply evaluation strategies
- Privately interact with documents using open-source LLMs to prevent data leaks
Note to Readers
Thank you for choosing "Generative AI with LangChain"! We appreciate your enthusiasm and feedback.
Please note that we've released an updated version of the book. Consequently, there are two different branches for this repository:
* [main](https://github.com/benman1/generative_ai_with_langchain/tree/main) - this is the original version of the book.
* [softupdate](https://github.com/benman1/generative_ai_with_langchain/tree/softupdate) - this is for the latest update of the book, corresponding to ver 0.1.13 of LangChain.
Please refer to the version that you are interested in or that corresponds to your version of the book.
Download a free PDF
Download a free PDF
_If you have already purchased an up-to-date print or Kindle version of this book, you can get a DRM-free PDF version at no cost. Simply click on the link to claim your free PDF._
[Free-Ebook](https://packt.link/free-ebook/9781835083468)
We also provide a PDF file that has color images of the screenshots/diagrams used in this book at [GraphicBundle](https://packt.link/gbp/9781835083468)
Commitment
Code Updates: Our commitment is to provide you with stable and valuable code examples. While LangChain is known for frequent updates, we understand the importance of aligning our code with the latest changes. The companion repository is regularly updated to harmonize with LangChain developments.
Expect Stability: For stability and usability, the repository might not match every minor LangChain update. We aim for consistency and reliability to ensure a seamless experience for our readers.
How to Reach Us: Encountering issues or have suggestions? Please don't hesitate to open an issue, and we'll promptly address it. Your feedback is invaluable, and we're here to support you in your journey with LangChain.
Thank you for your understanding and happy coding!
Know more on the Discord server
You can engage with the author and other readers on the discord server and find latest updates and discussions in the community at [Discord](https://packt.link/lang)
Chapters
In the following table, you can find links to the directories in this repository. Each directory contains further links to python scripts and to notebooks. You can also see links to computing platforms, where you can execute the notebooks in the repository. Please note that there are other Python scripts and projects that are not notebooks, which you'll find in the chapter directories.
| Chapters | Colab | Kaggle | Gradient | Studio Lab |
| :-------- | :-------- | :------- | :-------- | :-------- |
| **Chapter 1: What Is Generative AI?** | no code examples | | | |
| **Chapter 2: LangChain for LLM Apps** | no code examples | | | |
| **Chapter 3: Getting Started with LangChain** | [directory](chapter3) | | | |
| - LLMs_chat_models_and_prompts.ipynb
|
|
|
|
|
| - Running_local_models.ipynb
|
|
|
|
|
| - customer_service_helper.ipynb
|
|
|
|
|
| - customer_service_use_case.ipynb
|
|
|
|
|
| |
|
|
|
|
| |
|
|
|
|
| |
|
|
|
|
| |
|
|
|
|
| **Chapter 4: Building Capable Assistants** | [directory](chapter4) | | | |
| - information_extraction.ipynb
|
|
|
|
|
| - mitigating_hallucinations.ipynb
|
|
|
|
|
| |
|
|
|
|
| **Chapter 5: Building a Chatbot like ChatGPT** | [directory](chapter5) | | | |
| |
|
|
|
|
| |
|
|
|
|
| |
|
|
|
|
| |
|
|
|
|
| |
|
|
|
|
| **Chapter 6: Developing Software with Generative AI** | [directory](chapter6) | | | |
| |
|
|
|
|
| |
|
|
|
|
| |
|
|
|
|
| - software_development.ipynb
|
|
|
|
|
| **Chapter 7: LLMs for Data Science** | [directory](chapter7) | | | |
| |
|
|
|
|
| **Chapter 8: Customizing LLMs and Their Output** | [directory](chapter8) | | | |
| |
|
|
|
|
| **Chapter 9: Generative AI in Production** | [directory](chapter9) | | | |
| **Chapter 10: The Future of Generative Models** | no code examples | | | |
Requirements for this book
### Software and hardware list
This is the companion repository for the book. Here are a few instructions that help getting set up. Please also see chapter 3.
All chapters rely on Python.
| Chapter | Software required | Link to the software | Hardware specifications | OS required |
|:---: |:---: |:---: |:---: |:---: |
| All chapters | Python 3.11 | [https://www.python.org/downloads/](https://www.python.org/downloads/) | Should work on any recent computer | Windows, MacOS, Linux (any), macOS, Windows |
Please note that Python 3.12 might not work (see [#11](/../../issues/11)).
### Environment
You can install your local environment with conda (recommended) or pip. The environment configurations for conda, pip, and poetry are provided. They all have been tested on MacOS. Please note that if you choose pip as you installation tool, you might need additional installation of system dependencies.
If you have any problems with the environment, please raise an issue, where you show the error you got. If you feel confident, please go ahead and create a pull request.
On Windows, some people have been experiencing difficulties with conda and pip (because of readline and ncurses). If that's the case for you, please have a look at [WSL](https://learn.microsoft.com/en-us/windows/wsl/install) or use the Docker installation. Some people on Winodws reported they [needed](https://stackoverflow.com/questions/73969269/error-could-not-build-wheels-for-hnswlib-which-is-required-to-install-pyprojec/76245995#76245995) to install Visual Cpp Build Tools. In any case, if you have any problems with the environment, please raise an issue, where you show the error you got. If you feel confident that you found an improvement, please go ahead and create a pull request.
For pip and poetry, make sure you install pandoc in your system. On MacOS use brew:
```bash
brew install pandoc
```
On Ubuntu or Debian linux, use apt:
```bash
sudo apt-get install pandoc
```
On Windows, you can use an [installer](https://github.com/jgm/pandoc/releases/latest).
### Conda
This is the recommended method for installing dependencies. Please make sure you have [anaconda](https://www.anaconda.com/download) installed.
First create the environment for the book that contains all the dependencies:
```bash
conda env create --file langchain_ai.yaml --force
```
The conda environment is called `langchain_ai`. You can activate it as follows:
```bash
conda activate langchain_ai
```
### Pip
[Pip](https://pypi.org/project/pip/) is the default dependency management tool in Python. With pip, you should be able to install all the libraries from the requirements file:
```bash
pip install -r requirements.txt
```
If you are working with a slow internet connection, you might see a timeout with pip (this can also happen with conda and pip). As a workaround, you can increase the timeout setting like this:
```bash
export PIP_DEFAULT_TIMEOUT=100
```
### Docker
There's a [docker](https://www.docker.com/) file for the environment as well. It uses the docker environment and starts an ipython notebook. To use it, first build it, and then run it:
```bash
docker build -t langchain_ai .
docker run -it -p 8888:8888 langchain_ai
```
You should be able to find the notebook in your browser at [http://localhost:8888](http://localhost:8888).
### Poetry
Make sure you have [poetry](https://python-poetry.org/) installed. On Linux and MacOS, you should be able to use the requirements file:
```bash
poetry install --no-root
```
This should take the `pyproject.toml` file and install all dependencies.
## Setting API keys
Following best practices regarding safety, I am not committing my credentials to GitHub. You might see `import` statements mentioning a `config.py` file, which is not included in the repository. This module has a method `set_environment()` that sets all the keys as environment variables like this:
Example config.py:
```python
import os
def set_environment():
os.environ['OPENAI_API_KEY']='your-api-key-here'
```
Obviously, you'd put your API credentials here. Depending on the integration (Openai, Azure, etc) you need to add the corresponding API keys. The OpenAI API keys are the most often used across all the code.
You can find more details about API credentials and setup in chapter 3 of the book [Generative AI with LangChain](https://www.amazon.com/Generative-AI-LangChain-language-ChatGPT-ebook/dp/B0CBBL55PQ).
## Contributing
If you find anything amiss with the notebooks or dependencies, please feel free to create a pull request.
If you want to change the conda dependency specification (the yaml file), you can test it like this:
```bash
conda env create --file langchain_ai.yaml --force
```
You can update the pip requirements like this:
```bash
pip freeze > requirements.txt
```
Please make sure that you keep these two ways of maintaining dependencies in sync.
Then make sure, you test the notebooks in the new environment to see that they run.
### Code validation
I've included a `Makefile` that includes instructions for validation with flake8, mypy, and other tools. I have run mypy like this:
```bash
make typecheck
```
To run the code validation in ruff, please run
```bash
ruff check .
```
Get to know Authors
_Ben Auffarth_ Ben Auffarth is a full-stack data scientist with more than 15 years of work experience. With a background and Ph.D. in computational and cognitive neuroscience, he has designed and conducted wet lab experiments on cell cultures, analyzed experiments with terabytes of data, run brain models on IBM supercomputers with up to 64k cores, built production systems processing hundreds and thousands of transactions per day, and trained language models on a large corpus of text documents. He co-founded and is the former president of Data Science Speakers, London.
Other Related Books