Closed chidiewenike closed 3 years ago
Preliminary Setup:
sudo apt-get update && sudo apt-get upgrade -y && sudo apt-get dist-upgrade -y
sudo apt-get install libatlas-base-dev python3-dev python3-pip libhdf5-dev python-pip python-dev -y
sudo apt install -y python3-scipy sudo apt install gfortran
sudo reboot
pip3 install rasa-nlu[spacy]
https://qengineering.eu/install-tensorflow-2.2.0-on-raspberry-pi-4.html
$ sudo apt-get update $ sudo apt-get upgrade
$ sudo pip uninstall tensorflow $ sudo pip3 uninstall tensorflow
$ sudo apt-get install gfortran $ sudo apt-get install libhdf5-dev libc-ares-dev libeigen3-dev $ sudo apt-get install libatlas-base-dev libopenblas-dev libblas-dev $ sudo apt-get install liblapack-dev cython $ sudo pip3 install pybind11 $ sudo pip3 install h5py
$ sudo pip3 install --upgrade setuptools
$ pip install gdown
$ export PATH=$PATH:/home/pi/.local/bin
$ gdown https://drive.google.com/uc?id=11mujzVaFqa7R1_lB7q0kVPW22Ol51MPg
$ sudo -H pip3 install tensorflow-2.2.0-cp37-cp37m-linux_armv7l.whl
$ reboot
ERROR: Could not find a version that satisfies the requirement tensorflow-addons<0.8.0,>=0.7.1 (from rasa) (from versions: none) ERROR: No matching distribution found for tensorflow-addons<0.8.0,>=0.7.1 (from rasa)
Alternatively run Docker on rpi
https://github.com/rgstephens/rasaPi
Alternatively build from Tensorflow source
I was able to get this running on my Pi 4 1gb, although I didn't get much further than training a model yet. All I had to do was modify the version of the docker-compose.yml file to 3.3, and it worked after that.
Overview of how I got Rasa running using this with docker-compose on a Raspberry Pi 4b (1gb):
curl -sSL https://get.docker.com | sh
sudo usermod -aG docker pi
You'll have to re-login (or possibly restart your shell) after doing this.
docker-compose.yml
file so that the version
is 3.3
, and the Rasa version is 1.10.10
. The contents of my docker-compose.yml is below:
version: '3.3'
services:
rasa:
image: rasa:1.10.10
expose:
- 5005
ports:
- 5005:5005
volumes:
- ./:/app
command:
- run
endpoints.yml
to the following:
action_endpoint:
url: "http://app:5055/webhook"
This sets up the webhook endpoint. I believe app
is the hostname inside the docker container.
make docker
to build the docker imagedocker-compose run rasa init
to setup Rasa inside the image (the appdir will be on persistent storage), using the default directorydocker-compose up
. If you'd like to run detached, you can run docker-compose up -d
.curl -XPOST http://localhost:5005/webhooks/rest/webhook \
-H "Content-type: application/json" \
-d '{"sender": "test", "message": "Hey"}'
We probably could run with outside world access, but I'm not entirely sure what that'll take (perhaps something as simple as changing what IP we're listening on from app
to 0.0.0.0)
My model was really simple - I didn't change anything from the default. After the initial training, my CPU usage stayed pretty constant below 20% on one core (hovering around 15% most of the time, spiking to 30% only very briefly.) while querying with simple phrases, and my total system ram usage stayed below 400mb. We should try with larger models, but it seems like we may not need a Pi with more ram after all.
@snekiam successfully installed Rasa via Docker. Closing as a result.
Objective
Installing a working version of Rasa on Raspbian Buster to be used on a Raspberry Pi.
Key Result
Details
Explore solutions for installing Rasa on Raspbian Buster which requires both Tensorflow and SpaCy.