It was great to work and make this project, Tuti is now deprecated https://divyendusingh.com/farewell-tuti
Note: this was a private project, this is made public. Some CI/CD scripts might be outdated. This is only for reference and might not be deployable as is.
Built with telegraf, Telegram Bot API, AWS and ❤️
docker build . -t lingoparrot
docker tag lingoparrot:latest <docker remote>
docker run -p 3000:3000 --env-file ./.env.local.docker lingoparrot:latest
docker push <docker remote>
cd carrd-language-learners-20220623
npx serve
We use a specific dev/production setup. For all the relevant commands, yarn scripts have two variants like:
yarn run set-webhook
yarn run set-webhook-production
The 1st one uses .env
and the 2nd one uses .env.production
. With sensible defaults, I believe that this setup is most convenient for bot development and yields least mistakes. Open to feedback.
Copy .env_sample
file to .env
(and .env.production
for a production setup - details later) and fill the required values.
Run the following commands to start receiving @<bot-name>
requests on your local machine.
yarn dev
sls offline start
, supports hot realoading)..env
and localtunnel url).yarn run watch
- to watch and compile TS to JS.Generally, I have two versions of the bot i.e. development and production. I point my dev bot to my local setup using the above steps.
yarn run deploy
- deploys to Lambda using parameters from .env
yarn run deploy-production
- deploys to Lambda using parameters from .env.production