While the core of grammY is extremely efficient, the package does not ship with a built-in mechanism for long polling at scale. (It does scale well with webhooks, though.)
The grammY runner solves this by providing you with a sophisticated mechanism that can pull updates concurrently from the Telegram servers, and in turn execute your bot's middleware stack concurrently, all while catching errors, timeouts, and giving you full control over how much load is applied to your server.
Use the grammY runner package if
Do not use grammY runner if
Here is a quickstart for you, but the real documentation is here on the website. The runner package has many more features, and they are documented there.
npm i @grammyjs/runner
Import run
from @grammyjs/runner
, and replace bot.start()
with run(bot)
. It is that simple. Done!
Okay okay, here is some example code:
import { Bot } from "grammy";
import { run } from "@grammyjs/runner";
// Create bot
const bot = new Bot("<token>");
// Add the usual middleware, yada yada
bot.on("message", (ctx) => ctx.reply("Got your message."));
// Run it concurrently!
run(bot);
grammY runner makes it trivial to have very high update throughput. However, concurrency is generally very hard to get right, so please read this section in the docs.
—more verbose documentation about concurrency in grammY.
—documentation of everything that grammY runner exports.
—repository full of example bots, look our for those that demonstrate how to use grammY runner.