Closed thecoorum closed 9 months ago
Is it a typo that you never register the composer instance called middleware
?
Either way, this should not affect the reported problem. If you don't register it, the middleware tree will just complete faster.
As a first step, I would try to simplify the code so you can narrow down where the problem is. Do you use serverless functions or edge functions? The correct adapter is different for the two runtimes. Did you try the minimal example for vercel from our example bots repository at https://github.com/grammyjs/examples/tree/main/setups, or did you follow the guide at https://grammy.dev/hosting/vercel? What changes if you use a minmal example with a single bot instance, rather than creating the bot on the fly? (I don't think it's related to that, but this just means that you can throw it out in order to pinpoint the issue.)
Hey @KnorpelSenf! I'm using Serverless Functions for API routes. I also tried to simplify the implementation to use only start
command which responds with ctx.replyWithPhoto
Regarding the middleware it's not a typo, I omitted some code (don't know why, actually). Here is the full route
import { NextRequest, NextResponse } from "next/server";
import {
Bot,
Composer,
webhookCallback,
session,
enhanceStorage,
} from "grammy";
import { conversations, createConversation } from "@grammyjs/conversations";
import { freeStorage } from "@grammyjs/storage-free";
import { handlers } from "@/bot/handlers";
import { start } from "@/bot/commands/start";
import { request } from "@/bot/commands/request";
import { request as requestConversation } from "@/bot/conversations/request";
import { details } from "@/bot/callbacks/details";
import { supabase } from "@/utils/supabase";
import type { Context } from "@/bot/types";
export const POST = async (req: NextRequest, ...args: any[]) => {
const { data, error } = await supabase
.from("companies")
.select("bot_token")
.eq("slug", req.headers.get("host")!.split(".")[0])
.single();
if (error) {
return NextResponse.json({ ...error }, { status: 500 });
}
const token = data.bot_token;
const middleware = new Composer<Context>();
middleware.command("start", start);
middleware.command("request", request);
const bot = new Bot<Context>(token);
bot.use(
session({
initial: () => ({
name: "",
slug: "",
phone: "",
}),
storage: enhanceStorage({
storage: freeStorage(token),
millisecondsToLive: 10 * 60 * 1000,
}),
})
);
bot.use(conversations());
bot.use(createConversation(requestConversation, "request"));
bot.use(handlers);
bot.use(middleware);
bot.callbackQuery("request", (ctx) => ctx.conversation.enter("request"));
bot.callbackQuery("details", details);
bot.on("message", async (ctx) => {
await ctx.reply("Я поки що не знаю що з цим робити");
});
bot.catch((err) => {
console.error("Error:", err);
});
const handleUpdate = webhookCallback(bot, "std/http", "throw", 15_000);
return handleUpdate(req, ...args);
};
I also tried to log every step with Sentry and can see that the latest log is displayed in my console. So, my assumption is that something goes wrong in webhookCallback
import { NextRequest, NextResponse } from "next/server";
import {
Bot,
Composer,
webhookCallback,
session,
enhanceStorage,
} from "grammy";
import { conversations, createConversation } from "@grammyjs/conversations";
import { freeStorage } from "@grammyjs/storage-free";
import * as Sentry from "@sentry/nextjs"
import { handlers } from "@/bot/handlers";
import { start } from "@/bot/commands/start";
import { request } from "@/bot/commands/request";
import { request as requestConversation } from "@/bot/conversations/request";
import { details } from "@/bot/callbacks/details";
import { supabase } from "@/utils/supabase";
import type { Context } from "@/bot/types";
export const POST = async (req: NextRequest, ...args: any[]) => {
const { data, error } = await supabase
.from("companies")
.select("bot_token")
.eq("slug", req.headers.get("host")!.split(".")[0])
.single();
if (error) {
return NextResponse.json({ ...error }, { status: 500 });
}
const token = data.bot_token;
Sentry.captureMessage('middleware = new Composer')
const middleware = new Composer<Context>();
Sentry.captureMessage('middleware = new Composer done')
Sentry.captureMessage('middleware Composer use')
middleware.command("start", start);
middleware.command("request", request);
Sentry.captureMessage('middleware Composer use done')
Sentry.captureMessage(`new Bot: ${token}`)
const bot = new Bot<Context>(token);
Sentry.captureMessage('new Bot done')
Sentry.captureMessage('bot.use')
bot.use(
session({
initial: () => ({
name: "",
slug: "",
phone: "",
}),
storage: enhanceStorage({
storage: freeStorage(token),
millisecondsToLive: 10 * 60 * 1000,
}),
})
);
Sentry.captureMessage('bot.use done')
Sentry.captureMessage('bot.use conversations')
bot.use(conversations());
bot.use(createConversation(requestConversation, "request"));
Sentry.captureMessage('bot.use conversations done')
Sentry.captureMessage('bot.use handlers')
bot.use(handlers);
bot.use(middleware);
Sentry.captureMessage('bot.use handlers done')
Sentry.captureMessage('bot.callbackQuery')
bot.callbackQuery("request", (ctx) => ctx.conversation.enter("request"));
bot.callbackQuery("details", details);
Sentry.captureMessage('bot.callbackQuery done')
Sentry.captureMessage('bot.message')
bot.on("message", async (ctx) => {
await ctx.reply("Я поки що не знаю що з цим робити");
});
Sentry.captureMessage('bot.message done')
bot.catch((err) => {
Sentry.captureException(err);
return NextResponse.json({ ...err }, { status: 500 });
});
Sentry.captureMessage('handleUpdate')
const handleUpdate = webhookCallback(bot, "std/http", "throw", 15_000);
Sentry.captureMessage('handleUpdate done')
return handleUpdate(req, ...args);
};
Ah well, we do not have support for next/server
yet. The next-js
adapter is compatible with Next.js serverless functions, which have a different API signature.
The way these framework adapters work is that you first need to take a look at how your particular server expects its middleware to be. For next/server
, you can see an example here. This tells you that you need to do something like
import { NextResponse } from 'next/server'
import type { NextRequest } from 'next/server'
export function middleware(request: NextRequest) {
return NextResponse.next();
}
grammY defines its framework adapters in this file: https://github.com/grammyjs/grammY/blob/1c238c0f08df047dc8dc11dd069519cc4c68b7ee/src/convenience/frameworks.ts
Look at how every adapter maps function signatures like the above to a generic ReqResHandler
object that the webhook callback logic can work with. Currently, there is no compatible adapter for next/server
since we never import NextResponse
.
(We might be able to add support by simply returning a new Reponse()
but it doesn't seem to be documented that this actually works, so we'd have to experiment with it. /cc @PonomareVlad)
Until then, grammY provides a callback adapter that works with any framework: https://github.com/grammyjs/grammY/blob/1c238c0f08df047dc8dc11dd069519cc4c68b7ee/src/convenience/webhook.ts#L14-L24
This means that something similar to the following code will work (no promises, coded on github.com):
import { NextResponse } from 'next/server'
import type { NextRequest } from 'next/server'
const handleUpdate = webhookCallback(bot, "callback");
export function middleware(request: NextRequest) {
const update = await request.json();
const header = request.headers.get("X-Telegram-Bot-Api-Secret-Token");
return await handleUpdate(update, (json: string) => new NextResponse(json), header);
}
I'm curious if this works, please keep us posted.
@KnorpelSenf looking into the code I see that webhookCallback
doesn't export callback
as supported adapter in TS. Anyway it does use the function, but the Next CLI fails locally because of
Error: No response is returned from route handler '...'. Ensure you return a `Response` or a `NextResponse` in all branches of your handler.
@KnorpelSenf looking into the code I see that
webhookCallback
doesn't exportcallback
as supported adapter in TS
You're right, this needs to be fixed. It was caused by an incomplete refactoring some time ago.
Anyway it does use the function, but the Next CLI fails locally
Hmmm then perhaps you need to go even one more step back and provide a FrameworkAdapter
that makes use of handlerReturn
?
@all-contributors add @thecoorum for the bug
@KnorpelSenf
I've put up a pull request to add @thecoorum! :tada:
I see that std/http
seems to be using handlerReturn
(accordingly to the comments in frameworks.d.ts
), but not sure how I should adjust the response then 🤔
Also, as I mentioned before the default configuration for std/http
is working properly in local environment and fails in production only
not sure how I should adjust the response then 🤔
I guess you need to create a NextResponse object?
Also, as I mentioned before the default configuration for
std/http
is working properly in local environment and fails in production only
Right. I have no idea about this one. I have never used nextjs myself. What is the difference between the local and the production environment? (grammY itself certainly doesn't behave differently, it is not aware of its surroundings.)
Looking into the structure of std/http
can't understand why it's not working in production. I see it's returning regular Response
which is also acceptable by NextJS. They allow to return either Response
or NextResponse
(which as far as I remember is an extensions of Response
with some extra methods)
I suspect that maybe resolveResponse
is not assigned with Promise.resolve
and because of that end
, respond
and unauthorized
callbacks are never fired
I tried to extract the logic from the webhookCallback
straight into my route and while it's working in development it still times out in production. I also skipped the logic of the verifying bot token and token from headers as for some reason there was no X-Telegram-Bot-Api-Secret-Token
request header present
import { type NextRequest, NextResponse } from "next/server";
import { Bot } from "grammy";
const bot = new Bot(process.env.REQUESTS_BOT_TOKEN!);
bot.on("message", async (ctx) => {
await ctx.reply("Ping");
});
function timeoutIfNecessary(
task: Promise<void>,
onTimeout: "throw" | "return" | (() => unknown),
timeout: number
): Promise<void> {
if (timeout === Infinity) return task;
return new Promise((resolve, reject) => {
const handle = setTimeout(() => {
if (onTimeout === "throw") {
reject(new Error(`Request timed out after ${timeout} ms`));
} else {
if (typeof onTimeout === "function") onTimeout();
resolve();
}
}, timeout);
task
.then(resolve)
.catch(reject)
.finally(() => clearTimeout(handle));
});
}
export const POST = async (req: NextRequest) => {
let initialized = false;
if (!initialized) {
await bot.init();
initialized = true;
}
let usedWebhookReply = false;
const webhookReplyEnvelope = {
send: async (json: any) => {
usedWebhookReply = true;
await new Promise((resolve) => resolve(NextResponse.json(json)));
},
};
await timeoutIfNecessary(
bot.handleUpdate(await req.json(), webhookReplyEnvelope),
"throw",
10_000
);
if (!usedWebhookReply) {
return NextResponse.json(null, { status: 200 });
}
};
while it's working in development it still times out in production
This sort of gives me the feeling that neither of us is making obvious mistakes in the code. It sort of boils down to differences between dev and prod, such as having different implementations of global objects like Request/Response/Promise.
The above code is a fairly short example that reproduces the issue (https://sscce.org). It could be a good idea to contact the people from nextjs to find out why the code behaves differently.
for some reason there was no
X-Telegram-Bot-Api-Secret-Token
request header present
This is expected. It is only present if you configure it when setting your webhook.
I opened a discussion in the NextJS repo, let's see if any useful suggestions will appear there https://github.com/vercel/next.js/discussions/59652
Nice, subscribed. You may wanna include the above code in the discussion so that people don't need to understand grammY before they're able to look into the issue.
What happens if you throw out the timeoutIfNecessary
function and let the bot handle the update directly? Does that fix it?
Do you mean just calling bot.handleUpdate
with payload? Didn't check this approach and unfortunately will be able to do this only tomorrow
Yep! That would be the next step in narrowing down the problem. By continuing to remove seemingly unrelated code, we either end up removing the code that causes the problem, or we end up with a tiny bit of code that causes the problem. Either way, we will have isolated it, which allows the bug to be fixed (either by us or by them).
Hey @KnorpelSenf! Sorry for long reply, wasn't able to test out the suggestion you made during weekends. I tried to implement it now, but it still failing with timeout. Including the source code of the endpoint and the screenshot of log for triggering the endpoint
app/api/bots/requests/route.ts
import { type NextRequest, NextResponse } from "next/server";
import { Bot } from "grammy";
import { supabase } from "@/utils/supabase";
const bot = new Bot(process.env.REQUESTS_BOT_TOKEN!);
bot.on("message::bot_command", async (ctx) => {
// Match command pattern /process_<id>
const match = ctx.message!.text!.match(/^\/process_(\d+)$/);
if (!match) return;
const id = match[1];
const { data, error } = await supabase
.from("requests")
.select()
.eq("id", id)
.single();
if (error) {
await ctx.reply("Виникла помилка при завантаженні заявки.");
await ctx.reply(error.message);
return;
}
await ctx.reply(
`
<pre><code>
company_name: ${data.company_name}
company_slug: ${data.company_slug}
phone_number: ${data.phone_number}
user_id: ${data.user_id}
user_username: ${data.user_username}
</code></pre>
`
);
});
bot.on("message", async (ctx) => {
await ctx.reply("Ping");
});
export const POST = async (req: NextRequest) => {
let initialized = false;
if (!initialized) {
await bot.init();
initialized = true;
}
let usedWebhookReply = false;
const webhookReplyEnvelope = {
send: async (json: any) => {
usedWebhookReply = true;
await new Promise((resolve) => resolve(NextResponse.json(json)));
},
};
await bot.handleUpdate(await req.json(), webhookReplyEnvelope);
if (!usedWebhookReply) {
return NextResponse.json(null, { status: 200 });
}
};
Awesome!
Just to be sure, the webhook reply envelope is never used, right? You didn't enable the feature. So you should be able to empty send
and see the same behaviour. Also, I assume that you tested the ping handler, which timed out, so you should be able to remove the entire command handler and still see the same behaviour. Also, I suspect the init call to not be the problem because it only calls getMe
, so you should be able to specify the bot info when constructing your bot. This will make sure that you no longer need to call init but still be able to see the same behaviour.
This should leave you with <20 lines of code that have virtually no logic and still reproduce the issue. Can you confirm?
(Perhaps you now see where I'm going with this.)
i'm so glad there is at least an issue... i've been dancing with it for too long... so the official current state of things is that "Grammy doesnt work in production Next.js environment deployed in Vercel"?
i use latest grammy 1.20.3
and nextjs 14.0.4
Nobody really knows. I'm not using nextjs so I haven't investigated it.
my vercel env was at v18.* (see screenshot), switching to v20 (beta) solves the timeouts issue
subscribe for my GitHub 🎩
Very interesting stuff. @thecoorum can you confirm that this fixes it?
At some point I decided to migrate my bot to Deno, so it will take me some time to replicate the existing bot back on Next.js. I will post an update as soon as I will do some testings
Hmm, despite upgrading Vercel's Node version to 20.x
bot is still timing out. @di-sukharev is there anything else you did?
I will share the code of my bot, maybe I did something wrong...
import { NextRequest } from "next/server";
import {
Bot,
Composer,
webhookCallback,
session,
enhanceStorage,
} from "grammy";
import { conversations, createConversation } from "@grammyjs/conversations";
import { freeStorage } from "@grammyjs/storage-free";
import { handlers } from "@/bot/handlers";
import { start } from "@/bot/commands/start/admin";
import { request } from "@/bot/commands/request";
import { description } from "@/bot/commands/description";
import { process as processCommand } from "@/bot/commands/process";
import { request as requestConversation } from "@/bot/conversations/request";
import { process as processConversation } from "@/bot/conversations/process";
import { details } from "@/bot/callbacks/details";
import type { Context } from "@/bot/types";
const token = process.env.ADMIN_BOT_TOKEN!;
const middleware = new Composer<Context>();
middleware.command("start", start);
middleware.on("message::bot_command", processCommand);
const bot = new Bot<Context>(token);
bot.use(
session({
initial: () => ({}),
storage: enhanceStorage<{}>({
storage: freeStorage(token),
millisecondsToLive: 10 * 60 * 1000,
}),
})
);
bot.use(conversations());
bot.use(createConversation(requestConversation, "request"));
bot.use(createConversation(processConversation, "process"));
bot.use(handlers);
bot.use(middleware);
bot.callbackQuery("request", (ctx) => ctx.conversation.enter("request"));
bot.callbackQuery("description", details);
bot.on("message", async (ctx) => {
// ...
});
// bot.catch((error) => {
// Sentry.captureException(error);
// });
const handleUpdate = webhookCallback(bot, "std/http");
export const POST = async (req: NextRequest, ...rest: any[]) => {
return handleUpdate(req, ...rest);
};
@thecoorum i thought it's the node version, but when then it failed again, so i now know it's not..
but good news is that is that it doesnt matter, bc i anyway made it work :)
this is my webhook code in src/app/api/bot/route.ts:
import { webhookCallback } from "grammy";
import { NextRequest } from "next/server";
import { Bot, Context, InlineKeyboard } from "grammy";
interface BotConfig {
isDeveloper: boolean;
}
type ExtendedContext = Context & { config: BotConfig };
const telegramBotWebhookHandler = async (req: NextRequest, ...args: any[]) => {
const bot = new Bot<ExtendedContext>(process.env.TELEGRAM_BOT_KEY!);
bot.use(async (ctx, next) => {
const isInWhiteList = false;
if (isInWhiteList) {
ctx.config = { isDeveloper: isInWhiteList };
await next();
} else {
try {
console.log(`forbidden access from user ${ctx?.from?.id}`);
await ctx.reply("❌ 403 ❌");
} catch (error) {
console.log(error);
}
}
});
bot.command("start", (ctx) =>
ctx.reply(
`Hi ${ctx.from?.first_name}`
)
);
bot.hears("ping", async (ctx) => {
await ctx.reply("pong 🏓", {
reply_to_message_id: ctx.msg.message_id,
});
});
bot.on("message:text", async (ctx) => {
console.log("GOT MSG: ", ctx.msg.text);
const inlineKeyboard = new InlineKeyboard().webApp(
"Open app",
`${process.env.TELEGRAM_WEBAPP_URL}`
);
await ctx.reply("Hi", { reply_markup: inlineKeyboard });
});
const handleBotWebhook = webhookCallback(
bot,
"std/http",
"throw",
10000,
process.env.TELEGRAM_WEBHOOK_KEY
);
console.info("BOT REQUEST", { req, args });
return handleBotWebhook(req, ...args);
};
export { telegramBotWebhookHandler as POST };
this is the script i run after each next js build:
import { Bot } from "grammy";
const bot = new Bot(process.env.TELEGRAM_BOT_KEY!);
// @ts-ignore
const isBotWebhookSet = await bot.api.setWebhook(
process.env.TELEGRAM_WEBHOOK_URL!,
{
secret_token: process.env.TELEGRAM_WEBHOOK_KEY,
}
);
console.info({ message: "Successfully updated the webhook", isBotWebhookSet });
my local node version is v20
What is the key difference between this and the code in the original issue description?
I can't see a real difference between the initial code and the working one, only the export code style, but that shouldn't matter
Just to be clear, the two of you are using the same code with the same hosting provider and you observe different behaviour? That means that it isn't related to your code, but rather to something else entirely.
I honestly don't see how grammY can have something to do with this, so I don't think it will be fixed in the library (unless new evidence shows up). Feel free to close this issue, or keep it open and discuss further, whatever you prefer. :)
I will close this, as I do not see what we can do here. Feel free to reopen if you find out more things, and especially so if you can narrow down that there is a problem with grammY.
Oh, ok.. Now it is my turn 🚬 Has anyone managed to figure out the root cause of the issue or find a solution?
I think I have found a solution to the issue.
The key lies in how Next.js handles server dependencies during build time. Just add the grammy
dependency to serverComponentsExternalPackages
in the next.config.js config and it should work.
Find more information about serverComponentsExternalPackages"
here: https://nextjs.org/docs/app/api-reference/next-config-js/serverComponentsExternalPackages
My code:
Route
// src/app/api/bot/route.ts
import { NextRequest } from 'next/server';
import { Bot, webhookCallback } from 'grammy';
export const POST = async (req: NextRequest, ...args: any[]) => {
const token = process.env.TELEGRAM_TOKEN;
if (!token) throw new Error('TELEGRAM_TOKEN is unset');
const bot = new Bot(token);
bot.command('start', ctx => ctx.reply('Ласкаво просимо! Бот запущений.'));
bot.on('message', ctx => ctx.reply('Отримав ще одне повідомлення!'));
const handleUpdate = webhookCallback(bot, 'std/http', 'throw', 10000);
return handleUpdate(req, ...args);
};
Next.js config
// next.config.mjs
/** @type {import('next').NextConfig} */
const nextConfig = {
experimental: {
serverComponentsExternalPackages: ['grammy']
}
};
export default nextConfig;
I hope this will be useful to someone who also decides to create a telegram bot w/ grammY and Next.js.
Interesting stuff, thanks for sharing.
By the way,
// src/app/api/bot/route.ts
import { NextRequest } from 'next/server';
import { Bot, webhookCallback } from 'grammy';
export const POST = async (req: NextRequest, ...args: any[]) => {
const token = process.env.TELEGRAM_TOKEN;
if (!token) throw new Error('TELEGRAM_TOKEN is unset');
const bot = new Bot(token);
bot.command('start', ctx => ctx.reply('Ласкаво просимо! Бот запущений.'));
bot.on('message', ctx => ctx.reply('Отримав ще одне повідомлення!'));
const handleUpdate = webhookCallback(bot, 'std/http', 'throw', 10000);
return handleUpdate(req, ...args);
};
is a little inefficient because it recreates the bot for every update. This also means that it will have to re-initialize for every update, i.e. call getMe
.
Here is the optimised version:
// src/app/api/bot/route.ts
import { Bot, webhookCallback } from 'grammy';
const token = process.env.TELEGRAM_TOKEN;
if (!token) throw new Error('TELEGRAM_TOKEN is unset');
const bot = new Bot(token);
bot.command('start', ctx => ctx.reply('Ласкаво просимо! Бот запущений.'));
bot.on('message', ctx => ctx.reply('Отримав ще одне повідомлення!'));
export const POST = webhookCallback(bot, 'std/http');
@KnorpelSenf, thank you
I think I have found a solution to the issue.
The key lies in how Next.js handles server dependencies during build time. Just add the
grammy
dependency toserverComponentsExternalPackages
in the next.config.js config and it should work. Find more information aboutserverComponentsExternalPackages"
here: https://nextjs.org/docs/app/api-reference/next-config-js/serverComponentsExternalPackagesMy code:
Route
// src/app/api/bot/route.ts import { NextRequest } from 'next/server'; import { Bot, webhookCallback } from 'grammy'; export const POST = async (req: NextRequest, ...args: any[]) => { const token = process.env.TELEGRAM_TOKEN; if (!token) throw new Error('TELEGRAM_TOKEN is unset'); const bot = new Bot(token); bot.command('start', ctx => ctx.reply('Ласкаво просимо! Бот запущений.')); bot.on('message', ctx => ctx.reply('Отримав ще одне повідомлення!')); const handleUpdate = webhookCallback(bot, 'std/http', 'throw', 10000); return handleUpdate(req, ...args); };
Next.js config
// next.config.mjs /** @type {import('next').NextConfig} */ const nextConfig = { experimental: { serverComponentsExternalPackages: ['grammy'] } }; export default nextConfig;
I hope this will be useful to someone who also decides to create a telegram bot w/ grammY and Next.js.
You helped me a lot! Thank you very much!!
Perhaps we can add this info to the vercel setup in the example bots repository? @triken22 would you like to take care of that?
I'm using the following configuration of handling webhooks. While it's working in development (API route build time is under 1.5s), Vercel constantly reports function timeout without response. Some of the commands are using image send from static host with
ctx.replyWithPhoto
. Thestd/http
method is the only one working for me, neitherhttp
/https
, notnext-js
are not working because of different issuesAny suggestions or recommendations?
Thanks in advance!