upstash / qstash-js

Message queue for serverless
https://docs.upstash.com/qstash
MIT License
170 stars 21 forks source link

[FEATURE REQUEST] Better serialization in workflows #197

Open aldebout opened 1 month ago

aldebout commented 1 month ago

It'd be really nice to be able to add a custom serializer/deserializer to workflow steps when passing data between them, especially since the type inference is currently happy with Dates and others even though it actually breaks in real life.

Example:

import { serve } from "@upstash/qstash/nextjs";
import devalue from "devalue";
import { z } from "zod";

const endpointPayloadSchema = z.object({
  eventId: z.string(),
});

export const workflowEndpoint = serve<z.infer<typeof endpointPayloadSchema>>(
  async (context) => {
    const payload = endpointPayloadSchema.parse(context.requestPayload);

    const data = await context.run(
      "fetch-data",
      async () => {
        const event = await db.get(payload.eventId);

        return { event };
      },
      { serialize: devalue.stringify, deserialize: devalue.parse }
    );

    await context.run(
      "send-email",
      async () => {
        await sendEmail({
          templateId: 1,
          templateData: {
            date: Intl.DateTimeFormat("en-US", {
              day: "numeric",
              weekday: "long",
              month: "long",
              year: "numeric",
            }).format(data.event.startDate),
          },
        });
      },
      { serialize: devalue.stringify, deserialize: devalue.parse }
    );
  }
);

Currently, this code (minus the serialize/deserialize bit) works typescript-wise, but fails in real life because event.startDate is not deserialized as a Date but as a string. I'm assuming the underlying logic is performing a JSON.stringify/JSON.parse.

This could also be added at the serve level, to ensure consistency between the steps.

I've tried doing it in userland, it works but is pretty annoying type-wise. I have not found a better way than explicitly defining all my return types and do return devalue.stringify(response as ResponseType) and then const parsedData = devalue.parse(data) as ResponseType which is difficult when the return types are complex.

One outstanding question: what does this mean for the actual qstash platform? Currently, you can see the output data on teh platform as what looks like a Go print, which sounds to me like the output is parsed by qstash and that would make the use of user-defined serialization hard to work with.

CahidArda commented 1 month ago

Hi there, defining a method for serialization is possible at serve level, but not possible in step level like you said.

We can consider having serialize/deserialize methods at step level.

If we add them, they can look wrong in Upstash Console, but it could still be more convenient to have these methods. We will make sure to check how this change affects console when we set about to implement it.

Thanks a lot for the suggestion!

sancar commented 1 month ago

Hi @aldebout , Can you share an example where the objects looks like a Go print ? That sounds like a bug that we need to fix.

aldebout commented 1 month ago

@sancar I reached out on discord :)

sancar commented 1 month ago

Hi Alex, thanks for details. I think I have found how this is possible. Indeed we have a bug.

To give more detail, SDK sends your data(response of fetch-data) as Json. Qstash(which is implemented in go), deserializes a bigger object where your data is just a field of. Then when it tries to put the data back to events to show in the console, it serializes is as go object by mistke.

As a solution, I think we shouldn't deserialize it at all , we should put it as is. We will fix this and update here as soon as the fix is deployed.