Handling errors in Nodejs ReadableStream with vercel ai sdk

158 views Asked by At

When handling API calls to openai and trying to stream it back to the client, there are two possible cases for errors:

  1. Error occurs in the request, the stream does not begin between openai <-> my backend For this there is a nice documentation by vercel https://sdk.vercel.ai/docs/guides/providers/openai#guide-handling-errors (tldr: wrap it in a try catch (duh))

  2. Error occurs while the stream is ongoing. In this edge case the try catch block does not trigger. It could be a problem with sveltekit though?

Is there a way I can add an event listener or a "middleware" in the openai response stream, so that whenever an error does occur I can handle it accordingly?

const response = await openai.chat.completions.create({
            model,
            stream: true,
            messages
        });

        // Convert the response into a friendly text-stream
        const stream = OpenAIStream(response);

        // Respond with the stream
        return new Response(stream);
0

There are 0 answers