I have a stream of text that I send to a function to turn into speech. I send the text sentence by sentence for better latency. The issue im running into is handling interruptions. Right now, I send each sentence to a function containing a promise. So I end up with a chain of awaits that pile on fairly quickly, so when I try to interrupt the function, althought it properly breaks out, the await chain continues to run. Here is a snippet of the code (send_voice returns a promise)
const stream = await openai.chat.completions.create({
model: "gpt-3.5-turbo",
messages: conversation_history,
stream: true,
});
let sentence = "";
for await (const chunk of stream) {
const content = chunk.choices[0]?.delta?.content || "";
sentence += content;
if (interruptionState.value){
console.log("breaking");
break; //break correctly called but send_voice chain continues
}
if (sentence.endsWith('.') || sentence.endsWith('!') || sentence.endsWith('?')) {
console.log("FINAL: ", sentence);
conversation_history.push({ "role": "system", "content": sentence });
await send_voice(sentence, ws, elleven_key, voice_id,interruptionState);
sentence = "";
}
}
I know that I can't stop the await chain, but is there any way around this? I essentially need to call send_voice sequentially and be able to stop it quickly on interruption. I've been stuck on this for a while so any help would be much appreciated!
I've been stuck on this for a while so any help would be much appreciated!
Hope it helps.