-
Notifications
You must be signed in to change notification settings - Fork 870
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cannot stream chat completions from Azure #1015
Comments
Hi @johanbaath, could you share a small repro code? |
@deyaaeldeen sure something like this: Server: const azureOpenaiClient = new AzureOpenAI({
endpoint: "...",
deployment: "...",
apiVersion: "2024-07-01-preview",
apiKey: "...",
});
response = await azureOpenaiClient.chat.completions.create({
model: "",
messages: [
{
role: "user",
content: "hello!",
},
],
stream: true,
}); Client: const response = await ky.post(endpoint, {
body,
signal: abortController.signal,
});
const runner = ChatCompletionStreamingRunner.fromReadableStream(response.body);
runner.on("content", (delta) => {
finalText += delta;
});
await runner.finalChatCompletion(); |
Thanks! Could you also share the name and version of the deployed model and the region it is deployed in? |
@deyaaeldeen gpt-4o, version: 2024-05-13, region: Sweden Central |
@johanbaath Thanks for confirming! Is asynchronous filter enabled by any chance? This feature has been known to cause such behavior and a similar issue has been reported elsewhere, for example in openai/openai-python#1677 with more context. |
Yes! I use the async filter, is there anything I can do or is this being worked on? Thank you! |
We're discussing this behavior internally and I'll post an update as soon as I have one. For now, I suggest handling the absence of the |
Considering that this library does not properly manage this scenario, and given that Azure OpenAI for TypeScript advises migration to openai-node (as detailed here), are you suggesting that we manually patch the existing library as a temporary solution? Thanks! |
We didn't come to a decision yet regarding whether a change is necessary in either the service API or the client libraries. I'll keep the issue updated once I know more. In the meantime, I am merely suggesting to handle this issue for now in your code, for example, if you have a steam of completion chunks, you can do the following: for await (const chunk of steam) {
for (const choice of chunk.choices) {
...
const delta = choice.delta;
if (delta) {
// process delta
}
...
}
} |
Confirm this is a Node library issue and not an underlying OpenAI API issue
Describe the bug
When streaming chat completions using
client.chat.completions.create
withAzureOpenAI
client and reading withChatCompletionStreamingRunner.fromReadableStream
on the client, the following error occurs:Cause:
The error seems to be caused by
choice.delta
being undefined at some point during the streaming process, usually at the end of the stream.Questions:
To Reproduce
AzureOpenAI
clientclient.chat.completions.create
stream a response to the client (a web app)ChatCompletionStreamingRunner.fromReadableStream
on the clientCode snippets
No response
OS
macOS
Node version
Node v20.14.0
Library version
openai 4.56.0
The text was updated successfully, but these errors were encountered: