Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

prebuilt createReactAgent can not work with streamEvents #623

Open
Wulino opened this issue Oct 23, 2024 · 3 comments
Open

prebuilt createReactAgent can not work with streamEvents #623

Wulino opened this issue Oct 23, 2024 · 3 comments

Comments

@Wulino
Copy link

Wulino commented Oct 23, 2024

Bug Report: Stream Events with ChatMessageChunk

Description

I am trying to implement a fully streaming call using this.agentExecutor.stream(xxx)(this is not fully streaming), and everything was working fine initially. However, after making the following changes:

this.agentExecutor = createReactAgent({
  llm: this.chatModel,
  tools,
  messageModifier: prompt,
});
this.agentExecutor.streamEvents(
  { messages },
  {
    version: 'v2',
  },
);

20241023144113

image

So, is there anyway to achieve Streaming LLM Tokens? I have been struggling with this issue for many days. ReactAgent without streaming is really terrible!!

@jacoblee93
Copy link
Collaborator

What model are you using?

@Wulino
Copy link
Author

Wulino commented Oct 24, 2024

What model are you using?

It's trained By my company, just like ChatGPT. The llm class is based on @langchain/openai/ChatOpenAi, and I have only modified the input and output parameters of the API. So I think it's same as ChatGPT.
I believe the model itself is not the issue; it seems more like there are problems with the entire workflow's support for streamEvents, right? cause both "invoke" and "stream" are working as expected.

By the way, since I don't have a ChatGPT API key, is there anyone who can help me try using ChatGPT or other models to see if there are any issues with createReactAgent and streamEvents? This way, I can figure out exactly where the problem lies.

@jacoblee93
Copy link
Collaborator

So I think it's same as ChatGPT. I believe the model itself is not the issue; it seems more like there are problems with the entire workflow's support for streamEvents, right? cause both "invoke" and "stream" are working as expected.

Can you double check this by calling model.streamEvents on your implementation? As far as I know this works for OpenAI

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants