Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix issue when streaming LLM response #1523

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 7 additions & 1 deletion src/crewai/llm.py
Original file line number Diff line number Diff line change
Expand Up @@ -153,7 +153,13 @@ def call(self, messages: List[Dict[str, str]], callbacks: List[Any] = []) -> str
params = {k: v for k, v in params.items() if v is not None}

response = litellm.completion(**params)
return response["choices"][0]["message"]["content"]
if params.get("stream", False):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you're trying to listen to the stream, wouldn't you want this to be True?

Copy link
Contributor Author

@MottoX MottoX Oct 29, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"stream" is set to False in params by default but can be overridden through kwargs.
Despite this, here we pass a default value in get() function, to make the code more readable and independent of preceding code.
So, stream option can be enabled by passing stream=True when creating crewai.LLM instance.

content = ""
for chunk in response:
content += chunk.choices[0].delta.content or ""
return content
else:
return response["choices"][0]["message"]["content"]
except Exception as e:
if not LLMContextLengthExceededException(
str(e)
Expand Down