Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Number of tokens exceeded #5

Open
emontes opened this issue Aug 10, 2023 · 1 comment
Open

Number of tokens exceeded #5

emontes opened this issue Aug 10, 2023 · 1 comment

Comments

@emontes
Copy link

emontes commented Aug 10, 2023

After sending several different prompts, the moment came when it sent me the following message:

This model's maximum context length is 2049 tokens, however you requested 4098 tokens (2 in your prompt; 4096 for the completion). Please reduce your prompt; or completion length.

Even if I put a small promt (like hello) I got the error message.
image

I guess this has to do with the history of prompts that are being sent to the same conversation. Is there a way to clear the history or to indicate that it is a new conversation?

@valentinedwv
Copy link

try 1024 tokens

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants