You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
After sending several different prompts, the moment came when it sent me the following message:
This model's maximum context length is 2049 tokens, however you requested 4098 tokens (2 in your prompt; 4096 for the completion). Please reduce your prompt; or completion length.
Even if I put a small promt (like hello) I got the error message.
I guess this has to do with the history of prompts that are being sent to the same conversation. Is there a way to clear the history or to indicate that it is a new conversation?
The text was updated successfully, but these errors were encountered:
After sending several different prompts, the moment came when it sent me the following message:
This model's maximum context length is 2049 tokens, however you requested 4098 tokens (2 in your prompt; 4096 for the completion). Please reduce your prompt; or completion length.
Even if I put a small promt (like hello) I got the error message.
I guess this has to do with the history of prompts that are being sent to the same conversation. Is there a way to clear the history or to indicate that it is a new conversation?
The text was updated successfully, but these errors were encountered: