-
-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Token counting for a whole message #31
Comments
I'm not sure if this should be part of this library, but maybe in https://github.com/tryAGI/OpenAI? But this may not be an option if you already heavily depend on another OpenAI sdk. Although this sounds quite ambitious, I'm actually making pretty good progress on this. This will also allo to get the same for any other SDK based on the OpenAPI specification, which is very important for the rapid development of a library with a large number of integrations (LangChain .NET) |
I see. Since OpenAI doesn't really give us the specs, esp. not in a machine friendly way, that really shouldn't go in here. |
btw, I added your library to this highly viewed SO answer ;-) |
Not sure if this would fit into the scope of this project, but could be a real killer feature, since none of the others do it.
If not, please feel free to delete :-)
What would you like to be added:
Be able to pass a whole OpenAI Message object into a function, and get the complete token count back.
Why is this needed:
So far, counting of a complete OpenAI message is quite tricky, as the message can include multiple parts now, functions, tools etc.
As far as I know, there is no C# lib that supports this, doesn't seem like MS is adding any value here (in contrary :-) ) and it seems everyone wants to count tokens for messages, not just text.
Anything else we need to know?
I tried to implement it following this https://stackoverflow.com/a/77175648/4821032
There is also a typescript library that seems to come very close: https://github.com/hmarr/openai-chat-tokens
P.S.: I think it only is really needed for outgoing (prompt) messages, since the incoming chat objects have the actual token count in them.
The text was updated successfully, but these errors were encountered: