-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow users to pick Anthropic or OpenAI for LLM extraction #817
base: main
Are you sure you want to change the base?
Allow users to pick Anthropic or OpenAI for LLM extraction #817
Conversation
I tried to not commit the formatter changes, apologies for that! |
Could you check #414 to see if it fits your needs? |
Thanks @rafaelsideguide . That PR allows for different models from OpenAI. |
…o be inconsistent because anthropic does not have a response_format param
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Awesome work, @aar2dee2! Just one thing: if the env variable isn’t set and a request is made, it returns a 500 error with a response directly from Anthropic. It would be helpful if the response clarified the issue (e.g., 'Anthropic API key is not set' or 'OpenAI API key is not set' for OpenAI).
Current response:
{
"success": false,
"error": "(Internal server error) - Could not resolve authentication method. Expected either apiKey or authToken to be set. Or for one of the \"X-Api-Key\" or \"Authorization\" headers to be explicitly omitted - Could be due to LLM parsing issues"
}
I've added an
llmOptions
param in extract options for thev1/scrape
endpoint.This accepts an object with a 'model' and a 'provider' property.
For example,
Demo video: https://youtu.be/BqsS9UTMD3s
I've retained logs for now. Let me know if you'd like to make changes for merging.