Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use alternating Human / AI messages in M4, Assistant #30

Open
rlancemartin opened this issue Sep 24, 2024 · 2 comments
Open

Use alternating Human / AI messages in M4, Assistant #30

rlancemartin opened this issue Sep 24, 2024 · 2 comments

Comments

@rlancemartin
Copy link
Collaborator

@robbiemu noted

In module 4, research assistant, some models will not respond if the final message in the conversation is an AIMessage (which is always the case), so you may have to modify search_wikipedia and search_web to "humanize" the last message like:
msg = HumanMessage(content=msg.content). This is true for local models but may even be true for non gpt models (When I run mistral-nemo locally, it will not respond. If you have a ChatMistral, that would behave the same way). Likewise the quality will be a lot lower if we do not do the same for generate_answer, as its references will be intact but no write up will be done by the expert

@rlancemartin
Copy link
Collaborator Author

@robbiemu do you mind trying this?

#34

@robbiemu
Copy link
Contributor

Sure, I am just about to do that .. just FYI, here is what I had actually done when working through the module: https://gist.github.com/robbiemu/2704b22215694f4e5539ae601178d922

@rlancemartin rlancemartin changed the title Add HumanMessage to output of search_wikipedia, search_web in M4 Use alternating Human / AI messages in M4, Assistant Sep 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants