Replies: 1 comment
-
Hey @dangnhdev! 👋 I'm here to help you with any bugs, questions, or contributions while you're waiting for a human maintainer. Let's tackle this together! To configure separate models for embedding and chat, using OpenAI for chat and a local model for embedding, you can modify the import { getChatsConfigFromLocalStorage } from "@/lib/api/chat/chat.local";
import { useUserData } from "@/lib/hooks/useUserData";
import { BrainConfig } from "@/lib/types/BrainConfig";
// eslint-disable-next-line @typescript-eslint/explicit-module-boundary-types
export const useLocalStorageChatConfig = () => {
const { userData } = useUserData();
const chatConfig = getChatsConfigFromLocalStorage();
const chatModel = (userData?.models ?? []).includes(chatConfig?.chatModel ?? "")
? chatConfig?.chatModel
: undefined;
const embeddingModel = (userData?.models ?? []).includes(chatConfig?.embeddingModel ?? "")
? chatConfig?.embeddingModel
: undefined;
return {
chatConfig: {
chatModel: chatModel,
embeddingModel: embeddingModel,
temperature: chatConfig?.temperature,
maxTokens: chatConfig?.maxTokens,
},
};
}; In this example, |
Beta Was this translation helpful? Give feedback.
-
Hello how can I configure separate model for embedding and chat? I want to use openai for chat only and another local model for embedding to save money.
Beta Was this translation helpful? Give feedback.
All reactions