Code Wizard is an AI-powered chatbot designed to make understanding and using LangChain documentation effortless. Ask any question related to LangChain concepts or code, and Code Wizard will explain it clearly and interactively. It is built using a tech stack that includes Next.js, FastAPI, LangChain, LangGraph, and LCEL, with the ability to switch between models like ChatOpenAI and local LLaMA models.
Frontend: Code Wizard UI
Frontend Repo: Code Wizard Frontend
Backend Repo: Code Wizard Backend
Langchain.Langgraph.Documentation.Chatbot.Walkthrough.nTZu6I-0xF8.webm
- Interactive Chat Interface: Engaging chat interface built with Next.js and React for smooth and intuitive user experience
- LangChain Integration: Uses LangChain for building applications with large language models
- Documentation Search: Implements LangGraph DAG to search vector databases for relevant documentation chunks
- Custom AI Responses: Combines retrieved documentation chunks with ChatOpenAI to generate detailed answers
- Markdown Rendering: Supports rendering code snippets and Markdown for easy comprehension
- Frontend: Next.js, Typescript for a responsive and dynamic user interface
- Backend: FastAPI for fast and reliable API handling
- AI Frameworks: LangChain, LangGraph, LCEL for processing and understanding queries
- Model Support: Switchable between ChatOpenAI and LLaMA models for flexibility
- Data Storage: Vector databases for efficient document retrieval
Building Code Wizard was a fantastic learning journey, offering valuable lessons on:
- LangChain Mastery: Leveraging components like agents, memory, and vector stores effectively
- Model Optimization: Techniques like quantization and CPU offloading for efficient performance
- UI/UX Design: Creating conversational interfaces that feel natural and easy to use
- Scalable Backend Architecture: Using FastAPI and async processing for better performance
-
Caching System
- Cached responses for frequently asked questions to improve latency and efficiency
- Reduced API load and provided faster user experiences
-
Streaming Responses
- Implemented LangChain’s streaming feature to send data to users as soon as it’s available
- Enhanced user interaction by reducing waiting times
-
Model Flexibility
- Capability to switch to more powerful models like GPT-4 for critical use cases
- Balances performance and cost-effectiveness based on user needs
Follow these steps to set up and run the Code Wizard application locally.
- Ensure you have Node.js and Python installed on your machine.
- Install Bun for JavaScript package management.
- Install Pipx for Python package management.
-
Clone the Frontend Repository
git clone https://github.com/RutamBhagat/code_wizard_frontend cd code_wizard_frontend
-
Environment Setup for Frontend
- Create a
.env
file in thecode_wizard_frontend
directory. - Add your OpenAI API key to the
.env
file.
- Create a
-
Install Frontend Dependencies
# Install Bun globally npm install -g bun # Install frontend dependencies bun install
-
Start the Frontend Development Server
bun run dev
-
Clone the Backend Repository
cd .. # Go back to the previous directory git clone https://github.com/RutamBhagat/Code-Wizard-LangGraph-C_RAG cd Code-Wizard-LangGraph-C_RAG
-
Environment Setup for Backend
- Create a
.env
file in theCode-Wizard-LangGraph-C_RAG
directory. - Add your OpenAI API key to the
.env
file.
- Create a
-
Install Backend Dependencies
# Install PDM for Python dependency management pipx install pdm # Install backend dependencies pdm install
-
Start the Backend Server
pdm run uvicorn app.main:app --reload
Code Wizard has demonstrated its ability to transform the way developers learn and utilize the LangChain framework. It offers seamless integration of documentation search and AI-based explanations while being highly optimized for scalability and performance.