Skip to content

RutamBhagat/Code-Wizard-LangGraph-C_RAG

Repository files navigation

Code Wizard: LangChain Documentation AI Chatbot

Overview

Code Wizard is an AI-powered chatbot designed to make understanding and using LangChain documentation effortless. Ask any question related to LangChain concepts or code, and Code Wizard will explain it clearly and interactively. It is built using a tech stack that includes Next.js, FastAPI, LangChain, LangGraph, and LCEL, with the ability to switch between models like ChatOpenAI and local LLaMA models.

Frontend: Code Wizard UI

Frontend Repo: Code Wizard Frontend

Backend Repo: Code Wizard Backend

Demo Video

Langchain.Langgraph.Documentation.Chatbot.Walkthrough.nTZu6I-0xF8.webm

Key Features

  • Interactive Chat Interface: Engaging chat interface built with Next.js and React for smooth and intuitive user experience
  • LangChain Integration: Uses LangChain for building applications with large language models
  • Documentation Search: Implements LangGraph DAG to search vector databases for relevant documentation chunks
  • Custom AI Responses: Combines retrieved documentation chunks with ChatOpenAI to generate detailed answers
  • Markdown Rendering: Supports rendering code snippets and Markdown for easy comprehension

Technologies Used

  • Frontend: Next.js, Typescript for a responsive and dynamic user interface
  • Backend: FastAPI for fast and reliable API handling
  • AI Frameworks: LangChain, LangGraph, LCEL for processing and understanding queries
  • Model Support: Switchable between ChatOpenAI and LLaMA models for flexibility
  • Data Storage: Vector databases for efficient document retrieval

Challenges and Learnings

Building Code Wizard was a fantastic learning journey, offering valuable lessons on:

  • LangChain Mastery: Leveraging components like agents, memory, and vector stores effectively
  • Model Optimization: Techniques like quantization and CPU offloading for efficient performance
  • UI/UX Design: Creating conversational interfaces that feel natural and easy to use
  • Scalable Backend Architecture: Using FastAPI and async processing for better performance

Optimizations

  1. Caching System

    • Cached responses for frequently asked questions to improve latency and efficiency
    • Reduced API load and provided faster user experiences
  2. Streaming Responses

    • Implemented LangChain’s streaming feature to send data to users as soon as it’s available
    • Enhanced user interaction by reducing waiting times
  3. Model Flexibility

    • Capability to switch to more powerful models like GPT-4 for critical use cases
    • Balances performance and cost-effectiveness based on user needs

Getting Started

Follow these steps to set up and run the Code Wizard application locally.

Prerequisites

  • Ensure you have Node.js and Python installed on your machine.
  • Install Bun for JavaScript package management.
  • Install Pipx for Python package management.

Clone the Repositories

  1. Clone the Frontend Repository

    git clone https://github.com/RutamBhagat/code_wizard_frontend
    cd code_wizard_frontend
  2. Environment Setup for Frontend

    • Create a .env file in the code_wizard_frontend directory.
    • Add your OpenAI API key to the .env file.
  3. Install Frontend Dependencies

    # Install Bun globally
    npm install -g bun
    
    # Install frontend dependencies
    bun install
  4. Start the Frontend Development Server

    bun run dev
  5. Clone the Backend Repository

    cd ..  # Go back to the previous directory
    git clone https://github.com/RutamBhagat/Code-Wizard-LangGraph-C_RAG
    cd Code-Wizard-LangGraph-C_RAG
  6. Environment Setup for Backend

    • Create a .env file in the Code-Wizard-LangGraph-C_RAG directory.
    • Add your OpenAI API key to the .env file.
  7. Install Backend Dependencies

    # Install PDM for Python dependency management
    pipx install pdm
    
    # Install backend dependencies
    pdm install
  8. Start the Backend Server

    pdm run uvicorn app.main:app --reload

Outcome

Code Wizard has demonstrated its ability to transform the way developers learn and utilize the LangChain framework. It offers seamless integration of documentation search and AI-based explanations while being highly optimized for scalability and performance.

Screenshots

Screenshot 1 Screenshot 2 Screenshot 3 Screenshot 4