TalkSpark is an AI-powered tool designed to generate personalized conversation starters. The application leverages OpenAI's capabilities to analyze social media profiles and create custom icebreakers, making it easier to initiate meaningful conversations with new connections.
Frontend: TalkSpark UI
Frontend Repo: Talk Spark Frontend
Backend Repo: Talk Spark Backend
talk.spark.cgXr4V6pQZ8.mp4
- Profile Analysis: Automated system that analyzes social media profiles using LangGraph and custom tools
- Custom Icebreakers: Generates personalized conversation starters based on profile data
- Multi-Platform Integration: Supports various social media platforms through web scraping
- Modular Architecture: Easily extensible to include additional data sources and language models
- Frontend: Next.js for a responsive user interface
- Backend: FastAPI for efficient request handling
- AI Integration: LangChain, LCEL, LangGraph for AI processing
- External Tools: TavilyAPI for web search, jina.ai for persons data
- Development Tools: Python for backend processing
The development process provided valuable insights into:
- Integrating multiple data sources and LLMs into a cohesive application
- Creating custom agents and tools for LangGraph
- Optimizing web scraping for reliability and performance
- Implementing async processing and caching mechanisms
-
Caching System
- Implemented profile and webpage caching to reduce redundant requests
- Improved response times through efficient data storage
-
Asynchronous Processing
- Concurrent handling of web scraping and data processing
- Enhanced system responsiveness and scalability
Follow these steps to set up and run the TalkSpark application locally.
- Ensure you have Node.js and Python installed on your machine.
- Install Bun for JavaScript package management.
- Install Pipx for Python package management.
-
Clone the Frontend Repository
git clone https://github.com/RutamBhagat/talk_spark_frontend cd talk_spark_frontend
-
Environment Setup for Frontend
- Create a
.env
file in thetalk_spark_frontend
directory. - Add your OpenAI API key to the
.env
file.
- Create a
-
Install Frontend Dependencies
# Install Bun globally npm install -g bun # Install frontend dependencies bun install
-
Start the Frontend Development Server
bun run dev
-
Clone the Backend Repository
cd .. # Go back to the previous directory git clone https://github.com/RutamBhagat/talk_spark_langgraph cd talk_spark_langgraph
-
Environment Setup for Backend
- Create a
.env
file in thetalk_spark_langgraph
directory. - Add your OpenAI API key to the
.env
file.
- Create a
-
Install Backend Dependencies
# Install PDM for Python dependency management pipx install pdm # Install backend dependencies pdm install
-
Start the Backend Server
pdm run uvicorn app.main:app --reload
TalkSpark successfully demonstrates the practical application of AI in facilitating human connections. The system provides efficient, personalized conversation starters while maintaining scalability and performance through optimized processing techniques.