Here is a detailed README.md file with clear explanations for each part of your chatbot project:
This project is a modular, scalable chatbot built with FastAPI, LangGraph, and LangChain. It supports multi-turn conversations, memory summarization, and tool integration (Tavily search). Designed for helping entrepreneurs and business users through structured prompts like executive summaries, market analysis, marketing strategies, and more.
my\_chatbot\_project/
│
├── main.py # App entry, loads FastAPI + routers
├── requirements.txt # All dependencies
├── .env # API keys and secrets
│
├── config/ # Global app settings
│ └── settings.py
│
├── routers/ # API endpoint definitions
│ └── chat\_router.py
│
├── services/ # LangGraph logic, memory, summaries
│ ├── langgraph\_engine.py
│ ├── memory\_manager.py
│ └── summarizer.py
│
├── models/ # Shared types
│ └── state.py
│
├── tools/ # External tool setup (Tavily search)
│ └── tavily\_tool.py
│
├── prompts/ # Dynamic prompt templates per chat\_type
│ ├── executive\_summary.py
│ ├── market\_analysis.py
│ ├── marketing\_strategy.py
│ ├── financial\_projection.py
│ ├── implementation\_timeline.py
│ ├── default\_prompt.py
│ └── **init**.py
│
├── utils/ # Utilities
│ ├── serializers.py
│ └── api\_client.py
│
├── test/ # Unit test directory
│ └── test\_chat\_stream.py
│
└── README.md # Project documentation
git clone https://github.com/yourname/my_chatbot_project.git
cd my_chatbot_projectpip install -r requirements.txtCreate a .env file in the root with the following keys:
OPENAI_API_KEY=your-openai-key
TAVILY_API_KEY=your-tavily-key
/chat_stream is the main streaming endpoint using Server-Sent Events (SSE). It takes:
message– the user's questioncheckpoint_id– resume previous context or Noneclerk_id,project_id– user/project contextchat_type– type of session (e.g., "market_analysis")
- Uses
ConversationSummaryBufferMemoryto summarize long chats. - Automatically updates memory every round.
- Sends/receives chat summaries via REST API (
utils/api_client.py).
- Manages flow: model node → conditional → tool node → back to model.
- Asynchronously streams responses via
langgraph.astream_events.
- Integrated with Tavily for real-time web search.
- You can extend
tools/for custom tools later.
-
Prompts are modularized in
prompts/for each type:executive_summary,market_analysis, etc.
-
prompts/__init__.pydynamically loads the right one.
curl -N "http://localhost:8000/chat_stream?message=Hi&clerk_id=123&project_id=456&chat_type=executive_summary"data: {"type":"checkpoint", "checkpoint_id":"uuid"}
data: {"type":"content", "content":"Hello! Let’s get started..."}
data: {"type":"end"}executive_summarymarket_analysismarketing_strategyfinancial_projectionimplementation_timeline
These drive dynamic onboarding experiences using specialized prompts.
You can write tests using pytest in the test/ directory.
Example test:
pytest test/test_chat_stream.py- Add new prompts to
prompts/ - Register them in
__init__.py - Add new tools in
tools/ - Enhance summarization logic in
services/summarizer.py
- LangChain
- LangGraph
- Tavily Search
- Built with ❤️ by Priya Rathor