Skip to content

thorchh/agent-starter

Repository files navigation

Agent Starter

πŸ€– Agent Starter

Production-ready AI chat interface with tool calling, streaming, and multi-provider support

Built with Vercel AI SDK Β· AI Elements Β· Next.js 15 Β· TypeScript

TypeScript Next.js AI SDK License

Website Β· Features Β· Quick Start Β· Documentation Β· Examples


🎯 Overview

A polished, production-ready starter template for building AI agents with conversational interfaces. Features streaming responses, multi-step tool calling, web search with citations, and support for multiple AI providers.

Perfect for building:

  • AI assistants with custom tools and knowledge
  • Research agents with web search and citations
  • Data analysis tools with database integration
  • Customer support bots with contextual awareness
  • Internal tools with API integrations

✨ Features

🎨 Chat Experience

Real-time Streaming

  • Server-sent events for instant responses
  • Token-by-token streaming display
  • Progress indicators and loading states

Smart History

  • Sidebar with time-based grouping
  • Search across conversations
  • Draft mode (no empty chats)

Advanced Interactions

  • Multi-step tool calling
  • File attachments (drag & drop)
  • Inline citations with [1], [2] markers
  • Chain-of-thought visualization
  • Message branching & regeneration

πŸ€– Model Support

Provider Models Features
OpenAI GPT-5, GPT-4o, o4-mini Reasoning, tools, web search
Groq DeepSeek R1 (70B) Fast inference, reasoning
AI Gateway All providers Unified routing, cost optimization
  • Runtime switching - Change models mid-conversation
  • Reasoning display - Collapsible thought process for o-series models
  • Token tracking - Monitor input/output usage

πŸ› οΈ Built-in Tools

// Example: Weather tool
const weather = await getWeather({ city: "San Francisco" });
// β†’ { temp: 72, conditions: "Sunny" }
  • ⏰ getTime - Current server time
  • 🌀️ getWeather - Mock weather data (template for real APIs)
  • πŸ“Ž summarizeAttachments - File content analysis
  • πŸ” web_search - OpenAI built-in search (optional toggle)

β†’ See src/lib/ai/tools/examples/ for API, database, and search templates

🎨 UI/UX

Light mode
Light Mode

Dark mode
Dark Mode

Mobile responsive
Mobile Responsive

  • 🎨 Theme toggle - Light/dark mode with next-themes
  • πŸ“± Mobile responsive - Works on all screen sizes
  • ⌨️ Keyboard shortcuts - Cmd+K new chat, Cmd+Enter send
  • β™Ώ Accessible - ARIA labels, keyboard navigation

πŸ—οΈ Developer Experience

  • βœ… TypeScript strict mode - Full type safety
  • πŸ”§ ESLint + Prettier - Code quality enforcement
  • πŸ”₯ Hot reload - Instant feedback during development
  • πŸ“š Comprehensive docs - Architecture, deployment, contributing guides
  • 🧩 Modular design - Easy to extend and customize

πŸš€ Quick Start

Prerequisites

Installation

# Clone the repository
git clone https://github.com/thorchh/agent-starter.git
cd agent-starter

# Install dependencies
pnpm install

# Set up environment variables
cp .env.example .env.local
# Edit .env.local and add your OPENAI_API_KEY

# Start the development server
pnpm dev

Open http://localhost:3000 to see your app! πŸŽ‰

First Steps

  1. Try the chat - Navigate to /chat and ask "What time is it?"
  2. Test tool calling - The AI will call the getTime tool
  3. Enable search - Toggle web search and ask "Latest news about AI"
  4. Customize - Edit src/lib/ai/system-prompt.ts to change behavior

πŸŽ“ Examples

Adding a Custom Tool

// src/lib/ai/tools/getStockPrice.ts
import { tool } from "ai";
import { z } from "zod";

export const getStockPrice = tool({
  description: "Get the current stock price for a symbol",
  parameters: z.object({
    symbol: z.string().describe("Stock ticker symbol (e.g., AAPL)"),
  }),
  execute: async ({ symbol }) => {
    // Replace with real API call
    const price = Math.random() * 1000;
    return {
      symbol,
      price: price.toFixed(2),
      currency: "USD",
    };
  },
});

// src/lib/ai/tools/index.ts
export const tools = {
  getTime,
  getWeather,
  getStockPrice, // ← Add your tool
};

Switching Providers

// Use Groq for faster inference
AI_MODEL=groq/deepseek-r1-distill-llama-70b

// Use AI Gateway for cost optimization
AI_MODEL=gateway/openai/gpt-5
AI_GATEWAY_API_KEY=vck-...

Enabling Reasoning (o-series models)

// .env.local
ENABLE_REASONING=true
OPENAI_REASONING_SUMMARY=detailed
OPENAI_REASONING_EFFORT=high

πŸ“ Project Structure

agent/
β”œβ”€β”€ src/
β”‚   β”œβ”€β”€ app/
β”‚   β”‚   β”œβ”€β”€ api/chat/           # Streaming chat endpoint
β”‚   β”‚   β”œβ”€β”€ chat/[id]/          # Chat page (dynamic route)
β”‚   β”‚   └── page.tsx            # Landing page
β”‚   β”œβ”€β”€ components/
β”‚   β”‚   β”œβ”€β”€ chat/               # Chat UI (Client, Sidebar, MessageParts)
β”‚   β”‚   β”œβ”€β”€ ai-elements/        # AI Elements components
β”‚   β”‚   └── ui/                 # shadcn/ui primitives
β”‚   └── lib/
β”‚       β”œβ”€β”€ ai/
β”‚       β”‚   β”œβ”€β”€ provider.ts     # Multi-provider routing
β”‚       β”‚   β”œβ”€β”€ models.ts       # Model allowlist & config
β”‚       β”‚   β”œβ”€β”€ tools/          # Tool registry
β”‚       β”‚   └── system-prompt.ts
β”‚       └── chat/
β”‚           β”œβ”€β”€ server/         # File-based chat store
β”‚           └── store/          # Store implementations
β”œβ”€β”€ .env.example
β”œβ”€β”€ ARCHITECTURE.md             # System design deep dive
β”œβ”€β”€ DEPLOYMENT.md               # Production deployment guide
└── CONTRIBUTING.md             # Development guidelines

πŸ”§ Environment Variables

Variable Required Description Default
OPENAI_API_KEY βœ… OpenAI API key -
GROQ_API_KEY ❌ Groq API key for DeepSeek R1 -
AI_GATEWAY_API_KEY ❌ Vercel AI Gateway key -
AI_MODEL ❌ Default model ID openai/gpt-5
ENABLE_REASONING ❌ Show reasoning for o-series models false
OPENAI_REASONING_SUMMARY ❌ Reasoning detail level auto
OPENAI_REASONING_EFFORT ❌ Reasoning effort high

See .env.example for full configuration.


πŸ“– Documentation

Document Description
ARCHITECTURE.md System design, data flow, and key decisions
DEPLOYMENT.md Production deployment with Vercel/Supabase
CONTRIBUTING.md Development setup and guidelines
Tool Examples Database, API, and search tool templates

🚒 Deployment

Quick Deploy to Vercel

Deploy with Vercel

Production Checklist

For production use, you'll need to upgrade from the file-based storage:

  • Database - Set up Vercel Postgres or Supabase (see DEPLOYMENT.md)
  • Blob Storage - Configure S3/R2/Vercel Blob for file attachments
  • Authentication - Add NextAuth, Clerk, or Supabase Auth
  • Rate Limiting - Implement per-user/IP rate limits
  • Monitoring - Set up Sentry, Vercel Analytics, or similar

See DEPLOYMENT.md for detailed instructions.


πŸ› οΈ Tech Stack

Category Technologies
Framework Next.js 16 (App Router), React 19
Language TypeScript 5 (strict mode)
AI Vercel AI SDK 6.0, AI Elements
UI Tailwind CSS 4, shadcn/ui, Radix UI
Providers OpenAI, Groq, AI Gateway
Dev Tools ESLint, pnpm, Hot Reload

🀝 Contributing

Contributions are welcome! Please see CONTRIBUTING.md for:

  • Development setup
  • Code style guidelines
  • Testing requirements
  • Pull request process

πŸ“ License

This is a starter template. Use it however you like - MIT License.


πŸ™ Acknowledgments

Built with:


⭐ Star this repo if you find it useful!

Website Β· Documentation Β· GitHub

Made with ❀️ for the AI community

About

Production-ready AI chat interface starter template with tool calling, streaming, and multi-provider support

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published