Tools for creating and validating Catalyst Knowledge Packs.
pip install catalyst-builderKnowledge Packs are YAML configurations that define tools for integrating with external systems through the MCP (Model Context Protocol).
LLM-Optimized Knowledge Packs - New optional features to improve AI tool discovery and usage:
- Smart Tool Metadata - Display names, usage hints, complexity levels
- Parameter Constraints - Min/max values, examples, validation patterns
- Tool Prerequisites - Define safe tool usage sequences
- External Transforms - Reference Python/JS files for better maintainability
All features are 100% backward compatible - existing packs continue to work unchanged!
See the LLM Optimization Guide for details.
# pack.yaml
metadata:
name: "my_integration"
version: "1.0.0"
description: "Integration with external API"
connection:
type: "rest"
base_url: "${API_URL}"
auth:
method: "bearer"
token: "${API_TOKEN}"
tools:
list_items:
type: "list"
description: "Get list of items"
endpoint: "/items"
method: "GET"- REST API - HTTP/HTTPS API integrations
- Database - SQL and NoSQL database connections
- File System - Local files, S3, Azure Blob, Google Cloud Storage
- SSH - Remote system access
- Message Queue - RabbitMQ, Kafka, Redis Pub/Sub
list- Get arrays of datadetails- Get specific resource detailsquery- Run database queriessearch- Search with parametersexecute- Run commands or scripts
Define parameters for dynamic tools:
tools:
search_users:
type: "query"
sql: "SELECT * FROM users WHERE created_at > {since_date}"
parameters:
- name: "since_date"
type: "string"
required: trueTransform responses with jq, Python, JavaScript, or templates:
tools:
process_data:
type: "query"
sql: "SELECT id, name, status FROM users"
transform:
type: "jq"
expression: '.[] | {id, name, active: .status == "active"}'python -c "from catalyst_pack_schemas.validator import PackValidator; print(PackValidator().validate_pack('path/to/pack'))"Use environment variables for sensitive data:
connection:
host: "${DB_HOST}"
auth:
username: "${DB_USER}"
password: "${DB_PASSWORD}"Optional dependencies for specific integrations:
# Database connections
pip install asyncpg # PostgreSQL
pip install aiomysql # MySQL
pip install aiosqlite # SQLite
pip install motor # MongoDB
pip install redis # Redis
# Cloud storage
pip install aioboto3 # AWS S3
pip install google-cloud-storage # Google Cloud
pip install azure-storage-blob # Azure Blob
# Other integrations
pip install aio-pika # RabbitMQ
pip install aiokafka # Apache Kafka
pip install asyncssh # SSH connections- Integration Types - Detailed integration patterns
- Pack Structure - Pack organization guide
- Pack Development - Creation guide
- Security - Security patterns
See examples/ directory for sample packs demonstrating various integration patterns.