Parra-Glideator is an innovative web application designed to help paraglider pilots find the perfect place and time to fly. Leveraging sophisticated machine learning and generative AI models, it recommends the best flying spots based on weather forecasts and historical flight conditions.
Meet Parra-Glideator, our charming, paragliding gladiator parrot who traded natural flight for a paraglider. Like him, every pilot faces uncertainty—weather conditions, location choice, or flight planning can become a daunting battle. Parra-Glideator is here to ensure you have the odds on your side.
🌤️ Fly smarter, safer, and with more confidence!
The project is currently in public beta and available at parra-glideator.com. Try it out, plan your next flight adventure, and help us refine this pilot-friendly tool!
agents/ Autonomous LangGraph agents (e.g. Site Researcher)
analytics/ Notebooks, datasets, training pipeline
art/ Brand assets (Parra-Glideator!)
backend/ FastAPI API, MCP server, Celery workers, Docker
db/ dbt project building the analytics warehouse
frontend/ React + Leaflet single-page app
gfs/ Library for downloading & flattening NOAA GFS data
net/ PyTorch models + preprocessing (Glideator-Net)
scrapers/ Scrapy spiders for XContest & Paragliding Map
# clone & launch everything (API + DB + Worker + Web)
$ git clone https://github.com/janhelcl/glideator.git
$ cd glideator
$ docker-compose -f backend/docker-compose.dev.yml up --build- API docs: http://localhost:8000/docs
- Frontend: http://localhost:3000
- MCP Server http://localhost:8000/mcp
Each core component can be run on its own. Follow the dedicated README in the corresponding folder for setup & usage details:
backend/README.md— FastAPI API, Celery worker & Docker compose filesfrontend/README.md— React single-page applicationdb/README.md— dbt analytics warehousescrapers/README.md— Scrapy project for flight & site datagfs/README.md— GFS data downloader & utilitiesnet/README.md— PyTorch model libraryanalytics/training/README.md— End-to-end training pipelineagents/site_researcher/README.md— Autonomous Site Researcher agent
- Backend (
backend/) – FastAPI, MCP server, PostgreSQL, Celery, RabbitMQ. - Frontend (
frontend/) – React 18, Material-UI, React-Leaflet, D3. - Warehouse (
db/) – Postgres + dbt (staging & mart models). - ML Library (
net/) – Neural Networks implemented in PyTorch. - Training (
analytics/training/) – WebDataset loaders, notebooks. - Weather (
gfs/) – Fetches & processes NOAA GFS GRIB2 files. - Scrapers (
scrapers/) – Flight & site data collection with Scrapy. - Agents (
agents/site_researcher/) – LangGraph agent enriching site metadata. - MCP Integration – Model Context Protocol server enabling AI assistants to access paragliding data through structured tools for site information, weather forecasts, trip planning, and more.
- Scrapers write raw flights & sites → Postgres.
dbttransforms them into clean mart tables.- Training notebooks export WebDataset shards.
- PyTorch models are trained & the best checkpoint is shipped to the API.
For the full deep-dive (maths included!) see analytics/training/README.md.
