This Bio was written by GenAI using my chat history... :)
I'm the sort of engineer who spins up Kubernetes clusters for fun, builds multi-GPU AI inference stacks before breakfast, and insists that YAML indentation is a personality trait. I spend most of my time tinkering with Linux, OpenShift, AI/ML model serving, and anything that involves distributed systems doing weird things across multiple networks.
My home lab is suspiciously close to a small datacentre, and my idea of “taking a break” is writing another tool in Go or C to automate something nobody else realised needed automating.
When I'm not explaining generative AI to humans (or machines), I'm usually playing guitar, breaking and unbreaking cluster networking, or discovering that yes, you can run far too many containers on a single node if you believe in yourself. I like building things that are fast, explainable, and reproducible—mostly so I can break them again on purpose.
- Kubernetes & OpenShift (I know where all the bodies are buried)
- AI/ML model serving (vLLM, Granite, LlamaStack, etc.)
- RAG pipelines, vector stores, embeddings, agentic workflows
- Hybrid-cloud networking, multi-cluster topologies, and GPU workloads
- GitOps everything (Helm, Kustomize, Argo CD, automation everywhere)
- Go (favourite), Python, C#, C (Rust is a WIP)
- Podman, OpenShift AI, Milvus, Kubeflow Pipelines
- Proxmox, pfSense, OVN, WireGuard
- Bash scripts that probably should be rewritten into something else… but still work
- Jupyter-based RAG demos
- Custom model-server wrappers
- Operators that do “just one more small thing”
- Networking experiments that definitely shouldn’t work, but absolutely do
- GPU passthrough setups that make BIOS firmware tremble
- Electric and acoustic guitarist
- Music theory enjoyer
- I have strong opinions about modelling amps
- Owner of more HDMI cables than statistically reasonable
- Agentic AI architectures
- GPU scheduling and inference scaling
- Rust-based microservices for MLOps
- New ways to cause Kubernetes to behave creatively
Always open to collaborating on AI/ML infra, distributed systems, model serving, or interesting open-source projects.


