
The Builder’s
Lab.
Production systems built in the open. These are the tools, agents, and infrastructure experiments that run VMG Systems. We dogfood everything we sell.
EA Agent System
Qdrant · Gemini · Claude CLI
Autonomous morning brief and EOD wrap agents running on a self-hosted LXC. Each run recalls context from a Qdrant vector store (3,072-dim Gemini embeddings), generates a brief via Claude, posts to Slack, then extracts and stores new memories, creating a persistent, self-improving context loop.
n8n Automation Layer
GCP · Linear · Slack · Firestore
Self-hosted n8n instance orchestrating three production workflows: GCP infrastructure health checks with Slack alerting, Linear→Slack sprint summaries, and a Firestore polling agent that notifies on completed sales call recordings with full AI-extracted deal summaries.
Homelab AI OS
Proxmox · OPNsense · Tailscale
10-service self-hosted stack running across two Proxmox nodes with full VLAN isolation (OPNsense), zero-trust remote access (Tailscale), wildcard HTTPS via mkcert, and secrets management through a self-hosted Vaultwarden instance. Every service IaC-managed, zero ClickOps.
Langfuse Observability
Gemini · FastAPI · pgvector
Full LLM observability layer wrapping every Gemini call in Genubi's FastAPI pipeline. Traces inputs, outputs, latency, and token cost per request. Feeds a pgvector store for RAG workflows and enables 'Golden Dataset' regression testing against eval accuracy baselines.
Everything here runs in production
We dogfood what we sell.
Every system above runs live. If it works for VMG, it ships to clients.