Conduit Fabric

AI infrastructure I built for myself - and use every day.

Multi-provider LLM orchestration, MCP tooling, and document intelligence on Cloudflare Workers.

One engineer. Fully operational.
Conduit dashboard - widgets, tasks, morning briefing, system health
The system

Five surfaces, one fabric, zero ops burden.

Everything routes through a single Cloudflare Worker backed by D1, KV, and R2. No containers. No uptime monitoring. No infrastructure to babysit.

Terminal CodingMCP
Web PortalsSSE MCP
Telegrambot
Browseringest
Dashboardread
conduit-fabric · api.condi.dev Cloudflare Worker + Durable Object · 25 MCP tools · Cron engine · LLM proxy · Auth
D1tasks, context, widgets
KVOAuth, config, email
R2media, attachments
api.condi.dev - MCP server + REST API
dashboard.condi.dev - widget dashboard
tools.condi.dev - LLM arena

Live surfaces sit behind Cloudflare Access. Happy to walk through a demo.

Capabilities

What it actually does.

LLM orchestration

Route the same prompt to OpenAI, xAI, Gemini, and Claude with provider-specific adapters. Compare responses side by side, pick the best output, switch providers without touching application code. The arena at tools.condi.dev makes this interactive.

// fan out across all four providers
const results = await send_to_llm({
  prompt: "Review this PR for security issues",
  providers: ["anthropic", "openai", "gemini", "xai"],
  reasoning: true
});

// → parallel fetch, per-provider timeout
// → responses returned as they arrive
// → arena UI: compare side-by-side or consolidate

MCP tool server

Around two dozen tools exposed via Model Context Protocol - context ingestion, task tracking, project management, full-text search, cross-model workflows, email, dashboard publishing. Connected to Claude Code and Claude.ai as a live integration, not a demo.

// MCP tool surface
create_task({ title, scope, priority, project, tags })
ingest_context({ kind, content, tags, project })
search_context({ query, filters })
list_projects({ kind, status })
link_items({ from, to, relation })
send_to_llm({ prompt, providers, reasoning })
send_email({ to, subject, markdown })
dashboard_publish({ slot, category, content })
triage_inbox({ limit })

// plus: get_tasks, update_task, get_context, ...

Context capture & retrieval

Content goes in messy - meeting notes, research dumps, raw captures. Conduit tags and distills it into LLM-ready context, chunks it for RAG retrieval, and stores relations as a first-class graph so references don't get lost. Retrieval is FTS5 full-text search across everything, not keyword matching against filenames.

// messy input
await ingest_context({
  title: "standup notes 4/18",
  content: raw_dump,
  tags: ["work", "defense-ux"]
});

// retrieval pipeline
// → LLM-auto-tagged and distilled
// → chunked for RAG retrieval
// → FTS5 full-text search (D1)
// → graph: spawned / blocks / references

Dashboard & automation

A slot-based, cron-driven information dashboard. Morning news briefings, task summaries, evening recaps - all generated by scheduled AI jobs and published to named widget slots. ETag-cached polling keeps it snappy. Time-travel archive lets you scroll back through previous versions.

Morning briefing - cron-generated news digest Dashboard - tasks and widgets panel
Proof of life

This isn't a side project.

Conduit Fabric handles task management, context capture, document intelligence, and multi-provider LLM routing across every working session. The dashboard runs cron jobs at 6am, noon, and 9pm - morning briefing, task triage, evening recap.

The MCP server is connected to Claude Code and Claude.ai right now. Work flows through Conduit.

25
MCP tools
6
Scheduled cron jobs
4
LLM providers
1
Worker. Zero ops.
About

Steve Biggs

20+ years building defense software, simulation systems, and training platforms. Currently exploring what happens when one engineer treats AI tooling as infrastructure instead of a novelty.

Before Conduit, my AI setup was a pile of disjointed MCP servers - hard to update, capabilities scattered across terminals, apps, and browsers, nothing speaking to anything else. Conduit started as the cleanup. It turned into the system I work through every day: one place to capture context, track tasks, route prompts across providers, and publish back to a dashboard I actually read.

Huntsville, AL · Built on Cloudflare
Get in touch

Curious about the stack?

Want to talk AI-augmented workflows, MCP tooling, or Cloudflare architecture?

conduit@condi.dev