feat(mcpd): Llm resource — CRUD + CLI + apply #52

Merged
michal merged 1 commits from feat/llm into main 2026-04-19 21:39:29 +00:00
Owner

Summary

Phase 1 of the Llm plan: introduce the `Llm` resource so operators can register the provider catalogue centrally. No inference proxy yet — Phase 2 adds that; this PR only lands the schema + CRUD + CLI so Phase 2 has something to hang off.

Based on `feat/secretbackend` (PR #51) — chained because Llm credentials are stored by reference to a `Secret`, which needs the new backend abstraction to resolve cleanly.

  • Prisma: new `Llm` model with `{apiKeySecretId, apiKeySecretKey}` FK pair to `Secret`. Reverse `llms` relation on Secret.
  • mcpd: repo + service + Zod schema + `/api/v1/llms` routes. API talks `apiKeyRef: {name, key}`; service translates to/from the FK. `resolveApiKey()` reads through SecretService — that's the hook for Phase 2.
  • RBAC: added `llms` resource + alias.
  • CLI: `create|get|describe|delete llm` + `apply -f` with kind: llm. Shell completions regenerated.
  • Types: `anthropic | openai | deepseek | vllm | ollama | gemini-cli`. Tiers: `fast | heavy`.

Test plan

  • 11 LlmService unit tests (create with/without apiKeyRef, duplicate rejection, update null unlink, resolveApiKey happy/missing, validation)
  • 9 route tests (GET/POST/PUT/DELETE + 404/409/400)
  • Full workspace suite: 1812/1812 passing (+20 new)
  • TypeScript clean across mcpd + cli
  • Completions freshness test passes
  • End-to-end: deploy, register an Llm, round-trip through `apply -f`, confirm apiKeyRef resolves

🤖 Generated with Claude Code

## Summary Phase 1 of the Llm plan: introduce the \`Llm\` resource so operators can register the provider catalogue centrally. **No inference proxy yet** — Phase 2 adds that; this PR only lands the schema + CRUD + CLI so Phase 2 has something to hang off. **Based on \`feat/secretbackend\` (PR #51)** — chained because Llm credentials are stored by reference to a \`Secret\`, which needs the new backend abstraction to resolve cleanly. - Prisma: new \`Llm\` model with \`{apiKeySecretId, apiKeySecretKey}\` FK pair to \`Secret\`. Reverse \`llms\` relation on Secret. - mcpd: repo + service + Zod schema + \`/api/v1/llms\` routes. API talks \`apiKeyRef: {name, key}\`; service translates to/from the FK. \`resolveApiKey()\` reads through SecretService — that's the hook for Phase 2. - RBAC: added \`llms\` resource + alias. - CLI: \`create|get|describe|delete llm\` + \`apply -f\` with kind: llm. Shell completions regenerated. - Types: \`anthropic | openai | deepseek | vllm | ollama | gemini-cli\`. Tiers: \`fast | heavy\`. ## Test plan - [x] 11 LlmService unit tests (create with/without apiKeyRef, duplicate rejection, update null unlink, resolveApiKey happy/missing, validation) - [x] 9 route tests (GET/POST/PUT/DELETE + 404/409/400) - [x] Full workspace suite: **1812/1812 passing** (+20 new) - [x] TypeScript clean across mcpd + cli - [x] Completions freshness test passes - [ ] End-to-end: deploy, register an Llm, round-trip through \`apply -f\`, confirm apiKeyRef resolves 🤖 Generated with [Claude Code](https://claude.com/claude-code)
michal changed target branch from feat/secretbackend to main 2026-04-19 21:39:24 +00:00
michal added 1 commit 2026-04-19 21:39:24 +00:00
Why: every client that wants an LLM (the agent, HTTP-mode mcplocal, Claude
Code's STDIO mcplocal) today has to know the provider URL + key, and each
user's ~/.mcpctl/config.json carries them. Centralising the catalogue on the
server is the prerequisite for Phase 2 (mcpd proxies inference so credentials
never leave the cluster).

This phase adds the `Llm` resource and its CRUD surface — no proxy yet, no
client pivot yet. Just enough to register what you have.

Schema:
- New `Llm` model: name/type/model/url/tier/description + {apiKeySecretId,
  apiKeySecretKey} FK pair. Reverse `llms` relation on Secret.
- Provider types: anthropic | openai | deepseek | vllm | ollama | gemini-cli.
- Tiers: fast | heavy.

mcpd:
- LlmRepository + LlmService + Zod validation schema + /api/v1/llms routes.
- API surface exposes `apiKeyRef: {name, key}` — the service translates to/
  from the FK pair so clients never deal in cuids.
- `resolveApiKey(llmName)` reads through SecretService (which itself dispatches
  to the right SecretBackend). That's the hook Phase 2's inference proxy uses.
- RBAC: added `'llms'` to RBAC_RESOURCES + resource alias. Standard
  view/create/edit/delete semantics.
- Wired into main.ts (repo, service, routes).

CLI:
- `mcpctl create llm <name> --type X --model Y --tier fast|heavy --api-key-ref SECRET/KEY [--url ...] [--extra k=v ...]`
- `mcpctl get|describe|delete llm` — standard resource verbs.
- `mcpctl apply -f` with `kind: llm` (single- or multi-doc yaml/json).
  Applied after secrets, before servers — apiKeyRef resolves an existing Secret.
- Shell completions regenerated.

Tests: 11 service unit tests + 9 route tests (happy path, 404s, 409, validation).
Full suite 1812/1812 (+20 from the 1792 Phase 0 baseline). TypeScript clean.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
michal merged commit 9e3507752f into main 2026-04-19 21:39:29 +00:00
Sign in to join this conversation.
No Reviewers
No Label
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: michal/mcpctl#52