Compare commits

..

52 Commits

Author SHA1 Message Date
Michal
9481d394a1 feat: completions update, create promptrequest, LLM flag rename, ACP content fix
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- Add prompts/promptrequests to shell completions (fish + bash)
- Add approve, setup, prompt, promptrequest commands to completions
- Add `create promptrequest` CLI command (POST /projects/:name/promptrequests)
- Rename --proxy-mode-llm-provider/model to --llm-provider/model
- Fix ACP client: handle single-object content format from real Gemini
- Add tests for single-object content and agent_thought_chunk filtering

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-25 00:21:31 +00:00
Michal
bc769c4eeb fix: LLM health check via mcplocal instead of spawning gemini directly
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
Status command now queries mcplocal's /llm/health endpoint instead of
spawning the gemini binary. This uses the persistent ACP connection
(fast) and works for any configured provider, not just gemini-cli.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-25 00:03:25 +00:00
6f534c8ba9 Merge pull request 'feat: persistent Gemini ACP provider + status spinner' (#40) from feat/gemini-acp-provider into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-24 23:52:31 +00:00
Michal
11da8b1fbf feat: persistent Gemini ACP provider + status spinner
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
Replace per-call gemini CLI spawning (~10s cold start each time) with
persistent ACP (Agent Client Protocol) subprocess. First call absorbs
the cold start, subsequent calls are near-instant over JSON-RPC stdio.

- Add AcpClient: manages persistent gemini --experimental-acp subprocess
  with lazy init, auto-restart on crash/timeout, NDJSON framing
- Add GeminiAcpProvider: LlmProvider wrapper with serial queue for
  concurrent calls, same interface as GeminiCliProvider
- Add dispose() to LlmProvider interface + disposeAll() to registry
- Wire provider disposal into mcplocal shutdown handler
- Add status command spinner with progressive output and color-coded
  LLM health check results (green checkmark/red cross)
- 25 new tests (17 ACP client + 8 provider)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-24 23:52:04 +00:00
Michal
848868d45f feat: auto-detect gemini binary path, LLM health check in status
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
- Setup wizard auto-detects gemini binary via `which`, saves full path
  so systemd service can find it without user PATH
- `mcpctl status` tests LLM provider health (gemini: quick prompt test,
  ollama: health check, API providers: key stored confirmation)
- Shows error details inline: "gemini-cli / gemini-2.5-flash (not authenticated)"

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-24 23:24:31 +00:00
Michal
869217a07a fix: exactOptionalPropertyTypes and ResponsePaginator type errors
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-24 23:15:15 +00:00
04d115933b Merge pull request 'feat: LLM provider configuration, secret store, and setup wizard' (#39) from feat/llm-config-and-secrets into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-24 22:48:39 +00:00
Michal
7c23da10c6 feat: LLM provider configuration, secret store, and setup wizard
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
Add secure credential storage (GNOME Keyring + file fallback),
LLM provider config in ~/.mcpctl/config.json, interactive setup
wizard (mcpctl config setup), and wire configured provider into
mcplocal for smart pagination summaries.

- Secret store: SecretStore interface, GnomeKeyringStore, FileSecretStore
- Config schema: LlmConfigSchema with provider/model/url/binaryPath
- Setup wizard: arrow-key provider/model selection, dynamic model fetch
- Provider factory: creates ProviderRegistry from config + secrets
- Status: shows LLM line with hint when not configured
- 572 tests passing across all packages

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-24 22:48:17 +00:00
32b4de4343 Merge pull request 'feat: smart response pagination for large MCP tool results' (#38) from feat/response-pagination into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-24 21:40:53 +00:00
Michal
e06db9afba feat: smart response pagination for large MCP tool results
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
Intercepts oversized tool responses (>80K chars), caches them, and returns
a page index. LLM can fetch specific pages via _resultId/_page params.
Supports LLM-generated smart summaries with simple fallback.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-24 21:40:33 +00:00
Michal
a25809b84a fix: auto-read user credentials for mcpd auth
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
mcplocal now reads ~/.mcpctl/credentials automatically when
MCPLOCAL_MCPD_TOKEN env var is not set, matching CLI behavior.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-24 19:14:56 +00:00
f5a902d3e0 Merge pull request 'fix: STDIO transport stdout flush and MCP notification handling' (#37) from fix/stdio-flush-and-notifications into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-24 19:10:03 +00:00
Michal
9cb0c5ce24 fix: STDIO transport stdout flush and MCP notification handling
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- Wait for stdout.write callback before process.exit in STDIO transport
  to prevent truncation of large responses (e.g. grafana tools/list)
- Handle MCP notification methods (notifications/initialized, etc.) in
  router instead of returning "Method not found" error
- Use -p shorthand in config claude output

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-24 19:09:47 +00:00
06230ec034 Merge pull request 'feat: prompt resources, proxy transport fix, enriched descriptions' (#36) from feat/prompt-resources-and-proxy-transport into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-24 14:53:24 +00:00
Michal
079c7b3dfa feat: add prompt resources, fix MCP proxy transport, enrich tool descriptions
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- Fix MCP proxy to support SSE and STDIO transports (not just HTTP POST)
- Enrich tool descriptions with server context for LLM clarity
- Add Prompt and PromptRequest resources with two-resource RBAC model
- Add propose_prompt MCP tool for LLM to create pending prompt requests
- Add prompt resources visible in MCP resources/list (approved + session's pending)
- Add project-level prompt/instructions in MCP initialize response
- Add ServiceAccount subject type for RBAC (SA identity from X-Service-Account header)
- Add CLI commands: create prompt, get prompts/promptrequests, approve promptrequest
- Add prompts to apply config schema
- 956 tests passing across all packages

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-24 14:53:00 +00:00
Michal
7829f4fb92 fix: handle SSE responses in MCP bridge and add Commander-level tests
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
The bridge now parses SSE text/event-stream responses (extracting data:
lines) in addition to plain JSON. Also sends correct Accept header
per MCP streamable HTTP spec. Added tests for SSE handling and
command option parsing (-p/--project).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-24 10:17:45 +00:00
Michal
fa6240107f fix: mcp command accepts --project directly for Claude spawned processes
The mcp subcommand now has its own -p/--project option with
passThroughOptions(), so `mcpctl mcp --project NAME` works when Claude
spawns the process. Updated config claude to generate
args: ['mcp', '--project', project] and added Commander-level tests.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-24 10:14:16 +00:00
b34ea63d3d Merge pull request 'feat: add mcpctl mcp STDIO bridge, rework config claude' (#35) from feat/mcp-stdio-bridge into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-24 00:52:21 +00:00
Michal
e17a2282e8 feat: add mcpctl mcp STDIO bridge, rework config claude
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- New `mcpctl mcp -p PROJECT` command: STDIO-to-StreamableHTTP bridge
  that reads JSON-RPC from stdin and forwards to mcplocal project endpoint
- Rework `config claude` to write mcpctl mcp entry instead of fetching
  server configs from API (no secrets in .mcp.json)
- Keep `config claude-generate` as backward-compat alias
- Fix discovery.ts auth token not being forwarded to mcpd (RBAC bypass)
- Update fish/bash completions for new commands
- 10 new MCP bridge tests, updated claude tests, fixed project-discovery test

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-24 00:52:05 +00:00
01d3c4e02d Merge pull request 'fix: don't send Content-Type on bodyless DELETE, include full server data in project queries' (#34) from fix/delete-content-type-and-project-servers into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 19:55:35 +00:00
Michal
e4affe5962 fix: don't send Content-Type on bodyless DELETE, include full server data in project queries
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- Only set Content-Type: application/json when request body is present (fixes
  Fastify rejecting empty DELETE with "Body cannot be empty" 400 error)
- Changed PROJECT_INCLUDE to return full server objects instead of just {id, name}
  so project server listings show transport, package, image columns

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 19:54:34 +00:00
c75e7cdf4d Merge pull request 'fix: prevent attach/detach-server from repeating server arg on tab' (#33) from fix/completion-no-repeat-server-arg into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 19:36:53 +00:00
Michal
65c340a03c fix: prevent attach/detach-server from repeating server arg on tab
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
Added __mcpctl_needs_server_arg guard in fish and position check in
bash so completions stop after one server name is selected.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 19:36:45 +00:00
677d34b868 Merge pull request 'fix: instance completions use server.name, smart attach/detach' (#32) from fix/completion-instances-attach-detach into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 19:32:34 +00:00
Michal
c5b8cb60b7 fix: instance completions use server.name, smart attach/detach
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- Instances have no name field — use server.name for completions
- attach-server: show only servers NOT in the project
- detach-server: show only servers IN the project
- Add helper functions for project-aware server completion
- 5 new tests covering all three fixes

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 19:32:18 +00:00
9a5deffb8f Merge pull request 'fix: use .[][].name in jq for wrapped JSON response' (#31) from fix/completion-jq-wrapped-json into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 19:27:02 +00:00
Michal
ec7ada5383 fix: use .[][].name in jq for wrapped JSON response
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
API returns { "resources": [...] } not bare arrays, so .[].name
produced no output. Use .[][].name to unwrap the outer object first.
Also auto-load .env in pr.sh.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 19:26:47 +00:00
b81d3be2d5 Merge pull request 'fix: use jq for completion name extraction to avoid nested matches' (#30) from fix/completion-nested-names into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 19:23:48 +00:00
Michal
e2c54bfc5c fix: use jq for completion name extraction to avoid nested matches
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
The regex "name":\s*"..." on JSON matched nested server names inside
project objects, mixing resource types in completions. Switch to
jq -r '.[].name' for proper top-level extraction. Add jq as RPM
dependency. Add pr.sh for PR creation via Gitea API.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 19:23:21 +00:00
7b7854b007 Merge pull request 'feat: erase stale fish completions and add completion tests' (#29) from feat/completions-stale-erase-and-tests into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 19:17:00 +00:00
Michal
f23dd99662 feat: erase stale fish completions and add completion tests
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
Fish completions are additive — sourcing a new file doesn't remove old
rules. Add `complete -c mcpctl -e` at the top to clear stale entries.
Also add 12 structural tests to prevent completion regressions.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 19:16:36 +00:00
43af85cb58 Merge pull request 'feat: context-aware completions with dynamic resource names' (#28) from feat/completions-project-scope-dynamic into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 19:08:45 +00:00
Michal
6d2e3c2eb3 feat: context-aware completions with dynamic resource names
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- Hide attach-server/detach-server from --help (only relevant with --project)
- --project shows only project-scoped commands in tab completion
- Tab after resource type fetches live resource names from API
- --project value auto-completes from existing project names
- Stop offering resource types after one is already selected

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 19:08:29 +00:00
ce21db3853 Merge pull request 'feat: --project scopes get servers/instances' (#27) from feat/project-scoped-get into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 19:03:23 +00:00
Michal
767725023e feat: --project flag scopes get servers/instances to project
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
mcpctl --project NAME get servers — shows only servers attached to the project
mcpctl --project NAME get instances — shows only instances of project servers

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 19:03:07 +00:00
2bd1b55fe8 Merge pull request 'feat: add tests.sh runner and project routes tests' (#26) from feat/tests-sh-and-project-routes-tests into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 18:58:06 +00:00
Michal
0f2a93f2f0 feat: add tests.sh runner and project routes integration tests
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- tests.sh: run all tests with `bash tests.sh`, summary with `--short`
- tests.sh --filter mcpd/cli: run specific package
- project-routes.test.ts: 17 new route-level tests covering CRUD,
  attach/detach, and the ownerId filtering bug fix

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 18:57:46 +00:00
ce81d9d616 Merge pull request 'fix: project list uses RBAC filtering instead of ownerId' (#25) from fix/project-list-rbac into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 18:52:29 +00:00
Michal
c6cc39c6f7 fix: project list should use RBAC filtering, not ownerId
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
The list endpoint was filtering by ownerId before RBAC could include
projects the user has view access to via name-scoped bindings.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 18:52:13 +00:00
de074d9a90 Merge pull request 'feat: remove ProjectMember, add expose RBAC role, attach/detach-server' (#24) from feat/project-improvements into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 17:50:24 +00:00
Michal
783cf15179 feat: remove ProjectMember, add expose RBAC role, attach/detach-server commands
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- Remove ProjectMember model entirely (RBAC manages project access)
- Add 'expose' RBAC role for /mcp-config endpoint access (edit implies expose)
- Rename CLI flags: --llm-provider → --proxy-mode-llm-provider, --llm-model → --proxy-mode-llm-model
- Add attach-server / detach-server CLI commands (mcpctl --project NAME attach-server SERVER)
- Add POST/DELETE /api/v1/projects/:id/servers endpoints for server attach/detach
- Remove members from backup/restore, apply, get, describe
- Prisma migration to drop ProjectMember table

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 17:50:01 +00:00
5844d6c73f Merge pull request 'fix: RBAC name-scoped access — CUID resolution + list filtering' (#23) from fix/rbac-name-scoped-access into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 12:27:48 +00:00
Michal
604bd76d60 fix: RBAC name-scoped access — CUID resolution + list filtering
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
Two bugs fixed:
- GET /api/v1/servers/:cuid now resolves CUID→name before RBAC check,
  so name-scoped bindings match correctly
- List endpoints now filter responses via preSerialization hook using
  getAllowedScope(), so name-scoped users only see their resources

Also adds fulldeploy.sh orchestrator script.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 12:26:37 +00:00
da14bb8c23 Merge pull request 'fix: update shell completions for current CLI commands' (#22) from fix/update-shell-completions into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 12:00:50 +00:00
Michal
9e9a2f4a54 fix: update shell completions for current CLI commands
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
Add users, groups, rbac, secrets, templates to resource completions.
Remove stale profiles references. Add login, logout, create, edit,
delete, logs commands. Update config subcommands.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 12:00:31 +00:00
c8cdd7f514 Merge pull request 'fix: migrate legacy admin role at startup' (#21) from fix/migrate-legacy-admin-role into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 11:31:31 +00:00
Michal
ec1dfe7438 fix: migrate legacy admin role to granular roles at startup
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- Add migrateAdminRole() that runs on mcpd boot
- Converts { role: 'admin', resource: X } → edit + run bindings
- Adds operation bindings for wildcard admin (impersonate, logs, etc.)
- Add tests verifying unknown/legacy roles are denied by canAccess

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 11:31:15 +00:00
50b4112398 Merge pull request 'fix: resolve tsc --build type errors' (#20) from fix/build-type-errors into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 11:08:08 +00:00
Michal
bb17a892d6 fix: resolve tsc --build type errors (exactOptionalPropertyTypes)
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- Fix resourceName assignment in mapUrlToPermission for strictness
- Use RbacRoleBinding type in restore-service instead of loose cast
- Remove stale ProjectMemberInput export from validation index

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 11:07:46 +00:00
a8117091a1 Merge pull request 'feat: granular RBAC with resource/operation bindings, users, groups' (#19) from feat/projects-rbac-users-groups into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 11:05:51 +00:00
Michal
dcda93d179 feat: granular RBAC with resource/operation bindings, users, groups
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- Replace admin role with granular roles: view, create, delete, edit, run
- Two binding types: resource bindings (role+resource+optional name) and
  operation bindings (role:run + action like backup, logs, impersonate)
- Name-scoped resource bindings for per-instance access control
- Remove role from project members (all permissions via RBAC)
- Add users, groups, RBAC CRUD endpoints and CLI commands
- describe user/group shows all RBAC access (direct + inherited)
- create rbac supports --subject, --binding, --operation flags
- Backup/restore handles users, groups, RBAC definitions
- mcplocal project-based MCP endpoint discovery
- Full test coverage for all new functionality

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 11:05:19 +00:00
a6b5e24a8d Merge pull request 'fix: add missing passwordHash to DB test user factory' (#18) from fix/db-tests-passwordhash into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 01:03:11 +00:00
127 changed files with 15205 additions and 753 deletions

View File

@@ -2,92 +2,176 @@ _mcpctl() {
local cur prev words cword local cur prev words cword
_init_completion || return _init_completion || return
local commands="config status get describe instance instances apply setup claude project projects backup restore help" local commands="status login logout config get describe delete logs create edit apply backup restore mcp approve help"
local global_opts="-v --version -o --output --daemon-url -h --help" local project_commands="attach-server detach-server get describe delete logs create edit help"
local resources="servers profiles projects instances" local global_opts="-v --version --daemon-url --direct --project -h --help"
local resources="servers instances secrets templates projects users groups rbac prompts promptrequests"
case "${words[1]}" in # Check if --project was given
local has_project=false
local i
for ((i=1; i < cword; i++)); do
if [[ "${words[i]}" == "--project" ]]; then
has_project=true
break
fi
done
# Find the first subcommand (skip --project and its argument, skip flags)
local subcmd=""
local subcmd_pos=0
for ((i=1; i < cword; i++)); do
if [[ "${words[i]}" == "--project" || "${words[i]}" == "--daemon-url" ]]; then
((i++)) # skip the argument
continue
fi
if [[ "${words[i]}" != -* ]]; then
subcmd="${words[i]}"
subcmd_pos=$i
break
fi
done
# Find the resource type after get/describe/delete/edit
local resource_type=""
if [[ -n "$subcmd_pos" ]] && [[ $subcmd_pos -gt 0 ]]; then
for ((i=subcmd_pos+1; i < cword; i++)); do
if [[ "${words[i]}" != -* ]] && [[ " $resources " == *" ${words[i]} "* ]]; then
resource_type="${words[i]}"
break
fi
done
fi
# If completing the --project value
if [[ "$prev" == "--project" ]]; then
local names
names=$(mcpctl get projects -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null)
COMPREPLY=($(compgen -W "$names" -- "$cur"))
return
fi
# Fetch resource names dynamically (jq extracts only top-level names)
_mcpctl_resource_names() {
local rt="$1"
if [[ -n "$rt" ]]; then
# Instances don't have a name field — use server.name instead
if [[ "$rt" == "instances" ]]; then
mcpctl get instances -o json 2>/dev/null | jq -r '.[][].server.name' 2>/dev/null
else
mcpctl get "$rt" -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null
fi
fi
}
# Get the --project value from the command line
_mcpctl_get_project_value() {
local i
for ((i=1; i < cword; i++)); do
if [[ "${words[i]}" == "--project" ]] && (( i+1 < cword )); then
echo "${words[i+1]}"
return
fi
done
}
case "$subcmd" in
config) config)
COMPREPLY=($(compgen -W "view set path reset help" -- "$cur")) if [[ $((cword - subcmd_pos)) -eq 1 ]]; then
COMPREPLY=($(compgen -W "view set path reset claude claude-generate setup impersonate help" -- "$cur"))
fi
return ;; return ;;
status) status)
COMPREPLY=($(compgen -W "--daemon-url -h --help" -- "$cur")) COMPREPLY=($(compgen -W "-h --help" -- "$cur"))
return ;; return ;;
get) login)
if [[ $cword -eq 2 ]]; then COMPREPLY=($(compgen -W "--url --email --password -h --help" -- "$cur"))
return ;;
logout)
return ;;
mcp)
return ;;
get|describe|delete)
if [[ -z "$resource_type" ]]; then
COMPREPLY=($(compgen -W "$resources" -- "$cur")) COMPREPLY=($(compgen -W "$resources" -- "$cur"))
else else
COMPREPLY=($(compgen -W "-o --output --daemon-url -h --help" -- "$cur")) local names
names=$(_mcpctl_resource_names "$resource_type")
COMPREPLY=($(compgen -W "$names -o --output -h --help" -- "$cur"))
fi fi
return ;; return ;;
describe) edit)
if [[ $cword -eq 2 ]]; then if [[ -z "$resource_type" ]]; then
COMPREPLY=($(compgen -W "$resources" -- "$cur")) COMPREPLY=($(compgen -W "servers projects" -- "$cur"))
else else
COMPREPLY=($(compgen -W "-o --output --daemon-url -h --help" -- "$cur")) local names
names=$(_mcpctl_resource_names "$resource_type")
COMPREPLY=($(compgen -W "$names -h --help" -- "$cur"))
fi fi
return ;; return ;;
instance|instances)
if [[ $cword -eq 2 ]]; then
COMPREPLY=($(compgen -W "list ls start stop restart remove rm logs inspect help" -- "$cur"))
else
case "${words[2]}" in
logs) logs)
COMPREPLY=($(compgen -W "--tail --since -h --help" -- "$cur")) COMPREPLY=($(compgen -W "--tail --since -f --follow -h --help" -- "$cur"))
;;
start)
COMPREPLY=($(compgen -W "--env --image -h --help" -- "$cur"))
;;
list|ls)
COMPREPLY=($(compgen -W "--server-id -o --output -h --help" -- "$cur"))
;;
esac
fi
return ;; return ;;
claude)
if [[ $cword -eq 2 ]]; then
COMPREPLY=($(compgen -W "generate show add remove help" -- "$cur"))
else
case "${words[2]}" in
generate|show|add|remove)
COMPREPLY=($(compgen -W "--path -p -h --help" -- "$cur"))
;;
esac
fi
return ;;
project|projects)
if [[ $cword -eq 2 ]]; then
COMPREPLY=($(compgen -W "list ls create delete rm show profiles set-profiles help" -- "$cur"))
else
case "${words[2]}" in
create) create)
COMPREPLY=($(compgen -W "--description -d -h --help" -- "$cur")) if [[ $((cword - subcmd_pos)) -eq 1 ]]; then
;; COMPREPLY=($(compgen -W "server secret project user group rbac prompt promptrequest help" -- "$cur"))
list|ls)
COMPREPLY=($(compgen -W "-o --output -h --help" -- "$cur"))
;;
esac
fi fi
return ;; return ;;
apply) apply)
COMPREPLY=($(compgen -f -- "$cur")) COMPREPLY=($(compgen -f -- "$cur"))
return ;; return ;;
backup) backup)
COMPREPLY=($(compgen -W "-o --output -p --password -r --resources -h --help" -- "$cur")) COMPREPLY=($(compgen -W "-o --output -p --password -h --help" -- "$cur"))
return ;; return ;;
restore) restore)
COMPREPLY=($(compgen -W "-i --input -p --password -c --conflict -h --help" -- "$cur")) COMPREPLY=($(compgen -W "-i --input -p --password -c --conflict -h --help" -- "$cur"))
return ;; return ;;
setup) attach-server)
# Only complete if no server arg given yet (first arg after subcmd)
if [[ $((cword - subcmd_pos)) -ne 1 ]]; then return; fi
local proj names all_servers proj_servers
proj=$(_mcpctl_get_project_value)
if [[ -n "$proj" ]]; then
all_servers=$(mcpctl get servers -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null)
proj_servers=$(mcpctl --project "$proj" get servers -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null)
names=$(comm -23 <(echo "$all_servers" | sort) <(echo "$proj_servers" | sort))
else
names=$(_mcpctl_resource_names "servers")
fi
COMPREPLY=($(compgen -W "$names" -- "$cur"))
return ;;
detach-server)
# Only complete if no server arg given yet (first arg after subcmd)
if [[ $((cword - subcmd_pos)) -ne 1 ]]; then return; fi
local proj names
proj=$(_mcpctl_get_project_value)
if [[ -n "$proj" ]]; then
names=$(mcpctl --project "$proj" get servers -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null)
fi
COMPREPLY=($(compgen -W "$names" -- "$cur"))
return ;;
approve)
if [[ -z "$resource_type" ]]; then
COMPREPLY=($(compgen -W "promptrequest" -- "$cur"))
else
local names
names=$(_mcpctl_resource_names "$resource_type")
COMPREPLY=($(compgen -W "$names" -- "$cur"))
fi
return ;; return ;;
help) help)
COMPREPLY=($(compgen -W "$commands" -- "$cur")) COMPREPLY=($(compgen -W "$commands" -- "$cur"))
return ;; return ;;
esac esac
if [[ $cword -eq 1 ]]; then # No subcommand yet — offer commands based on context
if [[ -z "$subcmd" ]]; then
if $has_project; then
COMPREPLY=($(compgen -W "$project_commands $global_opts" -- "$cur"))
else
COMPREPLY=($(compgen -W "$commands $global_opts" -- "$cur")) COMPREPLY=($(compgen -W "$commands $global_opts" -- "$cur"))
fi fi
fi
} }
complete -F _mcpctl mcpctl complete -F _mcpctl mcpctl

View File

@@ -1,80 +1,241 @@
# mcpctl fish completions # mcpctl fish completions
set -l commands config status get describe instance instances apply setup claude project projects backup restore help # Erase any stale completions from previous versions
complete -c mcpctl -e
set -l commands status login logout config get describe delete logs create edit apply backup restore mcp approve help
set -l project_commands attach-server detach-server get describe delete logs create edit help
# Disable file completions by default # Disable file completions by default
complete -c mcpctl -f complete -c mcpctl -f
# Global options # Global options
complete -c mcpctl -s v -l version -d 'Show version' complete -c mcpctl -s v -l version -d 'Show version'
complete -c mcpctl -s o -l output -d 'Output format' -xa 'table json yaml' complete -c mcpctl -l daemon-url -d 'mcplocal daemon URL' -x
complete -c mcpctl -l daemon-url -d 'mcpd daemon URL' -x complete -c mcpctl -l direct -d 'Bypass mcplocal, connect directly to mcpd'
complete -c mcpctl -l project -d 'Target project context' -x
complete -c mcpctl -s h -l help -d 'Show help' complete -c mcpctl -s h -l help -d 'Show help'
# Top-level commands # Helper: check if --project was given
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a config -d 'Manage configuration' function __mcpctl_has_project
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a status -d 'Show status and connectivity' set -l tokens (commandline -opc)
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a get -d 'List resources' for i in (seq (count $tokens))
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a describe -d 'Show resource details' if test "$tokens[$i]" = "--project"
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a instance -d 'Manage instances' return 0
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a apply -d 'Apply configuration from file' end
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a setup -d 'Interactive setup wizard' end
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a claude -d 'Manage Claude .mcp.json' return 1
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a project -d 'Manage projects' end
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a backup -d 'Backup configuration'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a restore -d 'Restore from backup'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a help -d 'Show help'
# get/describe resources # Helper: check if a resource type has been selected after get/describe/delete/edit
complete -c mcpctl -n "__fish_seen_subcommand_from get describe" -a 'servers profiles projects instances' -d 'Resource type' set -l resources servers instances secrets templates projects users groups rbac prompts promptrequests
function __mcpctl_needs_resource_type
set -l tokens (commandline -opc)
set -l found_cmd false
for tok in $tokens
if $found_cmd
# Check if next token after get/describe/delete/edit is a resource type
if contains -- $tok servers instances secrets templates projects users groups rbac prompts promptrequests
return 1 # resource type already present
end
end
if contains -- $tok get describe delete edit approve
set found_cmd true
end
end
if $found_cmd
return 0 # command found but no resource type yet
end
return 1
end
function __mcpctl_get_resource_type
set -l tokens (commandline -opc)
set -l found_cmd false
for tok in $tokens
if $found_cmd
if contains -- $tok servers instances secrets templates projects users groups rbac prompts promptrequests
echo $tok
return
end
end
if contains -- $tok get describe delete edit approve
set found_cmd true
end
end
end
# Fetch resource names dynamically from the API (jq extracts only top-level names)
function __mcpctl_resource_names
set -l resource (__mcpctl_get_resource_type)
if test -z "$resource"
return
end
# Instances don't have a name field — use server.name instead
if test "$resource" = "instances"
mcpctl get instances -o json 2>/dev/null | jq -r '.[][].server.name' 2>/dev/null
else
mcpctl get $resource -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null
end
end
# Fetch project names for --project value
function __mcpctl_project_names
mcpctl get projects -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null
end
# Helper: get the --project value from the command line
function __mcpctl_get_project_value
set -l tokens (commandline -opc)
for i in (seq (count $tokens))
if test "$tokens[$i]" = "--project"; and test $i -lt (count $tokens)
echo $tokens[(math $i + 1)]
return
end
end
end
# Servers currently attached to the project (for detach-server)
function __mcpctl_project_servers
set -l proj (__mcpctl_get_project_value)
if test -z "$proj"
return
end
mcpctl --project $proj get servers -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null
end
# Servers NOT attached to the project (for attach-server)
function __mcpctl_available_servers
set -l proj (__mcpctl_get_project_value)
if test -z "$proj"
# No project — show all servers
mcpctl get servers -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null
return
end
set -l all (mcpctl get servers -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null)
set -l attached (mcpctl --project $proj get servers -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null)
for s in $all
if not contains -- $s $attached
echo $s
end
end
end
# --project value completion
complete -c mcpctl -l project -xa '(__mcpctl_project_names)'
# Top-level commands (without --project)
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a status -d 'Show status and connectivity'
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a login -d 'Authenticate with mcpd'
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a logout -d 'Log out'
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a config -d 'Manage configuration'
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a get -d 'List resources'
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a describe -d 'Show resource details'
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a delete -d 'Delete a resource'
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a logs -d 'Get instance logs'
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a create -d 'Create a resource'
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a edit -d 'Edit a resource'
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a apply -d 'Apply configuration from file'
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a backup -d 'Backup configuration'
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a restore -d 'Restore from backup'
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a approve -d 'Approve a prompt request'
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a help -d 'Show help'
# Project-scoped commands (with --project)
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a attach-server -d 'Attach a server to the project'
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a detach-server -d 'Detach a server from the project'
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a get -d 'List resources (scoped to project)'
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a describe -d 'Show resource details'
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a delete -d 'Delete a resource'
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a logs -d 'Get instance logs'
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a create -d 'Create a resource'
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a edit -d 'Edit a resource'
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a help -d 'Show help'
# Resource types — only when resource type not yet selected
complete -c mcpctl -n "__fish_seen_subcommand_from get describe delete; and __mcpctl_needs_resource_type" -a "$resources" -d 'Resource type'
complete -c mcpctl -n "__fish_seen_subcommand_from edit; and __mcpctl_needs_resource_type" -a 'servers projects' -d 'Resource type'
# Resource names — after resource type is selected
complete -c mcpctl -n "__fish_seen_subcommand_from get describe delete edit approve; and not __mcpctl_needs_resource_type" -a '(__mcpctl_resource_names)' -d 'Resource name'
# Helper: check if attach-server/detach-server already has a server argument
function __mcpctl_needs_server_arg
set -l tokens (commandline -opc)
set -l found_cmd false
for tok in $tokens
if $found_cmd
if not string match -q -- '-*' $tok
return 1 # server arg already present
end
end
if contains -- $tok attach-server detach-server
set found_cmd true
end
end
if $found_cmd
return 0 # command found but no server arg yet
end
return 1
end
# attach-server: show servers NOT in the project (only if no server arg yet)
complete -c mcpctl -n "__fish_seen_subcommand_from attach-server; and __mcpctl_needs_server_arg" -a '(__mcpctl_available_servers)' -d 'Server'
# detach-server: show servers IN the project (only if no server arg yet)
complete -c mcpctl -n "__fish_seen_subcommand_from detach-server; and __mcpctl_needs_server_arg" -a '(__mcpctl_project_servers)' -d 'Server'
# get/describe options
complete -c mcpctl -n "__fish_seen_subcommand_from get" -s o -l output -d 'Output format' -xa 'table json yaml'
complete -c mcpctl -n "__fish_seen_subcommand_from describe" -s o -l output -d 'Output format' -xa 'detail json yaml'
complete -c mcpctl -n "__fish_seen_subcommand_from describe" -l show-values -d 'Show secret values'
# login options
complete -c mcpctl -n "__fish_seen_subcommand_from login" -l url -d 'mcpd URL' -x
complete -c mcpctl -n "__fish_seen_subcommand_from login" -l email -d 'Email address' -x
complete -c mcpctl -n "__fish_seen_subcommand_from login" -l password -d 'Password' -x
# config subcommands # config subcommands
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from view set path reset" -a view -d 'Show configuration' set -l config_cmds view set path reset claude claude-generate setup impersonate
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from view set path reset" -a set -d 'Set a config value' complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a view -d 'Show configuration'
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from view set path reset" -a path -d 'Show config file path' complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a set -d 'Set a config value'
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from view set path reset" -a reset -d 'Reset to defaults' complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a path -d 'Show config file path'
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a reset -d 'Reset to defaults'
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a claude -d 'Generate .mcp.json for project'
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a setup -d 'Configure LLM provider'
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a impersonate -d 'Impersonate a user'
# instance subcommands # create subcommands
set -l instance_cmds list ls start stop restart remove rm logs inspect set -l create_cmds server secret project user group rbac prompt promptrequest
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a list -d 'List instances' complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a server -d 'Create a server'
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a start -d 'Start instance' complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a secret -d 'Create a secret'
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a stop -d 'Stop instance' complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a project -d 'Create a project'
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a restart -d 'Restart instance' complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a user -d 'Create a user'
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a remove -d 'Remove instance' complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a group -d 'Create a group'
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a logs -d 'Get logs' complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a rbac -d 'Create an RBAC binding'
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a inspect -d 'Inspect container' complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a prompt -d 'Create an approved prompt'
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and __fish_seen_subcommand_from logs" -l tail -d 'Number of lines' -x complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a promptrequest -d 'Create a prompt request'
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and __fish_seen_subcommand_from logs" -l since -d 'Since timestamp' -x
# claude subcommands # logs options
set -l claude_cmds generate show add remove complete -c mcpctl -n "__fish_seen_subcommand_from logs" -l tail -d 'Number of lines' -x
complete -c mcpctl -n "__fish_seen_subcommand_from claude; and not __fish_seen_subcommand_from $claude_cmds" -a generate -d 'Generate .mcp.json' complete -c mcpctl -n "__fish_seen_subcommand_from logs" -l since -d 'Since timestamp' -x
complete -c mcpctl -n "__fish_seen_subcommand_from claude; and not __fish_seen_subcommand_from $claude_cmds" -a show -d 'Show .mcp.json' complete -c mcpctl -n "__fish_seen_subcommand_from logs" -s f -l follow -d 'Follow log output'
complete -c mcpctl -n "__fish_seen_subcommand_from claude; and not __fish_seen_subcommand_from $claude_cmds" -a add -d 'Add server entry'
complete -c mcpctl -n "__fish_seen_subcommand_from claude; and not __fish_seen_subcommand_from $claude_cmds" -a remove -d 'Remove server entry'
complete -c mcpctl -n "__fish_seen_subcommand_from claude; and __fish_seen_subcommand_from $claude_cmds" -s p -l path -d 'Path to .mcp.json' -rF
# project subcommands
set -l project_cmds list ls create delete rm show profiles set-profiles
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a list -d 'List projects'
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a create -d 'Create project'
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a delete -d 'Delete project'
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a show -d 'Show project'
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a profiles -d 'List profiles'
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a set-profiles -d 'Set profiles'
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and __fish_seen_subcommand_from create" -s d -l description -d 'Description' -x
# backup options # backup options
complete -c mcpctl -n "__fish_seen_subcommand_from backup" -s o -l output -d 'Output file' -rF complete -c mcpctl -n "__fish_seen_subcommand_from backup" -s o -l output -d 'Output file' -rF
complete -c mcpctl -n "__fish_seen_subcommand_from backup" -s p -l password -d 'Encryption password' -x complete -c mcpctl -n "__fish_seen_subcommand_from backup" -s p -l password -d 'Encryption password' -x
complete -c mcpctl -n "__fish_seen_subcommand_from backup" -s r -l resources -d 'Resources to backup' -xa 'servers profiles projects'
# restore options # restore options
complete -c mcpctl -n "__fish_seen_subcommand_from restore" -s i -l input -d 'Input file' -rF complete -c mcpctl -n "__fish_seen_subcommand_from restore" -s i -l input -d 'Input file' -rF
complete -c mcpctl -n "__fish_seen_subcommand_from restore" -s p -l password -d 'Decryption password' -x complete -c mcpctl -n "__fish_seen_subcommand_from restore" -s p -l password -d 'Decryption password' -x
complete -c mcpctl -n "__fish_seen_subcommand_from restore" -s c -l conflict -d 'Conflict strategy' -xa 'skip overwrite fail' complete -c mcpctl -n "__fish_seen_subcommand_from restore" -s c -l conflict -d 'Conflict strategy' -xa 'skip overwrite fail'
# approve: first arg is resource type (promptrequest only), second is name
complete -c mcpctl -n "__fish_seen_subcommand_from approve; and __mcpctl_needs_resource_type" -a 'promptrequest' -d 'Resource type'
# apply takes a file # apply takes a file
complete -c mcpctl -n "__fish_seen_subcommand_from apply" -s f -l file -d 'Configuration file' -rF
complete -c mcpctl -n "__fish_seen_subcommand_from apply" -F complete -c mcpctl -n "__fish_seen_subcommand_from apply" -F
# help completions # help completions

35
fulldeploy.sh Executable file
View File

@@ -0,0 +1,35 @@
#!/bin/bash
# Full deployment: Docker image → Portainer stack → RPM build/publish/install
set -e
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
cd "$SCRIPT_DIR"
# Load .env
if [ -f .env ]; then
set -a; source .env; set +a
fi
echo "========================================"
echo " mcpctl Full Deploy"
echo "========================================"
echo ""
echo ">>> Step 1/3: Build & push mcpd Docker image"
echo ""
bash scripts/build-mcpd.sh "$@"
echo ""
echo ">>> Step 2/3: Deploy stack to production"
echo ""
bash deploy.sh
echo ""
echo ">>> Step 3/3: Build, publish & install RPM"
echo ""
bash scripts/release.sh
echo ""
echo "========================================"
echo " Full deploy complete!"
echo "========================================"

View File

@@ -5,6 +5,8 @@ release: "1"
maintainer: michal maintainer: michal
description: kubectl-like CLI for managing MCP servers description: kubectl-like CLI for managing MCP servers
license: MIT license: MIT
depends:
- jq
contents: contents:
- src: ./dist/mcpctl - src: ./dist/mcpctl
dst: /usr/bin/mcpctl dst: /usr/bin/mcpctl

55
pr.sh Executable file
View File

@@ -0,0 +1,55 @@
#!/usr/bin/env bash
# Usage: bash pr.sh "PR title" "PR body"
# Loads GITEA_TOKEN from .env automatically
set -euo pipefail
# Load .env if GITEA_TOKEN not already exported
if [ -z "${GITEA_TOKEN:-}" ] && [ -f .env ]; then
set -a
source .env
set +a
fi
GITEA_URL="${GITEA_URL:-http://10.0.0.194:3012}"
REPO="${GITEA_OWNER:-michal}/mcpctl"
TITLE="${1:?Usage: pr.sh <title> [body]}"
BODY="${2:-}"
BASE="${3:-main}"
HEAD=$(git rev-parse --abbrev-ref HEAD)
if [ "$HEAD" = "$BASE" ]; then
echo "Error: already on $BASE, switch to a feature branch first" >&2
exit 1
fi
if [ -z "${GITEA_TOKEN:-}" ]; then
echo "Error: GITEA_TOKEN not set and .env not found" >&2
exit 1
fi
# Push if needed
if ! git rev-parse --verify "origin/$HEAD" &>/dev/null; then
git push -u origin "$HEAD"
else
git push
fi
# Create PR
RESPONSE=$(curl -s -X POST "$GITEA_URL/api/v1/repos/$REPO/pulls" \
-H "Authorization: token $GITEA_TOKEN" \
-H "Content-Type: application/json" \
-d "$(jq -n --arg t "$TITLE" --arg b "$BODY" --arg h "$HEAD" --arg base "$BASE" \
'{title: $t, body: $b, head: $h, base: $base}')")
PR_NUM=$(echo "$RESPONSE" | jq -r '.number // empty')
PR_URL=$(echo "$RESPONSE" | jq -r '.html_url // empty')
if [ -z "$PR_NUM" ]; then
echo "Error creating PR:" >&2
echo "$RESPONSE" | jq . 2>/dev/null || echo "$RESPONSE" >&2
exit 1
fi
echo "PR #$PR_NUM: https://mysources.co.uk/$REPO/pulls/$PR_NUM"

View File

@@ -24,7 +24,10 @@ export class ApiError extends Error {
function request<T>(method: string, url: string, timeout: number, body?: unknown, token?: string): Promise<ApiResponse<T>> { function request<T>(method: string, url: string, timeout: number, body?: unknown, token?: string): Promise<ApiResponse<T>> {
return new Promise((resolve, reject) => { return new Promise((resolve, reject) => {
const parsed = new URL(url); const parsed = new URL(url);
const headers: Record<string, string> = { 'Content-Type': 'application/json' }; const headers: Record<string, string> = {};
if (body !== undefined) {
headers['Content-Type'] = 'application/json';
}
if (token) { if (token) {
headers['Authorization'] = `Bearer ${token}`; headers['Authorization'] = `Bearer ${token}`;
} }

View File

@@ -63,17 +63,78 @@ const TemplateSpecSchema = z.object({
healthCheck: HealthCheckSchema.optional(), healthCheck: HealthCheckSchema.optional(),
}); });
const UserSpecSchema = z.object({
email: z.string().email(),
password: z.string().min(8),
name: z.string().optional(),
});
const GroupSpecSchema = z.object({
name: z.string().min(1),
description: z.string().default(''),
members: z.array(z.string().email()).default([]),
});
const RbacSubjectSchema = z.object({
kind: z.enum(['User', 'Group', 'ServiceAccount']),
name: z.string().min(1),
});
const RESOURCE_ALIASES: Record<string, string> = {
server: 'servers', instance: 'instances', secret: 'secrets',
project: 'projects', template: 'templates', user: 'users', group: 'groups',
prompt: 'prompts', promptrequest: 'promptrequests',
};
const RbacRoleBindingSchema = z.union([
z.object({
role: z.enum(['edit', 'view', 'create', 'delete', 'run', 'expose']),
resource: z.string().min(1).transform((r) => RESOURCE_ALIASES[r] ?? r),
name: z.string().min(1).optional(),
}),
z.object({
role: z.literal('run'),
action: z.string().min(1),
}),
]);
const RbacBindingSpecSchema = z.object({
name: z.string().min(1),
subjects: z.array(RbacSubjectSchema).default([]),
roleBindings: z.array(RbacRoleBindingSchema).default([]),
});
const PromptSpecSchema = z.object({
name: z.string().min(1).max(100).regex(/^[a-z0-9-]+$/),
content: z.string().min(1).max(50000),
projectId: z.string().optional(),
});
const ProjectSpecSchema = z.object({ const ProjectSpecSchema = z.object({
name: z.string().min(1), name: z.string().min(1),
description: z.string().default(''), description: z.string().default(''),
prompt: z.string().max(10000).default(''),
proxyMode: z.enum(['direct', 'filtered']).default('direct'),
llmProvider: z.string().optional(),
llmModel: z.string().optional(),
servers: z.array(z.string()).default([]),
}); });
const ApplyConfigSchema = z.object({ const ApplyConfigSchema = z.object({
servers: z.array(ServerSpecSchema).default([]),
secrets: z.array(SecretSpecSchema).default([]), secrets: z.array(SecretSpecSchema).default([]),
servers: z.array(ServerSpecSchema).default([]),
users: z.array(UserSpecSchema).default([]),
groups: z.array(GroupSpecSchema).default([]),
projects: z.array(ProjectSpecSchema).default([]), projects: z.array(ProjectSpecSchema).default([]),
templates: z.array(TemplateSpecSchema).default([]), templates: z.array(TemplateSpecSchema).default([]),
}); rbacBindings: z.array(RbacBindingSpecSchema).default([]),
rbac: z.array(RbacBindingSpecSchema).default([]),
prompts: z.array(PromptSpecSchema).default([]),
}).transform((data) => ({
...data,
// Merge rbac into rbacBindings so both keys work
rbacBindings: [...data.rbacBindings, ...data.rbac],
}));
export type ApplyConfig = z.infer<typeof ApplyConfigSchema>; export type ApplyConfig = z.infer<typeof ApplyConfigSchema>;
@@ -87,17 +148,26 @@ export function createApplyCommand(deps: ApplyCommandDeps): Command {
return new Command('apply') return new Command('apply')
.description('Apply declarative configuration from a YAML or JSON file') .description('Apply declarative configuration from a YAML or JSON file')
.argument('<file>', 'Path to config file (.yaml, .yml, or .json)') .argument('[file]', 'Path to config file (.yaml, .yml, or .json)')
.option('-f, --file <file>', 'Path to config file (alternative to positional arg)')
.option('--dry-run', 'Validate and show changes without applying') .option('--dry-run', 'Validate and show changes without applying')
.action(async (file: string, opts: { dryRun?: boolean }) => { .action(async (fileArg: string | undefined, opts: { file?: string; dryRun?: boolean }) => {
const file = fileArg ?? opts.file;
if (!file) {
throw new Error('File path required. Usage: mcpctl apply <file> or mcpctl apply -f <file>');
}
const config = loadConfigFile(file); const config = loadConfigFile(file);
if (opts.dryRun) { if (opts.dryRun) {
log('Dry run - would apply:'); log('Dry run - would apply:');
if (config.servers.length > 0) log(` ${config.servers.length} server(s)`);
if (config.secrets.length > 0) log(` ${config.secrets.length} secret(s)`); if (config.secrets.length > 0) log(` ${config.secrets.length} secret(s)`);
if (config.servers.length > 0) log(` ${config.servers.length} server(s)`);
if (config.users.length > 0) log(` ${config.users.length} user(s)`);
if (config.groups.length > 0) log(` ${config.groups.length} group(s)`);
if (config.projects.length > 0) log(` ${config.projects.length} project(s)`); if (config.projects.length > 0) log(` ${config.projects.length} project(s)`);
if (config.templates.length > 0) log(` ${config.templates.length} template(s)`); if (config.templates.length > 0) log(` ${config.templates.length} template(s)`);
if (config.rbacBindings.length > 0) log(` ${config.rbacBindings.length} rbacBinding(s)`);
if (config.prompts.length > 0) log(` ${config.prompts.length} prompt(s)`);
return; return;
} }
@@ -119,21 +189,7 @@ function loadConfigFile(path: string): ApplyConfig {
} }
async function applyConfig(client: ApiClient, config: ApplyConfig, log: (...args: unknown[]) => void): Promise<void> { async function applyConfig(client: ApiClient, config: ApplyConfig, log: (...args: unknown[]) => void): Promise<void> {
// Apply servers first // Apply order: secrets, servers, users, groups, projects, templates, rbacBindings
for (const server of config.servers) {
try {
const existing = await findByName(client, 'servers', server.name);
if (existing) {
await client.put(`/api/v1/servers/${(existing as { id: string }).id}`, server);
log(`Updated server: ${server.name}`);
} else {
await client.post('/api/v1/servers', server);
log(`Created server: ${server.name}`);
}
} catch (err) {
log(`Error applying server '${server.name}': ${err instanceof Error ? err.message : err}`);
}
}
// Apply secrets // Apply secrets
for (const secret of config.secrets) { for (const secret of config.secrets) {
@@ -151,20 +207,63 @@ async function applyConfig(client: ApiClient, config: ApplyConfig, log: (...args
} }
} }
// Apply projects // Apply servers
for (const server of config.servers) {
try {
const existing = await findByName(client, 'servers', server.name);
if (existing) {
await client.put(`/api/v1/servers/${(existing as { id: string }).id}`, server);
log(`Updated server: ${server.name}`);
} else {
await client.post('/api/v1/servers', server);
log(`Created server: ${server.name}`);
}
} catch (err) {
log(`Error applying server '${server.name}': ${err instanceof Error ? err.message : err}`);
}
}
// Apply users (matched by email)
for (const user of config.users) {
try {
const existing = await findByField(client, 'users', 'email', user.email);
if (existing) {
await client.put(`/api/v1/users/${(existing as { id: string }).id}`, user);
log(`Updated user: ${user.email}`);
} else {
await client.post('/api/v1/users', user);
log(`Created user: ${user.email}`);
}
} catch (err) {
log(`Error applying user '${user.email}': ${err instanceof Error ? err.message : err}`);
}
}
// Apply groups
for (const group of config.groups) {
try {
const existing = await findByName(client, 'groups', group.name);
if (existing) {
await client.put(`/api/v1/groups/${(existing as { id: string }).id}`, group);
log(`Updated group: ${group.name}`);
} else {
await client.post('/api/v1/groups', group);
log(`Created group: ${group.name}`);
}
} catch (err) {
log(`Error applying group '${group.name}': ${err instanceof Error ? err.message : err}`);
}
}
// Apply projects (send full spec including servers)
for (const project of config.projects) { for (const project of config.projects) {
try { try {
const existing = await findByName(client, 'projects', project.name); const existing = await findByName(client, 'projects', project.name);
if (existing) { if (existing) {
await client.put(`/api/v1/projects/${(existing as { id: string }).id}`, { await client.put(`/api/v1/projects/${(existing as { id: string }).id}`, project);
description: project.description,
});
log(`Updated project: ${project.name}`); log(`Updated project: ${project.name}`);
} else { } else {
await client.post('/api/v1/projects', { await client.post('/api/v1/projects', project);
name: project.name,
description: project.description,
});
log(`Created project: ${project.name}`); log(`Created project: ${project.name}`);
} }
} catch (err) { } catch (err) {
@@ -187,6 +286,38 @@ async function applyConfig(client: ApiClient, config: ApplyConfig, log: (...args
log(`Error applying template '${template.name}': ${err instanceof Error ? err.message : err}`); log(`Error applying template '${template.name}': ${err instanceof Error ? err.message : err}`);
} }
} }
// Apply RBAC bindings
for (const rbacBinding of config.rbacBindings) {
try {
const existing = await findByName(client, 'rbac', rbacBinding.name);
if (existing) {
await client.put(`/api/v1/rbac/${(existing as { id: string }).id}`, rbacBinding);
log(`Updated rbacBinding: ${rbacBinding.name}`);
} else {
await client.post('/api/v1/rbac', rbacBinding);
log(`Created rbacBinding: ${rbacBinding.name}`);
}
} catch (err) {
log(`Error applying rbacBinding '${rbacBinding.name}': ${err instanceof Error ? err.message : err}`);
}
}
// Apply prompts
for (const prompt of config.prompts) {
try {
const existing = await findByName(client, 'prompts', prompt.name);
if (existing) {
await client.put(`/api/v1/prompts/${(existing as { id: string }).id}`, { content: prompt.content });
log(`Updated prompt: ${prompt.name}`);
} else {
await client.post('/api/v1/prompts', prompt);
log(`Created prompt: ${prompt.name}`);
}
} catch (err) {
log(`Error applying prompt '${prompt.name}': ${err instanceof Error ? err.message : err}`);
}
}
} }
async function findByName(client: ApiClient, resource: string, name: string): Promise<unknown | null> { async function findByName(client: ApiClient, resource: string, name: string): Promise<unknown | null> {
@@ -198,5 +329,14 @@ async function findByName(client: ApiClient, resource: string, name: string): Pr
} }
} }
async function findByField<T extends string>(client: ApiClient, resource: string, field: T, value: string): Promise<unknown | null> {
try {
const items = await client.get<Array<Record<string, unknown>>>(`/api/v1/${resource}`);
return items.find((item) => item[field] === value) ?? null;
} catch {
return null;
}
}
// Export for testing // Export for testing
export { loadConfigFile, applyConfig }; export { loadConfigFile, applyConfig };

View File

@@ -10,6 +10,10 @@ export interface PromptDeps {
password(message: string): Promise<string>; password(message: string): Promise<string>;
} }
export interface StatusResponse {
hasUsers: boolean;
}
export interface AuthCommandDeps { export interface AuthCommandDeps {
configDeps: Partial<ConfigLoaderDeps>; configDeps: Partial<ConfigLoaderDeps>;
credentialsDeps: Partial<CredentialsDeps>; credentialsDeps: Partial<CredentialsDeps>;
@@ -17,6 +21,8 @@ export interface AuthCommandDeps {
log: (...args: string[]) => void; log: (...args: string[]) => void;
loginRequest: (mcpdUrl: string, email: string, password: string) => Promise<LoginResponse>; loginRequest: (mcpdUrl: string, email: string, password: string) => Promise<LoginResponse>;
logoutRequest: (mcpdUrl: string, token: string) => Promise<void>; logoutRequest: (mcpdUrl: string, token: string) => Promise<void>;
statusRequest: (mcpdUrl: string) => Promise<StatusResponse>;
bootstrapRequest: (mcpdUrl: string, email: string, password: string, name?: string) => Promise<LoginResponse>;
} }
interface LoginResponse { interface LoginResponse {
@@ -80,6 +86,70 @@ function defaultLogoutRequest(mcpdUrl: string, token: string): Promise<void> {
}); });
} }
function defaultStatusRequest(mcpdUrl: string): Promise<StatusResponse> {
return new Promise((resolve, reject) => {
const url = new URL('/api/v1/auth/status', mcpdUrl);
const opts: http.RequestOptions = {
hostname: url.hostname,
port: url.port,
path: url.pathname,
method: 'GET',
timeout: 10000,
headers: { 'Content-Type': 'application/json' },
};
const req = http.request(opts, (res) => {
const chunks: Buffer[] = [];
res.on('data', (chunk: Buffer) => chunks.push(chunk));
res.on('end', () => {
const raw = Buffer.concat(chunks).toString('utf-8');
if ((res.statusCode ?? 0) >= 400) {
reject(new Error(`Status check failed (${res.statusCode}): ${raw}`));
return;
}
resolve(JSON.parse(raw) as StatusResponse);
});
});
req.on('error', (err) => reject(new Error(`Cannot reach mcpd: ${err.message}`)));
req.on('timeout', () => { req.destroy(); reject(new Error('Status request timed out')); });
req.end();
});
}
function defaultBootstrapRequest(mcpdUrl: string, email: string, password: string, name?: string): Promise<LoginResponse> {
return new Promise((resolve, reject) => {
const url = new URL('/api/v1/auth/bootstrap', mcpdUrl);
const payload: Record<string, string> = { email, password };
if (name) {
payload['name'] = name;
}
const body = JSON.stringify(payload);
const opts: http.RequestOptions = {
hostname: url.hostname,
port: url.port,
path: url.pathname,
method: 'POST',
timeout: 10000,
headers: { 'Content-Type': 'application/json', 'Content-Length': Buffer.byteLength(body) },
};
const req = http.request(opts, (res) => {
const chunks: Buffer[] = [];
res.on('data', (chunk: Buffer) => chunks.push(chunk));
res.on('end', () => {
const raw = Buffer.concat(chunks).toString('utf-8');
if ((res.statusCode ?? 0) >= 400) {
reject(new Error(`Bootstrap failed (${res.statusCode}): ${raw}`));
return;
}
resolve(JSON.parse(raw) as LoginResponse);
});
});
req.on('error', (err) => reject(new Error(`Cannot reach mcpd: ${err.message}`)));
req.on('timeout', () => { req.destroy(); reject(new Error('Bootstrap request timed out')); });
req.write(body);
req.end();
});
}
async function defaultInput(message: string): Promise<string> { async function defaultInput(message: string): Promise<string> {
const { default: inquirer } = await import('inquirer'); const { default: inquirer } = await import('inquirer');
const { answer } = await inquirer.prompt([{ type: 'input', name: 'answer', message }]); const { answer } = await inquirer.prompt([{ type: 'input', name: 'answer', message }]);
@@ -99,10 +169,12 @@ const defaultDeps: AuthCommandDeps = {
log: (...args) => console.log(...args), log: (...args) => console.log(...args),
loginRequest: defaultLoginRequest, loginRequest: defaultLoginRequest,
logoutRequest: defaultLogoutRequest, logoutRequest: defaultLogoutRequest,
statusRequest: defaultStatusRequest,
bootstrapRequest: defaultBootstrapRequest,
}; };
export function createLoginCommand(deps?: Partial<AuthCommandDeps>): Command { export function createLoginCommand(deps?: Partial<AuthCommandDeps>): Command {
const { configDeps, credentialsDeps, prompt, log, loginRequest } = { ...defaultDeps, ...deps }; const { configDeps, credentialsDeps, prompt, log, loginRequest, statusRequest, bootstrapRequest } = { ...defaultDeps, ...deps };
return new Command('login') return new Command('login')
.description('Authenticate with mcpd') .description('Authenticate with mcpd')
@@ -111,10 +183,28 @@ export function createLoginCommand(deps?: Partial<AuthCommandDeps>): Command {
const config = loadConfig(configDeps); const config = loadConfig(configDeps);
const mcpdUrl = opts.mcpdUrl ?? config.mcpdUrl; const mcpdUrl = opts.mcpdUrl ?? config.mcpdUrl;
try {
const status = await statusRequest(mcpdUrl);
if (!status.hasUsers) {
log('No users configured. Creating first admin account.');
const email = await prompt.input('Email:');
const password = await prompt.password('Password:');
const name = await prompt.input('Name (optional):');
const result = name
? await bootstrapRequest(mcpdUrl, email, password, name)
: await bootstrapRequest(mcpdUrl, email, password);
saveCredentials({
token: result.token,
mcpdUrl,
user: result.user.email,
}, credentialsDeps);
log(`Logged in as ${result.user.email} (admin)`);
} else {
const email = await prompt.input('Email:'); const email = await prompt.input('Email:');
const password = await prompt.password('Password:'); const password = await prompt.password('Password:');
try {
const result = await loginRequest(mcpdUrl, email, password); const result = await loginRequest(mcpdUrl, email, password);
saveCredentials({ saveCredentials({
token: result.token, token: result.token,
@@ -122,6 +212,7 @@ export function createLoginCommand(deps?: Partial<AuthCommandDeps>): Command {
user: result.user.email, user: result.user.email,
}, credentialsDeps); }, credentialsDeps);
log(`Logged in as ${result.user.email}`); log(`Logged in as ${result.user.email}`);
}
} catch (err) { } catch (err) {
log(`Login failed: ${(err as Error).message}`); log(`Login failed: ${(err as Error).message}`);
process.exitCode = 1; process.exitCode = 1;

View File

@@ -1,155 +0,0 @@
import { Command } from 'commander';
import { writeFileSync, readFileSync, existsSync } from 'node:fs';
import { resolve } from 'node:path';
import type { ApiClient } from '../api-client.js';
interface McpConfig {
mcpServers: Record<string, { command: string; args: string[]; env?: Record<string, string> }>;
}
export interface ClaudeCommandDeps {
client: ApiClient;
log: (...args: unknown[]) => void;
}
export function createClaudeCommand(deps: ClaudeCommandDeps): Command {
const { client, log } = deps;
const cmd = new Command('claude')
.description('Manage Claude MCP configuration (.mcp.json)');
cmd
.command('generate <projectId>')
.description('Generate .mcp.json from a project configuration')
.option('-o, --output <path>', 'Output file path', '.mcp.json')
.option('--merge', 'Merge with existing .mcp.json instead of overwriting')
.option('--stdout', 'Print to stdout instead of writing a file')
.action(async (projectId: string, opts: { output: string; merge?: boolean; stdout?: boolean }) => {
const config = await client.get<McpConfig>(`/api/v1/projects/${projectId}/mcp-config`);
if (opts.stdout) {
log(JSON.stringify(config, null, 2));
return;
}
const outputPath = resolve(opts.output);
let finalConfig = config;
if (opts.merge && existsSync(outputPath)) {
try {
const existing = JSON.parse(readFileSync(outputPath, 'utf-8')) as McpConfig;
finalConfig = {
mcpServers: {
...existing.mcpServers,
...config.mcpServers,
},
};
} catch {
// If existing file is invalid, just overwrite
}
}
writeFileSync(outputPath, JSON.stringify(finalConfig, null, 2) + '\n');
const serverCount = Object.keys(finalConfig.mcpServers).length;
log(`Wrote ${outputPath} (${serverCount} server(s))`);
});
cmd
.command('show')
.description('Show current .mcp.json configuration')
.option('-p, --path <path>', 'Path to .mcp.json', '.mcp.json')
.action((opts: { path: string }) => {
const filePath = resolve(opts.path);
if (!existsSync(filePath)) {
log(`No .mcp.json found at ${filePath}`);
return;
}
const content = readFileSync(filePath, 'utf-8');
try {
const config = JSON.parse(content) as McpConfig;
const servers = Object.entries(config.mcpServers ?? {});
if (servers.length === 0) {
log('No MCP servers configured.');
return;
}
log(`MCP servers in ${filePath}:\n`);
for (const [name, server] of servers) {
log(` ${name}`);
log(` command: ${server.command} ${server.args.join(' ')}`);
if (server.env) {
const envKeys = Object.keys(server.env);
log(` env: ${envKeys.join(', ')}`);
}
}
} catch {
log(`Invalid JSON in ${filePath}`);
}
});
cmd
.command('add <name>')
.description('Add an MCP server entry to .mcp.json')
.requiredOption('-c, --command <cmd>', 'Command to run')
.option('-a, --args <args...>', 'Command arguments')
.option('-e, --env <key=value...>', 'Environment variables')
.option('-p, --path <path>', 'Path to .mcp.json', '.mcp.json')
.action((name: string, opts: { command: string; args?: string[]; env?: string[]; path: string }) => {
const filePath = resolve(opts.path);
let config: McpConfig = { mcpServers: {} };
if (existsSync(filePath)) {
try {
config = JSON.parse(readFileSync(filePath, 'utf-8')) as McpConfig;
} catch {
// Start fresh
}
}
const entry: { command: string; args: string[]; env?: Record<string, string> } = {
command: opts.command,
args: opts.args ?? [],
};
if (opts.env && opts.env.length > 0) {
const env: Record<string, string> = {};
for (const pair of opts.env) {
const eqIdx = pair.indexOf('=');
if (eqIdx > 0) {
env[pair.slice(0, eqIdx)] = pair.slice(eqIdx + 1);
}
}
entry.env = env;
}
config.mcpServers[name] = entry;
writeFileSync(filePath, JSON.stringify(config, null, 2) + '\n');
log(`Added '${name}' to ${filePath}`);
});
cmd
.command('remove <name>')
.description('Remove an MCP server entry from .mcp.json')
.option('-p, --path <path>', 'Path to .mcp.json', '.mcp.json')
.action((name: string, opts: { path: string }) => {
const filePath = resolve(opts.path);
if (!existsSync(filePath)) {
log(`No .mcp.json found at ${filePath}`);
return;
}
try {
const config = JSON.parse(readFileSync(filePath, 'utf-8')) as McpConfig;
if (!(name in config.mcpServers)) {
log(`Server '${name}' not found in ${filePath}`);
return;
}
delete config.mcpServers[name];
writeFileSync(filePath, JSON.stringify(config, null, 2) + '\n');
log(`Removed '${name}' from ${filePath}`);
} catch {
log(`Invalid JSON in ${filePath}`);
}
});
return cmd;
}

View File

@@ -0,0 +1,347 @@
import { Command } from 'commander';
import http from 'node:http';
import https from 'node:https';
import { execFile } from 'node:child_process';
import { promisify } from 'node:util';
import { loadConfig, saveConfig } from '../config/index.js';
import type { ConfigLoaderDeps, McpctlConfig, LlmConfig, LlmProviderName } from '../config/index.js';
import type { SecretStore } from '@mcpctl/shared';
import { createSecretStore } from '@mcpctl/shared';
const execFileAsync = promisify(execFile);
export interface ConfigSetupPrompt {
select<T>(message: string, choices: Array<{ name: string; value: T; description?: string }>): Promise<T>;
input(message: string, defaultValue?: string): Promise<string>;
password(message: string): Promise<string>;
confirm(message: string, defaultValue?: boolean): Promise<boolean>;
}
export interface ConfigSetupDeps {
configDeps: Partial<ConfigLoaderDeps>;
secretStore: SecretStore;
log: (...args: string[]) => void;
prompt: ConfigSetupPrompt;
fetchModels: (url: string, path: string) => Promise<string[]>;
whichBinary: (name: string) => Promise<string | null>;
}
interface ProviderChoice {
name: string;
value: LlmProviderName;
description: string;
}
const PROVIDER_CHOICES: ProviderChoice[] = [
{ name: 'Gemini CLI', value: 'gemini-cli', description: 'Google Gemini via local CLI (free, no API key)' },
{ name: 'Ollama', value: 'ollama', description: 'Local models via Ollama' },
{ name: 'Anthropic (Claude)', value: 'anthropic', description: 'Claude API (requires API key)' },
{ name: 'vLLM', value: 'vllm', description: 'Self-hosted vLLM (OpenAI-compatible)' },
{ name: 'OpenAI', value: 'openai', description: 'OpenAI API (requires API key)' },
{ name: 'DeepSeek', value: 'deepseek', description: 'DeepSeek API (requires API key)' },
{ name: 'None (disable)', value: 'none', description: 'Disable LLM features' },
];
const GEMINI_MODELS = ['gemini-2.5-flash', 'gemini-2.5-pro', 'gemini-2.0-flash'];
const ANTHROPIC_MODELS = ['claude-haiku-3-5-20241022', 'claude-sonnet-4-20250514', 'claude-opus-4-20250514'];
const DEEPSEEK_MODELS = ['deepseek-chat', 'deepseek-reasoner'];
function defaultFetchModels(baseUrl: string, path: string): Promise<string[]> {
return new Promise((resolve) => {
const url = new URL(path, baseUrl);
const isHttps = url.protocol === 'https:';
const transport = isHttps ? https : http;
const req = transport.get({
hostname: url.hostname,
port: url.port || (isHttps ? 443 : 80),
path: url.pathname,
timeout: 5000,
}, (res) => {
const chunks: Buffer[] = [];
res.on('data', (chunk: Buffer) => chunks.push(chunk));
res.on('end', () => {
try {
const raw = Buffer.concat(chunks).toString('utf-8');
const data = JSON.parse(raw) as { models?: Array<{ name: string }>; data?: Array<{ id: string }> };
// Ollama format: { models: [{ name }] }
if (data.models) {
resolve(data.models.map((m) => m.name));
return;
}
// OpenAI/vLLM format: { data: [{ id }] }
if (data.data) {
resolve(data.data.map((m) => m.id));
return;
}
resolve([]);
} catch {
resolve([]);
}
});
});
req.on('error', () => resolve([]));
req.on('timeout', () => { req.destroy(); resolve([]); });
});
}
async function defaultSelect<T>(message: string, choices: Array<{ name: string; value: T; description?: string }>): Promise<T> {
const { default: inquirer } = await import('inquirer');
const { answer } = await inquirer.prompt([{
type: 'list',
name: 'answer',
message,
choices: choices.map((c) => ({
name: c.description ? `${c.name}${c.description}` : c.name,
value: c.value,
short: c.name,
})),
}]);
return answer as T;
}
async function defaultInput(message: string, defaultValue?: string): Promise<string> {
const { default: inquirer } = await import('inquirer');
const { answer } = await inquirer.prompt([{
type: 'input',
name: 'answer',
message,
default: defaultValue,
}]);
return answer as string;
}
async function defaultPassword(message: string): Promise<string> {
const { default: inquirer } = await import('inquirer');
const { answer } = await inquirer.prompt([{ type: 'password', name: 'answer', message }]);
return answer as string;
}
async function defaultConfirm(message: string, defaultValue?: boolean): Promise<boolean> {
const { default: inquirer } = await import('inquirer');
const { answer } = await inquirer.prompt([{
type: 'confirm',
name: 'answer',
message,
default: defaultValue ?? true,
}]);
return answer as boolean;
}
const defaultPrompt: ConfigSetupPrompt = {
select: defaultSelect,
input: defaultInput,
password: defaultPassword,
confirm: defaultConfirm,
};
async function defaultWhichBinary(name: string): Promise<string | null> {
try {
const { stdout } = await execFileAsync('which', [name], { timeout: 3000 });
const path = stdout.trim();
return path || null;
} catch {
return null;
}
}
export function createConfigSetupCommand(deps?: Partial<ConfigSetupDeps>): Command {
return new Command('setup')
.description('Interactive LLM provider setup wizard')
.action(async () => {
const configDeps = deps?.configDeps ?? {};
const log = deps?.log ?? ((...args: string[]) => console.log(...args));
const prompt = deps?.prompt ?? defaultPrompt;
const fetchModels = deps?.fetchModels ?? defaultFetchModels;
const whichBinary = deps?.whichBinary ?? defaultWhichBinary;
const secretStore = deps?.secretStore ?? await createSecretStore();
const config = loadConfig(configDeps);
const currentLlm = config.llm;
// Annotate current provider in choices
const choices = PROVIDER_CHOICES.map((c) => {
if (currentLlm?.provider === c.value) {
return { ...c, name: `${c.name} (current)` };
}
return c;
});
const provider = await prompt.select<LlmProviderName>('Select LLM provider:', choices);
if (provider === 'none') {
const updated: McpctlConfig = { ...config, llm: { provider: 'none' } };
saveConfig(updated, configDeps);
log('LLM disabled. Restart mcplocal: systemctl --user restart mcplocal');
return;
}
let llmConfig: LlmConfig;
switch (provider) {
case 'gemini-cli':
llmConfig = await setupGeminiCli(prompt, log, whichBinary, currentLlm);
break;
case 'ollama':
llmConfig = await setupOllama(prompt, fetchModels, currentLlm);
break;
case 'anthropic':
llmConfig = await setupApiKeyProvider(prompt, secretStore, 'anthropic', 'anthropic-api-key', ANTHROPIC_MODELS, currentLlm);
break;
case 'vllm':
llmConfig = await setupVllm(prompt, fetchModels, currentLlm);
break;
case 'openai':
llmConfig = await setupApiKeyProvider(prompt, secretStore, 'openai', 'openai-api-key', [], currentLlm);
break;
case 'deepseek':
llmConfig = await setupApiKeyProvider(prompt, secretStore, 'deepseek', 'deepseek-api-key', DEEPSEEK_MODELS, currentLlm);
break;
default:
return;
}
const updated: McpctlConfig = { ...config, llm: llmConfig };
saveConfig(updated, configDeps);
log(`\nLLM configured: ${llmConfig.provider}${llmConfig.model ? ` / ${llmConfig.model}` : ''}`);
log('Restart mcplocal: systemctl --user restart mcplocal');
});
}
async function setupGeminiCli(
prompt: ConfigSetupPrompt,
log: (...args: string[]) => void,
whichBinary: (name: string) => Promise<string | null>,
current?: LlmConfig,
): Promise<LlmConfig> {
const model = await prompt.select<string>('Select model:', [
...GEMINI_MODELS.map((m) => ({
name: m === current?.model ? `${m} (current)` : m,
value: m,
})),
{ name: 'Custom...', value: '__custom__' },
]);
const finalModel = model === '__custom__'
? await prompt.input('Model name:', current?.model)
: model;
// Auto-detect gemini binary path
let binaryPath: string | undefined;
const detected = await whichBinary('gemini');
if (detected) {
log(`Found gemini at: ${detected}`);
binaryPath = detected;
} else {
log('Warning: gemini binary not found in PATH');
const manualPath = await prompt.input('Binary path (or install with: npm i -g @google/gemini-cli):');
if (manualPath) binaryPath = manualPath;
}
return { provider: 'gemini-cli', model: finalModel, binaryPath };
}
async function setupOllama(prompt: ConfigSetupPrompt, fetchModels: ConfigSetupDeps['fetchModels'], current?: LlmConfig): Promise<LlmConfig> {
const url = await prompt.input('Ollama URL:', current?.url ?? 'http://localhost:11434');
// Try to fetch models from Ollama
const models = await fetchModels(url, '/api/tags');
let model: string;
if (models.length > 0) {
const choices = models.map((m) => ({
name: m === current?.model ? `${m} (current)` : m,
value: m,
}));
choices.push({ name: 'Custom...', value: '__custom__' });
model = await prompt.select<string>('Select model:', choices);
if (model === '__custom__') {
model = await prompt.input('Model name:', current?.model);
}
} else {
model = await prompt.input('Model name (could not fetch models):', current?.model ?? 'llama3.2');
}
return { provider: 'ollama', model, url };
}
async function setupVllm(prompt: ConfigSetupPrompt, fetchModels: ConfigSetupDeps['fetchModels'], current?: LlmConfig): Promise<LlmConfig> {
const url = await prompt.input('vLLM URL:', current?.url ?? 'http://localhost:8000');
// Try to fetch models from vLLM (OpenAI-compatible)
const models = await fetchModels(url, '/v1/models');
let model: string;
if (models.length > 0) {
const choices = models.map((m) => ({
name: m === current?.model ? `${m} (current)` : m,
value: m,
}));
choices.push({ name: 'Custom...', value: '__custom__' });
model = await prompt.select<string>('Select model:', choices);
if (model === '__custom__') {
model = await prompt.input('Model name:', current?.model);
}
} else {
model = await prompt.input('Model name (could not fetch models):', current?.model ?? 'default');
}
return { provider: 'vllm', model, url };
}
async function setupApiKeyProvider(
prompt: ConfigSetupPrompt,
secretStore: SecretStore,
provider: LlmProviderName,
secretKey: string,
hardcodedModels: string[],
current?: LlmConfig,
): Promise<LlmConfig> {
// Check for existing API key
const existingKey = await secretStore.get(secretKey);
let apiKey: string;
if (existingKey) {
const masked = `****${existingKey.slice(-4)}`;
const changeKey = await prompt.confirm(`API key stored (${masked}). Change it?`, false);
if (changeKey) {
apiKey = await prompt.password('API key:');
} else {
apiKey = existingKey;
}
} else {
apiKey = await prompt.password('API key:');
}
// Store API key
if (apiKey !== existingKey) {
await secretStore.set(secretKey, apiKey);
}
// Model selection
let model: string;
if (hardcodedModels.length > 0) {
const choices = hardcodedModels.map((m) => ({
name: m === current?.model ? `${m} (current)` : m,
value: m,
}));
choices.push({ name: 'Custom...', value: '__custom__' });
model = await prompt.select<string>('Select model:', choices);
if (model === '__custom__') {
model = await prompt.input('Model name:', current?.model);
}
} else {
model = await prompt.input('Model name:', current?.model ?? 'gpt-4o');
}
// Optional custom URL for openai
let url: string | undefined;
if (provider === 'openai') {
const customUrl = await prompt.confirm('Use custom API endpoint?', false);
if (customUrl) {
url = await prompt.input('API URL:', current?.url ?? 'https://api.openai.com');
}
}
return { provider, model, url };
}

View File

@@ -1,19 +1,36 @@
import { Command } from 'commander'; import { Command } from 'commander';
import { writeFileSync, readFileSync, existsSync } from 'node:fs';
import { resolve, join } from 'node:path';
import { homedir } from 'node:os';
import { loadConfig, saveConfig, mergeConfig, getConfigPath, DEFAULT_CONFIG } from '../config/index.js'; import { loadConfig, saveConfig, mergeConfig, getConfigPath, DEFAULT_CONFIG } from '../config/index.js';
import type { McpctlConfig, ConfigLoaderDeps } from '../config/index.js'; import type { McpctlConfig, ConfigLoaderDeps } from '../config/index.js';
import { formatJson, formatYaml } from '../formatters/index.js'; import { formatJson, formatYaml } from '../formatters/index.js';
import { saveCredentials, loadCredentials } from '../auth/index.js';
import { createConfigSetupCommand } from './config-setup.js';
import type { CredentialsDeps, StoredCredentials } from '../auth/index.js';
import type { ApiClient } from '../api-client.js';
interface McpConfig {
mcpServers: Record<string, { command?: string; args?: string[]; url?: string; env?: Record<string, string> }>;
}
export interface ConfigCommandDeps { export interface ConfigCommandDeps {
configDeps: Partial<ConfigLoaderDeps>; configDeps: Partial<ConfigLoaderDeps>;
log: (...args: string[]) => void; log: (...args: string[]) => void;
} }
export interface ConfigApiDeps {
client: ApiClient;
credentialsDeps: Partial<CredentialsDeps>;
log: (...args: string[]) => void;
}
const defaultDeps: ConfigCommandDeps = { const defaultDeps: ConfigCommandDeps = {
configDeps: {}, configDeps: {},
log: (...args) => console.log(...args), log: (...args) => console.log(...args),
}; };
export function createConfigCommand(deps?: Partial<ConfigCommandDeps>): Command { export function createConfigCommand(deps?: Partial<ConfigCommandDeps>, apiDeps?: ConfigApiDeps): Command {
const { configDeps, log } = { ...defaultDeps, ...deps }; const { configDeps, log } = { ...defaultDeps, ...deps };
const config = new Command('config').description('Manage mcpctl configuration'); const config = new Command('config').description('Manage mcpctl configuration');
@@ -68,5 +85,134 @@ export function createConfigCommand(deps?: Partial<ConfigCommandDeps>): Command
log('Configuration reset to defaults'); log('Configuration reset to defaults');
}); });
// claude/claude-generate: generate .mcp.json pointing at mcpctl mcp bridge
function registerClaudeCommand(name: string, hidden: boolean): void {
const cmd = config
.command(name)
.description(hidden ? '' : 'Generate .mcp.json that connects a project via mcpctl mcp bridge')
.requiredOption('--project <name>', 'Project name')
.option('-o, --output <path>', 'Output file path', '.mcp.json')
.option('--merge', 'Merge with existing .mcp.json instead of overwriting')
.option('--stdout', 'Print to stdout instead of writing a file')
.action((opts: { project: string; output: string; merge?: boolean; stdout?: boolean }) => {
const mcpConfig: McpConfig = {
mcpServers: {
[opts.project]: {
command: 'mcpctl',
args: ['mcp', '-p', opts.project],
},
},
};
if (opts.stdout) {
log(JSON.stringify(mcpConfig, null, 2));
return;
}
const outputPath = resolve(opts.output);
let finalConfig = mcpConfig;
if (opts.merge && existsSync(outputPath)) {
try {
const existing = JSON.parse(readFileSync(outputPath, 'utf-8')) as McpConfig;
finalConfig = {
mcpServers: {
...existing.mcpServers,
...mcpConfig.mcpServers,
},
};
} catch {
// If existing file is invalid, just overwrite
}
}
writeFileSync(outputPath, JSON.stringify(finalConfig, null, 2) + '\n');
const serverCount = Object.keys(finalConfig.mcpServers).length;
log(`Wrote ${outputPath} (${serverCount} server(s))`);
});
if (hidden) {
// Commander shows empty-description commands but they won't clutter help output
void cmd; // suppress unused lint
}
}
registerClaudeCommand('claude', false);
registerClaudeCommand('claude-generate', true); // backward compat
config.addCommand(createConfigSetupCommand({ configDeps }));
if (apiDeps) {
const { client, credentialsDeps, log: apiLog } = apiDeps;
config
.command('impersonate')
.description('Impersonate another user or return to original identity')
.argument('[email]', 'Email of user to impersonate')
.option('--quit', 'Stop impersonating and return to original identity')
.action(async (email: string | undefined, opts: { quit?: boolean }) => {
const configDir = credentialsDeps?.configDir ?? join(homedir(), '.mcpctl');
const backupPath = join(configDir, 'credentials-backup');
if (opts.quit) {
if (!existsSync(backupPath)) {
apiLog('No impersonation session to quit');
process.exitCode = 1;
return;
}
const backupRaw = readFileSync(backupPath, 'utf-8');
const backup = JSON.parse(backupRaw) as StoredCredentials;
saveCredentials(backup, credentialsDeps);
// Remove backup file
const { unlinkSync } = await import('node:fs');
unlinkSync(backupPath);
apiLog(`Returned to ${backup.user}`);
return;
}
if (!email) {
apiLog('Email is required when not using --quit');
process.exitCode = 1;
return;
}
// Save current credentials as backup
const currentCreds = loadCredentials(credentialsDeps);
if (!currentCreds) {
apiLog('Not logged in. Run "mcpctl login" first.');
process.exitCode = 1;
return;
}
writeFileSync(backupPath, JSON.stringify(currentCreds, null, 2) + '\n', 'utf-8');
try {
const result = await client.post<{ token: string; user: { email: string } }>(
'/api/v1/auth/impersonate',
{ email },
);
saveCredentials({
token: result.token,
mcpdUrl: currentCreds.mcpdUrl,
user: result.user.email,
}, credentialsDeps);
apiLog(`Impersonating ${result.user.email}. Use 'mcpctl config impersonate --quit' to return.`);
} catch (err) {
// Restore backup on failure
const backup = JSON.parse(readFileSync(backupPath, 'utf-8')) as StoredCredentials;
saveCredentials(backup, credentialsDeps);
const { unlinkSync } = await import('node:fs');
unlinkSync(backupPath);
apiLog(`Impersonate failed: ${(err as Error).message}`);
process.exitCode = 1;
}
});
}
return config; return config;
} }

View File

@@ -55,7 +55,7 @@ export function createCreateCommand(deps: CreateCommandDeps): Command {
const { client, log } = deps; const { client, log } = deps;
const cmd = new Command('create') const cmd = new Command('create')
.description('Create a resource (server, project)'); .description('Create a resource (server, secret, project, user, group, rbac)');
// --- create server --- // --- create server ---
cmd.command('server') cmd.command('server')
@@ -195,19 +195,32 @@ export function createCreateCommand(deps: CreateCommandDeps): Command {
.description('Create a project') .description('Create a project')
.argument('<name>', 'Project name') .argument('<name>', 'Project name')
.option('-d, --description <text>', 'Project description', '') .option('-d, --description <text>', 'Project description', '')
.option('--proxy-mode <mode>', 'Proxy mode (direct, filtered)')
.option('--llm-provider <name>', 'LLM provider name')
.option('--llm-model <name>', 'LLM model name')
.option('--prompt <text>', 'Project-level prompt / instructions for the LLM')
.option('--server <name>', 'Server name (repeat for multiple)', collect, [])
.option('--force', 'Update if already exists') .option('--force', 'Update if already exists')
.action(async (name: string, opts) => { .action(async (name: string, opts) => {
try { const body: Record<string, unknown> = {
const project = await client.post<{ id: string; name: string }>('/api/v1/projects', {
name, name,
description: opts.description, description: opts.description,
}); proxyMode: opts.proxyMode ?? 'direct',
};
if (opts.prompt) body.prompt = opts.prompt;
if (opts.llmProvider) body.llmProvider = opts.llmProvider;
if (opts.llmModel) body.llmModel = opts.llmModel;
if (opts.server.length > 0) body.servers = opts.server;
try {
const project = await client.post<{ id: string; name: string }>('/api/v1/projects', body);
log(`project '${project.name}' created (id: ${project.id})`); log(`project '${project.name}' created (id: ${project.id})`);
} catch (err) { } catch (err) {
if (err instanceof ApiError && err.status === 409 && opts.force) { if (err instanceof ApiError && err.status === 409 && opts.force) {
const existing = (await client.get<Array<{ id: string; name: string }>>('/api/v1/projects')).find((p) => p.name === name); const existing = (await client.get<Array<{ id: string; name: string }>>('/api/v1/projects')).find((p) => p.name === name);
if (!existing) throw err; if (!existing) throw err;
await client.put(`/api/v1/projects/${existing.id}`, { description: opts.description }); const { name: _n, ...updateBody } = body;
await client.put(`/api/v1/projects/${existing.id}`, updateBody);
log(`project '${name}' updated (id: ${existing.id})`); log(`project '${name}' updated (id: ${existing.id})`);
} else { } else {
throw err; throw err;
@@ -215,5 +228,182 @@ export function createCreateCommand(deps: CreateCommandDeps): Command {
} }
}); });
// --- create user ---
cmd.command('user')
.description('Create a user')
.argument('<email>', 'User email address')
.option('--password <pass>', 'User password')
.option('--name <name>', 'User display name')
.option('--force', 'Update if already exists')
.action(async (email: string, opts) => {
if (!opts.password) {
throw new Error('--password is required');
}
const body: Record<string, unknown> = {
email,
password: opts.password,
};
if (opts.name) body.name = opts.name;
try {
const user = await client.post<{ id: string; email: string }>('/api/v1/users', body);
log(`user '${user.email}' created (id: ${user.id})`);
} catch (err) {
if (err instanceof ApiError && err.status === 409 && opts.force) {
const existing = (await client.get<Array<{ id: string; email: string }>>('/api/v1/users')).find((u) => u.email === email);
if (!existing) throw err;
const { email: _e, ...updateBody } = body;
await client.put(`/api/v1/users/${existing.id}`, updateBody);
log(`user '${email}' updated (id: ${existing.id})`);
} else {
throw err;
}
}
});
// --- create group ---
cmd.command('group')
.description('Create a group')
.argument('<name>', 'Group name')
.option('--description <text>', 'Group description')
.option('--member <email>', 'Member email (repeat for multiple)', collect, [])
.option('--force', 'Update if already exists')
.action(async (name: string, opts) => {
const body: Record<string, unknown> = {
name,
members: opts.member,
};
if (opts.description) body.description = opts.description;
try {
const group = await client.post<{ id: string; name: string }>('/api/v1/groups', body);
log(`group '${group.name}' created (id: ${group.id})`);
} catch (err) {
if (err instanceof ApiError && err.status === 409 && opts.force) {
const existing = (await client.get<Array<{ id: string; name: string }>>('/api/v1/groups')).find((g) => g.name === name);
if (!existing) throw err;
const { name: _n, ...updateBody } = body;
await client.put(`/api/v1/groups/${existing.id}`, updateBody);
log(`group '${name}' updated (id: ${existing.id})`);
} else {
throw err;
}
}
});
// --- create rbac ---
cmd.command('rbac')
.description('Create an RBAC binding definition')
.argument('<name>', 'RBAC binding name')
.option('--subject <entry>', 'Subject as Kind:name (repeat for multiple)', collect, [])
.option('--binding <entry>', 'Role binding as role:resource (e.g. edit:servers, run:projects)', collect, [])
.option('--operation <action>', 'Operation binding (e.g. logs, backup)', collect, [])
.option('--force', 'Update if already exists')
.action(async (name: string, opts) => {
const subjects = (opts.subject as string[]).map((entry: string) => {
const colonIdx = entry.indexOf(':');
if (colonIdx === -1) {
throw new Error(`Invalid subject format '${entry}'. Expected Kind:name (e.g. User:alice@example.com)`);
}
return { kind: entry.slice(0, colonIdx), name: entry.slice(colonIdx + 1) };
});
const roleBindings: Array<Record<string, string>> = [];
// Resource bindings from --binding flag (role:resource or role:resource:name)
for (const entry of opts.binding as string[]) {
const parts = entry.split(':');
if (parts.length === 2) {
roleBindings.push({ role: parts[0]!, resource: parts[1]! });
} else if (parts.length === 3) {
roleBindings.push({ role: parts[0]!, resource: parts[1]!, name: parts[2]! });
} else {
throw new Error(`Invalid binding format '${entry}'. Expected role:resource or role:resource:name (e.g. edit:servers, view:servers:my-ha)`);
}
}
// Operation bindings from --operation flag
for (const action of opts.operation as string[]) {
roleBindings.push({ role: 'run', action });
}
const body: Record<string, unknown> = {
name,
subjects,
roleBindings,
};
try {
const rbac = await client.post<{ id: string; name: string }>('/api/v1/rbac', body);
log(`rbac '${rbac.name}' created (id: ${rbac.id})`);
} catch (err) {
if (err instanceof ApiError && err.status === 409 && opts.force) {
const existing = (await client.get<Array<{ id: string; name: string }>>('/api/v1/rbac')).find((r) => r.name === name);
if (!existing) throw err;
const { name: _n, ...updateBody } = body;
await client.put(`/api/v1/rbac/${existing.id}`, updateBody);
log(`rbac '${name}' updated (id: ${existing.id})`);
} else {
throw err;
}
}
});
// --- create prompt ---
cmd.command('prompt')
.description('Create an approved prompt')
.argument('<name>', 'Prompt name (lowercase alphanumeric with hyphens)')
.option('--project <name>', 'Project name to scope the prompt to')
.option('--content <text>', 'Prompt content text')
.option('--content-file <path>', 'Read prompt content from file')
.action(async (name: string, opts) => {
let content = opts.content as string | undefined;
if (opts.contentFile) {
const fs = await import('node:fs/promises');
content = await fs.readFile(opts.contentFile as string, 'utf-8');
}
if (!content) {
throw new Error('--content or --content-file is required');
}
const body: Record<string, unknown> = { name, content };
if (opts.project) {
// Resolve project name to ID
const projects = await client.get<Array<{ id: string; name: string }>>('/api/v1/projects');
const project = projects.find((p) => p.name === opts.project);
if (!project) throw new Error(`Project '${opts.project as string}' not found`);
body.projectId = project.id;
}
const prompt = await client.post<{ id: string; name: string }>('/api/v1/prompts', body);
log(`prompt '${prompt.name}' created (id: ${prompt.id})`);
});
// --- create promptrequest ---
cmd.command('promptrequest')
.description('Create a prompt request (pending proposal that needs approval)')
.argument('<name>', 'Prompt request name (lowercase alphanumeric with hyphens)')
.requiredOption('--project <name>', 'Project name (required)')
.option('--content <text>', 'Prompt content text')
.option('--content-file <path>', 'Read prompt content from file')
.action(async (name: string, opts) => {
let content = opts.content as string | undefined;
if (opts.contentFile) {
const fs = await import('node:fs/promises');
content = await fs.readFile(opts.contentFile as string, 'utf-8');
}
if (!content) {
throw new Error('--content or --content-file is required');
}
const projectName = opts.project as string;
const pr = await client.post<{ id: string; name: string }>(
`/api/v1/projects/${encodeURIComponent(projectName)}/promptrequests`,
{ name, content },
);
log(`prompt request '${pr.name}' created (id: ${pr.id})`);
log(` approve with: mcpctl approve promptrequest ${pr.name}`);
});
return cmd; return cmd;
} }

View File

@@ -11,7 +11,7 @@ export function createDeleteCommand(deps: DeleteCommandDeps): Command {
const { client, log } = deps; const { client, log } = deps;
return new Command('delete') return new Command('delete')
.description('Delete a resource (server, instance, profile, project)') .description('Delete a resource (server, instance, secret, project, user, group, rbac)')
.argument('<resource>', 'resource type') .argument('<resource>', 'resource type')
.argument('<id>', 'resource ID or name') .argument('<id>', 'resource ID or name')
.action(async (resourceArg: string, idOrName: string) => { .action(async (resourceArg: string, idOrName: string) => {

View File

@@ -138,11 +138,34 @@ function formatProjectDetail(project: Record<string, unknown>): string {
lines.push(`=== Project: ${project.name} ===`); lines.push(`=== Project: ${project.name} ===`);
lines.push(`${pad('Name:')}${project.name}`); lines.push(`${pad('Name:')}${project.name}`);
if (project.description) lines.push(`${pad('Description:')}${project.description}`); if (project.description) lines.push(`${pad('Description:')}${project.description}`);
if (project.ownerId) lines.push(`${pad('Owner:')}${project.ownerId}`);
// Proxy config section
const proxyMode = project.proxyMode as string | undefined;
const llmProvider = project.llmProvider as string | undefined;
const llmModel = project.llmModel as string | undefined;
if (proxyMode || llmProvider || llmModel) {
lines.push('');
lines.push('Proxy Config:');
lines.push(` ${pad('Mode:', 18)}${proxyMode ?? 'direct'}`);
if (llmProvider) lines.push(` ${pad('LLM Provider:', 18)}${llmProvider}`);
if (llmModel) lines.push(` ${pad('LLM Model:', 18)}${llmModel}`);
}
// Servers section
const servers = project.servers as Array<{ server: { name: string } }> | undefined;
if (servers && servers.length > 0) {
lines.push('');
lines.push('Servers:');
lines.push(' NAME');
for (const s of servers) {
lines.push(` ${s.server.name}`);
}
}
lines.push(''); lines.push('');
lines.push('Metadata:'); lines.push('Metadata:');
lines.push(` ${pad('ID:', 12)}${project.id}`); lines.push(` ${pad('ID:', 12)}${project.id}`);
if (project.ownerId) lines.push(` ${pad('Owner:', 12)}${project.ownerId}`);
if (project.createdAt) lines.push(` ${pad('Created:', 12)}${project.createdAt}`); if (project.createdAt) lines.push(` ${pad('Created:', 12)}${project.createdAt}`);
if (project.updatedAt) lines.push(` ${pad('Updated:', 12)}${project.updatedAt}`); if (project.updatedAt) lines.push(` ${pad('Updated:', 12)}${project.updatedAt}`);
@@ -240,6 +263,231 @@ function formatTemplateDetail(template: Record<string, unknown>): string {
return lines.join('\n'); return lines.join('\n');
} }
interface RbacBinding { role: string; resource?: string; action?: string; name?: string }
interface RbacDef { name: string; subjects: Array<{ kind: string; name: string }>; roleBindings: RbacBinding[] }
interface PermissionSet { source: string; bindings: RbacBinding[] }
function formatPermissionSections(sections: PermissionSet[]): string[] {
const lines: string[] = [];
for (const section of sections) {
const bindings = section.bindings;
if (bindings.length === 0) continue;
const resourceBindings = bindings.filter((b) => 'resource' in b && b.resource !== undefined);
const operationBindings = bindings.filter((b) => 'action' in b && b.action !== undefined);
if (resourceBindings.length > 0) {
lines.push('');
lines.push(`${section.source} — Resources:`);
const roleW = Math.max(6, ...resourceBindings.map((b) => b.role.length)) + 2;
const resW = Math.max(10, ...resourceBindings.map((b) => (b.resource ?? '').length)) + 2;
const hasName = resourceBindings.some((b) => b.name);
if (hasName) {
lines.push(` ${'ROLE'.padEnd(roleW)}${'RESOURCE'.padEnd(resW)}NAME`);
} else {
lines.push(` ${'ROLE'.padEnd(roleW)}RESOURCE`);
}
for (const b of resourceBindings) {
if (hasName) {
lines.push(` ${b.role.padEnd(roleW)}${(b.resource ?? '').padEnd(resW)}${b.name ?? '*'}`);
} else {
lines.push(` ${b.role.padEnd(roleW)}${b.resource}`);
}
}
}
if (operationBindings.length > 0) {
lines.push('');
lines.push(`${section.source} — Operations:`);
lines.push(` ${'ACTION'.padEnd(20)}ROLE`);
for (const b of operationBindings) {
lines.push(` ${(b.action ?? '').padEnd(20)}${b.role}`);
}
}
}
return lines;
}
function collectBindingsForSubject(
rbacDefs: RbacDef[],
kind: string,
name: string,
): { rbacName: string; bindings: RbacBinding[] }[] {
const results: { rbacName: string; bindings: RbacBinding[] }[] = [];
for (const def of rbacDefs) {
const matched = def.subjects.some((s) => s.kind === kind && s.name === name);
if (matched) {
results.push({ rbacName: def.name, bindings: def.roleBindings });
}
}
return results;
}
function formatUserDetail(
user: Record<string, unknown>,
rbacDefs?: RbacDef[],
userGroups?: string[],
): string {
const lines: string[] = [];
lines.push(`=== User: ${user.email} ===`);
lines.push(`${pad('Email:')}${user.email}`);
lines.push(`${pad('Name:')}${(user.name as string | null) ?? '-'}`);
lines.push(`${pad('Provider:')}${(user.provider as string | null) ?? 'local'}`);
if (userGroups && userGroups.length > 0) {
lines.push(`${pad('Groups:')}${userGroups.join(', ')}`);
}
if (rbacDefs) {
const email = user.email as string;
// Direct permissions (User:email subjects)
const directMatches = collectBindingsForSubject(rbacDefs, 'User', email);
const directBindings = directMatches.flatMap((m) => m.bindings);
const directSources = directMatches.map((m) => m.rbacName).join(', ');
// Inherited permissions (Group:name subjects)
const inheritedSections: PermissionSet[] = [];
if (userGroups) {
for (const groupName of userGroups) {
const groupMatches = collectBindingsForSubject(rbacDefs, 'Group', groupName);
const groupBindings = groupMatches.flatMap((m) => m.bindings);
if (groupBindings.length > 0) {
inheritedSections.push({ source: `Inherited (${groupName})`, bindings: groupBindings });
}
}
}
const sections: PermissionSet[] = [];
if (directBindings.length > 0) {
sections.push({ source: `Direct (${directSources})`, bindings: directBindings });
}
sections.push(...inheritedSections);
if (sections.length > 0) {
lines.push('');
lines.push('Access:');
lines.push(...formatPermissionSections(sections));
} else {
lines.push('');
lines.push('Access: (none)');
}
}
lines.push('');
lines.push('Metadata:');
lines.push(` ${pad('ID:', 12)}${user.id}`);
if (user.createdAt) lines.push(` ${pad('Created:', 12)}${user.createdAt}`);
if (user.updatedAt) lines.push(` ${pad('Updated:', 12)}${user.updatedAt}`);
return lines.join('\n');
}
function formatGroupDetail(group: Record<string, unknown>, rbacDefs?: RbacDef[]): string {
const lines: string[] = [];
lines.push(`=== Group: ${group.name} ===`);
lines.push(`${pad('Name:')}${group.name}`);
if (group.description) lines.push(`${pad('Description:')}${group.description}`);
const members = group.members as Array<{ user: { email: string }; createdAt?: string }> | undefined;
if (members && members.length > 0) {
lines.push('');
lines.push('Members:');
const emailW = Math.max(6, ...members.map((m) => m.user.email.length)) + 2;
lines.push(` ${'EMAIL'.padEnd(emailW)}ADDED`);
for (const m of members) {
const added = (m.createdAt as string | undefined) ?? '-';
lines.push(` ${m.user.email.padEnd(emailW)}${added}`);
}
}
if (rbacDefs) {
const groupName = group.name as string;
const matches = collectBindingsForSubject(rbacDefs, 'Group', groupName);
const allBindings = matches.flatMap((m) => m.bindings);
const sources = matches.map((m) => m.rbacName).join(', ');
if (allBindings.length > 0) {
const sections: PermissionSet[] = [{ source: `Granted (${sources})`, bindings: allBindings }];
lines.push('');
lines.push('Access:');
lines.push(...formatPermissionSections(sections));
} else {
lines.push('');
lines.push('Access: (none)');
}
}
lines.push('');
lines.push('Metadata:');
lines.push(` ${pad('ID:', 12)}${group.id}`);
if (group.createdAt) lines.push(` ${pad('Created:', 12)}${group.createdAt}`);
if (group.updatedAt) lines.push(` ${pad('Updated:', 12)}${group.updatedAt}`);
return lines.join('\n');
}
function formatRbacDetail(rbac: Record<string, unknown>): string {
const lines: string[] = [];
lines.push(`=== RBAC: ${rbac.name} ===`);
lines.push(`${pad('Name:')}${rbac.name}`);
const subjects = rbac.subjects as Array<{ kind: string; name: string }> | undefined;
if (subjects && subjects.length > 0) {
lines.push('');
lines.push('Subjects:');
const kindW = Math.max(6, ...subjects.map((s) => s.kind.length)) + 2;
lines.push(` ${'KIND'.padEnd(kindW)}NAME`);
for (const s of subjects) {
lines.push(` ${s.kind.padEnd(kindW)}${s.name}`);
}
}
const roleBindings = rbac.roleBindings as Array<{ role: string; resource?: string; action?: string; name?: string }> | undefined;
if (roleBindings && roleBindings.length > 0) {
// Separate resource bindings from operation bindings
const resourceBindings = roleBindings.filter((b) => 'resource' in b && b.resource !== undefined);
const operationBindings = roleBindings.filter((b) => 'action' in b && b.action !== undefined);
if (resourceBindings.length > 0) {
lines.push('');
lines.push('Resource Bindings:');
const roleW = Math.max(6, ...resourceBindings.map((b) => b.role.length)) + 2;
const resW = Math.max(10, ...resourceBindings.map((b) => (b.resource ?? '').length)) + 2;
const hasName = resourceBindings.some((b) => b.name);
if (hasName) {
lines.push(` ${'ROLE'.padEnd(roleW)}${'RESOURCE'.padEnd(resW)}NAME`);
} else {
lines.push(` ${'ROLE'.padEnd(roleW)}RESOURCE`);
}
for (const b of resourceBindings) {
if (hasName) {
lines.push(` ${b.role.padEnd(roleW)}${(b.resource ?? '').padEnd(resW)}${b.name ?? '*'}`);
} else {
lines.push(` ${b.role.padEnd(roleW)}${b.resource}`);
}
}
}
if (operationBindings.length > 0) {
lines.push('');
lines.push('Operations:');
lines.push(` ${'ACTION'.padEnd(20)}ROLE`);
for (const b of operationBindings) {
lines.push(` ${(b.action ?? '').padEnd(20)}${b.role}`);
}
}
}
lines.push('');
lines.push('Metadata:');
lines.push(` ${pad('ID:', 12)}${rbac.id}`);
if (rbac.createdAt) lines.push(` ${pad('Created:', 12)}${rbac.createdAt}`);
if (rbac.updatedAt) lines.push(` ${pad('Updated:', 12)}${rbac.updatedAt}`);
return lines.join('\n');
}
function formatGenericDetail(obj: Record<string, unknown>): string { function formatGenericDetail(obj: Record<string, unknown>): string {
const lines: string[] = []; const lines: string[] = [];
for (const [key, value] of Object.entries(obj)) { for (const [key, value] of Object.entries(obj)) {
@@ -341,6 +589,27 @@ export function createDescribeCommand(deps: DescribeCommandDeps): Command {
case 'projects': case 'projects':
deps.log(formatProjectDetail(item)); deps.log(formatProjectDetail(item));
break; break;
case 'users': {
// Fetch RBAC definitions and groups to show permissions
const [rbacDefsForUser, allGroupsForUser] = await Promise.all([
deps.client.get<RbacDef[]>('/api/v1/rbac').catch(() => [] as RbacDef[]),
deps.client.get<Array<{ name: string; members?: Array<{ user: { email: string } }> }>>('/api/v1/groups').catch(() => []),
]);
const userEmail = item.email as string;
const userGroupNames = allGroupsForUser
.filter((g) => g.members?.some((m) => m.user.email === userEmail))
.map((g) => g.name);
deps.log(formatUserDetail(item, rbacDefsForUser, userGroupNames));
break;
}
case 'groups': {
const rbacDefsForGroup = await deps.client.get<RbacDef[]>('/api/v1/rbac').catch(() => [] as RbacDef[]);
deps.log(formatGroupDetail(item, rbacDefsForGroup));
break;
}
case 'rbac':
deps.log(formatRbacDetail(item));
break;
default: default:
deps.log(formatGenericDetail(item)); deps.log(formatGenericDetail(item));
} }

View File

@@ -47,7 +47,7 @@ export function createEditCommand(deps: EditCommandDeps): Command {
return; return;
} }
const validResources = ['servers', 'secrets', 'projects']; const validResources = ['servers', 'secrets', 'projects', 'groups', 'rbac'];
if (!validResources.includes(resource)) { if (!validResources.includes(resource)) {
log(`Error: unknown resource type '${resourceArg}'`); log(`Error: unknown resource type '${resourceArg}'`);
process.exitCode = 1; process.exitCode = 1;

View File

@@ -21,7 +21,9 @@ interface ProjectRow {
id: string; id: string;
name: string; name: string;
description: string; description: string;
proxyMode: string;
ownerId: string; ownerId: string;
servers?: Array<{ server: { name: string } }>;
} }
interface SecretRow { interface SecretRow {
@@ -57,10 +59,60 @@ const serverColumns: Column<ServerRow>[] = [
{ header: 'ID', key: 'id' }, { header: 'ID', key: 'id' },
]; ];
interface UserRow {
id: string;
email: string;
name: string | null;
provider: string | null;
}
interface GroupRow {
id: string;
name: string;
description: string;
members?: Array<{ user: { email: string } }>;
}
interface RbacRow {
id: string;
name: string;
subjects: Array<{ kind: string; name: string }>;
roleBindings: Array<{ role: string; resource?: string; action?: string; name?: string }>;
}
const projectColumns: Column<ProjectRow>[] = [ const projectColumns: Column<ProjectRow>[] = [
{ header: 'NAME', key: 'name' }, { header: 'NAME', key: 'name' },
{ header: 'MODE', key: (r) => r.proxyMode ?? 'direct', width: 10 },
{ header: 'SERVERS', key: (r) => r.servers ? String(r.servers.length) : '0', width: 8 },
{ header: 'DESCRIPTION', key: 'description', width: 30 },
{ header: 'ID', key: 'id' },
];
const userColumns: Column<UserRow>[] = [
{ header: 'EMAIL', key: 'email' },
{ header: 'NAME', key: (r) => r.name ?? '-' },
{ header: 'PROVIDER', key: (r) => r.provider ?? 'local', width: 10 },
{ header: 'ID', key: 'id' },
];
const groupColumns: Column<GroupRow>[] = [
{ header: 'NAME', key: 'name' },
{ header: 'MEMBERS', key: (r) => r.members ? String(r.members.length) : '0', width: 8 },
{ header: 'DESCRIPTION', key: 'description', width: 40 }, { header: 'DESCRIPTION', key: 'description', width: 40 },
{ header: 'OWNER', key: 'ownerId' }, { header: 'ID', key: 'id' },
];
const rbacColumns: Column<RbacRow>[] = [
{ header: 'NAME', key: 'name' },
{ header: 'SUBJECTS', key: (r) => r.subjects.map((s) => `${s.kind}:${s.name}`).join(', '), width: 30 },
{ header: 'BINDINGS', key: (r) => r.roleBindings.map((b) => {
if ('action' in b && b.action !== undefined) return `run>${b.action}`;
if ('resource' in b && b.resource !== undefined) {
const base = `${b.role}:${b.resource}`;
return b.name ? `${base}:${b.name}` : base;
}
return b.role;
}).join(', '), width: 40 },
{ header: 'ID', key: 'id' }, { header: 'ID', key: 'id' },
]; ];
@@ -78,6 +130,36 @@ const templateColumns: Column<TemplateRow>[] = [
{ header: 'DESCRIPTION', key: 'description', width: 50 }, { header: 'DESCRIPTION', key: 'description', width: 50 },
]; ];
interface PromptRow {
id: string;
name: string;
projectId: string | null;
createdAt: string;
}
interface PromptRequestRow {
id: string;
name: string;
projectId: string | null;
createdBySession: string | null;
createdAt: string;
}
const promptColumns: Column<PromptRow>[] = [
{ header: 'NAME', key: 'name' },
{ header: 'PROJECT', key: (r) => r.projectId ?? '-', width: 20 },
{ header: 'CREATED', key: (r) => new Date(r.createdAt).toLocaleString(), width: 20 },
{ header: 'ID', key: 'id' },
];
const promptRequestColumns: Column<PromptRequestRow>[] = [
{ header: 'NAME', key: 'name' },
{ header: 'PROJECT', key: (r) => r.projectId ?? '-', width: 20 },
{ header: 'SESSION', key: (r) => r.createdBySession ? r.createdBySession.slice(0, 12) : '-', width: 14 },
{ header: 'CREATED', key: (r) => new Date(r.createdAt).toLocaleString(), width: 20 },
{ header: 'ID', key: 'id' },
];
const instanceColumns: Column<InstanceRow>[] = [ const instanceColumns: Column<InstanceRow>[] = [
{ header: 'NAME', key: (r) => r.server?.name ?? '-', width: 20 }, { header: 'NAME', key: (r) => r.server?.name ?? '-', width: 20 },
{ header: 'STATUS', key: 'status', width: 10 }, { header: 'STATUS', key: 'status', width: 10 },
@@ -99,6 +181,16 @@ function getColumnsForResource(resource: string): Column<Record<string, unknown>
return templateColumns as unknown as Column<Record<string, unknown>>[]; return templateColumns as unknown as Column<Record<string, unknown>>[];
case 'instances': case 'instances':
return instanceColumns as unknown as Column<Record<string, unknown>>[]; return instanceColumns as unknown as Column<Record<string, unknown>>[];
case 'users':
return userColumns as unknown as Column<Record<string, unknown>>[];
case 'groups':
return groupColumns as unknown as Column<Record<string, unknown>>[];
case 'rbac':
return rbacColumns as unknown as Column<Record<string, unknown>>[];
case 'prompts':
return promptColumns as unknown as Column<Record<string, unknown>>[];
case 'promptrequests':
return promptRequestColumns as unknown as Column<Record<string, unknown>>[];
default: default:
return [ return [
{ header: 'ID', key: 'id' as keyof Record<string, unknown> }, { header: 'ID', key: 'id' as keyof Record<string, unknown> },

224
src/cli/src/commands/mcp.ts Normal file
View File

@@ -0,0 +1,224 @@
import { Command } from 'commander';
import http from 'node:http';
import { createInterface } from 'node:readline';
export interface McpBridgeOptions {
projectName: string;
mcplocalUrl: string;
token?: string | undefined;
stdin: NodeJS.ReadableStream;
stdout: NodeJS.WritableStream;
stderr: NodeJS.WritableStream;
}
function postJsonRpc(
url: string,
body: string,
sessionId: string | undefined,
token: string | undefined,
): Promise<{ status: number; headers: http.IncomingHttpHeaders; body: string }> {
return new Promise((resolve, reject) => {
const parsed = new URL(url);
const headers: Record<string, string> = {
'Content-Type': 'application/json',
'Accept': 'application/json, text/event-stream',
};
if (sessionId) {
headers['mcp-session-id'] = sessionId;
}
if (token) {
headers['Authorization'] = `Bearer ${token}`;
}
const req = http.request(
{
hostname: parsed.hostname,
port: parsed.port,
path: parsed.pathname,
method: 'POST',
headers,
timeout: 30_000,
},
(res) => {
const chunks: Buffer[] = [];
res.on('data', (chunk: Buffer) => chunks.push(chunk));
res.on('end', () => {
resolve({
status: res.statusCode ?? 0,
headers: res.headers,
body: Buffer.concat(chunks).toString('utf-8'),
});
});
},
);
req.on('error', reject);
req.on('timeout', () => {
req.destroy();
reject(new Error('Request timed out'));
});
req.write(body);
req.end();
});
}
function sendDelete(
url: string,
sessionId: string,
token: string | undefined,
): Promise<void> {
return new Promise((resolve) => {
const parsed = new URL(url);
const headers: Record<string, string> = {
'mcp-session-id': sessionId,
};
if (token) {
headers['Authorization'] = `Bearer ${token}`;
}
const req = http.request(
{
hostname: parsed.hostname,
port: parsed.port,
path: parsed.pathname,
method: 'DELETE',
headers,
timeout: 5_000,
},
() => resolve(),
);
req.on('error', () => resolve()); // Best effort cleanup
req.on('timeout', () => {
req.destroy();
resolve();
});
req.end();
});
}
/**
* Extract JSON-RPC messages from an HTTP response body.
* Handles both plain JSON and SSE (text/event-stream) formats.
*/
function extractJsonRpcMessages(contentType: string | undefined, body: string): string[] {
if (contentType?.includes('text/event-stream')) {
// Parse SSE: extract data: lines
const messages: string[] = [];
for (const line of body.split('\n')) {
if (line.startsWith('data: ')) {
messages.push(line.slice(6));
}
}
return messages;
}
// Plain JSON response
return [body];
}
/**
* STDIO-to-Streamable-HTTP MCP bridge.
*
* Reads JSON-RPC messages line-by-line from stdin, POSTs them to
* mcplocal's project endpoint, and writes responses to stdout.
*/
export async function runMcpBridge(opts: McpBridgeOptions): Promise<void> {
const { projectName, mcplocalUrl, token, stdin, stdout, stderr } = opts;
const endpointUrl = `${mcplocalUrl.replace(/\/$/, '')}/projects/${encodeURIComponent(projectName)}/mcp`;
let sessionId: string | undefined;
const rl = createInterface({ input: stdin, crlfDelay: Infinity });
for await (const line of rl) {
const trimmed = line.trim();
if (!trimmed) continue;
try {
const result = await postJsonRpc(endpointUrl, trimmed, sessionId, token);
// Capture session ID from first response
if (!sessionId) {
const sid = result.headers['mcp-session-id'];
if (typeof sid === 'string') {
sessionId = sid;
}
}
if (result.status >= 400) {
stderr.write(`MCP bridge error: HTTP ${result.status}: ${result.body}\n`);
}
// Handle both plain JSON and SSE responses
const messages = extractJsonRpcMessages(result.headers['content-type'], result.body);
for (const msg of messages) {
const trimmedMsg = msg.trim();
if (trimmedMsg) {
stdout.write(trimmedMsg + '\n');
}
}
} catch (err) {
stderr.write(`MCP bridge error: ${err instanceof Error ? err.message : String(err)}\n`);
}
}
// stdin closed — cleanup session
if (sessionId) {
await sendDelete(endpointUrl, sessionId, token);
}
}
export interface McpCommandDeps {
getProject: () => string | undefined;
configLoader?: () => { mcplocalUrl: string };
credentialsLoader?: () => { token: string } | null;
}
export function createMcpCommand(deps: McpCommandDeps): Command {
const cmd = new Command('mcp')
.description('MCP STDIO transport bridge — connects stdin/stdout to a project MCP endpoint')
.passThroughOptions()
.option('-p, --project <name>', 'Project name')
.action(async (opts: { project?: string }) => {
// Accept -p/--project on the command itself, or fall back to global --project
const projectName = opts.project ?? deps.getProject();
if (!projectName) {
process.stderr.write('Error: --project is required for the mcp command\n');
process.exitCode = 1;
return;
}
let mcplocalUrl = 'http://localhost:3200';
if (deps.configLoader) {
mcplocalUrl = deps.configLoader().mcplocalUrl;
} else {
try {
const { loadConfig } = await import('../config/index.js');
mcplocalUrl = loadConfig().mcplocalUrl;
} catch {
// Use default
}
}
let token: string | undefined;
if (deps.credentialsLoader) {
token = deps.credentialsLoader()?.token;
} else {
try {
const { loadCredentials } = await import('../auth/index.js');
token = loadCredentials()?.token;
} catch {
// No credentials
}
}
await runMcpBridge({
projectName,
mcplocalUrl,
token,
stdin: process.stdin,
stdout: process.stdout,
stderr: process.stderr,
});
});
return cmd;
}

View File

@@ -0,0 +1,66 @@
import { Command } from 'commander';
import type { ApiClient } from '../api-client.js';
import { resolveNameOrId, resolveResource } from './shared.js';
export interface ProjectOpsDeps {
client: ApiClient;
log: (...args: string[]) => void;
getProject: () => string | undefined;
}
function requireProject(deps: ProjectOpsDeps): string {
const project = deps.getProject();
if (!project) {
deps.log('Error: --project <name> is required for this command.');
process.exitCode = 1;
throw new Error('--project required');
}
return project;
}
export function createAttachServerCommand(deps: ProjectOpsDeps): Command {
const { client, log } = deps;
return new Command('attach-server')
.description('Attach a server to a project (requires --project)')
.argument('<server-name>', 'Server name to attach')
.action(async (serverName: string) => {
const projectName = requireProject(deps);
const projectId = await resolveNameOrId(client, 'projects', projectName);
await client.post(`/api/v1/projects/${projectId}/servers`, { server: serverName });
log(`server '${serverName}' attached to project '${projectName}'`);
});
}
export function createDetachServerCommand(deps: ProjectOpsDeps): Command {
const { client, log } = deps;
return new Command('detach-server')
.description('Detach a server from a project (requires --project)')
.argument('<server-name>', 'Server name to detach')
.action(async (serverName: string) => {
const projectName = requireProject(deps);
const projectId = await resolveNameOrId(client, 'projects', projectName);
await client.delete(`/api/v1/projects/${projectId}/servers/${serverName}`);
log(`server '${serverName}' detached from project '${projectName}'`);
});
}
export function createApproveCommand(deps: ProjectOpsDeps): Command {
const { client, log } = deps;
return new Command('approve')
.description('Approve a pending prompt request (atomic: delete request, create prompt)')
.argument('<resource>', 'Resource type (promptrequest)')
.argument('<name>', 'Prompt request name or ID')
.action(async (resourceArg: string, nameOrId: string) => {
const resource = resolveResource(resourceArg);
if (resource !== 'promptrequests') {
throw new Error(`approve is only supported for 'promptrequest', got '${resourceArg}'`);
}
const id = await resolveNameOrId(client, 'promptrequests', nameOrId);
const prompt = await client.post<{ id: string; name: string }>(`/api/v1/promptrequests/${id}/approve`, {});
log(`prompt request approved → prompt '${prompt.name}' created (id: ${prompt.id})`);
});
}

View File

@@ -1,15 +0,0 @@
import { Command } from 'commander';
import type { ApiClient } from '../api-client.js';
export interface ProjectCommandDeps {
client: ApiClient;
log: (...args: unknown[]) => void;
}
export function createProjectCommand(_deps: ProjectCommandDeps): Command {
const cmd = new Command('project')
.alias('proj')
.description('Project-specific actions (create with "create project", list with "get projects")');
return cmd;
}

View File

@@ -11,6 +11,16 @@ export const RESOURCE_ALIASES: Record<string, string> = {
sec: 'secrets', sec: 'secrets',
template: 'templates', template: 'templates',
tpl: 'templates', tpl: 'templates',
user: 'users',
group: 'groups',
rbac: 'rbac',
'rbac-definition': 'rbac',
'rbac-binding': 'rbac',
prompt: 'prompts',
prompts: 'prompts',
promptrequest: 'promptrequests',
promptrequests: 'promptrequests',
pr: 'promptrequests',
}; };
export function resolveResource(name: string): string { export function resolveResource(name: string): string {
@@ -28,9 +38,23 @@ export async function resolveNameOrId(
if (/^c[a-z0-9]{24}/.test(nameOrId)) { if (/^c[a-z0-9]{24}/.test(nameOrId)) {
return nameOrId; return nameOrId;
} }
const items = await client.get<Array<{ id: string; name: string }>>(`/api/v1/${resource}`); // Users resolve by email, not name
const match = items.find((item) => item.name === nameOrId); if (resource === 'users') {
const items = await client.get<Array<{ id: string; email: string }>>(`/api/v1/${resource}`);
const match = items.find((item) => item.email === nameOrId);
if (match) return match.id; if (match) return match.id;
throw new Error(`user '${nameOrId}' not found`);
}
const items = await client.get<Array<Record<string, unknown>>>(`/api/v1/${resource}`);
const match = items.find((item) => {
// Instances use server.name, other resources use name directly
if (resource === 'instances') {
const server = item.server as { name?: string } | undefined;
return server?.name === nameOrId;
}
return item.name === nameOrId;
});
if (match) return match.id as string;
throw new Error(`${resource.replace(/s$/, '')} '${nameOrId}' not found`); throw new Error(`${resource.replace(/s$/, '')} '${nameOrId}' not found`);
} }

View File

@@ -7,11 +7,22 @@ import type { CredentialsDeps } from '../auth/index.js';
import { formatJson, formatYaml } from '../formatters/index.js'; import { formatJson, formatYaml } from '../formatters/index.js';
import { APP_VERSION } from '@mcpctl/shared'; import { APP_VERSION } from '@mcpctl/shared';
// ANSI helpers
const GREEN = '\x1b[32m';
const RED = '\x1b[31m';
const DIM = '\x1b[2m';
const RESET = '\x1b[0m';
const CLEAR_LINE = '\x1b[2K\r';
export interface StatusCommandDeps { export interface StatusCommandDeps {
configDeps: Partial<ConfigLoaderDeps>; configDeps: Partial<ConfigLoaderDeps>;
credentialsDeps: Partial<CredentialsDeps>; credentialsDeps: Partial<CredentialsDeps>;
log: (...args: string[]) => void; log: (...args: string[]) => void;
write: (text: string) => void;
checkHealth: (url: string) => Promise<boolean>; checkHealth: (url: string) => Promise<boolean>;
/** Check LLM health via mcplocal's /llm/health endpoint */
checkLlm: (mcplocalUrl: string) => Promise<string>;
isTTY: boolean;
} }
function defaultCheckHealth(url: string): Promise<boolean> { function defaultCheckHealth(url: string): Promise<boolean> {
@@ -28,15 +39,51 @@ function defaultCheckHealth(url: string): Promise<boolean> {
}); });
} }
/**
* Check LLM health by querying mcplocal's /llm/health endpoint.
* This tests the actual provider running inside the daemon (uses persistent ACP for gemini, etc.)
*/
function defaultCheckLlm(mcplocalUrl: string): Promise<string> {
return new Promise((resolve) => {
const req = http.get(`${mcplocalUrl}/llm/health`, { timeout: 30000 }, (res) => {
const chunks: Buffer[] = [];
res.on('data', (chunk: Buffer) => chunks.push(chunk));
res.on('end', () => {
try {
const body = JSON.parse(Buffer.concat(chunks).toString('utf-8')) as { status: string; error?: string };
if (body.status === 'ok') {
resolve('ok');
} else if (body.status === 'not configured') {
resolve('not configured');
} else if (body.error) {
resolve(body.error.slice(0, 80));
} else {
resolve(body.status);
}
} catch {
resolve('invalid response');
}
});
});
req.on('error', () => resolve('mcplocal unreachable'));
req.on('timeout', () => { req.destroy(); resolve('timeout'); });
});
}
const SPINNER_FRAMES = ['⠋', '⠙', '⠹', '⠸', '⠼', '⠴', '⠦', '⠧', '⠇', '⠏'];
const defaultDeps: StatusCommandDeps = { const defaultDeps: StatusCommandDeps = {
configDeps: {}, configDeps: {},
credentialsDeps: {}, credentialsDeps: {},
log: (...args) => console.log(...args), log: (...args) => console.log(...args),
write: (text) => process.stdout.write(text),
checkHealth: defaultCheckHealth, checkHealth: defaultCheckHealth,
checkLlm: defaultCheckLlm,
isTTY: process.stdout.isTTY ?? false,
}; };
export function createStatusCommand(deps?: Partial<StatusCommandDeps>): Command { export function createStatusCommand(deps?: Partial<StatusCommandDeps>): Command {
const { configDeps, credentialsDeps, log, checkHealth } = { ...defaultDeps, ...deps }; const { configDeps, credentialsDeps, log, write, checkHealth, checkLlm, isTTY } = { ...defaultDeps, ...deps };
return new Command('status') return new Command('status')
.description('Show mcpctl status and connectivity') .description('Show mcpctl status and connectivity')
@@ -45,11 +92,22 @@ export function createStatusCommand(deps?: Partial<StatusCommandDeps>): Command
const config = loadConfig(configDeps); const config = loadConfig(configDeps);
const creds = loadCredentials(credentialsDeps); const creds = loadCredentials(credentialsDeps);
const [mcplocalReachable, mcpdReachable] = await Promise.all([ const llmLabel = config.llm && config.llm.provider !== 'none'
? `${config.llm.provider}${config.llm.model ? ` / ${config.llm.model}` : ''}`
: null;
if (opts.output !== 'table') {
// JSON/YAML: run everything in parallel, wait, output at once
const [mcplocalReachable, mcpdReachable, llmStatus] = await Promise.all([
checkHealth(config.mcplocalUrl), checkHealth(config.mcplocalUrl),
checkHealth(config.mcpdUrl), checkHealth(config.mcpdUrl),
llmLabel ? checkLlm(config.mcplocalUrl) : Promise.resolve(null),
]); ]);
const llm = llmLabel
? llmStatus === 'ok' ? llmLabel : `${llmLabel} (${llmStatus})`
: null;
const status = { const status = {
version: APP_VERSION, version: APP_VERSION,
mcplocalUrl: config.mcplocalUrl, mcplocalUrl: config.mcplocalUrl,
@@ -59,19 +117,60 @@ export function createStatusCommand(deps?: Partial<StatusCommandDeps>): Command
auth: creds ? { user: creds.user } : null, auth: creds ? { user: creds.user } : null,
registries: config.registries, registries: config.registries,
outputFormat: config.outputFormat, outputFormat: config.outputFormat,
llm,
llmStatus,
}; };
if (opts.output === 'json') { log(opts.output === 'json' ? formatJson(status) : formatYaml(status));
log(formatJson(status)); return;
} else if (opts.output === 'yaml') { }
log(formatYaml(status));
} else { // Table format: print lines progressively, LLM last with spinner
log(`mcpctl v${status.version}`);
log(`mcplocal: ${status.mcplocalUrl} (${mcplocalReachable ? 'connected' : 'unreachable'})`); // Fast health checks first
log(`mcpd: ${status.mcpdUrl} (${mcpdReachable ? 'connected' : 'unreachable'})`); const [mcplocalReachable, mcpdReachable] = await Promise.all([
checkHealth(config.mcplocalUrl),
checkHealth(config.mcpdUrl),
]);
log(`mcpctl v${APP_VERSION}`);
log(`mcplocal: ${config.mcplocalUrl} (${mcplocalReachable ? 'connected' : 'unreachable'})`);
log(`mcpd: ${config.mcpdUrl} (${mcpdReachable ? 'connected' : 'unreachable'})`);
log(`Auth: ${creds ? `logged in as ${creds.user}` : 'not logged in'}`); log(`Auth: ${creds ? `logged in as ${creds.user}` : 'not logged in'}`);
log(`Registries: ${status.registries.join(', ')}`); log(`Registries: ${config.registries.join(', ')}`);
log(`Output: ${status.outputFormat}`); log(`Output: ${config.outputFormat}`);
if (!llmLabel) {
log(`LLM: not configured (run 'mcpctl config setup')`);
return;
}
// LLM check with spinner — queries mcplocal's /llm/health endpoint
const llmPromise = checkLlm(config.mcplocalUrl);
if (isTTY) {
let frame = 0;
const interval = setInterval(() => {
write(`${CLEAR_LINE}LLM: ${llmLabel} ${DIM}${SPINNER_FRAMES[frame % SPINNER_FRAMES.length]} checking...${RESET}`);
frame++;
}, 80);
const llmStatus = await llmPromise;
clearInterval(interval);
if (llmStatus === 'ok' || llmStatus === 'ok (key stored)') {
write(`${CLEAR_LINE}LLM: ${llmLabel} ${GREEN}${llmStatus}${RESET}\n`);
} else {
write(`${CLEAR_LINE}LLM: ${llmLabel} ${RED}${llmStatus}${RESET}\n`);
}
} else {
// Non-TTY: no spinner, just wait and print
const llmStatus = await llmPromise;
if (llmStatus === 'ok' || llmStatus === 'ok (key stored)') {
log(`LLM: ${llmLabel}${llmStatus}`);
} else {
log(`LLM: ${llmLabel}${llmStatus}`);
}
} }
}); });
} }

View File

@@ -1,4 +1,4 @@
export { McpctlConfigSchema, DEFAULT_CONFIG } from './schema.js'; export { McpctlConfigSchema, LlmConfigSchema, LLM_PROVIDERS, DEFAULT_CONFIG } from './schema.js';
export type { McpctlConfig } from './schema.js'; export type { McpctlConfig, LlmConfig, LlmProviderName } from './schema.js';
export { loadConfig, saveConfig, mergeConfig, getConfigPath } from './loader.js'; export { loadConfig, saveConfig, mergeConfig, getConfigPath } from './loader.js';
export type { ConfigLoaderDeps } from './loader.js'; export type { ConfigLoaderDeps } from './loader.js';

View File

@@ -1,5 +1,21 @@
import { z } from 'zod'; import { z } from 'zod';
export const LLM_PROVIDERS = ['gemini-cli', 'ollama', 'anthropic', 'openai', 'deepseek', 'vllm', 'none'] as const;
export type LlmProviderName = typeof LLM_PROVIDERS[number];
export const LlmConfigSchema = z.object({
/** LLM provider name */
provider: z.enum(LLM_PROVIDERS),
/** Model name */
model: z.string().optional(),
/** Provider URL (for ollama, vllm, openai with custom endpoint) */
url: z.string().optional(),
/** Binary path override (for gemini-cli) */
binaryPath: z.string().optional(),
}).strict();
export type LlmConfig = z.infer<typeof LlmConfigSchema>;
export const McpctlConfigSchema = z.object({ export const McpctlConfigSchema = z.object({
/** mcplocal daemon endpoint (local LLM pre-processing proxy) */ /** mcplocal daemon endpoint (local LLM pre-processing proxy) */
mcplocalUrl: z.string().default('http://localhost:3200'), mcplocalUrl: z.string().default('http://localhost:3200'),
@@ -19,6 +35,8 @@ export const McpctlConfigSchema = z.object({
outputFormat: z.enum(['table', 'json', 'yaml']).default('table'), outputFormat: z.enum(['table', 'json', 'yaml']).default('table'),
/** Smithery API key */ /** Smithery API key */
smitheryApiKey: z.string().optional(), smitheryApiKey: z.string().optional(),
/** LLM provider configuration for smart features (pagination summaries, etc.) */
llm: LlmConfigSchema.optional(),
}).transform((cfg) => { }).transform((cfg) => {
// Backward compatibility: if old daemonUrl is set but mcplocalUrl wasn't explicitly changed, // Backward compatibility: if old daemonUrl is set but mcplocalUrl wasn't explicitly changed,
// use daemonUrl as mcplocalUrl // use daemonUrl as mcplocalUrl

View File

@@ -10,10 +10,10 @@ import { createLogsCommand } from './commands/logs.js';
import { createApplyCommand } from './commands/apply.js'; import { createApplyCommand } from './commands/apply.js';
import { createCreateCommand } from './commands/create.js'; import { createCreateCommand } from './commands/create.js';
import { createEditCommand } from './commands/edit.js'; import { createEditCommand } from './commands/edit.js';
import { createClaudeCommand } from './commands/claude.js';
import { createProjectCommand } from './commands/project.js';
import { createBackupCommand, createRestoreCommand } from './commands/backup.js'; import { createBackupCommand, createRestoreCommand } from './commands/backup.js';
import { createLoginCommand, createLogoutCommand } from './commands/auth.js'; import { createLoginCommand, createLogoutCommand } from './commands/auth.js';
import { createAttachServerCommand, createDetachServerCommand, createApproveCommand } from './commands/project-ops.js';
import { createMcpCommand } from './commands/mcp.js';
import { ApiClient, ApiError } from './api-client.js'; import { ApiClient, ApiError } from './api-client.js';
import { loadConfig } from './config/index.js'; import { loadConfig } from './config/index.js';
import { loadCredentials } from './auth/index.js'; import { loadCredentials } from './auth/index.js';
@@ -26,9 +26,9 @@ export function createProgram(): Command {
.version(APP_VERSION, '-v, --version') .version(APP_VERSION, '-v, --version')
.enablePositionalOptions() .enablePositionalOptions()
.option('--daemon-url <url>', 'mcplocal daemon URL') .option('--daemon-url <url>', 'mcplocal daemon URL')
.option('--direct', 'bypass mcplocal and connect directly to mcpd'); .option('--direct', 'bypass mcplocal and connect directly to mcpd')
.option('--project <name>', 'Target project for project commands');
program.addCommand(createConfigCommand());
program.addCommand(createStatusCommand()); program.addCommand(createStatusCommand());
program.addCommand(createLoginCommand()); program.addCommand(createLoginCommand());
program.addCommand(createLogoutCommand()); program.addCommand(createLogoutCommand());
@@ -48,7 +48,28 @@ export function createProgram(): Command {
const client = new ApiClient({ baseUrl, token: creds?.token ?? undefined }); const client = new ApiClient({ baseUrl, token: creds?.token ?? undefined });
program.addCommand(createConfigCommand(undefined, {
client,
credentialsDeps: {},
log: (...args) => console.log(...args),
}));
const fetchResource = async (resource: string, nameOrId?: string): Promise<unknown[]> => { const fetchResource = async (resource: string, nameOrId?: string): Promise<unknown[]> => {
const projectName = program.opts().project as string | undefined;
// --project scoping for servers and instances
if (projectName && !nameOrId && (resource === 'servers' || resource === 'instances')) {
const projectId = await resolveNameOrId(client, 'projects', projectName);
if (resource === 'servers') {
return client.get<unknown[]>(`/api/v1/projects/${projectId}/servers`);
}
// instances: fetch project servers, then filter instances by serverId
const projectServers = await client.get<Array<{ id: string }>>(`/api/v1/projects/${projectId}/servers`);
const serverIds = new Set(projectServers.map((s) => s.id));
const allInstances = await client.get<Array<{ serverId: string }>>(`/api/v1/instances`);
return allInstances.filter((inst) => serverIds.has(inst.serverId));
}
if (nameOrId) { if (nameOrId) {
// Glob pattern — use query param filtering // Glob pattern — use query param filtering
if (nameOrId.includes('*')) { if (nameOrId.includes('*')) {
@@ -113,16 +134,6 @@ export function createProgram(): Command {
log: (...args) => console.log(...args), log: (...args) => console.log(...args),
})); }));
program.addCommand(createClaudeCommand({
client,
log: (...args) => console.log(...args),
}));
program.addCommand(createProjectCommand({
client,
log: (...args) => console.log(...args),
}));
program.addCommand(createBackupCommand({ program.addCommand(createBackupCommand({
client, client,
log: (...args) => console.log(...args), log: (...args) => console.log(...args),
@@ -133,6 +144,18 @@ export function createProgram(): Command {
log: (...args) => console.log(...args), log: (...args) => console.log(...args),
})); }));
const projectOpsDeps = {
client,
log: (...args: string[]) => console.log(...args),
getProject: () => program.opts().project as string | undefined,
};
program.addCommand(createAttachServerCommand(projectOpsDeps), { hidden: true });
program.addCommand(createDetachServerCommand(projectOpsDeps), { hidden: true });
program.addCommand(createApproveCommand(projectOpsDeps));
program.addCommand(createMcpCommand({
getProject: () => program.opts().project as string | undefined,
}), { hidden: true });
return program; return program;
} }
@@ -145,14 +168,28 @@ const isDirectRun =
if (isDirectRun) { if (isDirectRun) {
createProgram().parseAsync(process.argv).catch((err: unknown) => { createProgram().parseAsync(process.argv).catch((err: unknown) => {
if (err instanceof ApiError) { if (err instanceof ApiError) {
if (err.status === 401) {
console.error("Error: you need to log in. Run 'mcpctl login' to authenticate.");
} else if (err.status === 403) {
console.error('Error: permission denied. You do not have access to this resource.');
} else {
let msg: string; let msg: string;
try { try {
const parsed = JSON.parse(err.body) as { error?: string; message?: string }; const parsed = JSON.parse(err.body) as { error?: string; message?: string; details?: unknown };
msg = parsed.error ?? parsed.message ?? err.body; msg = parsed.error ?? parsed.message ?? err.body;
if (parsed.details && Array.isArray(parsed.details)) {
const issues = parsed.details as Array<{ message?: string; path?: string[] }>;
const detail = issues.map((i) => {
const path = i.path?.join('.') ?? '';
return path ? `${path}: ${i.message}` : (i.message ?? '');
}).filter(Boolean).join('; ');
if (detail) msg += `: ${detail}`;
}
} catch { } catch {
msg = err.body; msg = err.body;
} }
console.error(`Error: ${msg}`); console.error(`Error: ${msg}`);
}
} else if (err instanceof Error) { } else if (err instanceof Error) {
console.error(`Error: ${err.message}`); console.error(`Error: ${err.message}`);
} else { } else {

View File

@@ -21,6 +21,16 @@ beforeAll(async () => {
res.writeHead(201, { 'Content-Type': 'application/json' }); res.writeHead(201, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ id: 'srv-new', ...body })); res.end(JSON.stringify({ id: 'srv-new', ...body }));
}); });
} else if (req.url === '/api/v1/servers/srv-1' && req.method === 'DELETE') {
// Fastify rejects empty body with Content-Type: application/json
const ct = req.headers['content-type'] ?? '';
if (ct.includes('application/json')) {
res.writeHead(400, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ error: "Body cannot be empty when content-type is set to 'application/json'" }));
} else {
res.writeHead(204);
res.end();
}
} else if (req.url === '/api/v1/missing' && req.method === 'GET') { } else if (req.url === '/api/v1/missing' && req.method === 'GET') {
res.writeHead(404, { 'Content-Type': 'application/json' }); res.writeHead(404, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ error: 'Not found' })); res.end(JSON.stringify({ error: 'Not found' }));
@@ -75,6 +85,12 @@ describe('ApiClient', () => {
await expect(client.get('/anything')).rejects.toThrow(); await expect(client.get('/anything')).rejects.toThrow();
}); });
it('performs DELETE without Content-Type header', async () => {
const client = new ApiClient({ baseUrl: `http://localhost:${port}` });
// Should succeed (204) because no Content-Type is sent on bodyless DELETE
await expect(client.delete('/api/v1/servers/srv-1')).resolves.toBeUndefined();
});
it('sends Authorization header when token provided', async () => { it('sends Authorization header when token provided', async () => {
// We need a separate server to check the header // We need a separate server to check the header
let receivedAuth = ''; let receivedAuth = '';

View File

@@ -159,4 +159,347 @@ projects:
rmSync(tmpDir, { recursive: true, force: true }); rmSync(tmpDir, { recursive: true, force: true });
}); });
it('applies users (no role field)', async () => {
const configPath = join(tmpDir, 'config.yaml');
writeFileSync(configPath, `
users:
- email: alice@test.com
password: password123
name: Alice
`);
const cmd = createApplyCommand({ client, log });
await cmd.parseAsync([configPath], { from: 'user' });
const callBody = vi.mocked(client.post).mock.calls[0]![1] as Record<string, unknown>;
expect(callBody).toEqual(expect.objectContaining({
email: 'alice@test.com',
password: 'password123',
name: 'Alice',
}));
expect(callBody).not.toHaveProperty('role');
expect(output.join('\n')).toContain('Created user: alice@test.com');
rmSync(tmpDir, { recursive: true, force: true });
});
it('updates existing users matched by email', async () => {
vi.mocked(client.get).mockImplementation(async (url: string) => {
if (url === '/api/v1/users') return [{ id: 'usr-1', email: 'alice@test.com' }];
return [];
});
const configPath = join(tmpDir, 'config.yaml');
writeFileSync(configPath, `
users:
- email: alice@test.com
password: newpassword
name: Alice Updated
`);
const cmd = createApplyCommand({ client, log });
await cmd.parseAsync([configPath], { from: 'user' });
expect(client.put).toHaveBeenCalledWith('/api/v1/users/usr-1', expect.objectContaining({
email: 'alice@test.com',
name: 'Alice Updated',
}));
expect(output.join('\n')).toContain('Updated user: alice@test.com');
rmSync(tmpDir, { recursive: true, force: true });
});
it('applies groups', async () => {
const configPath = join(tmpDir, 'config.yaml');
writeFileSync(configPath, `
groups:
- name: dev-team
description: Development team
members:
- alice@test.com
- bob@test.com
`);
const cmd = createApplyCommand({ client, log });
await cmd.parseAsync([configPath], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/groups', expect.objectContaining({
name: 'dev-team',
description: 'Development team',
members: ['alice@test.com', 'bob@test.com'],
}));
expect(output.join('\n')).toContain('Created group: dev-team');
rmSync(tmpDir, { recursive: true, force: true });
});
it('updates existing groups', async () => {
vi.mocked(client.get).mockImplementation(async (url: string) => {
if (url === '/api/v1/groups') return [{ id: 'grp-1', name: 'dev-team' }];
return [];
});
const configPath = join(tmpDir, 'config.yaml');
writeFileSync(configPath, `
groups:
- name: dev-team
description: Updated devs
members:
- new@test.com
`);
const cmd = createApplyCommand({ client, log });
await cmd.parseAsync([configPath], { from: 'user' });
expect(client.put).toHaveBeenCalledWith('/api/v1/groups/grp-1', expect.objectContaining({
name: 'dev-team',
description: 'Updated devs',
}));
expect(output.join('\n')).toContain('Updated group: dev-team');
rmSync(tmpDir, { recursive: true, force: true });
});
it('applies rbacBindings', async () => {
const configPath = join(tmpDir, 'config.yaml');
writeFileSync(configPath, `
rbac:
- name: developers
subjects:
- kind: User
name: alice@test.com
- kind: Group
name: dev-team
roleBindings:
- role: edit
resource: servers
- role: view
resource: instances
`);
const cmd = createApplyCommand({ client, log });
await cmd.parseAsync([configPath], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', expect.objectContaining({
name: 'developers',
subjects: [
{ kind: 'User', name: 'alice@test.com' },
{ kind: 'Group', name: 'dev-team' },
],
roleBindings: [
{ role: 'edit', resource: 'servers' },
{ role: 'view', resource: 'instances' },
],
}));
expect(output.join('\n')).toContain('Created rbacBinding: developers');
rmSync(tmpDir, { recursive: true, force: true });
});
it('updates existing rbacBindings', async () => {
vi.mocked(client.get).mockImplementation(async (url: string) => {
if (url === '/api/v1/rbac') return [{ id: 'rbac-1', name: 'developers' }];
return [];
});
const configPath = join(tmpDir, 'config.yaml');
writeFileSync(configPath, `
rbacBindings:
- name: developers
subjects:
- kind: User
name: new@test.com
roleBindings:
- role: edit
resource: "*"
`);
const cmd = createApplyCommand({ client, log });
await cmd.parseAsync([configPath], { from: 'user' });
expect(client.put).toHaveBeenCalledWith('/api/v1/rbac/rbac-1', expect.objectContaining({
name: 'developers',
}));
expect(output.join('\n')).toContain('Updated rbacBinding: developers');
rmSync(tmpDir, { recursive: true, force: true });
});
it('applies projects with servers', async () => {
const configPath = join(tmpDir, 'config.yaml');
writeFileSync(configPath, `
projects:
- name: smart-home
description: Home automation
proxyMode: filtered
llmProvider: gemini-cli
llmModel: gemini-2.0-flash
servers:
- my-grafana
- my-ha
`);
const cmd = createApplyCommand({ client, log });
await cmd.parseAsync([configPath], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/projects', expect.objectContaining({
name: 'smart-home',
proxyMode: 'filtered',
llmProvider: 'gemini-cli',
llmModel: 'gemini-2.0-flash',
servers: ['my-grafana', 'my-ha'],
}));
expect(output.join('\n')).toContain('Created project: smart-home');
rmSync(tmpDir, { recursive: true, force: true });
});
it('dry-run shows all new resource types', async () => {
const configPath = join(tmpDir, 'config.yaml');
writeFileSync(configPath, `
secrets:
- name: creds
data:
TOKEN: abc
users:
- email: alice@test.com
password: password123
groups:
- name: dev-team
members: []
projects:
- name: my-proj
description: A project
rbacBindings:
- name: admins
subjects:
- kind: User
name: admin@test.com
roleBindings:
- role: edit
resource: "*"
`);
const cmd = createApplyCommand({ client, log });
await cmd.parseAsync([configPath, '--dry-run'], { from: 'user' });
expect(client.post).not.toHaveBeenCalled();
const text = output.join('\n');
expect(text).toContain('Dry run');
expect(text).toContain('1 secret(s)');
expect(text).toContain('1 user(s)');
expect(text).toContain('1 group(s)');
expect(text).toContain('1 project(s)');
expect(text).toContain('1 rbacBinding(s)');
rmSync(tmpDir, { recursive: true, force: true });
});
it('applies resources in correct order', async () => {
const callOrder: string[] = [];
vi.mocked(client.post).mockImplementation(async (url: string) => {
callOrder.push(url);
return { id: 'new-id', name: 'test' };
});
const configPath = join(tmpDir, 'config.yaml');
writeFileSync(configPath, `
rbacBindings:
- name: admins
subjects:
- kind: User
name: admin@test.com
roleBindings:
- role: edit
resource: "*"
users:
- email: admin@test.com
password: password123
secrets:
- name: creds
data:
KEY: val
groups:
- name: dev-team
servers:
- name: my-server
transport: STDIO
projects:
- name: my-proj
`);
const cmd = createApplyCommand({ client, log });
await cmd.parseAsync([configPath], { from: 'user' });
// Apply order: secrets → servers → users → groups → projects → templates → rbacBindings
expect(callOrder[0]).toBe('/api/v1/secrets');
expect(callOrder[1]).toBe('/api/v1/servers');
expect(callOrder[2]).toBe('/api/v1/users');
expect(callOrder[3]).toBe('/api/v1/groups');
expect(callOrder[4]).toBe('/api/v1/projects');
expect(callOrder[5]).toBe('/api/v1/rbac');
rmSync(tmpDir, { recursive: true, force: true });
});
it('applies rbac with operation bindings', async () => {
const configPath = join(tmpDir, 'config.yaml');
writeFileSync(configPath, `
rbac:
- name: ops-team
subjects:
- kind: Group
name: ops
roleBindings:
- role: edit
resource: servers
- role: run
action: backup
- role: run
action: logs
`);
const cmd = createApplyCommand({ client, log });
await cmd.parseAsync([configPath], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', expect.objectContaining({
name: 'ops-team',
roleBindings: [
{ role: 'edit', resource: 'servers' },
{ role: 'run', action: 'backup' },
{ role: 'run', action: 'logs' },
],
}));
expect(output.join('\n')).toContain('Created rbacBinding: ops-team');
rmSync(tmpDir, { recursive: true, force: true });
});
it('applies rbac with name-scoped resource binding', async () => {
const configPath = join(tmpDir, 'config.yaml');
writeFileSync(configPath, `
rbac:
- name: ha-viewer
subjects:
- kind: User
name: alice@test.com
roleBindings:
- role: view
resource: servers
name: my-ha
`);
const cmd = createApplyCommand({ client, log });
await cmd.parseAsync([configPath], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', expect.objectContaining({
name: 'ha-viewer',
roleBindings: [
{ role: 'view', resource: 'servers', name: 'my-ha' },
],
}));
rmSync(tmpDir, { recursive: true, force: true });
});
}); });

View File

@@ -37,6 +37,8 @@ describe('login command', () => {
user: { email }, user: { email },
}), }),
logoutRequest: async () => {}, logoutRequest: async () => {},
statusRequest: async () => ({ hasUsers: true }),
bootstrapRequest: async () => ({ token: '', user: { email: '' } }),
}); });
await cmd.parseAsync([], { from: 'user' }); await cmd.parseAsync([], { from: 'user' });
expect(output[0]).toContain('Logged in as alice@test.com'); expect(output[0]).toContain('Logged in as alice@test.com');
@@ -58,6 +60,8 @@ describe('login command', () => {
log, log,
loginRequest: async () => { throw new Error('Invalid credentials'); }, loginRequest: async () => { throw new Error('Invalid credentials'); },
logoutRequest: async () => {}, logoutRequest: async () => {},
statusRequest: async () => ({ hasUsers: true }),
bootstrapRequest: async () => ({ token: '', user: { email: '' } }),
}); });
await cmd.parseAsync([], { from: 'user' }); await cmd.parseAsync([], { from: 'user' });
expect(output[0]).toContain('Login failed'); expect(output[0]).toContain('Login failed');
@@ -83,6 +87,8 @@ describe('login command', () => {
return { token: 'tok', user: { email } }; return { token: 'tok', user: { email } };
}, },
logoutRequest: async () => {}, logoutRequest: async () => {},
statusRequest: async () => ({ hasUsers: true }),
bootstrapRequest: async () => ({ token: '', user: { email: '' } }),
}); });
await cmd.parseAsync([], { from: 'user' }); await cmd.parseAsync([], { from: 'user' });
expect(capturedUrl).toBe('http://custom:3100'); expect(capturedUrl).toBe('http://custom:3100');
@@ -103,12 +109,74 @@ describe('login command', () => {
return { token: 'tok', user: { email } }; return { token: 'tok', user: { email } };
}, },
logoutRequest: async () => {}, logoutRequest: async () => {},
statusRequest: async () => ({ hasUsers: true }),
bootstrapRequest: async () => ({ token: '', user: { email: '' } }),
}); });
await cmd.parseAsync(['--mcpd-url', 'http://override:3100'], { from: 'user' }); await cmd.parseAsync(['--mcpd-url', 'http://override:3100'], { from: 'user' });
expect(capturedUrl).toBe('http://override:3100'); expect(capturedUrl).toBe('http://override:3100');
}); });
}); });
describe('login bootstrap flow', () => {
it('bootstraps first admin when no users exist', async () => {
let bootstrapCalled = false;
const cmd = createLoginCommand({
configDeps: { configDir: tempDir },
credentialsDeps: { configDir: tempDir },
prompt: {
input: async (msg) => {
if (msg.includes('Name')) return 'Admin User';
return 'admin@test.com';
},
password: async () => 'admin-pass',
},
log,
loginRequest: async () => ({ token: '', user: { email: '' } }),
logoutRequest: async () => {},
statusRequest: async () => ({ hasUsers: false }),
bootstrapRequest: async (_url, email, _password) => {
bootstrapCalled = true;
return { token: 'admin-token', user: { email } };
},
});
await cmd.parseAsync([], { from: 'user' });
expect(bootstrapCalled).toBe(true);
expect(output.join('\n')).toContain('No users configured');
expect(output.join('\n')).toContain('admin@test.com');
expect(output.join('\n')).toContain('admin');
const creds = loadCredentials({ configDir: tempDir });
expect(creds).not.toBeNull();
expect(creds!.token).toBe('admin-token');
expect(creds!.user).toBe('admin@test.com');
});
it('falls back to normal login when users exist', async () => {
let loginCalled = false;
const cmd = createLoginCommand({
configDeps: { configDir: tempDir },
credentialsDeps: { configDir: tempDir },
prompt: {
input: async () => 'alice@test.com',
password: async () => 'secret',
},
log,
loginRequest: async (_url, email) => {
loginCalled = true;
return { token: 'session-tok', user: { email } };
},
logoutRequest: async () => {},
statusRequest: async () => ({ hasUsers: true }),
bootstrapRequest: async () => { throw new Error('Should not be called'); },
});
await cmd.parseAsync([], { from: 'user' });
expect(loginCalled).toBe(true);
expect(output.join('\n')).not.toContain('No users configured');
});
});
describe('logout command', () => { describe('logout command', () => {
it('removes credentials on logout', async () => { it('removes credentials on logout', async () => {
saveCredentials({ token: 'tok', mcpdUrl: 'http://x:3100', user: 'alice' }, { configDir: tempDir }); saveCredentials({ token: 'tok', mcpdUrl: 'http://x:3100', user: 'alice' }, { configDir: tempDir });
@@ -120,6 +188,8 @@ describe('logout command', () => {
log, log,
loginRequest: async () => ({ token: '', user: { email: '' } }), loginRequest: async () => ({ token: '', user: { email: '' } }),
logoutRequest: async () => { logoutCalled = true; }, logoutRequest: async () => { logoutCalled = true; },
statusRequest: async () => ({ hasUsers: true }),
bootstrapRequest: async () => ({ token: '', user: { email: '' } }),
}); });
await cmd.parseAsync([], { from: 'user' }); await cmd.parseAsync([], { from: 'user' });
expect(output[0]).toContain('Logged out successfully'); expect(output[0]).toContain('Logged out successfully');
@@ -137,6 +207,8 @@ describe('logout command', () => {
log, log,
loginRequest: async () => ({ token: '', user: { email: '' } }), loginRequest: async () => ({ token: '', user: { email: '' } }),
logoutRequest: async () => {}, logoutRequest: async () => {},
statusRequest: async () => ({ hasUsers: true }),
bootstrapRequest: async () => ({ token: '', user: { email: '' } }),
}); });
await cmd.parseAsync([], { from: 'user' }); await cmd.parseAsync([], { from: 'user' });
expect(output[0]).toContain('Not logged in'); expect(output[0]).toContain('Not logged in');

View File

@@ -1,56 +1,67 @@
import { describe, it, expect, vi, beforeEach } from 'vitest'; import { describe, it, expect, vi, beforeEach, afterEach } from 'vitest';
import { writeFileSync, readFileSync, mkdtempSync, rmSync } from 'node:fs'; import { writeFileSync, readFileSync, mkdtempSync, rmSync } from 'node:fs';
import { join } from 'node:path'; import { join } from 'node:path';
import { tmpdir } from 'node:os'; import { tmpdir } from 'node:os';
import { createClaudeCommand } from '../../src/commands/claude.js'; import { createConfigCommand } from '../../src/commands/config.js';
import type { ApiClient } from '../../src/api-client.js'; import type { ApiClient } from '../../src/api-client.js';
import { saveCredentials, loadCredentials } from '../../src/auth/index.js';
function mockClient(): ApiClient { function mockClient(): ApiClient {
return { return {
get: vi.fn(async () => ({ get: vi.fn(async () => ({})),
mcpServers: { post: vi.fn(async () => ({ token: 'impersonated-tok', user: { email: 'other@test.com' } })),
'slack--default': { command: 'npx', args: ['-y', '@anthropic/slack-mcp'], env: { WORKSPACE: 'test' } },
'github--default': { command: 'npx', args: ['-y', '@anthropic/github-mcp'] },
},
})),
post: vi.fn(async () => ({})),
put: vi.fn(async () => ({})), put: vi.fn(async () => ({})),
delete: vi.fn(async () => {}), delete: vi.fn(async () => {}),
} as unknown as ApiClient; } as unknown as ApiClient;
} }
describe('claude command', () => { describe('config claude', () => {
let client: ReturnType<typeof mockClient>; let client: ReturnType<typeof mockClient>;
let output: string[]; let output: string[];
let tmpDir: string; let tmpDir: string;
const log = (...args: unknown[]) => output.push(args.map(String).join(' ')); const log = (...args: string[]) => output.push(args.join(' '));
beforeEach(() => { beforeEach(() => {
client = mockClient(); client = mockClient();
output = []; output = [];
tmpDir = mkdtempSync(join(tmpdir(), 'mcpctl-claude-')); tmpDir = mkdtempSync(join(tmpdir(), 'mcpctl-config-claude-'));
}); });
describe('generate', () => { afterEach(() => {
it('generates .mcp.json from project config', async () => {
const outPath = join(tmpDir, '.mcp.json');
const cmd = createClaudeCommand({ client, log });
await cmd.parseAsync(['generate', 'proj-1', '-o', outPath], { from: 'user' });
expect(client.get).toHaveBeenCalledWith('/api/v1/projects/proj-1/mcp-config');
const written = JSON.parse(readFileSync(outPath, 'utf-8'));
expect(written.mcpServers['slack--default']).toBeDefined();
expect(output.join('\n')).toContain('2 server(s)');
rmSync(tmpDir, { recursive: true, force: true }); rmSync(tmpDir, { recursive: true, force: true });
}); });
it('generates .mcp.json with mcpctl mcp bridge entry', async () => {
const outPath = join(tmpDir, '.mcp.json');
const cmd = createConfigCommand(
{ configDeps: { configDir: tmpDir }, log },
{ client, credentialsDeps: { configDir: tmpDir }, log },
);
await cmd.parseAsync(['claude', '--project', 'homeautomation', '-o', outPath], { from: 'user' });
// No API call should be made
expect(client.get).not.toHaveBeenCalled();
const written = JSON.parse(readFileSync(outPath, 'utf-8'));
expect(written.mcpServers['homeautomation']).toEqual({
command: 'mcpctl',
args: ['mcp', '-p', 'homeautomation'],
});
expect(output.join('\n')).toContain('1 server(s)');
});
it('prints to stdout with --stdout', async () => { it('prints to stdout with --stdout', async () => {
const cmd = createClaudeCommand({ client, log }); const cmd = createConfigCommand(
await cmd.parseAsync(['generate', 'proj-1', '--stdout'], { from: 'user' }); { configDeps: { configDir: tmpDir }, log },
{ client, credentialsDeps: { configDir: tmpDir }, log },
);
await cmd.parseAsync(['claude', '--project', 'myproj', '--stdout'], { from: 'user' });
expect(output[0]).toContain('mcpServers'); const parsed = JSON.parse(output[0]);
rmSync(tmpDir, { recursive: true, force: true }); expect(parsed.mcpServers['myproj']).toEqual({
command: 'mcpctl',
args: ['mcp', '-p', 'myproj'],
});
}); });
it('merges with existing .mcp.json', async () => { it('merges with existing .mcp.json', async () => {
@@ -59,100 +70,123 @@ describe('claude command', () => {
mcpServers: { 'existing--server': { command: 'echo', args: [] } }, mcpServers: { 'existing--server': { command: 'echo', args: [] } },
})); }));
const cmd = createClaudeCommand({ client, log }); const cmd = createConfigCommand(
await cmd.parseAsync(['generate', 'proj-1', '-o', outPath, '--merge'], { from: 'user' }); { configDeps: { configDir: tmpDir }, log },
{ client, credentialsDeps: { configDir: tmpDir }, log },
);
await cmd.parseAsync(['claude', '--project', 'proj-1', '-o', outPath, '--merge'], { from: 'user' });
const written = JSON.parse(readFileSync(outPath, 'utf-8')); const written = JSON.parse(readFileSync(outPath, 'utf-8'));
expect(written.mcpServers['existing--server']).toBeDefined(); expect(written.mcpServers['existing--server']).toBeDefined();
expect(written.mcpServers['slack--default']).toBeDefined(); expect(written.mcpServers['proj-1']).toEqual({
expect(output.join('\n')).toContain('3 server(s)'); command: 'mcpctl',
args: ['mcp', '-p', 'proj-1'],
});
expect(output.join('\n')).toContain('2 server(s)');
});
it('backward compat: claude-generate still works', async () => {
const outPath = join(tmpDir, '.mcp.json');
const cmd = createConfigCommand(
{ configDeps: { configDir: tmpDir }, log },
{ client, credentialsDeps: { configDir: tmpDir }, log },
);
await cmd.parseAsync(['claude-generate', '--project', 'proj-1', '-o', outPath], { from: 'user' });
const written = JSON.parse(readFileSync(outPath, 'utf-8'));
expect(written.mcpServers['proj-1']).toEqual({
command: 'mcpctl',
args: ['mcp', '-p', 'proj-1'],
});
});
it('uses project name as the server key', async () => {
const outPath = join(tmpDir, '.mcp.json');
const cmd = createConfigCommand(
{ configDeps: { configDir: tmpDir }, log },
);
await cmd.parseAsync(['claude', '--project', 'my-fancy-project', '-o', outPath], { from: 'user' });
const written = JSON.parse(readFileSync(outPath, 'utf-8'));
expect(Object.keys(written.mcpServers)).toEqual(['my-fancy-project']);
});
});
describe('config impersonate', () => {
let client: ReturnType<typeof mockClient>;
let output: string[];
let tmpDir: string;
const log = (...args: string[]) => output.push(args.join(' '));
beforeEach(() => {
client = mockClient();
output = [];
tmpDir = mkdtempSync(join(tmpdir(), 'mcpctl-config-impersonate-'));
});
afterEach(() => {
rmSync(tmpDir, { recursive: true, force: true }); rmSync(tmpDir, { recursive: true, force: true });
}); });
it('impersonates a user and saves backup', async () => {
saveCredentials({ token: 'admin-tok', mcpdUrl: 'http://localhost:3100', user: 'admin@test.com' }, { configDir: tmpDir });
const cmd = createConfigCommand(
{ configDeps: { configDir: tmpDir }, log },
{ client, credentialsDeps: { configDir: tmpDir }, log },
);
await cmd.parseAsync(['impersonate', 'other@test.com'], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/auth/impersonate', { email: 'other@test.com' });
expect(output.join('\n')).toContain('Impersonating other@test.com');
const creds = loadCredentials({ configDir: tmpDir });
expect(creds!.user).toBe('other@test.com');
expect(creds!.token).toBe('impersonated-tok');
// Backup exists
const backup = JSON.parse(readFileSync(join(tmpDir, 'credentials-backup'), 'utf-8'));
expect(backup.user).toBe('admin@test.com');
}); });
describe('show', () => { it('quits impersonation and restores backup', async () => {
it('shows servers in .mcp.json', () => { // Set up current (impersonated) credentials
const filePath = join(tmpDir, '.mcp.json'); saveCredentials({ token: 'impersonated-tok', mcpdUrl: 'http://localhost:3100', user: 'other@test.com' }, { configDir: tmpDir });
writeFileSync(filePath, JSON.stringify({ // Set up backup (original) credentials
mcpServers: { writeFileSync(join(tmpDir, 'credentials-backup'), JSON.stringify({
'slack': { command: 'npx', args: ['-y', '@anthropic/slack-mcp'], env: { TOKEN: 'x' } }, token: 'admin-tok', mcpdUrl: 'http://localhost:3100', user: 'admin@test.com',
},
})); }));
const cmd = createClaudeCommand({ client, log }); const cmd = createConfigCommand(
cmd.parseAsync(['show', '-p', filePath], { from: 'user' }); { configDeps: { configDir: tmpDir }, log },
{ client, credentialsDeps: { configDir: tmpDir }, log },
);
await cmd.parseAsync(['impersonate', '--quit'], { from: 'user' });
expect(output.join('\n')).toContain('slack'); expect(output.join('\n')).toContain('Returned to admin@test.com');
expect(output.join('\n')).toContain('npx -y @anthropic/slack-mcp');
expect(output.join('\n')).toContain('TOKEN');
rmSync(tmpDir, { recursive: true, force: true }); const creds = loadCredentials({ configDir: tmpDir });
expect(creds!.user).toBe('admin@test.com');
expect(creds!.token).toBe('admin-tok');
}); });
it('handles missing file', () => { it('errors when not logged in', async () => {
const cmd = createClaudeCommand({ client, log }); const cmd = createConfigCommand(
cmd.parseAsync(['show', '-p', join(tmpDir, 'nonexistent.json')], { from: 'user' }); { configDeps: { configDir: tmpDir }, log },
{ client, credentialsDeps: { configDir: tmpDir }, log },
);
await cmd.parseAsync(['impersonate', 'other@test.com'], { from: 'user' });
expect(output.join('\n')).toContain('No .mcp.json found'); expect(output.join('\n')).toContain('Not logged in');
rmSync(tmpDir, { recursive: true, force: true });
});
}); });
describe('add', () => { it('errors when quitting with no backup', async () => {
it('adds a server entry', () => { const cmd = createConfigCommand(
const filePath = join(tmpDir, '.mcp.json'); { configDeps: { configDir: tmpDir }, log },
const cmd = createClaudeCommand({ client, log }); { client, credentialsDeps: { configDir: tmpDir }, log },
cmd.parseAsync(['add', 'my-server', '-c', 'npx', '-a', '-y', 'my-pkg', '-p', filePath], { from: 'user' }); );
await cmd.parseAsync(['impersonate', '--quit'], { from: 'user' });
const written = JSON.parse(readFileSync(filePath, 'utf-8')); expect(output.join('\n')).toContain('No impersonation session to quit');
expect(written.mcpServers['my-server']).toEqual({
command: 'npx',
args: ['-y', 'my-pkg'],
});
rmSync(tmpDir, { recursive: true, force: true });
});
it('adds server with env vars', () => {
const filePath = join(tmpDir, '.mcp.json');
const cmd = createClaudeCommand({ client, log });
cmd.parseAsync(['add', 'my-server', '-c', 'node', '-e', 'KEY=val', 'SECRET=abc', '-p', filePath], { from: 'user' });
const written = JSON.parse(readFileSync(filePath, 'utf-8'));
expect(written.mcpServers['my-server'].env).toEqual({ KEY: 'val', SECRET: 'abc' });
rmSync(tmpDir, { recursive: true, force: true });
});
});
describe('remove', () => {
it('removes a server entry', () => {
const filePath = join(tmpDir, '.mcp.json');
writeFileSync(filePath, JSON.stringify({
mcpServers: { 'slack': { command: 'npx', args: [] }, 'github': { command: 'npx', args: [] } },
}));
const cmd = createClaudeCommand({ client, log });
cmd.parseAsync(['remove', 'slack', '-p', filePath], { from: 'user' });
const written = JSON.parse(readFileSync(filePath, 'utf-8'));
expect(written.mcpServers['slack']).toBeUndefined();
expect(written.mcpServers['github']).toBeDefined();
expect(output.join('\n')).toContain("Removed 'slack'");
rmSync(tmpDir, { recursive: true, force: true });
});
it('reports when server not found', () => {
const filePath = join(tmpDir, '.mcp.json');
writeFileSync(filePath, JSON.stringify({ mcpServers: {} }));
const cmd = createClaudeCommand({ client, log });
cmd.parseAsync(['remove', 'nonexistent', '-p', filePath], { from: 'user' });
expect(output.join('\n')).toContain('not found');
rmSync(tmpDir, { recursive: true, force: true });
});
}); });
}); });

View File

@@ -0,0 +1,293 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { createConfigSetupCommand } from '../../src/commands/config-setup.js';
import type { ConfigSetupDeps, ConfigSetupPrompt } from '../../src/commands/config-setup.js';
import type { SecretStore } from '@mcpctl/shared';
import { mkdtempSync, rmSync, readFileSync } from 'node:fs';
import { join } from 'node:path';
import { tmpdir } from 'node:os';
let tempDir: string;
let logs: string[];
beforeEach(() => {
tempDir = mkdtempSync(join(tmpdir(), 'mcpctl-config-setup-test-'));
logs = [];
});
function cleanup() {
rmSync(tempDir, { recursive: true, force: true });
}
function mockSecretStore(secrets: Record<string, string> = {}): SecretStore {
const store: Record<string, string> = { ...secrets };
return {
get: vi.fn(async (key: string) => store[key] ?? null),
set: vi.fn(async (key: string, value: string) => { store[key] = value; }),
delete: vi.fn(async () => true),
backend: () => 'mock',
};
}
function mockPrompt(answers: unknown[]): ConfigSetupPrompt {
let callIndex = 0;
return {
select: vi.fn(async () => answers[callIndex++]),
input: vi.fn(async () => answers[callIndex++] as string),
password: vi.fn(async () => answers[callIndex++] as string),
confirm: vi.fn(async () => answers[callIndex++] as boolean),
};
}
function buildDeps(overrides: {
secrets?: Record<string, string>;
answers?: unknown[];
fetchModels?: ConfigSetupDeps['fetchModels'];
whichBinary?: ConfigSetupDeps['whichBinary'];
} = {}): ConfigSetupDeps {
return {
configDeps: { configDir: tempDir },
secretStore: mockSecretStore(overrides.secrets),
log: (...args: string[]) => logs.push(args.join(' ')),
prompt: mockPrompt(overrides.answers ?? []),
fetchModels: overrides.fetchModels ?? vi.fn(async () => []),
whichBinary: overrides.whichBinary ?? vi.fn(async () => '/usr/bin/gemini'),
};
}
function readConfig(): Record<string, unknown> {
const raw = readFileSync(join(tempDir, 'config.json'), 'utf-8');
return JSON.parse(raw) as Record<string, unknown>;
}
async function runSetup(deps: ConfigSetupDeps): Promise<void> {
const cmd = createConfigSetupCommand(deps);
await cmd.parseAsync([], { from: 'user' });
}
describe('config setup wizard', () => {
describe('provider: none', () => {
it('disables LLM and saves config', async () => {
const deps = buildDeps({ answers: ['none'] });
await runSetup(deps);
const config = readConfig();
expect(config.llm).toEqual({ provider: 'none' });
expect(logs.some((l) => l.includes('LLM disabled'))).toBe(true);
cleanup();
});
});
describe('provider: gemini-cli', () => {
it('auto-detects binary path and saves config', async () => {
// Answers: select provider, select model (no binary prompt — auto-detected)
const deps = buildDeps({
answers: ['gemini-cli', 'gemini-2.5-flash'],
whichBinary: vi.fn(async () => '/home/user/.npm-global/bin/gemini'),
});
await runSetup(deps);
const config = readConfig();
const llm = config.llm as Record<string, unknown>;
expect(llm.provider).toBe('gemini-cli');
expect(llm.model).toBe('gemini-2.5-flash');
expect(llm.binaryPath).toBe('/home/user/.npm-global/bin/gemini');
expect(logs.some((l) => l.includes('Found gemini at'))).toBe(true);
cleanup();
});
it('prompts for manual path when binary not found', async () => {
// Answers: select provider, select model, enter manual path
const deps = buildDeps({
answers: ['gemini-cli', 'gemini-2.5-flash', '/opt/gemini'],
whichBinary: vi.fn(async () => null),
});
await runSetup(deps);
const config = readConfig();
const llm = config.llm as Record<string, unknown>;
expect(llm.binaryPath).toBe('/opt/gemini');
expect(logs.some((l) => l.includes('not found'))).toBe(true);
cleanup();
});
it('saves gemini-cli with custom model', async () => {
// Answers: select provider, select custom, enter model name
const deps = buildDeps({
answers: ['gemini-cli', '__custom__', 'gemini-3.0-flash'],
whichBinary: vi.fn(async () => '/usr/bin/gemini'),
});
await runSetup(deps);
const config = readConfig();
const llm = config.llm as Record<string, unknown>;
expect(llm.model).toBe('gemini-3.0-flash');
cleanup();
});
});
describe('provider: ollama', () => {
it('fetches models and allows selection', async () => {
const fetchModels = vi.fn(async () => ['llama3.2', 'codellama', 'mistral']);
// Answers: select provider, enter URL, select model
const deps = buildDeps({
answers: ['ollama', 'http://localhost:11434', 'codellama'],
fetchModels,
});
await runSetup(deps);
expect(fetchModels).toHaveBeenCalledWith('http://localhost:11434', '/api/tags');
const config = readConfig();
const llm = config.llm as Record<string, unknown>;
expect(llm.provider).toBe('ollama');
expect(llm.model).toBe('codellama');
expect(llm.url).toBe('http://localhost:11434');
cleanup();
});
it('falls back to manual input when fetch fails', async () => {
const fetchModels = vi.fn(async () => []);
// Answers: select provider, enter URL, enter model manually
const deps = buildDeps({
answers: ['ollama', 'http://localhost:11434', 'llama3.2'],
fetchModels,
});
await runSetup(deps);
const config = readConfig();
expect((config.llm as Record<string, unknown>).model).toBe('llama3.2');
cleanup();
});
});
describe('provider: anthropic', () => {
it('prompts for API key and saves to secret store', async () => {
// Answers: select provider, enter API key, select model
const deps = buildDeps({
answers: ['anthropic', 'sk-ant-new-key', 'claude-haiku-3-5-20241022'],
});
await runSetup(deps);
expect(deps.secretStore.set).toHaveBeenCalledWith('anthropic-api-key', 'sk-ant-new-key');
const config = readConfig();
const llm = config.llm as Record<string, unknown>;
expect(llm.provider).toBe('anthropic');
expect(llm.model).toBe('claude-haiku-3-5-20241022');
// API key should NOT be in config file
expect(llm).not.toHaveProperty('apiKey');
cleanup();
});
it('shows existing key masked and allows keeping it', async () => {
// Answers: select provider, confirm change=false, select model
const deps = buildDeps({
secrets: { 'anthropic-api-key': 'sk-ant-existing-key-1234' },
answers: ['anthropic', false, 'claude-sonnet-4-20250514'],
});
await runSetup(deps);
// Should NOT have called set (kept existing key)
expect(deps.secretStore.set).not.toHaveBeenCalled();
const config = readConfig();
expect((config.llm as Record<string, unknown>).model).toBe('claude-sonnet-4-20250514');
cleanup();
});
it('allows replacing existing key', async () => {
// Answers: select provider, confirm change=true, enter new key, select model
const deps = buildDeps({
secrets: { 'anthropic-api-key': 'sk-ant-old' },
answers: ['anthropic', true, 'sk-ant-new', 'claude-haiku-3-5-20241022'],
});
await runSetup(deps);
expect(deps.secretStore.set).toHaveBeenCalledWith('anthropic-api-key', 'sk-ant-new');
cleanup();
});
});
describe('provider: vllm', () => {
it('fetches models from vLLM and allows selection', async () => {
const fetchModels = vi.fn(async () => ['my-model', 'llama-70b']);
// Answers: select provider, enter URL, select model
const deps = buildDeps({
answers: ['vllm', 'http://gpu:8000', 'llama-70b'],
fetchModels,
});
await runSetup(deps);
expect(fetchModels).toHaveBeenCalledWith('http://gpu:8000', '/v1/models');
const config = readConfig();
const llm = config.llm as Record<string, unknown>;
expect(llm.provider).toBe('vllm');
expect(llm.url).toBe('http://gpu:8000');
expect(llm.model).toBe('llama-70b');
cleanup();
});
});
describe('provider: openai', () => {
it('prompts for key, model, and optional custom endpoint', async () => {
// Answers: select provider, enter key, enter model, confirm custom URL=true, enter URL
const deps = buildDeps({
answers: ['openai', 'sk-openai-key', 'gpt-4o', true, 'https://custom.api.com'],
});
await runSetup(deps);
expect(deps.secretStore.set).toHaveBeenCalledWith('openai-api-key', 'sk-openai-key');
const config = readConfig();
const llm = config.llm as Record<string, unknown>;
expect(llm.provider).toBe('openai');
expect(llm.model).toBe('gpt-4o');
expect(llm.url).toBe('https://custom.api.com');
cleanup();
});
it('skips custom URL when not requested', async () => {
// Answers: select provider, enter key, enter model, confirm custom URL=false
const deps = buildDeps({
answers: ['openai', 'sk-openai-key', 'gpt-4o-mini', false],
});
await runSetup(deps);
const config = readConfig();
const llm = config.llm as Record<string, unknown>;
expect(llm.url).toBeUndefined();
cleanup();
});
});
describe('provider: deepseek', () => {
it('prompts for key and model', async () => {
// Answers: select provider, enter key, select model
const deps = buildDeps({
answers: ['deepseek', 'sk-ds-key', 'deepseek-chat'],
});
await runSetup(deps);
expect(deps.secretStore.set).toHaveBeenCalledWith('deepseek-api-key', 'sk-ds-key');
const config = readConfig();
const llm = config.llm as Record<string, unknown>;
expect(llm.provider).toBe('deepseek');
expect(llm.model).toBe('deepseek-chat');
cleanup();
});
});
describe('output messages', () => {
it('shows restart instruction', async () => {
const deps = buildDeps({ answers: ['gemini-cli', 'gemini-2.5-flash'] });
await runSetup(deps);
expect(logs.some((l) => l.includes('systemctl --user restart mcplocal'))).toBe(true);
cleanup();
});
it('shows configured provider and model', async () => {
const deps = buildDeps({ answers: ['gemini-cli', 'gemini-2.5-flash'] });
await runSetup(deps);
expect(logs.some((l) => l.includes('gemini-cli') && l.includes('gemini-2.5-flash'))).toBe(true);
cleanup();
});
});
});

View File

@@ -175,6 +175,7 @@ describe('create command', () => {
expect(client.post).toHaveBeenCalledWith('/api/v1/projects', { expect(client.post).toHaveBeenCalledWith('/api/v1/projects', {
name: 'my-project', name: 'my-project',
description: 'A test project', description: 'A test project',
proxyMode: 'direct',
}); });
expect(output.join('\n')).toContain("project 'test' created"); expect(output.join('\n')).toContain("project 'test' created");
}); });
@@ -185,6 +186,7 @@ describe('create command', () => {
expect(client.post).toHaveBeenCalledWith('/api/v1/projects', { expect(client.post).toHaveBeenCalledWith('/api/v1/projects', {
name: 'minimal', name: 'minimal',
description: '', description: '',
proxyMode: 'direct',
}); });
}); });
@@ -193,8 +195,256 @@ describe('create command', () => {
vi.mocked(client.get).mockResolvedValueOnce([{ id: 'proj-1', name: 'my-proj' }] as never); vi.mocked(client.get).mockResolvedValueOnce([{ id: 'proj-1', name: 'my-proj' }] as never);
const cmd = createCreateCommand({ client, log }); const cmd = createCreateCommand({ client, log });
await cmd.parseAsync(['project', 'my-proj', '-d', 'updated', '--force'], { from: 'user' }); await cmd.parseAsync(['project', 'my-proj', '-d', 'updated', '--force'], { from: 'user' });
expect(client.put).toHaveBeenCalledWith('/api/v1/projects/proj-1', { description: 'updated' }); expect(client.put).toHaveBeenCalledWith('/api/v1/projects/proj-1', { description: 'updated', proxyMode: 'direct' });
expect(output.join('\n')).toContain("project 'my-proj' updated"); expect(output.join('\n')).toContain("project 'my-proj' updated");
}); });
}); });
describe('create user', () => {
it('creates a user with password and name', async () => {
vi.mocked(client.post).mockResolvedValueOnce({ id: 'usr-1', email: 'alice@test.com' });
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync([
'user', 'alice@test.com',
'--password', 'secret123',
'--name', 'Alice',
], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/users', {
email: 'alice@test.com',
password: 'secret123',
name: 'Alice',
});
expect(output.join('\n')).toContain("user 'alice@test.com' created");
});
it('does not send role field (RBAC is the auth mechanism)', async () => {
vi.mocked(client.post).mockResolvedValueOnce({ id: 'usr-1', email: 'admin@test.com' });
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync([
'user', 'admin@test.com',
'--password', 'pass123',
], { from: 'user' });
const callBody = vi.mocked(client.post).mock.calls[0]![1] as Record<string, unknown>;
expect(callBody).not.toHaveProperty('role');
});
it('requires --password', async () => {
const cmd = createCreateCommand({ client, log });
await expect(cmd.parseAsync(['user', 'alice@test.com'], { from: 'user' })).rejects.toThrow('--password is required');
});
it('throws on 409 without --force', async () => {
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"User already exists"}'));
const cmd = createCreateCommand({ client, log });
await expect(
cmd.parseAsync(['user', 'alice@test.com', '--password', 'pass'], { from: 'user' }),
).rejects.toThrow('API error 409');
});
it('updates existing user on 409 with --force', async () => {
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"User already exists"}'));
vi.mocked(client.get).mockResolvedValueOnce([{ id: 'usr-1', email: 'alice@test.com' }] as never);
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync([
'user', 'alice@test.com', '--password', 'newpass', '--name', 'Alice New', '--force',
], { from: 'user' });
expect(client.put).toHaveBeenCalledWith('/api/v1/users/usr-1', {
password: 'newpass',
name: 'Alice New',
});
expect(output.join('\n')).toContain("user 'alice@test.com' updated");
});
});
describe('create group', () => {
it('creates a group with members', async () => {
vi.mocked(client.post).mockResolvedValueOnce({ id: 'grp-1', name: 'dev-team' });
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync([
'group', 'dev-team',
'--description', 'Development team',
'--member', 'alice@test.com',
'--member', 'bob@test.com',
], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/groups', {
name: 'dev-team',
description: 'Development team',
members: ['alice@test.com', 'bob@test.com'],
});
expect(output.join('\n')).toContain("group 'dev-team' created");
});
it('creates a group with no members', async () => {
vi.mocked(client.post).mockResolvedValueOnce({ id: 'grp-1', name: 'empty-group' });
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync(['group', 'empty-group'], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/groups', {
name: 'empty-group',
members: [],
});
});
it('throws on 409 without --force', async () => {
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"Group already exists"}'));
const cmd = createCreateCommand({ client, log });
await expect(
cmd.parseAsync(['group', 'dev-team'], { from: 'user' }),
).rejects.toThrow('API error 409');
});
it('updates existing group on 409 with --force', async () => {
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"Group already exists"}'));
vi.mocked(client.get).mockResolvedValueOnce([{ id: 'grp-1', name: 'dev-team' }] as never);
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync([
'group', 'dev-team', '--member', 'new@test.com', '--force',
], { from: 'user' });
expect(client.put).toHaveBeenCalledWith('/api/v1/groups/grp-1', {
members: ['new@test.com'],
});
expect(output.join('\n')).toContain("group 'dev-team' updated");
});
});
describe('create rbac', () => {
it('creates an RBAC definition with subjects and bindings', async () => {
vi.mocked(client.post).mockResolvedValueOnce({ id: 'rbac-1', name: 'developers' });
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync([
'rbac', 'developers',
'--subject', 'User:alice@test.com',
'--subject', 'Group:dev-team',
'--binding', 'edit:servers',
'--binding', 'view:instances',
], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', {
name: 'developers',
subjects: [
{ kind: 'User', name: 'alice@test.com' },
{ kind: 'Group', name: 'dev-team' },
],
roleBindings: [
{ role: 'edit', resource: 'servers' },
{ role: 'view', resource: 'instances' },
],
});
expect(output.join('\n')).toContain("rbac 'developers' created");
});
it('creates an RBAC definition with wildcard resource', async () => {
vi.mocked(client.post).mockResolvedValueOnce({ id: 'rbac-1', name: 'admins' });
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync([
'rbac', 'admins',
'--subject', 'User:admin@test.com',
'--binding', 'edit:*',
], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', {
name: 'admins',
subjects: [{ kind: 'User', name: 'admin@test.com' }],
roleBindings: [{ role: 'edit', resource: '*' }],
});
});
it('creates an RBAC definition with empty subjects and bindings', async () => {
vi.mocked(client.post).mockResolvedValueOnce({ id: 'rbac-1', name: 'empty' });
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync(['rbac', 'empty'], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', {
name: 'empty',
subjects: [],
roleBindings: [],
});
});
it('throws on invalid subject format', async () => {
const cmd = createCreateCommand({ client, log });
await expect(
cmd.parseAsync(['rbac', 'bad', '--subject', 'no-colon'], { from: 'user' }),
).rejects.toThrow('Invalid subject format');
});
it('throws on invalid binding format', async () => {
const cmd = createCreateCommand({ client, log });
await expect(
cmd.parseAsync(['rbac', 'bad', '--binding', 'no-colon'], { from: 'user' }),
).rejects.toThrow('Invalid binding format');
});
it('throws on 409 without --force', async () => {
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"RBAC already exists"}'));
const cmd = createCreateCommand({ client, log });
await expect(
cmd.parseAsync(['rbac', 'developers', '--subject', 'User:a@b.com', '--binding', 'edit:servers'], { from: 'user' }),
).rejects.toThrow('API error 409');
});
it('updates existing RBAC on 409 with --force', async () => {
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"RBAC already exists"}'));
vi.mocked(client.get).mockResolvedValueOnce([{ id: 'rbac-1', name: 'developers' }] as never);
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync([
'rbac', 'developers',
'--subject', 'User:new@test.com',
'--binding', 'edit:*',
'--force',
], { from: 'user' });
expect(client.put).toHaveBeenCalledWith('/api/v1/rbac/rbac-1', {
subjects: [{ kind: 'User', name: 'new@test.com' }],
roleBindings: [{ role: 'edit', resource: '*' }],
});
expect(output.join('\n')).toContain("rbac 'developers' updated");
});
it('creates an RBAC definition with operation bindings', async () => {
vi.mocked(client.post).mockResolvedValueOnce({ id: 'rbac-1', name: 'ops' });
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync([
'rbac', 'ops',
'--subject', 'Group:ops-team',
'--binding', 'edit:servers',
'--operation', 'logs',
'--operation', 'backup',
], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', {
name: 'ops',
subjects: [{ kind: 'Group', name: 'ops-team' }],
roleBindings: [
{ role: 'edit', resource: 'servers' },
{ role: 'run', action: 'logs' },
{ role: 'run', action: 'backup' },
],
});
expect(output.join('\n')).toContain("rbac 'ops' created");
});
it('creates an RBAC definition with name-scoped binding', async () => {
vi.mocked(client.post).mockResolvedValueOnce({ id: 'rbac-1', name: 'ha-viewer' });
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync([
'rbac', 'ha-viewer',
'--subject', 'User:alice@test.com',
'--binding', 'view:servers:my-ha',
], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', {
name: 'ha-viewer',
subjects: [{ kind: 'User', name: 'alice@test.com' }],
roleBindings: [
{ role: 'view', resource: 'servers', name: 'my-ha' },
],
});
});
});
}); });

View File

@@ -287,4 +287,410 @@ describe('describe command', () => {
expect(text).toContain('list_datasources'); expect(text).toContain('list_datasources');
expect(text).toContain('mcpctl create server my-grafana --from-template=grafana'); expect(text).toContain('mcpctl create server my-grafana --from-template=grafana');
}); });
it('shows user detail (no Role field — RBAC is the auth mechanism)', async () => {
const deps = makeDeps({
id: 'usr-1',
email: 'alice@test.com',
name: 'Alice Smith',
provider: null,
createdAt: '2025-01-01',
updatedAt: '2025-01-15',
});
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'user', 'usr-1']);
expect(deps.fetchResource).toHaveBeenCalledWith('users', 'usr-1');
const text = deps.output.join('\n');
expect(text).toContain('=== User: alice@test.com ===');
expect(text).toContain('Email:');
expect(text).toContain('alice@test.com');
expect(text).toContain('Name:');
expect(text).toContain('Alice Smith');
expect(text).not.toContain('Role:');
expect(text).toContain('Provider:');
expect(text).toContain('local');
expect(text).toContain('ID:');
expect(text).toContain('usr-1');
});
it('shows user with no name as dash', async () => {
const deps = makeDeps({
id: 'usr-2',
email: 'bob@test.com',
name: null,
provider: 'oidc',
});
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'user', 'usr-2']);
const text = deps.output.join('\n');
expect(text).toContain('=== User: bob@test.com ===');
expect(text).toContain('Name:');
expect(text).toContain('-');
expect(text).not.toContain('Role:');
expect(text).toContain('oidc');
});
it('shows group detail with members', async () => {
const deps = makeDeps({
id: 'grp-1',
name: 'dev-team',
description: 'Development team',
members: [
{ user: { email: 'alice@test.com' }, createdAt: '2025-01-01' },
{ user: { email: 'bob@test.com' }, createdAt: '2025-01-02' },
],
createdAt: '2025-01-01',
updatedAt: '2025-01-15',
});
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'group', 'grp-1']);
expect(deps.fetchResource).toHaveBeenCalledWith('groups', 'grp-1');
const text = deps.output.join('\n');
expect(text).toContain('=== Group: dev-team ===');
expect(text).toContain('Name:');
expect(text).toContain('dev-team');
expect(text).toContain('Description:');
expect(text).toContain('Development team');
expect(text).toContain('Members:');
expect(text).toContain('EMAIL');
expect(text).toContain('ADDED');
expect(text).toContain('alice@test.com');
expect(text).toContain('bob@test.com');
expect(text).toContain('ID:');
expect(text).toContain('grp-1');
});
it('shows group detail with no members', async () => {
const deps = makeDeps({
id: 'grp-2',
name: 'empty-group',
description: '',
members: [],
});
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'group', 'grp-2']);
const text = deps.output.join('\n');
expect(text).toContain('=== Group: empty-group ===');
// No Members section when empty
expect(text).not.toContain('EMAIL');
});
it('shows RBAC detail with subjects and bindings', async () => {
const deps = makeDeps({
id: 'rbac-1',
name: 'developers',
subjects: [
{ kind: 'User', name: 'alice@test.com' },
{ kind: 'Group', name: 'dev-team' },
],
roleBindings: [
{ role: 'edit', resource: 'servers' },
{ role: 'view', resource: 'instances' },
{ role: 'view', resource: 'projects' },
],
createdAt: '2025-01-01',
updatedAt: '2025-01-15',
});
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'rbac', 'rbac-1']);
expect(deps.fetchResource).toHaveBeenCalledWith('rbac', 'rbac-1');
const text = deps.output.join('\n');
expect(text).toContain('=== RBAC: developers ===');
expect(text).toContain('Name:');
expect(text).toContain('developers');
// Subjects section
expect(text).toContain('Subjects:');
expect(text).toContain('KIND');
expect(text).toContain('NAME');
expect(text).toContain('User');
expect(text).toContain('alice@test.com');
expect(text).toContain('Group');
expect(text).toContain('dev-team');
// Role Bindings section
expect(text).toContain('Resource Bindings:');
expect(text).toContain('ROLE');
expect(text).toContain('RESOURCE');
expect(text).toContain('edit');
expect(text).toContain('servers');
expect(text).toContain('view');
expect(text).toContain('instances');
expect(text).toContain('projects');
expect(text).toContain('ID:');
expect(text).toContain('rbac-1');
});
it('shows RBAC detail with wildcard resource', async () => {
const deps = makeDeps({
id: 'rbac-2',
name: 'admins',
subjects: [{ kind: 'User', name: 'admin@test.com' }],
roleBindings: [{ role: 'edit', resource: '*' }],
});
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'rbac', 'rbac-2']);
const text = deps.output.join('\n');
expect(text).toContain('=== RBAC: admins ===');
expect(text).toContain('edit');
expect(text).toContain('*');
});
it('shows RBAC detail with empty subjects and bindings', async () => {
const deps = makeDeps({
id: 'rbac-3',
name: 'empty-rbac',
subjects: [],
roleBindings: [],
});
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'rbac', 'rbac-3']);
const text = deps.output.join('\n');
expect(text).toContain('=== RBAC: empty-rbac ===');
// No Subjects or Role Bindings sections when empty
expect(text).not.toContain('KIND');
expect(text).not.toContain('ROLE');
expect(text).not.toContain('RESOURCE');
});
it('shows RBAC detail with mixed resource and operation bindings', async () => {
const deps = makeDeps({
id: 'rbac-1',
name: 'admin-access',
subjects: [{ kind: 'Group', name: 'admin' }],
roleBindings: [
{ role: 'edit', resource: '*' },
{ role: 'run', resource: 'projects' },
{ role: 'run', action: 'logs' },
{ role: 'run', action: 'backup' },
],
createdAt: '2025-01-01',
});
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'rbac', 'rbac-1']);
const text = deps.output.join('\n');
expect(text).toContain('Resource Bindings:');
expect(text).toContain('edit');
expect(text).toContain('*');
expect(text).toContain('run');
expect(text).toContain('projects');
expect(text).toContain('Operations:');
expect(text).toContain('ACTION');
expect(text).toContain('logs');
expect(text).toContain('backup');
});
it('shows RBAC detail with name-scoped resource binding', async () => {
const deps = makeDeps({
id: 'rbac-1',
name: 'ha-viewer',
subjects: [{ kind: 'User', name: 'alice@test.com' }],
roleBindings: [
{ role: 'view', resource: 'servers', name: 'my-ha' },
{ role: 'edit', resource: 'secrets' },
],
});
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'rbac', 'rbac-1']);
const text = deps.output.join('\n');
expect(text).toContain('Resource Bindings:');
expect(text).toContain('NAME');
expect(text).toContain('my-ha');
expect(text).toContain('view');
expect(text).toContain('servers');
});
it('shows user with direct RBAC permissions', async () => {
const deps = makeDeps({
id: 'usr-1',
email: 'alice@test.com',
name: 'Alice',
provider: null,
});
vi.mocked(deps.client.get)
.mockResolvedValueOnce([] as never) // users list (resolveNameOrId)
.mockResolvedValueOnce([ // RBAC defs
{
name: 'dev-access',
subjects: [{ kind: 'User', name: 'alice@test.com' }],
roleBindings: [
{ role: 'edit', resource: 'servers' },
{ role: 'run', action: 'logs' },
],
},
] as never)
.mockResolvedValueOnce([] as never); // groups
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'user', 'usr-1']);
const text = deps.output.join('\n');
expect(text).toContain('=== User: alice@test.com ===');
expect(text).toContain('Access:');
expect(text).toContain('Direct (dev-access)');
expect(text).toContain('Resources:');
expect(text).toContain('edit');
expect(text).toContain('servers');
expect(text).toContain('Operations:');
expect(text).toContain('logs');
});
it('shows user with inherited group permissions', async () => {
const deps = makeDeps({
id: 'usr-1',
email: 'bob@test.com',
name: 'Bob',
provider: null,
});
vi.mocked(deps.client.get)
.mockResolvedValueOnce([] as never) // users list
.mockResolvedValueOnce([ // RBAC defs
{
name: 'team-perms',
subjects: [{ kind: 'Group', name: 'dev-team' }],
roleBindings: [
{ role: 'view', resource: '*' },
{ role: 'run', action: 'backup' },
],
},
] as never)
.mockResolvedValueOnce([ // groups
{ name: 'dev-team', members: [{ user: { email: 'bob@test.com' } }] },
] as never);
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'user', 'usr-1']);
const text = deps.output.join('\n');
expect(text).toContain('Groups:');
expect(text).toContain('dev-team');
expect(text).toContain('Access:');
expect(text).toContain('Inherited (dev-team)');
expect(text).toContain('view');
expect(text).toContain('*');
expect(text).toContain('backup');
});
it('shows user with no permissions', async () => {
const deps = makeDeps({
id: 'usr-1',
email: 'nobody@test.com',
name: null,
provider: null,
});
vi.mocked(deps.client.get)
.mockResolvedValueOnce([] as never)
.mockResolvedValueOnce([] as never)
.mockResolvedValueOnce([] as never);
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'user', 'usr-1']);
const text = deps.output.join('\n');
expect(text).toContain('Access: (none)');
});
it('shows group with RBAC permissions', async () => {
const deps = makeDeps({
id: 'grp-1',
name: 'admin',
description: 'Admin group',
members: [{ user: { email: 'alice@test.com' } }],
});
vi.mocked(deps.client.get)
.mockResolvedValueOnce([] as never) // groups list (resolveNameOrId)
.mockResolvedValueOnce([ // RBAC defs
{
name: 'admin-access',
subjects: [{ kind: 'Group', name: 'admin' }],
roleBindings: [
{ role: 'edit', resource: '*' },
{ role: 'run', action: 'backup' },
{ role: 'run', action: 'restore' },
],
},
] as never);
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'group', 'grp-1']);
const text = deps.output.join('\n');
expect(text).toContain('=== Group: admin ===');
expect(text).toContain('Access:');
expect(text).toContain('Granted (admin-access)');
expect(text).toContain('edit');
expect(text).toContain('*');
expect(text).toContain('backup');
expect(text).toContain('restore');
});
it('shows group with name-scoped permissions', async () => {
const deps = makeDeps({
id: 'grp-1',
name: 'ha-team',
description: 'HA team',
members: [],
});
vi.mocked(deps.client.get)
.mockResolvedValueOnce([] as never)
.mockResolvedValueOnce([ // RBAC defs
{
name: 'ha-access',
subjects: [{ kind: 'Group', name: 'ha-team' }],
roleBindings: [
{ role: 'edit', resource: 'servers', name: 'my-ha' },
{ role: 'view', resource: 'secrets' },
],
},
] as never);
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'group', 'grp-1']);
const text = deps.output.join('\n');
expect(text).toContain('Access:');
expect(text).toContain('Granted (ha-access)');
expect(text).toContain('my-ha');
expect(text).toContain('NAME');
});
it('outputs user detail as JSON', async () => {
const deps = makeDeps({ id: 'usr-1', email: 'alice@test.com', name: 'Alice', role: 'ADMIN' });
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'user', 'usr-1', '-o', 'json']);
const parsed = JSON.parse(deps.output[0] ?? '');
expect(parsed.email).toBe('alice@test.com');
expect(parsed.role).toBe('ADMIN');
});
it('outputs group detail as YAML', async () => {
const deps = makeDeps({ id: 'grp-1', name: 'dev-team', description: 'Devs' });
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'group', 'grp-1', '-o', 'yaml']);
expect(deps.output[0]).toContain('name: dev-team');
});
it('outputs rbac detail as JSON', async () => {
const deps = makeDeps({
id: 'rbac-1',
name: 'devs',
subjects: [{ kind: 'User', name: 'a@b.com' }],
roleBindings: [{ role: 'edit', resource: 'servers' }],
});
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'rbac', 'rbac-1', '-o', 'json']);
const parsed = JSON.parse(deps.output[0] ?? '');
expect(parsed.subjects).toHaveLength(1);
expect(parsed.roleBindings[0].role).toBe('edit');
});
}); });

View File

@@ -85,4 +85,170 @@ describe('get command', () => {
await cmd.parseAsync(['node', 'test', 'servers']); await cmd.parseAsync(['node', 'test', 'servers']);
expect(deps.output[0]).toContain('No servers found'); expect(deps.output[0]).toContain('No servers found');
}); });
it('lists users with correct columns (no ROLE column)', async () => {
const deps = makeDeps([
{ id: 'usr-1', email: 'alice@test.com', name: 'Alice', provider: null },
{ id: 'usr-2', email: 'bob@test.com', name: null, provider: 'oidc' },
]);
const cmd = createGetCommand(deps);
await cmd.parseAsync(['node', 'test', 'users']);
expect(deps.fetchResource).toHaveBeenCalledWith('users', undefined);
const text = deps.output.join('\n');
expect(text).toContain('EMAIL');
expect(text).toContain('NAME');
expect(text).not.toContain('ROLE');
expect(text).toContain('PROVIDER');
expect(text).toContain('alice@test.com');
expect(text).toContain('Alice');
expect(text).toContain('bob@test.com');
expect(text).toContain('oidc');
});
it('resolves user alias', async () => {
const deps = makeDeps([]);
const cmd = createGetCommand(deps);
await cmd.parseAsync(['node', 'test', 'user']);
expect(deps.fetchResource).toHaveBeenCalledWith('users', undefined);
});
it('lists groups with correct columns', async () => {
const deps = makeDeps([
{
id: 'grp-1',
name: 'dev-team',
description: 'Developers',
members: [{ user: { email: 'alice@test.com' } }, { user: { email: 'bob@test.com' } }],
},
{ id: 'grp-2', name: 'ops-team', description: 'Operations', members: [] },
]);
const cmd = createGetCommand(deps);
await cmd.parseAsync(['node', 'test', 'groups']);
expect(deps.fetchResource).toHaveBeenCalledWith('groups', undefined);
const text = deps.output.join('\n');
expect(text).toContain('NAME');
expect(text).toContain('MEMBERS');
expect(text).toContain('DESCRIPTION');
expect(text).toContain('dev-team');
expect(text).toContain('2');
expect(text).toContain('ops-team');
expect(text).toContain('0');
});
it('resolves group alias', async () => {
const deps = makeDeps([]);
const cmd = createGetCommand(deps);
await cmd.parseAsync(['node', 'test', 'group']);
expect(deps.fetchResource).toHaveBeenCalledWith('groups', undefined);
});
it('lists rbac definitions with correct columns', async () => {
const deps = makeDeps([
{
id: 'rbac-1',
name: 'admins',
subjects: [{ kind: 'User', name: 'admin@test.com' }],
roleBindings: [{ role: 'edit', resource: '*' }],
},
]);
const cmd = createGetCommand(deps);
await cmd.parseAsync(['node', 'test', 'rbac']);
expect(deps.fetchResource).toHaveBeenCalledWith('rbac', undefined);
const text = deps.output.join('\n');
expect(text).toContain('NAME');
expect(text).toContain('SUBJECTS');
expect(text).toContain('BINDINGS');
expect(text).toContain('admins');
expect(text).toContain('User:admin@test.com');
expect(text).toContain('edit:*');
});
it('resolves rbac-definition alias', async () => {
const deps = makeDeps([]);
const cmd = createGetCommand(deps);
await cmd.parseAsync(['node', 'test', 'rbac-definition']);
expect(deps.fetchResource).toHaveBeenCalledWith('rbac', undefined);
});
it('lists projects with new columns', async () => {
const deps = makeDeps([{
id: 'proj-1',
name: 'smart-home',
description: 'Home automation',
proxyMode: 'filtered',
ownerId: 'usr-1',
servers: [{ server: { name: 'grafana' } }],
}]);
const cmd = createGetCommand(deps);
await cmd.parseAsync(['node', 'test', 'projects']);
const text = deps.output.join('\n');
expect(text).toContain('MODE');
expect(text).toContain('SERVERS');
expect(text).toContain('smart-home');
expect(text).toContain('filtered');
expect(text).toContain('1');
});
it('displays mixed resource and operation bindings', async () => {
const deps = makeDeps([
{
id: 'rbac-1',
name: 'admin-access',
subjects: [{ kind: 'Group', name: 'admin' }],
roleBindings: [
{ role: 'edit', resource: '*' },
{ role: 'run', action: 'logs' },
{ role: 'run', action: 'backup' },
],
},
]);
const cmd = createGetCommand(deps);
await cmd.parseAsync(['node', 'test', 'rbac']);
const text = deps.output.join('\n');
expect(text).toContain('edit:*');
expect(text).toContain('run>logs');
expect(text).toContain('run>backup');
});
it('displays name-scoped resource bindings', async () => {
const deps = makeDeps([
{
id: 'rbac-1',
name: 'ha-viewer',
subjects: [{ kind: 'User', name: 'alice@test.com' }],
roleBindings: [{ role: 'view', resource: 'servers', name: 'my-ha' }],
},
]);
const cmd = createGetCommand(deps);
await cmd.parseAsync(['node', 'test', 'rbac']);
const text = deps.output.join('\n');
expect(text).toContain('view:servers:my-ha');
});
it('shows no results message for empty users list', async () => {
const deps = makeDeps([]);
const cmd = createGetCommand(deps);
await cmd.parseAsync(['node', 'test', 'users']);
expect(deps.output[0]).toContain('No users found');
});
it('shows no results message for empty groups list', async () => {
const deps = makeDeps([]);
const cmd = createGetCommand(deps);
await cmd.parseAsync(['node', 'test', 'groups']);
expect(deps.output[0]).toContain('No groups found');
});
it('shows no results message for empty rbac list', async () => {
const deps = makeDeps([]);
const cmd = createGetCommand(deps);
await cmd.parseAsync(['node', 'test', 'rbac']);
expect(deps.output[0]).toContain('No rbac found');
});
}); });

View File

@@ -0,0 +1,481 @@
import { describe, it, expect, beforeAll, afterAll } from 'vitest';
import http from 'node:http';
import { Readable, Writable } from 'node:stream';
import { runMcpBridge, createMcpCommand } from '../../src/commands/mcp.js';
// ---- Mock MCP server (simulates mcplocal project endpoint) ----
interface RecordedRequest {
method: string;
url: string;
headers: http.IncomingHttpHeaders;
body: string;
}
let mockServer: http.Server;
let mockPort: number;
const recorded: RecordedRequest[] = [];
let sessionCounter = 0;
function makeInitializeResponse(id: number | string) {
return JSON.stringify({
jsonrpc: '2.0',
id,
result: {
protocolVersion: '2024-11-05',
capabilities: { tools: {} },
serverInfo: { name: 'test-server', version: '1.0.0' },
},
});
}
function makeToolsListResponse(id: number | string) {
return JSON.stringify({
jsonrpc: '2.0',
id,
result: {
tools: [
{ name: 'grafana/query', description: 'Query Grafana', inputSchema: { type: 'object', properties: {} } },
],
},
});
}
function makeToolCallResponse(id: number | string) {
return JSON.stringify({
jsonrpc: '2.0',
id,
result: {
content: [{ type: 'text', text: 'tool result' }],
},
});
}
beforeAll(async () => {
mockServer = http.createServer((req, res) => {
const chunks: Buffer[] = [];
req.on('data', (c: Buffer) => chunks.push(c));
req.on('end', () => {
const body = Buffer.concat(chunks).toString('utf-8');
recorded.push({ method: req.method ?? '', url: req.url ?? '', headers: req.headers, body });
if (req.method === 'DELETE') {
res.writeHead(200);
res.end();
return;
}
if (req.method === 'POST' && req.url?.startsWith('/projects/')) {
let sessionId = req.headers['mcp-session-id'] as string | undefined;
// Assign session ID on first request
if (!sessionId) {
sessionCounter++;
sessionId = `session-${sessionCounter}`;
}
res.setHeader('mcp-session-id', sessionId);
// Parse JSON-RPC and respond based on method
try {
const rpc = JSON.parse(body) as { id: number | string; method: string };
let responseBody: string;
switch (rpc.method) {
case 'initialize':
responseBody = makeInitializeResponse(rpc.id);
break;
case 'tools/list':
responseBody = makeToolsListResponse(rpc.id);
break;
case 'tools/call':
responseBody = makeToolCallResponse(rpc.id);
break;
default:
responseBody = JSON.stringify({ jsonrpc: '2.0', id: rpc.id, error: { code: -32601, message: 'Method not found' } });
}
// Respond in SSE format for /projects/sse-project/mcp
if (req.url?.includes('sse-project')) {
res.writeHead(200, { 'Content-Type': 'text/event-stream' });
res.end(`event: message\ndata: ${responseBody}\n\n`);
} else {
res.writeHead(200, { 'Content-Type': 'application/json' });
res.end(responseBody);
}
} catch {
res.writeHead(400, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ error: 'Invalid JSON' }));
}
return;
}
res.writeHead(404);
res.end();
});
});
await new Promise<void>((resolve) => {
mockServer.listen(0, () => {
const addr = mockServer.address();
if (addr && typeof addr === 'object') {
mockPort = addr.port;
}
resolve();
});
});
});
afterAll(() => {
mockServer.close();
});
// ---- Helper to run bridge with mock streams ----
function createMockStreams() {
const stdoutChunks: string[] = [];
const stderrChunks: string[] = [];
const stdout = new Writable({
write(chunk: Buffer, _encoding, callback) {
stdoutChunks.push(chunk.toString());
callback();
},
});
const stderr = new Writable({
write(chunk: Buffer, _encoding, callback) {
stderrChunks.push(chunk.toString());
callback();
},
});
return { stdout, stderr, stdoutChunks, stderrChunks };
}
function pushAndEnd(stdin: Readable, lines: string[]) {
for (const line of lines) {
stdin.push(line + '\n');
}
stdin.push(null); // EOF
}
// ---- Tests ----
describe('MCP STDIO Bridge', () => {
beforeAll(() => {
recorded.length = 0;
sessionCounter = 0;
});
it('forwards initialize request and returns response', async () => {
recorded.length = 0;
const stdin = new Readable({ read() {} });
const { stdout, stdoutChunks } = createMockStreams();
const initMsg = JSON.stringify({
jsonrpc: '2.0', id: 1, method: 'initialize',
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
});
pushAndEnd(stdin, [initMsg]);
await runMcpBridge({
projectName: 'test-project',
mcplocalUrl: `http://localhost:${mockPort}`,
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
});
// Verify request was made to correct URL
expect(recorded.some((r) => r.url === '/projects/test-project/mcp' && r.method === 'POST')).toBe(true);
// Verify response on stdout
const output = stdoutChunks.join('');
const parsed = JSON.parse(output.trim());
expect(parsed.result.serverInfo.name).toBe('test-server');
expect(parsed.result.protocolVersion).toBe('2024-11-05');
});
it('sends session ID on subsequent requests', async () => {
recorded.length = 0;
const stdin = new Readable({ read() {} });
const { stdout, stdoutChunks } = createMockStreams();
const initMsg = JSON.stringify({
jsonrpc: '2.0', id: 1, method: 'initialize',
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
});
const toolsListMsg = JSON.stringify({ jsonrpc: '2.0', id: 2, method: 'tools/list', params: {} });
pushAndEnd(stdin, [initMsg, toolsListMsg]);
await runMcpBridge({
projectName: 'test-project',
mcplocalUrl: `http://localhost:${mockPort}`,
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
});
// First POST should NOT have mcp-session-id header
const firstPost = recorded.find((r) => r.method === 'POST' && r.body.includes('initialize'));
expect(firstPost).toBeDefined();
expect(firstPost!.headers['mcp-session-id']).toBeUndefined();
// Second POST SHOULD have mcp-session-id header
const secondPost = recorded.find((r) => r.method === 'POST' && r.body.includes('tools/list'));
expect(secondPost).toBeDefined();
expect(secondPost!.headers['mcp-session-id']).toMatch(/^session-/);
// Verify tools/list response
const lines = stdoutChunks.join('').trim().split('\n');
expect(lines.length).toBe(2);
const toolsResponse = JSON.parse(lines[1]);
expect(toolsResponse.result.tools[0].name).toBe('grafana/query');
});
it('forwards tools/call and returns result', async () => {
recorded.length = 0;
const stdin = new Readable({ read() {} });
const { stdout, stdoutChunks } = createMockStreams();
const initMsg = JSON.stringify({
jsonrpc: '2.0', id: 1, method: 'initialize',
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
});
const callMsg = JSON.stringify({
jsonrpc: '2.0', id: 2, method: 'tools/call',
params: { name: 'grafana/query', arguments: { query: 'test' } },
});
pushAndEnd(stdin, [initMsg, callMsg]);
await runMcpBridge({
projectName: 'test-project',
mcplocalUrl: `http://localhost:${mockPort}`,
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
});
const lines = stdoutChunks.join('').trim().split('\n');
expect(lines.length).toBe(2);
const callResponse = JSON.parse(lines[1]);
expect(callResponse.result.content[0].text).toBe('tool result');
});
it('forwards Authorization header when token provided', async () => {
recorded.length = 0;
const stdin = new Readable({ read() {} });
const { stdout } = createMockStreams();
const initMsg = JSON.stringify({
jsonrpc: '2.0', id: 1, method: 'initialize',
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
});
pushAndEnd(stdin, [initMsg]);
await runMcpBridge({
projectName: 'test-project',
mcplocalUrl: `http://localhost:${mockPort}`,
token: 'my-secret-token',
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
});
const post = recorded.find((r) => r.method === 'POST');
expect(post).toBeDefined();
expect(post!.headers['authorization']).toBe('Bearer my-secret-token');
});
it('does not send Authorization header when no token', async () => {
recorded.length = 0;
const stdin = new Readable({ read() {} });
const { stdout } = createMockStreams();
const initMsg = JSON.stringify({
jsonrpc: '2.0', id: 1, method: 'initialize',
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
});
pushAndEnd(stdin, [initMsg]);
await runMcpBridge({
projectName: 'test-project',
mcplocalUrl: `http://localhost:${mockPort}`,
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
});
const post = recorded.find((r) => r.method === 'POST');
expect(post).toBeDefined();
expect(post!.headers['authorization']).toBeUndefined();
});
it('sends DELETE to clean up session on stdin EOF', async () => {
recorded.length = 0;
const stdin = new Readable({ read() {} });
const { stdout } = createMockStreams();
const initMsg = JSON.stringify({
jsonrpc: '2.0', id: 1, method: 'initialize',
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
});
pushAndEnd(stdin, [initMsg]);
await runMcpBridge({
projectName: 'test-project',
mcplocalUrl: `http://localhost:${mockPort}`,
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
});
// Should have a DELETE request for session cleanup
const deleteReq = recorded.find((r) => r.method === 'DELETE');
expect(deleteReq).toBeDefined();
expect(deleteReq!.headers['mcp-session-id']).toMatch(/^session-/);
});
it('does not send DELETE if no session was established', async () => {
recorded.length = 0;
const stdin = new Readable({ read() {} });
const { stdout } = createMockStreams();
// Push EOF immediately with no messages
stdin.push(null);
await runMcpBridge({
projectName: 'test-project',
mcplocalUrl: `http://localhost:${mockPort}`,
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
});
expect(recorded.filter((r) => r.method === 'DELETE')).toHaveLength(0);
});
it('writes errors to stderr, not stdout', async () => {
recorded.length = 0;
const stdin = new Readable({ read() {} });
const { stdout, stdoutChunks, stderr, stderrChunks } = createMockStreams();
// Send to a non-existent port to trigger connection error
const badMsg = JSON.stringify({ jsonrpc: '2.0', id: 1, method: 'initialize', params: {} });
pushAndEnd(stdin, [badMsg]);
await runMcpBridge({
projectName: 'test-project',
mcplocalUrl: 'http://localhost:1', // will fail to connect
stdin, stdout, stderr,
});
// Error should be on stderr
expect(stderrChunks.join('')).toContain('MCP bridge error');
// stdout should be empty (no corrupted output)
expect(stdoutChunks.join('')).toBe('');
});
it('skips blank lines in stdin', async () => {
recorded.length = 0;
const stdin = new Readable({ read() {} });
const { stdout, stdoutChunks } = createMockStreams();
const initMsg = JSON.stringify({
jsonrpc: '2.0', id: 1, method: 'initialize',
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
});
pushAndEnd(stdin, ['', ' ', initMsg, '']);
await runMcpBridge({
projectName: 'test-project',
mcplocalUrl: `http://localhost:${mockPort}`,
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
});
// Only one POST (for the actual message)
const posts = recorded.filter((r) => r.method === 'POST');
expect(posts).toHaveLength(1);
// One response line
const lines = stdoutChunks.join('').trim().split('\n');
expect(lines).toHaveLength(1);
});
it('handles SSE (text/event-stream) responses', async () => {
recorded.length = 0;
const stdin = new Readable({ read() {} });
const { stdout, stdoutChunks } = createMockStreams();
const initMsg = JSON.stringify({
jsonrpc: '2.0', id: 1, method: 'initialize',
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
});
pushAndEnd(stdin, [initMsg]);
await runMcpBridge({
projectName: 'sse-project', // triggers SSE response from mock server
mcplocalUrl: `http://localhost:${mockPort}`,
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
});
// Should extract JSON from SSE data: lines
const output = stdoutChunks.join('').trim();
const parsed = JSON.parse(output);
expect(parsed.result.serverInfo.name).toBe('test-server');
});
it('URL-encodes project name', async () => {
recorded.length = 0;
const stdin = new Readable({ read() {} });
const { stdout } = createMockStreams();
const { stderr } = createMockStreams();
const initMsg = JSON.stringify({
jsonrpc: '2.0', id: 1, method: 'initialize',
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
});
pushAndEnd(stdin, [initMsg]);
await runMcpBridge({
projectName: 'my project',
mcplocalUrl: `http://localhost:${mockPort}`,
stdin, stdout, stderr,
});
const post = recorded.find((r) => r.method === 'POST');
expect(post?.url).toBe('/projects/my%20project/mcp');
});
});
describe('createMcpCommand', () => {
it('accepts --project option directly', () => {
const cmd = createMcpCommand({
getProject: () => undefined,
configLoader: () => ({ mcplocalUrl: 'http://localhost:3200' }),
credentialsLoader: () => null,
});
const opt = cmd.options.find((o) => o.long === '--project');
expect(opt).toBeDefined();
expect(opt!.short).toBe('-p');
});
it('parses --project from command args', async () => {
let capturedProject: string | undefined;
const cmd = createMcpCommand({
getProject: () => undefined,
configLoader: () => ({ mcplocalUrl: `http://localhost:${mockPort}` }),
credentialsLoader: () => null,
});
// Override the action to capture what project was parsed
// We test by checking the option parsing works, not by running the full bridge
const parsed = cmd.parse(['--project', 'test-proj'], { from: 'user' });
capturedProject = parsed.opts().project;
expect(capturedProject).toBe('test-proj');
});
it('parses -p shorthand from command args', () => {
const cmd = createMcpCommand({
getProject: () => undefined,
configLoader: () => ({ mcplocalUrl: `http://localhost:${mockPort}` }),
credentialsLoader: () => null,
});
const parsed = cmd.parse(['-p', 'my-project'], { from: 'user' });
expect(parsed.opts().project).toBe('my-project');
});
});

View File

@@ -1,17 +1,19 @@
import { describe, it, expect, vi, beforeEach } from 'vitest'; import { describe, it, expect, vi, beforeEach } from 'vitest';
import { createProjectCommand } from '../../src/commands/project.js'; import { createCreateCommand } from '../../src/commands/create.js';
import type { ApiClient } from '../../src/api-client.js'; import { createGetCommand } from '../../src/commands/get.js';
import { createDescribeCommand } from '../../src/commands/describe.js';
import { type ApiClient, ApiError } from '../../src/api-client.js';
function mockClient(): ApiClient { function mockClient(): ApiClient {
return { return {
get: vi.fn(async () => []), get: vi.fn(async () => []),
post: vi.fn(async () => ({ id: 'proj-1', name: 'my-project' })), post: vi.fn(async () => ({ id: 'new-id', name: 'test' })),
put: vi.fn(async () => ({})), put: vi.fn(async () => ({})),
delete: vi.fn(async () => {}), delete: vi.fn(async () => {}),
} as unknown as ApiClient; } as unknown as ApiClient;
} }
describe('project command', () => { describe('project with new fields', () => {
let client: ReturnType<typeof mockClient>; let client: ReturnType<typeof mockClient>;
let output: string[]; let output: string[];
const log = (...args: unknown[]) => output.push(args.map(String).join(' ')); const log = (...args: unknown[]) => output.push(args.map(String).join(' '));
@@ -21,9 +23,94 @@ describe('project command', () => {
output = []; output = [];
}); });
it('creates command with alias', () => { describe('create project with enhanced options', () => {
const cmd = createProjectCommand({ client, log }); it('creates project with proxy mode and servers', async () => {
expect(cmd.name()).toBe('project'); const cmd = createCreateCommand({ client, log });
expect(cmd.alias()).toBe('proj'); await cmd.parseAsync([
'project', 'smart-home',
'-d', 'Smart home project',
'--proxy-mode', 'filtered',
'--llm-provider', 'gemini-cli',
'--llm-model', 'gemini-2.0-flash',
'--server', 'my-grafana',
'--server', 'my-ha',
], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/projects', expect.objectContaining({
name: 'smart-home',
description: 'Smart home project',
proxyMode: 'filtered',
llmProvider: 'gemini-cli',
llmModel: 'gemini-2.0-flash',
servers: ['my-grafana', 'my-ha'],
}));
});
it('defaults proxy mode to direct', async () => {
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync(['project', 'basic'], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/projects', expect.objectContaining({
proxyMode: 'direct',
}));
});
});
describe('get projects shows new columns', () => {
it('shows MODE and SERVERS columns', async () => {
const deps = {
output: [] as string[],
fetchResource: vi.fn(async () => [{
id: 'proj-1',
name: 'smart-home',
description: 'Test',
proxyMode: 'filtered',
ownerId: 'user-1',
servers: [{ server: { name: 'grafana' } }, { server: { name: 'ha' } }],
}]),
log: (...args: string[]) => deps.output.push(args.join(' ')),
};
const cmd = createGetCommand(deps);
await cmd.parseAsync(['node', 'test', 'projects']);
const text = deps.output.join('\n');
expect(text).toContain('MODE');
expect(text).toContain('SERVERS');
expect(text).toContain('smart-home');
});
});
describe('describe project shows full detail', () => {
it('shows servers and proxy config', async () => {
const deps = {
output: [] as string[],
client: mockClient(),
fetchResource: vi.fn(async () => ({
id: 'proj-1',
name: 'smart-home',
description: 'Smart home',
proxyMode: 'filtered',
llmProvider: 'gemini-cli',
llmModel: 'gemini-2.0-flash',
ownerId: 'user-1',
servers: [
{ server: { name: 'my-grafana' } },
{ server: { name: 'my-ha' } },
],
createdAt: '2025-01-01',
updatedAt: '2025-01-01',
})),
log: (...args: string[]) => deps.output.push(args.join(' ')),
};
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'project', 'proj-1']);
const text = deps.output.join('\n');
expect(text).toContain('=== Project: smart-home ===');
expect(text).toContain('filtered');
expect(text).toContain('gemini-cli');
expect(text).toContain('my-grafana');
expect(text).toContain('my-ha');
});
}); });
}); });

View File

@@ -3,19 +3,38 @@ import { mkdtempSync, rmSync } from 'node:fs';
import { join } from 'node:path'; import { join } from 'node:path';
import { tmpdir } from 'node:os'; import { tmpdir } from 'node:os';
import { createStatusCommand } from '../../src/commands/status.js'; import { createStatusCommand } from '../../src/commands/status.js';
import type { StatusCommandDeps } from '../../src/commands/status.js';
import { saveConfig, DEFAULT_CONFIG } from '../../src/config/index.js'; import { saveConfig, DEFAULT_CONFIG } from '../../src/config/index.js';
import { saveCredentials } from '../../src/auth/index.js'; import { saveCredentials } from '../../src/auth/index.js';
let tempDir: string; let tempDir: string;
let output: string[]; let output: string[];
let written: string[];
function log(...args: string[]) { function log(...args: string[]) {
output.push(args.join(' ')); output.push(args.join(' '));
} }
function write(text: string) {
written.push(text);
}
function baseDeps(overrides?: Partial<StatusCommandDeps>): Partial<StatusCommandDeps> {
return {
configDeps: { configDir: tempDir },
credentialsDeps: { configDir: tempDir },
log,
write,
checkHealth: async () => true,
isTTY: false,
...overrides,
};
}
beforeEach(() => { beforeEach(() => {
tempDir = mkdtempSync(join(tmpdir(), 'mcpctl-status-test-')); tempDir = mkdtempSync(join(tmpdir(), 'mcpctl-status-test-'));
output = []; output = [];
written = [];
}); });
afterEach(() => { afterEach(() => {
@@ -24,12 +43,7 @@ afterEach(() => {
describe('status command', () => { describe('status command', () => {
it('shows status in table format', async () => { it('shows status in table format', async () => {
const cmd = createStatusCommand({ const cmd = createStatusCommand(baseDeps());
configDeps: { configDir: tempDir },
credentialsDeps: { configDir: tempDir },
log,
checkHealth: async () => true,
});
await cmd.parseAsync([], { from: 'user' }); await cmd.parseAsync([], { from: 'user' });
const out = output.join('\n'); const out = output.join('\n');
expect(out).toContain('mcpctl v'); expect(out).toContain('mcpctl v');
@@ -39,46 +53,26 @@ describe('status command', () => {
}); });
it('shows unreachable when daemons are down', async () => { it('shows unreachable when daemons are down', async () => {
const cmd = createStatusCommand({ const cmd = createStatusCommand(baseDeps({ checkHealth: async () => false }));
configDeps: { configDir: tempDir },
credentialsDeps: { configDir: tempDir },
log,
checkHealth: async () => false,
});
await cmd.parseAsync([], { from: 'user' }); await cmd.parseAsync([], { from: 'user' });
expect(output.join('\n')).toContain('unreachable'); expect(output.join('\n')).toContain('unreachable');
}); });
it('shows not logged in when no credentials', async () => { it('shows not logged in when no credentials', async () => {
const cmd = createStatusCommand({ const cmd = createStatusCommand(baseDeps());
configDeps: { configDir: tempDir },
credentialsDeps: { configDir: tempDir },
log,
checkHealth: async () => true,
});
await cmd.parseAsync([], { from: 'user' }); await cmd.parseAsync([], { from: 'user' });
expect(output.join('\n')).toContain('not logged in'); expect(output.join('\n')).toContain('not logged in');
}); });
it('shows logged in user when credentials exist', async () => { it('shows logged in user when credentials exist', async () => {
saveCredentials({ token: 'tok', mcpdUrl: 'http://x:3100', user: 'alice@example.com' }, { configDir: tempDir }); saveCredentials({ token: 'tok', mcpdUrl: 'http://x:3100', user: 'alice@example.com' }, { configDir: tempDir });
const cmd = createStatusCommand({ const cmd = createStatusCommand(baseDeps());
configDeps: { configDir: tempDir },
credentialsDeps: { configDir: tempDir },
log,
checkHealth: async () => true,
});
await cmd.parseAsync([], { from: 'user' }); await cmd.parseAsync([], { from: 'user' });
expect(output.join('\n')).toContain('logged in as alice@example.com'); expect(output.join('\n')).toContain('logged in as alice@example.com');
}); });
it('shows status in JSON format', async () => { it('shows status in JSON format', async () => {
const cmd = createStatusCommand({ const cmd = createStatusCommand(baseDeps());
configDeps: { configDir: tempDir },
credentialsDeps: { configDir: tempDir },
log,
checkHealth: async () => true,
});
await cmd.parseAsync(['-o', 'json'], { from: 'user' }); await cmd.parseAsync(['-o', 'json'], { from: 'user' });
const parsed = JSON.parse(output[0]) as Record<string, unknown>; const parsed = JSON.parse(output[0]) as Record<string, unknown>;
expect(parsed['version']).toBe('0.1.0'); expect(parsed['version']).toBe('0.1.0');
@@ -87,12 +81,7 @@ describe('status command', () => {
}); });
it('shows status in YAML format', async () => { it('shows status in YAML format', async () => {
const cmd = createStatusCommand({ const cmd = createStatusCommand(baseDeps({ checkHealth: async () => false }));
configDeps: { configDir: tempDir },
credentialsDeps: { configDir: tempDir },
log,
checkHealth: async () => false,
});
await cmd.parseAsync(['-o', 'yaml'], { from: 'user' }); await cmd.parseAsync(['-o', 'yaml'], { from: 'user' });
expect(output[0]).toContain('mcplocalReachable: false'); expect(output[0]).toContain('mcplocalReachable: false');
}); });
@@ -100,15 +89,12 @@ describe('status command', () => {
it('checks correct URLs from config', async () => { it('checks correct URLs from config', async () => {
saveConfig({ ...DEFAULT_CONFIG, mcplocalUrl: 'http://local:3200', mcpdUrl: 'http://remote:3100' }, { configDir: tempDir }); saveConfig({ ...DEFAULT_CONFIG, mcplocalUrl: 'http://local:3200', mcpdUrl: 'http://remote:3100' }, { configDir: tempDir });
const checkedUrls: string[] = []; const checkedUrls: string[] = [];
const cmd = createStatusCommand({ const cmd = createStatusCommand(baseDeps({
configDeps: { configDir: tempDir },
credentialsDeps: { configDir: tempDir },
log,
checkHealth: async (url) => { checkHealth: async (url) => {
checkedUrls.push(url); checkedUrls.push(url);
return false; return false;
}, },
}); }));
await cmd.parseAsync([], { from: 'user' }); await cmd.parseAsync([], { from: 'user' });
expect(checkedUrls).toContain('http://local:3200'); expect(checkedUrls).toContain('http://local:3200');
expect(checkedUrls).toContain('http://remote:3100'); expect(checkedUrls).toContain('http://remote:3100');
@@ -116,14 +102,100 @@ describe('status command', () => {
it('shows registries from config', async () => { it('shows registries from config', async () => {
saveConfig({ ...DEFAULT_CONFIG, registries: ['official'] }, { configDir: tempDir }); saveConfig({ ...DEFAULT_CONFIG, registries: ['official'] }, { configDir: tempDir });
const cmd = createStatusCommand({ const cmd = createStatusCommand(baseDeps());
configDeps: { configDir: tempDir },
credentialsDeps: { configDir: tempDir },
log,
checkHealth: async () => true,
});
await cmd.parseAsync([], { from: 'user' }); await cmd.parseAsync([], { from: 'user' });
expect(output.join('\n')).toContain('official'); expect(output.join('\n')).toContain('official');
expect(output.join('\n')).not.toContain('glama'); expect(output.join('\n')).not.toContain('glama');
}); });
it('shows LLM not configured hint when no LLM is set', async () => {
const cmd = createStatusCommand(baseDeps());
await cmd.parseAsync([], { from: 'user' });
const out = output.join('\n');
expect(out).toContain('LLM:');
expect(out).toContain('not configured');
expect(out).toContain('mcpctl config setup');
});
it('shows green check when LLM is healthy (non-TTY)', async () => {
saveConfig({ ...DEFAULT_CONFIG, llm: { provider: 'anthropic', model: 'claude-haiku-3-5-20241022' } }, { configDir: tempDir });
const cmd = createStatusCommand(baseDeps({ checkLlm: async () => 'ok' }));
await cmd.parseAsync([], { from: 'user' });
const out = output.join('\n');
expect(out).toContain('anthropic / claude-haiku-3-5-20241022');
expect(out).toContain('✓ ok');
});
it('shows red cross when LLM check fails (non-TTY)', async () => {
saveConfig({ ...DEFAULT_CONFIG, llm: { provider: 'gemini-cli', model: 'gemini-2.5-flash' } }, { configDir: tempDir });
const cmd = createStatusCommand(baseDeps({ checkLlm: async () => 'not authenticated' }));
await cmd.parseAsync([], { from: 'user' });
const out = output.join('\n');
expect(out).toContain('✗ not authenticated');
});
it('shows error message from mcplocal', async () => {
saveConfig({ ...DEFAULT_CONFIG, llm: { provider: 'gemini-cli', model: 'gemini-2.5-flash' } }, { configDir: tempDir });
const cmd = createStatusCommand(baseDeps({ checkLlm: async () => 'binary not found' }));
await cmd.parseAsync([], { from: 'user' });
expect(output.join('\n')).toContain('✗ binary not found');
});
it('queries mcplocal URL for LLM health', async () => {
saveConfig({ ...DEFAULT_CONFIG, mcplocalUrl: 'http://custom:9999', llm: { provider: 'gemini-cli', model: 'gemini-2.5-flash' } }, { configDir: tempDir });
let queriedUrl = '';
const cmd = createStatusCommand(baseDeps({
checkLlm: async (url) => { queriedUrl = url; return 'ok'; },
}));
await cmd.parseAsync([], { from: 'user' });
expect(queriedUrl).toBe('http://custom:9999');
});
it('uses spinner on TTY and writes final result', async () => {
saveConfig({ ...DEFAULT_CONFIG, llm: { provider: 'gemini-cli', model: 'gemini-2.5-flash' } }, { configDir: tempDir });
const cmd = createStatusCommand(baseDeps({
isTTY: true,
checkLlm: async () => 'ok',
}));
await cmd.parseAsync([], { from: 'user' });
// On TTY, the final LLM line goes through write(), not log()
const finalWrite = written[written.length - 1];
expect(finalWrite).toContain('gemini-cli / gemini-2.5-flash');
expect(finalWrite).toContain('✓ ok');
});
it('uses spinner on TTY and shows failure', async () => {
saveConfig({ ...DEFAULT_CONFIG, llm: { provider: 'gemini-cli', model: 'gemini-2.5-flash' } }, { configDir: tempDir });
const cmd = createStatusCommand(baseDeps({
isTTY: true,
checkLlm: async () => 'not authenticated',
}));
await cmd.parseAsync([], { from: 'user' });
const finalWrite = written[written.length - 1];
expect(finalWrite).toContain('✗ not authenticated');
});
it('shows not configured when LLM provider is none', async () => {
saveConfig({ ...DEFAULT_CONFIG, llm: { provider: 'none' } }, { configDir: tempDir });
const cmd = createStatusCommand(baseDeps());
await cmd.parseAsync([], { from: 'user' });
expect(output.join('\n')).toContain('not configured');
});
it('includes llm and llmStatus in JSON output', async () => {
saveConfig({ ...DEFAULT_CONFIG, llm: { provider: 'gemini-cli', model: 'gemini-2.5-flash' } }, { configDir: tempDir });
const cmd = createStatusCommand(baseDeps({ checkLlm: async () => 'ok' }));
await cmd.parseAsync(['-o', 'json'], { from: 'user' });
const parsed = JSON.parse(output[0]) as Record<string, unknown>;
expect(parsed['llm']).toBe('gemini-cli / gemini-2.5-flash');
expect(parsed['llmStatus']).toBe('ok');
});
it('includes null llm in JSON output when not configured', async () => {
const cmd = createStatusCommand(baseDeps());
await cmd.parseAsync(['-o', 'json'], { from: 'user' });
const parsed = JSON.parse(output[0]) as Record<string, unknown>;
expect(parsed['llm']).toBeNull();
expect(parsed['llmStatus']).toBeNull();
});
}); });

View File

@@ -0,0 +1,176 @@
import { describe, it, expect } from 'vitest';
import { readFileSync } from 'node:fs';
import { join, dirname } from 'node:path';
import { fileURLToPath } from 'node:url';
const root = join(dirname(fileURLToPath(import.meta.url)), '..', '..', '..');
const fishFile = readFileSync(join(root, 'completions', 'mcpctl.fish'), 'utf-8');
const bashFile = readFileSync(join(root, 'completions', 'mcpctl.bash'), 'utf-8');
describe('fish completions', () => {
it('erases stale completions at the top', () => {
const lines = fishFile.split('\n');
const firstComplete = lines.findIndex((l) => l.startsWith('complete '));
expect(lines[firstComplete]).toContain('-e');
});
it('does not offer resource types without __mcpctl_needs_resource_type guard', () => {
const resourceTypes = ['servers', 'instances', 'secrets', 'templates', 'projects', 'users', 'groups', 'rbac', 'prompts', 'promptrequests'];
const lines = fishFile.split('\n').filter((l) => l.startsWith('complete '));
for (const line of lines) {
// Find lines that offer resource types as positional args
const offersResourceType = resourceTypes.some((r) => {
// Match `-a "...servers..."` or `-a 'servers projects'`
const aMatch = line.match(/-a\s+['"]([^'"]+)['"]/);
if (!aMatch) return false;
return aMatch[1].split(/\s+/).includes(r);
});
if (!offersResourceType) continue;
// Skip the help completions line and the -e line
if (line.includes('__fish_seen_subcommand_from help')) continue;
// Skip project-scoped command offerings (those offer commands, not resource types)
if (line.includes('attach-server') || line.includes('detach-server')) continue;
// Skip lines that offer commands (not resource types)
if (line.includes("-d 'Show") || line.includes("-d 'Manage") || line.includes("-d 'Authenticate") ||
line.includes("-d 'Log out'") || line.includes("-d 'Get instance") || line.includes("-d 'Create a resource'") ||
line.includes("-d 'Edit a resource'") || line.includes("-d 'Apply") || line.includes("-d 'Backup") ||
line.includes("-d 'Restore") || line.includes("-d 'List resources") || line.includes("-d 'Delete a resource'")) continue;
// Lines offering resource types MUST have __mcpctl_needs_resource_type in their condition
expect(line, `Resource type completion missing guard: ${line}`).toContain('__mcpctl_needs_resource_type');
}
});
it('resource name completions require resource type to be selected', () => {
const lines = fishFile.split('\n').filter((l) => l.startsWith('complete') && l.includes('__mcpctl_resource_names'));
expect(lines.length).toBeGreaterThan(0);
for (const line of lines) {
expect(line).toContain('not __mcpctl_needs_resource_type');
}
});
it('defines --project option', () => {
expect(fishFile).toContain("complete -c mcpctl -l project");
});
it('attach-server command only shows with --project', () => {
// Only check lines that OFFER attach-server as a command (via -a attach-server), not argument completions
const lines = fishFile.split('\n').filter((l) =>
l.startsWith('complete') && l.includes("-a attach-server"));
expect(lines.length).toBeGreaterThan(0);
for (const line of lines) {
expect(line).toContain('__mcpctl_has_project');
}
});
it('detach-server command only shows with --project', () => {
const lines = fishFile.split('\n').filter((l) =>
l.startsWith('complete') && l.includes("-a detach-server"));
expect(lines.length).toBeGreaterThan(0);
for (const line of lines) {
expect(line).toContain('__mcpctl_has_project');
}
});
it('resource name functions use jq .[][].name to unwrap wrapped JSON and avoid nested matches', () => {
// API returns { "resources": [...] } not [...], so .[].name fails silently.
// Must use .[][].name to unwrap the outer object then iterate the array.
// Also must not use string match regex which matches nested name fields.
const resourceNamesFn = fishFile.match(/function __mcpctl_resource_names[\s\S]*?^end/m)?.[0] ?? '';
const projectNamesFn = fishFile.match(/function __mcpctl_project_names[\s\S]*?^end/m)?.[0] ?? '';
expect(resourceNamesFn, '__mcpctl_resource_names must use jq .[][].name').toContain("jq -r '.[][].name'");
expect(resourceNamesFn, '__mcpctl_resource_names must not use string match on name').not.toMatch(/string match.*"name"/);
expect(projectNamesFn, '__mcpctl_project_names must use jq .[][].name').toContain("jq -r '.[][].name'");
expect(projectNamesFn, '__mcpctl_project_names must not use string match on name').not.toMatch(/string match.*"name"/);
});
it('instances use server.name instead of name', () => {
const resourceNamesFn = fishFile.match(/function __mcpctl_resource_names[\s\S]*?^end/m)?.[0] ?? '';
expect(resourceNamesFn, 'must handle instances via server.name').toContain('.server.name');
});
it('attach-server completes with available (unattached) servers and guards against repeat', () => {
const attachLine = fishFile.split('\n').find((l) =>
l.startsWith('complete') && l.includes('__fish_seen_subcommand_from attach-server'));
expect(attachLine, 'attach-server argument completion must exist').toBeDefined();
expect(attachLine, 'attach-server must use __mcpctl_available_servers').toContain('__mcpctl_available_servers');
expect(attachLine, 'attach-server must guard with __mcpctl_needs_server_arg').toContain('__mcpctl_needs_server_arg');
});
it('detach-server completes with project servers and guards against repeat', () => {
const detachLine = fishFile.split('\n').find((l) =>
l.startsWith('complete') && l.includes('__fish_seen_subcommand_from detach-server'));
expect(detachLine, 'detach-server argument completion must exist').toBeDefined();
expect(detachLine, 'detach-server must use __mcpctl_project_servers').toContain('__mcpctl_project_servers');
expect(detachLine, 'detach-server must guard with __mcpctl_needs_server_arg').toContain('__mcpctl_needs_server_arg');
});
it('non-project commands do not show with --project', () => {
const nonProjectCmds = ['status', 'login', 'logout', 'config', 'apply', 'backup', 'restore'];
const lines = fishFile.split('\n').filter((l) => l.startsWith('complete') && l.includes('-a '));
for (const cmd of nonProjectCmds) {
const cmdLines = lines.filter((l) => {
const aMatch = l.match(/-a\s+(\S+)/);
return aMatch && aMatch[1].replace(/['"]/g, '') === cmd;
});
for (const line of cmdLines) {
expect(line, `${cmd} should require 'not __mcpctl_has_project'`).toContain('not __mcpctl_has_project');
}
}
});
});
describe('bash completions', () => {
it('separates project commands from regular commands', () => {
expect(bashFile).toContain('project_commands=');
expect(bashFile).toContain('attach-server detach-server');
});
it('checks has_project before offering project commands', () => {
expect(bashFile).toContain('if $has_project');
expect(bashFile).toContain('$project_commands');
});
it('fetches resource names dynamically after resource type', () => {
expect(bashFile).toContain('_mcpctl_resource_names');
// get/describe/delete should use resource_names when resource_type is set
expect(bashFile).toMatch(/get\|describe\|delete\)[\s\S]*?_mcpctl_resource_names/);
});
it('attach-server filters out already-attached servers and guards against repeat', () => {
const attachBlock = bashFile.match(/attach-server\)[\s\S]*?return ;;/)?.[0] ?? '';
expect(attachBlock, 'attach-server must use _mcpctl_get_project_value').toContain('_mcpctl_get_project_value');
expect(attachBlock, 'attach-server must query project servers to exclude').toContain('--project');
expect(attachBlock, 'attach-server must check position to prevent repeat').toContain('cword - subcmd_pos');
});
it('detach-server shows only project servers and guards against repeat', () => {
const detachBlock = bashFile.match(/detach-server\)[\s\S]*?return ;;/)?.[0] ?? '';
expect(detachBlock, 'detach-server must use _mcpctl_get_project_value').toContain('_mcpctl_get_project_value');
expect(detachBlock, 'detach-server must query project servers').toContain('--project');
expect(detachBlock, 'detach-server must check position to prevent repeat').toContain('cword - subcmd_pos');
});
it('instances use server.name instead of name', () => {
const fnMatch = bashFile.match(/_mcpctl_resource_names\(\)[\s\S]*?\n\s*\}/)?.[0] ?? '';
expect(fnMatch, 'must handle instances via .server.name').toContain('.server.name');
});
it('defines --project option', () => {
expect(bashFile).toContain('--project');
});
it('resource name function uses jq .[][].name to unwrap wrapped JSON and avoid nested matches', () => {
const fnMatch = bashFile.match(/_mcpctl_resource_names\(\)[\s\S]*?\n\s*\}/)?.[0] ?? '';
expect(fnMatch, '_mcpctl_resource_names must use jq .[][].name').toContain("jq -r '.[][].name'");
expect(fnMatch, '_mcpctl_resource_names must not use grep on name').not.toMatch(/grep.*"name"/);
// Guard against .[].name (single bracket) which fails on wrapped JSON
expect(fnMatch, '_mcpctl_resource_names must not use .[].name (needs .[][].name)').not.toMatch(/jq.*'\.\[\]\.name'/);
});
});

View File

@@ -21,35 +21,44 @@ describe('CLI command registration (e2e)', () => {
expect(commandNames).toContain('apply'); expect(commandNames).toContain('apply');
expect(commandNames).toContain('create'); expect(commandNames).toContain('create');
expect(commandNames).toContain('edit'); expect(commandNames).toContain('edit');
expect(commandNames).toContain('claude');
expect(commandNames).toContain('project');
expect(commandNames).toContain('backup'); expect(commandNames).toContain('backup');
expect(commandNames).toContain('restore'); expect(commandNames).toContain('restore');
}); });
it('instance command is removed (use get/delete/logs instead)', () => { it('old project and claude top-level commands are removed', () => {
const program = createProgram(); const program = createProgram();
const commandNames = program.commands.map((c) => c.name()); const commandNames = program.commands.map((c) => c.name());
expect(commandNames).not.toContain('claude');
expect(commandNames).not.toContain('project');
expect(commandNames).not.toContain('instance'); expect(commandNames).not.toContain('instance');
}); });
it('claude command has config management subcommands', () => { it('config command has claude-generate and impersonate subcommands', () => {
const program = createProgram(); const program = createProgram();
const claude = program.commands.find((c) => c.name() === 'claude'); const config = program.commands.find((c) => c.name() === 'config');
expect(claude).toBeDefined(); expect(config).toBeDefined();
const subcommands = claude!.commands.map((c) => c.name()); const subcommands = config!.commands.map((c) => c.name());
expect(subcommands).toContain('generate'); expect(subcommands).toContain('claude-generate');
expect(subcommands).toContain('show'); expect(subcommands).toContain('impersonate');
expect(subcommands).toContain('add'); expect(subcommands).toContain('view');
expect(subcommands).toContain('remove'); expect(subcommands).toContain('set');
expect(subcommands).toContain('path');
expect(subcommands).toContain('reset');
}); });
it('project command exists with alias', () => { it('create command has user, group, rbac subcommands', () => {
const program = createProgram(); const program = createProgram();
const project = program.commands.find((c) => c.name() === 'project'); const create = program.commands.find((c) => c.name() === 'create');
expect(project).toBeDefined(); expect(create).toBeDefined();
expect(project!.alias()).toBe('proj');
const subcommands = create!.commands.map((c) => c.name());
expect(subcommands).toContain('server');
expect(subcommands).toContain('secret');
expect(subcommands).toContain('project');
expect(subcommands).toContain('user');
expect(subcommands).toContain('group');
expect(subcommands).toContain('rbac');
}); });
it('displays version', () => { it('displays version', () => {

View File

@@ -0,0 +1,8 @@
-- DropForeignKey
ALTER TABLE "ProjectMember" DROP CONSTRAINT IF EXISTS "ProjectMember_projectId_fkey";
-- DropForeignKey
ALTER TABLE "ProjectMember" DROP CONSTRAINT IF EXISTS "ProjectMember_userId_fkey";
-- DropTable
DROP TABLE IF EXISTS "ProjectMember";

View File

@@ -15,13 +15,16 @@ model User {
name String? name String?
passwordHash String passwordHash String
role Role @default(USER) role Role @default(USER)
provider String?
externalId String?
version Int @default(1) version Int @default(1)
createdAt DateTime @default(now()) createdAt DateTime @default(now())
updatedAt DateTime @updatedAt updatedAt DateTime @updatedAt
sessions Session[] sessions Session[]
auditLogs AuditLog[] auditLogs AuditLog[]
projects Project[] ownedProjects Project[]
groupMemberships GroupMember[]
@@index([email]) @@index([email])
} }
@@ -71,6 +74,7 @@ model McpServer {
templateVersion String? templateVersion String?
instances McpInstance[] instances McpInstance[]
projects ProjectServer[]
@@index([name]) @@index([name])
} }
@@ -117,23 +121,85 @@ model Secret {
@@index([name]) @@index([name])
} }
// ── Groups ──
model Group {
id String @id @default(cuid())
name String @unique
description String @default("")
version Int @default(1)
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
members GroupMember[]
@@index([name])
}
model GroupMember {
id String @id @default(cuid())
groupId String
userId String
createdAt DateTime @default(now())
group Group @relation(fields: [groupId], references: [id], onDelete: Cascade)
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
@@unique([groupId, userId])
@@index([groupId])
@@index([userId])
}
// ── RBAC Definitions ──
model RbacDefinition {
id String @id @default(cuid())
name String @unique
subjects Json @default("[]")
roleBindings Json @default("[]")
version Int @default(1)
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
@@index([name])
}
// ── Projects ── // ── Projects ──
model Project { model Project {
id String @id @default(cuid()) id String @id @default(cuid())
name String @unique name String @unique
description String @default("") description String @default("")
prompt String @default("")
proxyMode String @default("direct")
llmProvider String?
llmModel String?
ownerId String ownerId String
version Int @default(1) version Int @default(1)
createdAt DateTime @default(now()) createdAt DateTime @default(now())
updatedAt DateTime @updatedAt updatedAt DateTime @updatedAt
owner User @relation(fields: [ownerId], references: [id], onDelete: Cascade) owner User @relation(fields: [ownerId], references: [id], onDelete: Cascade)
servers ProjectServer[]
prompts Prompt[]
promptRequests PromptRequest[]
@@index([name]) @@index([name])
@@index([ownerId]) @@index([ownerId])
} }
model ProjectServer {
id String @id @default(cuid())
projectId String
serverId String
createdAt DateTime @default(now())
project Project @relation(fields: [projectId], references: [id], onDelete: Cascade)
server McpServer @relation(fields: [serverId], references: [id], onDelete: Cascade)
@@unique([projectId, serverId])
}
// ── MCP Instances (running containers) ── // ── MCP Instances (running containers) ──
model McpInstance { model McpInstance {
@@ -164,6 +230,41 @@ enum InstanceStatus {
ERROR ERROR
} }
// ── Prompts (approved content resources) ──
model Prompt {
id String @id @default(cuid())
name String
content String @db.Text
projectId String?
version Int @default(1)
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
project Project? @relation(fields: [projectId], references: [id], onDelete: Cascade)
@@unique([name, projectId])
@@index([projectId])
}
// ── Prompt Requests (pending proposals from LLM sessions) ──
model PromptRequest {
id String @id @default(cuid())
name String
content String @db.Text
projectId String?
createdBySession String?
createdByUserId String?
createdAt DateTime @default(now())
project Project? @relation(fields: [projectId], references: [id], onDelete: Cascade)
@@unique([name, projectId])
@@index([projectId])
@@index([createdBySession])
}
// ── Audit Logs ── // ── Audit Logs ──
model AuditLog { model AuditLog {

View File

@@ -49,10 +49,15 @@ export async function clearAllTables(client: PrismaClient): Promise<void> {
// Delete in order respecting foreign keys // Delete in order respecting foreign keys
await client.auditLog.deleteMany(); await client.auditLog.deleteMany();
await client.mcpInstance.deleteMany(); await client.mcpInstance.deleteMany();
await client.projectServer.deleteMany();
await client.projectMember.deleteMany();
await client.secret.deleteMany(); await client.secret.deleteMany();
await client.session.deleteMany(); await client.session.deleteMany();
await client.project.deleteMany(); await client.project.deleteMany();
await client.mcpServer.deleteMany(); await client.mcpServer.deleteMany();
await client.mcpTemplate.deleteMany(); await client.mcpTemplate.deleteMany();
await client.groupMember.deleteMany();
await client.group.deleteMany();
await client.rbacDefinition.deleteMany();
await client.user.deleteMany(); await client.user.deleteMany();
} }

View File

@@ -29,6 +29,29 @@ async function createUser(overrides: { email?: string; name?: string; role?: 'US
}); });
} }
async function createGroup(overrides: { name?: string; description?: string } = {}) {
return prisma.group.create({
data: {
name: overrides.name ?? `group-${Date.now()}`,
description: overrides.description ?? 'Test group',
},
});
}
async function createProject(overrides: { name?: string; ownerId?: string } = {}) {
let ownerId = overrides.ownerId;
if (!ownerId) {
const user = await createUser();
ownerId = user.id;
}
return prisma.project.create({
data: {
name: overrides.name ?? `project-${Date.now()}`,
ownerId,
},
});
}
async function createServer(overrides: { name?: string; transport?: 'STDIO' | 'SSE' | 'STREAMABLE_HTTP' } = {}) { async function createServer(overrides: { name?: string; transport?: 'STDIO' | 'SSE' | 'STREAMABLE_HTTP' } = {}) {
return prisma.mcpServer.create({ return prisma.mcpServer.create({
data: { data: {
@@ -310,3 +333,236 @@ describe('AuditLog', () => {
expect(logs).toHaveLength(0); expect(logs).toHaveLength(0);
}); });
}); });
// ── User SSO fields ──
describe('User SSO fields', () => {
it('stores provider and externalId', async () => {
const user = await prisma.user.create({
data: {
email: 'sso@example.com',
passwordHash: 'hash',
provider: 'oidc',
externalId: 'ext-123',
},
});
expect(user.provider).toBe('oidc');
expect(user.externalId).toBe('ext-123');
});
it('defaults provider and externalId to null', async () => {
const user = await createUser();
expect(user.provider).toBeNull();
expect(user.externalId).toBeNull();
});
});
// ── Group model ──
describe('Group', () => {
it('creates a group with defaults', async () => {
const group = await createGroup();
expect(group.id).toBeDefined();
expect(group.version).toBe(1);
});
it('enforces unique name', async () => {
await createGroup({ name: 'devs' });
await expect(createGroup({ name: 'devs' })).rejects.toThrow();
});
it('creates group members', async () => {
const group = await createGroup();
const user = await createUser();
const member = await prisma.groupMember.create({
data: { groupId: group.id, userId: user.id },
});
expect(member.groupId).toBe(group.id);
expect(member.userId).toBe(user.id);
});
it('enforces unique group-user pair', async () => {
const group = await createGroup();
const user = await createUser();
await prisma.groupMember.create({ data: { groupId: group.id, userId: user.id } });
await expect(
prisma.groupMember.create({ data: { groupId: group.id, userId: user.id } }),
).rejects.toThrow();
});
it('cascades delete when group is deleted', async () => {
const group = await createGroup();
const user = await createUser();
await prisma.groupMember.create({ data: { groupId: group.id, userId: user.id } });
await prisma.group.delete({ where: { id: group.id } });
const members = await prisma.groupMember.findMany({ where: { groupId: group.id } });
expect(members).toHaveLength(0);
});
});
// ── RbacDefinition model ──
describe('RbacDefinition', () => {
it('creates with defaults', async () => {
const rbac = await prisma.rbacDefinition.create({
data: { name: 'test-rbac' },
});
expect(rbac.subjects).toEqual([]);
expect(rbac.roleBindings).toEqual([]);
expect(rbac.version).toBe(1);
});
it('enforces unique name', async () => {
await prisma.rbacDefinition.create({ data: { name: 'dup-rbac' } });
await expect(prisma.rbacDefinition.create({ data: { name: 'dup-rbac' } })).rejects.toThrow();
});
it('stores subjects as JSON', async () => {
const rbac = await prisma.rbacDefinition.create({
data: {
name: 'with-subjects',
subjects: [{ kind: 'User', name: 'alice@test.com' }, { kind: 'Group', name: 'devs' }],
},
});
const subjects = rbac.subjects as Array<{ kind: string; name: string }>;
expect(subjects).toHaveLength(2);
expect(subjects[0].kind).toBe('User');
});
it('stores roleBindings as JSON', async () => {
const rbac = await prisma.rbacDefinition.create({
data: {
name: 'with-bindings',
roleBindings: [{ role: 'editor', resource: 'servers' }],
},
});
const bindings = rbac.roleBindings as Array<{ role: string; resource: string }>;
expect(bindings).toHaveLength(1);
expect(bindings[0].role).toBe('editor');
});
it('updates subjects and roleBindings', async () => {
const rbac = await prisma.rbacDefinition.create({ data: { name: 'updatable-rbac' } });
const updated = await prisma.rbacDefinition.update({
where: { id: rbac.id },
data: {
subjects: [{ kind: 'User', name: 'bob@test.com' }],
roleBindings: [{ role: 'admin', resource: '*' }],
},
});
expect((updated.subjects as unknown[]).length).toBe(1);
expect((updated.roleBindings as unknown[]).length).toBe(1);
});
});
// ── ProjectServer model ──
describe('ProjectServer', () => {
it('links project to server', async () => {
const project = await createProject();
const server = await createServer();
const ps = await prisma.projectServer.create({
data: { projectId: project.id, serverId: server.id },
});
expect(ps.projectId).toBe(project.id);
expect(ps.serverId).toBe(server.id);
});
it('enforces unique project-server pair', async () => {
const project = await createProject();
const server = await createServer();
await prisma.projectServer.create({ data: { projectId: project.id, serverId: server.id } });
await expect(
prisma.projectServer.create({ data: { projectId: project.id, serverId: server.id } }),
).rejects.toThrow();
});
it('cascades delete when project is deleted', async () => {
const project = await createProject();
const server = await createServer();
await prisma.projectServer.create({ data: { projectId: project.id, serverId: server.id } });
await prisma.project.delete({ where: { id: project.id } });
const links = await prisma.projectServer.findMany({ where: { projectId: project.id } });
expect(links).toHaveLength(0);
});
it('cascades delete when server is deleted', async () => {
const project = await createProject();
const server = await createServer();
await prisma.projectServer.create({ data: { projectId: project.id, serverId: server.id } });
await prisma.mcpServer.delete({ where: { id: server.id } });
const links = await prisma.projectServer.findMany({ where: { serverId: server.id } });
expect(links).toHaveLength(0);
});
});
// ── ProjectMember model ──
describe('ProjectMember', () => {
it('links project to user with role', async () => {
const user = await createUser();
const project = await createProject({ ownerId: user.id });
const pm = await prisma.projectMember.create({
data: { projectId: project.id, userId: user.id, role: 'admin' },
});
expect(pm.role).toBe('admin');
});
it('defaults role to member', async () => {
const user = await createUser();
const project = await createProject({ ownerId: user.id });
const pm = await prisma.projectMember.create({
data: { projectId: project.id, userId: user.id },
});
expect(pm.role).toBe('member');
});
it('enforces unique project-user pair', async () => {
const user = await createUser();
const project = await createProject({ ownerId: user.id });
await prisma.projectMember.create({ data: { projectId: project.id, userId: user.id } });
await expect(
prisma.projectMember.create({ data: { projectId: project.id, userId: user.id } }),
).rejects.toThrow();
});
it('cascades delete when project is deleted', async () => {
const user = await createUser();
const project = await createProject({ ownerId: user.id });
await prisma.projectMember.create({ data: { projectId: project.id, userId: user.id } });
await prisma.project.delete({ where: { id: project.id } });
const members = await prisma.projectMember.findMany({ where: { projectId: project.id } });
expect(members).toHaveLength(0);
});
});
// ── Project new fields ──
describe('Project new fields', () => {
it('defaults proxyMode to direct', async () => {
const project = await createProject();
expect(project.proxyMode).toBe('direct');
});
it('stores proxyMode, llmProvider, llmModel', async () => {
const user = await createUser();
const project = await prisma.project.create({
data: {
name: 'filtered-project',
ownerId: user.id,
proxyMode: 'filtered',
llmProvider: 'gemini-cli',
llmModel: 'gemini-2.0-flash',
},
});
expect(project.proxyMode).toBe('filtered');
expect(project.llmProvider).toBe('gemini-cli');
expect(project.llmModel).toBe('gemini-2.0-flash');
});
it('defaults llmProvider and llmModel to null', async () => {
const project = await createProject();
expect(project.llmProvider).toBeNull();
expect(project.llmModel).toBeNull();
});
});

View File

@@ -14,7 +14,12 @@ import {
ProjectRepository, ProjectRepository,
AuditLogRepository, AuditLogRepository,
TemplateRepository, TemplateRepository,
RbacDefinitionRepository,
UserRepository,
GroupRepository,
} from './repositories/index.js'; } from './repositories/index.js';
import { PromptRepository } from './repositories/prompt.repository.js';
import { PromptRequestRepository } from './repositories/prompt-request.repository.js';
import { import {
McpServerService, McpServerService,
SecretService, SecretService,
@@ -30,7 +35,14 @@ import {
McpProxyService, McpProxyService,
TemplateService, TemplateService,
HealthProbeRunner, HealthProbeRunner,
RbacDefinitionService,
RbacService,
UserService,
GroupService,
} from './services/index.js'; } from './services/index.js';
import type { RbacAction } from './services/index.js';
import type { UpdateRbacDefinitionInput } from './validation/rbac-definition.schema.js';
import { createAuthMiddleware } from './middleware/auth.js';
import { import {
registerMcpServerRoutes, registerMcpServerRoutes,
registerSecretRoutes, registerSecretRoutes,
@@ -42,7 +54,155 @@ import {
registerAuthRoutes, registerAuthRoutes,
registerMcpProxyRoutes, registerMcpProxyRoutes,
registerTemplateRoutes, registerTemplateRoutes,
registerRbacRoutes,
registerUserRoutes,
registerGroupRoutes,
} from './routes/index.js'; } from './routes/index.js';
import { registerPromptRoutes } from './routes/prompts.js';
import { PromptService } from './services/prompt.service.js';
type PermissionCheck =
| { kind: 'resource'; resource: string; action: RbacAction; resourceName?: string }
| { kind: 'operation'; operation: string }
| { kind: 'skip' };
/**
* Map an HTTP method + URL to a permission check.
* Returns 'skip' for URLs that should not be RBAC-checked.
*/
function mapUrlToPermission(method: string, url: string): PermissionCheck {
const match = url.match(/^\/api\/v1\/([a-z-]+)/);
if (!match) return { kind: 'skip' };
const segment = match[1] as string;
// Operations (non-resource endpoints)
if (segment === 'backup') return { kind: 'operation', operation: 'backup' };
if (segment === 'restore') return { kind: 'operation', operation: 'restore' };
if (segment === 'audit-logs' && method === 'DELETE') return { kind: 'operation', operation: 'audit-purge' };
const resourceMap: Record<string, string | undefined> = {
'servers': 'servers',
'instances': 'instances',
'secrets': 'secrets',
'projects': 'projects',
'templates': 'templates',
'users': 'users',
'groups': 'groups',
'rbac': 'rbac',
'audit-logs': 'rbac',
'mcp': 'servers',
'prompts': 'prompts',
'promptrequests': 'promptrequests',
};
const resource = resourceMap[segment];
if (resource === undefined) return { kind: 'skip' };
// Special case: /api/v1/promptrequests/:id/approve → needs both delete+promptrequests and create+prompts
// We check delete on promptrequests (the harder permission); create on prompts is checked in the service layer
const approveMatch = url.match(/^\/api\/v1\/promptrequests\/([^/?]+)\/approve/);
if (approveMatch?.[1]) {
return { kind: 'resource', resource: 'promptrequests', action: 'delete', resourceName: approveMatch[1] };
}
// Special case: /api/v1/projects/:name/prompts/visible → view prompts
const visiblePromptsMatch = url.match(/^\/api\/v1\/projects\/([^/?]+)\/prompts\/visible/);
if (visiblePromptsMatch?.[1]) {
return { kind: 'resource', resource: 'prompts', action: 'view' };
}
// Special case: /api/v1/projects/:name/promptrequests → create promptrequests
const projectPromptrequestsMatch = url.match(/^\/api\/v1\/projects\/([^/?]+)\/promptrequests/);
if (projectPromptrequestsMatch?.[1] && method === 'POST') {
return { kind: 'resource', resource: 'promptrequests', action: 'create' };
}
// Special case: /api/v1/projects/:id/instructions → view projects
const instructionsMatch = url.match(/^\/api\/v1\/projects\/([^/?]+)\/instructions/);
if (instructionsMatch?.[1]) {
return { kind: 'resource', resource: 'projects', action: 'view', resourceName: instructionsMatch[1] };
}
// Special case: /api/v1/projects/:id/mcp-config → requires 'expose' permission
const mcpConfigMatch = url.match(/^\/api\/v1\/projects\/([^/?]+)\/mcp-config/);
if (mcpConfigMatch?.[1]) {
return { kind: 'resource', resource: 'projects', action: 'expose', resourceName: mcpConfigMatch[1] };
}
// Special case: /api/v1/projects/:id/servers — attach/detach requires 'edit'
const projectServersMatch = url.match(/^\/api\/v1\/projects\/([^/?]+)\/servers/);
if (projectServersMatch?.[1] && method !== 'GET') {
return { kind: 'resource', resource: 'projects', action: 'edit', resourceName: projectServersMatch[1] };
}
// Map HTTP method to action
let action: RbacAction;
switch (method) {
case 'GET':
case 'HEAD':
action = 'view';
break;
case 'POST':
action = 'create';
break;
case 'DELETE':
action = 'delete';
break;
default: // PUT, PATCH
action = 'edit';
break;
}
// Extract resource name/ID from URL (3rd segment: /api/v1/servers/:nameOrId)
const nameMatch = url.match(/^\/api\/v1\/[a-z-]+\/([^/?]+)/);
const resourceName = nameMatch?.[1];
const check: PermissionCheck = { kind: 'resource', resource, action };
if (resourceName !== undefined) (check as { resourceName: string }).resourceName = resourceName;
return check;
}
/**
* Migrate legacy 'admin' role bindings → granular roles.
* Old format: { role: 'admin', resource: '*' }
* New format: { role: 'edit', resource: '*' }, { role: 'run', resource: '*' },
* plus operation bindings for impersonate, logs, backup, restore, audit-purge
*/
async function migrateAdminRole(rbacRepo: InstanceType<typeof RbacDefinitionRepository>): Promise<void> {
const definitions = await rbacRepo.findAll();
for (const def of definitions) {
const bindings = def.roleBindings as Array<Record<string, unknown>>;
const hasAdminRole = bindings.some((b) => b['role'] === 'admin');
if (!hasAdminRole) continue;
// Replace admin bindings with granular equivalents
const newBindings: Array<Record<string, string>> = [];
for (const b of bindings) {
if (b['role'] === 'admin') {
const resource = b['resource'] as string;
newBindings.push({ role: 'edit', resource });
newBindings.push({ role: 'run', resource });
} else {
newBindings.push(b as Record<string, string>);
}
}
// Add operation bindings (idempotent — only for wildcard admin)
const hasWildcard = bindings.some((b) => b['role'] === 'admin' && b['resource'] === '*');
if (hasWildcard) {
const ops = ['impersonate', 'logs', 'backup', 'restore', 'audit-purge'];
for (const op of ops) {
if (!newBindings.some((b) => b['action'] === op)) {
newBindings.push({ role: 'run', action: op });
}
}
}
await rbacRepo.update(def.id, { roleBindings: newBindings as UpdateRbacDefinitionInput['roleBindings'] });
// eslint-disable-next-line no-console
console.log(`mcpd: migrated RBAC '${def.name}' from admin → granular roles`);
}
}
async function main(): Promise<void> { async function main(): Promise<void> {
const config = loadConfigFromEnv(); const config = loadConfigFromEnv();
@@ -82,6 +242,21 @@ async function main(): Promise<void> {
const projectRepo = new ProjectRepository(prisma); const projectRepo = new ProjectRepository(prisma);
const auditLogRepo = new AuditLogRepository(prisma); const auditLogRepo = new AuditLogRepository(prisma);
const templateRepo = new TemplateRepository(prisma); const templateRepo = new TemplateRepository(prisma);
const rbacDefinitionRepo = new RbacDefinitionRepository(prisma);
const userRepo = new UserRepository(prisma);
const groupRepo = new GroupRepository(prisma);
// CUID detection for RBAC name resolution
const CUID_RE = /^c[^\s-]{8,}$/i;
const nameResolvers: Record<string, { findById(id: string): Promise<{ name: string } | null> }> = {
servers: serverRepo,
secrets: secretRepo,
projects: projectRepo,
groups: groupRepo,
};
// Migrate legacy 'admin' role → granular roles
await migrateAdminRole(rbacDefinitionRepo);
// Orchestrator // Orchestrator
const orchestrator = new DockerContainerManager(); const orchestrator = new DockerContainerManager();
@@ -91,15 +266,27 @@ async function main(): Promise<void> {
const instanceService = new InstanceService(instanceRepo, serverRepo, orchestrator, secretRepo); const instanceService = new InstanceService(instanceRepo, serverRepo, orchestrator, secretRepo);
serverService.setInstanceService(instanceService); serverService.setInstanceService(instanceService);
const secretService = new SecretService(secretRepo); const secretService = new SecretService(secretRepo);
const projectService = new ProjectService(projectRepo); const projectService = new ProjectService(projectRepo, serverRepo, secretRepo);
const auditLogService = new AuditLogService(auditLogRepo); const auditLogService = new AuditLogService(auditLogRepo);
const metricsCollector = new MetricsCollector(); const metricsCollector = new MetricsCollector();
const healthAggregator = new HealthAggregator(metricsCollector, orchestrator); const healthAggregator = new HealthAggregator(metricsCollector, orchestrator);
const backupService = new BackupService(serverRepo, projectRepo, secretRepo); const backupService = new BackupService(serverRepo, projectRepo, secretRepo, userRepo, groupRepo, rbacDefinitionRepo);
const restoreService = new RestoreService(serverRepo, projectRepo, secretRepo); const restoreService = new RestoreService(serverRepo, projectRepo, secretRepo, userRepo, groupRepo, rbacDefinitionRepo);
const authService = new AuthService(prisma); const authService = new AuthService(prisma);
const templateService = new TemplateService(templateRepo); const templateService = new TemplateService(templateRepo);
const mcpProxyService = new McpProxyService(instanceRepo, serverRepo); const mcpProxyService = new McpProxyService(instanceRepo, serverRepo, orchestrator);
const rbacDefinitionService = new RbacDefinitionService(rbacDefinitionRepo);
const rbacService = new RbacService(rbacDefinitionRepo, prisma);
const userService = new UserService(userRepo);
const groupService = new GroupService(groupRepo, userRepo);
const promptRepo = new PromptRepository(prisma);
const promptRequestRepo = new PromptRequestRepository(prisma);
const promptService = new PromptService(promptRepo, promptRequestRepo, projectRepo);
// Auth middleware for global hooks
const authMiddleware = createAuthMiddleware({
findSession: (token) => authService.findSession(token),
});
// Server // Server
const app = await createServer(config, { const app = await createServer(config, {
@@ -115,6 +302,59 @@ async function main(): Promise<void> {
}, },
}); });
// ── Global auth hook ──
// Runs on all /api/v1/* routes EXCEPT auth endpoints and health checks.
// Tests that use createServer() directly are NOT affected — this hook
// is only registered here in main.ts.
app.addHook('preHandler', async (request, reply) => {
const url = request.url;
// Skip auth for health, auth, and root
if (url.startsWith('/api/v1/auth/') || url === '/healthz' || url === '/health') return;
if (!url.startsWith('/api/v1/')) return;
// Run auth middleware
await authMiddleware(request, reply);
});
// ── Global RBAC hook ──
// Runs after the auth hook. Maps URL to resource+action and checks permissions.
app.addHook('preHandler', async (request, reply) => {
if (reply.sent) return; // Auth hook already rejected
const url = request.url;
if (url.startsWith('/api/v1/auth/') || url === '/healthz' || url === '/health') return;
if (!url.startsWith('/api/v1/')) return;
if (request.userId === undefined) return; // Auth hook will handle 401
const check = mapUrlToPermission(request.method, url);
if (check.kind === 'skip') return;
// Extract service account identity from header (sent by mcplocal)
const saHeader = request.headers['x-service-account'];
const serviceAccountName = typeof saHeader === 'string' ? saHeader : undefined;
let allowed: boolean;
if (check.kind === 'operation') {
allowed = await rbacService.canRunOperation(request.userId, check.operation, serviceAccountName);
} else {
// Resolve CUID → human name for name-scoped RBAC bindings
if (check.resourceName !== undefined && CUID_RE.test(check.resourceName)) {
const resolver = nameResolvers[check.resource];
if (resolver) {
const entity = await resolver.findById(check.resourceName);
if (entity) check.resourceName = entity.name;
}
}
allowed = await rbacService.canAccess(request.userId, check.action, check.resource, check.resourceName, serviceAccountName);
// Compute scope for list filtering (used by preSerialization hook)
if (allowed && check.resourceName === undefined) {
request.rbacScope = await rbacService.getAllowedScope(request.userId, check.action, check.resource, serviceAccountName);
}
}
if (!allowed) {
reply.code(403).send({ error: 'Forbidden' });
}
});
// Routes // Routes
registerMcpServerRoutes(app, serverService, instanceService); registerMcpServerRoutes(app, serverService, instanceService);
registerTemplateRoutes(app, templateService); registerTemplateRoutes(app, templateService);
@@ -124,12 +364,27 @@ async function main(): Promise<void> {
registerAuditLogRoutes(app, auditLogService); registerAuditLogRoutes(app, auditLogService);
registerHealthMonitoringRoutes(app, { healthAggregator, metricsCollector }); registerHealthMonitoringRoutes(app, { healthAggregator, metricsCollector });
registerBackupRoutes(app, { backupService, restoreService }); registerBackupRoutes(app, { backupService, restoreService });
registerAuthRoutes(app, { authService }); registerAuthRoutes(app, { authService, userService, groupService, rbacDefinitionService, rbacService });
registerMcpProxyRoutes(app, { registerMcpProxyRoutes(app, {
mcpProxyService, mcpProxyService,
auditLogService, auditLogService,
authDeps: { findSession: (token) => authService.findSession(token) }, authDeps: { findSession: (token) => authService.findSession(token) },
}); });
registerRbacRoutes(app, rbacDefinitionService);
registerUserRoutes(app, userService);
registerGroupRoutes(app, groupService);
registerPromptRoutes(app, promptService, projectRepo);
// ── RBAC list filtering hook ──
// Filters array responses to only include resources the user is allowed to see.
app.addHook('preSerialization', async (request, _reply, payload) => {
if (!request.rbacScope || request.rbacScope.wildcard) return payload;
if (!Array.isArray(payload)) return payload;
return (payload as Array<Record<string, unknown>>).filter((item) => {
const name = item['name'];
return typeof name === 'string' && request.rbacScope!.names.has(name);
});
});
// Start // Start
await app.listen({ port: config.port, host: config.host }); await app.listen({ port: config.port, host: config.host });

View File

@@ -7,6 +7,7 @@ export interface AuthDeps {
declare module 'fastify' { declare module 'fastify' {
interface FastifyRequest { interface FastifyRequest {
userId?: string; userId?: string;
rbacScope?: { wildcard: boolean; names: Set<string> };
} }
} }

View File

@@ -0,0 +1,36 @@
import type { FastifyRequest, FastifyReply } from 'fastify';
import type { RbacService, RbacAction } from '../services/rbac.service.js';
export function createRbacMiddleware(rbacService: RbacService) {
function requirePermission(resource: string, action: RbacAction, resourceName?: string) {
return async (request: FastifyRequest, reply: FastifyReply): Promise<void> => {
if (request.userId === undefined) {
reply.code(401).send({ error: 'Authentication required' });
return;
}
const allowed = await rbacService.canAccess(request.userId, action, resource, resourceName);
if (!allowed) {
reply.code(403).send({ error: 'Forbidden' });
return;
}
};
}
function requireOperation(operation: string) {
return async (request: FastifyRequest, reply: FastifyReply): Promise<void> => {
if (request.userId === undefined) {
reply.code(401).send({ error: 'Authentication required' });
return;
}
const allowed = await rbacService.canRunOperation(request.userId, operation);
if (!allowed) {
reply.code(403).send({ error: 'Forbidden' });
return;
}
};
}
return { requirePermission, requireOperation };
}

View File

@@ -0,0 +1,93 @@
import type { PrismaClient, Group } from '@prisma/client';
export interface GroupWithMembers extends Group {
members: Array<{ id: string; user: { id: string; email: string; name: string | null } }>;
}
export interface IGroupRepository {
findAll(): Promise<GroupWithMembers[]>;
findById(id: string): Promise<GroupWithMembers | null>;
findByName(name: string): Promise<GroupWithMembers | null>;
create(data: { name: string; description?: string }): Promise<Group>;
update(id: string, data: { description?: string }): Promise<Group>;
delete(id: string): Promise<void>;
setMembers(groupId: string, userIds: string[]): Promise<void>;
findGroupsForUser(userId: string): Promise<Array<{ id: string; name: string }>>;
}
const MEMBERS_INCLUDE = {
members: {
select: {
id: true,
user: {
select: { id: true, email: true, name: true },
},
},
},
} as const;
export class GroupRepository implements IGroupRepository {
constructor(private readonly prisma: PrismaClient) {}
async findAll(): Promise<GroupWithMembers[]> {
return this.prisma.group.findMany({
orderBy: { name: 'asc' },
include: MEMBERS_INCLUDE,
});
}
async findById(id: string): Promise<GroupWithMembers | null> {
return this.prisma.group.findUnique({
where: { id },
include: MEMBERS_INCLUDE,
});
}
async findByName(name: string): Promise<GroupWithMembers | null> {
return this.prisma.group.findUnique({
where: { name },
include: MEMBERS_INCLUDE,
});
}
async create(data: { name: string; description?: string }): Promise<Group> {
const createData: Record<string, unknown> = { name: data.name };
if (data.description !== undefined) createData['description'] = data.description;
return this.prisma.group.create({
data: createData as Parameters<PrismaClient['group']['create']>[0]['data'],
});
}
async update(id: string, data: { description?: string }): Promise<Group> {
const updateData: Record<string, unknown> = {};
if (data.description !== undefined) updateData['description'] = data.description;
return this.prisma.group.update({ where: { id }, data: updateData });
}
async delete(id: string): Promise<void> {
await this.prisma.group.delete({ where: { id } });
}
async setMembers(groupId: string, userIds: string[]): Promise<void> {
await this.prisma.$transaction(async (tx) => {
await tx.groupMember.deleteMany({ where: { groupId } });
if (userIds.length > 0) {
await tx.groupMember.createMany({
data: userIds.map((userId) => ({ groupId, userId })),
});
}
});
}
async findGroupsForUser(userId: string): Promise<Array<{ id: string; name: string }>> {
const memberships = await this.prisma.groupMember.findMany({
where: { userId },
select: {
group: {
select: { id: true, name: true },
},
},
});
return memberships.map((m) => m.group);
}
}

View File

@@ -1,9 +1,15 @@
export type { IMcpServerRepository, IMcpInstanceRepository, ISecretRepository, IAuditLogRepository, AuditLogFilter } from './interfaces.js'; export type { IMcpServerRepository, IMcpInstanceRepository, ISecretRepository, IAuditLogRepository, AuditLogFilter } from './interfaces.js';
export { McpServerRepository } from './mcp-server.repository.js'; export { McpServerRepository } from './mcp-server.repository.js';
export { SecretRepository } from './secret.repository.js'; export { SecretRepository } from './secret.repository.js';
export type { IProjectRepository } from './project.repository.js'; export type { IProjectRepository, ProjectWithRelations } from './project.repository.js';
export { ProjectRepository } from './project.repository.js'; export { ProjectRepository } from './project.repository.js';
export { McpInstanceRepository } from './mcp-instance.repository.js'; export { McpInstanceRepository } from './mcp-instance.repository.js';
export { AuditLogRepository } from './audit-log.repository.js'; export { AuditLogRepository } from './audit-log.repository.js';
export type { ITemplateRepository } from './template.repository.js'; export type { ITemplateRepository } from './template.repository.js';
export { TemplateRepository } from './template.repository.js'; export { TemplateRepository } from './template.repository.js';
export type { IRbacDefinitionRepository } from './rbac-definition.repository.js';
export { RbacDefinitionRepository } from './rbac-definition.repository.js';
export type { IUserRepository, SafeUser } from './user.repository.js';
export { UserRepository } from './user.repository.js';
export type { IGroupRepository, GroupWithMembers } from './group.repository.js';
export { GroupRepository } from './group.repository.js';

View File

@@ -1,49 +1,92 @@
import type { PrismaClient, Project } from '@prisma/client'; import type { PrismaClient, Project } from '@prisma/client';
import type { CreateProjectInput, UpdateProjectInput } from '../validation/project.schema.js';
export interface ProjectWithRelations extends Project {
servers: Array<{ id: string; projectId: string; serverId: string; server: Record<string, unknown> & { id: string; name: string } }>;
}
const PROJECT_INCLUDE = {
servers: { include: { server: true } },
} as const;
export interface IProjectRepository { export interface IProjectRepository {
findAll(ownerId?: string): Promise<Project[]>; findAll(ownerId?: string): Promise<ProjectWithRelations[]>;
findById(id: string): Promise<Project | null>; findById(id: string): Promise<ProjectWithRelations | null>;
findByName(name: string): Promise<Project | null>; findByName(name: string): Promise<ProjectWithRelations | null>;
create(data: CreateProjectInput & { ownerId: string }): Promise<Project>; create(data: { name: string; description: string; prompt?: string; ownerId: string; proxyMode: string; llmProvider?: string; llmModel?: string }): Promise<ProjectWithRelations>;
update(id: string, data: UpdateProjectInput): Promise<Project>; update(id: string, data: Record<string, unknown>): Promise<ProjectWithRelations>;
delete(id: string): Promise<void>; delete(id: string): Promise<void>;
setServers(projectId: string, serverIds: string[]): Promise<void>;
addServer(projectId: string, serverId: string): Promise<void>;
removeServer(projectId: string, serverId: string): Promise<void>;
} }
export class ProjectRepository implements IProjectRepository { export class ProjectRepository implements IProjectRepository {
constructor(private readonly prisma: PrismaClient) {} constructor(private readonly prisma: PrismaClient) {}
async findAll(ownerId?: string): Promise<Project[]> { async findAll(ownerId?: string): Promise<ProjectWithRelations[]> {
const where = ownerId !== undefined ? { ownerId } : {}; const where = ownerId !== undefined ? { ownerId } : {};
return this.prisma.project.findMany({ where, orderBy: { name: 'asc' } }); return this.prisma.project.findMany({ where, orderBy: { name: 'asc' }, include: PROJECT_INCLUDE }) as unknown as Promise<ProjectWithRelations[]>;
} }
async findById(id: string): Promise<Project | null> { async findById(id: string): Promise<ProjectWithRelations | null> {
return this.prisma.project.findUnique({ where: { id } }); return this.prisma.project.findUnique({ where: { id }, include: PROJECT_INCLUDE }) as unknown as Promise<ProjectWithRelations | null>;
} }
async findByName(name: string): Promise<Project | null> { async findByName(name: string): Promise<ProjectWithRelations | null> {
return this.prisma.project.findUnique({ where: { name } }); return this.prisma.project.findUnique({ where: { name }, include: PROJECT_INCLUDE }) as unknown as Promise<ProjectWithRelations | null>;
} }
async create(data: CreateProjectInput & { ownerId: string }): Promise<Project> { async create(data: { name: string; description: string; prompt?: string; ownerId: string; proxyMode: string; llmProvider?: string; llmModel?: string }): Promise<ProjectWithRelations> {
return this.prisma.project.create({ const createData: Record<string, unknown> = {
data: {
name: data.name, name: data.name,
description: data.description, description: data.description,
ownerId: data.ownerId, ownerId: data.ownerId,
}, proxyMode: data.proxyMode,
}); };
if (data.prompt !== undefined) createData['prompt'] = data.prompt;
if (data.llmProvider !== undefined) createData['llmProvider'] = data.llmProvider;
if (data.llmModel !== undefined) createData['llmModel'] = data.llmModel;
return this.prisma.project.create({
data: createData as Parameters<PrismaClient['project']['create']>[0]['data'],
include: PROJECT_INCLUDE,
}) as unknown as Promise<ProjectWithRelations>;
} }
async update(id: string, data: UpdateProjectInput): Promise<Project> { async update(id: string, data: Record<string, unknown>): Promise<ProjectWithRelations> {
const updateData: Record<string, unknown> = {}; return this.prisma.project.update({
if (data.description !== undefined) updateData['description'] = data.description; where: { id },
return this.prisma.project.update({ where: { id }, data: updateData }); data,
include: PROJECT_INCLUDE,
}) as unknown as Promise<ProjectWithRelations>;
} }
async delete(id: string): Promise<void> { async delete(id: string): Promise<void> {
await this.prisma.project.delete({ where: { id } }); await this.prisma.project.delete({ where: { id } });
} }
async setServers(projectId: string, serverIds: string[]): Promise<void> {
await this.prisma.$transaction(async (tx) => {
await tx.projectServer.deleteMany({ where: { projectId } });
if (serverIds.length > 0) {
await tx.projectServer.createMany({
data: serverIds.map((serverId) => ({ projectId, serverId })),
});
}
});
}
async addServer(projectId: string, serverId: string): Promise<void> {
await this.prisma.projectServer.upsert({
where: { projectId_serverId: { projectId, serverId } },
create: { projectId, serverId },
update: {},
});
}
async removeServer(projectId: string, serverId: string): Promise<void> {
await this.prisma.projectServer.deleteMany({
where: { projectId, serverId },
});
}
} }

View File

@@ -0,0 +1,53 @@
import type { PrismaClient, PromptRequest } from '@prisma/client';
export interface IPromptRequestRepository {
findAll(projectId?: string): Promise<PromptRequest[]>;
findById(id: string): Promise<PromptRequest | null>;
findByNameAndProject(name: string, projectId: string | null): Promise<PromptRequest | null>;
findBySession(sessionId: string, projectId?: string): Promise<PromptRequest[]>;
create(data: { name: string; content: string; projectId?: string; createdBySession?: string; createdByUserId?: string }): Promise<PromptRequest>;
delete(id: string): Promise<void>;
}
export class PromptRequestRepository implements IPromptRequestRepository {
constructor(private readonly prisma: PrismaClient) {}
async findAll(projectId?: string): Promise<PromptRequest[]> {
if (projectId !== undefined) {
return this.prisma.promptRequest.findMany({
where: { OR: [{ projectId }, { projectId: null }] },
orderBy: { createdAt: 'desc' },
});
}
return this.prisma.promptRequest.findMany({ orderBy: { createdAt: 'desc' } });
}
async findById(id: string): Promise<PromptRequest | null> {
return this.prisma.promptRequest.findUnique({ where: { id } });
}
async findByNameAndProject(name: string, projectId: string | null): Promise<PromptRequest | null> {
return this.prisma.promptRequest.findUnique({
where: { name_projectId: { name, projectId: projectId ?? '' } },
});
}
async findBySession(sessionId: string, projectId?: string): Promise<PromptRequest[]> {
const where: Record<string, unknown> = { createdBySession: sessionId };
if (projectId !== undefined) {
where['OR'] = [{ projectId }, { projectId: null }];
}
return this.prisma.promptRequest.findMany({
where,
orderBy: { createdAt: 'desc' },
});
}
async create(data: { name: string; content: string; projectId?: string; createdBySession?: string; createdByUserId?: string }): Promise<PromptRequest> {
return this.prisma.promptRequest.create({ data });
}
async delete(id: string): Promise<void> {
await this.prisma.promptRequest.delete({ where: { id } });
}
}

View File

@@ -0,0 +1,47 @@
import type { PrismaClient, Prompt } from '@prisma/client';
export interface IPromptRepository {
findAll(projectId?: string): Promise<Prompt[]>;
findById(id: string): Promise<Prompt | null>;
findByNameAndProject(name: string, projectId: string | null): Promise<Prompt | null>;
create(data: { name: string; content: string; projectId?: string }): Promise<Prompt>;
update(id: string, data: { content?: string }): Promise<Prompt>;
delete(id: string): Promise<void>;
}
export class PromptRepository implements IPromptRepository {
constructor(private readonly prisma: PrismaClient) {}
async findAll(projectId?: string): Promise<Prompt[]> {
if (projectId !== undefined) {
// Project-scoped + global prompts
return this.prisma.prompt.findMany({
where: { OR: [{ projectId }, { projectId: null }] },
orderBy: { name: 'asc' },
});
}
return this.prisma.prompt.findMany({ orderBy: { name: 'asc' } });
}
async findById(id: string): Promise<Prompt | null> {
return this.prisma.prompt.findUnique({ where: { id } });
}
async findByNameAndProject(name: string, projectId: string | null): Promise<Prompt | null> {
return this.prisma.prompt.findUnique({
where: { name_projectId: { name, projectId: projectId ?? '' } },
});
}
async create(data: { name: string; content: string; projectId?: string }): Promise<Prompt> {
return this.prisma.prompt.create({ data });
}
async update(id: string, data: { content?: string }): Promise<Prompt> {
return this.prisma.prompt.update({ where: { id }, data });
}
async delete(id: string): Promise<void> {
await this.prisma.prompt.delete({ where: { id } });
}
}

View File

@@ -0,0 +1,48 @@
import type { PrismaClient, RbacDefinition } from '@prisma/client';
import type { CreateRbacDefinitionInput, UpdateRbacDefinitionInput } from '../validation/rbac-definition.schema.js';
export interface IRbacDefinitionRepository {
findAll(): Promise<RbacDefinition[]>;
findById(id: string): Promise<RbacDefinition | null>;
findByName(name: string): Promise<RbacDefinition | null>;
create(data: CreateRbacDefinitionInput): Promise<RbacDefinition>;
update(id: string, data: UpdateRbacDefinitionInput): Promise<RbacDefinition>;
delete(id: string): Promise<void>;
}
export class RbacDefinitionRepository implements IRbacDefinitionRepository {
constructor(private readonly prisma: PrismaClient) {}
async findAll(): Promise<RbacDefinition[]> {
return this.prisma.rbacDefinition.findMany({ orderBy: { name: 'asc' } });
}
async findById(id: string): Promise<RbacDefinition | null> {
return this.prisma.rbacDefinition.findUnique({ where: { id } });
}
async findByName(name: string): Promise<RbacDefinition | null> {
return this.prisma.rbacDefinition.findUnique({ where: { name } });
}
async create(data: CreateRbacDefinitionInput): Promise<RbacDefinition> {
return this.prisma.rbacDefinition.create({
data: {
name: data.name,
subjects: data.subjects,
roleBindings: data.roleBindings,
},
});
}
async update(id: string, data: UpdateRbacDefinitionInput): Promise<RbacDefinition> {
const updateData: Record<string, unknown> = {};
if (data.subjects !== undefined) updateData['subjects'] = data.subjects;
if (data.roleBindings !== undefined) updateData['roleBindings'] = data.roleBindings;
return this.prisma.rbacDefinition.update({ where: { id }, data: updateData });
}
async delete(id: string): Promise<void> {
await this.prisma.rbacDefinition.delete({ where: { id } });
}
}

View File

@@ -0,0 +1,76 @@
import type { PrismaClient, User } from '@prisma/client';
/** User without the passwordHash field — safe for API responses. */
export type SafeUser = Omit<User, 'passwordHash'>;
export interface IUserRepository {
findAll(): Promise<SafeUser[]>;
findById(id: string): Promise<SafeUser | null>;
findByEmail(email: string, includeHash?: boolean): Promise<SafeUser | null> | Promise<User | null>;
create(data: { email: string; passwordHash: string; name?: string; role?: string }): Promise<SafeUser>;
delete(id: string): Promise<void>;
count(): Promise<number>;
}
/** Fields to select when passwordHash must be excluded. */
const safeSelect = {
id: true,
email: true,
name: true,
role: true,
provider: true,
externalId: true,
version: true,
createdAt: true,
updatedAt: true,
} as const;
export class UserRepository implements IUserRepository {
constructor(private readonly prisma: PrismaClient) {}
async findAll(): Promise<SafeUser[]> {
return this.prisma.user.findMany({
select: safeSelect,
orderBy: { email: 'asc' },
});
}
async findById(id: string): Promise<SafeUser | null> {
return this.prisma.user.findUnique({
where: { id },
select: safeSelect,
});
}
async findByEmail(email: string, includeHash?: boolean): Promise<User | SafeUser | null> {
if (includeHash === true) {
return this.prisma.user.findUnique({ where: { email } });
}
return this.prisma.user.findUnique({
where: { email },
select: safeSelect,
});
}
async create(data: { email: string; passwordHash: string; name?: string; role?: string }): Promise<SafeUser> {
const createData: Record<string, unknown> = {
email: data.email,
passwordHash: data.passwordHash,
};
if (data.name !== undefined) createData['name'] = data.name;
if (data.role !== undefined) createData['role'] = data.role;
return this.prisma.user.create({
data: createData as Parameters<PrismaClient['user']['create']>[0]['data'],
select: safeSelect,
});
}
async delete(id: string): Promise<void> {
await this.prisma.user.delete({ where: { id } });
}
async count(): Promise<number> {
return this.prisma.user.count();
}
}

View File

@@ -1,15 +1,76 @@
import type { FastifyInstance } from 'fastify'; import type { FastifyInstance } from 'fastify';
import type { AuthService } from '../services/auth.service.js'; import type { AuthService } from '../services/auth.service.js';
import type { UserService } from '../services/user.service.js';
import type { GroupService } from '../services/group.service.js';
import type { RbacDefinitionService } from '../services/rbac-definition.service.js';
import type { RbacService } from '../services/rbac.service.js';
import { createAuthMiddleware } from '../middleware/auth.js'; import { createAuthMiddleware } from '../middleware/auth.js';
import { createRbacMiddleware } from '../middleware/rbac.js';
export interface AuthRouteDeps { export interface AuthRouteDeps {
authService: AuthService; authService: AuthService;
userService: UserService;
groupService: GroupService;
rbacDefinitionService: RbacDefinitionService;
rbacService: RbacService;
} }
export function registerAuthRoutes(app: FastifyInstance, deps: AuthRouteDeps): void { export function registerAuthRoutes(app: FastifyInstance, deps: AuthRouteDeps): void {
const authMiddleware = createAuthMiddleware({ const authMiddleware = createAuthMiddleware({
findSession: (token) => deps.authService.findSession(token), findSession: (token) => deps.authService.findSession(token),
}); });
const { requireOperation } = createRbacMiddleware(deps.rbacService);
// GET /api/v1/auth/status — unauthenticated, returns whether any users exist
app.get('/api/v1/auth/status', async () => {
const count = await deps.userService.count();
return { hasUsers: count > 0 };
});
// POST /api/v1/auth/bootstrap — only works when no users exist (first-run setup)
app.post('/api/v1/auth/bootstrap', async (request, reply) => {
const count = await deps.userService.count();
if (count > 0) {
reply.code(409).send({ error: 'Users already exist. Use login instead.' });
return;
}
const { email, password, name } = request.body as { email: string; password: string; name?: string };
// Create the first admin user
await deps.userService.create({
email,
password,
...(name !== undefined ? { name } : {}),
});
// Create "admin" group and add the first user to it
await deps.groupService.create({
name: 'admin',
description: 'Bootstrap admin group',
members: [email],
});
// Create bootstrap RBAC: full resource access + all operations
await deps.rbacDefinitionService.create({
name: 'bootstrap-admin',
subjects: [{ kind: 'Group', name: 'admin' }],
roleBindings: [
{ role: 'edit', resource: '*' },
{ role: 'run', resource: '*' },
{ role: 'run', action: 'impersonate' },
{ role: 'run', action: 'logs' },
{ role: 'run', action: 'backup' },
{ role: 'run', action: 'restore' },
{ role: 'run', action: 'audit-purge' },
],
});
// Auto-login so the caller gets a token immediately
const session = await deps.authService.login(email, password);
reply.code(201);
return session;
});
// POST /api/v1/auth/login — no auth required // POST /api/v1/auth/login — no auth required
app.post<{ app.post<{
@@ -28,4 +89,15 @@ export function registerAuthRoutes(app: FastifyInstance, deps: AuthRouteDeps): v
await deps.authService.logout(token); await deps.authService.logout(token);
return { success: true }; return { success: true };
}); });
// POST /api/v1/auth/impersonate — requires auth + run:impersonate operation
app.post(
'/api/v1/auth/impersonate',
{ preHandler: [authMiddleware, requireOperation('impersonate')] },
async (request) => {
const { email } = request.body as { email: string };
const result = await deps.authService.impersonate(email);
return result;
},
);
} }

View File

@@ -13,7 +13,7 @@ export function registerBackupRoutes(app: FastifyInstance, deps: BackupDeps): vo
app.post<{ app.post<{
Body: { Body: {
password?: string; password?: string;
resources?: Array<'servers' | 'secrets' | 'projects'>; resources?: Array<'servers' | 'secrets' | 'projects' | 'users' | 'groups' | 'rbac'>;
}; };
}>('/api/v1/backup', async (request) => { }>('/api/v1/backup', async (request) => {
const opts: BackupOptions = {}; const opts: BackupOptions = {};
@@ -51,7 +51,7 @@ export function registerBackupRoutes(app: FastifyInstance, deps: BackupDeps): vo
const result = await deps.restoreService.restore(bundle, restoreOpts); const result = await deps.restoreService.restore(bundle, restoreOpts);
if (result.errors.length > 0 && result.serversCreated === 0 && result.secretsCreated === 0 && result.projectsCreated === 0) { if (result.errors.length > 0 && result.serversCreated === 0 && result.secretsCreated === 0 && result.projectsCreated === 0 && result.usersCreated === 0 && result.groupsCreated === 0 && result.rbacCreated === 0) {
reply.code(422); reply.code(422);
} }

View File

@@ -0,0 +1,35 @@
import type { FastifyInstance } from 'fastify';
import type { GroupService } from '../services/group.service.js';
export function registerGroupRoutes(
app: FastifyInstance,
service: GroupService,
): void {
app.get('/api/v1/groups', async () => {
return service.list();
});
app.get<{ Params: { id: string } }>('/api/v1/groups/:id', async (request) => {
// Try by ID first, fall back to name lookup
try {
return await service.getById(request.params.id);
} catch {
return service.getByName(request.params.id);
}
});
app.post('/api/v1/groups', async (request, reply) => {
const group = await service.create(request.body);
reply.code(201);
return group;
});
app.put<{ Params: { id: string } }>('/api/v1/groups/:id', async (request) => {
return service.update(request.params.id, request.body);
});
app.delete<{ Params: { id: string } }>('/api/v1/groups/:id', async (request, reply) => {
await service.delete(request.params.id);
reply.code(204);
});
}

View File

@@ -14,3 +14,6 @@ export type { AuthRouteDeps } from './auth.js';
export { registerMcpProxyRoutes } from './mcp-proxy.js'; export { registerMcpProxyRoutes } from './mcp-proxy.js';
export type { McpProxyRouteDeps } from './mcp-proxy.js'; export type { McpProxyRouteDeps } from './mcp-proxy.js';
export { registerTemplateRoutes } from './templates.js'; export { registerTemplateRoutes } from './templates.js';
export { registerRbacRoutes } from './rbac-definitions.js';
export { registerUserRoutes } from './users.js';
export { registerGroupRoutes } from './groups.js';

View File

@@ -2,13 +2,13 @@ import type { FastifyInstance } from 'fastify';
import type { ProjectService } from '../services/project.service.js'; import type { ProjectService } from '../services/project.service.js';
export function registerProjectRoutes(app: FastifyInstance, service: ProjectService): void { export function registerProjectRoutes(app: FastifyInstance, service: ProjectService): void {
app.get('/api/v1/projects', async (request) => { app.get('/api/v1/projects', async () => {
// If authenticated, filter by owner; otherwise list all // RBAC preSerialization hook handles access filtering
return service.list(request.userId); return service.list();
}); });
app.get<{ Params: { id: string } }>('/api/v1/projects/:id', async (request) => { app.get<{ Params: { id: string } }>('/api/v1/projects/:id', async (request) => {
return service.getById(request.params.id); return service.resolveAndGet(request.params.id);
}); });
app.post('/api/v1/projects', async (request, reply) => { app.post('/api/v1/projects', async (request, reply) => {
@@ -19,11 +19,51 @@ export function registerProjectRoutes(app: FastifyInstance, service: ProjectServ
}); });
app.put<{ Params: { id: string } }>('/api/v1/projects/:id', async (request) => { app.put<{ Params: { id: string } }>('/api/v1/projects/:id', async (request) => {
return service.update(request.params.id, request.body); const project = await service.resolveAndGet(request.params.id);
return service.update(project.id, request.body);
}); });
app.delete<{ Params: { id: string } }>('/api/v1/projects/:id', async (request, reply) => { app.delete<{ Params: { id: string } }>('/api/v1/projects/:id', async (request, reply) => {
await service.delete(request.params.id); const project = await service.resolveAndGet(request.params.id);
await service.delete(project.id);
reply.code(204); reply.code(204);
}); });
// Generate .mcp.json for a project
app.get<{ Params: { id: string } }>('/api/v1/projects/:id/mcp-config', async (request) => {
return service.generateMcpConfig(request.params.id);
});
// Attach a server to a project
app.post<{ Params: { id: string }; Body: { server: string } }>('/api/v1/projects/:id/servers', async (request) => {
const body = request.body as { server?: string };
if (!body.server) {
throw Object.assign(new Error('Missing "server" in request body'), { statusCode: 400 });
}
return service.addServer(request.params.id, body.server);
});
// Detach a server from a project
app.delete<{ Params: { id: string; serverName: string } }>('/api/v1/projects/:id/servers/:serverName', async (request, reply) => {
await service.removeServer(request.params.id, request.params.serverName);
reply.code(204);
});
// List servers in a project (for mcplocal discovery)
app.get<{ Params: { id: string } }>('/api/v1/projects/:id/servers', async (request) => {
const project = await service.resolveAndGet(request.params.id);
return project.servers.map((ps) => ps.server);
});
// Get project instructions for LLM (prompt + server list)
app.get<{ Params: { id: string } }>('/api/v1/projects/:id/instructions', async (request) => {
const project = await service.resolveAndGet(request.params.id);
return {
prompt: project.prompt,
servers: project.servers.map((ps) => ({
name: (ps.server as Record<string, unknown>).name as string,
description: (ps.server as Record<string, unknown>).description as string,
})),
};
});
} }

View File

@@ -0,0 +1,86 @@
import type { FastifyInstance } from 'fastify';
import type { PromptService } from '../services/prompt.service.js';
import type { IProjectRepository } from '../repositories/project.repository.js';
export function registerPromptRoutes(
app: FastifyInstance,
service: PromptService,
projectRepo: IProjectRepository,
): void {
// ── Prompts (approved) ──
app.get('/api/v1/prompts', async () => {
return service.listPrompts();
});
app.get<{ Params: { id: string } }>('/api/v1/prompts/:id', async (request) => {
return service.getPrompt(request.params.id);
});
app.post('/api/v1/prompts', async (request, reply) => {
const prompt = await service.createPrompt(request.body);
reply.code(201);
return prompt;
});
app.put<{ Params: { id: string } }>('/api/v1/prompts/:id', async (request) => {
return service.updatePrompt(request.params.id, request.body);
});
app.delete<{ Params: { id: string } }>('/api/v1/prompts/:id', async (request, reply) => {
await service.deletePrompt(request.params.id);
reply.code(204);
});
// ── Prompt Requests (pending proposals) ──
app.get('/api/v1/promptrequests', async () => {
return service.listPromptRequests();
});
app.get<{ Params: { id: string } }>('/api/v1/promptrequests/:id', async (request) => {
return service.getPromptRequest(request.params.id);
});
app.delete<{ Params: { id: string } }>('/api/v1/promptrequests/:id', async (request, reply) => {
await service.deletePromptRequest(request.params.id);
reply.code(204);
});
// Approve: atomic delete request → create prompt
app.post<{ Params: { id: string } }>('/api/v1/promptrequests/:id/approve', async (request) => {
return service.approve(request.params.id);
});
// ── Project-scoped endpoints (for mcplocal) ──
// Visible prompts: approved + session's pending requests
app.get<{ Params: { name: string }; Querystring: { session?: string } }>(
'/api/v1/projects/:name/prompts/visible',
async (request) => {
const project = await projectRepo.findByName(request.params.name);
if (!project) {
throw Object.assign(new Error(`Project not found: ${request.params.name}`), { statusCode: 404 });
}
return service.getVisiblePrompts(project.id, request.query.session);
},
);
// LLM propose: create a PromptRequest for a project
app.post<{ Params: { name: string } }>(
'/api/v1/projects/:name/promptrequests',
async (request, reply) => {
const project = await projectRepo.findByName(request.params.name);
if (!project) {
throw Object.assign(new Error(`Project not found: ${request.params.name}`), { statusCode: 404 });
}
const body = request.body as Record<string, unknown>;
const req = await service.propose({
...body,
projectId: project.id,
});
reply.code(201);
return req;
},
);
}

View File

@@ -0,0 +1,30 @@
import type { FastifyInstance } from 'fastify';
import type { RbacDefinitionService } from '../services/rbac-definition.service.js';
export function registerRbacRoutes(
app: FastifyInstance,
service: RbacDefinitionService,
): void {
app.get('/api/v1/rbac', async () => {
return service.list();
});
app.get<{ Params: { id: string } }>('/api/v1/rbac/:id', async (request) => {
return service.getById(request.params.id);
});
app.post('/api/v1/rbac', async (request, reply) => {
const def = await service.create(request.body);
reply.code(201);
return def;
});
app.put<{ Params: { id: string } }>('/api/v1/rbac/:id', async (request) => {
return service.update(request.params.id, request.body);
});
app.delete<{ Params: { id: string } }>('/api/v1/rbac/:id', async (request, reply) => {
await service.delete(request.params.id);
reply.code(204);
});
}

View File

@@ -0,0 +1,31 @@
import type { FastifyInstance } from 'fastify';
import type { UserService } from '../services/user.service.js';
export function registerUserRoutes(
app: FastifyInstance,
service: UserService,
): void {
app.get('/api/v1/users', async () => {
return service.list();
});
app.get<{ Params: { id: string } }>('/api/v1/users/:id', async (request) => {
// Support lookup by email (contains @) or by id
const idOrEmail = request.params.id;
if (idOrEmail.includes('@')) {
return service.getByEmail(idOrEmail);
}
return service.getById(idOrEmail);
});
app.post('/api/v1/users', async (request, reply) => {
const user = await service.create(request.body);
reply.code(201);
return user;
});
app.delete<{ Params: { id: string } }>('/api/v1/users/:id', async (_request, reply) => {
await service.delete(_request.params.id);
reply.code(204);
});
}

View File

@@ -63,4 +63,32 @@ export class AuthService {
} }
return { userId: session.userId, expiresAt: session.expiresAt }; return { userId: session.userId, expiresAt: session.expiresAt };
} }
/**
* Create a session for a user by email without requiring their password.
* Used for admin impersonation.
*/
async impersonate(email: string): Promise<LoginResult> {
const user = await this.prisma.user.findUnique({ where: { email } });
if (user === null) {
throw new AuthenticationError('User not found');
}
const token = randomUUID();
const expiresAt = new Date(Date.now() + SESSION_TTL_MS);
await this.prisma.session.create({
data: {
token,
userId: user.id,
expiresAt,
},
});
return {
token,
expiresAt,
user: { id: user.id, email: user.email, role: user.role },
};
}
} }

View File

@@ -1,5 +1,8 @@
import type { IMcpServerRepository, ISecretRepository } from '../../repositories/interfaces.js'; import type { IMcpServerRepository, ISecretRepository } from '../../repositories/interfaces.js';
import type { IProjectRepository } from '../../repositories/project.repository.js'; import type { IProjectRepository } from '../../repositories/project.repository.js';
import type { IUserRepository } from '../../repositories/user.repository.js';
import type { IGroupRepository } from '../../repositories/group.repository.js';
import type { IRbacDefinitionRepository } from '../../repositories/rbac-definition.repository.js';
import { encrypt, isSensitiveKey } from './crypto.js'; import { encrypt, isSensitiveKey } from './crypto.js';
import type { EncryptedPayload } from './crypto.js'; import type { EncryptedPayload } from './crypto.js';
import { APP_VERSION } from '@mcpctl/shared'; import { APP_VERSION } from '@mcpctl/shared';
@@ -12,6 +15,9 @@ export interface BackupBundle {
servers: BackupServer[]; servers: BackupServer[];
secrets: BackupSecret[]; secrets: BackupSecret[];
projects: BackupProject[]; projects: BackupProject[];
users?: BackupUser[];
groups?: BackupGroup[];
rbacBindings?: BackupRbacBinding[];
encryptedSecrets?: EncryptedPayload; encryptedSecrets?: EncryptedPayload;
} }
@@ -33,11 +39,34 @@ export interface BackupSecret {
export interface BackupProject { export interface BackupProject {
name: string; name: string;
description: string; description: string;
proxyMode?: string;
llmProvider?: string | null;
llmModel?: string | null;
serverNames?: string[];
}
export interface BackupUser {
email: string;
name: string | null;
role: string;
provider: string | null;
}
export interface BackupGroup {
name: string;
description: string;
memberEmails: string[];
}
export interface BackupRbacBinding {
name: string;
subjects: unknown;
roleBindings: unknown;
} }
export interface BackupOptions { export interface BackupOptions {
password?: string; password?: string;
resources?: Array<'servers' | 'secrets' | 'projects'>; resources?: Array<'servers' | 'secrets' | 'projects' | 'users' | 'groups' | 'rbac'>;
} }
export class BackupService { export class BackupService {
@@ -45,14 +74,20 @@ export class BackupService {
private serverRepo: IMcpServerRepository, private serverRepo: IMcpServerRepository,
private projectRepo: IProjectRepository, private projectRepo: IProjectRepository,
private secretRepo: ISecretRepository, private secretRepo: ISecretRepository,
private userRepo?: IUserRepository,
private groupRepo?: IGroupRepository,
private rbacRepo?: IRbacDefinitionRepository,
) {} ) {}
async createBackup(options?: BackupOptions): Promise<BackupBundle> { async createBackup(options?: BackupOptions): Promise<BackupBundle> {
const resources = options?.resources ?? ['servers', 'secrets', 'projects']; const resources = options?.resources ?? ['servers', 'secrets', 'projects', 'users', 'groups', 'rbac'];
let servers: BackupServer[] = []; let servers: BackupServer[] = [];
let secrets: BackupSecret[] = []; let secrets: BackupSecret[] = [];
let projects: BackupProject[] = []; let projects: BackupProject[] = [];
let users: BackupUser[] = [];
let groups: BackupGroup[] = [];
let rbacBindings: BackupRbacBinding[] = [];
if (resources.includes('servers')) { if (resources.includes('servers')) {
const allServers = await this.serverRepo.findAll(); const allServers = await this.serverRepo.findAll();
@@ -80,6 +115,38 @@ export class BackupService {
projects = allProjects.map((proj) => ({ projects = allProjects.map((proj) => ({
name: proj.name, name: proj.name,
description: proj.description, description: proj.description,
proxyMode: proj.proxyMode,
llmProvider: proj.llmProvider,
llmModel: proj.llmModel,
serverNames: proj.servers.map((ps) => ps.server.name),
}));
}
if (resources.includes('users') && this.userRepo) {
const allUsers = await this.userRepo.findAll();
users = allUsers.map((u) => ({
email: u.email,
name: u.name,
role: u.role,
provider: u.provider,
}));
}
if (resources.includes('groups') && this.groupRepo) {
const allGroups = await this.groupRepo.findAll();
groups = allGroups.map((g) => ({
name: g.name,
description: g.description,
memberEmails: g.members.map((m) => m.user.email),
}));
}
if (resources.includes('rbac') && this.rbacRepo) {
const allRbac = await this.rbacRepo.findAll();
rbacBindings = allRbac.map((r) => ({
name: r.name,
subjects: r.subjects,
roleBindings: r.roleBindings,
})); }));
} }
@@ -91,6 +158,9 @@ export class BackupService {
servers, servers,
secrets, secrets,
projects, projects,
users,
groups,
rbacBindings,
}; };
if (options?.password && secrets.length > 0) { if (options?.password && secrets.length > 0) {

View File

@@ -1,5 +1,9 @@
import type { IMcpServerRepository, ISecretRepository } from '../../repositories/interfaces.js'; import type { IMcpServerRepository, ISecretRepository } from '../../repositories/interfaces.js';
import type { IProjectRepository } from '../../repositories/project.repository.js'; import type { IProjectRepository } from '../../repositories/project.repository.js';
import type { IUserRepository } from '../../repositories/user.repository.js';
import type { IGroupRepository } from '../../repositories/group.repository.js';
import type { IRbacDefinitionRepository } from '../../repositories/rbac-definition.repository.js';
import type { RbacRoleBinding } from '../../validation/rbac-definition.schema.js';
import { decrypt } from './crypto.js'; import { decrypt } from './crypto.js';
import type { BackupBundle } from './backup-service.js'; import type { BackupBundle } from './backup-service.js';
@@ -17,6 +21,12 @@ export interface RestoreResult {
secretsSkipped: number; secretsSkipped: number;
projectsCreated: number; projectsCreated: number;
projectsSkipped: number; projectsSkipped: number;
usersCreated: number;
usersSkipped: number;
groupsCreated: number;
groupsSkipped: number;
rbacCreated: number;
rbacSkipped: number;
errors: string[]; errors: string[];
} }
@@ -25,6 +35,9 @@ export class RestoreService {
private serverRepo: IMcpServerRepository, private serverRepo: IMcpServerRepository,
private projectRepo: IProjectRepository, private projectRepo: IProjectRepository,
private secretRepo: ISecretRepository, private secretRepo: ISecretRepository,
private userRepo?: IUserRepository,
private groupRepo?: IGroupRepository,
private rbacRepo?: IRbacDefinitionRepository,
) {} ) {}
validateBundle(bundle: unknown): bundle is BackupBundle { validateBundle(bundle: unknown): bundle is BackupBundle {
@@ -36,6 +49,7 @@ export class RestoreService {
Array.isArray(b['secrets']) && Array.isArray(b['secrets']) &&
Array.isArray(b['projects']) Array.isArray(b['projects'])
); );
// users, groups, rbacBindings are optional for backwards compatibility
} }
async restore(bundle: BackupBundle, options?: RestoreOptions): Promise<RestoreResult> { async restore(bundle: BackupBundle, options?: RestoreOptions): Promise<RestoreResult> {
@@ -47,6 +61,12 @@ export class RestoreService {
secretsSkipped: 0, secretsSkipped: 0,
projectsCreated: 0, projectsCreated: 0,
projectsSkipped: 0, projectsSkipped: 0,
usersCreated: 0,
usersSkipped: 0,
groupsCreated: 0,
groupsSkipped: 0,
rbacCreated: 0,
rbacSkipped: 0,
errors: [], errors: [],
}; };
@@ -78,6 +98,37 @@ export class RestoreService {
} }
} }
// Restore order: secrets → servers → users → groups → projects → rbacBindings
// Restore secrets
for (const secret of bundle.secrets) {
try {
const existing = await this.secretRepo.findByName(secret.name);
if (existing) {
if (strategy === 'fail') {
result.errors.push(`Secret "${secret.name}" already exists`);
return result;
}
if (strategy === 'skip') {
result.secretsSkipped++;
continue;
}
// overwrite
await this.secretRepo.update(existing.id, { data: secret.data });
result.secretsCreated++;
continue;
}
await this.secretRepo.create({
name: secret.name,
data: secret.data,
});
result.secretsCreated++;
} catch (err) {
result.errors.push(`Failed to restore secret "${secret.name}": ${err instanceof Error ? err.message : String(err)}`);
}
}
// Restore servers // Restore servers
for (const server of bundle.servers) { for (const server of bundle.servers) {
try { try {
@@ -121,36 +172,75 @@ export class RestoreService {
} }
} }
// Restore secrets // Restore users
for (const secret of bundle.secrets) { if (bundle.users && this.userRepo) {
for (const user of bundle.users) {
try { try {
const existing = await this.secretRepo.findByName(secret.name); const existing = await this.userRepo.findByEmail(user.email);
if (existing) { if (existing) {
if (strategy === 'fail') { if (strategy === 'fail') {
result.errors.push(`Secret "${secret.name}" already exists`); result.errors.push(`User "${user.email}" already exists`);
return result;
}
result.usersSkipped++;
continue;
}
// Create with placeholder passwordHash — user must reset password
const createData: { email: string; passwordHash: string; name?: string; role?: string } = {
email: user.email,
passwordHash: '__RESTORED_MUST_RESET__',
role: user.role,
};
if (user.name !== null) createData.name = user.name;
await this.userRepo.create(createData);
result.usersCreated++;
} catch (err) {
result.errors.push(`Failed to restore user "${user.email}": ${err instanceof Error ? err.message : String(err)}`);
}
}
}
// Restore groups
if (bundle.groups && this.groupRepo && this.userRepo) {
for (const group of bundle.groups) {
try {
const existing = await this.groupRepo.findByName(group.name);
if (existing) {
if (strategy === 'fail') {
result.errors.push(`Group "${group.name}" already exists`);
return result; return result;
} }
if (strategy === 'skip') { if (strategy === 'skip') {
result.secretsSkipped++; result.groupsSkipped++;
continue; continue;
} }
// overwrite // overwrite: update description and re-set members
await this.secretRepo.update(existing.id, { data: secret.data }); await this.groupRepo.update(existing.id, { description: group.description });
result.secretsCreated++; if (group.memberEmails.length > 0) {
const memberIds = await this.resolveUserEmails(group.memberEmails);
await this.groupRepo.setMembers(existing.id, memberIds);
}
result.groupsCreated++;
continue; continue;
} }
await this.secretRepo.create({ const created = await this.groupRepo.create({
name: secret.name, name: group.name,
data: secret.data, description: group.description,
}); });
result.secretsCreated++; if (group.memberEmails.length > 0) {
const memberIds = await this.resolveUserEmails(group.memberEmails);
await this.groupRepo.setMembers(created.id, memberIds);
}
result.groupsCreated++;
} catch (err) { } catch (err) {
result.errors.push(`Failed to restore secret "${secret.name}": ${err instanceof Error ? err.message : String(err)}`); result.errors.push(`Failed to restore group "${group.name}": ${err instanceof Error ? err.message : String(err)}`);
}
} }
} }
// Restore projects // Restore projects (enriched)
for (const project of bundle.projects) { for (const project of bundle.projects) {
try { try {
const existing = await this.projectRepo.findByName(project.name); const existing = await this.projectRepo.findByName(project.name);
@@ -164,22 +254,100 @@ export class RestoreService {
continue; continue;
} }
// overwrite // overwrite
await this.projectRepo.update(existing.id, { description: project.description }); const updateData: Record<string, unknown> = { description: project.description };
if (project.proxyMode) updateData['proxyMode'] = project.proxyMode;
if (project.llmProvider !== undefined) updateData['llmProvider'] = project.llmProvider;
if (project.llmModel !== undefined) updateData['llmModel'] = project.llmModel;
await this.projectRepo.update(existing.id, updateData);
// Re-link servers
if (project.serverNames && project.serverNames.length > 0) {
const serverIds = await this.resolveServerNames(project.serverNames);
await this.projectRepo.setServers(existing.id, serverIds);
}
result.projectsCreated++; result.projectsCreated++;
continue; continue;
} }
await this.projectRepo.create({ const projectCreateData: { name: string; description: string; ownerId: string; proxyMode: string; llmProvider?: string; llmModel?: string } = {
name: project.name, name: project.name,
description: project.description, description: project.description,
ownerId: 'system', ownerId: 'system',
}); proxyMode: project.proxyMode ?? 'direct',
};
if (project.llmProvider != null) projectCreateData.llmProvider = project.llmProvider;
if (project.llmModel != null) projectCreateData.llmModel = project.llmModel;
const created = await this.projectRepo.create(projectCreateData);
// Link servers
if (project.serverNames && project.serverNames.length > 0) {
const serverIds = await this.resolveServerNames(project.serverNames);
await this.projectRepo.setServers(created.id, serverIds);
}
result.projectsCreated++; result.projectsCreated++;
} catch (err) { } catch (err) {
result.errors.push(`Failed to restore project "${project.name}": ${err instanceof Error ? err.message : String(err)}`); result.errors.push(`Failed to restore project "${project.name}": ${err instanceof Error ? err.message : String(err)}`);
} }
} }
// Restore RBAC bindings
if (bundle.rbacBindings && this.rbacRepo) {
for (const rbac of bundle.rbacBindings) {
try {
const existing = await this.rbacRepo.findByName(rbac.name);
if (existing) {
if (strategy === 'fail') {
result.errors.push(`RBAC binding "${rbac.name}" already exists`);
return result; return result;
} }
if (strategy === 'skip') {
result.rbacSkipped++;
continue;
}
// overwrite
await this.rbacRepo.update(existing.id, {
subjects: rbac.subjects as Array<{ kind: 'User' | 'Group'; name: string }>,
roleBindings: rbac.roleBindings as RbacRoleBinding[],
});
result.rbacCreated++;
continue;
}
await this.rbacRepo.create({
name: rbac.name,
subjects: rbac.subjects as Array<{ kind: 'User' | 'Group'; name: string }>,
roleBindings: rbac.roleBindings as RbacRoleBinding[],
});
result.rbacCreated++;
} catch (err) {
result.errors.push(`Failed to restore RBAC binding "${rbac.name}": ${err instanceof Error ? err.message : String(err)}`);
}
}
}
return result;
}
/** Resolve email addresses to user IDs via the user repository. */
private async resolveUserEmails(emails: string[]): Promise<string[]> {
const ids: string[] = [];
for (const email of emails) {
const user = await this.userRepo!.findByEmail(email);
if (user) ids.push(user.id);
}
return ids;
}
/** Resolve server names to server IDs via the server repository. */
private async resolveServerNames(names: string[]): Promise<string[]> {
const ids: string[] = [];
for (const name of names) {
const server = await this.serverRepo.findByName(name);
if (server) ids.push(server.id);
}
return ids;
}
} }

View File

@@ -0,0 +1,89 @@
import type { GroupWithMembers, IGroupRepository } from '../repositories/group.repository.js';
import type { IUserRepository } from '../repositories/user.repository.js';
import { CreateGroupSchema, UpdateGroupSchema } from '../validation/group.schema.js';
import { NotFoundError, ConflictError } from './mcp-server.service.js';
export class GroupService {
constructor(
private readonly groupRepo: IGroupRepository,
private readonly userRepo: IUserRepository,
) {}
async list(): Promise<GroupWithMembers[]> {
return this.groupRepo.findAll();
}
async getById(id: string): Promise<GroupWithMembers> {
const group = await this.groupRepo.findById(id);
if (group === null) {
throw new NotFoundError(`Group not found: ${id}`);
}
return group;
}
async getByName(name: string): Promise<GroupWithMembers> {
const group = await this.groupRepo.findByName(name);
if (group === null) {
throw new NotFoundError(`Group not found: ${name}`);
}
return group;
}
async create(input: unknown): Promise<GroupWithMembers> {
const data = CreateGroupSchema.parse(input);
const existing = await this.groupRepo.findByName(data.name);
if (existing !== null) {
throw new ConflictError(`Group already exists: ${data.name}`);
}
const group = await this.groupRepo.create({
name: data.name,
description: data.description,
});
if (data.members.length > 0) {
const userIds = await this.resolveEmails(data.members);
await this.groupRepo.setMembers(group.id, userIds);
}
const result = await this.groupRepo.findById(group.id);
// Should always exist since we just created it
return result!;
}
async update(id: string, input: unknown): Promise<GroupWithMembers> {
const data = UpdateGroupSchema.parse(input);
// Verify exists
await this.getById(id);
if (data.description !== undefined) {
await this.groupRepo.update(id, { description: data.description });
}
if (data.members !== undefined) {
const userIds = await this.resolveEmails(data.members);
await this.groupRepo.setMembers(id, userIds);
}
return this.getById(id);
}
async delete(id: string): Promise<void> {
await this.getById(id);
await this.groupRepo.delete(id);
}
private async resolveEmails(emails: string[]): Promise<string[]> {
const userIds: string[] = [];
for (const email of emails) {
const user = await this.userRepo.findByEmail(email);
if (user === null) {
throw new NotFoundError(`User not found: ${email}`);
}
userIds.push(user.id);
}
return userIds;
}
}

View File

@@ -27,3 +27,8 @@ export type { McpProxyRequest, McpProxyResponse } from './mcp-proxy-service.js';
export { TemplateService } from './template.service.js'; export { TemplateService } from './template.service.js';
export { HealthProbeRunner } from './health-probe.service.js'; export { HealthProbeRunner } from './health-probe.service.js';
export type { HealthCheckSpec, ProbeResult } from './health-probe.service.js'; export type { HealthCheckSpec, ProbeResult } from './health-probe.service.js';
export { RbacDefinitionService } from './rbac-definition.service.js';
export { RbacService } from './rbac.service.js';
export type { RbacAction, Permission, AllowedScope } from './rbac.service.js';
export { UserService } from './user.service.js';
export { GroupService } from './group.service.js';

View File

@@ -1,8 +1,10 @@
import type { McpServer } from '@prisma/client'; import type { McpServer } from '@prisma/client';
export interface McpConfigServer { export interface McpConfigServer {
command: string; command?: string;
args: string[]; args?: string[];
url?: string;
headers?: Record<string, string>;
env?: Record<string, string>; env?: Record<string, string>;
} }
@@ -19,6 +21,13 @@ export function generateMcpConfig(
const mcpServers: Record<string, McpConfigServer> = {}; const mcpServers: Record<string, McpConfigServer> = {};
for (const { server, resolvedEnv } of servers) { for (const { server, resolvedEnv } of servers) {
if (server.transport === 'SSE' || server.transport === 'STREAMABLE_HTTP') {
// Point at mcpd proxy URL for non-STDIO transports
mcpServers[server.name] = {
url: `http://localhost:3100/api/v1/mcp/proxy/${server.name}`,
};
} else {
// STDIO — npx command approach
const config: McpConfigServer = { const config: McpConfigServer = {
command: 'npx', command: 'npx',
args: ['-y', server.packageName ?? server.name], args: ['-y', server.packageName ?? server.name],
@@ -30,6 +39,7 @@ export function generateMcpConfig(
mcpServers[server.name] = config; mcpServers[server.name] = config;
} }
}
return { mcpServers }; return { mcpServers };
} }

View File

@@ -1,7 +1,10 @@
import type { McpInstance } from '@prisma/client'; import type { McpInstance, McpServer } from '@prisma/client';
import type { IMcpInstanceRepository, IMcpServerRepository } from '../repositories/interfaces.js'; import type { IMcpInstanceRepository, IMcpServerRepository } from '../repositories/interfaces.js';
import type { McpOrchestrator } from './orchestrator.js';
import { NotFoundError } from './mcp-server.service.js'; import { NotFoundError } from './mcp-server.service.js';
import { InvalidStateError } from './instance.service.js'; import { InvalidStateError } from './instance.service.js';
import { sendViaSse } from './transport/sse-client.js';
import { sendViaStdio } from './transport/stdio-client.js';
export interface McpProxyRequest { export interface McpProxyRequest {
serverId: string; serverId: string;
@@ -38,17 +41,21 @@ export class McpProxyService {
constructor( constructor(
private readonly instanceRepo: IMcpInstanceRepository, private readonly instanceRepo: IMcpInstanceRepository,
private readonly serverRepo: IMcpServerRepository, private readonly serverRepo: IMcpServerRepository,
private readonly orchestrator?: McpOrchestrator,
) {} ) {}
async execute(request: McpProxyRequest): Promise<McpProxyResponse> { async execute(request: McpProxyRequest): Promise<McpProxyResponse> {
const server = await this.serverRepo.findById(request.serverId); const server = await this.serverRepo.findById(request.serverId);
if (!server) {
// External server: proxy directly to externalUrl throw new NotFoundError(`Server '${request.serverId}' not found`);
if (server?.externalUrl) {
return this.sendToExternal(server.id, server.externalUrl, request.method, request.params);
} }
// Managed server: find running instance // External server: proxy directly to externalUrl
if (server.externalUrl) {
return this.sendToExternal(server, request.method, request.params);
}
// Managed server: find running instance and dispatch by transport
const instances = await this.instanceRepo.findAll(request.serverId); const instances = await this.instanceRepo.findAll(request.serverId);
const running = instances.find((i) => i.status === 'RUNNING'); const running = instances.find((i) => i.status === 'RUNNING');
@@ -56,20 +63,95 @@ export class McpProxyService {
throw new NotFoundError(`No running instance found for server '${request.serverId}'`); throw new NotFoundError(`No running instance found for server '${request.serverId}'`);
} }
if (running.port === null || running.port === undefined) { return this.sendToManaged(server, running, request.method, request.params);
throw new InvalidStateError(
`Running instance '${running.id}' for server '${request.serverId}' has no port assigned`,
);
}
return this.sendJsonRpc(running, request.method, request.params);
} }
/** /**
* Send a JSON-RPC request to an external MCP server. * Send to an external MCP server. Dispatches based on transport type.
* Handles streamable-http protocol (session management + SSE response parsing).
*/ */
private async sendToExternal( private async sendToExternal(
server: McpServer,
method: string,
params?: Record<string, unknown>,
): Promise<McpProxyResponse> {
const url = server.externalUrl as string;
if (server.transport === 'SSE') {
return sendViaSse(url, method, params);
}
// STREAMABLE_HTTP (default for external)
return this.sendStreamableHttp(server.id, url, method, params);
}
/**
* Send to a managed (containerized) MCP server. Dispatches based on transport type.
*/
private async sendToManaged(
server: McpServer,
instance: McpInstance,
method: string,
params?: Record<string, unknown>,
): Promise<McpProxyResponse> {
const transport = server.transport as string;
// STDIO: use docker exec
if (transport === 'STDIO') {
if (!this.orchestrator) {
throw new InvalidStateError('Orchestrator required for STDIO transport');
}
if (!instance.containerId) {
throw new InvalidStateError(`Instance '${instance.id}' has no container ID`);
}
const packageName = server.packageName as string | null;
if (!packageName) {
throw new InvalidStateError(`Server '${server.id}' has no package name for STDIO transport`);
}
return sendViaStdio(this.orchestrator, instance.containerId, packageName, method, params);
}
// SSE or STREAMABLE_HTTP: need a base URL
const baseUrl = await this.resolveBaseUrl(instance, server);
if (transport === 'SSE') {
return sendViaSse(baseUrl, method, params);
}
// STREAMABLE_HTTP (default)
return this.sendStreamableHttp(server.id, baseUrl, method, params);
}
/**
* Resolve the base URL for an HTTP-based managed server.
* Prefers container internal IP on Docker network, falls back to localhost:port.
*/
private async resolveBaseUrl(instance: McpInstance, server: McpServer): Promise<string> {
const containerPort = (server.containerPort as number | null) ?? 3000;
if (this.orchestrator && instance.containerId) {
try {
const containerInfo = await this.orchestrator.inspectContainer(instance.containerId);
if (containerInfo.ip) {
return `http://${containerInfo.ip}:${containerPort}`;
}
} catch {
// Fall through to localhost
}
}
if (instance.port !== null && instance.port !== undefined) {
return `http://localhost:${instance.port}`;
}
throw new InvalidStateError(
`Cannot resolve URL for instance '${instance.id}': no container IP or host port`,
);
}
/**
* Send via streamable-http protocol with session management.
*/
private async sendStreamableHttp(
serverId: string, serverId: string,
url: string, url: string,
method: string, method: string,
@@ -109,14 +191,14 @@ export class McpProxyService {
// Session expired? Clear and retry once // Session expired? Clear and retry once
if (response.status === 400 || response.status === 404) { if (response.status === 400 || response.status === 404) {
this.sessions.delete(serverId); this.sessions.delete(serverId);
return this.sendToExternal(serverId, url, method, params); return this.sendStreamableHttp(serverId, url, method, params);
} }
return { return {
jsonrpc: '2.0', jsonrpc: '2.0',
id: 1, id: 1,
error: { error: {
code: -32000, code: -32000,
message: `External MCP server returned HTTP ${response.status}: ${response.statusText}`, message: `MCP server returned HTTP ${response.status}: ${response.statusText}`,
}, },
}; };
} }
@@ -126,8 +208,7 @@ export class McpProxyService {
} }
/** /**
* Initialize a streamable-http session with an external server. * Initialize a streamable-http session with a server.
* Sends `initialize` and `notifications/initialized`, caches the session ID.
*/ */
private async initSession(serverId: string, url: string): Promise<void> { private async initSession(serverId: string, url: string): Promise<void> {
const initBody = { const initBody = {
@@ -174,41 +255,4 @@ export class McpProxyService {
body: JSON.stringify({ jsonrpc: '2.0', method: 'notifications/initialized' }), body: JSON.stringify({ jsonrpc: '2.0', method: 'notifications/initialized' }),
}); });
} }
private async sendJsonRpc(
instance: McpInstance,
method: string,
params?: Record<string, unknown>,
): Promise<McpProxyResponse> {
const url = `http://localhost:${instance.port}`;
const body: Record<string, unknown> = {
jsonrpc: '2.0',
id: 1,
method,
};
if (params !== undefined) {
body.params = params;
}
const response = await fetch(url, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(body),
});
if (!response.ok) {
return {
jsonrpc: '2.0',
id: 1,
error: {
code: -32000,
message: `MCP server returned HTTP ${response.status}: ${response.statusText}`,
},
};
}
const result = (await response.json()) as McpProxyResponse;
return result;
}
} }

View File

@@ -1,18 +1,24 @@
import type { Project } from '@prisma/client'; import type { McpServer } from '@prisma/client';
import type { IProjectRepository } from '../repositories/project.repository.js'; import type { IProjectRepository, ProjectWithRelations } from '../repositories/project.repository.js';
import type { IMcpServerRepository, ISecretRepository } from '../repositories/interfaces.js';
import { CreateProjectSchema, UpdateProjectSchema } from '../validation/project.schema.js'; import { CreateProjectSchema, UpdateProjectSchema } from '../validation/project.schema.js';
import { NotFoundError, ConflictError } from './mcp-server.service.js'; import { NotFoundError, ConflictError } from './mcp-server.service.js';
import { resolveServerEnv } from './env-resolver.js';
import { generateMcpConfig } from './mcp-config-generator.js';
import type { McpConfig } from './mcp-config-generator.js';
export class ProjectService { export class ProjectService {
constructor( constructor(
private readonly projectRepo: IProjectRepository, private readonly projectRepo: IProjectRepository,
private readonly serverRepo: IMcpServerRepository,
private readonly secretRepo: ISecretRepository,
) {} ) {}
async list(ownerId?: string): Promise<Project[]> { async list(ownerId?: string): Promise<ProjectWithRelations[]> {
return this.projectRepo.findAll(ownerId); return this.projectRepo.findAll(ownerId);
} }
async getById(id: string): Promise<Project> { async getById(id: string): Promise<ProjectWithRelations> {
const project = await this.projectRepo.findById(id); const project = await this.projectRepo.findById(id);
if (project === null) { if (project === null) {
throw new NotFoundError(`Project not found: ${id}`); throw new NotFoundError(`Project not found: ${id}`);
@@ -20,7 +26,20 @@ export class ProjectService {
return project; return project;
} }
async create(input: unknown, ownerId: string): Promise<Project> { /** Resolve by ID or name. */
async resolveAndGet(idOrName: string): Promise<ProjectWithRelations> {
// Try by ID first
const byId = await this.projectRepo.findById(idOrName);
if (byId !== null) return byId;
// Fall back to name
const byName = await this.projectRepo.findByName(idOrName);
if (byName !== null) return byName;
throw new NotFoundError(`Project not found: ${idOrName}`);
}
async create(input: unknown, ownerId: string): Promise<ProjectWithRelations> {
const data = CreateProjectSchema.parse(input); const data = CreateProjectSchema.parse(input);
const existing = await this.projectRepo.findByName(data.name); const existing = await this.projectRepo.findByName(data.name);
@@ -28,17 +47,109 @@ export class ProjectService {
throw new ConflictError(`Project already exists: ${data.name}`); throw new ConflictError(`Project already exists: ${data.name}`);
} }
return this.projectRepo.create({ ...data, ownerId }); // Resolve server names to IDs
const serverIds = await this.resolveServerNames(data.servers);
const project = await this.projectRepo.create({
name: data.name,
description: data.description,
prompt: data.prompt,
ownerId,
proxyMode: data.proxyMode,
...(data.llmProvider !== undefined ? { llmProvider: data.llmProvider } : {}),
...(data.llmModel !== undefined ? { llmModel: data.llmModel } : {}),
});
// Link servers
if (serverIds.length > 0) {
await this.projectRepo.setServers(project.id, serverIds);
} }
async update(id: string, input: unknown): Promise<Project> { // Re-fetch to include relations
return this.getById(project.id);
}
async update(id: string, input: unknown): Promise<ProjectWithRelations> {
const data = UpdateProjectSchema.parse(input); const data = UpdateProjectSchema.parse(input);
await this.getById(id); const project = await this.getById(id);
return this.projectRepo.update(id, data);
// Build update data for scalar fields
const updateData: Record<string, unknown> = {};
if (data.description !== undefined) updateData['description'] = data.description;
if (data.prompt !== undefined) updateData['prompt'] = data.prompt;
if (data.proxyMode !== undefined) updateData['proxyMode'] = data.proxyMode;
if (data.llmProvider !== undefined) updateData['llmProvider'] = data.llmProvider;
if (data.llmModel !== undefined) updateData['llmModel'] = data.llmModel;
// Update scalar fields if any changed
if (Object.keys(updateData).length > 0) {
await this.projectRepo.update(project.id, updateData);
}
// Update servers if provided
if (data.servers !== undefined) {
const serverIds = await this.resolveServerNames(data.servers);
await this.projectRepo.setServers(project.id, serverIds);
}
// Re-fetch to include updated relations
return this.getById(project.id);
} }
async delete(id: string): Promise<void> { async delete(id: string): Promise<void> {
await this.getById(id); await this.getById(id);
await this.projectRepo.delete(id); await this.projectRepo.delete(id);
} }
async generateMcpConfig(idOrName: string): Promise<McpConfig> {
const project = await this.resolveAndGet(idOrName);
if (project.proxyMode === 'filtered') {
// Single entry pointing at mcplocal proxy
return {
mcpServers: {
[project.name]: {
url: `http://localhost:3100/api/v1/mcp/proxy/project/${project.name}`,
},
},
};
}
// Direct mode: fetch full servers and resolve env
const serverEntries: Array<{ server: McpServer; resolvedEnv: Record<string, string> }> = [];
for (const ps of project.servers) {
const server = await this.serverRepo.findById(ps.server.id);
if (server === null) continue;
const resolvedEnv = await resolveServerEnv(server, this.secretRepo);
serverEntries.push({ server, resolvedEnv });
}
return generateMcpConfig(serverEntries);
}
async addServer(idOrName: string, serverName: string): Promise<ProjectWithRelations> {
const project = await this.resolveAndGet(idOrName);
const server = await this.serverRepo.findByName(serverName);
if (server === null) throw new NotFoundError(`Server not found: ${serverName}`);
await this.projectRepo.addServer(project.id, server.id);
return this.getById(project.id);
}
async removeServer(idOrName: string, serverName: string): Promise<ProjectWithRelations> {
const project = await this.resolveAndGet(idOrName);
const server = await this.serverRepo.findByName(serverName);
if (server === null) throw new NotFoundError(`Server not found: ${serverName}`);
await this.projectRepo.removeServer(project.id, server.id);
return this.getById(project.id);
}
private async resolveServerNames(names: string[]): Promise<string[]> {
return Promise.all(names.map(async (name) => {
const server = await this.serverRepo.findByName(name);
if (server === null) throw new NotFoundError(`Server not found: ${name}`);
return server.id;
}));
}
} }

View File

@@ -0,0 +1,137 @@
import type { Prompt, PromptRequest } from '@prisma/client';
import type { IPromptRepository } from '../repositories/prompt.repository.js';
import type { IPromptRequestRepository } from '../repositories/prompt-request.repository.js';
import type { IProjectRepository } from '../repositories/project.repository.js';
import { CreatePromptSchema, UpdatePromptSchema, CreatePromptRequestSchema } from '../validation/prompt.schema.js';
import { NotFoundError } from './mcp-server.service.js';
export class PromptService {
constructor(
private readonly promptRepo: IPromptRepository,
private readonly promptRequestRepo: IPromptRequestRepository,
private readonly projectRepo: IProjectRepository,
) {}
// ── Prompt CRUD ──
async listPrompts(projectId?: string): Promise<Prompt[]> {
return this.promptRepo.findAll(projectId);
}
async getPrompt(id: string): Promise<Prompt> {
const prompt = await this.promptRepo.findById(id);
if (prompt === null) throw new NotFoundError(`Prompt not found: ${id}`);
return prompt;
}
async createPrompt(input: unknown): Promise<Prompt> {
const data = CreatePromptSchema.parse(input);
if (data.projectId) {
const project = await this.projectRepo.findById(data.projectId);
if (project === null) throw new NotFoundError(`Project not found: ${data.projectId}`);
}
const createData: { name: string; content: string; projectId?: string } = {
name: data.name,
content: data.content,
};
if (data.projectId !== undefined) createData.projectId = data.projectId;
return this.promptRepo.create(createData);
}
async updatePrompt(id: string, input: unknown): Promise<Prompt> {
const data = UpdatePromptSchema.parse(input);
await this.getPrompt(id);
const updateData: { content?: string } = {};
if (data.content !== undefined) updateData.content = data.content;
return this.promptRepo.update(id, updateData);
}
async deletePrompt(id: string): Promise<void> {
await this.getPrompt(id);
await this.promptRepo.delete(id);
}
// ── PromptRequest CRUD ──
async listPromptRequests(projectId?: string): Promise<PromptRequest[]> {
return this.promptRequestRepo.findAll(projectId);
}
async getPromptRequest(id: string): Promise<PromptRequest> {
const req = await this.promptRequestRepo.findById(id);
if (req === null) throw new NotFoundError(`PromptRequest not found: ${id}`);
return req;
}
async deletePromptRequest(id: string): Promise<void> {
await this.getPromptRequest(id);
await this.promptRequestRepo.delete(id);
}
// ── Propose (LLM creates a PromptRequest) ──
async propose(input: unknown): Promise<PromptRequest> {
const data = CreatePromptRequestSchema.parse(input);
if (data.projectId) {
const project = await this.projectRepo.findById(data.projectId);
if (project === null) throw new NotFoundError(`Project not found: ${data.projectId}`);
}
const createData: { name: string; content: string; projectId?: string; createdBySession?: string; createdByUserId?: string } = {
name: data.name,
content: data.content,
};
if (data.projectId !== undefined) createData.projectId = data.projectId;
if (data.createdBySession !== undefined) createData.createdBySession = data.createdBySession;
if (data.createdByUserId !== undefined) createData.createdByUserId = data.createdByUserId;
return this.promptRequestRepo.create(createData);
}
// ── Approve (delete PromptRequest → create Prompt) ──
async approve(requestId: string): Promise<Prompt> {
const req = await this.getPromptRequest(requestId);
// Create the approved prompt
const createData: { name: string; content: string; projectId?: string } = {
name: req.name,
content: req.content,
};
if (req.projectId !== null) createData.projectId = req.projectId;
const prompt = await this.promptRepo.create(createData);
// Delete the request
await this.promptRequestRepo.delete(requestId);
return prompt;
}
// ── Visibility for MCP (approved prompts + session's pending requests) ──
async getVisiblePrompts(
projectId?: string,
sessionId?: string,
): Promise<Array<{ name: string; content: string; type: 'prompt' | 'promptrequest' }>> {
const results: Array<{ name: string; content: string; type: 'prompt' | 'promptrequest' }> = [];
// Approved prompts (project-scoped + global)
const prompts = await this.promptRepo.findAll(projectId);
for (const p of prompts) {
results.push({ name: p.name, content: p.content, type: 'prompt' });
}
// Session's own pending requests
if (sessionId) {
const requests = await this.promptRequestRepo.findBySession(sessionId, projectId);
for (const r of requests) {
results.push({ name: r.name, content: r.content, type: 'promptrequest' });
}
}
return results;
}
}

View File

@@ -0,0 +1,54 @@
import type { RbacDefinition } from '@prisma/client';
import type { IRbacDefinitionRepository } from '../repositories/rbac-definition.repository.js';
import { CreateRbacDefinitionSchema, UpdateRbacDefinitionSchema } from '../validation/rbac-definition.schema.js';
import { NotFoundError, ConflictError } from './mcp-server.service.js';
export class RbacDefinitionService {
constructor(private readonly repo: IRbacDefinitionRepository) {}
async list(): Promise<RbacDefinition[]> {
return this.repo.findAll();
}
async getById(id: string): Promise<RbacDefinition> {
const def = await this.repo.findById(id);
if (def === null) {
throw new NotFoundError(`RbacDefinition not found: ${id}`);
}
return def;
}
async getByName(name: string): Promise<RbacDefinition> {
const def = await this.repo.findByName(name);
if (def === null) {
throw new NotFoundError(`RbacDefinition not found: ${name}`);
}
return def;
}
async create(input: unknown): Promise<RbacDefinition> {
const data = CreateRbacDefinitionSchema.parse(input);
const existing = await this.repo.findByName(data.name);
if (existing !== null) {
throw new ConflictError(`RbacDefinition already exists: ${data.name}`);
}
return this.repo.create(data);
}
async update(id: string, input: unknown): Promise<RbacDefinition> {
const data = UpdateRbacDefinitionSchema.parse(input);
// Verify exists
await this.getById(id);
return this.repo.update(id, data);
}
async delete(id: string): Promise<void> {
// Verify exists
await this.getById(id);
await this.repo.delete(id);
}
}

View File

@@ -0,0 +1,165 @@
import type { PrismaClient } from '@prisma/client';
import type { IRbacDefinitionRepository } from '../repositories/rbac-definition.repository.js';
import {
normalizeResource,
isResourceBinding,
isOperationBinding,
type RbacSubject,
type RbacRoleBinding,
} from '../validation/rbac-definition.schema.js';
export type RbacAction = 'view' | 'create' | 'delete' | 'edit' | 'run' | 'expose';
export interface ResourcePermission {
role: string;
resource: string;
name?: string;
}
export interface OperationPermission {
role: 'run';
action: string;
}
export type Permission = ResourcePermission | OperationPermission;
export interface AllowedScope {
wildcard: boolean;
names: Set<string>;
}
/** Maps roles to the set of actions they grant. */
const ROLE_ACTIONS: Record<string, readonly RbacAction[]> = {
edit: ['view', 'create', 'delete', 'edit', 'expose'],
view: ['view'],
create: ['create'],
delete: ['delete'],
run: ['run'],
expose: ['expose', 'view'],
};
export class RbacService {
constructor(
private readonly rbacRepo: IRbacDefinitionRepository,
private readonly prisma: PrismaClient,
) {}
/**
* Check whether a user is allowed to perform an action on a resource.
* @param resourceName — optional specific resource name (e.g. 'my-ha').
* If provided, name-scoped bindings only match when their name equals this.
* If omitted (listing), name-scoped bindings still grant access.
*/
async canAccess(userId: string, action: RbacAction, resource: string, resourceName?: string, serviceAccountName?: string): Promise<boolean> {
const permissions = await this.getPermissions(userId, serviceAccountName);
const normalized = normalizeResource(resource);
for (const perm of permissions) {
if (!('resource' in perm)) continue;
const actions = ROLE_ACTIONS[perm.role];
if (actions === undefined) continue;
if (!actions.includes(action)) continue;
const permResource = normalizeResource(perm.resource);
if (permResource !== '*' && permResource !== normalized) continue;
// Name-scoped check: if binding has a name AND caller specified a resourceName, must match
if (perm.name !== undefined && resourceName !== undefined && perm.name !== resourceName) continue;
return true;
}
return false;
}
/**
* Check whether a user is allowed to perform a named operation.
* Operations require an explicit 'run' role binding with a matching action.
*/
async canRunOperation(userId: string, operation: string, serviceAccountName?: string): Promise<boolean> {
const permissions = await this.getPermissions(userId, serviceAccountName);
for (const perm of permissions) {
if ('action' in perm && perm.role === 'run' && perm.action === operation) {
return true;
}
}
return false;
}
/**
* Determine the set of resource names a user may access for a given action+resource.
* Returns wildcard:true if any matching binding is unscoped (no name constraint).
* Returns wildcard:false with a set of allowed names if all bindings are name-scoped.
*/
async getAllowedScope(userId: string, action: RbacAction, resource: string, serviceAccountName?: string): Promise<AllowedScope> {
const permissions = await this.getPermissions(userId, serviceAccountName);
const normalized = normalizeResource(resource);
const names = new Set<string>();
for (const perm of permissions) {
if (!('resource' in perm)) continue;
const actions = ROLE_ACTIONS[perm.role];
if (actions === undefined) continue;
if (!actions.includes(action)) continue;
const permResource = normalizeResource(perm.resource);
if (permResource !== '*' && permResource !== normalized) continue;
// Unscoped binding → wildcard access to this resource
if (perm.name === undefined) return { wildcard: true, names: new Set() };
names.add(perm.name);
}
return { wildcard: false, names };
}
/**
* Collect all permissions for a user across all matching RbacDefinitions.
*/
async getPermissions(userId: string, serviceAccountName?: string): Promise<Permission[]> {
// 1. Resolve user email
const user = await this.prisma.user.findUnique({
where: { id: userId },
select: { email: true },
});
if (user === null && serviceAccountName === undefined) return [];
// 2. Resolve group names the user belongs to
let groupNames: string[] = [];
if (user !== null) {
const memberships = await this.prisma.groupMember.findMany({
where: { userId },
select: { group: { select: { name: true } } },
});
groupNames = memberships.map((m) => m.group.name);
}
// 3. Load all RbacDefinitions
const definitions = await this.rbacRepo.findAll();
// 4. Find definitions where user or service account is a subject
const permissions: Permission[] = [];
for (const def of definitions) {
const subjects = def.subjects as RbacSubject[];
const matched = subjects.some((s) => {
if (s.kind === 'User') return user !== null && s.name === user.email;
if (s.kind === 'Group') return groupNames.includes(s.name);
if (s.kind === 'ServiceAccount') return serviceAccountName !== undefined && s.name === serviceAccountName;
return false;
});
if (!matched) continue;
// 5. Collect roleBindings
const bindings = def.roleBindings as RbacRoleBinding[];
for (const binding of bindings) {
if (isResourceBinding(binding)) {
const perm: ResourcePermission = { role: binding.role, resource: binding.resource };
if (binding.name !== undefined) perm.name = binding.name;
permissions.push(perm);
} else if (isOperationBinding(binding)) {
permissions.push({ role: 'run', action: binding.action });
}
}
}
return permissions;
}
}

View File

@@ -0,0 +1,2 @@
export { sendViaSse } from './sse-client.js';
export { sendViaStdio } from './stdio-client.js';

View File

@@ -0,0 +1,150 @@
import type { McpProxyResponse } from '../mcp-proxy-service.js';
/**
* SSE transport client for MCP servers using the legacy SSE protocol.
*
* Protocol: GET /sse → endpoint event with messages URL → POST to messages URL.
* Responses come back on the SSE stream, matched by JSON-RPC request ID.
*
* Each call opens a fresh SSE connection, initializes, sends the request,
* reads the response, and closes. Session caching may be added later.
*/
export async function sendViaSse(
baseUrl: string,
method: string,
params?: Record<string, unknown>,
timeoutMs = 30_000,
): Promise<McpProxyResponse> {
const controller = new AbortController();
const timer = setTimeout(() => controller.abort(), timeoutMs);
try {
// 1. GET /sse → SSE stream
const sseResp = await fetch(`${baseUrl}/sse`, {
method: 'GET',
headers: { 'Accept': 'text/event-stream' },
signal: controller.signal,
});
if (!sseResp.ok) {
return errorResponse(`SSE connect failed: HTTP ${sseResp.status}`);
}
const reader = sseResp.body?.getReader();
if (!reader) {
return errorResponse('No SSE stream body');
}
// 2. Read until we get the endpoint event with messages URL
const decoder = new TextDecoder();
let buffer = '';
let messagesUrl = '';
while (!messagesUrl) {
const { done, value } = await reader.read();
if (done) break;
buffer += decoder.decode(value, { stream: true });
for (const line of buffer.split('\n')) {
if (line.startsWith('data: ') && buffer.includes('event: endpoint')) {
const endpoint = line.slice(6).trim();
messagesUrl = endpoint.startsWith('http') ? endpoint : `${baseUrl}${endpoint}`;
}
}
const lines = buffer.split('\n');
buffer = lines[lines.length - 1] ?? '';
}
if (!messagesUrl) {
reader.cancel();
return errorResponse('No endpoint event from SSE stream');
}
const postHeaders = { 'Content-Type': 'application/json' };
// 3. Initialize
const initResp = await fetch(messagesUrl, {
method: 'POST',
headers: postHeaders,
body: JSON.stringify({
jsonrpc: '2.0',
id: 1,
method: 'initialize',
params: {
protocolVersion: '2024-11-05',
capabilities: {},
clientInfo: { name: 'mcpctl-proxy', version: '0.1.0' },
},
}),
signal: controller.signal,
});
if (!initResp.ok) {
reader.cancel();
return errorResponse(`SSE initialize failed: HTTP ${initResp.status}`);
}
// 4. Send notifications/initialized
await fetch(messagesUrl, {
method: 'POST',
headers: postHeaders,
body: JSON.stringify({ jsonrpc: '2.0', method: 'notifications/initialized' }),
signal: controller.signal,
});
// 5. Send the actual request
const requestId = 2;
await fetch(messagesUrl, {
method: 'POST',
headers: postHeaders,
body: JSON.stringify({
jsonrpc: '2.0',
id: requestId,
method,
...(params !== undefined ? { params } : {}),
}),
signal: controller.signal,
});
// 6. Read response from SSE stream (matched by request ID)
let responseBuffer = '';
const readTimeout = setTimeout(() => reader.cancel(), 5000);
while (true) {
const { done, value } = await reader.read();
if (done) break;
responseBuffer += decoder.decode(value, { stream: true });
for (const line of responseBuffer.split('\n')) {
if (line.startsWith('data: ')) {
try {
const parsed = JSON.parse(line.slice(6)) as McpProxyResponse;
if (parsed.id === requestId) {
clearTimeout(readTimeout);
reader.cancel();
return parsed;
}
} catch {
// Not valid JSON, skip
}
}
}
const respLines = responseBuffer.split('\n');
responseBuffer = respLines[respLines.length - 1] ?? '';
}
clearTimeout(readTimeout);
reader.cancel();
return errorResponse('No response received from SSE stream');
} finally {
clearTimeout(timer);
}
}
function errorResponse(message: string): McpProxyResponse {
return {
jsonrpc: '2.0',
id: 1,
error: { code: -32000, message },
};
}

View File

@@ -0,0 +1,119 @@
import type { McpOrchestrator } from '../orchestrator.js';
import type { McpProxyResponse } from '../mcp-proxy-service.js';
/**
* STDIO transport client for MCP servers running as Docker containers.
*
* Runs `docker exec` with an inline Node.js script that spawns the MCP server
* binary, pipes JSON-RPC messages via stdin/stdout, and returns the response.
*
* Each call is self-contained: initialize → notifications/initialized → request → response.
*/
export async function sendViaStdio(
orchestrator: McpOrchestrator,
containerId: string,
packageName: string,
method: string,
params?: Record<string, unknown>,
timeoutMs = 30_000,
): Promise<McpProxyResponse> {
const initMsg = JSON.stringify({
jsonrpc: '2.0',
id: 1,
method: 'initialize',
params: {
protocolVersion: '2024-11-05',
capabilities: {},
clientInfo: { name: 'mcpctl-proxy', version: '0.1.0' },
},
});
const initializedMsg = JSON.stringify({
jsonrpc: '2.0',
method: 'notifications/initialized',
});
const requestBody: Record<string, unknown> = {
jsonrpc: '2.0',
id: 2,
method,
};
if (params !== undefined) {
requestBody.params = params;
}
const requestMsg = JSON.stringify(requestBody);
// Inline Node.js script that:
// 1. Spawns the MCP server binary via npx
// 2. Sends initialize → initialized → actual request via stdin
// 3. Reads stdout for JSON-RPC response with id: 2
// 4. Outputs the full JSON-RPC response to stdout
const probeScript = `
const { spawn } = require('child_process');
const proc = spawn('npx', ['--prefer-offline', '-y', ${JSON.stringify(packageName)}], { stdio: ['pipe', 'pipe', 'pipe'] });
let output = '';
let responded = false;
proc.stdout.on('data', d => {
output += d;
const lines = output.split('\\n');
for (const line of lines) {
if (!line.trim()) continue;
try {
const msg = JSON.parse(line);
if (msg.id === 2) {
responded = true;
process.stdout.write(JSON.stringify(msg), () => {
proc.kill();
process.exit(0);
});
}
} catch {}
}
output = lines[lines.length - 1] || '';
});
proc.stderr.on('data', () => {});
proc.on('error', e => { process.stdout.write(JSON.stringify({jsonrpc:'2.0',id:2,error:{code:-32000,message:e.message}})); process.exit(1); });
proc.on('exit', (code) => { if (!responded) { process.stdout.write(JSON.stringify({jsonrpc:'2.0',id:2,error:{code:-32000,message:'process exited '+code}})); process.exit(1); } });
setTimeout(() => { if (!responded) { process.stdout.write(JSON.stringify({jsonrpc:'2.0',id:2,error:{code:-32000,message:'timeout'}})); proc.kill(); process.exit(1); } }, ${timeoutMs - 2000});
proc.stdin.write(${JSON.stringify(initMsg)} + '\\n');
setTimeout(() => {
proc.stdin.write(${JSON.stringify(initializedMsg)} + '\\n');
setTimeout(() => {
proc.stdin.write(${JSON.stringify(requestMsg)} + '\\n');
}, 500);
}, 500);
`.trim();
try {
const result = await orchestrator.execInContainer(
containerId,
['node', '-e', probeScript],
{ timeoutMs },
);
if (result.exitCode === 0 && result.stdout.trim()) {
try {
return JSON.parse(result.stdout.trim()) as McpProxyResponse;
} catch {
return errorResponse(`Failed to parse STDIO response: ${result.stdout.slice(0, 200)}`);
}
}
// Try to parse error response from stdout
try {
return JSON.parse(result.stdout.trim()) as McpProxyResponse;
} catch {
const errorMsg = result.stderr.trim() || `docker exec exit code ${result.exitCode}`;
return errorResponse(errorMsg);
}
} catch (err) {
return errorResponse(err instanceof Error ? err.message : String(err));
}
}
function errorResponse(message: string): McpProxyResponse {
return {
jsonrpc: '2.0',
id: 2,
error: { code: -32000, message },
};
}

View File

@@ -0,0 +1,60 @@
import bcrypt from 'bcrypt';
import type { IUserRepository, SafeUser } from '../repositories/user.repository.js';
import { CreateUserSchema } from '../validation/user.schema.js';
import { NotFoundError, ConflictError } from './mcp-server.service.js';
const SALT_ROUNDS = 10;
export class UserService {
constructor(private readonly userRepo: IUserRepository) {}
async list(): Promise<SafeUser[]> {
return this.userRepo.findAll();
}
async getById(id: string): Promise<SafeUser> {
const user = await this.userRepo.findById(id);
if (user === null) {
throw new NotFoundError(`User not found: ${id}`);
}
return user;
}
async getByEmail(email: string): Promise<SafeUser> {
const user = await this.userRepo.findByEmail(email);
if (user === null) {
throw new NotFoundError(`User not found: ${email}`);
}
return user;
}
async create(input: unknown): Promise<SafeUser> {
const data = CreateUserSchema.parse(input);
const existing = await this.userRepo.findByEmail(data.email);
if (existing !== null) {
throw new ConflictError(`User already exists: ${data.email}`);
}
const passwordHash = await bcrypt.hash(data.password, SALT_ROUNDS);
const createData: { email: string; passwordHash: string; name?: string } = {
email: data.email,
passwordHash,
};
if (data.name !== undefined) {
createData.name = data.name;
}
return this.userRepo.create(createData);
}
async delete(id: string): Promise<void> {
await this.getById(id);
await this.userRepo.delete(id);
}
async count(): Promise<number> {
return this.userRepo.count();
}
}

View File

@@ -0,0 +1,15 @@
import { z } from 'zod';
export const CreateGroupSchema = z.object({
name: z.string().min(1).max(100).regex(/^[a-z0-9-]+$/, 'Name must be lowercase alphanumeric with hyphens'),
description: z.string().max(1000).default(''),
members: z.array(z.string().email()).default([]),
});
export const UpdateGroupSchema = z.object({
description: z.string().max(1000).optional(),
members: z.array(z.string().email()).optional(),
});
export type CreateGroupInput = z.infer<typeof CreateGroupSchema>;
export type UpdateGroupInput = z.infer<typeof UpdateGroupSchema>;

View File

@@ -2,3 +2,5 @@ export { CreateMcpServerSchema, UpdateMcpServerSchema } from './mcp-server.schem
export type { CreateMcpServerInput, UpdateMcpServerInput } from './mcp-server.schema.js'; export type { CreateMcpServerInput, UpdateMcpServerInput } from './mcp-server.schema.js';
export { CreateProjectSchema, UpdateProjectSchema } from './project.schema.js'; export { CreateProjectSchema, UpdateProjectSchema } from './project.schema.js';
export type { CreateProjectInput, UpdateProjectInput } from './project.schema.js'; export type { CreateProjectInput, UpdateProjectInput } from './project.schema.js';
export { CreateRbacDefinitionSchema, UpdateRbacDefinitionSchema, RbacSubjectSchema, RbacRoleBindingSchema, RBAC_ROLES, RBAC_RESOURCES } from './rbac-definition.schema.js';
export type { CreateRbacDefinitionInput, UpdateRbacDefinitionInput, RbacSubject, RbacRoleBinding } from './rbac-definition.schema.js';

View File

@@ -3,10 +3,23 @@ import { z } from 'zod';
export const CreateProjectSchema = z.object({ export const CreateProjectSchema = z.object({
name: z.string().min(1).max(100).regex(/^[a-z0-9-]+$/, 'Name must be lowercase alphanumeric with hyphens'), name: z.string().min(1).max(100).regex(/^[a-z0-9-]+$/, 'Name must be lowercase alphanumeric with hyphens'),
description: z.string().max(1000).default(''), description: z.string().max(1000).default(''),
}); prompt: z.string().max(10000).default(''),
proxyMode: z.enum(['direct', 'filtered']).default('direct'),
llmProvider: z.string().max(100).optional(),
llmModel: z.string().max(100).optional(),
servers: z.array(z.string().min(1)).default([]),
}).refine(
(d) => d.proxyMode !== 'filtered' || d.llmProvider,
{ message: 'llmProvider is required when proxyMode is "filtered"' },
);
export const UpdateProjectSchema = z.object({ export const UpdateProjectSchema = z.object({
description: z.string().max(1000).optional(), description: z.string().max(1000).optional(),
prompt: z.string().max(10000).optional(),
proxyMode: z.enum(['direct', 'filtered']).optional(),
llmProvider: z.string().max(100).nullable().optional(),
llmModel: z.string().max(100).nullable().optional(),
servers: z.array(z.string().min(1)).optional(),
}); });
export type CreateProjectInput = z.infer<typeof CreateProjectSchema>; export type CreateProjectInput = z.infer<typeof CreateProjectSchema>;

View File

@@ -0,0 +1,23 @@
import { z } from 'zod';
export const CreatePromptSchema = z.object({
name: z.string().min(1).max(100).regex(/^[a-z0-9-]+$/, 'Name must be lowercase alphanumeric with hyphens'),
content: z.string().min(1).max(50000),
projectId: z.string().optional(),
});
export const UpdatePromptSchema = z.object({
content: z.string().min(1).max(50000).optional(),
});
export const CreatePromptRequestSchema = z.object({
name: z.string().min(1).max(100).regex(/^[a-z0-9-]+$/, 'Name must be lowercase alphanumeric with hyphens'),
content: z.string().min(1).max(50000),
projectId: z.string().optional(),
createdBySession: z.string().optional(),
createdByUserId: z.string().optional(),
});
export type CreatePromptInput = z.infer<typeof CreatePromptSchema>;
export type UpdatePromptInput = z.infer<typeof UpdatePromptSchema>;
export type CreatePromptRequestInput = z.infer<typeof CreatePromptRequestSchema>;

View File

@@ -0,0 +1,73 @@
import { z } from 'zod';
export const RBAC_ROLES = ['edit', 'view', 'create', 'delete', 'run', 'expose'] as const;
export const RBAC_RESOURCES = ['*', 'servers', 'instances', 'secrets', 'projects', 'templates', 'users', 'groups', 'rbac', 'prompts', 'promptrequests'] as const;
/** Singular→plural map for resource names. */
const RESOURCE_ALIASES: Record<string, string> = {
server: 'servers',
instance: 'instances',
secret: 'secrets',
project: 'projects',
template: 'templates',
user: 'users',
group: 'groups',
prompt: 'prompts',
promptrequest: 'promptrequests',
};
/** Normalize a resource name to its canonical plural form. */
export function normalizeResource(resource: string): string {
return RESOURCE_ALIASES[resource] ?? resource;
}
export const RbacSubjectSchema = z.object({
kind: z.enum(['User', 'Group', 'ServiceAccount']),
name: z.string().min(1),
});
/** Resource binding: role grants access to a resource type (optionally scoped to a named instance). */
export const ResourceBindingSchema = z.object({
role: z.enum(RBAC_ROLES),
resource: z.string().min(1).transform(normalizeResource),
name: z.string().min(1).optional(),
});
/** Operation binding: 'run' role grants access to a named operation. */
export const OperationBindingSchema = z.object({
role: z.literal('run'),
action: z.string().min(1),
});
/** Union of both binding types. */
export const RbacRoleBindingSchema = z.union([
ResourceBindingSchema,
OperationBindingSchema,
]);
export type RbacSubject = z.infer<typeof RbacSubjectSchema>;
export type ResourceBinding = z.infer<typeof ResourceBindingSchema>;
export type OperationBinding = z.infer<typeof OperationBindingSchema>;
export type RbacRoleBinding = z.infer<typeof RbacRoleBindingSchema>;
export function isResourceBinding(b: RbacRoleBinding): b is ResourceBinding {
return 'resource' in b;
}
export function isOperationBinding(b: RbacRoleBinding): b is OperationBinding {
return 'action' in b;
}
export const CreateRbacDefinitionSchema = z.object({
name: z.string().min(1).max(100).regex(/^[a-z0-9-]+$/, 'Name must be lowercase alphanumeric with hyphens'),
subjects: z.array(RbacSubjectSchema).min(1),
roleBindings: z.array(RbacRoleBindingSchema).min(1),
});
export const UpdateRbacDefinitionSchema = z.object({
subjects: z.array(RbacSubjectSchema).min(1).optional(),
roleBindings: z.array(RbacRoleBindingSchema).min(1).optional(),
});
export type CreateRbacDefinitionInput = z.infer<typeof CreateRbacDefinitionSchema>;
export type UpdateRbacDefinitionInput = z.infer<typeof UpdateRbacDefinitionSchema>;

View File

@@ -0,0 +1,15 @@
import { z } from 'zod';
export const CreateUserSchema = z.object({
email: z.string().email(),
password: z.string().min(8).max(128),
name: z.string().max(100).optional(),
});
export const UpdateUserSchema = z.object({
name: z.string().max(100).optional(),
password: z.string().min(8).max(128).optional(),
});
export type CreateUserInput = z.infer<typeof CreateUserSchema>;
export type UpdateUserInput = z.infer<typeof UpdateUserSchema>;

View File

@@ -0,0 +1,424 @@
import { describe, it, expect, vi, afterEach, beforeEach } from 'vitest';
import Fastify from 'fastify';
import type { FastifyInstance } from 'fastify';
import { registerAuthRoutes } from '../src/routes/auth.js';
import { errorHandler } from '../src/middleware/error-handler.js';
import type { AuthService, LoginResult } from '../src/services/auth.service.js';
import type { UserService } from '../src/services/user.service.js';
import type { GroupService } from '../src/services/group.service.js';
import type { RbacDefinitionService } from '../src/services/rbac-definition.service.js';
import type { RbacService, RbacAction } from '../src/services/rbac.service.js';
import type { SafeUser } from '../src/repositories/user.repository.js';
import type { RbacDefinition } from '@prisma/client';
let app: FastifyInstance;
afterEach(async () => {
if (app) await app.close();
});
function makeLoginResult(overrides?: Partial<LoginResult>): LoginResult {
return {
token: 'test-token-123',
expiresAt: new Date(Date.now() + 86400_000),
user: { id: 'user-1', email: 'admin@example.com', role: 'user' },
...overrides,
};
}
function makeSafeUser(overrides?: Partial<SafeUser>): SafeUser {
return {
id: 'user-1',
email: 'admin@example.com',
name: null,
role: 'user',
provider: 'local',
externalId: null,
version: 1,
createdAt: new Date(),
updatedAt: new Date(),
...overrides,
};
}
function makeRbacDef(overrides?: Partial<RbacDefinition>): RbacDefinition {
return {
id: 'rbac-1',
name: 'bootstrap-admin',
subjects: [{ kind: 'Group', name: 'admin' }],
roleBindings: [
{ role: 'edit', resource: '*' },
{ role: 'run', resource: '*' },
{ role: 'run', action: 'impersonate' },
{ role: 'run', action: 'logs' },
{ role: 'run', action: 'backup' },
{ role: 'run', action: 'restore' },
{ role: 'run', action: 'audit-purge' },
],
version: 1,
createdAt: new Date(),
updatedAt: new Date(),
...overrides,
};
}
interface MockDeps {
authService: {
login: ReturnType<typeof vi.fn>;
logout: ReturnType<typeof vi.fn>;
findSession: ReturnType<typeof vi.fn>;
impersonate: ReturnType<typeof vi.fn>;
};
userService: {
count: ReturnType<typeof vi.fn>;
create: ReturnType<typeof vi.fn>;
list: ReturnType<typeof vi.fn>;
getById: ReturnType<typeof vi.fn>;
getByEmail: ReturnType<typeof vi.fn>;
delete: ReturnType<typeof vi.fn>;
};
groupService: {
create: ReturnType<typeof vi.fn>;
list: ReturnType<typeof vi.fn>;
getById: ReturnType<typeof vi.fn>;
getByName: ReturnType<typeof vi.fn>;
update: ReturnType<typeof vi.fn>;
delete: ReturnType<typeof vi.fn>;
};
rbacDefinitionService: {
create: ReturnType<typeof vi.fn>;
list: ReturnType<typeof vi.fn>;
getById: ReturnType<typeof vi.fn>;
getByName: ReturnType<typeof vi.fn>;
update: ReturnType<typeof vi.fn>;
delete: ReturnType<typeof vi.fn>;
};
rbacService: {
canAccess: ReturnType<typeof vi.fn>;
canRunOperation: ReturnType<typeof vi.fn>;
getPermissions: ReturnType<typeof vi.fn>;
};
}
function createMockDeps(): MockDeps {
return {
authService: {
login: vi.fn(async () => makeLoginResult()),
logout: vi.fn(async () => {}),
findSession: vi.fn(async () => null),
impersonate: vi.fn(async () => makeLoginResult({ token: 'impersonated-token' })),
},
userService: {
count: vi.fn(async () => 0),
create: vi.fn(async () => makeSafeUser()),
list: vi.fn(async () => []),
getById: vi.fn(async () => makeSafeUser()),
getByEmail: vi.fn(async () => makeSafeUser()),
delete: vi.fn(async () => {}),
},
groupService: {
create: vi.fn(async () => ({ id: 'grp-1', name: 'admin', description: 'Bootstrap admin group', members: [] })),
list: vi.fn(async () => []),
getById: vi.fn(async () => null),
getByName: vi.fn(async () => null),
update: vi.fn(async () => null),
delete: vi.fn(async () => {}),
},
rbacDefinitionService: {
create: vi.fn(async () => makeRbacDef()),
list: vi.fn(async () => []),
getById: vi.fn(async () => makeRbacDef()),
getByName: vi.fn(async () => null),
update: vi.fn(async () => makeRbacDef()),
delete: vi.fn(async () => {}),
},
rbacService: {
canAccess: vi.fn(async () => false),
canRunOperation: vi.fn(async () => false),
getPermissions: vi.fn(async () => []),
},
};
}
function createApp(deps: MockDeps): Promise<FastifyInstance> {
app = Fastify({ logger: false });
app.setErrorHandler(errorHandler);
registerAuthRoutes(app, deps as unknown as {
authService: AuthService;
userService: UserService;
groupService: GroupService;
rbacDefinitionService: RbacDefinitionService;
rbacService: RbacService;
});
return app.ready();
}
describe('Auth Bootstrap', () => {
describe('GET /api/v1/auth/status', () => {
it('returns hasUsers: false when no users exist', async () => {
const deps = createMockDeps();
deps.userService.count.mockResolvedValue(0);
await createApp(deps);
const res = await app.inject({ method: 'GET', url: '/api/v1/auth/status' });
expect(res.statusCode).toBe(200);
expect(res.json<{ hasUsers: boolean }>().hasUsers).toBe(false);
});
it('returns hasUsers: true when users exist', async () => {
const deps = createMockDeps();
deps.userService.count.mockResolvedValue(1);
await createApp(deps);
const res = await app.inject({ method: 'GET', url: '/api/v1/auth/status' });
expect(res.statusCode).toBe(200);
expect(res.json<{ hasUsers: boolean }>().hasUsers).toBe(true);
});
});
describe('POST /api/v1/auth/bootstrap', () => {
it('creates admin user, admin group, RBAC definition targeting group, and returns session token', async () => {
const deps = createMockDeps();
deps.userService.count.mockResolvedValue(0);
await createApp(deps);
const res = await app.inject({
method: 'POST',
url: '/api/v1/auth/bootstrap',
payload: { email: 'admin@example.com', password: 'securepass123' },
});
expect(res.statusCode).toBe(201);
const body = res.json<LoginResult>();
expect(body.token).toBe('test-token-123');
expect(body.user.email).toBe('admin@example.com');
// Verify user was created
expect(deps.userService.create).toHaveBeenCalledWith({
email: 'admin@example.com',
password: 'securepass123',
});
// Verify admin group was created with the user as member
expect(deps.groupService.create).toHaveBeenCalledWith({
name: 'admin',
description: 'Bootstrap admin group',
members: ['admin@example.com'],
});
// Verify RBAC definition targets the Group, not the User
expect(deps.rbacDefinitionService.create).toHaveBeenCalledWith({
name: 'bootstrap-admin',
subjects: [{ kind: 'Group', name: 'admin' }],
roleBindings: [
{ role: 'edit', resource: '*' },
{ role: 'run', resource: '*' },
{ role: 'run', action: 'impersonate' },
{ role: 'run', action: 'logs' },
{ role: 'run', action: 'backup' },
{ role: 'run', action: 'restore' },
{ role: 'run', action: 'audit-purge' },
],
});
// Verify auto-login was called
expect(deps.authService.login).toHaveBeenCalledWith('admin@example.com', 'securepass123');
});
it('passes name when provided', async () => {
const deps = createMockDeps();
deps.userService.count.mockResolvedValue(0);
await createApp(deps);
await app.inject({
method: 'POST',
url: '/api/v1/auth/bootstrap',
payload: { email: 'admin@example.com', password: 'securepass123', name: 'Admin User' },
});
expect(deps.userService.create).toHaveBeenCalledWith({
email: 'admin@example.com',
password: 'securepass123',
name: 'Admin User',
});
});
it('returns 409 when users already exist', async () => {
const deps = createMockDeps();
deps.userService.count.mockResolvedValue(1);
await createApp(deps);
const res = await app.inject({
method: 'POST',
url: '/api/v1/auth/bootstrap',
payload: { email: 'admin@example.com', password: 'securepass123' },
});
expect(res.statusCode).toBe(409);
expect(res.json<{ error: string }>().error).toContain('Users already exist');
// Should NOT have created user, group, or RBAC
expect(deps.userService.create).not.toHaveBeenCalled();
expect(deps.groupService.create).not.toHaveBeenCalled();
expect(deps.rbacDefinitionService.create).not.toHaveBeenCalled();
});
it('validates email and password via UserService', async () => {
const deps = createMockDeps();
deps.userService.count.mockResolvedValue(0);
// Simulate Zod validation error from UserService
deps.userService.create.mockRejectedValue(
Object.assign(new Error('Validation error'), { statusCode: 400, issues: [] }),
);
await createApp(deps);
const res = await app.inject({
method: 'POST',
url: '/api/v1/auth/bootstrap',
payload: { email: 'not-an-email', password: 'short' },
});
// The error handler should handle the validation error
expect(res.statusCode).toBeGreaterThanOrEqual(400);
});
});
describe('POST /api/v1/auth/login', () => {
it('logs in successfully', async () => {
const deps = createMockDeps();
await createApp(deps);
const res = await app.inject({
method: 'POST',
url: '/api/v1/auth/login',
payload: { email: 'admin@example.com', password: 'securepass123' },
});
expect(res.statusCode).toBe(200);
expect(res.json<LoginResult>().token).toBe('test-token-123');
});
});
describe('POST /api/v1/auth/logout', () => {
it('logs out with valid token', async () => {
const deps = createMockDeps();
deps.authService.findSession.mockResolvedValue({
userId: 'user-1',
expiresAt: new Date(Date.now() + 86400_000),
});
await createApp(deps);
const res = await app.inject({
method: 'POST',
url: '/api/v1/auth/logout',
headers: { authorization: 'Bearer valid-token' },
});
expect(res.statusCode).toBe(200);
expect(res.json<{ success: boolean }>().success).toBe(true);
expect(deps.authService.logout).toHaveBeenCalledWith('valid-token');
});
it('returns 401 without auth', async () => {
const deps = createMockDeps();
await createApp(deps);
const res = await app.inject({
method: 'POST',
url: '/api/v1/auth/logout',
});
expect(res.statusCode).toBe(401);
});
});
describe('POST /api/v1/auth/impersonate', () => {
it('creates session for target user when caller is admin', async () => {
const deps = createMockDeps();
// Auth: valid session
deps.authService.findSession.mockResolvedValue({
userId: 'admin-user-id',
expiresAt: new Date(Date.now() + 86400_000),
});
// RBAC: allow impersonate operation
deps.rbacService.canRunOperation.mockResolvedValue(true);
// Impersonate returns token for target
deps.authService.impersonate.mockResolvedValue(
makeLoginResult({ token: 'impersonated-token', user: { id: 'user-2', email: 'target@example.com', role: 'user' } }),
);
await createApp(deps);
const res = await app.inject({
method: 'POST',
url: '/api/v1/auth/impersonate',
headers: { authorization: 'Bearer admin-token' },
payload: { email: 'target@example.com' },
});
expect(res.statusCode).toBe(200);
const body = res.json<LoginResult>();
expect(body.token).toBe('impersonated-token');
expect(body.user.email).toBe('target@example.com');
expect(deps.authService.impersonate).toHaveBeenCalledWith('target@example.com');
});
it('returns 401 without auth', async () => {
const deps = createMockDeps();
await createApp(deps);
const res = await app.inject({
method: 'POST',
url: '/api/v1/auth/impersonate',
payload: { email: 'target@example.com' },
});
expect(res.statusCode).toBe(401);
});
it('returns 403 when caller lacks admin permission on users', async () => {
const deps = createMockDeps();
// Auth: valid session
deps.authService.findSession.mockResolvedValue({
userId: 'non-admin-id',
expiresAt: new Date(Date.now() + 86400_000),
});
// RBAC: deny
deps.rbacService.canRunOperation.mockResolvedValue(false);
await createApp(deps);
const res = await app.inject({
method: 'POST',
url: '/api/v1/auth/impersonate',
headers: { authorization: 'Bearer regular-token' },
payload: { email: 'target@example.com' },
});
expect(res.statusCode).toBe(403);
});
it('returns 401 when impersonation target does not exist', async () => {
const deps = createMockDeps();
// Auth: valid session
deps.authService.findSession.mockResolvedValue({
userId: 'admin-user-id',
expiresAt: new Date(Date.now() + 86400_000),
});
// RBAC: allow
deps.rbacService.canRunOperation.mockResolvedValue(true);
// Impersonate fails — user not found
const authError = new Error('User not found');
(authError as Error & { statusCode: number }).statusCode = 401;
deps.authService.impersonate.mockRejectedValue(authError);
await createApp(deps);
const res = await app.inject({
method: 'POST',
url: '/api/v1/auth/impersonate',
headers: { authorization: 'Bearer admin-token' },
payload: { email: 'nonexistent@example.com' },
});
expect(res.statusCode).toBe(401);
});
});
});

View File

@@ -6,6 +6,9 @@ import { encrypt, decrypt, isSensitiveKey } from '../src/services/backup/crypto.
import { registerBackupRoutes } from '../src/routes/backup.js'; import { registerBackupRoutes } from '../src/routes/backup.js';
import type { IMcpServerRepository, ISecretRepository } from '../src/repositories/interfaces.js'; import type { IMcpServerRepository, ISecretRepository } from '../src/repositories/interfaces.js';
import type { IProjectRepository } from '../src/repositories/project.repository.js'; import type { IProjectRepository } from '../src/repositories/project.repository.js';
import type { IUserRepository } from '../src/repositories/user.repository.js';
import type { IGroupRepository } from '../src/repositories/group.repository.js';
import type { IRbacDefinitionRepository } from '../src/repositories/rbac-definition.repository.js';
// Mock data // Mock data
const mockServers = [ const mockServers = [
@@ -31,8 +34,32 @@ const mockSecrets = [
const mockProjects = [ const mockProjects = [
{ {
id: 'proj1', name: 'my-project', description: 'Test project', id: 'proj1', name: 'my-project', description: 'Test project', proxyMode: 'direct', llmProvider: null, llmModel: null,
ownerId: 'user1', version: 1, createdAt: new Date(), updatedAt: new Date(), ownerId: 'user1', version: 1, createdAt: new Date(), updatedAt: new Date(),
servers: [{ id: 'ps1', server: { id: 's1', name: 'github' } }],
},
];
const mockUsers = [
{ id: 'u1', email: 'alice@test.com', name: 'Alice', role: 'ADMIN', provider: null, externalId: null, version: 1, createdAt: new Date(), updatedAt: new Date() },
{ id: 'u2', email: 'bob@test.com', name: null, role: 'USER', provider: 'oidc', externalId: null, version: 1, createdAt: new Date(), updatedAt: new Date() },
];
const mockGroups = [
{
id: 'g1', name: 'dev-team', description: 'Developers', version: 1, createdAt: new Date(), updatedAt: new Date(),
members: [
{ id: 'gm1', user: { id: 'u1', email: 'alice@test.com', name: 'Alice' } },
{ id: 'gm2', user: { id: 'u2', email: 'bob@test.com', name: null } },
],
},
];
const mockRbacDefinitions = [
{
id: 'rbac1', name: 'admins', version: 1, createdAt: new Date(), updatedAt: new Date(),
subjects: [{ kind: 'User', name: 'alice@test.com' }],
roleBindings: [{ role: 'edit', resource: '*' }],
}, },
]; ];
@@ -63,9 +90,47 @@ function mockProjectRepo(): IProjectRepository {
findAll: vi.fn(async () => [...mockProjects]), findAll: vi.fn(async () => [...mockProjects]),
findById: vi.fn(async (id: string) => mockProjects.find((p) => p.id === id) ?? null), findById: vi.fn(async (id: string) => mockProjects.find((p) => p.id === id) ?? null),
findByName: vi.fn(async () => null), findByName: vi.fn(async () => null),
create: vi.fn(async (data) => ({ id: 'new-proj', ...data, version: 1, createdAt: new Date(), updatedAt: new Date() } as typeof mockProjects[0])), create: vi.fn(async (data) => ({ id: 'new-proj', ...data, servers: [], version: 1, createdAt: new Date(), updatedAt: new Date() } as typeof mockProjects[0])),
update: vi.fn(async (id, data) => ({ ...mockProjects.find((p) => p.id === id)!, ...data })), update: vi.fn(async (id, data) => ({ ...mockProjects.find((p) => p.id === id)!, ...data })),
delete: vi.fn(async () => {}), delete: vi.fn(async () => {}),
setServers: vi.fn(async () => {}),
addServer: vi.fn(async () => {}),
removeServer: vi.fn(async () => {}),
};
}
function mockUserRepo(): IUserRepository {
return {
findAll: vi.fn(async () => [...mockUsers]),
findById: vi.fn(async (id: string) => mockUsers.find((u) => u.id === id) ?? null),
findByEmail: vi.fn(async (email: string) => mockUsers.find((u) => u.email === email) ?? null),
create: vi.fn(async (data) => ({ id: 'new-u', ...data, provider: null, externalId: null, version: 1, createdAt: new Date(), updatedAt: new Date() } as typeof mockUsers[0])),
delete: vi.fn(async () => {}),
count: vi.fn(async () => mockUsers.length),
};
}
function mockGroupRepo(): IGroupRepository {
return {
findAll: vi.fn(async () => [...mockGroups]),
findById: vi.fn(async (id: string) => mockGroups.find((g) => g.id === id) ?? null),
findByName: vi.fn(async (name: string) => mockGroups.find((g) => g.name === name) ?? null),
create: vi.fn(async (data) => ({ id: 'new-g', ...data, version: 1, createdAt: new Date(), updatedAt: new Date() } as typeof mockGroups[0])),
update: vi.fn(async (id, data) => ({ ...mockGroups.find((g) => g.id === id)!, ...data })),
delete: vi.fn(async () => {}),
setMembers: vi.fn(async () => {}),
findGroupsForUser: vi.fn(async () => []),
};
}
function mockRbacRepo(): IRbacDefinitionRepository {
return {
findAll: vi.fn(async () => [...mockRbacDefinitions]),
findById: vi.fn(async (id: string) => mockRbacDefinitions.find((r) => r.id === id) ?? null),
findByName: vi.fn(async (name: string) => mockRbacDefinitions.find((r) => r.name === name) ?? null),
create: vi.fn(async (data) => ({ id: 'new-rbac', ...data, version: 1, createdAt: new Date(), updatedAt: new Date() } as typeof mockRbacDefinitions[0])),
update: vi.fn(async (id, data) => ({ ...mockRbacDefinitions.find((r) => r.id === id)!, ...data })),
delete: vi.fn(async () => {}),
}; };
} }
@@ -110,7 +175,7 @@ describe('BackupService', () => {
let backupService: BackupService; let backupService: BackupService;
beforeEach(() => { beforeEach(() => {
backupService = new BackupService(mockServerRepo(), mockProjectRepo(), mockSecretRepo()); backupService = new BackupService(mockServerRepo(), mockProjectRepo(), mockSecretRepo(), mockUserRepo(), mockGroupRepo(), mockRbacRepo());
}); });
it('creates backup with all resources', async () => { it('creates backup with all resources', async () => {
@@ -126,11 +191,50 @@ describe('BackupService', () => {
expect(bundle.projects[0]!.name).toBe('my-project'); expect(bundle.projects[0]!.name).toBe('my-project');
}); });
it('includes users in backup', async () => {
const bundle = await backupService.createBackup();
expect(bundle.users).toHaveLength(2);
expect(bundle.users![0]!.email).toBe('alice@test.com');
expect(bundle.users![0]!.role).toBe('ADMIN');
expect(bundle.users![1]!.email).toBe('bob@test.com');
expect(bundle.users![1]!.provider).toBe('oidc');
});
it('includes groups in backup with member emails', async () => {
const bundle = await backupService.createBackup();
expect(bundle.groups).toHaveLength(1);
expect(bundle.groups![0]!.name).toBe('dev-team');
expect(bundle.groups![0]!.memberEmails).toEqual(['alice@test.com', 'bob@test.com']);
});
it('includes rbac bindings in backup', async () => {
const bundle = await backupService.createBackup();
expect(bundle.rbacBindings).toHaveLength(1);
expect(bundle.rbacBindings![0]!.name).toBe('admins');
expect(bundle.rbacBindings![0]!.subjects).toEqual([{ kind: 'User', name: 'alice@test.com' }]);
});
it('includes enriched projects with server names', async () => {
const bundle = await backupService.createBackup();
const proj = bundle.projects[0]!;
expect(proj.proxyMode).toBe('direct');
expect(proj.serverNames).toEqual(['github']);
});
it('filters resources', async () => { it('filters resources', async () => {
const bundle = await backupService.createBackup({ resources: ['servers'] }); const bundle = await backupService.createBackup({ resources: ['servers'] });
expect(bundle.servers).toHaveLength(2); expect(bundle.servers).toHaveLength(2);
expect(bundle.secrets).toHaveLength(0); expect(bundle.secrets).toHaveLength(0);
expect(bundle.projects).toHaveLength(0); expect(bundle.projects).toHaveLength(0);
expect(bundle.users).toHaveLength(0);
expect(bundle.groups).toHaveLength(0);
expect(bundle.rbacBindings).toHaveLength(0);
});
it('filters to only users', async () => {
const bundle = await backupService.createBackup({ resources: ['users'] });
expect(bundle.servers).toHaveLength(0);
expect(bundle.users).toHaveLength(2);
}); });
it('encrypts sensitive secret values when password provided', async () => { it('encrypts sensitive secret values when password provided', async () => {
@@ -150,13 +254,22 @@ describe('BackupService', () => {
(emptySecretRepo.findAll as ReturnType<typeof vi.fn>).mockResolvedValue([]); (emptySecretRepo.findAll as ReturnType<typeof vi.fn>).mockResolvedValue([]);
const emptyProjectRepo = mockProjectRepo(); const emptyProjectRepo = mockProjectRepo();
(emptyProjectRepo.findAll as ReturnType<typeof vi.fn>).mockResolvedValue([]); (emptyProjectRepo.findAll as ReturnType<typeof vi.fn>).mockResolvedValue([]);
const emptyUserRepo = mockUserRepo();
(emptyUserRepo.findAll as ReturnType<typeof vi.fn>).mockResolvedValue([]);
const emptyGroupRepo = mockGroupRepo();
(emptyGroupRepo.findAll as ReturnType<typeof vi.fn>).mockResolvedValue([]);
const emptyRbacRepo = mockRbacRepo();
(emptyRbacRepo.findAll as ReturnType<typeof vi.fn>).mockResolvedValue([]);
const service = new BackupService(emptyServerRepo, emptyProjectRepo, emptySecretRepo); const service = new BackupService(emptyServerRepo, emptyProjectRepo, emptySecretRepo, emptyUserRepo, emptyGroupRepo, emptyRbacRepo);
const bundle = await service.createBackup(); const bundle = await service.createBackup();
expect(bundle.servers).toHaveLength(0); expect(bundle.servers).toHaveLength(0);
expect(bundle.secrets).toHaveLength(0); expect(bundle.secrets).toHaveLength(0);
expect(bundle.projects).toHaveLength(0); expect(bundle.projects).toHaveLength(0);
expect(bundle.users).toHaveLength(0);
expect(bundle.groups).toHaveLength(0);
expect(bundle.rbacBindings).toHaveLength(0);
}); });
}); });
@@ -165,16 +278,25 @@ describe('RestoreService', () => {
let serverRepo: IMcpServerRepository; let serverRepo: IMcpServerRepository;
let secretRepo: ISecretRepository; let secretRepo: ISecretRepository;
let projectRepo: IProjectRepository; let projectRepo: IProjectRepository;
let userRepo: IUserRepository;
let groupRepo: IGroupRepository;
let rbacRepo: IRbacDefinitionRepository;
beforeEach(() => { beforeEach(() => {
serverRepo = mockServerRepo(); serverRepo = mockServerRepo();
secretRepo = mockSecretRepo(); secretRepo = mockSecretRepo();
projectRepo = mockProjectRepo(); projectRepo = mockProjectRepo();
userRepo = mockUserRepo();
groupRepo = mockGroupRepo();
rbacRepo = mockRbacRepo();
// Default: nothing exists yet // Default: nothing exists yet
(serverRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null); (serverRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
(secretRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null); (secretRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
(projectRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null); (projectRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
restoreService = new RestoreService(serverRepo, projectRepo, secretRepo); (userRepo.findByEmail as ReturnType<typeof vi.fn>).mockResolvedValue(null);
(groupRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
(rbacRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
restoreService = new RestoreService(serverRepo, projectRepo, secretRepo, userRepo, groupRepo, rbacRepo);
}); });
const validBundle = { const validBundle = {
@@ -187,6 +309,23 @@ describe('RestoreService', () => {
projects: [{ name: 'test-proj', description: 'Test' }], projects: [{ name: 'test-proj', description: 'Test' }],
}; };
const fullBundle = {
...validBundle,
users: [
{ email: 'alice@test.com', name: 'Alice', role: 'ADMIN', provider: null },
{ email: 'bob@test.com', name: null, role: 'USER', provider: 'oidc' },
],
groups: [
{ name: 'dev-team', description: 'Developers', memberEmails: ['alice@test.com', 'bob@test.com'] },
],
rbacBindings: [
{ name: 'admins', subjects: [{ kind: 'User', name: 'alice@test.com' }], roleBindings: [{ role: 'edit', resource: '*' }] },
],
projects: [
{ name: 'test-proj', description: 'Test', proxyMode: 'filtered', llmProvider: 'openai', llmModel: 'gpt-4', serverNames: ['github'], members: ['alice@test.com'] },
],
};
it('validates valid bundle', () => { it('validates valid bundle', () => {
expect(restoreService.validateBundle(validBundle)).toBe(true); expect(restoreService.validateBundle(validBundle)).toBe(true);
}); });
@@ -197,6 +336,11 @@ describe('RestoreService', () => {
expect(restoreService.validateBundle({ version: '1' })).toBe(false); expect(restoreService.validateBundle({ version: '1' })).toBe(false);
}); });
it('validates old bundles without new fields (backwards compatibility)', () => {
expect(restoreService.validateBundle(validBundle)).toBe(true);
// Old bundle has no users/groups/rbacBindings — should still validate
});
it('restores all resources', async () => { it('restores all resources', async () => {
const result = await restoreService.restore(validBundle); const result = await restoreService.restore(validBundle);
@@ -209,6 +353,95 @@ describe('RestoreService', () => {
expect(projectRepo.create).toHaveBeenCalled(); expect(projectRepo.create).toHaveBeenCalled();
}); });
it('restores users', async () => {
const result = await restoreService.restore(fullBundle);
expect(result.usersCreated).toBe(2);
expect(userRepo.create).toHaveBeenCalledWith(expect.objectContaining({
email: 'alice@test.com',
name: 'Alice',
role: 'ADMIN',
passwordHash: '__RESTORED_MUST_RESET__',
}));
expect(userRepo.create).toHaveBeenCalledWith(expect.objectContaining({
email: 'bob@test.com',
role: 'USER',
}));
});
it('restores groups with member resolution', async () => {
// After users are created, simulate they can be found by email
let callCount = 0;
(userRepo.findByEmail as ReturnType<typeof vi.fn>).mockImplementation(async (email: string) => {
// First calls during user restore return null (user doesn't exist yet)
// Later calls during group member resolution return the created user
callCount++;
if (callCount > 2) {
// After user creation phase, simulate finding created users
if (email === 'alice@test.com') return { id: 'new-u-alice', email };
if (email === 'bob@test.com') return { id: 'new-u-bob', email };
}
return null;
});
const result = await restoreService.restore(fullBundle);
expect(result.groupsCreated).toBe(1);
expect(groupRepo.create).toHaveBeenCalledWith(expect.objectContaining({
name: 'dev-team',
description: 'Developers',
}));
expect(groupRepo.setMembers).toHaveBeenCalled();
});
it('restores rbac bindings', async () => {
const result = await restoreService.restore(fullBundle);
expect(result.rbacCreated).toBe(1);
expect(rbacRepo.create).toHaveBeenCalledWith(expect.objectContaining({
name: 'admins',
subjects: [{ kind: 'User', name: 'alice@test.com' }],
roleBindings: [{ role: 'edit', resource: '*' }],
}));
});
it('restores enriched projects with server linking', async () => {
// Simulate servers exist (restored in prior step)
(serverRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
// After server restore, we can find them
let serverCallCount = 0;
(serverRepo.findByName as ReturnType<typeof vi.fn>).mockImplementation(async (name: string) => {
serverCallCount++;
// During server restore phase, first call returns null (server doesn't exist)
// During project restore phase, server should be found
if (serverCallCount > 1 && name === 'github') return { id: 'restored-s1', name: 'github' };
return null;
});
const result = await restoreService.restore(fullBundle);
expect(result.projectsCreated).toBe(1);
expect(projectRepo.create).toHaveBeenCalledWith(expect.objectContaining({
name: 'test-proj',
proxyMode: 'filtered',
llmProvider: 'openai',
llmModel: 'gpt-4',
}));
expect(projectRepo.setServers).toHaveBeenCalled();
});
it('restores old bundle without users/groups/rbac', async () => {
const result = await restoreService.restore(validBundle);
expect(result.serversCreated).toBe(1);
expect(result.secretsCreated).toBe(1);
expect(result.projectsCreated).toBe(1);
expect(result.usersCreated).toBe(0);
expect(result.groupsCreated).toBe(0);
expect(result.rbacCreated).toBe(0);
expect(result.errors).toHaveLength(0);
});
it('skips existing resources with skip strategy', async () => { it('skips existing resources with skip strategy', async () => {
(serverRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(mockServers[0]); (serverRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(mockServers[0]);
const result = await restoreService.restore(validBundle, { conflictStrategy: 'skip' }); const result = await restoreService.restore(validBundle, { conflictStrategy: 'skip' });
@@ -218,6 +451,33 @@ describe('RestoreService', () => {
expect(serverRepo.create).not.toHaveBeenCalled(); expect(serverRepo.create).not.toHaveBeenCalled();
}); });
it('skips existing users', async () => {
(userRepo.findByEmail as ReturnType<typeof vi.fn>).mockResolvedValue(mockUsers[0]);
const bundle = { ...validBundle, users: [{ email: 'alice@test.com', name: 'Alice', role: 'ADMIN', provider: null }] };
const result = await restoreService.restore(bundle, { conflictStrategy: 'skip' });
expect(result.usersSkipped).toBe(1);
expect(result.usersCreated).toBe(0);
});
it('skips existing groups', async () => {
(groupRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(mockGroups[0]);
const bundle = { ...validBundle, groups: [{ name: 'dev-team', description: 'Devs', memberEmails: [] }] };
const result = await restoreService.restore(bundle, { conflictStrategy: 'skip' });
expect(result.groupsSkipped).toBe(1);
expect(result.groupsCreated).toBe(0);
});
it('skips existing rbac bindings', async () => {
(rbacRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(mockRbacDefinitions[0]);
const bundle = { ...validBundle, rbacBindings: [{ name: 'admins', subjects: [], roleBindings: [] }] };
const result = await restoreService.restore(bundle, { conflictStrategy: 'skip' });
expect(result.rbacSkipped).toBe(1);
expect(result.rbacCreated).toBe(0);
});
it('aborts on conflict with fail strategy', async () => { it('aborts on conflict with fail strategy', async () => {
(serverRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(mockServers[0]); (serverRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(mockServers[0]);
const result = await restoreService.restore(validBundle, { conflictStrategy: 'fail' }); const result = await restoreService.restore(validBundle, { conflictStrategy: 'fail' });
@@ -233,6 +493,18 @@ describe('RestoreService', () => {
expect(serverRepo.update).toHaveBeenCalled(); expect(serverRepo.update).toHaveBeenCalled();
}); });
it('overwrites existing rbac bindings', async () => {
(rbacRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(mockRbacDefinitions[0]);
const bundle = {
...validBundle,
rbacBindings: [{ name: 'admins', subjects: [{ kind: 'User', name: 'new@test.com' }], roleBindings: [{ role: 'view', resource: 'servers' }] }],
};
const result = await restoreService.restore(bundle, { conflictStrategy: 'overwrite' });
expect(result.rbacCreated).toBe(1);
expect(rbacRepo.update).toHaveBeenCalled();
});
it('fails restore with encrypted bundle and no password', async () => { it('fails restore with encrypted bundle and no password', async () => {
const encBundle = { ...validBundle, encrypted: true, encryptedSecrets: encrypt('{}', 'pw') }; const encBundle = { ...validBundle, encrypted: true, encryptedSecrets: encrypt('{}', 'pw') };
const result = await restoreService.restore(encBundle); const result = await restoreService.restore(encBundle);
@@ -262,6 +534,26 @@ describe('RestoreService', () => {
const result = await restoreService.restore(encBundle, { password: 'wrong' }); const result = await restoreService.restore(encBundle, { password: 'wrong' });
expect(result.errors[0]).toContain('Failed to decrypt'); expect(result.errors[0]).toContain('Failed to decrypt');
}); });
it('restores in correct order: secrets → servers → users → groups → projects → rbac', async () => {
const callOrder: string[] = [];
(secretRepo.create as ReturnType<typeof vi.fn>).mockImplementation(async () => { callOrder.push('secret'); return { id: 'sec' }; });
(serverRepo.create as ReturnType<typeof vi.fn>).mockImplementation(async () => { callOrder.push('server'); return { id: 'srv' }; });
(userRepo.create as ReturnType<typeof vi.fn>).mockImplementation(async () => { callOrder.push('user'); return { id: 'usr' }; });
(groupRepo.create as ReturnType<typeof vi.fn>).mockImplementation(async () => { callOrder.push('group'); return { id: 'grp' }; });
(projectRepo.create as ReturnType<typeof vi.fn>).mockImplementation(async () => { callOrder.push('project'); return { id: 'proj', servers: [] }; });
(rbacRepo.create as ReturnType<typeof vi.fn>).mockImplementation(async () => { callOrder.push('rbac'); return { id: 'rbac' }; });
await restoreService.restore(fullBundle);
expect(callOrder[0]).toBe('secret');
expect(callOrder[1]).toBe('server');
expect(callOrder[2]).toBe('user');
expect(callOrder[3]).toBe('user'); // second user
expect(callOrder[4]).toBe('group');
expect(callOrder[5]).toBe('project');
expect(callOrder[6]).toBe('rbac');
});
}); });
describe('Backup Routes', () => { describe('Backup Routes', () => {
@@ -272,7 +564,7 @@ describe('Backup Routes', () => {
const sRepo = mockServerRepo(); const sRepo = mockServerRepo();
const secRepo = mockSecretRepo(); const secRepo = mockSecretRepo();
const prRepo = mockProjectRepo(); const prRepo = mockProjectRepo();
backupService = new BackupService(sRepo, prRepo, secRepo); backupService = new BackupService(sRepo, prRepo, secRepo, mockUserRepo(), mockGroupRepo(), mockRbacRepo());
const rSRepo = mockServerRepo(); const rSRepo = mockServerRepo();
(rSRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null); (rSRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
@@ -280,7 +572,13 @@ describe('Backup Routes', () => {
(rSecRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null); (rSecRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
const rPrRepo = mockProjectRepo(); const rPrRepo = mockProjectRepo();
(rPrRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null); (rPrRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
restoreService = new RestoreService(rSRepo, rPrRepo, rSecRepo); const rUserRepo = mockUserRepo();
(rUserRepo.findByEmail as ReturnType<typeof vi.fn>).mockResolvedValue(null);
const rGroupRepo = mockGroupRepo();
(rGroupRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
const rRbacRepo = mockRbacRepo();
(rRbacRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
restoreService = new RestoreService(rSRepo, rPrRepo, rSecRepo, rUserRepo, rGroupRepo, rRbacRepo);
}); });
async function buildApp() { async function buildApp() {
@@ -289,7 +587,7 @@ describe('Backup Routes', () => {
return app; return app;
} }
it('POST /api/v1/backup returns bundle', async () => { it('POST /api/v1/backup returns bundle with new resource types', async () => {
const app = await buildApp(); const app = await buildApp();
const res = await app.inject({ const res = await app.inject({
method: 'POST', method: 'POST',
@@ -303,6 +601,9 @@ describe('Backup Routes', () => {
expect(body.servers).toBeDefined(); expect(body.servers).toBeDefined();
expect(body.secrets).toBeDefined(); expect(body.secrets).toBeDefined();
expect(body.projects).toBeDefined(); expect(body.projects).toBeDefined();
expect(body.users).toBeDefined();
expect(body.groups).toBeDefined();
expect(body.rbacBindings).toBeDefined();
}); });
it('POST /api/v1/restore imports bundle', async () => { it('POST /api/v1/restore imports bundle', async () => {
@@ -318,6 +619,9 @@ describe('Backup Routes', () => {
expect(res.statusCode).toBe(200); expect(res.statusCode).toBe(200);
const body = res.json(); const body = res.json();
expect(body.serversCreated).toBeDefined(); expect(body.serversCreated).toBeDefined();
expect(body.usersCreated).toBeDefined();
expect(body.groupsCreated).toBeDefined();
expect(body.rbacCreated).toBeDefined();
}); });
it('POST /api/v1/restore rejects invalid bundle', async () => { it('POST /api/v1/restore rejects invalid bundle', async () => {

View File

@@ -0,0 +1,250 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { GroupService } from '../src/services/group.service.js';
import { NotFoundError, ConflictError } from '../src/services/mcp-server.service.js';
import type { IGroupRepository, GroupWithMembers } from '../src/repositories/group.repository.js';
import type { IUserRepository, SafeUser } from '../src/repositories/user.repository.js';
import type { Group } from '@prisma/client';
function makeGroup(overrides: Partial<Group> = {}): Group {
return {
id: 'grp-1',
name: 'developers',
description: 'Dev team',
version: 1,
createdAt: new Date(),
updatedAt: new Date(),
...overrides,
};
}
function makeGroupWithMembers(overrides: Partial<Group> = {}, members: GroupWithMembers['members'] = []): GroupWithMembers {
return {
...makeGroup(overrides),
members,
};
}
function makeUser(overrides: Partial<SafeUser> = {}): SafeUser {
return {
id: 'user-1',
email: 'alice@example.com',
name: 'Alice',
role: 'USER',
provider: null,
externalId: null,
version: 1,
createdAt: new Date(),
updatedAt: new Date(),
...overrides,
};
}
function mockGroupRepo(): IGroupRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByName: vi.fn(async () => null),
create: vi.fn(async (data) => makeGroup({ name: data.name, description: data.description ?? '' })),
update: vi.fn(async (id, data) => makeGroup({ id, description: data.description ?? '' })),
delete: vi.fn(async () => {}),
setMembers: vi.fn(async () => {}),
findGroupsForUser: vi.fn(async () => []),
};
}
function mockUserRepo(): IUserRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByEmail: vi.fn(async () => null),
create: vi.fn(async () => makeUser()),
delete: vi.fn(async () => {}),
count: vi.fn(async () => 0),
};
}
describe('GroupService', () => {
let groupRepo: ReturnType<typeof mockGroupRepo>;
let userRepo: ReturnType<typeof mockUserRepo>;
let service: GroupService;
beforeEach(() => {
groupRepo = mockGroupRepo();
userRepo = mockUserRepo();
service = new GroupService(groupRepo, userRepo);
});
describe('list', () => {
it('returns empty list', async () => {
const result = await service.list();
expect(result).toEqual([]);
expect(groupRepo.findAll).toHaveBeenCalled();
});
it('returns groups with members', async () => {
const groups = [
makeGroupWithMembers({ id: 'g1', name: 'admins' }, [
{ id: 'gm-1', user: { id: 'u1', email: 'a@b.com', name: 'A' } },
]),
];
vi.mocked(groupRepo.findAll).mockResolvedValue(groups);
const result = await service.list();
expect(result).toHaveLength(1);
expect(result[0].members).toHaveLength(1);
});
});
describe('create', () => {
it('creates a group without members', async () => {
const created = makeGroupWithMembers({ name: 'my-group', description: '' }, []);
vi.mocked(groupRepo.findById).mockResolvedValue(created);
const result = await service.create({ name: 'my-group' });
expect(result.name).toBe('my-group');
expect(groupRepo.create).toHaveBeenCalledWith({ name: 'my-group', description: '' });
expect(groupRepo.setMembers).not.toHaveBeenCalled();
});
it('creates a group with members', async () => {
const alice = makeUser({ id: 'u-alice', email: 'alice@example.com' });
const bob = makeUser({ id: 'u-bob', email: 'bob@example.com', name: 'Bob' });
vi.mocked(userRepo.findByEmail).mockImplementation(async (email: string) => {
if (email === 'alice@example.com') return alice;
if (email === 'bob@example.com') return bob;
return null;
});
const created = makeGroupWithMembers({ name: 'team' }, [
{ id: 'gm-1', user: { id: 'u-alice', email: 'alice@example.com', name: 'Alice' } },
{ id: 'gm-2', user: { id: 'u-bob', email: 'bob@example.com', name: 'Bob' } },
]);
vi.mocked(groupRepo.findById).mockResolvedValue(created);
const result = await service.create({
name: 'team',
members: ['alice@example.com', 'bob@example.com'],
});
expect(groupRepo.setMembers).toHaveBeenCalledWith('grp-1', ['u-alice', 'u-bob']);
expect(result.members).toHaveLength(2);
});
it('throws ConflictError when name exists', async () => {
vi.mocked(groupRepo.findByName).mockResolvedValue(makeGroupWithMembers({ name: 'taken' }));
await expect(service.create({ name: 'taken' })).rejects.toThrow(ConflictError);
});
it('throws NotFoundError for unknown member email', async () => {
vi.mocked(userRepo.findByEmail).mockResolvedValue(null);
await expect(
service.create({ name: 'team', members: ['unknown@example.com'] }),
).rejects.toThrow(NotFoundError);
});
it('validates input', async () => {
await expect(service.create({ name: '' })).rejects.toThrow();
await expect(service.create({ name: 'UPPERCASE' })).rejects.toThrow();
});
});
describe('getById', () => {
it('returns group when found', async () => {
const group = makeGroupWithMembers({ id: 'g1' });
vi.mocked(groupRepo.findById).mockResolvedValue(group);
const result = await service.getById('g1');
expect(result.id).toBe('g1');
});
it('throws NotFoundError when not found', async () => {
await expect(service.getById('missing')).rejects.toThrow(NotFoundError);
});
});
describe('getByName', () => {
it('returns group when found', async () => {
const group = makeGroupWithMembers({ name: 'admins' });
vi.mocked(groupRepo.findByName).mockResolvedValue(group);
const result = await service.getByName('admins');
expect(result.name).toBe('admins');
});
it('throws NotFoundError when not found', async () => {
await expect(service.getByName('missing')).rejects.toThrow(NotFoundError);
});
});
describe('update', () => {
it('updates description', async () => {
const group = makeGroupWithMembers({ id: 'g1' });
vi.mocked(groupRepo.findById).mockResolvedValue(group);
const updated = makeGroupWithMembers({ id: 'g1', description: 'new desc' });
// After update, getById is called again to return fresh data
vi.mocked(groupRepo.findById).mockResolvedValue(updated);
const result = await service.update('g1', { description: 'new desc' });
expect(groupRepo.update).toHaveBeenCalledWith('g1', { description: 'new desc' });
expect(result.description).toBe('new desc');
});
it('updates members (full replacement)', async () => {
const group = makeGroupWithMembers({ id: 'g1' }, [
{ id: 'gm-1', user: { id: 'u-old', email: 'old@example.com', name: 'Old' } },
]);
vi.mocked(groupRepo.findById).mockResolvedValue(group);
const alice = makeUser({ id: 'u-alice', email: 'alice@example.com' });
vi.mocked(userRepo.findByEmail).mockResolvedValue(alice);
const updated = makeGroupWithMembers({ id: 'g1' }, [
{ id: 'gm-2', user: { id: 'u-alice', email: 'alice@example.com', name: 'Alice' } },
]);
vi.mocked(groupRepo.findById).mockResolvedValueOnce(group).mockResolvedValue(updated);
const result = await service.update('g1', { members: ['alice@example.com'] });
expect(groupRepo.setMembers).toHaveBeenCalledWith('g1', ['u-alice']);
expect(result.members).toHaveLength(1);
});
it('throws NotFoundError when group not found', async () => {
await expect(service.update('missing', { description: 'x' })).rejects.toThrow(NotFoundError);
});
it('throws NotFoundError for unknown member email on update', async () => {
const group = makeGroupWithMembers({ id: 'g1' });
vi.mocked(groupRepo.findById).mockResolvedValue(group);
vi.mocked(userRepo.findByEmail).mockResolvedValue(null);
await expect(
service.update('g1', { members: ['unknown@example.com'] }),
).rejects.toThrow(NotFoundError);
});
});
describe('delete', () => {
it('deletes group', async () => {
const group = makeGroupWithMembers({ id: 'g1' });
vi.mocked(groupRepo.findById).mockResolvedValue(group);
await service.delete('g1');
expect(groupRepo.delete).toHaveBeenCalledWith('g1');
});
it('throws NotFoundError when group not found', async () => {
await expect(service.delete('missing')).rejects.toThrow(NotFoundError);
});
});
describe('group includes resolved member info', () => {
it('members include user id, email, and name', async () => {
const group = makeGroupWithMembers({ id: 'g1', name: 'team' }, [
{ id: 'gm-1', user: { id: 'u1', email: 'alice@example.com', name: 'Alice' } },
{ id: 'gm-2', user: { id: 'u2', email: 'bob@example.com', name: null } },
]);
vi.mocked(groupRepo.findById).mockResolvedValue(group);
const result = await service.getById('g1');
expect(result.members[0].user).toEqual({ id: 'u1', email: 'alice@example.com', name: 'Alice' });
expect(result.members[1].user).toEqual({ id: 'u2', email: 'bob@example.com', name: null });
});
});
});

View File

@@ -11,10 +11,17 @@ function makeServer(overrides: Partial<McpServer> = {}): McpServer {
dockerImage: null, dockerImage: null,
transport: 'STDIO', transport: 'STDIO',
repositoryUrl: null, repositoryUrl: null,
externalUrl: null,
command: null,
containerPort: null,
replicas: 1,
env: [], env: [],
healthCheck: null,
version: 1, version: 1,
createdAt: new Date(), createdAt: new Date(),
updatedAt: new Date(), updatedAt: new Date(),
templateName: null,
templateVersion: null,
...overrides, ...overrides,
}; };
} }
@@ -25,7 +32,7 @@ describe('generateMcpConfig', () => {
expect(result).toEqual({ mcpServers: {} }); expect(result).toEqual({ mcpServers: {} });
}); });
it('generates config for a single server', () => { it('generates config for a single STDIO server', () => {
const result = generateMcpConfig([ const result = generateMcpConfig([
{ server: makeServer(), resolvedEnv: {} }, { server: makeServer(), resolvedEnv: {} },
]); ]);
@@ -34,7 +41,7 @@ describe('generateMcpConfig', () => {
expect(result.mcpServers['slack']?.args).toEqual(['-y', '@anthropic/slack-mcp']); expect(result.mcpServers['slack']?.args).toEqual(['-y', '@anthropic/slack-mcp']);
}); });
it('includes resolved env when present', () => { it('includes resolved env when present for STDIO server', () => {
const result = generateMcpConfig([ const result = generateMcpConfig([
{ server: makeServer(), resolvedEnv: { SLACK_TEAM_ID: 'T123' } }, { server: makeServer(), resolvedEnv: { SLACK_TEAM_ID: 'T123' } },
]); ]);
@@ -67,4 +74,35 @@ describe('generateMcpConfig', () => {
]); ]);
expect(result.mcpServers['slack']?.args).toEqual(['-y', 'slack']); expect(result.mcpServers['slack']?.args).toEqual(['-y', 'slack']);
}); });
it('generates URL-based config for SSE servers', () => {
const server = makeServer({ name: 'sse-server', transport: 'SSE' });
const result = generateMcpConfig([
{ server, resolvedEnv: { TOKEN: 'abc' } },
]);
const config = result.mcpServers['sse-server'];
expect(config?.url).toBe('http://localhost:3100/api/v1/mcp/proxy/sse-server');
expect(config?.command).toBeUndefined();
expect(config?.args).toBeUndefined();
expect(config?.env).toBeUndefined();
});
it('generates URL-based config for STREAMABLE_HTTP servers', () => {
const server = makeServer({ name: 'stream-server', transport: 'STREAMABLE_HTTP' });
const result = generateMcpConfig([
{ server, resolvedEnv: {} },
]);
const config = result.mcpServers['stream-server'];
expect(config?.url).toBe('http://localhost:3100/api/v1/mcp/proxy/stream-server');
expect(config?.command).toBeUndefined();
});
it('mixes STDIO and SSE servers correctly', () => {
const result = generateMcpConfig([
{ server: makeServer({ name: 'stdio-srv', transport: 'STDIO' }), resolvedEnv: {} },
{ server: makeServer({ name: 'sse-srv', transport: 'SSE' }), resolvedEnv: {} },
]);
expect(result.mcpServers['stdio-srv']?.command).toBe('npx');
expect(result.mcpServers['sse-srv']?.url).toBeDefined();
});
}); });

View File

@@ -0,0 +1,283 @@
import { describe, it, expect, vi, afterEach } from 'vitest';
import Fastify from 'fastify';
import type { FastifyInstance } from 'fastify';
import { registerProjectRoutes } from '../src/routes/projects.js';
import { ProjectService } from '../src/services/project.service.js';
import { errorHandler } from '../src/middleware/error-handler.js';
import type { IProjectRepository, ProjectWithRelations } from '../src/repositories/project.repository.js';
import type { IMcpServerRepository, ISecretRepository } from '../src/repositories/interfaces.js';
let app: FastifyInstance;
function makeProject(overrides: Partial<ProjectWithRelations> = {}): ProjectWithRelations {
return {
id: 'proj-1',
name: 'test-project',
description: '',
ownerId: 'user-1',
proxyMode: 'direct',
llmProvider: null,
llmModel: null,
version: 1,
createdAt: new Date(),
updatedAt: new Date(),
servers: [],
...overrides,
};
}
function mockProjectRepo(): IProjectRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByName: vi.fn(async () => null),
create: vi.fn(async (data) => makeProject({
name: data.name,
description: data.description,
ownerId: data.ownerId,
proxyMode: data.proxyMode,
})),
update: vi.fn(async (_id, data) => makeProject({ ...data as Partial<ProjectWithRelations> })),
delete: vi.fn(async () => {}),
setServers: vi.fn(async () => {}),
addServer: vi.fn(async () => {}),
removeServer: vi.fn(async () => {}),
};
}
function mockServerRepo(): IMcpServerRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByName: vi.fn(async () => null),
create: vi.fn(async () => ({} as never)),
update: vi.fn(async () => ({} as never)),
delete: vi.fn(async () => {}),
};
}
function mockSecretRepo(): ISecretRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByName: vi.fn(async () => null),
create: vi.fn(async () => ({} as never)),
update: vi.fn(async () => ({} as never)),
delete: vi.fn(async () => {}),
};
}
afterEach(async () => {
if (app) await app.close();
});
function createApp(projectRepo: IProjectRepository, serverRepo?: IMcpServerRepository) {
app = Fastify({ logger: false });
app.setErrorHandler(errorHandler);
const service = new ProjectService(projectRepo, serverRepo ?? mockServerRepo(), mockSecretRepo());
registerProjectRoutes(app, service);
return app.ready();
}
describe('Project Routes', () => {
describe('GET /api/v1/projects', () => {
it('returns project list', async () => {
const repo = mockProjectRepo();
vi.mocked(repo.findAll).mockResolvedValue([
makeProject({ id: 'p1', name: 'alpha', ownerId: 'user-1' }),
makeProject({ id: 'p2', name: 'beta', ownerId: 'user-2' }),
]);
await createApp(repo);
const res = await app.inject({ method: 'GET', url: '/api/v1/projects' });
expect(res.statusCode).toBe(200);
const body = res.json<Array<{ name: string }>>();
expect(body).toHaveLength(2);
});
it('lists all projects without ownerId filtering', async () => {
// This is the bug fix: the route must call list() without ownerId
// so that RBAC (preSerialization) handles access filtering, not the DB query.
const repo = mockProjectRepo();
vi.mocked(repo.findAll).mockResolvedValue([makeProject()]);
await createApp(repo);
await app.inject({ method: 'GET', url: '/api/v1/projects' });
// findAll must be called with NO arguments (undefined ownerId)
expect(repo.findAll).toHaveBeenCalledWith(undefined);
});
});
describe('GET /api/v1/projects/:id', () => {
it('returns 404 when not found', async () => {
const repo = mockProjectRepo();
await createApp(repo);
const res = await app.inject({ method: 'GET', url: '/api/v1/projects/missing' });
expect(res.statusCode).toBe(404);
});
it('returns project when found by ID', async () => {
const repo = mockProjectRepo();
vi.mocked(repo.findById).mockResolvedValue(makeProject({ id: 'p1', name: 'my-proj' }));
await createApp(repo);
const res = await app.inject({ method: 'GET', url: '/api/v1/projects/p1' });
expect(res.statusCode).toBe(200);
expect(res.json<{ name: string }>().name).toBe('my-proj');
});
it('resolves by name when ID not found', async () => {
const repo = mockProjectRepo();
vi.mocked(repo.findByName).mockResolvedValue(makeProject({ name: 'my-proj' }));
await createApp(repo);
const res = await app.inject({ method: 'GET', url: '/api/v1/projects/my-proj' });
expect(res.statusCode).toBe(200);
expect(res.json<{ name: string }>().name).toBe('my-proj');
});
});
describe('POST /api/v1/projects', () => {
it('creates a project and returns 201', async () => {
const repo = mockProjectRepo();
vi.mocked(repo.findById).mockResolvedValue(makeProject({ name: 'new-proj' }));
await createApp(repo);
const res = await app.inject({
method: 'POST',
url: '/api/v1/projects',
payload: { name: 'new-proj' },
});
expect(res.statusCode).toBe(201);
});
it('returns 400 for invalid input', async () => {
const repo = mockProjectRepo();
await createApp(repo);
const res = await app.inject({
method: 'POST',
url: '/api/v1/projects',
payload: { name: '' },
});
expect(res.statusCode).toBe(400);
});
it('returns 409 when name already exists', async () => {
const repo = mockProjectRepo();
vi.mocked(repo.findByName).mockResolvedValue(makeProject());
await createApp(repo);
const res = await app.inject({
method: 'POST',
url: '/api/v1/projects',
payload: { name: 'taken' },
});
expect(res.statusCode).toBe(409);
});
});
describe('PUT /api/v1/projects/:id', () => {
it('updates a project', async () => {
const repo = mockProjectRepo();
vi.mocked(repo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
await createApp(repo);
const res = await app.inject({
method: 'PUT',
url: '/api/v1/projects/p1',
payload: { description: 'Updated' },
});
expect(res.statusCode).toBe(200);
});
it('returns 404 when not found', async () => {
const repo = mockProjectRepo();
await createApp(repo);
const res = await app.inject({
method: 'PUT',
url: '/api/v1/projects/missing',
payload: { description: 'x' },
});
expect(res.statusCode).toBe(404);
});
});
describe('DELETE /api/v1/projects/:id', () => {
it('deletes a project and returns 204', async () => {
const repo = mockProjectRepo();
vi.mocked(repo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
await createApp(repo);
const res = await app.inject({ method: 'DELETE', url: '/api/v1/projects/p1' });
expect(res.statusCode).toBe(204);
});
it('returns 404 when not found', async () => {
const repo = mockProjectRepo();
await createApp(repo);
const res = await app.inject({ method: 'DELETE', url: '/api/v1/projects/missing' });
expect(res.statusCode).toBe(404);
});
});
describe('POST /api/v1/projects/:id/servers (attach)', () => {
it('attaches a server to a project', async () => {
const projectRepo = mockProjectRepo();
const serverRepo = mockServerRepo();
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
vi.mocked(serverRepo.findByName).mockResolvedValue({ id: 'srv-1', name: 'my-ha' } as never);
await createApp(projectRepo, serverRepo);
const res = await app.inject({
method: 'POST',
url: '/api/v1/projects/p1/servers',
payload: { server: 'my-ha' },
});
expect(res.statusCode).toBe(200);
expect(projectRepo.addServer).toHaveBeenCalledWith('p1', 'srv-1');
});
it('returns 400 when server field is missing', async () => {
const repo = mockProjectRepo();
vi.mocked(repo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
await createApp(repo);
const res = await app.inject({
method: 'POST',
url: '/api/v1/projects/p1/servers',
payload: {},
});
expect(res.statusCode).toBe(400);
});
it('returns 404 when server not found', async () => {
const projectRepo = mockProjectRepo();
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
await createApp(projectRepo);
const res = await app.inject({
method: 'POST',
url: '/api/v1/projects/p1/servers',
payload: { server: 'nonexistent' },
});
expect(res.statusCode).toBe(404);
});
});
describe('DELETE /api/v1/projects/:id/servers/:serverName (detach)', () => {
it('detaches a server from a project', async () => {
const projectRepo = mockProjectRepo();
const serverRepo = mockServerRepo();
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
vi.mocked(serverRepo.findByName).mockResolvedValue({ id: 'srv-1', name: 'my-ha' } as never);
await createApp(projectRepo, serverRepo);
const res = await app.inject({ method: 'DELETE', url: '/api/v1/projects/p1/servers/my-ha' });
expect(res.statusCode).toBe(204);
expect(projectRepo.removeServer).toHaveBeenCalledWith('p1', 'srv-1');
});
it('returns 404 when server not found', async () => {
const projectRepo = mockProjectRepo();
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
await createApp(projectRepo);
const res = await app.inject({ method: 'DELETE', url: '/api/v1/projects/p1/servers/nonexistent' });
expect(res.statusCode).toBe(404);
});
});
});

View File

@@ -1,66 +1,383 @@
import { describe, it, expect, vi, beforeEach } from 'vitest'; import { describe, it, expect, vi, beforeEach } from 'vitest';
import { ProjectService } from '../src/services/project.service.js'; import { ProjectService } from '../src/services/project.service.js';
import { NotFoundError, ConflictError } from '../src/services/mcp-server.service.js'; import { NotFoundError, ConflictError } from '../src/services/mcp-server.service.js';
import type { IProjectRepository } from '../src/repositories/project.repository.js'; import type { IProjectRepository, ProjectWithRelations } from '../src/repositories/project.repository.js';
import type { IMcpServerRepository, ISecretRepository } from '../src/repositories/interfaces.js';
import type { McpServer } from '@prisma/client';
function makeProject(overrides: Partial<ProjectWithRelations> = {}): ProjectWithRelations {
return {
id: 'proj-1',
name: 'test-project',
description: '',
ownerId: 'user-1',
proxyMode: 'direct',
llmProvider: null,
llmModel: null,
version: 1,
createdAt: new Date(),
updatedAt: new Date(),
servers: [],
...overrides,
};
}
function makeServer(overrides: Partial<McpServer> = {}): McpServer {
return {
id: 'srv-1',
name: 'test-server',
description: '',
packageName: '@mcp/test',
dockerImage: null,
transport: 'STDIO',
repositoryUrl: null,
externalUrl: null,
command: null,
containerPort: null,
replicas: 1,
env: [],
healthCheck: null,
version: 1,
createdAt: new Date(),
updatedAt: new Date(),
templateName: null,
templateVersion: null,
...overrides,
};
}
function mockProjectRepo(): IProjectRepository { function mockProjectRepo(): IProjectRepository {
return { return {
findAll: vi.fn(async () => []), findAll: vi.fn(async () => []),
findById: vi.fn(async () => null), findById: vi.fn(async () => null),
findByName: vi.fn(async () => null), findByName: vi.fn(async () => null),
create: vi.fn(async (data) => ({ create: vi.fn(async (data) => makeProject({
id: 'proj-1',
name: data.name, name: data.name,
description: data.description ?? '', description: data.description,
ownerId: data.ownerId, ownerId: data.ownerId,
version: 1, proxyMode: data.proxyMode,
createdAt: new Date(), llmProvider: data.llmProvider ?? null,
updatedAt: new Date(), llmModel: data.llmModel ?? null,
})),
update: vi.fn(async (id) => ({
id, name: 'test', description: '', ownerId: 'u1', version: 2,
createdAt: new Date(), updatedAt: new Date(),
})), })),
update: vi.fn(async (_id, data) => makeProject({ ...data as Partial<ProjectWithRelations> })),
delete: vi.fn(async () => {}),
setServers: vi.fn(async () => {}),
addServer: vi.fn(async () => {}),
removeServer: vi.fn(async () => {}),
};
}
function mockServerRepo(): IMcpServerRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByName: vi.fn(async () => null),
create: vi.fn(async () => makeServer()),
update: vi.fn(async () => makeServer()),
delete: vi.fn(async () => {}),
};
}
function mockSecretRepo(): ISecretRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByName: vi.fn(async () => null),
create: vi.fn(async () => ({ id: 'sec-1', name: 'test', data: {}, version: 1, createdAt: new Date(), updatedAt: new Date() })),
update: vi.fn(async () => ({ id: 'sec-1', name: 'test', data: {}, version: 1, createdAt: new Date(), updatedAt: new Date() })),
delete: vi.fn(async () => {}), delete: vi.fn(async () => {}),
}; };
} }
describe('ProjectService', () => { describe('ProjectService', () => {
let projectRepo: ReturnType<typeof mockProjectRepo>; let projectRepo: ReturnType<typeof mockProjectRepo>;
let serverRepo: ReturnType<typeof mockServerRepo>;
let secretRepo: ReturnType<typeof mockSecretRepo>;
let service: ProjectService; let service: ProjectService;
beforeEach(() => { beforeEach(() => {
projectRepo = mockProjectRepo(); projectRepo = mockProjectRepo();
service = new ProjectService(projectRepo); serverRepo = mockServerRepo();
secretRepo = mockSecretRepo();
service = new ProjectService(projectRepo, serverRepo, secretRepo);
}); });
describe('create', () => { describe('create', () => {
it('creates a project', async () => { it('creates a basic project', async () => {
// After create, getById is called to re-fetch with relations
const created = makeProject({ name: 'my-project', ownerId: 'user-1' });
vi.mocked(projectRepo.findById).mockResolvedValue(created);
const result = await service.create({ name: 'my-project' }, 'user-1'); const result = await service.create({ name: 'my-project' }, 'user-1');
expect(result.name).toBe('my-project'); expect(result.name).toBe('my-project');
expect(result.ownerId).toBe('user-1'); expect(result.ownerId).toBe('user-1');
expect(projectRepo.create).toHaveBeenCalled();
}); });
it('throws ConflictError when name exists', async () => { it('throws ConflictError when name exists', async () => {
vi.mocked(projectRepo.findByName).mockResolvedValue({ id: '1' } as never); vi.mocked(projectRepo.findByName).mockResolvedValue(makeProject());
await expect(service.create({ name: 'taken' }, 'u1')).rejects.toThrow(ConflictError); await expect(service.create({ name: 'taken' }, 'u1')).rejects.toThrow(ConflictError);
}); });
it('validates input', async () => { it('validates input', async () => {
await expect(service.create({ name: '' }, 'u1')).rejects.toThrow(); await expect(service.create({ name: '' }, 'u1')).rejects.toThrow();
}); });
it('creates project with servers (resolves names)', async () => {
const srv1 = makeServer({ id: 'srv-1', name: 'github' });
const srv2 = makeServer({ id: 'srv-2', name: 'slack' });
vi.mocked(serverRepo.findByName).mockImplementation(async (name) => {
if (name === 'github') return srv1;
if (name === 'slack') return srv2;
return null;
});
const created = makeProject({ id: 'proj-new' });
vi.mocked(projectRepo.create).mockResolvedValue(created);
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({
id: 'proj-new',
servers: [
{ id: 'ps-1', server: { id: 'srv-1', name: 'github' } },
{ id: 'ps-2', server: { id: 'srv-2', name: 'slack' } },
],
}));
const result = await service.create({ name: 'my-project', servers: ['github', 'slack'] }, 'user-1');
expect(projectRepo.setServers).toHaveBeenCalledWith('proj-new', ['srv-1', 'srv-2']);
expect(result.servers).toHaveLength(2);
});
it('creates project with proxyMode and llmProvider', async () => {
const created = makeProject({ id: 'proj-filtered', proxyMode: 'filtered', llmProvider: 'openai' });
vi.mocked(projectRepo.create).mockResolvedValue(created);
vi.mocked(projectRepo.findById).mockResolvedValue(created);
const result = await service.create({
name: 'filtered-proj',
proxyMode: 'filtered',
llmProvider: 'openai',
}, 'user-1');
expect(result.proxyMode).toBe('filtered');
expect(result.llmProvider).toBe('openai');
});
it('rejects filtered project without llmProvider', async () => {
await expect(
service.create({ name: 'bad-proj', proxyMode: 'filtered' }, 'user-1'),
).rejects.toThrow();
});
it('throws NotFoundError when server name resolution fails', async () => {
vi.mocked(serverRepo.findByName).mockResolvedValue(null);
await expect(
service.create({ name: 'my-project', servers: ['nonexistent'] }, 'user-1'),
).rejects.toThrow(NotFoundError);
});
}); });
describe('getById', () => { describe('getById', () => {
it('throws NotFoundError when not found', async () => { it('throws NotFoundError when not found', async () => {
await expect(service.getById('missing')).rejects.toThrow(NotFoundError); await expect(service.getById('missing')).rejects.toThrow(NotFoundError);
}); });
it('returns project when found', async () => {
const proj = makeProject({ id: 'found' });
vi.mocked(projectRepo.findById).mockResolvedValue(proj);
const result = await service.getById('found');
expect(result.id).toBe('found');
});
});
describe('resolveAndGet', () => {
it('finds by ID first', async () => {
const proj = makeProject({ id: 'proj-id' });
vi.mocked(projectRepo.findById).mockResolvedValue(proj);
const result = await service.resolveAndGet('proj-id');
expect(result.id).toBe('proj-id');
});
it('falls back to name when ID not found', async () => {
vi.mocked(projectRepo.findById).mockResolvedValue(null);
const proj = makeProject({ name: 'my-name' });
vi.mocked(projectRepo.findByName).mockResolvedValue(proj);
const result = await service.resolveAndGet('my-name');
expect(result.name).toBe('my-name');
});
it('throws NotFoundError when neither ID nor name found', async () => {
await expect(service.resolveAndGet('nothing')).rejects.toThrow(NotFoundError);
});
});
describe('update', () => {
it('updates servers (full replacement)', async () => {
const existing = makeProject({ id: 'proj-1' });
vi.mocked(projectRepo.findById).mockResolvedValue(existing);
const srv = makeServer({ id: 'srv-new', name: 'new-srv' });
vi.mocked(serverRepo.findByName).mockResolvedValue(srv);
await service.update('proj-1', { servers: ['new-srv'] });
expect(projectRepo.setServers).toHaveBeenCalledWith('proj-1', ['srv-new']);
});
it('updates proxyMode', async () => {
const existing = makeProject({ id: 'proj-1' });
vi.mocked(projectRepo.findById).mockResolvedValue(existing);
await service.update('proj-1', { proxyMode: 'filtered', llmProvider: 'anthropic' });
expect(projectRepo.update).toHaveBeenCalledWith('proj-1', {
proxyMode: 'filtered',
llmProvider: 'anthropic',
});
});
}); });
describe('delete', () => { describe('delete', () => {
it('deletes project', async () => { it('deletes project', async () => {
vi.mocked(projectRepo.findById).mockResolvedValue({ id: 'p1' } as never); vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
await service.delete('p1'); await service.delete('p1');
expect(projectRepo.delete).toHaveBeenCalledWith('p1'); expect(projectRepo.delete).toHaveBeenCalledWith('p1');
}); });
it('throws NotFoundError when project does not exist', async () => {
await expect(service.delete('missing')).rejects.toThrow(NotFoundError);
});
});
describe('addServer', () => {
it('attaches a server by name', async () => {
const project = makeProject({ id: 'proj-1' });
const srv = makeServer({ id: 'srv-1', name: 'my-ha' });
vi.mocked(projectRepo.findById).mockResolvedValue(project);
vi.mocked(serverRepo.findByName).mockResolvedValue(srv);
await service.addServer('proj-1', 'my-ha');
expect(projectRepo.addServer).toHaveBeenCalledWith('proj-1', 'srv-1');
});
it('throws NotFoundError when project not found', async () => {
await expect(service.addServer('missing', 'my-ha')).rejects.toThrow(NotFoundError);
});
it('throws NotFoundError when server not found', async () => {
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({ id: 'proj-1' }));
vi.mocked(serverRepo.findByName).mockResolvedValue(null);
await expect(service.addServer('proj-1', 'nonexistent')).rejects.toThrow(NotFoundError);
});
});
describe('removeServer', () => {
it('detaches a server by name', async () => {
const project = makeProject({ id: 'proj-1' });
const srv = makeServer({ id: 'srv-1', name: 'my-ha' });
vi.mocked(projectRepo.findById).mockResolvedValue(project);
vi.mocked(serverRepo.findByName).mockResolvedValue(srv);
await service.removeServer('proj-1', 'my-ha');
expect(projectRepo.removeServer).toHaveBeenCalledWith('proj-1', 'srv-1');
});
it('throws NotFoundError when project not found', async () => {
await expect(service.removeServer('missing', 'my-ha')).rejects.toThrow(NotFoundError);
});
it('throws NotFoundError when server not found', async () => {
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({ id: 'proj-1' }));
vi.mocked(serverRepo.findByName).mockResolvedValue(null);
await expect(service.removeServer('proj-1', 'nonexistent')).rejects.toThrow(NotFoundError);
});
});
describe('generateMcpConfig', () => {
it('generates direct mode config with STDIO servers', async () => {
const srv = makeServer({ id: 'srv-1', name: 'github', packageName: '@mcp/github', transport: 'STDIO' });
const project = makeProject({
id: 'proj-1',
name: 'my-proj',
proxyMode: 'direct',
servers: [{ id: 'ps-1', server: { id: 'srv-1', name: 'github' } }],
});
vi.mocked(projectRepo.findById).mockResolvedValue(project);
vi.mocked(serverRepo.findById).mockResolvedValue(srv);
const config = await service.generateMcpConfig('proj-1');
expect(config.mcpServers['github']).toBeDefined();
expect(config.mcpServers['github']?.command).toBe('npx');
expect(config.mcpServers['github']?.args).toEqual(['-y', '@mcp/github']);
});
it('generates direct mode config with SSE servers (URL-based)', async () => {
const srv = makeServer({ id: 'srv-2', name: 'sse-server', transport: 'SSE' });
const project = makeProject({
id: 'proj-1',
proxyMode: 'direct',
servers: [{ id: 'ps-1', server: { id: 'srv-2', name: 'sse-server' } }],
});
vi.mocked(projectRepo.findById).mockResolvedValue(project);
vi.mocked(serverRepo.findById).mockResolvedValue(srv);
const config = await service.generateMcpConfig('proj-1');
expect(config.mcpServers['sse-server']?.url).toBe('http://localhost:3100/api/v1/mcp/proxy/sse-server');
expect(config.mcpServers['sse-server']?.command).toBeUndefined();
});
it('generates filtered mode config (single mcplocal entry)', async () => {
const project = makeProject({
id: 'proj-1',
name: 'filtered-proj',
proxyMode: 'filtered',
llmProvider: 'openai',
servers: [{ id: 'ps-1', server: { id: 'srv-1', name: 'github' } }],
});
vi.mocked(projectRepo.findById).mockResolvedValue(project);
const config = await service.generateMcpConfig('proj-1');
expect(Object.keys(config.mcpServers)).toHaveLength(1);
expect(config.mcpServers['filtered-proj']?.url).toBe('http://localhost:3100/api/v1/mcp/proxy/project/filtered-proj');
});
it('resolves by name for mcp-config', async () => {
const project = makeProject({
id: 'proj-1',
name: 'my-proj',
proxyMode: 'direct',
servers: [],
});
vi.mocked(projectRepo.findById).mockResolvedValue(null);
vi.mocked(projectRepo.findByName).mockResolvedValue(project);
const config = await service.generateMcpConfig('my-proj');
expect(config.mcpServers).toEqual({});
});
it('includes env for STDIO servers', async () => {
const srv = makeServer({
id: 'srv-1',
name: 'github',
transport: 'STDIO',
env: [{ name: 'GITHUB_TOKEN', value: 'tok123' }],
});
const project = makeProject({
id: 'proj-1',
proxyMode: 'direct',
servers: [{ id: 'ps-1', server: { id: 'srv-1', name: 'github' } }],
});
vi.mocked(projectRepo.findById).mockResolvedValue(project);
vi.mocked(serverRepo.findById).mockResolvedValue(srv);
const config = await service.generateMcpConfig('proj-1');
expect(config.mcpServers['github']?.env?.['GITHUB_TOKEN']).toBe('tok123');
});
}); });
}); });

View File

@@ -0,0 +1,229 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { RbacDefinitionService } from '../src/services/rbac-definition.service.js';
import { NotFoundError, ConflictError } from '../src/services/mcp-server.service.js';
import type { IRbacDefinitionRepository } from '../src/repositories/rbac-definition.repository.js';
import type { RbacDefinition } from '@prisma/client';
function makeDef(overrides: Partial<RbacDefinition> = {}): RbacDefinition {
return {
id: 'def-1',
name: 'test-rbac',
subjects: [{ kind: 'User', name: 'alice@example.com' }],
roleBindings: [{ role: 'edit', resource: '*' }],
version: 1,
createdAt: new Date(),
updatedAt: new Date(),
...overrides,
};
}
function mockRepo(): IRbacDefinitionRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByName: vi.fn(async () => null),
create: vi.fn(async (data) => makeDef({ name: data.name, subjects: data.subjects, roleBindings: data.roleBindings })),
update: vi.fn(async (id, data) => makeDef({ id, ...data })),
delete: vi.fn(async () => {}),
};
}
describe('RbacDefinitionService', () => {
let repo: ReturnType<typeof mockRepo>;
let service: RbacDefinitionService;
beforeEach(() => {
repo = mockRepo();
service = new RbacDefinitionService(repo);
});
describe('list', () => {
it('returns all definitions', async () => {
const defs = await service.list();
expect(repo.findAll).toHaveBeenCalled();
expect(defs).toEqual([]);
});
});
describe('getById', () => {
it('returns definition when found', async () => {
const def = makeDef();
vi.mocked(repo.findById).mockResolvedValue(def);
const result = await service.getById('def-1');
expect(result.id).toBe('def-1');
});
it('throws NotFoundError when not found', async () => {
await expect(service.getById('missing')).rejects.toThrow(NotFoundError);
});
});
describe('getByName', () => {
it('returns definition when found', async () => {
const def = makeDef();
vi.mocked(repo.findByName).mockResolvedValue(def);
const result = await service.getByName('test-rbac');
expect(result.name).toBe('test-rbac');
});
it('throws NotFoundError when not found', async () => {
await expect(service.getByName('missing')).rejects.toThrow(NotFoundError);
});
});
describe('create', () => {
it('creates a definition with valid input', async () => {
const result = await service.create({
name: 'new-rbac',
subjects: [{ kind: 'User', name: 'alice@example.com' }],
roleBindings: [{ role: 'edit', resource: '*' }],
});
expect(result.name).toBe('new-rbac');
expect(repo.create).toHaveBeenCalled();
});
it('throws ConflictError when name exists', async () => {
vi.mocked(repo.findByName).mockResolvedValue(makeDef());
await expect(
service.create({
name: 'test-rbac',
subjects: [{ kind: 'User', name: 'bob@example.com' }],
roleBindings: [{ role: 'view', resource: 'servers' }],
}),
).rejects.toThrow(ConflictError);
});
it('throws on missing subjects', async () => {
await expect(
service.create({
name: 'bad-rbac',
subjects: [],
roleBindings: [{ role: 'view', resource: 'servers' }],
}),
).rejects.toThrow();
});
it('throws on missing roleBindings', async () => {
await expect(
service.create({
name: 'bad-rbac',
subjects: [{ kind: 'User', name: 'alice@example.com' }],
roleBindings: [],
}),
).rejects.toThrow();
});
it('throws on invalid role', async () => {
await expect(
service.create({
name: 'bad-rbac',
subjects: [{ kind: 'User', name: 'alice@example.com' }],
roleBindings: [{ role: 'superadmin', resource: '*' }],
}),
).rejects.toThrow();
});
it('throws on invalid subject kind', async () => {
await expect(
service.create({
name: 'bad-rbac',
subjects: [{ kind: 'Robot', name: 'bot-1' }],
roleBindings: [{ role: 'view', resource: 'servers' }],
}),
).rejects.toThrow();
});
it('throws on invalid name format', async () => {
await expect(
service.create({
name: 'Invalid Name!',
subjects: [{ kind: 'User', name: 'alice@example.com' }],
roleBindings: [{ role: 'view', resource: 'servers' }],
}),
).rejects.toThrow();
});
it('normalizes singular resource names to plural', async () => {
await service.create({
name: 'singular-rbac',
subjects: [{ kind: 'User', name: 'alice@example.com' }],
roleBindings: [
{ role: 'view', resource: 'server' },
{ role: 'edit', resource: 'secret', name: 'my-secret' },
],
});
const call = vi.mocked(repo.create).mock.calls[0]![0];
expect(call.roleBindings[0]!.resource).toBe('servers');
expect(call.roleBindings[1]!.resource).toBe('secrets');
expect(call.roleBindings[1]!.name).toBe('my-secret');
});
it('creates a definition with operation bindings', async () => {
const result = await service.create({
name: 'ops-rbac',
subjects: [{ kind: 'User', name: 'alice@example.com' }],
roleBindings: [{ role: 'run', action: 'logs' }],
});
expect(result.name).toBe('ops-rbac');
expect(repo.create).toHaveBeenCalled();
const call = vi.mocked(repo.create).mock.calls[0]![0];
expect(call.roleBindings[0]!.action).toBe('logs');
});
it('creates a definition with mixed resource and operation bindings', async () => {
const result = await service.create({
name: 'mixed-rbac',
subjects: [{ kind: 'User', name: 'alice@example.com' }],
roleBindings: [
{ role: 'view', resource: 'servers' },
{ role: 'run', action: 'logs' },
],
});
expect(result.name).toBe('mixed-rbac');
expect(repo.create).toHaveBeenCalled();
const call = vi.mocked(repo.create).mock.calls[0]![0];
expect(call.roleBindings).toHaveLength(2);
expect(call.roleBindings[0]!.resource).toBe('servers');
expect(call.roleBindings[1]!.action).toBe('logs');
});
it('creates a definition with name-scoped resource binding', async () => {
const result = await service.create({
name: 'scoped-rbac',
subjects: [{ kind: 'User', name: 'alice@example.com' }],
roleBindings: [{ role: 'view', resource: 'servers', name: 'my-ha' }],
});
expect(result.name).toBe('scoped-rbac');
expect(repo.create).toHaveBeenCalled();
const call = vi.mocked(repo.create).mock.calls[0]![0];
expect(call.roleBindings[0]!.resource).toBe('servers');
expect(call.roleBindings[0]!.name).toBe('my-ha');
});
});
describe('update', () => {
it('updates an existing definition', async () => {
vi.mocked(repo.findById).mockResolvedValue(makeDef());
await service.update('def-1', { subjects: [{ kind: 'User', name: 'bob@example.com' }] });
expect(repo.update).toHaveBeenCalledWith('def-1', {
subjects: [{ kind: 'User', name: 'bob@example.com' }],
});
});
it('throws NotFoundError when definition does not exist', async () => {
await expect(service.update('missing', {})).rejects.toThrow(NotFoundError);
});
});
describe('delete', () => {
it('deletes an existing definition', async () => {
vi.mocked(repo.findById).mockResolvedValue(makeDef());
await service.delete('def-1');
expect(repo.delete).toHaveBeenCalledWith('def-1');
});
it('throws NotFoundError when definition does not exist', async () => {
await expect(service.delete('missing')).rejects.toThrow(NotFoundError);
});
});
});

View File

@@ -0,0 +1,444 @@
/**
* Integration tests reproducing RBAC name-scoped access bugs.
*
* Bug 1: `mcpctl get servers` shows ALL servers despite user only having
* view:servers+name:my-home-assistant
* Bug 2: `mcpctl get server my-home-assistant -o yaml` returns 403 because
* CLI resolves name→CUID, and RBAC compares CUID against binding name
*
* These tests spin up a full Fastify app with auth + RBAC hooks + server routes,
* exactly like main.ts, to catch regressions at the HTTP level.
*/
import { describe, it, expect, vi, afterEach, beforeEach } from 'vitest';
import Fastify from 'fastify';
import type { FastifyInstance } from 'fastify';
import { registerMcpServerRoutes } from '../src/routes/mcp-servers.js';
import { McpServerService } from '../src/services/mcp-server.service.js';
import { InstanceService } from '../src/services/instance.service.js';
import { RbacService } from '../src/services/rbac.service.js';
import { errorHandler } from '../src/middleware/error-handler.js';
import type { IMcpServerRepository, IMcpInstanceRepository } from '../src/repositories/interfaces.js';
import type { IRbacDefinitionRepository } from '../src/repositories/rbac-definition.repository.js';
import type { McpOrchestrator } from '../src/services/orchestrator.js';
import type { McpServer, RbacDefinition, PrismaClient } from '@prisma/client';
// ── Test data ──
const SERVERS: McpServer[] = [
{ id: 'clxyz000000001', name: 'my-home-assistant', description: 'HA server', transport: 'STDIO', packageName: null, dockerImage: null, repositoryUrl: null, externalUrl: null, command: null, containerPort: null, replicas: 1, env: [], healthCheck: null, version: 1, createdAt: new Date(), updatedAt: new Date() },
{ id: 'clxyz000000002', name: 'slack-server', description: 'Slack MCP', transport: 'STDIO', packageName: null, dockerImage: null, repositoryUrl: null, externalUrl: null, command: null, containerPort: null, replicas: 1, env: [], healthCheck: null, version: 1, createdAt: new Date(), updatedAt: new Date() },
{ id: 'clxyz000000003', name: 'github-server', description: 'GitHub MCP', transport: 'STDIO', packageName: null, dockerImage: null, repositoryUrl: null, externalUrl: null, command: null, containerPort: null, replicas: 1, env: [], healthCheck: null, version: 1, createdAt: new Date(), updatedAt: new Date() },
];
// User tokens → userId mapping
const SESSIONS: Record<string, { userId: string }> = {
'scoped-token': { userId: 'user-scoped' },
'admin-token': { userId: 'user-admin' },
'multi-scoped-token': { userId: 'user-multi' },
'secrets-only-token': { userId: 'user-secrets' },
'edit-scoped-token': { userId: 'user-edit-scoped' },
};
// User email mapping
const USERS: Record<string, { email: string }> = {
'user-scoped': { email: 'scoped@example.com' },
'user-admin': { email: 'admin@example.com' },
'user-multi': { email: 'multi@example.com' },
'user-secrets': { email: 'secrets@example.com' },
'user-edit-scoped': { email: 'editscoped@example.com' },
};
// RBAC definitions
const RBAC_DEFS: RbacDefinition[] = [
{
id: 'rbac-scoped', name: 'scoped-view', version: 1, createdAt: new Date(), updatedAt: new Date(),
subjects: [{ kind: 'User', name: 'scoped@example.com' }],
roleBindings: [{ role: 'view', resource: 'servers', name: 'my-home-assistant' }],
},
{
id: 'rbac-admin', name: 'admin-all', version: 1, createdAt: new Date(), updatedAt: new Date(),
subjects: [{ kind: 'User', name: 'admin@example.com' }],
roleBindings: [{ role: 'edit', resource: '*' }],
},
{
id: 'rbac-multi', name: 'multi-scoped', version: 1, createdAt: new Date(), updatedAt: new Date(),
subjects: [{ kind: 'User', name: 'multi@example.com' }],
roleBindings: [
{ role: 'view', resource: 'servers', name: 'my-home-assistant' },
{ role: 'view', resource: 'servers', name: 'slack-server' },
],
},
{
id: 'rbac-secrets', name: 'secrets-only', version: 1, createdAt: new Date(), updatedAt: new Date(),
subjects: [{ kind: 'User', name: 'secrets@example.com' }],
roleBindings: [{ role: 'view', resource: 'secrets' }],
},
{
id: 'rbac-edit-scoped', name: 'edit-scoped', version: 1, createdAt: new Date(), updatedAt: new Date(),
subjects: [{ kind: 'User', name: 'editscoped@example.com' }],
roleBindings: [{ role: 'edit', resource: 'servers', name: 'my-home-assistant' }],
},
];
// ── Mock factories ──
function mockServerRepo(): IMcpServerRepository {
return {
findAll: vi.fn(async () => [...SERVERS]),
findById: vi.fn(async (id: string) => SERVERS.find((s) => s.id === id) ?? null),
findByName: vi.fn(async (name: string) => SERVERS.find((s) => s.name === name) ?? null),
create: vi.fn(async () => SERVERS[0]!),
update: vi.fn(async () => SERVERS[0]!),
delete: vi.fn(async () => {}),
};
}
function mockRbacRepo(): IRbacDefinitionRepository {
return {
findAll: vi.fn(async () => [...RBAC_DEFS]),
findById: vi.fn(async () => null),
findByName: vi.fn(async () => null),
create: vi.fn(async () => RBAC_DEFS[0]!),
update: vi.fn(async () => RBAC_DEFS[0]!),
delete: vi.fn(async () => {}),
};
}
function mockPrisma(): PrismaClient {
return {
user: {
findUnique: vi.fn(async ({ where }: { where: { id: string } }) => {
const u = USERS[where.id];
return u ? { email: u.email } : null;
}),
},
groupMember: {
findMany: vi.fn(async () => []),
},
} as unknown as PrismaClient;
}
function stubInstanceRepo(): IMcpInstanceRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByContainerId: vi.fn(async () => null),
create: vi.fn(async (data) => ({
id: 'inst-stub', serverId: data.serverId, containerId: null,
status: data.status ?? 'STOPPED', port: null, metadata: {},
healthStatus: null, lastHealthCheck: null, events: [],
version: 1, createdAt: new Date(), updatedAt: new Date(),
}) as never),
updateStatus: vi.fn(async () => ({}) as never),
delete: vi.fn(async () => {}),
};
}
function stubOrchestrator(): McpOrchestrator {
return {
ping: vi.fn(async () => true),
pullImage: vi.fn(async () => {}),
createContainer: vi.fn(async () => ({ containerId: 'ctr', name: 'stub', state: 'running' as const, port: 3000, createdAt: new Date() })),
stopContainer: vi.fn(async () => {}),
removeContainer: vi.fn(async () => {}),
inspectContainer: vi.fn(async () => ({ containerId: 'ctr', name: 'stub', state: 'running' as const, createdAt: new Date() })),
getContainerLogs: vi.fn(async () => ({ stdout: '', stderr: '' })),
};
}
// ── App setup (replicates main.ts hooks) ──
import { normalizeResource } from '../src/validation/rbac-definition.schema.js';
import type { RbacAction } from '../src/services/rbac.service.js';
type PermissionCheck =
| { kind: 'resource'; resource: string; action: RbacAction; resourceName?: string }
| { kind: 'operation'; operation: string }
| { kind: 'skip' };
function mapUrlToPermission(method: string, url: string): PermissionCheck {
const match = url.match(/^\/api\/v1\/([a-z-]+)/);
if (!match) return { kind: 'skip' };
const segment = match[1] as string;
if (segment === 'backup') return { kind: 'operation', operation: 'backup' };
if (segment === 'restore') return { kind: 'operation', operation: 'restore' };
if (segment === 'audit-logs' && method === 'DELETE') return { kind: 'operation', operation: 'audit-purge' };
const resourceMap: Record<string, string | undefined> = {
servers: 'servers', instances: 'instances', secrets: 'secrets',
projects: 'projects', templates: 'templates', users: 'users',
groups: 'groups', rbac: 'rbac', 'audit-logs': 'rbac', mcp: 'servers',
};
const resource = resourceMap[segment];
if (resource === undefined) return { kind: 'skip' };
let action: RbacAction;
switch (method) {
case 'GET': case 'HEAD': action = 'view'; break;
case 'POST': action = 'create'; break;
case 'DELETE': action = 'delete'; break;
default: action = 'edit'; break;
}
const nameMatch = url.match(/^\/api\/v1\/[a-z-]+\/([^/?]+)/);
const resourceName = nameMatch?.[1];
const check: PermissionCheck = { kind: 'resource', resource, action };
if (resourceName !== undefined) (check as { resourceName: string }).resourceName = resourceName;
return check;
}
let app: FastifyInstance;
afterEach(async () => {
if (app) await app.close();
});
async function createTestApp() {
const serverRepo = mockServerRepo();
const rbacRepo = mockRbacRepo();
const prisma = mockPrisma();
const rbacService = new RbacService(rbacRepo, prisma);
const CUID_RE = /^c[^\s-]{8,}$/i;
const nameResolvers: Record<string, { findById(id: string): Promise<{ name: string } | null> }> = {
servers: serverRepo,
};
app = Fastify({ logger: false });
app.setErrorHandler(errorHandler);
// Auth hook (mock)
app.addHook('preHandler', async (request, reply) => {
const url = request.url;
if (url.startsWith('/api/v1/auth/') || url === '/healthz') return;
if (!url.startsWith('/api/v1/')) return;
const header = request.headers.authorization;
if (!header?.startsWith('Bearer ')) {
reply.code(401).send({ error: 'Unauthorized' });
return;
}
const token = header.slice(7);
const session = SESSIONS[token];
if (!session) {
reply.code(401).send({ error: 'Invalid token' });
return;
}
request.userId = session.userId;
});
// RBAC hook (replicates main.ts)
app.addHook('preHandler', async (request, reply) => {
if (reply.sent) return;
const url = request.url;
if (url.startsWith('/api/v1/auth/') || url === '/healthz') return;
if (!url.startsWith('/api/v1/')) return;
if (request.userId === undefined) return;
const check = mapUrlToPermission(request.method, url);
if (check.kind === 'skip') return;
let allowed: boolean;
if (check.kind === 'operation') {
allowed = await rbacService.canRunOperation(request.userId, check.operation);
} else {
// CUID→name resolution
if (check.resourceName !== undefined && CUID_RE.test(check.resourceName)) {
const resolver = nameResolvers[check.resource];
if (resolver) {
const entity = await resolver.findById(check.resourceName);
if (entity) check.resourceName = entity.name;
}
}
allowed = await rbacService.canAccess(request.userId, check.action, check.resource, check.resourceName);
// Compute scope for list filtering
if (allowed && check.resourceName === undefined) {
request.rbacScope = await rbacService.getAllowedScope(request.userId, check.action, check.resource);
}
}
if (!allowed) {
reply.code(403).send({ error: 'Forbidden' });
}
});
// Routes
const serverService = new McpServerService(serverRepo);
const instanceService = new InstanceService(stubInstanceRepo(), serverRepo, stubOrchestrator());
serverService.setInstanceService(instanceService);
registerMcpServerRoutes(app, serverService, instanceService);
// preSerialization hook (list filtering)
app.addHook('preSerialization', async (request, _reply, payload) => {
if (!request.rbacScope || request.rbacScope.wildcard) return payload;
if (!Array.isArray(payload)) return payload;
return (payload as Array<Record<string, unknown>>).filter((item) => {
const name = item['name'];
return typeof name === 'string' && request.rbacScope!.names.has(name);
});
});
await app.ready();
return app;
}
// ── Tests ──
describe('RBAC name-scoped integration (reproduces mcpctl bugs)', () => {
beforeEach(async () => {
await createTestApp();
});
describe('Bug 1: mcpctl get servers (list filtering)', () => {
it('name-scoped user sees ONLY their permitted server', async () => {
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers',
headers: { authorization: 'Bearer scoped-token' },
});
expect(res.statusCode).toBe(200);
const servers = res.json<Array<{ name: string }>>();
expect(servers).toHaveLength(1);
expect(servers[0]!.name).toBe('my-home-assistant');
});
it('wildcard user sees ALL servers', async () => {
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers',
headers: { authorization: 'Bearer admin-token' },
});
expect(res.statusCode).toBe(200);
const servers = res.json<Array<{ name: string }>>();
expect(servers).toHaveLength(3);
});
it('user with multiple name-scoped bindings sees only those servers', async () => {
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers',
headers: { authorization: 'Bearer multi-scoped-token' },
});
expect(res.statusCode).toBe(200);
const servers = res.json<Array<{ name: string }>>();
expect(servers).toHaveLength(2);
const names = servers.map((s) => s.name);
expect(names).toContain('my-home-assistant');
expect(names).toContain('slack-server');
expect(names).not.toContain('github-server');
});
it('user with no server permissions gets 403', async () => {
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers',
headers: { authorization: 'Bearer secrets-only-token' },
});
expect(res.statusCode).toBe(403);
});
});
describe('Bug 2: mcpctl get server NAME (CUID resolution)', () => {
it('allows access when URL contains CUID matching a name-scoped binding', async () => {
// CLI resolves my-home-assistant → clxyz000000001
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers/clxyz000000001',
headers: { authorization: 'Bearer scoped-token' },
});
expect(res.statusCode).toBe(200);
expect(res.json<{ name: string }>().name).toBe('my-home-assistant');
});
it('denies access when CUID resolves to server NOT in binding', async () => {
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers/clxyz000000002',
headers: { authorization: 'Bearer scoped-token' },
});
expect(res.statusCode).toBe(403);
});
it('passes RBAC when URL has human-readable name (route 404 is expected)', async () => {
// Human name in URL: RBAC passes (matches binding directly),
// but the route only does findById, so it 404s.
// CLI always resolves name→CUID first, so this doesn't happen in practice.
// The important thing: it does NOT return 403.
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers/my-home-assistant',
headers: { authorization: 'Bearer scoped-token' },
});
expect(res.statusCode).toBe(404); // Not 403!
});
it('handles nonexistent CUID gracefully (403)', async () => {
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers/cnonexistent12345678',
headers: { authorization: 'Bearer scoped-token' },
});
expect(res.statusCode).toBe(403);
});
it('wildcard user can access any server by CUID', async () => {
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers/clxyz000000002',
headers: { authorization: 'Bearer admin-token' },
});
expect(res.statusCode).toBe(200);
expect(res.json<{ name: string }>().name).toBe('slack-server');
});
});
describe('name-scoped write operations', () => {
it('name-scoped edit user can DELETE their named server by CUID', async () => {
const res = await app.inject({
method: 'DELETE',
url: '/api/v1/servers/clxyz000000001',
headers: { authorization: 'Bearer edit-scoped-token' },
});
expect(res.statusCode).toBe(204);
});
it('name-scoped edit user CANNOT delete other servers', async () => {
const res = await app.inject({
method: 'DELETE',
url: '/api/v1/servers/clxyz000000002',
headers: { authorization: 'Bearer edit-scoped-token' },
});
expect(res.statusCode).toBe(403);
});
it('name-scoped view user CANNOT delete their named server', async () => {
const res = await app.inject({
method: 'DELETE',
url: '/api/v1/servers/clxyz000000001',
headers: { authorization: 'Bearer scoped-token' },
});
expect(res.statusCode).toBe(403);
});
});
describe('preSerialization edge cases', () => {
it('single-object responses pass through unmodified', async () => {
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers/clxyz000000001',
headers: { authorization: 'Bearer scoped-token' },
});
expect(res.statusCode).toBe(200);
expect(res.json<{ name: string }>().name).toBe('my-home-assistant');
});
it('unauthenticated requests get 401', async () => {
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers',
});
expect(res.statusCode).toBe(401);
});
});
});

1012
src/mcpd/tests/rbac.test.ts Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,302 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { PromptService } from '../../src/services/prompt.service.js';
import type { IPromptRepository } from '../../src/repositories/prompt.repository.js';
import type { IPromptRequestRepository } from '../../src/repositories/prompt-request.repository.js';
import type { IProjectRepository } from '../../src/repositories/project.repository.js';
import type { Prompt, PromptRequest, Project } from '@prisma/client';
function makePrompt(overrides: Partial<Prompt> = {}): Prompt {
return {
id: 'prompt-1',
name: 'test-prompt',
content: 'Hello world',
projectId: null,
version: 1,
createdAt: new Date(),
updatedAt: new Date(),
...overrides,
};
}
function makePromptRequest(overrides: Partial<PromptRequest> = {}): PromptRequest {
return {
id: 'req-1',
name: 'test-request',
content: 'Proposed content',
projectId: null,
createdBySession: 'session-abc',
createdByUserId: null,
createdAt: new Date(),
...overrides,
};
}
function makeProject(overrides: Partial<Project> = {}): Project {
return {
id: 'proj-1',
name: 'test-project',
description: '',
prompt: '',
proxyMode: 'direct',
llmProvider: null,
llmModel: null,
ownerId: 'user-1',
createdAt: new Date(),
updatedAt: new Date(),
...overrides,
} as Project;
}
function mockPromptRepo(): IPromptRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByNameAndProject: vi.fn(async () => null),
create: vi.fn(async (data) => makePrompt(data)),
update: vi.fn(async (id, data) => makePrompt({ id, ...data })),
delete: vi.fn(async () => {}),
};
}
function mockPromptRequestRepo(): IPromptRequestRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByNameAndProject: vi.fn(async () => null),
findBySession: vi.fn(async () => []),
create: vi.fn(async (data) => makePromptRequest(data)),
delete: vi.fn(async () => {}),
};
}
function mockProjectRepo(): IProjectRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByName: vi.fn(async () => null),
create: vi.fn(async (data) => makeProject(data)),
update: vi.fn(async (id, data) => makeProject({ id, ...data })),
delete: vi.fn(async () => {}),
};
}
describe('PromptService', () => {
let promptRepo: IPromptRepository;
let promptRequestRepo: IPromptRequestRepository;
let projectRepo: IProjectRepository;
let service: PromptService;
beforeEach(() => {
promptRepo = mockPromptRepo();
promptRequestRepo = mockPromptRequestRepo();
projectRepo = mockProjectRepo();
service = new PromptService(promptRepo, promptRequestRepo, projectRepo);
});
// ── Prompt CRUD ──
describe('listPrompts', () => {
it('should return all prompts', async () => {
const prompts = [makePrompt(), makePrompt({ id: 'prompt-2', name: 'other' })];
vi.mocked(promptRepo.findAll).mockResolvedValue(prompts);
const result = await service.listPrompts();
expect(result).toEqual(prompts);
expect(promptRepo.findAll).toHaveBeenCalledWith(undefined);
});
it('should filter by projectId', async () => {
await service.listPrompts('proj-1');
expect(promptRepo.findAll).toHaveBeenCalledWith('proj-1');
});
});
describe('getPrompt', () => {
it('should return a prompt by id', async () => {
const prompt = makePrompt();
vi.mocked(promptRepo.findById).mockResolvedValue(prompt);
const result = await service.getPrompt('prompt-1');
expect(result).toEqual(prompt);
});
it('should throw NotFoundError for missing prompt', async () => {
await expect(service.getPrompt('nope')).rejects.toThrow('Prompt not found: nope');
});
});
describe('createPrompt', () => {
it('should create a prompt', async () => {
const result = await service.createPrompt({ name: 'new-prompt', content: 'stuff' });
expect(promptRepo.create).toHaveBeenCalledWith({ name: 'new-prompt', content: 'stuff' });
expect(result.name).toBe('new-prompt');
});
it('should validate project exists when projectId given', async () => {
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject());
await service.createPrompt({ name: 'scoped', content: 'x', projectId: 'proj-1' });
expect(projectRepo.findById).toHaveBeenCalledWith('proj-1');
});
it('should throw when project not found', async () => {
await expect(
service.createPrompt({ name: 'bad', content: 'x', projectId: 'nope' }),
).rejects.toThrow('Project not found: nope');
});
it('should reject invalid name format', async () => {
await expect(
service.createPrompt({ name: 'INVALID_NAME', content: 'x' }),
).rejects.toThrow();
});
});
describe('updatePrompt', () => {
it('should update prompt content', async () => {
vi.mocked(promptRepo.findById).mockResolvedValue(makePrompt());
await service.updatePrompt('prompt-1', { content: 'updated' });
expect(promptRepo.update).toHaveBeenCalledWith('prompt-1', { content: 'updated' });
});
it('should throw for missing prompt', async () => {
await expect(service.updatePrompt('nope', { content: 'x' })).rejects.toThrow('Prompt not found');
});
});
describe('deletePrompt', () => {
it('should delete an existing prompt', async () => {
vi.mocked(promptRepo.findById).mockResolvedValue(makePrompt());
await service.deletePrompt('prompt-1');
expect(promptRepo.delete).toHaveBeenCalledWith('prompt-1');
});
it('should throw for missing prompt', async () => {
await expect(service.deletePrompt('nope')).rejects.toThrow('Prompt not found');
});
});
// ── PromptRequest CRUD ──
describe('listPromptRequests', () => {
it('should return all prompt requests', async () => {
const reqs = [makePromptRequest()];
vi.mocked(promptRequestRepo.findAll).mockResolvedValue(reqs);
const result = await service.listPromptRequests();
expect(result).toEqual(reqs);
});
});
describe('getPromptRequest', () => {
it('should return a prompt request by id', async () => {
const req = makePromptRequest();
vi.mocked(promptRequestRepo.findById).mockResolvedValue(req);
const result = await service.getPromptRequest('req-1');
expect(result).toEqual(req);
});
it('should throw for missing request', async () => {
await expect(service.getPromptRequest('nope')).rejects.toThrow('PromptRequest not found');
});
});
describe('deletePromptRequest', () => {
it('should delete an existing request', async () => {
vi.mocked(promptRequestRepo.findById).mockResolvedValue(makePromptRequest());
await service.deletePromptRequest('req-1');
expect(promptRequestRepo.delete).toHaveBeenCalledWith('req-1');
});
});
// ── Propose ──
describe('propose', () => {
it('should create a prompt request', async () => {
const result = await service.propose({
name: 'my-prompt',
content: 'proposal',
createdBySession: 'sess-1',
});
expect(promptRequestRepo.create).toHaveBeenCalledWith(
expect.objectContaining({ name: 'my-prompt', content: 'proposal', createdBySession: 'sess-1' }),
);
expect(result.name).toBe('my-prompt');
});
it('should validate project exists when projectId given', async () => {
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject());
await service.propose({
name: 'scoped',
content: 'x',
projectId: 'proj-1',
});
expect(projectRepo.findById).toHaveBeenCalledWith('proj-1');
});
});
// ── Approve ──
describe('approve', () => {
it('should delete request and create prompt (atomic)', async () => {
const req = makePromptRequest({ id: 'req-1', name: 'approved', content: 'good stuff', projectId: 'proj-1' });
vi.mocked(promptRequestRepo.findById).mockResolvedValue(req);
const result = await service.approve('req-1');
expect(promptRepo.create).toHaveBeenCalledWith(
expect.objectContaining({ name: 'approved', content: 'good stuff', projectId: 'proj-1' }),
);
expect(promptRequestRepo.delete).toHaveBeenCalledWith('req-1');
expect(result.name).toBe('approved');
});
it('should throw for missing request', async () => {
await expect(service.approve('nope')).rejects.toThrow('PromptRequest not found');
});
it('should handle global prompt (no projectId)', async () => {
const req = makePromptRequest({ id: 'req-2', name: 'global', content: 'stuff', projectId: null });
vi.mocked(promptRequestRepo.findById).mockResolvedValue(req);
await service.approve('req-2');
// Should NOT include projectId in the create call
const createArg = vi.mocked(promptRepo.create).mock.calls[0]![0];
expect(createArg).not.toHaveProperty('projectId');
});
});
// ── Visibility ──
describe('getVisiblePrompts', () => {
it('should return approved prompts and session requests', async () => {
vi.mocked(promptRepo.findAll).mockResolvedValue([
makePrompt({ name: 'approved-1', content: 'A' }),
]);
vi.mocked(promptRequestRepo.findBySession).mockResolvedValue([
makePromptRequest({ name: 'pending-1', content: 'B' }),
]);
const result = await service.getVisiblePrompts('proj-1', 'sess-1');
expect(result).toHaveLength(2);
expect(result[0]).toEqual({ name: 'approved-1', content: 'A', type: 'prompt' });
expect(result[1]).toEqual({ name: 'pending-1', content: 'B', type: 'promptrequest' });
});
it('should not include pending requests without sessionId', async () => {
vi.mocked(promptRepo.findAll).mockResolvedValue([makePrompt()]);
const result = await service.getVisiblePrompts('proj-1');
expect(result).toHaveLength(1);
expect(promptRequestRepo.findBySession).not.toHaveBeenCalled();
});
it('should return empty when no prompts or requests', async () => {
const result = await service.getVisiblePrompts();
expect(result).toEqual([]);
});
});
});

View File

@@ -0,0 +1,208 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { UserService } from '../src/services/user.service.js';
import { NotFoundError, ConflictError } from '../src/services/mcp-server.service.js';
import type { IUserRepository, SafeUser } from '../src/repositories/user.repository.js';
function makeSafeUser(overrides: Partial<SafeUser> = {}): SafeUser {
return {
id: 'user-1',
email: 'alice@example.com',
name: 'Alice',
role: 'USER',
provider: null,
externalId: null,
version: 1,
createdAt: new Date(),
updatedAt: new Date(),
...overrides,
};
}
function mockUserRepo(): IUserRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByEmail: vi.fn(async () => null),
create: vi.fn(async (data) =>
makeSafeUser({ email: data.email, name: data.name ?? null }),
),
delete: vi.fn(async () => {}),
count: vi.fn(async () => 0),
};
}
describe('UserService', () => {
let repo: ReturnType<typeof mockUserRepo>;
let service: UserService;
beforeEach(() => {
repo = mockUserRepo();
service = new UserService(repo);
});
// ── list ──────────────────────────────────────────────────
describe('list', () => {
it('returns empty array when no users', async () => {
const result = await service.list();
expect(result).toEqual([]);
expect(repo.findAll).toHaveBeenCalledOnce();
});
it('returns all users', async () => {
const users = [
makeSafeUser({ id: 'u1', email: 'a@b.com' }),
makeSafeUser({ id: 'u2', email: 'c@d.com' }),
];
vi.mocked(repo.findAll).mockResolvedValue(users);
const result = await service.list();
expect(result).toHaveLength(2);
expect(result[0]!.email).toBe('a@b.com');
});
});
// ── create ────────────────────────────────────────────────
describe('create', () => {
it('creates a user and hashes password', async () => {
const result = await service.create({
email: 'alice@example.com',
password: 'securePass123',
});
expect(result.email).toBe('alice@example.com');
expect(repo.create).toHaveBeenCalledOnce();
// Verify the passwordHash was generated (not the plain password)
const createCall = vi.mocked(repo.create).mock.calls[0]![0]!;
expect(createCall.passwordHash).toBeDefined();
expect(createCall.passwordHash).not.toBe('securePass123');
expect(createCall.passwordHash.startsWith('$2b$')).toBe(true);
});
it('creates a user with optional name', async () => {
await service.create({
email: 'bob@example.com',
password: 'securePass123',
name: 'Bob',
});
const createCall = vi.mocked(repo.create).mock.calls[0]![0]!;
expect(createCall.email).toBe('bob@example.com');
expect(createCall.name).toBe('Bob');
});
it('returns user without passwordHash', async () => {
const result = await service.create({
email: 'alice@example.com',
password: 'securePass123',
});
// SafeUser type should not have passwordHash
expect(result).not.toHaveProperty('passwordHash');
});
it('throws ConflictError when email already exists', async () => {
vi.mocked(repo.findByEmail).mockResolvedValue(makeSafeUser());
await expect(
service.create({ email: 'alice@example.com', password: 'securePass123' }),
).rejects.toThrow(ConflictError);
});
it('throws ZodError for invalid email', async () => {
await expect(
service.create({ email: 'not-an-email', password: 'securePass123' }),
).rejects.toThrow();
});
it('throws ZodError for short password', async () => {
await expect(
service.create({ email: 'a@b.com', password: 'short' }),
).rejects.toThrow();
});
it('throws ZodError for missing email', async () => {
await expect(
service.create({ password: 'securePass123' }),
).rejects.toThrow();
});
it('throws ZodError for password exceeding max length', async () => {
await expect(
service.create({ email: 'a@b.com', password: 'x'.repeat(129) }),
).rejects.toThrow();
});
});
// ── getById ───────────────────────────────────────────────
describe('getById', () => {
it('returns user when found', async () => {
const user = makeSafeUser();
vi.mocked(repo.findById).mockResolvedValue(user);
const result = await service.getById('user-1');
expect(result.email).toBe('alice@example.com');
expect(repo.findById).toHaveBeenCalledWith('user-1');
});
it('throws NotFoundError when not found', async () => {
await expect(service.getById('missing')).rejects.toThrow(NotFoundError);
});
});
// ── getByEmail ────────────────────────────────────────────
describe('getByEmail', () => {
it('returns user when found', async () => {
const user = makeSafeUser();
vi.mocked(repo.findByEmail).mockResolvedValue(user);
const result = await service.getByEmail('alice@example.com');
expect(result.email).toBe('alice@example.com');
expect(repo.findByEmail).toHaveBeenCalledWith('alice@example.com');
});
it('throws NotFoundError when not found', async () => {
await expect(service.getByEmail('nobody@example.com')).rejects.toThrow(NotFoundError);
});
});
// ── delete ────────────────────────────────────────────────
describe('delete', () => {
it('deletes user by id', async () => {
vi.mocked(repo.findById).mockResolvedValue(makeSafeUser());
await service.delete('user-1');
expect(repo.delete).toHaveBeenCalledWith('user-1');
});
it('throws NotFoundError when user does not exist', async () => {
await expect(service.delete('missing')).rejects.toThrow(NotFoundError);
});
});
// ── count ─────────────────────────────────────────────────
describe('count', () => {
it('returns 0 when no users', async () => {
const result = await service.count();
expect(result).toBe(0);
});
it('returns 1 when one user exists', async () => {
vi.mocked(repo.count).mockResolvedValue(1);
const result = await service.count();
expect(result).toBe(1);
});
it('returns correct count for multiple users', async () => {
vi.mocked(repo.count).mockResolvedValue(5);
const result = await service.count();
expect(result).toBe(5);
});
});
});

View File

@@ -5,6 +5,7 @@ import { McpdUpstream } from './upstream/mcpd.js';
interface McpdServer { interface McpdServer {
id: string; id: string;
name: string; name: string;
description?: string;
transport: string; transport: string;
status?: string; status?: string;
} }
@@ -15,6 +16,40 @@ interface McpdServer {
*/ */
export async function refreshUpstreams(router: McpRouter, mcpdClient: McpdClient): Promise<string[]> { export async function refreshUpstreams(router: McpRouter, mcpdClient: McpdClient): Promise<string[]> {
const servers = await mcpdClient.get<McpdServer[]>('/api/v1/servers'); const servers = await mcpdClient.get<McpdServer[]>('/api/v1/servers');
return syncUpstreams(router, mcpdClient, servers);
}
/**
* Discovers MCP servers scoped to a project and registers them as upstreams.
* Uses the project-servers endpoint that returns only servers linked to the project.
*
* @param authToken - Optional bearer token forwarded to mcpd for RBAC checks.
*/
export async function refreshProjectUpstreams(
router: McpRouter,
mcpdClient: McpdClient,
projectName: string,
authToken?: string,
): Promise<string[]> {
const path = `/api/v1/projects/${encodeURIComponent(projectName)}/servers`;
let servers: McpdServer[];
if (authToken) {
// Forward the client's auth token to mcpd so RBAC applies
const result = await mcpdClient.forward('GET', path, '', undefined, authToken);
if (result.status >= 400) {
throw new Error(`Failed to fetch project servers: ${result.status}`);
}
servers = result.body as McpdServer[];
} else {
servers = await mcpdClient.get<McpdServer[]>(path);
}
return syncUpstreams(router, mcpdClient, servers);
}
/** Shared sync logic: reconcile a router's upstreams with a server list. */
function syncUpstreams(router: McpRouter, mcpdClient: McpdClient, servers: McpdServer[]): string[] {
const registered: string[] = []; const registered: string[] = [];
// Remove stale upstreams // Remove stale upstreams
@@ -29,7 +64,7 @@ export async function refreshUpstreams(router: McpRouter, mcpdClient: McpdClient
// Add/update upstreams for each server // Add/update upstreams for each server
for (const server of servers) { for (const server of servers) {
if (!currentNames.has(server.name)) { if (!currentNames.has(server.name)) {
const upstream = new McpdUpstream(server.id, server.name, mcpdClient); const upstream = new McpdUpstream(server.id, server.name, mcpdClient, server.description);
router.addUpstream(upstream); router.addUpstream(upstream);
} }
registered.push(server.name); registered.push(server.name);

View File

@@ -1,3 +1,7 @@
import { existsSync, readFileSync } from 'node:fs';
import { join } from 'node:path';
import { homedir } from 'node:os';
/** Configuration for the mcplocal HTTP server. */ /** Configuration for the mcplocal HTTP server. */
export interface HttpConfig { export interface HttpConfig {
/** Port for the HTTP server (default: 3200) */ /** Port for the HTTP server (default: 3200) */
@@ -15,9 +19,48 @@ export interface HttpConfig {
const DEFAULT_HTTP_PORT = 3200; const DEFAULT_HTTP_PORT = 3200;
const DEFAULT_HTTP_HOST = '127.0.0.1'; const DEFAULT_HTTP_HOST = '127.0.0.1';
const DEFAULT_MCPD_URL = 'http://localhost:3100'; const DEFAULT_MCPD_URL = 'http://localhost:3100';
const DEFAULT_MCPD_TOKEN = '';
const DEFAULT_LOG_LEVEL = 'info'; const DEFAULT_LOG_LEVEL = 'info';
/**
* Read the user's mcpctl credentials from ~/.mcpctl/credentials.
* Returns the token if found, empty string otherwise.
*/
function loadUserToken(): string {
try {
const credPath = join(homedir(), '.mcpctl', 'credentials');
if (!existsSync(credPath)) return '';
const raw = readFileSync(credPath, 'utf-8');
const parsed = JSON.parse(raw) as { token?: string };
return parsed.token ?? '';
} catch {
return '';
}
}
export interface LlmFileConfig {
provider: string;
model?: string;
url?: string;
binaryPath?: string;
}
/**
* Load LLM configuration from ~/.mcpctl/config.json.
* Returns undefined if no LLM section is configured.
*/
export function loadLlmConfig(): LlmFileConfig | undefined {
try {
const configPath = join(homedir(), '.mcpctl', 'config.json');
if (!existsSync(configPath)) return undefined;
const raw = readFileSync(configPath, 'utf-8');
const parsed = JSON.parse(raw) as { llm?: LlmFileConfig };
if (!parsed.llm?.provider || parsed.llm.provider === 'none') return undefined;
return parsed.llm;
} catch {
return undefined;
}
}
export function loadHttpConfig(env: Record<string, string | undefined> = process.env): HttpConfig { export function loadHttpConfig(env: Record<string, string | undefined> = process.env): HttpConfig {
const portStr = env['MCPLOCAL_HTTP_PORT']; const portStr = env['MCPLOCAL_HTTP_PORT'];
const port = portStr !== undefined ? parseInt(portStr, 10) : DEFAULT_HTTP_PORT; const port = portStr !== undefined ? parseInt(portStr, 10) : DEFAULT_HTTP_PORT;
@@ -26,7 +69,7 @@ export function loadHttpConfig(env: Record<string, string | undefined> = process
httpPort: Number.isFinite(port) ? port : DEFAULT_HTTP_PORT, httpPort: Number.isFinite(port) ? port : DEFAULT_HTTP_PORT,
httpHost: env['MCPLOCAL_HTTP_HOST'] ?? DEFAULT_HTTP_HOST, httpHost: env['MCPLOCAL_HTTP_HOST'] ?? DEFAULT_HTTP_HOST,
mcpdUrl: env['MCPLOCAL_MCPD_URL'] ?? DEFAULT_MCPD_URL, mcpdUrl: env['MCPLOCAL_MCPD_URL'] ?? DEFAULT_MCPD_URL,
mcpdToken: env['MCPLOCAL_MCPD_TOKEN'] ?? DEFAULT_MCPD_TOKEN, mcpdToken: env['MCPLOCAL_MCPD_TOKEN'] ?? loadUserToken(),
logLevel: (env['MCPLOCAL_LOG_LEVEL'] as HttpConfig['logLevel'] | undefined) ?? DEFAULT_LOG_LEVEL, logLevel: (env['MCPLOCAL_LOG_LEVEL'] as HttpConfig['logLevel'] | undefined) ?? DEFAULT_LOG_LEVEL,
}; };
} }

View File

@@ -5,3 +5,4 @@ export type { HttpConfig } from './config.js';
export { McpdClient, AuthenticationError, ConnectionError } from './mcpd-client.js'; export { McpdClient, AuthenticationError, ConnectionError } from './mcpd-client.js';
export { registerProxyRoutes } from './routes/proxy.js'; export { registerProxyRoutes } from './routes/proxy.js';
export { registerMcpEndpoint } from './mcp-endpoint.js'; export { registerMcpEndpoint } from './mcp-endpoint.js';
export { registerProjectMcpEndpoint } from './project-mcp-endpoint.js';

View File

@@ -23,11 +23,21 @@ export class ConnectionError extends Error {
export class McpdClient { export class McpdClient {
private readonly baseUrl: string; private readonly baseUrl: string;
private readonly token: string; private readonly token: string;
private readonly extraHeaders: Record<string, string>;
constructor(baseUrl: string, token: string) { constructor(baseUrl: string, token: string, extraHeaders?: Record<string, string>) {
// Strip trailing slash for consistent URL joining // Strip trailing slash for consistent URL joining
this.baseUrl = baseUrl.replace(/\/+$/, ''); this.baseUrl = baseUrl.replace(/\/+$/, '');
this.token = token; this.token = token;
this.extraHeaders = extraHeaders ?? {};
}
/**
* Create a new client with additional default headers.
* Inherits base URL and token from the current client.
*/
withHeaders(headers: Record<string, string>): McpdClient {
return new McpdClient(this.baseUrl, this.token, { ...this.extraHeaders, ...headers });
} }
async get<T>(path: string): Promise<T> { async get<T>(path: string): Promise<T> {
@@ -49,16 +59,21 @@ export class McpdClient {
/** /**
* Forward a raw request to mcpd. Returns the status code and body * Forward a raw request to mcpd. Returns the status code and body
* so the proxy route can relay them directly. * so the proxy route can relay them directly.
*
* @param authOverride - If provided, used as the Bearer token instead of the
* service token. This allows forwarding end-user tokens for RBAC enforcement.
*/ */
async forward( async forward(
method: string, method: string,
path: string, path: string,
query: string, query: string,
body: unknown | undefined, body: unknown | undefined,
authOverride?: string,
): Promise<{ status: number; body: unknown }> { ): Promise<{ status: number; body: unknown }> {
const url = `${this.baseUrl}${path}${query ? `?${query}` : ''}`; const url = `${this.baseUrl}${path}${query ? `?${query}` : ''}`;
const headers: Record<string, string> = { const headers: Record<string, string> = {
'Authorization': `Bearer ${this.token}`, ...this.extraHeaders,
'Authorization': `Bearer ${authOverride ?? this.token}`,
'Accept': 'application/json', 'Accept': 'application/json',
}; };

View File

@@ -0,0 +1,163 @@
/**
* Project-scoped Streamable HTTP MCP protocol endpoint.
*
* Exposes per-project MCP endpoints at /projects/:projectName/mcp so
* Claude Code can connect to a specific project's servers only.
*
* Each project gets its own McpRouter instance (cached with TTL).
* Sessions are managed per-project.
*/
import { randomUUID } from 'node:crypto';
import type { FastifyInstance } from 'fastify';
import { StreamableHTTPServerTransport } from '@modelcontextprotocol/sdk/server/streamableHttp.js';
import type { JSONRPCMessage } from '@modelcontextprotocol/sdk/types.js';
import { McpRouter } from '../router.js';
import { ResponsePaginator } from '../llm/pagination.js';
import { refreshProjectUpstreams } from '../discovery.js';
import type { McpdClient } from './mcpd-client.js';
import type { ProviderRegistry } from '../providers/registry.js';
import type { JsonRpcRequest } from '../types.js';
interface ProjectCacheEntry {
router: McpRouter;
lastRefresh: number;
}
interface SessionEntry {
transport: StreamableHTTPServerTransport;
projectName: string;
}
const CACHE_TTL_MS = 60_000; // 60 seconds
export function registerProjectMcpEndpoint(app: FastifyInstance, mcpdClient: McpdClient, providerRegistry?: ProviderRegistry | null): void {
const projectCache = new Map<string, ProjectCacheEntry>();
const sessions = new Map<string, SessionEntry>();
async function getOrCreateRouter(projectName: string, authToken?: string): Promise<McpRouter> {
const existing = projectCache.get(projectName);
const now = Date.now();
if (existing && (now - existing.lastRefresh) < CACHE_TTL_MS) {
return existing.router;
}
// Create new router or refresh existing one
const router = existing?.router ?? new McpRouter();
await refreshProjectUpstreams(router, mcpdClient, projectName, authToken);
// Wire pagination support with LLM provider if configured
router.setPaginator(new ResponsePaginator(providerRegistry ?? null));
// Configure prompt resources with SA-scoped client for RBAC
const saClient = mcpdClient.withHeaders({ 'X-Service-Account': `project:${projectName}` });
router.setPromptConfig(saClient, projectName);
// Fetch project instructions and set on router
try {
const instructions = await mcpdClient.get<{ prompt: string; servers: Array<{ name: string; description: string }> }>(
`/api/v1/projects/${encodeURIComponent(projectName)}/instructions`,
);
const parts: string[] = [];
if (instructions.prompt) {
parts.push(instructions.prompt);
}
if (instructions.servers.length > 0) {
parts.push('Available MCP servers:');
for (const s of instructions.servers) {
parts.push(`- ${s.name}${s.description ? `: ${s.description}` : ''}`);
}
}
if (parts.length > 0) {
router.setInstructions(parts.join('\n'));
}
} catch {
// Instructions are optional — don't fail if endpoint is unavailable
}
projectCache.set(projectName, { router, lastRefresh: now });
return router;
}
// POST /projects/:projectName/mcp — JSON-RPC requests
app.post<{ Params: { projectName: string } }>('/projects/:projectName/mcp', async (request, reply) => {
const { projectName } = request.params;
const sessionId = request.headers['mcp-session-id'] as string | undefined;
const authToken = (request.headers['authorization'] as string | undefined)?.replace(/^Bearer\s+/i, '');
if (sessionId && sessions.has(sessionId)) {
const session = sessions.get(sessionId)!;
await session.transport.handleRequest(request.raw, reply.raw, request.body);
reply.hijack();
return;
}
if (sessionId && !sessions.has(sessionId)) {
reply.code(404).send({ error: 'Session not found' });
return;
}
// New session — get/create project router
let router: McpRouter;
try {
router = await getOrCreateRouter(projectName, authToken);
} catch (err) {
reply.code(502).send({ error: `Failed to load project: ${err instanceof Error ? err.message : String(err)}` });
return;
}
const transport = new StreamableHTTPServerTransport({
sessionIdGenerator: () => randomUUID(),
onsessioninitialized: (id) => {
sessions.set(id, { transport, projectName });
},
});
transport.onmessage = async (message: JSONRPCMessage) => {
if ('method' in message && 'id' in message) {
const ctx = transport.sessionId ? { sessionId: transport.sessionId } : undefined;
const response = await router.route(message as unknown as JsonRpcRequest, ctx);
await transport.send(response as unknown as JSONRPCMessage);
}
};
transport.onclose = () => {
const id = transport.sessionId;
if (id) {
sessions.delete(id);
}
};
await transport.handleRequest(request.raw, reply.raw, request.body);
reply.hijack();
});
// GET /projects/:projectName/mcp — SSE stream
app.get<{ Params: { projectName: string } }>('/projects/:projectName/mcp', async (request, reply) => {
const sessionId = request.headers['mcp-session-id'] as string | undefined;
if (!sessionId || !sessions.has(sessionId)) {
reply.code(400).send({ error: 'Invalid or missing session ID' });
return;
}
const session = sessions.get(sessionId)!;
await session.transport.handleRequest(request.raw, reply.raw);
reply.hijack();
});
// DELETE /projects/:projectName/mcp — Session cleanup
app.delete<{ Params: { projectName: string } }>('/projects/:projectName/mcp', async (request, reply) => {
const sessionId = request.headers['mcp-session-id'] as string | undefined;
if (!sessionId || !sessions.has(sessionId)) {
reply.code(400).send({ error: 'Invalid or missing session ID' });
return;
}
const session = sessions.get(sessionId)!;
await session.transport.handleRequest(request.raw, reply.raw);
sessions.delete(sessionId);
reply.hijack();
});
}

View File

@@ -16,8 +16,13 @@ export function registerProxyRoutes(app: FastifyInstance, client: McpdClient): v
? (request.body as unknown) ? (request.body as unknown)
: undefined; : undefined;
// Forward the user's auth token to mcpd so RBAC applies per-user.
// If no user token is present, mcpd will use its auth hook to reject.
const authHeader = request.headers['authorization'] as string | undefined;
const userToken = authHeader?.startsWith('Bearer ') ? authHeader.slice(7) : undefined;
try { try {
const result = await client.forward(request.method, path, querystring, body); const result = await client.forward(request.method, path, querystring, body, userToken);
return reply.code(result.status).send(result.body); return reply.code(result.status).send(result.body);
} catch (err: unknown) { } catch (err: unknown) {
if (err instanceof AuthenticationError) { if (err instanceof AuthenticationError) {

View File

@@ -6,14 +6,17 @@ import type { HttpConfig } from './config.js';
import { McpdClient } from './mcpd-client.js'; import { McpdClient } from './mcpd-client.js';
import { registerProxyRoutes } from './routes/proxy.js'; import { registerProxyRoutes } from './routes/proxy.js';
import { registerMcpEndpoint } from './mcp-endpoint.js'; import { registerMcpEndpoint } from './mcp-endpoint.js';
import { registerProjectMcpEndpoint } from './project-mcp-endpoint.js';
import type { McpRouter } from '../router.js'; import type { McpRouter } from '../router.js';
import type { HealthMonitor } from '../health.js'; import type { HealthMonitor } from '../health.js';
import type { TieredHealthMonitor } from '../health/tiered.js'; import type { TieredHealthMonitor } from '../health/tiered.js';
import type { ProviderRegistry } from '../providers/registry.js';
export interface HttpServerDeps { export interface HttpServerDeps {
router: McpRouter; router: McpRouter;
healthMonitor?: HealthMonitor | undefined; healthMonitor?: HealthMonitor | undefined;
tieredHealthMonitor?: TieredHealthMonitor | undefined; tieredHealthMonitor?: TieredHealthMonitor | undefined;
providerRegistry?: ProviderRegistry | null | undefined;
} }
export async function createHttpServer( export async function createHttpServer(
@@ -78,6 +81,34 @@ export async function createHttpServer(
reply.code(200).send({ status: 'ok' }); reply.code(200).send({ status: 'ok' });
}); });
// LLM health check — tests the active provider with a tiny prompt
app.get('/llm/health', async (_request, reply) => {
const provider = deps.providerRegistry?.getActive() ?? null;
if (!provider) {
reply.code(200).send({ status: 'not configured' });
return;
}
try {
const result = await provider.complete({
messages: [{ role: 'user', content: 'Respond with exactly: ok' }],
maxTokens: 10,
});
const ok = result.content.trim().toLowerCase().includes('ok');
reply.code(200).send({
status: ok ? 'ok' : 'unexpected response',
provider: provider.name,
response: result.content.trim().slice(0, 100),
});
} catch (err) {
const msg = (err as Error).message ?? String(err);
reply.code(200).send({
status: 'error',
provider: provider.name,
error: msg.slice(0, 200),
});
}
});
// Proxy management routes to mcpd // Proxy management routes to mcpd
const mcpdClient = new McpdClient(config.mcpdUrl, config.mcpdToken); const mcpdClient = new McpdClient(config.mcpdUrl, config.mcpdToken);
registerProxyRoutes(app, mcpdClient); registerProxyRoutes(app, mcpdClient);
@@ -85,5 +116,8 @@ export async function createHttpServer(
// Streamable HTTP MCP protocol endpoint at /mcp // Streamable HTTP MCP protocol endpoint at /mcp
registerMcpEndpoint(app, deps.router); registerMcpEndpoint(app, deps.router);
// Project-scoped MCP endpoint at /projects/:projectName/mcp
registerProjectMcpEndpoint(app, mcpdClient, deps.providerRegistry);
return app; return app;
} }

View File

@@ -0,0 +1,97 @@
import type { SecretStore } from '@mcpctl/shared';
import type { LlmFileConfig } from './http/config.js';
import { ProviderRegistry } from './providers/registry.js';
import { GeminiAcpProvider } from './providers/gemini-acp.js';
import { OllamaProvider } from './providers/ollama.js';
import { AnthropicProvider } from './providers/anthropic.js';
import { OpenAiProvider } from './providers/openai.js';
import { DeepSeekProvider } from './providers/deepseek.js';
import type { GeminiAcpConfig } from './providers/gemini-acp.js';
import type { OllamaConfig } from './providers/ollama.js';
import type { AnthropicConfig } from './providers/anthropic.js';
import type { OpenAiConfig } from './providers/openai.js';
import type { DeepSeekConfig } from './providers/deepseek.js';
/**
* Create a ProviderRegistry from user config + secret store.
* Returns an empty registry if config is undefined or provider is 'none'.
*/
export async function createProviderFromConfig(
config: LlmFileConfig | undefined,
secretStore: SecretStore,
): Promise<ProviderRegistry> {
const registry = new ProviderRegistry();
if (!config?.provider || config.provider === 'none') return registry;
switch (config.provider) {
case 'gemini-cli': {
const cfg: GeminiAcpConfig = {};
if (config.binaryPath) cfg.binaryPath = config.binaryPath;
if (config.model) cfg.defaultModel = config.model;
registry.register(new GeminiAcpProvider(cfg));
break;
}
case 'ollama': {
const cfg: OllamaConfig = {};
if (config.url) cfg.baseUrl = config.url;
if (config.model) cfg.defaultModel = config.model;
registry.register(new OllamaProvider(cfg));
break;
}
case 'anthropic': {
const apiKey = await secretStore.get('anthropic-api-key');
if (!apiKey) {
process.stderr.write('Warning: Anthropic API key not found in secret store. Run "mcpctl config setup" to configure.\n');
return registry;
}
const cfg: AnthropicConfig = { apiKey };
if (config.model) cfg.defaultModel = config.model;
registry.register(new AnthropicProvider(cfg));
break;
}
case 'openai': {
const apiKey = await secretStore.get('openai-api-key');
if (!apiKey) {
process.stderr.write('Warning: OpenAI API key not found in secret store. Run "mcpctl config setup" to configure.\n');
return registry;
}
const cfg: OpenAiConfig = { apiKey };
if (config.url) cfg.baseUrl = config.url;
if (config.model) cfg.defaultModel = config.model;
registry.register(new OpenAiProvider(cfg));
break;
}
case 'deepseek': {
const apiKey = await secretStore.get('deepseek-api-key');
if (!apiKey) {
process.stderr.write('Warning: DeepSeek API key not found in secret store. Run "mcpctl config setup" to configure.\n');
return registry;
}
const cfg: DeepSeekConfig = { apiKey };
if (config.url) cfg.baseUrl = config.url;
if (config.model) cfg.defaultModel = config.model;
registry.register(new DeepSeekProvider(cfg));
break;
}
case 'vllm': {
// vLLM uses OpenAI-compatible API
if (!config.url) {
process.stderr.write('Warning: vLLM URL not configured. Run "mcpctl config setup" to configure.\n');
return registry;
}
registry.register(new OpenAiProvider({
apiKey: 'unused',
baseUrl: config.url,
defaultModel: config.model ?? 'default',
}));
break;
}
}
return registry;
}

View File

@@ -6,3 +6,5 @@ export { FilterCache, DEFAULT_FILTER_CACHE_CONFIG } from './filter-cache.js';
export type { FilterCacheConfig } from './filter-cache.js'; export type { FilterCacheConfig } from './filter-cache.js';
export { FilterMetrics } from './metrics.js'; export { FilterMetrics } from './metrics.js';
export type { FilterMetricsSnapshot } from './metrics.js'; export type { FilterMetricsSnapshot } from './metrics.js';
export { ResponsePaginator, DEFAULT_PAGINATION_CONFIG, PAGINATION_INDEX_SYSTEM_PROMPT } from './pagination.js';
export type { PaginationConfig, PaginationIndex, PageSummary, PaginatedToolResponse } from './pagination.js';

Some files were not shown because too many files have changed in this diff Show More