Compare commits

...

39 Commits

Author SHA1 Message Date
Michal
767725023e feat: --project flag scopes get servers/instances to project
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
mcpctl --project NAME get servers — shows only servers attached to the project
mcpctl --project NAME get instances — shows only instances of project servers

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 19:03:07 +00:00
2bd1b55fe8 Merge pull request 'feat: add tests.sh runner and project routes tests' (#26) from feat/tests-sh-and-project-routes-tests into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 18:58:06 +00:00
Michal
0f2a93f2f0 feat: add tests.sh runner and project routes integration tests
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- tests.sh: run all tests with `bash tests.sh`, summary with `--short`
- tests.sh --filter mcpd/cli: run specific package
- project-routes.test.ts: 17 new route-level tests covering CRUD,
  attach/detach, and the ownerId filtering bug fix

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 18:57:46 +00:00
ce81d9d616 Merge pull request 'fix: project list uses RBAC filtering instead of ownerId' (#25) from fix/project-list-rbac into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 18:52:29 +00:00
Michal
c6cc39c6f7 fix: project list should use RBAC filtering, not ownerId
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
The list endpoint was filtering by ownerId before RBAC could include
projects the user has view access to via name-scoped bindings.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 18:52:13 +00:00
de074d9a90 Merge pull request 'feat: remove ProjectMember, add expose RBAC role, attach/detach-server' (#24) from feat/project-improvements into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 17:50:24 +00:00
Michal
783cf15179 feat: remove ProjectMember, add expose RBAC role, attach/detach-server commands
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- Remove ProjectMember model entirely (RBAC manages project access)
- Add 'expose' RBAC role for /mcp-config endpoint access (edit implies expose)
- Rename CLI flags: --llm-provider → --proxy-mode-llm-provider, --llm-model → --proxy-mode-llm-model
- Add attach-server / detach-server CLI commands (mcpctl --project NAME attach-server SERVER)
- Add POST/DELETE /api/v1/projects/:id/servers endpoints for server attach/detach
- Remove members from backup/restore, apply, get, describe
- Prisma migration to drop ProjectMember table

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 17:50:01 +00:00
5844d6c73f Merge pull request 'fix: RBAC name-scoped access — CUID resolution + list filtering' (#23) from fix/rbac-name-scoped-access into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 12:27:48 +00:00
Michal
604bd76d60 fix: RBAC name-scoped access — CUID resolution + list filtering
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
Two bugs fixed:
- GET /api/v1/servers/:cuid now resolves CUID→name before RBAC check,
  so name-scoped bindings match correctly
- List endpoints now filter responses via preSerialization hook using
  getAllowedScope(), so name-scoped users only see their resources

Also adds fulldeploy.sh orchestrator script.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 12:26:37 +00:00
da14bb8c23 Merge pull request 'fix: update shell completions for current CLI commands' (#22) from fix/update-shell-completions into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 12:00:50 +00:00
Michal
9e9a2f4a54 fix: update shell completions for current CLI commands
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
Add users, groups, rbac, secrets, templates to resource completions.
Remove stale profiles references. Add login, logout, create, edit,
delete, logs commands. Update config subcommands.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 12:00:31 +00:00
c8cdd7f514 Merge pull request 'fix: migrate legacy admin role at startup' (#21) from fix/migrate-legacy-admin-role into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 11:31:31 +00:00
Michal
ec1dfe7438 fix: migrate legacy admin role to granular roles at startup
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- Add migrateAdminRole() that runs on mcpd boot
- Converts { role: 'admin', resource: X } → edit + run bindings
- Adds operation bindings for wildcard admin (impersonate, logs, etc.)
- Add tests verifying unknown/legacy roles are denied by canAccess

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 11:31:15 +00:00
50b4112398 Merge pull request 'fix: resolve tsc --build type errors' (#20) from fix/build-type-errors into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 11:08:08 +00:00
Michal
bb17a892d6 fix: resolve tsc --build type errors (exactOptionalPropertyTypes)
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- Fix resourceName assignment in mapUrlToPermission for strictness
- Use RbacRoleBinding type in restore-service instead of loose cast
- Remove stale ProjectMemberInput export from validation index

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 11:07:46 +00:00
a8117091a1 Merge pull request 'feat: granular RBAC with resource/operation bindings, users, groups' (#19) from feat/projects-rbac-users-groups into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 11:05:51 +00:00
Michal
dcda93d179 feat: granular RBAC with resource/operation bindings, users, groups
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- Replace admin role with granular roles: view, create, delete, edit, run
- Two binding types: resource bindings (role+resource+optional name) and
  operation bindings (role:run + action like backup, logs, impersonate)
- Name-scoped resource bindings for per-instance access control
- Remove role from project members (all permissions via RBAC)
- Add users, groups, RBAC CRUD endpoints and CLI commands
- describe user/group shows all RBAC access (direct + inherited)
- create rbac supports --subject, --binding, --operation flags
- Backup/restore handles users, groups, RBAC definitions
- mcplocal project-based MCP endpoint discovery
- Full test coverage for all new functionality

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 11:05:19 +00:00
a6b5e24a8d Merge pull request 'fix: add missing passwordHash to DB test user factory' (#18) from fix/db-tests-passwordhash into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 01:03:11 +00:00
Michal
3a6e58274c fix: add missing passwordHash to DB test user factory
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 01:02:41 +00:00
Michal
c819b65175 fix: SSE health probe uses proper SSE protocol (GET /sse + POST /messages)
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
SSE-transport MCP servers (like ha-mcp) use a different protocol flow:
GET /sse to establish event stream, read endpoint event, then POST
JSON-RPC messages to /messages?session_id=... URL. Previously was
POSTing to root which returned 404.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 00:55:25 +00:00
Michal
c3ef5a664f chore: remove accidentally committed logs.sh
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 00:52:31 +00:00
Michal
4c2927a16e fix: HTTP health probes use container IP for internal network communication
mcpd and MCP containers share the mcp-servers Docker network. HTTP probes
must use the container's internal IP + containerPort instead of localhost
+ host-mapped port. Also extracts container IP from Docker inspect.

Updated home-assistant template to use ghcr.io/homeassistant-ai/ha-mcp
Docker image (SSE transport) instead of broken npm package.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 00:52:17 +00:00
79dd6e723d Merge pull request 'feat: MCP health probe runner with tool-call probes' (#17) from feat/health-probe-runner into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 00:39:09 +00:00
Michal
cde1c59fd6 feat: MCP health probe runner — periodic tool-call probes for instances
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
Implements Kubernetes-style liveness probes that call MCP tools defined
in server healthCheck configs. For STDIO servers, uses docker exec to
spawn a disposable MCP client that sends initialize + tool call. For
HTTP/SSE servers, sends JSON-RPC directly.

- HealthProbeRunner service with configurable interval/threshold/timeout
- execInContainer added to orchestrator interface + Docker implementation
- Instance findById now includes server relation (fixes describe showing IDs)
- Events appended to instance (last 50), healthStatus tracked as
  healthy/degraded/unhealthy
- 12 unit tests covering probing, thresholds, intervals, cleanup

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 00:38:48 +00:00
daa5860ed2 Merge pull request 'fix: stdin open for STDIO servers + describe instance resolution' (#16) from fix/stdin-describe-instance into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 00:26:49 +00:00
Michal
ecbf48dd49 fix: keep stdin open for STDIO servers + describe instance resolves server names
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
STDIO MCP servers read from stdin and exit on EOF. Docker containers close
stdin by default, causing all STDIO servers to crash immediately. Added
OpenStdin: true to container creation.

Describe instance now resolves server names (like logs command), preferring
RUNNING instances. Added 7 new describe tests covering server name resolution,
healthcheck display, events section, and template detail.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 00:26:28 +00:00
d38b5aac60 Merge pull request 'feat: container liveness sync + node-runner slim base' (#15) from feat/container-liveness-sync into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 00:18:41 +00:00
Michal
d07d4d11dd feat: container liveness sync + node-runner slim base
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- Add syncStatus() to InstanceService: detects crashed/stopped containers,
  marks them ERROR with last log line as context
- Reconcile now syncs container status first (detect dead before counting)
- Add 30s periodic sync loop in main.ts
- Switch node-runner from alpine to slim (Debian) for npm compatibility
  (fixes home-assistant-mcp-server binary not found on Alpine)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 00:18:28 +00:00
fa58c1b5ed Merge pull request 'fix: logs resolves server names + replica handling + tests' (#14) from fix/logs-resolve-and-tests into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 00:12:50 +00:00
Michal
dd1dfc629d fix: logs command resolves server names, proper replica handling
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- `mcpctl logs <server-name>` resolves to first RUNNING instance
- `mcpctl logs <server-name> -i <N>` selects specific replica
- Shows "instance N/M" hint when server has multiple replicas
- Added 5 proper tests: server name resolution, RUNNING preference,
  replica selection, out-of-range error, no instances error

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 00:12:39 +00:00
7b3dab142e Merge pull request 'fix: show server name in instances, logs by server name' (#13) from fix/instance-ux into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 00:07:57 +00:00
Michal
4c127a7dc3 fix: show server name in instances table, allow logs by server name
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- Instance list now shows server NAME instead of cryptic server ID
- Include server relation in findAll query (Prisma include)
- Logs command accepts server name, server ID, or instance ID
  (resolves server name → first RUNNING instance)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 00:07:42 +00:00
c1e3e4aed6 Merge pull request 'feat: auto-pull images + registry path for node-runner' (#12) from feat/node-runner-registry-pull into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 00:03:19 +00:00
Michal
e45c6079c1 feat: pull images before container creation, use registry path for node-runner
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- Default node-runner image now uses mysources.co.uk registry path
- Add pullImage() call before createContainer() to auto-pull missing images
- Update stack/docker-compose.yml with MCPD_NODE_RUNNER_IMAGE and
  MCPD_MCP_NETWORK env vars, fix mcp-servers network naming

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 00:03:01 +00:00
e4aef3acf1 Merge pull request 'feat: add node-runner base image for npm-based MCP servers' (#11) from feat/node-runner-base-image into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-22 23:41:36 +00:00
Michal
a2cda38850 feat: add node-runner base image for npm-based MCP servers
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
STDIO servers with packageName (e.g. @leval/mcp-grafana) need a Node.js
container that runs `npx -y <package>`. Previously, packageName was used
as a Docker image reference causing "invalid reference format" errors.

- Add Dockerfile.node-runner: minimal node:20-alpine with npx entrypoint
- Update instance.service.ts: detect npm-based servers and use node-runner
  image with npx command instead of treating packageName as image name
- Fix NanoCPUs: only set when explicitly provided (kernel CFS not available
  on all hosts)
- Add mcp-servers network with explicit name for container isolation
- Configure MCPD_NODE_RUNNER_IMAGE and MCPD_MCP_NETWORK env vars

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-22 23:41:16 +00:00
081e90de0f Merge pull request 'fix: error handling and --force flag for create commands' (#10) from fix/create-error-handling into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-22 23:06:52 +00:00
Michal
4e3d896ef6 fix: proper error handling and --force flag for create commands
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- Add global error handler: clean messages instead of stack traces
- Add --force flag to create server/secret/project: updates on 409 conflict
- Strip null values and template-only fields from --from-template payload
- Add tests: 409 handling, --force update, null-stripping from templates

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-22 23:06:33 +00:00
0823e965bf Merge pull request 'feat: MCP healthcheck probes + new templates' (#9) from feat/healthcheck-probes into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-22 22:50:10 +00:00
91 changed files with 10198 additions and 625 deletions

1
.gitignore vendored
View File

@@ -37,3 +37,4 @@ pgdata/
# Prisma
src/db/prisma/migrations/*.sql.backup
logs.sh

View File

@@ -2,84 +2,65 @@ _mcpctl() {
local cur prev words cword
_init_completion || return
local commands="config status get describe instance instances apply setup claude project projects backup restore help"
local global_opts="-v --version -o --output --daemon-url -h --help"
local resources="servers profiles projects instances"
local commands="status login logout config get describe delete logs create edit apply backup restore help"
local global_opts="-v --version --daemon-url --direct -h --help"
local resources="servers instances secrets templates projects users groups rbac"
case "${words[1]}" in
config)
COMPREPLY=($(compgen -W "view set path reset help" -- "$cur"))
if [[ $cword -eq 2 ]]; then
COMPREPLY=($(compgen -W "view set path reset claude-generate impersonate help" -- "$cur"))
fi
return ;;
status)
COMPREPLY=($(compgen -W "--daemon-url -h --help" -- "$cur"))
COMPREPLY=($(compgen -W "-h --help" -- "$cur"))
return ;;
login)
COMPREPLY=($(compgen -W "--url --email --password -h --help" -- "$cur"))
return ;;
logout)
return ;;
get)
if [[ $cword -eq 2 ]]; then
COMPREPLY=($(compgen -W "$resources" -- "$cur"))
else
COMPREPLY=($(compgen -W "-o --output --daemon-url -h --help" -- "$cur"))
COMPREPLY=($(compgen -W "-o --output -h --help" -- "$cur"))
fi
return ;;
describe)
if [[ $cword -eq 2 ]]; then
COMPREPLY=($(compgen -W "$resources" -- "$cur"))
else
COMPREPLY=($(compgen -W "-o --output --daemon-url -h --help" -- "$cur"))
COMPREPLY=($(compgen -W "-o --output --show-values -h --help" -- "$cur"))
fi
return ;;
instance|instances)
delete)
if [[ $cword -eq 2 ]]; then
COMPREPLY=($(compgen -W "list ls start stop restart remove rm logs inspect help" -- "$cur"))
else
case "${words[2]}" in
COMPREPLY=($(compgen -W "$resources" -- "$cur"))
fi
return ;;
edit)
if [[ $cword -eq 2 ]]; then
COMPREPLY=($(compgen -W "servers projects" -- "$cur"))
fi
return ;;
logs)
COMPREPLY=($(compgen -W "--tail --since -h --help" -- "$cur"))
;;
start)
COMPREPLY=($(compgen -W "--env --image -h --help" -- "$cur"))
;;
list|ls)
COMPREPLY=($(compgen -W "--server-id -o --output -h --help" -- "$cur"))
;;
esac
fi
COMPREPLY=($(compgen -W "--tail --since -f --follow -h --help" -- "$cur"))
return ;;
claude)
if [[ $cword -eq 2 ]]; then
COMPREPLY=($(compgen -W "generate show add remove help" -- "$cur"))
else
case "${words[2]}" in
generate|show|add|remove)
COMPREPLY=($(compgen -W "--path -p -h --help" -- "$cur"))
;;
esac
fi
return ;;
project|projects)
if [[ $cword -eq 2 ]]; then
COMPREPLY=($(compgen -W "list ls create delete rm show profiles set-profiles help" -- "$cur"))
else
case "${words[2]}" in
create)
COMPREPLY=($(compgen -W "--description -d -h --help" -- "$cur"))
;;
list|ls)
COMPREPLY=($(compgen -W "-o --output -h --help" -- "$cur"))
;;
esac
if [[ $cword -eq 2 ]]; then
COMPREPLY=($(compgen -W "server secret project user group rbac help" -- "$cur"))
fi
return ;;
apply)
COMPREPLY=($(compgen -f -- "$cur"))
return ;;
backup)
COMPREPLY=($(compgen -W "-o --output -p --password -r --resources -h --help" -- "$cur"))
COMPREPLY=($(compgen -W "-o --output -p --password -h --help" -- "$cur"))
return ;;
restore)
COMPREPLY=($(compgen -W "-i --input -p --password -c --conflict -h --help" -- "$cur"))
return ;;
setup)
return ;;
help)
COMPREPLY=($(compgen -W "$commands" -- "$cur"))
return ;;

View File

@@ -1,73 +1,73 @@
# mcpctl fish completions
set -l commands config status get describe instance instances apply setup claude project projects backup restore help
set -l commands status login logout config get describe delete logs create edit apply backup restore help
# Disable file completions by default
complete -c mcpctl -f
# Global options
complete -c mcpctl -s v -l version -d 'Show version'
complete -c mcpctl -s o -l output -d 'Output format' -xa 'table json yaml'
complete -c mcpctl -l daemon-url -d 'mcpd daemon URL' -x
complete -c mcpctl -l daemon-url -d 'mcplocal daemon URL' -x
complete -c mcpctl -l direct -d 'Bypass mcplocal, connect directly to mcpd'
complete -c mcpctl -s h -l help -d 'Show help'
# Top-level commands
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a config -d 'Manage configuration'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a status -d 'Show status and connectivity'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a login -d 'Authenticate with mcpd'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a logout -d 'Log out'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a config -d 'Manage configuration'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a get -d 'List resources'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a describe -d 'Show resource details'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a instance -d 'Manage instances'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a delete -d 'Delete a resource'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a logs -d 'Get instance logs'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a create -d 'Create a resource'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a edit -d 'Edit a resource'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a apply -d 'Apply configuration from file'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a setup -d 'Interactive setup wizard'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a claude -d 'Manage Claude .mcp.json'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a project -d 'Manage projects'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a backup -d 'Backup configuration'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a restore -d 'Restore from backup'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a help -d 'Show help'
# get/describe resources
complete -c mcpctl -n "__fish_seen_subcommand_from get describe" -a 'servers profiles projects instances' -d 'Resource type'
# Resource types for get/describe/delete/edit
set -l resources servers instances secrets templates projects users groups rbac
complete -c mcpctl -n "__fish_seen_subcommand_from get describe delete" -a "$resources" -d 'Resource type'
complete -c mcpctl -n "__fish_seen_subcommand_from edit" -a 'servers projects' -d 'Resource type'
# get/describe/delete options
complete -c mcpctl -n "__fish_seen_subcommand_from get" -s o -l output -d 'Output format' -xa 'table json yaml'
complete -c mcpctl -n "__fish_seen_subcommand_from describe" -s o -l output -d 'Output format' -xa 'detail json yaml'
complete -c mcpctl -n "__fish_seen_subcommand_from describe" -l show-values -d 'Show secret values'
# login options
complete -c mcpctl -n "__fish_seen_subcommand_from login" -l url -d 'mcpd URL' -x
complete -c mcpctl -n "__fish_seen_subcommand_from login" -l email -d 'Email address' -x
complete -c mcpctl -n "__fish_seen_subcommand_from login" -l password -d 'Password' -x
# config subcommands
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from view set path reset" -a view -d 'Show configuration'
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from view set path reset" -a set -d 'Set a config value'
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from view set path reset" -a path -d 'Show config file path'
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from view set path reset" -a reset -d 'Reset to defaults'
set -l config_cmds view set path reset claude-generate impersonate
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a view -d 'Show configuration'
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a set -d 'Set a config value'
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a path -d 'Show config file path'
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a reset -d 'Reset to defaults'
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a claude-generate -d 'Generate .mcp.json'
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a impersonate -d 'Impersonate a user'
# instance subcommands
set -l instance_cmds list ls start stop restart remove rm logs inspect
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a list -d 'List instances'
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a start -d 'Start instance'
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a stop -d 'Stop instance'
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a restart -d 'Restart instance'
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a remove -d 'Remove instance'
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a logs -d 'Get logs'
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a inspect -d 'Inspect container'
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and __fish_seen_subcommand_from logs" -l tail -d 'Number of lines' -x
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and __fish_seen_subcommand_from logs" -l since -d 'Since timestamp' -x
# create subcommands
set -l create_cmds server secret project user group rbac
complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a server -d 'Create a server'
complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a secret -d 'Create a secret'
complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a project -d 'Create a project'
complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a user -d 'Create a user'
complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a group -d 'Create a group'
complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a rbac -d 'Create an RBAC binding'
# claude subcommands
set -l claude_cmds generate show add remove
complete -c mcpctl -n "__fish_seen_subcommand_from claude; and not __fish_seen_subcommand_from $claude_cmds" -a generate -d 'Generate .mcp.json'
complete -c mcpctl -n "__fish_seen_subcommand_from claude; and not __fish_seen_subcommand_from $claude_cmds" -a show -d 'Show .mcp.json'
complete -c mcpctl -n "__fish_seen_subcommand_from claude; and not __fish_seen_subcommand_from $claude_cmds" -a add -d 'Add server entry'
complete -c mcpctl -n "__fish_seen_subcommand_from claude; and not __fish_seen_subcommand_from $claude_cmds" -a remove -d 'Remove server entry'
complete -c mcpctl -n "__fish_seen_subcommand_from claude; and __fish_seen_subcommand_from $claude_cmds" -s p -l path -d 'Path to .mcp.json' -rF
# project subcommands
set -l project_cmds list ls create delete rm show profiles set-profiles
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a list -d 'List projects'
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a create -d 'Create project'
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a delete -d 'Delete project'
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a show -d 'Show project'
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a profiles -d 'List profiles'
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a set-profiles -d 'Set profiles'
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and __fish_seen_subcommand_from create" -s d -l description -d 'Description' -x
# logs options
complete -c mcpctl -n "__fish_seen_subcommand_from logs" -l tail -d 'Number of lines' -x
complete -c mcpctl -n "__fish_seen_subcommand_from logs" -l since -d 'Since timestamp' -x
complete -c mcpctl -n "__fish_seen_subcommand_from logs" -s f -l follow -d 'Follow log output'
# backup options
complete -c mcpctl -n "__fish_seen_subcommand_from backup" -s o -l output -d 'Output file' -rF
complete -c mcpctl -n "__fish_seen_subcommand_from backup" -s p -l password -d 'Encryption password' -x
complete -c mcpctl -n "__fish_seen_subcommand_from backup" -s r -l resources -d 'Resources to backup' -xa 'servers profiles projects'
# restore options
complete -c mcpctl -n "__fish_seen_subcommand_from restore" -s i -l input -d 'Input file' -rF
@@ -75,6 +75,7 @@ complete -c mcpctl -n "__fish_seen_subcommand_from restore" -s p -l password -d
complete -c mcpctl -n "__fish_seen_subcommand_from restore" -s c -l conflict -d 'Conflict strategy' -xa 'skip overwrite fail'
# apply takes a file
complete -c mcpctl -n "__fish_seen_subcommand_from apply" -s f -l file -d 'Configuration file' -rF
complete -c mcpctl -n "__fish_seen_subcommand_from apply" -F
# help completions

View File

@@ -0,0 +1,13 @@
# Base container for npm-based MCP servers (STDIO transport).
# mcpd uses this image to run `npx -y <packageName>` when a server
# has packageName but no dockerImage.
# Using slim (Debian) instead of alpine for better npm package compatibility.
FROM node:20-slim
WORKDIR /mcp
# Pre-warm npx cache directory
RUN mkdir -p /root/.npm
# Default entrypoint — overridden by mcpd via container command
ENTRYPOINT ["npx", "-y"]

View File

@@ -30,6 +30,8 @@ services:
MCPD_PORT: "3100"
MCPD_HOST: "0.0.0.0"
MCPD_LOG_LEVEL: info
MCPD_NODE_RUNNER_IMAGE: mcpctl-node-runner:latest
MCPD_MCP_NETWORK: mcp-servers
depends_on:
postgres:
condition: service_healthy
@@ -48,6 +50,16 @@ services:
retries: 3
start_period: 10s
# Base image for npm-based MCP servers (built once, used by mcpd)
node-runner:
build:
context: ..
dockerfile: deploy/Dockerfile.node-runner
image: mcpctl-node-runner:latest
profiles:
- build
entrypoint: ["echo", "Image built successfully"]
postgres-test:
image: postgres:16-alpine
container_name: mcpctl-postgres-test
@@ -71,8 +83,11 @@ networks:
mcpctl:
driver: bridge
mcp-servers:
name: mcp-servers
driver: bridge
internal: true
# Not internal — MCP servers need outbound access to reach external APIs
# (e.g., Grafana, Home Assistant). Isolation is enforced by not binding
# host ports on MCP server containers; only mcpd can reach them.
volumes:
mcpctl-pgdata:

35
fulldeploy.sh Executable file
View File

@@ -0,0 +1,35 @@
#!/bin/bash
# Full deployment: Docker image → Portainer stack → RPM build/publish/install
set -e
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
cd "$SCRIPT_DIR"
# Load .env
if [ -f .env ]; then
set -a; source .env; set +a
fi
echo "========================================"
echo " mcpctl Full Deploy"
echo "========================================"
echo ""
echo ">>> Step 1/3: Build & push mcpd Docker image"
echo ""
bash scripts/build-mcpd.sh "$@"
echo ""
echo ">>> Step 2/3: Deploy stack to production"
echo ""
bash deploy.sh
echo ""
echo ">>> Step 3/3: Build, publish & install RPM"
echo ""
bash scripts/release.sh
echo ""
echo "========================================"
echo " Full deploy complete!"
echo "========================================"

View File

@@ -63,17 +63,69 @@ const TemplateSpecSchema = z.object({
healthCheck: HealthCheckSchema.optional(),
});
const UserSpecSchema = z.object({
email: z.string().email(),
password: z.string().min(8),
name: z.string().optional(),
});
const GroupSpecSchema = z.object({
name: z.string().min(1),
description: z.string().default(''),
members: z.array(z.string().email()).default([]),
});
const RbacSubjectSchema = z.object({
kind: z.enum(['User', 'Group']),
name: z.string().min(1),
});
const RESOURCE_ALIASES: Record<string, string> = {
server: 'servers', instance: 'instances', secret: 'secrets',
project: 'projects', template: 'templates', user: 'users', group: 'groups',
};
const RbacRoleBindingSchema = z.union([
z.object({
role: z.enum(['edit', 'view', 'create', 'delete', 'run', 'expose']),
resource: z.string().min(1).transform((r) => RESOURCE_ALIASES[r] ?? r),
name: z.string().min(1).optional(),
}),
z.object({
role: z.literal('run'),
action: z.string().min(1),
}),
]);
const RbacBindingSpecSchema = z.object({
name: z.string().min(1),
subjects: z.array(RbacSubjectSchema).default([]),
roleBindings: z.array(RbacRoleBindingSchema).default([]),
});
const ProjectSpecSchema = z.object({
name: z.string().min(1),
description: z.string().default(''),
proxyMode: z.enum(['direct', 'filtered']).default('direct'),
llmProvider: z.string().optional(),
llmModel: z.string().optional(),
servers: z.array(z.string()).default([]),
});
const ApplyConfigSchema = z.object({
servers: z.array(ServerSpecSchema).default([]),
secrets: z.array(SecretSpecSchema).default([]),
servers: z.array(ServerSpecSchema).default([]),
users: z.array(UserSpecSchema).default([]),
groups: z.array(GroupSpecSchema).default([]),
projects: z.array(ProjectSpecSchema).default([]),
templates: z.array(TemplateSpecSchema).default([]),
});
rbacBindings: z.array(RbacBindingSpecSchema).default([]),
rbac: z.array(RbacBindingSpecSchema).default([]),
}).transform((data) => ({
...data,
// Merge rbac into rbacBindings so both keys work
rbacBindings: [...data.rbacBindings, ...data.rbac],
}));
export type ApplyConfig = z.infer<typeof ApplyConfigSchema>;
@@ -87,17 +139,25 @@ export function createApplyCommand(deps: ApplyCommandDeps): Command {
return new Command('apply')
.description('Apply declarative configuration from a YAML or JSON file')
.argument('<file>', 'Path to config file (.yaml, .yml, or .json)')
.argument('[file]', 'Path to config file (.yaml, .yml, or .json)')
.option('-f, --file <file>', 'Path to config file (alternative to positional arg)')
.option('--dry-run', 'Validate and show changes without applying')
.action(async (file: string, opts: { dryRun?: boolean }) => {
.action(async (fileArg: string | undefined, opts: { file?: string; dryRun?: boolean }) => {
const file = fileArg ?? opts.file;
if (!file) {
throw new Error('File path required. Usage: mcpctl apply <file> or mcpctl apply -f <file>');
}
const config = loadConfigFile(file);
if (opts.dryRun) {
log('Dry run - would apply:');
if (config.servers.length > 0) log(` ${config.servers.length} server(s)`);
if (config.secrets.length > 0) log(` ${config.secrets.length} secret(s)`);
if (config.servers.length > 0) log(` ${config.servers.length} server(s)`);
if (config.users.length > 0) log(` ${config.users.length} user(s)`);
if (config.groups.length > 0) log(` ${config.groups.length} group(s)`);
if (config.projects.length > 0) log(` ${config.projects.length} project(s)`);
if (config.templates.length > 0) log(` ${config.templates.length} template(s)`);
if (config.rbacBindings.length > 0) log(` ${config.rbacBindings.length} rbacBinding(s)`);
return;
}
@@ -119,21 +179,7 @@ function loadConfigFile(path: string): ApplyConfig {
}
async function applyConfig(client: ApiClient, config: ApplyConfig, log: (...args: unknown[]) => void): Promise<void> {
// Apply servers first
for (const server of config.servers) {
try {
const existing = await findByName(client, 'servers', server.name);
if (existing) {
await client.put(`/api/v1/servers/${(existing as { id: string }).id}`, server);
log(`Updated server: ${server.name}`);
} else {
await client.post('/api/v1/servers', server);
log(`Created server: ${server.name}`);
}
} catch (err) {
log(`Error applying server '${server.name}': ${err instanceof Error ? err.message : err}`);
}
}
// Apply order: secrets, servers, users, groups, projects, templates, rbacBindings
// Apply secrets
for (const secret of config.secrets) {
@@ -151,20 +197,63 @@ async function applyConfig(client: ApiClient, config: ApplyConfig, log: (...args
}
}
// Apply projects
// Apply servers
for (const server of config.servers) {
try {
const existing = await findByName(client, 'servers', server.name);
if (existing) {
await client.put(`/api/v1/servers/${(existing as { id: string }).id}`, server);
log(`Updated server: ${server.name}`);
} else {
await client.post('/api/v1/servers', server);
log(`Created server: ${server.name}`);
}
} catch (err) {
log(`Error applying server '${server.name}': ${err instanceof Error ? err.message : err}`);
}
}
// Apply users (matched by email)
for (const user of config.users) {
try {
const existing = await findByField(client, 'users', 'email', user.email);
if (existing) {
await client.put(`/api/v1/users/${(existing as { id: string }).id}`, user);
log(`Updated user: ${user.email}`);
} else {
await client.post('/api/v1/users', user);
log(`Created user: ${user.email}`);
}
} catch (err) {
log(`Error applying user '${user.email}': ${err instanceof Error ? err.message : err}`);
}
}
// Apply groups
for (const group of config.groups) {
try {
const existing = await findByName(client, 'groups', group.name);
if (existing) {
await client.put(`/api/v1/groups/${(existing as { id: string }).id}`, group);
log(`Updated group: ${group.name}`);
} else {
await client.post('/api/v1/groups', group);
log(`Created group: ${group.name}`);
}
} catch (err) {
log(`Error applying group '${group.name}': ${err instanceof Error ? err.message : err}`);
}
}
// Apply projects (send full spec including servers)
for (const project of config.projects) {
try {
const existing = await findByName(client, 'projects', project.name);
if (existing) {
await client.put(`/api/v1/projects/${(existing as { id: string }).id}`, {
description: project.description,
});
await client.put(`/api/v1/projects/${(existing as { id: string }).id}`, project);
log(`Updated project: ${project.name}`);
} else {
await client.post('/api/v1/projects', {
name: project.name,
description: project.description,
});
await client.post('/api/v1/projects', project);
log(`Created project: ${project.name}`);
}
} catch (err) {
@@ -187,6 +276,22 @@ async function applyConfig(client: ApiClient, config: ApplyConfig, log: (...args
log(`Error applying template '${template.name}': ${err instanceof Error ? err.message : err}`);
}
}
// Apply RBAC bindings
for (const rbacBinding of config.rbacBindings) {
try {
const existing = await findByName(client, 'rbac', rbacBinding.name);
if (existing) {
await client.put(`/api/v1/rbac/${(existing as { id: string }).id}`, rbacBinding);
log(`Updated rbacBinding: ${rbacBinding.name}`);
} else {
await client.post('/api/v1/rbac', rbacBinding);
log(`Created rbacBinding: ${rbacBinding.name}`);
}
} catch (err) {
log(`Error applying rbacBinding '${rbacBinding.name}': ${err instanceof Error ? err.message : err}`);
}
}
}
async function findByName(client: ApiClient, resource: string, name: string): Promise<unknown | null> {
@@ -198,5 +303,14 @@ async function findByName(client: ApiClient, resource: string, name: string): Pr
}
}
async function findByField<T extends string>(client: ApiClient, resource: string, field: T, value: string): Promise<unknown | null> {
try {
const items = await client.get<Array<Record<string, unknown>>>(`/api/v1/${resource}`);
return items.find((item) => item[field] === value) ?? null;
} catch {
return null;
}
}
// Export for testing
export { loadConfigFile, applyConfig };

View File

@@ -10,6 +10,10 @@ export interface PromptDeps {
password(message: string): Promise<string>;
}
export interface StatusResponse {
hasUsers: boolean;
}
export interface AuthCommandDeps {
configDeps: Partial<ConfigLoaderDeps>;
credentialsDeps: Partial<CredentialsDeps>;
@@ -17,6 +21,8 @@ export interface AuthCommandDeps {
log: (...args: string[]) => void;
loginRequest: (mcpdUrl: string, email: string, password: string) => Promise<LoginResponse>;
logoutRequest: (mcpdUrl: string, token: string) => Promise<void>;
statusRequest: (mcpdUrl: string) => Promise<StatusResponse>;
bootstrapRequest: (mcpdUrl: string, email: string, password: string, name?: string) => Promise<LoginResponse>;
}
interface LoginResponse {
@@ -80,6 +86,70 @@ function defaultLogoutRequest(mcpdUrl: string, token: string): Promise<void> {
});
}
function defaultStatusRequest(mcpdUrl: string): Promise<StatusResponse> {
return new Promise((resolve, reject) => {
const url = new URL('/api/v1/auth/status', mcpdUrl);
const opts: http.RequestOptions = {
hostname: url.hostname,
port: url.port,
path: url.pathname,
method: 'GET',
timeout: 10000,
headers: { 'Content-Type': 'application/json' },
};
const req = http.request(opts, (res) => {
const chunks: Buffer[] = [];
res.on('data', (chunk: Buffer) => chunks.push(chunk));
res.on('end', () => {
const raw = Buffer.concat(chunks).toString('utf-8');
if ((res.statusCode ?? 0) >= 400) {
reject(new Error(`Status check failed (${res.statusCode}): ${raw}`));
return;
}
resolve(JSON.parse(raw) as StatusResponse);
});
});
req.on('error', (err) => reject(new Error(`Cannot reach mcpd: ${err.message}`)));
req.on('timeout', () => { req.destroy(); reject(new Error('Status request timed out')); });
req.end();
});
}
function defaultBootstrapRequest(mcpdUrl: string, email: string, password: string, name?: string): Promise<LoginResponse> {
return new Promise((resolve, reject) => {
const url = new URL('/api/v1/auth/bootstrap', mcpdUrl);
const payload: Record<string, string> = { email, password };
if (name) {
payload['name'] = name;
}
const body = JSON.stringify(payload);
const opts: http.RequestOptions = {
hostname: url.hostname,
port: url.port,
path: url.pathname,
method: 'POST',
timeout: 10000,
headers: { 'Content-Type': 'application/json', 'Content-Length': Buffer.byteLength(body) },
};
const req = http.request(opts, (res) => {
const chunks: Buffer[] = [];
res.on('data', (chunk: Buffer) => chunks.push(chunk));
res.on('end', () => {
const raw = Buffer.concat(chunks).toString('utf-8');
if ((res.statusCode ?? 0) >= 400) {
reject(new Error(`Bootstrap failed (${res.statusCode}): ${raw}`));
return;
}
resolve(JSON.parse(raw) as LoginResponse);
});
});
req.on('error', (err) => reject(new Error(`Cannot reach mcpd: ${err.message}`)));
req.on('timeout', () => { req.destroy(); reject(new Error('Bootstrap request timed out')); });
req.write(body);
req.end();
});
}
async function defaultInput(message: string): Promise<string> {
const { default: inquirer } = await import('inquirer');
const { answer } = await inquirer.prompt([{ type: 'input', name: 'answer', message }]);
@@ -99,10 +169,12 @@ const defaultDeps: AuthCommandDeps = {
log: (...args) => console.log(...args),
loginRequest: defaultLoginRequest,
logoutRequest: defaultLogoutRequest,
statusRequest: defaultStatusRequest,
bootstrapRequest: defaultBootstrapRequest,
};
export function createLoginCommand(deps?: Partial<AuthCommandDeps>): Command {
const { configDeps, credentialsDeps, prompt, log, loginRequest } = { ...defaultDeps, ...deps };
const { configDeps, credentialsDeps, prompt, log, loginRequest, statusRequest, bootstrapRequest } = { ...defaultDeps, ...deps };
return new Command('login')
.description('Authenticate with mcpd')
@@ -111,10 +183,28 @@ export function createLoginCommand(deps?: Partial<AuthCommandDeps>): Command {
const config = loadConfig(configDeps);
const mcpdUrl = opts.mcpdUrl ?? config.mcpdUrl;
try {
const status = await statusRequest(mcpdUrl);
if (!status.hasUsers) {
log('No users configured. Creating first admin account.');
const email = await prompt.input('Email:');
const password = await prompt.password('Password:');
const name = await prompt.input('Name (optional):');
const result = name
? await bootstrapRequest(mcpdUrl, email, password, name)
: await bootstrapRequest(mcpdUrl, email, password);
saveCredentials({
token: result.token,
mcpdUrl,
user: result.user.email,
}, credentialsDeps);
log(`Logged in as ${result.user.email} (admin)`);
} else {
const email = await prompt.input('Email:');
const password = await prompt.password('Password:');
try {
const result = await loginRequest(mcpdUrl, email, password);
saveCredentials({
token: result.token,
@@ -122,6 +212,7 @@ export function createLoginCommand(deps?: Partial<AuthCommandDeps>): Command {
user: result.user.email,
}, credentialsDeps);
log(`Logged in as ${result.user.email}`);
}
} catch (err) {
log(`Login failed: ${(err as Error).message}`);
process.exitCode = 1;

View File

@@ -1,155 +0,0 @@
import { Command } from 'commander';
import { writeFileSync, readFileSync, existsSync } from 'node:fs';
import { resolve } from 'node:path';
import type { ApiClient } from '../api-client.js';
interface McpConfig {
mcpServers: Record<string, { command: string; args: string[]; env?: Record<string, string> }>;
}
export interface ClaudeCommandDeps {
client: ApiClient;
log: (...args: unknown[]) => void;
}
export function createClaudeCommand(deps: ClaudeCommandDeps): Command {
const { client, log } = deps;
const cmd = new Command('claude')
.description('Manage Claude MCP configuration (.mcp.json)');
cmd
.command('generate <projectId>')
.description('Generate .mcp.json from a project configuration')
.option('-o, --output <path>', 'Output file path', '.mcp.json')
.option('--merge', 'Merge with existing .mcp.json instead of overwriting')
.option('--stdout', 'Print to stdout instead of writing a file')
.action(async (projectId: string, opts: { output: string; merge?: boolean; stdout?: boolean }) => {
const config = await client.get<McpConfig>(`/api/v1/projects/${projectId}/mcp-config`);
if (opts.stdout) {
log(JSON.stringify(config, null, 2));
return;
}
const outputPath = resolve(opts.output);
let finalConfig = config;
if (opts.merge && existsSync(outputPath)) {
try {
const existing = JSON.parse(readFileSync(outputPath, 'utf-8')) as McpConfig;
finalConfig = {
mcpServers: {
...existing.mcpServers,
...config.mcpServers,
},
};
} catch {
// If existing file is invalid, just overwrite
}
}
writeFileSync(outputPath, JSON.stringify(finalConfig, null, 2) + '\n');
const serverCount = Object.keys(finalConfig.mcpServers).length;
log(`Wrote ${outputPath} (${serverCount} server(s))`);
});
cmd
.command('show')
.description('Show current .mcp.json configuration')
.option('-p, --path <path>', 'Path to .mcp.json', '.mcp.json')
.action((opts: { path: string }) => {
const filePath = resolve(opts.path);
if (!existsSync(filePath)) {
log(`No .mcp.json found at ${filePath}`);
return;
}
const content = readFileSync(filePath, 'utf-8');
try {
const config = JSON.parse(content) as McpConfig;
const servers = Object.entries(config.mcpServers ?? {});
if (servers.length === 0) {
log('No MCP servers configured.');
return;
}
log(`MCP servers in ${filePath}:\n`);
for (const [name, server] of servers) {
log(` ${name}`);
log(` command: ${server.command} ${server.args.join(' ')}`);
if (server.env) {
const envKeys = Object.keys(server.env);
log(` env: ${envKeys.join(', ')}`);
}
}
} catch {
log(`Invalid JSON in ${filePath}`);
}
});
cmd
.command('add <name>')
.description('Add an MCP server entry to .mcp.json')
.requiredOption('-c, --command <cmd>', 'Command to run')
.option('-a, --args <args...>', 'Command arguments')
.option('-e, --env <key=value...>', 'Environment variables')
.option('-p, --path <path>', 'Path to .mcp.json', '.mcp.json')
.action((name: string, opts: { command: string; args?: string[]; env?: string[]; path: string }) => {
const filePath = resolve(opts.path);
let config: McpConfig = { mcpServers: {} };
if (existsSync(filePath)) {
try {
config = JSON.parse(readFileSync(filePath, 'utf-8')) as McpConfig;
} catch {
// Start fresh
}
}
const entry: { command: string; args: string[]; env?: Record<string, string> } = {
command: opts.command,
args: opts.args ?? [],
};
if (opts.env && opts.env.length > 0) {
const env: Record<string, string> = {};
for (const pair of opts.env) {
const eqIdx = pair.indexOf('=');
if (eqIdx > 0) {
env[pair.slice(0, eqIdx)] = pair.slice(eqIdx + 1);
}
}
entry.env = env;
}
config.mcpServers[name] = entry;
writeFileSync(filePath, JSON.stringify(config, null, 2) + '\n');
log(`Added '${name}' to ${filePath}`);
});
cmd
.command('remove <name>')
.description('Remove an MCP server entry from .mcp.json')
.option('-p, --path <path>', 'Path to .mcp.json', '.mcp.json')
.action((name: string, opts: { path: string }) => {
const filePath = resolve(opts.path);
if (!existsSync(filePath)) {
log(`No .mcp.json found at ${filePath}`);
return;
}
try {
const config = JSON.parse(readFileSync(filePath, 'utf-8')) as McpConfig;
if (!(name in config.mcpServers)) {
log(`Server '${name}' not found in ${filePath}`);
return;
}
delete config.mcpServers[name];
writeFileSync(filePath, JSON.stringify(config, null, 2) + '\n');
log(`Removed '${name}' from ${filePath}`);
} catch {
log(`Invalid JSON in ${filePath}`);
}
});
return cmd;
}

View File

@@ -1,19 +1,35 @@
import { Command } from 'commander';
import { writeFileSync, readFileSync, existsSync } from 'node:fs';
import { resolve, join } from 'node:path';
import { homedir } from 'node:os';
import { loadConfig, saveConfig, mergeConfig, getConfigPath, DEFAULT_CONFIG } from '../config/index.js';
import type { McpctlConfig, ConfigLoaderDeps } from '../config/index.js';
import { formatJson, formatYaml } from '../formatters/index.js';
import { saveCredentials, loadCredentials } from '../auth/index.js';
import type { CredentialsDeps, StoredCredentials } from '../auth/index.js';
import type { ApiClient } from '../api-client.js';
interface McpConfig {
mcpServers: Record<string, { command: string; args: string[]; env?: Record<string, string> }>;
}
export interface ConfigCommandDeps {
configDeps: Partial<ConfigLoaderDeps>;
log: (...args: string[]) => void;
}
export interface ConfigApiDeps {
client: ApiClient;
credentialsDeps: Partial<CredentialsDeps>;
log: (...args: string[]) => void;
}
const defaultDeps: ConfigCommandDeps = {
configDeps: {},
log: (...args) => console.log(...args),
};
export function createConfigCommand(deps?: Partial<ConfigCommandDeps>): Command {
export function createConfigCommand(deps?: Partial<ConfigCommandDeps>, apiDeps?: ConfigApiDeps): Command {
const { configDeps, log } = { ...defaultDeps, ...deps };
const config = new Command('config').description('Manage mcpctl configuration');
@@ -68,5 +84,115 @@ export function createConfigCommand(deps?: Partial<ConfigCommandDeps>): Command
log('Configuration reset to defaults');
});
if (apiDeps) {
const { client, credentialsDeps, log: apiLog } = apiDeps;
config
.command('claude-generate')
.description('Generate .mcp.json from a project configuration')
.requiredOption('--project <name>', 'Project name')
.option('-o, --output <path>', 'Output file path', '.mcp.json')
.option('--merge', 'Merge with existing .mcp.json instead of overwriting')
.option('--stdout', 'Print to stdout instead of writing a file')
.action(async (opts: { project: string; output: string; merge?: boolean; stdout?: boolean }) => {
const mcpConfig = await client.get<McpConfig>(`/api/v1/projects/${opts.project}/mcp-config`);
if (opts.stdout) {
apiLog(JSON.stringify(mcpConfig, null, 2));
return;
}
const outputPath = resolve(opts.output);
let finalConfig = mcpConfig;
if (opts.merge && existsSync(outputPath)) {
try {
const existing = JSON.parse(readFileSync(outputPath, 'utf-8')) as McpConfig;
finalConfig = {
mcpServers: {
...existing.mcpServers,
...mcpConfig.mcpServers,
},
};
} catch {
// If existing file is invalid, just overwrite
}
}
writeFileSync(outputPath, JSON.stringify(finalConfig, null, 2) + '\n');
const serverCount = Object.keys(finalConfig.mcpServers).length;
apiLog(`Wrote ${outputPath} (${serverCount} server(s))`);
});
config
.command('impersonate')
.description('Impersonate another user or return to original identity')
.argument('[email]', 'Email of user to impersonate')
.option('--quit', 'Stop impersonating and return to original identity')
.action(async (email: string | undefined, opts: { quit?: boolean }) => {
const configDir = credentialsDeps?.configDir ?? join(homedir(), '.mcpctl');
const backupPath = join(configDir, 'credentials-backup');
if (opts.quit) {
if (!existsSync(backupPath)) {
apiLog('No impersonation session to quit');
process.exitCode = 1;
return;
}
const backupRaw = readFileSync(backupPath, 'utf-8');
const backup = JSON.parse(backupRaw) as StoredCredentials;
saveCredentials(backup, credentialsDeps);
// Remove backup file
const { unlinkSync } = await import('node:fs');
unlinkSync(backupPath);
apiLog(`Returned to ${backup.user}`);
return;
}
if (!email) {
apiLog('Email is required when not using --quit');
process.exitCode = 1;
return;
}
// Save current credentials as backup
const currentCreds = loadCredentials(credentialsDeps);
if (!currentCreds) {
apiLog('Not logged in. Run "mcpctl login" first.');
process.exitCode = 1;
return;
}
writeFileSync(backupPath, JSON.stringify(currentCreds, null, 2) + '\n', 'utf-8');
try {
const result = await client.post<{ token: string; user: { email: string } }>(
'/api/v1/auth/impersonate',
{ email },
);
saveCredentials({
token: result.token,
mcpdUrl: currentCreds.mcpdUrl,
user: result.user.email,
}, credentialsDeps);
apiLog(`Impersonating ${result.user.email}. Use 'mcpctl config impersonate --quit' to return.`);
} catch (err) {
// Restore backup on failure
const backup = JSON.parse(readFileSync(backupPath, 'utf-8')) as StoredCredentials;
saveCredentials(backup, credentialsDeps);
const { unlinkSync } = await import('node:fs');
unlinkSync(backupPath);
apiLog(`Impersonate failed: ${(err as Error).message}`);
process.exitCode = 1;
}
});
}
return config;
}

View File

@@ -1,5 +1,5 @@
import { Command } from 'commander';
import type { ApiClient } from '../api-client.js';
import { type ApiClient, ApiError } from '../api-client.js';
export interface CreateCommandDeps {
client: ApiClient;
log: (...args: unknown[]) => void;
@@ -55,7 +55,7 @@ export function createCreateCommand(deps: CreateCommandDeps): Command {
const { client, log } = deps;
const cmd = new Command('create')
.description('Create a resource (server, project)');
.description('Create a resource (server, secret, project, user, group, rbac)');
// --- create server ---
cmd.command('server')
@@ -72,6 +72,7 @@ export function createCreateCommand(deps: CreateCommandDeps): Command {
.option('--replicas <count>', 'Number of replicas')
.option('--env <entry>', 'Env var: KEY=value (inline) or KEY=secretRef:SECRET:KEY (secret ref, repeat for multiple)', collect, [])
.option('--from-template <name>', 'Create from template (name or name:version)')
.option('--force', 'Update if already exists')
.action(async (name: string, opts) => {
let base: Record<string, unknown> = {};
@@ -92,9 +93,12 @@ export function createCreateCommand(deps: CreateCommandDeps): Command {
if (!template) throw new Error(`Template '${tplName}' not found`);
}
// Copy template fields as base (strip template-only fields)
const { id: _id, createdAt: _c, updatedAt: _u, ...tplFields } = template;
base = { ...tplFields };
// Copy template fields as base (strip template-only, internal, and null fields)
const { id: _id, createdAt: _c, updatedAt: _u, version: _v, name: _n, ...tplFields } = template;
base = {};
for (const [k, v] of Object.entries(tplFields)) {
if (v !== null && v !== undefined) base[k] = v;
}
// Convert template env (description/required) to server env (name/value/valueFrom)
const tplEnv = template.env as Array<{ name: string; description?: string; required?: boolean; defaultValue?: string }> | undefined;
@@ -144,8 +148,20 @@ export function createCreateCommand(deps: CreateCommandDeps): Command {
if (!body.replicas) body.replicas = 1;
}
try {
const server = await client.post<{ id: string; name: string }>('/api/v1/servers', body);
log(`server '${server.name}' created (id: ${server.id})`);
} catch (err) {
if (err instanceof ApiError && err.status === 409 && opts.force) {
const existing = (await client.get<Array<{ id: string; name: string }>>('/api/v1/servers')).find((s) => s.name === name);
if (!existing) throw err;
const { name: _n, ...updateBody } = body;
await client.put(`/api/v1/servers/${existing.id}`, updateBody);
log(`server '${name}' updated (id: ${existing.id})`);
} else {
throw err;
}
}
});
// --- create secret ---
@@ -153,13 +169,25 @@ export function createCreateCommand(deps: CreateCommandDeps): Command {
.description('Create a secret')
.argument('<name>', 'Secret name (lowercase, hyphens allowed)')
.option('--data <entry>', 'Secret data KEY=value (repeat for multiple)', collect, [])
.option('--force', 'Update if already exists')
.action(async (name: string, opts) => {
const data = parseEnvEntries(opts.data);
try {
const secret = await client.post<{ id: string; name: string }>('/api/v1/secrets', {
name,
data,
});
log(`secret '${secret.name}' created (id: ${secret.id})`);
} catch (err) {
if (err instanceof ApiError && err.status === 409 && opts.force) {
const existing = (await client.get<Array<{ id: string; name: string }>>('/api/v1/secrets')).find((s) => s.name === name);
if (!existing) throw err;
await client.put(`/api/v1/secrets/${existing.id}`, { data });
log(`secret '${name}' updated (id: ${existing.id})`);
} else {
throw err;
}
}
});
// --- create project ---
@@ -167,12 +195,156 @@ export function createCreateCommand(deps: CreateCommandDeps): Command {
.description('Create a project')
.argument('<name>', 'Project name')
.option('-d, --description <text>', 'Project description', '')
.option('--proxy-mode <mode>', 'Proxy mode (direct, filtered)')
.option('--proxy-mode-llm-provider <name>', 'LLM provider name (for filtered proxy mode)')
.option('--proxy-mode-llm-model <name>', 'LLM model name (for filtered proxy mode)')
.option('--server <name>', 'Server name (repeat for multiple)', collect, [])
.option('--force', 'Update if already exists')
.action(async (name: string, opts) => {
const project = await client.post<{ id: string; name: string }>('/api/v1/projects', {
const body: Record<string, unknown> = {
name,
description: opts.description,
});
proxyMode: opts.proxyMode ?? 'direct',
};
if (opts.proxyModeLlmProvider) body.llmProvider = opts.proxyModeLlmProvider;
if (opts.proxyModeLlmModel) body.llmModel = opts.proxyModeLlmModel;
if (opts.server.length > 0) body.servers = opts.server;
try {
const project = await client.post<{ id: string; name: string }>('/api/v1/projects', body);
log(`project '${project.name}' created (id: ${project.id})`);
} catch (err) {
if (err instanceof ApiError && err.status === 409 && opts.force) {
const existing = (await client.get<Array<{ id: string; name: string }>>('/api/v1/projects')).find((p) => p.name === name);
if (!existing) throw err;
const { name: _n, ...updateBody } = body;
await client.put(`/api/v1/projects/${existing.id}`, updateBody);
log(`project '${name}' updated (id: ${existing.id})`);
} else {
throw err;
}
}
});
// --- create user ---
cmd.command('user')
.description('Create a user')
.argument('<email>', 'User email address')
.option('--password <pass>', 'User password')
.option('--name <name>', 'User display name')
.option('--force', 'Update if already exists')
.action(async (email: string, opts) => {
if (!opts.password) {
throw new Error('--password is required');
}
const body: Record<string, unknown> = {
email,
password: opts.password,
};
if (opts.name) body.name = opts.name;
try {
const user = await client.post<{ id: string; email: string }>('/api/v1/users', body);
log(`user '${user.email}' created (id: ${user.id})`);
} catch (err) {
if (err instanceof ApiError && err.status === 409 && opts.force) {
const existing = (await client.get<Array<{ id: string; email: string }>>('/api/v1/users')).find((u) => u.email === email);
if (!existing) throw err;
const { email: _e, ...updateBody } = body;
await client.put(`/api/v1/users/${existing.id}`, updateBody);
log(`user '${email}' updated (id: ${existing.id})`);
} else {
throw err;
}
}
});
// --- create group ---
cmd.command('group')
.description('Create a group')
.argument('<name>', 'Group name')
.option('--description <text>', 'Group description')
.option('--member <email>', 'Member email (repeat for multiple)', collect, [])
.option('--force', 'Update if already exists')
.action(async (name: string, opts) => {
const body: Record<string, unknown> = {
name,
members: opts.member,
};
if (opts.description) body.description = opts.description;
try {
const group = await client.post<{ id: string; name: string }>('/api/v1/groups', body);
log(`group '${group.name}' created (id: ${group.id})`);
} catch (err) {
if (err instanceof ApiError && err.status === 409 && opts.force) {
const existing = (await client.get<Array<{ id: string; name: string }>>('/api/v1/groups')).find((g) => g.name === name);
if (!existing) throw err;
const { name: _n, ...updateBody } = body;
await client.put(`/api/v1/groups/${existing.id}`, updateBody);
log(`group '${name}' updated (id: ${existing.id})`);
} else {
throw err;
}
}
});
// --- create rbac ---
cmd.command('rbac')
.description('Create an RBAC binding definition')
.argument('<name>', 'RBAC binding name')
.option('--subject <entry>', 'Subject as Kind:name (repeat for multiple)', collect, [])
.option('--binding <entry>', 'Role binding as role:resource (e.g. edit:servers, run:projects)', collect, [])
.option('--operation <action>', 'Operation binding (e.g. logs, backup)', collect, [])
.option('--force', 'Update if already exists')
.action(async (name: string, opts) => {
const subjects = (opts.subject as string[]).map((entry: string) => {
const colonIdx = entry.indexOf(':');
if (colonIdx === -1) {
throw new Error(`Invalid subject format '${entry}'. Expected Kind:name (e.g. User:alice@example.com)`);
}
return { kind: entry.slice(0, colonIdx), name: entry.slice(colonIdx + 1) };
});
const roleBindings: Array<Record<string, string>> = [];
// Resource bindings from --binding flag (role:resource or role:resource:name)
for (const entry of opts.binding as string[]) {
const parts = entry.split(':');
if (parts.length === 2) {
roleBindings.push({ role: parts[0]!, resource: parts[1]! });
} else if (parts.length === 3) {
roleBindings.push({ role: parts[0]!, resource: parts[1]!, name: parts[2]! });
} else {
throw new Error(`Invalid binding format '${entry}'. Expected role:resource or role:resource:name (e.g. edit:servers, view:servers:my-ha)`);
}
}
// Operation bindings from --operation flag
for (const action of opts.operation as string[]) {
roleBindings.push({ role: 'run', action });
}
const body: Record<string, unknown> = {
name,
subjects,
roleBindings,
};
try {
const rbac = await client.post<{ id: string; name: string }>('/api/v1/rbac', body);
log(`rbac '${rbac.name}' created (id: ${rbac.id})`);
} catch (err) {
if (err instanceof ApiError && err.status === 409 && opts.force) {
const existing = (await client.get<Array<{ id: string; name: string }>>('/api/v1/rbac')).find((r) => r.name === name);
if (!existing) throw err;
const { name: _n, ...updateBody } = body;
await client.put(`/api/v1/rbac/${existing.id}`, updateBody);
log(`rbac '${name}' updated (id: ${existing.id})`);
} else {
throw err;
}
}
});
return cmd;

View File

@@ -11,7 +11,7 @@ export function createDeleteCommand(deps: DeleteCommandDeps): Command {
const { client, log } = deps;
return new Command('delete')
.description('Delete a resource (server, instance, profile, project)')
.description('Delete a resource (server, instance, secret, project, user, group, rbac)')
.argument('<resource>', 'resource type')
.argument('<id>', 'resource ID or name')
.action(async (resourceArg: string, idOrName: string) => {

View File

@@ -74,9 +74,10 @@ function formatServerDetail(server: Record<string, unknown>): string {
function formatInstanceDetail(instance: Record<string, unknown>, inspect?: Record<string, unknown>): string {
const lines: string[] = [];
lines.push(`=== Instance: ${instance.id} ===`);
const server = instance.server as { name: string } | undefined;
lines.push(`=== Instance: ${server?.name ?? instance.id} ===`);
lines.push(`${pad('Status:')}${instance.status}`);
lines.push(`${pad('Server ID:')}${instance.serverId}`);
lines.push(`${pad('Server:')}${server?.name ?? String(instance.serverId)}`);
lines.push(`${pad('Container ID:')}${instance.containerId ?? '-'}`);
lines.push(`${pad('Port:')}${instance.port ?? '-'}`);
@@ -137,11 +138,34 @@ function formatProjectDetail(project: Record<string, unknown>): string {
lines.push(`=== Project: ${project.name} ===`);
lines.push(`${pad('Name:')}${project.name}`);
if (project.description) lines.push(`${pad('Description:')}${project.description}`);
if (project.ownerId) lines.push(`${pad('Owner:')}${project.ownerId}`);
// Proxy config section
const proxyMode = project.proxyMode as string | undefined;
const llmProvider = project.llmProvider as string | undefined;
const llmModel = project.llmModel as string | undefined;
if (proxyMode || llmProvider || llmModel) {
lines.push('');
lines.push('Proxy Config:');
lines.push(` ${pad('Mode:', 18)}${proxyMode ?? 'direct'}`);
if (llmProvider) lines.push(` ${pad('LLM Provider:', 18)}${llmProvider}`);
if (llmModel) lines.push(` ${pad('LLM Model:', 18)}${llmModel}`);
}
// Servers section
const servers = project.servers as Array<{ server: { name: string } }> | undefined;
if (servers && servers.length > 0) {
lines.push('');
lines.push('Servers:');
lines.push(' NAME');
for (const s of servers) {
lines.push(` ${s.server.name}`);
}
}
lines.push('');
lines.push('Metadata:');
lines.push(` ${pad('ID:', 12)}${project.id}`);
if (project.ownerId) lines.push(` ${pad('Owner:', 12)}${project.ownerId}`);
if (project.createdAt) lines.push(` ${pad('Created:', 12)}${project.createdAt}`);
if (project.updatedAt) lines.push(` ${pad('Updated:', 12)}${project.updatedAt}`);
@@ -239,6 +263,231 @@ function formatTemplateDetail(template: Record<string, unknown>): string {
return lines.join('\n');
}
interface RbacBinding { role: string; resource?: string; action?: string; name?: string }
interface RbacDef { name: string; subjects: Array<{ kind: string; name: string }>; roleBindings: RbacBinding[] }
interface PermissionSet { source: string; bindings: RbacBinding[] }
function formatPermissionSections(sections: PermissionSet[]): string[] {
const lines: string[] = [];
for (const section of sections) {
const bindings = section.bindings;
if (bindings.length === 0) continue;
const resourceBindings = bindings.filter((b) => 'resource' in b && b.resource !== undefined);
const operationBindings = bindings.filter((b) => 'action' in b && b.action !== undefined);
if (resourceBindings.length > 0) {
lines.push('');
lines.push(`${section.source} — Resources:`);
const roleW = Math.max(6, ...resourceBindings.map((b) => b.role.length)) + 2;
const resW = Math.max(10, ...resourceBindings.map((b) => (b.resource ?? '').length)) + 2;
const hasName = resourceBindings.some((b) => b.name);
if (hasName) {
lines.push(` ${'ROLE'.padEnd(roleW)}${'RESOURCE'.padEnd(resW)}NAME`);
} else {
lines.push(` ${'ROLE'.padEnd(roleW)}RESOURCE`);
}
for (const b of resourceBindings) {
if (hasName) {
lines.push(` ${b.role.padEnd(roleW)}${(b.resource ?? '').padEnd(resW)}${b.name ?? '*'}`);
} else {
lines.push(` ${b.role.padEnd(roleW)}${b.resource}`);
}
}
}
if (operationBindings.length > 0) {
lines.push('');
lines.push(`${section.source} — Operations:`);
lines.push(` ${'ACTION'.padEnd(20)}ROLE`);
for (const b of operationBindings) {
lines.push(` ${(b.action ?? '').padEnd(20)}${b.role}`);
}
}
}
return lines;
}
function collectBindingsForSubject(
rbacDefs: RbacDef[],
kind: string,
name: string,
): { rbacName: string; bindings: RbacBinding[] }[] {
const results: { rbacName: string; bindings: RbacBinding[] }[] = [];
for (const def of rbacDefs) {
const matched = def.subjects.some((s) => s.kind === kind && s.name === name);
if (matched) {
results.push({ rbacName: def.name, bindings: def.roleBindings });
}
}
return results;
}
function formatUserDetail(
user: Record<string, unknown>,
rbacDefs?: RbacDef[],
userGroups?: string[],
): string {
const lines: string[] = [];
lines.push(`=== User: ${user.email} ===`);
lines.push(`${pad('Email:')}${user.email}`);
lines.push(`${pad('Name:')}${(user.name as string | null) ?? '-'}`);
lines.push(`${pad('Provider:')}${(user.provider as string | null) ?? 'local'}`);
if (userGroups && userGroups.length > 0) {
lines.push(`${pad('Groups:')}${userGroups.join(', ')}`);
}
if (rbacDefs) {
const email = user.email as string;
// Direct permissions (User:email subjects)
const directMatches = collectBindingsForSubject(rbacDefs, 'User', email);
const directBindings = directMatches.flatMap((m) => m.bindings);
const directSources = directMatches.map((m) => m.rbacName).join(', ');
// Inherited permissions (Group:name subjects)
const inheritedSections: PermissionSet[] = [];
if (userGroups) {
for (const groupName of userGroups) {
const groupMatches = collectBindingsForSubject(rbacDefs, 'Group', groupName);
const groupBindings = groupMatches.flatMap((m) => m.bindings);
if (groupBindings.length > 0) {
inheritedSections.push({ source: `Inherited (${groupName})`, bindings: groupBindings });
}
}
}
const sections: PermissionSet[] = [];
if (directBindings.length > 0) {
sections.push({ source: `Direct (${directSources})`, bindings: directBindings });
}
sections.push(...inheritedSections);
if (sections.length > 0) {
lines.push('');
lines.push('Access:');
lines.push(...formatPermissionSections(sections));
} else {
lines.push('');
lines.push('Access: (none)');
}
}
lines.push('');
lines.push('Metadata:');
lines.push(` ${pad('ID:', 12)}${user.id}`);
if (user.createdAt) lines.push(` ${pad('Created:', 12)}${user.createdAt}`);
if (user.updatedAt) lines.push(` ${pad('Updated:', 12)}${user.updatedAt}`);
return lines.join('\n');
}
function formatGroupDetail(group: Record<string, unknown>, rbacDefs?: RbacDef[]): string {
const lines: string[] = [];
lines.push(`=== Group: ${group.name} ===`);
lines.push(`${pad('Name:')}${group.name}`);
if (group.description) lines.push(`${pad('Description:')}${group.description}`);
const members = group.members as Array<{ user: { email: string }; createdAt?: string }> | undefined;
if (members && members.length > 0) {
lines.push('');
lines.push('Members:');
const emailW = Math.max(6, ...members.map((m) => m.user.email.length)) + 2;
lines.push(` ${'EMAIL'.padEnd(emailW)}ADDED`);
for (const m of members) {
const added = (m.createdAt as string | undefined) ?? '-';
lines.push(` ${m.user.email.padEnd(emailW)}${added}`);
}
}
if (rbacDefs) {
const groupName = group.name as string;
const matches = collectBindingsForSubject(rbacDefs, 'Group', groupName);
const allBindings = matches.flatMap((m) => m.bindings);
const sources = matches.map((m) => m.rbacName).join(', ');
if (allBindings.length > 0) {
const sections: PermissionSet[] = [{ source: `Granted (${sources})`, bindings: allBindings }];
lines.push('');
lines.push('Access:');
lines.push(...formatPermissionSections(sections));
} else {
lines.push('');
lines.push('Access: (none)');
}
}
lines.push('');
lines.push('Metadata:');
lines.push(` ${pad('ID:', 12)}${group.id}`);
if (group.createdAt) lines.push(` ${pad('Created:', 12)}${group.createdAt}`);
if (group.updatedAt) lines.push(` ${pad('Updated:', 12)}${group.updatedAt}`);
return lines.join('\n');
}
function formatRbacDetail(rbac: Record<string, unknown>): string {
const lines: string[] = [];
lines.push(`=== RBAC: ${rbac.name} ===`);
lines.push(`${pad('Name:')}${rbac.name}`);
const subjects = rbac.subjects as Array<{ kind: string; name: string }> | undefined;
if (subjects && subjects.length > 0) {
lines.push('');
lines.push('Subjects:');
const kindW = Math.max(6, ...subjects.map((s) => s.kind.length)) + 2;
lines.push(` ${'KIND'.padEnd(kindW)}NAME`);
for (const s of subjects) {
lines.push(` ${s.kind.padEnd(kindW)}${s.name}`);
}
}
const roleBindings = rbac.roleBindings as Array<{ role: string; resource?: string; action?: string; name?: string }> | undefined;
if (roleBindings && roleBindings.length > 0) {
// Separate resource bindings from operation bindings
const resourceBindings = roleBindings.filter((b) => 'resource' in b && b.resource !== undefined);
const operationBindings = roleBindings.filter((b) => 'action' in b && b.action !== undefined);
if (resourceBindings.length > 0) {
lines.push('');
lines.push('Resource Bindings:');
const roleW = Math.max(6, ...resourceBindings.map((b) => b.role.length)) + 2;
const resW = Math.max(10, ...resourceBindings.map((b) => (b.resource ?? '').length)) + 2;
const hasName = resourceBindings.some((b) => b.name);
if (hasName) {
lines.push(` ${'ROLE'.padEnd(roleW)}${'RESOURCE'.padEnd(resW)}NAME`);
} else {
lines.push(` ${'ROLE'.padEnd(roleW)}RESOURCE`);
}
for (const b of resourceBindings) {
if (hasName) {
lines.push(` ${b.role.padEnd(roleW)}${(b.resource ?? '').padEnd(resW)}${b.name ?? '*'}`);
} else {
lines.push(` ${b.role.padEnd(roleW)}${b.resource}`);
}
}
}
if (operationBindings.length > 0) {
lines.push('');
lines.push('Operations:');
lines.push(` ${'ACTION'.padEnd(20)}ROLE`);
for (const b of operationBindings) {
lines.push(` ${(b.action ?? '').padEnd(20)}${b.role}`);
}
}
}
lines.push('');
lines.push('Metadata:');
lines.push(` ${pad('ID:', 12)}${rbac.id}`);
if (rbac.createdAt) lines.push(` ${pad('Created:', 12)}${rbac.createdAt}`);
if (rbac.updatedAt) lines.push(` ${pad('Updated:', 12)}${rbac.updatedAt}`);
return lines.join('\n');
}
function formatGenericDetail(obj: Record<string, unknown>): string {
const lines: string[] = [];
for (const [key, value] of Object.entries(obj)) {
@@ -277,11 +526,33 @@ export function createDescribeCommand(deps: DescribeCommandDeps): Command {
// Resolve name → ID
let id: string;
if (resource === 'instances') {
// Instances: accept instance ID or server name (resolve to first running instance)
try {
id = await resolveNameOrId(deps.client, resource, idOrName);
} catch {
// Not an instance ID — try as server name
const servers = await deps.client.get<Array<{ id: string; name: string }>>('/api/v1/servers');
const server = servers.find((s) => s.name === idOrName || s.id === idOrName);
if (server) {
const instances = await deps.client.get<Array<{ id: string; status: string }>>(`/api/v1/instances?serverId=${server.id}`);
const running = instances.find((i) => i.status === 'RUNNING') ?? instances[0];
if (running) {
id = running.id;
} else {
throw new Error(`No instances found for server '${idOrName}'`);
}
} else {
id = idOrName;
}
}
} else {
try {
id = await resolveNameOrId(deps.client, resource, idOrName);
} catch {
id = idOrName;
}
}
const item = await deps.fetchResource(resource, id) as Record<string, unknown>;
@@ -318,6 +589,27 @@ export function createDescribeCommand(deps: DescribeCommandDeps): Command {
case 'projects':
deps.log(formatProjectDetail(item));
break;
case 'users': {
// Fetch RBAC definitions and groups to show permissions
const [rbacDefsForUser, allGroupsForUser] = await Promise.all([
deps.client.get<RbacDef[]>('/api/v1/rbac').catch(() => [] as RbacDef[]),
deps.client.get<Array<{ name: string; members?: Array<{ user: { email: string } }> }>>('/api/v1/groups').catch(() => []),
]);
const userEmail = item.email as string;
const userGroupNames = allGroupsForUser
.filter((g) => g.members?.some((m) => m.user.email === userEmail))
.map((g) => g.name);
deps.log(formatUserDetail(item, rbacDefsForUser, userGroupNames));
break;
}
case 'groups': {
const rbacDefsForGroup = await deps.client.get<RbacDef[]>('/api/v1/rbac').catch(() => [] as RbacDef[]);
deps.log(formatGroupDetail(item, rbacDefsForGroup));
break;
}
case 'rbac':
deps.log(formatRbacDetail(item));
break;
default:
deps.log(formatGenericDetail(item));
}

View File

@@ -47,7 +47,7 @@ export function createEditCommand(deps: EditCommandDeps): Command {
return;
}
const validResources = ['servers', 'secrets', 'projects'];
const validResources = ['servers', 'secrets', 'projects', 'groups', 'rbac'];
if (!validResources.includes(resource)) {
log(`Error: unknown resource type '${resourceArg}'`);
process.exitCode = 1;

View File

@@ -21,7 +21,9 @@ interface ProjectRow {
id: string;
name: string;
description: string;
proxyMode: string;
ownerId: string;
servers?: Array<{ server: { name: string } }>;
}
interface SecretRow {
@@ -42,6 +44,7 @@ interface TemplateRow {
interface InstanceRow {
id: string;
serverId: string;
server?: { name: string };
status: string;
containerId: string | null;
port: number | null;
@@ -56,10 +59,60 @@ const serverColumns: Column<ServerRow>[] = [
{ header: 'ID', key: 'id' },
];
interface UserRow {
id: string;
email: string;
name: string | null;
provider: string | null;
}
interface GroupRow {
id: string;
name: string;
description: string;
members?: Array<{ user: { email: string } }>;
}
interface RbacRow {
id: string;
name: string;
subjects: Array<{ kind: string; name: string }>;
roleBindings: Array<{ role: string; resource?: string; action?: string; name?: string }>;
}
const projectColumns: Column<ProjectRow>[] = [
{ header: 'NAME', key: 'name' },
{ header: 'MODE', key: (r) => r.proxyMode ?? 'direct', width: 10 },
{ header: 'SERVERS', key: (r) => r.servers ? String(r.servers.length) : '0', width: 8 },
{ header: 'DESCRIPTION', key: 'description', width: 30 },
{ header: 'ID', key: 'id' },
];
const userColumns: Column<UserRow>[] = [
{ header: 'EMAIL', key: 'email' },
{ header: 'NAME', key: (r) => r.name ?? '-' },
{ header: 'PROVIDER', key: (r) => r.provider ?? 'local', width: 10 },
{ header: 'ID', key: 'id' },
];
const groupColumns: Column<GroupRow>[] = [
{ header: 'NAME', key: 'name' },
{ header: 'MEMBERS', key: (r) => r.members ? String(r.members.length) : '0', width: 8 },
{ header: 'DESCRIPTION', key: 'description', width: 40 },
{ header: 'OWNER', key: 'ownerId' },
{ header: 'ID', key: 'id' },
];
const rbacColumns: Column<RbacRow>[] = [
{ header: 'NAME', key: 'name' },
{ header: 'SUBJECTS', key: (r) => r.subjects.map((s) => `${s.kind}:${s.name}`).join(', '), width: 30 },
{ header: 'BINDINGS', key: (r) => r.roleBindings.map((b) => {
if ('action' in b && b.action !== undefined) return `run>${b.action}`;
if ('resource' in b && b.resource !== undefined) {
const base = `${b.role}:${b.resource}`;
return b.name ? `${base}:${b.name}` : base;
}
return b.role;
}).join(', '), width: 40 },
{ header: 'ID', key: 'id' },
];
@@ -78,9 +131,9 @@ const templateColumns: Column<TemplateRow>[] = [
];
const instanceColumns: Column<InstanceRow>[] = [
{ header: 'NAME', key: (r) => r.server?.name ?? '-', width: 20 },
{ header: 'STATUS', key: 'status', width: 10 },
{ header: 'HEALTH', key: (r) => r.healthStatus ?? '-', width: 10 },
{ header: 'SERVER ID', key: 'serverId' },
{ header: 'PORT', key: (r) => r.port != null ? String(r.port) : '-', width: 6 },
{ header: 'CONTAINER', key: (r) => r.containerId ? r.containerId.slice(0, 12) : '-', width: 14 },
{ header: 'ID', key: 'id' },
@@ -98,6 +151,12 @@ function getColumnsForResource(resource: string): Column<Record<string, unknown>
return templateColumns as unknown as Column<Record<string, unknown>>[];
case 'instances':
return instanceColumns as unknown as Column<Record<string, unknown>>[];
case 'users':
return userColumns as unknown as Column<Record<string, unknown>>[];
case 'groups':
return groupColumns as unknown as Column<Record<string, unknown>>[];
case 'rbac':
return rbacColumns as unknown as Column<Record<string, unknown>>[];
default:
return [
{ header: 'ID', key: 'id' as keyof Record<string, unknown> },

View File

@@ -6,15 +6,84 @@ export interface LogsCommandDeps {
log: (...args: unknown[]) => void;
}
interface InstanceInfo {
id: string;
status: string;
containerId: string | null;
}
/**
* Resolve a name/ID to an instance ID.
* Accepts: instance ID, server name, or server ID.
* For servers with multiple replicas, picks by --instance index or first RUNNING.
*/
async function resolveInstance(
client: ApiClient,
nameOrId: string,
instanceIndex?: number,
): Promise<{ instanceId: string; serverName?: string; replicaInfo?: string }> {
// Try as instance ID first
try {
await client.get(`/api/v1/instances/${nameOrId}`);
return { instanceId: nameOrId };
} catch {
// Not a valid instance ID
}
// Try as server name/ID → find its instances
const servers = await client.get<Array<{ id: string; name: string }>>('/api/v1/servers');
const server = servers.find((s) => s.name === nameOrId || s.id === nameOrId);
if (!server) {
throw new Error(`Instance or server '${nameOrId}' not found`);
}
const instances = await client.get<InstanceInfo[]>(`/api/v1/instances?serverId=${server.id}`);
if (instances.length === 0) {
throw new Error(`No instances found for server '${server.name}'`);
}
// Select by index or pick first running
let selected: InstanceInfo | undefined;
if (instanceIndex !== undefined) {
if (instanceIndex < 0 || instanceIndex >= instances.length) {
throw new Error(`Instance index ${instanceIndex} out of range (server '${server.name}' has ${instances.length} instance${instances.length > 1 ? 's' : ''})`);
}
selected = instances[instanceIndex];
} else {
selected = instances.find((i) => i.status === 'RUNNING') ?? instances[0];
}
if (!selected) {
throw new Error(`No instances found for server '${server.name}'`);
}
const result: { instanceId: string; serverName?: string; replicaInfo?: string } = {
instanceId: selected.id,
serverName: server.name,
};
if (instances.length > 1) {
result.replicaInfo = `instance ${instances.indexOf(selected) + 1}/${instances.length}`;
}
return result;
}
export function createLogsCommand(deps: LogsCommandDeps): Command {
const { client, log } = deps;
return new Command('logs')
.description('Get logs from an MCP server instance')
.argument('<instance-id>', 'Instance ID')
.argument('<name>', 'Server name, server ID, or instance ID')
.option('-t, --tail <lines>', 'Number of lines to show')
.action(async (id: string, opts: { tail?: string }) => {
let url = `/api/v1/instances/${id}/logs`;
.option('-i, --instance <index>', 'Instance/replica index (0-based, for servers with multiple replicas)')
.action(async (nameOrId: string, opts: { tail?: string; instance?: string }) => {
const instanceIndex = opts.instance !== undefined ? parseInt(opts.instance, 10) : undefined;
const { instanceId, serverName, replicaInfo } = await resolveInstance(client, nameOrId, instanceIndex);
if (replicaInfo) {
process.stderr.write(`Showing logs for ${serverName} (${replicaInfo})\n`);
}
let url = `/api/v1/instances/${instanceId}/logs`;
if (opts.tail) {
url += `?tail=${opts.tail}`;
}

View File

@@ -0,0 +1,47 @@
import { Command } from 'commander';
import type { ApiClient } from '../api-client.js';
import { resolveNameOrId } from './shared.js';
export interface ProjectOpsDeps {
client: ApiClient;
log: (...args: string[]) => void;
getProject: () => string | undefined;
}
function requireProject(deps: ProjectOpsDeps): string {
const project = deps.getProject();
if (!project) {
deps.log('Error: --project <name> is required for this command.');
process.exitCode = 1;
throw new Error('--project required');
}
return project;
}
export function createAttachServerCommand(deps: ProjectOpsDeps): Command {
const { client, log } = deps;
return new Command('attach-server')
.description('Attach a server to a project (requires --project)')
.argument('<server-name>', 'Server name to attach')
.action(async (serverName: string) => {
const projectName = requireProject(deps);
const projectId = await resolveNameOrId(client, 'projects', projectName);
await client.post(`/api/v1/projects/${projectId}/servers`, { server: serverName });
log(`server '${serverName}' attached to project '${projectName}'`);
});
}
export function createDetachServerCommand(deps: ProjectOpsDeps): Command {
const { client, log } = deps;
return new Command('detach-server')
.description('Detach a server from a project (requires --project)')
.argument('<server-name>', 'Server name to detach')
.action(async (serverName: string) => {
const projectName = requireProject(deps);
const projectId = await resolveNameOrId(client, 'projects', projectName);
await client.delete(`/api/v1/projects/${projectId}/servers/${serverName}`);
log(`server '${serverName}' detached from project '${projectName}'`);
});
}

View File

@@ -1,15 +0,0 @@
import { Command } from 'commander';
import type { ApiClient } from '../api-client.js';
export interface ProjectCommandDeps {
client: ApiClient;
log: (...args: unknown[]) => void;
}
export function createProjectCommand(_deps: ProjectCommandDeps): Command {
const cmd = new Command('project')
.alias('proj')
.description('Project-specific actions (create with "create project", list with "get projects")');
return cmd;
}

View File

@@ -11,6 +11,11 @@ export const RESOURCE_ALIASES: Record<string, string> = {
sec: 'secrets',
template: 'templates',
tpl: 'templates',
user: 'users',
group: 'groups',
rbac: 'rbac',
'rbac-definition': 'rbac',
'rbac-binding': 'rbac',
};
export function resolveResource(name: string): string {
@@ -28,9 +33,23 @@ export async function resolveNameOrId(
if (/^c[a-z0-9]{24}/.test(nameOrId)) {
return nameOrId;
}
const items = await client.get<Array<{ id: string; name: string }>>(`/api/v1/${resource}`);
const match = items.find((item) => item.name === nameOrId);
// Users resolve by email, not name
if (resource === 'users') {
const items = await client.get<Array<{ id: string; email: string }>>(`/api/v1/${resource}`);
const match = items.find((item) => item.email === nameOrId);
if (match) return match.id;
throw new Error(`user '${nameOrId}' not found`);
}
const items = await client.get<Array<Record<string, unknown>>>(`/api/v1/${resource}`);
const match = items.find((item) => {
// Instances use server.name, other resources use name directly
if (resource === 'instances') {
const server = item.server as { name?: string } | undefined;
return server?.name === nameOrId;
}
return item.name === nameOrId;
});
if (match) return match.id as string;
throw new Error(`${resource.replace(/s$/, '')} '${nameOrId}' not found`);
}

View File

@@ -10,11 +10,10 @@ import { createLogsCommand } from './commands/logs.js';
import { createApplyCommand } from './commands/apply.js';
import { createCreateCommand } from './commands/create.js';
import { createEditCommand } from './commands/edit.js';
import { createClaudeCommand } from './commands/claude.js';
import { createProjectCommand } from './commands/project.js';
import { createBackupCommand, createRestoreCommand } from './commands/backup.js';
import { createLoginCommand, createLogoutCommand } from './commands/auth.js';
import { ApiClient } from './api-client.js';
import { createAttachServerCommand, createDetachServerCommand } from './commands/project-ops.js';
import { ApiClient, ApiError } from './api-client.js';
import { loadConfig } from './config/index.js';
import { loadCredentials } from './auth/index.js';
import { resolveNameOrId } from './commands/shared.js';
@@ -26,9 +25,9 @@ export function createProgram(): Command {
.version(APP_VERSION, '-v, --version')
.enablePositionalOptions()
.option('--daemon-url <url>', 'mcplocal daemon URL')
.option('--direct', 'bypass mcplocal and connect directly to mcpd');
.option('--direct', 'bypass mcplocal and connect directly to mcpd')
.option('--project <name>', 'Target project for project commands');
program.addCommand(createConfigCommand());
program.addCommand(createStatusCommand());
program.addCommand(createLoginCommand());
program.addCommand(createLogoutCommand());
@@ -48,7 +47,28 @@ export function createProgram(): Command {
const client = new ApiClient({ baseUrl, token: creds?.token ?? undefined });
program.addCommand(createConfigCommand(undefined, {
client,
credentialsDeps: {},
log: (...args) => console.log(...args),
}));
const fetchResource = async (resource: string, nameOrId?: string): Promise<unknown[]> => {
const projectName = program.opts().project as string | undefined;
// --project scoping for servers and instances
if (projectName && !nameOrId && (resource === 'servers' || resource === 'instances')) {
const projectId = await resolveNameOrId(client, 'projects', projectName);
if (resource === 'servers') {
return client.get<unknown[]>(`/api/v1/projects/${projectId}/servers`);
}
// instances: fetch project servers, then filter instances by serverId
const projectServers = await client.get<Array<{ id: string }>>(`/api/v1/projects/${projectId}/servers`);
const serverIds = new Set(projectServers.map((s) => s.id));
const allInstances = await client.get<Array<{ serverId: string }>>(`/api/v1/instances`);
return allInstances.filter((inst) => serverIds.has(inst.serverId));
}
if (nameOrId) {
// Glob pattern — use query param filtering
if (nameOrId.includes('*')) {
@@ -113,16 +133,6 @@ export function createProgram(): Command {
log: (...args) => console.log(...args),
}));
program.addCommand(createClaudeCommand({
client,
log: (...args) => console.log(...args),
}));
program.addCommand(createProjectCommand({
client,
log: (...args) => console.log(...args),
}));
program.addCommand(createBackupCommand({
client,
log: (...args) => console.log(...args),
@@ -133,6 +143,14 @@ export function createProgram(): Command {
log: (...args) => console.log(...args),
}));
const projectOpsDeps = {
client,
log: (...args: string[]) => console.log(...args),
getProject: () => program.opts().project as string | undefined,
};
program.addCommand(createAttachServerCommand(projectOpsDeps));
program.addCommand(createDetachServerCommand(projectOpsDeps));
return program;
}
@@ -143,5 +161,35 @@ const isDirectRun =
import.meta.url === `file://${process.argv[1]}`;
if (isDirectRun) {
createProgram().parseAsync(process.argv);
createProgram().parseAsync(process.argv).catch((err: unknown) => {
if (err instanceof ApiError) {
if (err.status === 401) {
console.error("Error: you need to log in. Run 'mcpctl login' to authenticate.");
} else if (err.status === 403) {
console.error('Error: permission denied. You do not have access to this resource.');
} else {
let msg: string;
try {
const parsed = JSON.parse(err.body) as { error?: string; message?: string; details?: unknown };
msg = parsed.error ?? parsed.message ?? err.body;
if (parsed.details && Array.isArray(parsed.details)) {
const issues = parsed.details as Array<{ message?: string; path?: string[] }>;
const detail = issues.map((i) => {
const path = i.path?.join('.') ?? '';
return path ? `${path}: ${i.message}` : (i.message ?? '');
}).filter(Boolean).join('; ');
if (detail) msg += `: ${detail}`;
}
} catch {
msg = err.body;
}
console.error(`Error: ${msg}`);
}
} else if (err instanceof Error) {
console.error(`Error: ${err.message}`);
} else {
console.error(`Error: ${String(err)}`);
}
process.exit(1);
});
}

View File

@@ -159,4 +159,347 @@ projects:
rmSync(tmpDir, { recursive: true, force: true });
});
it('applies users (no role field)', async () => {
const configPath = join(tmpDir, 'config.yaml');
writeFileSync(configPath, `
users:
- email: alice@test.com
password: password123
name: Alice
`);
const cmd = createApplyCommand({ client, log });
await cmd.parseAsync([configPath], { from: 'user' });
const callBody = vi.mocked(client.post).mock.calls[0]![1] as Record<string, unknown>;
expect(callBody).toEqual(expect.objectContaining({
email: 'alice@test.com',
password: 'password123',
name: 'Alice',
}));
expect(callBody).not.toHaveProperty('role');
expect(output.join('\n')).toContain('Created user: alice@test.com');
rmSync(tmpDir, { recursive: true, force: true });
});
it('updates existing users matched by email', async () => {
vi.mocked(client.get).mockImplementation(async (url: string) => {
if (url === '/api/v1/users') return [{ id: 'usr-1', email: 'alice@test.com' }];
return [];
});
const configPath = join(tmpDir, 'config.yaml');
writeFileSync(configPath, `
users:
- email: alice@test.com
password: newpassword
name: Alice Updated
`);
const cmd = createApplyCommand({ client, log });
await cmd.parseAsync([configPath], { from: 'user' });
expect(client.put).toHaveBeenCalledWith('/api/v1/users/usr-1', expect.objectContaining({
email: 'alice@test.com',
name: 'Alice Updated',
}));
expect(output.join('\n')).toContain('Updated user: alice@test.com');
rmSync(tmpDir, { recursive: true, force: true });
});
it('applies groups', async () => {
const configPath = join(tmpDir, 'config.yaml');
writeFileSync(configPath, `
groups:
- name: dev-team
description: Development team
members:
- alice@test.com
- bob@test.com
`);
const cmd = createApplyCommand({ client, log });
await cmd.parseAsync([configPath], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/groups', expect.objectContaining({
name: 'dev-team',
description: 'Development team',
members: ['alice@test.com', 'bob@test.com'],
}));
expect(output.join('\n')).toContain('Created group: dev-team');
rmSync(tmpDir, { recursive: true, force: true });
});
it('updates existing groups', async () => {
vi.mocked(client.get).mockImplementation(async (url: string) => {
if (url === '/api/v1/groups') return [{ id: 'grp-1', name: 'dev-team' }];
return [];
});
const configPath = join(tmpDir, 'config.yaml');
writeFileSync(configPath, `
groups:
- name: dev-team
description: Updated devs
members:
- new@test.com
`);
const cmd = createApplyCommand({ client, log });
await cmd.parseAsync([configPath], { from: 'user' });
expect(client.put).toHaveBeenCalledWith('/api/v1/groups/grp-1', expect.objectContaining({
name: 'dev-team',
description: 'Updated devs',
}));
expect(output.join('\n')).toContain('Updated group: dev-team');
rmSync(tmpDir, { recursive: true, force: true });
});
it('applies rbacBindings', async () => {
const configPath = join(tmpDir, 'config.yaml');
writeFileSync(configPath, `
rbac:
- name: developers
subjects:
- kind: User
name: alice@test.com
- kind: Group
name: dev-team
roleBindings:
- role: edit
resource: servers
- role: view
resource: instances
`);
const cmd = createApplyCommand({ client, log });
await cmd.parseAsync([configPath], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', expect.objectContaining({
name: 'developers',
subjects: [
{ kind: 'User', name: 'alice@test.com' },
{ kind: 'Group', name: 'dev-team' },
],
roleBindings: [
{ role: 'edit', resource: 'servers' },
{ role: 'view', resource: 'instances' },
],
}));
expect(output.join('\n')).toContain('Created rbacBinding: developers');
rmSync(tmpDir, { recursive: true, force: true });
});
it('updates existing rbacBindings', async () => {
vi.mocked(client.get).mockImplementation(async (url: string) => {
if (url === '/api/v1/rbac') return [{ id: 'rbac-1', name: 'developers' }];
return [];
});
const configPath = join(tmpDir, 'config.yaml');
writeFileSync(configPath, `
rbacBindings:
- name: developers
subjects:
- kind: User
name: new@test.com
roleBindings:
- role: edit
resource: "*"
`);
const cmd = createApplyCommand({ client, log });
await cmd.parseAsync([configPath], { from: 'user' });
expect(client.put).toHaveBeenCalledWith('/api/v1/rbac/rbac-1', expect.objectContaining({
name: 'developers',
}));
expect(output.join('\n')).toContain('Updated rbacBinding: developers');
rmSync(tmpDir, { recursive: true, force: true });
});
it('applies projects with servers', async () => {
const configPath = join(tmpDir, 'config.yaml');
writeFileSync(configPath, `
projects:
- name: smart-home
description: Home automation
proxyMode: filtered
llmProvider: gemini-cli
llmModel: gemini-2.0-flash
servers:
- my-grafana
- my-ha
`);
const cmd = createApplyCommand({ client, log });
await cmd.parseAsync([configPath], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/projects', expect.objectContaining({
name: 'smart-home',
proxyMode: 'filtered',
llmProvider: 'gemini-cli',
llmModel: 'gemini-2.0-flash',
servers: ['my-grafana', 'my-ha'],
}));
expect(output.join('\n')).toContain('Created project: smart-home');
rmSync(tmpDir, { recursive: true, force: true });
});
it('dry-run shows all new resource types', async () => {
const configPath = join(tmpDir, 'config.yaml');
writeFileSync(configPath, `
secrets:
- name: creds
data:
TOKEN: abc
users:
- email: alice@test.com
password: password123
groups:
- name: dev-team
members: []
projects:
- name: my-proj
description: A project
rbacBindings:
- name: admins
subjects:
- kind: User
name: admin@test.com
roleBindings:
- role: edit
resource: "*"
`);
const cmd = createApplyCommand({ client, log });
await cmd.parseAsync([configPath, '--dry-run'], { from: 'user' });
expect(client.post).not.toHaveBeenCalled();
const text = output.join('\n');
expect(text).toContain('Dry run');
expect(text).toContain('1 secret(s)');
expect(text).toContain('1 user(s)');
expect(text).toContain('1 group(s)');
expect(text).toContain('1 project(s)');
expect(text).toContain('1 rbacBinding(s)');
rmSync(tmpDir, { recursive: true, force: true });
});
it('applies resources in correct order', async () => {
const callOrder: string[] = [];
vi.mocked(client.post).mockImplementation(async (url: string) => {
callOrder.push(url);
return { id: 'new-id', name: 'test' };
});
const configPath = join(tmpDir, 'config.yaml');
writeFileSync(configPath, `
rbacBindings:
- name: admins
subjects:
- kind: User
name: admin@test.com
roleBindings:
- role: edit
resource: "*"
users:
- email: admin@test.com
password: password123
secrets:
- name: creds
data:
KEY: val
groups:
- name: dev-team
servers:
- name: my-server
transport: STDIO
projects:
- name: my-proj
`);
const cmd = createApplyCommand({ client, log });
await cmd.parseAsync([configPath], { from: 'user' });
// Apply order: secrets → servers → users → groups → projects → templates → rbacBindings
expect(callOrder[0]).toBe('/api/v1/secrets');
expect(callOrder[1]).toBe('/api/v1/servers');
expect(callOrder[2]).toBe('/api/v1/users');
expect(callOrder[3]).toBe('/api/v1/groups');
expect(callOrder[4]).toBe('/api/v1/projects');
expect(callOrder[5]).toBe('/api/v1/rbac');
rmSync(tmpDir, { recursive: true, force: true });
});
it('applies rbac with operation bindings', async () => {
const configPath = join(tmpDir, 'config.yaml');
writeFileSync(configPath, `
rbac:
- name: ops-team
subjects:
- kind: Group
name: ops
roleBindings:
- role: edit
resource: servers
- role: run
action: backup
- role: run
action: logs
`);
const cmd = createApplyCommand({ client, log });
await cmd.parseAsync([configPath], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', expect.objectContaining({
name: 'ops-team',
roleBindings: [
{ role: 'edit', resource: 'servers' },
{ role: 'run', action: 'backup' },
{ role: 'run', action: 'logs' },
],
}));
expect(output.join('\n')).toContain('Created rbacBinding: ops-team');
rmSync(tmpDir, { recursive: true, force: true });
});
it('applies rbac with name-scoped resource binding', async () => {
const configPath = join(tmpDir, 'config.yaml');
writeFileSync(configPath, `
rbac:
- name: ha-viewer
subjects:
- kind: User
name: alice@test.com
roleBindings:
- role: view
resource: servers
name: my-ha
`);
const cmd = createApplyCommand({ client, log });
await cmd.parseAsync([configPath], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', expect.objectContaining({
name: 'ha-viewer',
roleBindings: [
{ role: 'view', resource: 'servers', name: 'my-ha' },
],
}));
rmSync(tmpDir, { recursive: true, force: true });
});
});

View File

@@ -37,6 +37,8 @@ describe('login command', () => {
user: { email },
}),
logoutRequest: async () => {},
statusRequest: async () => ({ hasUsers: true }),
bootstrapRequest: async () => ({ token: '', user: { email: '' } }),
});
await cmd.parseAsync([], { from: 'user' });
expect(output[0]).toContain('Logged in as alice@test.com');
@@ -58,6 +60,8 @@ describe('login command', () => {
log,
loginRequest: async () => { throw new Error('Invalid credentials'); },
logoutRequest: async () => {},
statusRequest: async () => ({ hasUsers: true }),
bootstrapRequest: async () => ({ token: '', user: { email: '' } }),
});
await cmd.parseAsync([], { from: 'user' });
expect(output[0]).toContain('Login failed');
@@ -83,6 +87,8 @@ describe('login command', () => {
return { token: 'tok', user: { email } };
},
logoutRequest: async () => {},
statusRequest: async () => ({ hasUsers: true }),
bootstrapRequest: async () => ({ token: '', user: { email: '' } }),
});
await cmd.parseAsync([], { from: 'user' });
expect(capturedUrl).toBe('http://custom:3100');
@@ -103,12 +109,74 @@ describe('login command', () => {
return { token: 'tok', user: { email } };
},
logoutRequest: async () => {},
statusRequest: async () => ({ hasUsers: true }),
bootstrapRequest: async () => ({ token: '', user: { email: '' } }),
});
await cmd.parseAsync(['--mcpd-url', 'http://override:3100'], { from: 'user' });
expect(capturedUrl).toBe('http://override:3100');
});
});
describe('login bootstrap flow', () => {
it('bootstraps first admin when no users exist', async () => {
let bootstrapCalled = false;
const cmd = createLoginCommand({
configDeps: { configDir: tempDir },
credentialsDeps: { configDir: tempDir },
prompt: {
input: async (msg) => {
if (msg.includes('Name')) return 'Admin User';
return 'admin@test.com';
},
password: async () => 'admin-pass',
},
log,
loginRequest: async () => ({ token: '', user: { email: '' } }),
logoutRequest: async () => {},
statusRequest: async () => ({ hasUsers: false }),
bootstrapRequest: async (_url, email, _password) => {
bootstrapCalled = true;
return { token: 'admin-token', user: { email } };
},
});
await cmd.parseAsync([], { from: 'user' });
expect(bootstrapCalled).toBe(true);
expect(output.join('\n')).toContain('No users configured');
expect(output.join('\n')).toContain('admin@test.com');
expect(output.join('\n')).toContain('admin');
const creds = loadCredentials({ configDir: tempDir });
expect(creds).not.toBeNull();
expect(creds!.token).toBe('admin-token');
expect(creds!.user).toBe('admin@test.com');
});
it('falls back to normal login when users exist', async () => {
let loginCalled = false;
const cmd = createLoginCommand({
configDeps: { configDir: tempDir },
credentialsDeps: { configDir: tempDir },
prompt: {
input: async () => 'alice@test.com',
password: async () => 'secret',
},
log,
loginRequest: async (_url, email) => {
loginCalled = true;
return { token: 'session-tok', user: { email } };
},
logoutRequest: async () => {},
statusRequest: async () => ({ hasUsers: true }),
bootstrapRequest: async () => { throw new Error('Should not be called'); },
});
await cmd.parseAsync([], { from: 'user' });
expect(loginCalled).toBe(true);
expect(output.join('\n')).not.toContain('No users configured');
});
});
describe('logout command', () => {
it('removes credentials on logout', async () => {
saveCredentials({ token: 'tok', mcpdUrl: 'http://x:3100', user: 'alice' }, { configDir: tempDir });
@@ -120,6 +188,8 @@ describe('logout command', () => {
log,
loginRequest: async () => ({ token: '', user: { email: '' } }),
logoutRequest: async () => { logoutCalled = true; },
statusRequest: async () => ({ hasUsers: true }),
bootstrapRequest: async () => ({ token: '', user: { email: '' } }),
});
await cmd.parseAsync([], { from: 'user' });
expect(output[0]).toContain('Logged out successfully');
@@ -137,6 +207,8 @@ describe('logout command', () => {
log,
loginRequest: async () => ({ token: '', user: { email: '' } }),
logoutRequest: async () => {},
statusRequest: async () => ({ hasUsers: true }),
bootstrapRequest: async () => ({ token: '', user: { email: '' } }),
});
await cmd.parseAsync([], { from: 'user' });
expect(output[0]).toContain('Not logged in');

View File

@@ -1,9 +1,10 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { describe, it, expect, vi, beforeEach, afterEach } from 'vitest';
import { writeFileSync, readFileSync, mkdtempSync, rmSync } from 'node:fs';
import { join } from 'node:path';
import { tmpdir } from 'node:os';
import { createClaudeCommand } from '../../src/commands/claude.js';
import { createConfigCommand } from '../../src/commands/config.js';
import type { ApiClient } from '../../src/api-client.js';
import { saveCredentials, loadCredentials } from '../../src/auth/index.js';
function mockClient(): ApiClient {
return {
@@ -13,44 +14,50 @@ function mockClient(): ApiClient {
'github--default': { command: 'npx', args: ['-y', '@anthropic/github-mcp'] },
},
})),
post: vi.fn(async () => ({})),
post: vi.fn(async () => ({ token: 'impersonated-tok', user: { email: 'other@test.com' } })),
put: vi.fn(async () => ({})),
delete: vi.fn(async () => {}),
} as unknown as ApiClient;
}
describe('claude command', () => {
describe('config claude-generate', () => {
let client: ReturnType<typeof mockClient>;
let output: string[];
let tmpDir: string;
const log = (...args: unknown[]) => output.push(args.map(String).join(' '));
const log = (...args: string[]) => output.push(args.join(' '));
beforeEach(() => {
client = mockClient();
output = [];
tmpDir = mkdtempSync(join(tmpdir(), 'mcpctl-claude-'));
tmpDir = mkdtempSync(join(tmpdir(), 'mcpctl-config-claude-'));
});
afterEach(() => {
rmSync(tmpDir, { recursive: true, force: true });
});
describe('generate', () => {
it('generates .mcp.json from project config', async () => {
const outPath = join(tmpDir, '.mcp.json');
const cmd = createClaudeCommand({ client, log });
await cmd.parseAsync(['generate', 'proj-1', '-o', outPath], { from: 'user' });
const cmd = createConfigCommand(
{ configDeps: { configDir: tmpDir }, log },
{ client, credentialsDeps: { configDir: tmpDir }, log },
);
await cmd.parseAsync(['claude-generate', '--project', 'proj-1', '-o', outPath], { from: 'user' });
expect(client.get).toHaveBeenCalledWith('/api/v1/projects/proj-1/mcp-config');
const written = JSON.parse(readFileSync(outPath, 'utf-8'));
expect(written.mcpServers['slack--default']).toBeDefined();
expect(output.join('\n')).toContain('2 server(s)');
rmSync(tmpDir, { recursive: true, force: true });
});
it('prints to stdout with --stdout', async () => {
const cmd = createClaudeCommand({ client, log });
await cmd.parseAsync(['generate', 'proj-1', '--stdout'], { from: 'user' });
const cmd = createConfigCommand(
{ configDeps: { configDir: tmpDir }, log },
{ client, credentialsDeps: { configDir: tmpDir }, log },
);
await cmd.parseAsync(['claude-generate', '--project', 'proj-1', '--stdout'], { from: 'user' });
expect(output[0]).toContain('mcpServers');
rmSync(tmpDir, { recursive: true, force: true });
});
it('merges with existing .mcp.json', async () => {
@@ -59,100 +66,94 @@ describe('claude command', () => {
mcpServers: { 'existing--server': { command: 'echo', args: [] } },
}));
const cmd = createClaudeCommand({ client, log });
await cmd.parseAsync(['generate', 'proj-1', '-o', outPath, '--merge'], { from: 'user' });
const cmd = createConfigCommand(
{ configDeps: { configDir: tmpDir }, log },
{ client, credentialsDeps: { configDir: tmpDir }, log },
);
await cmd.parseAsync(['claude-generate', '--project', 'proj-1', '-o', outPath, '--merge'], { from: 'user' });
const written = JSON.parse(readFileSync(outPath, 'utf-8'));
expect(written.mcpServers['existing--server']).toBeDefined();
expect(written.mcpServers['slack--default']).toBeDefined();
expect(output.join('\n')).toContain('3 server(s)');
rmSync(tmpDir, { recursive: true, force: true });
});
});
describe('show', () => {
it('shows servers in .mcp.json', () => {
const filePath = join(tmpDir, '.mcp.json');
writeFileSync(filePath, JSON.stringify({
mcpServers: {
'slack': { command: 'npx', args: ['-y', '@anthropic/slack-mcp'], env: { TOKEN: 'x' } },
},
}));
const cmd = createClaudeCommand({ client, log });
cmd.parseAsync(['show', '-p', filePath], { from: 'user' });
expect(output.join('\n')).toContain('slack');
expect(output.join('\n')).toContain('npx -y @anthropic/slack-mcp');
expect(output.join('\n')).toContain('TOKEN');
rmSync(tmpDir, { recursive: true, force: true });
});
it('handles missing file', () => {
const cmd = createClaudeCommand({ client, log });
cmd.parseAsync(['show', '-p', join(tmpDir, 'nonexistent.json')], { from: 'user' });
expect(output.join('\n')).toContain('No .mcp.json found');
rmSync(tmpDir, { recursive: true, force: true });
});
});
describe('add', () => {
it('adds a server entry', () => {
const filePath = join(tmpDir, '.mcp.json');
const cmd = createClaudeCommand({ client, log });
cmd.parseAsync(['add', 'my-server', '-c', 'npx', '-a', '-y', 'my-pkg', '-p', filePath], { from: 'user' });
const written = JSON.parse(readFileSync(filePath, 'utf-8'));
expect(written.mcpServers['my-server']).toEqual({
command: 'npx',
args: ['-y', 'my-pkg'],
});
rmSync(tmpDir, { recursive: true, force: true });
});
it('adds server with env vars', () => {
const filePath = join(tmpDir, '.mcp.json');
const cmd = createClaudeCommand({ client, log });
cmd.parseAsync(['add', 'my-server', '-c', 'node', '-e', 'KEY=val', 'SECRET=abc', '-p', filePath], { from: 'user' });
const written = JSON.parse(readFileSync(filePath, 'utf-8'));
expect(written.mcpServers['my-server'].env).toEqual({ KEY: 'val', SECRET: 'abc' });
rmSync(tmpDir, { recursive: true, force: true });
});
});
describe('remove', () => {
it('removes a server entry', () => {
const filePath = join(tmpDir, '.mcp.json');
writeFileSync(filePath, JSON.stringify({
mcpServers: { 'slack': { command: 'npx', args: [] }, 'github': { command: 'npx', args: [] } },
}));
const cmd = createClaudeCommand({ client, log });
cmd.parseAsync(['remove', 'slack', '-p', filePath], { from: 'user' });
const written = JSON.parse(readFileSync(filePath, 'utf-8'));
expect(written.mcpServers['slack']).toBeUndefined();
expect(written.mcpServers['github']).toBeDefined();
expect(output.join('\n')).toContain("Removed 'slack'");
rmSync(tmpDir, { recursive: true, force: true });
});
it('reports when server not found', () => {
const filePath = join(tmpDir, '.mcp.json');
writeFileSync(filePath, JSON.stringify({ mcpServers: {} }));
const cmd = createClaudeCommand({ client, log });
cmd.parseAsync(['remove', 'nonexistent', '-p', filePath], { from: 'user' });
expect(output.join('\n')).toContain('not found');
rmSync(tmpDir, { recursive: true, force: true });
});
});
});
describe('config impersonate', () => {
let client: ReturnType<typeof mockClient>;
let output: string[];
let tmpDir: string;
const log = (...args: string[]) => output.push(args.join(' '));
beforeEach(() => {
client = mockClient();
output = [];
tmpDir = mkdtempSync(join(tmpdir(), 'mcpctl-config-impersonate-'));
});
afterEach(() => {
rmSync(tmpDir, { recursive: true, force: true });
});
it('impersonates a user and saves backup', async () => {
saveCredentials({ token: 'admin-tok', mcpdUrl: 'http://localhost:3100', user: 'admin@test.com' }, { configDir: tmpDir });
const cmd = createConfigCommand(
{ configDeps: { configDir: tmpDir }, log },
{ client, credentialsDeps: { configDir: tmpDir }, log },
);
await cmd.parseAsync(['impersonate', 'other@test.com'], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/auth/impersonate', { email: 'other@test.com' });
expect(output.join('\n')).toContain('Impersonating other@test.com');
const creds = loadCredentials({ configDir: tmpDir });
expect(creds!.user).toBe('other@test.com');
expect(creds!.token).toBe('impersonated-tok');
// Backup exists
const backup = JSON.parse(readFileSync(join(tmpDir, 'credentials-backup'), 'utf-8'));
expect(backup.user).toBe('admin@test.com');
});
it('quits impersonation and restores backup', async () => {
// Set up current (impersonated) credentials
saveCredentials({ token: 'impersonated-tok', mcpdUrl: 'http://localhost:3100', user: 'other@test.com' }, { configDir: tmpDir });
// Set up backup (original) credentials
writeFileSync(join(tmpDir, 'credentials-backup'), JSON.stringify({
token: 'admin-tok', mcpdUrl: 'http://localhost:3100', user: 'admin@test.com',
}));
const cmd = createConfigCommand(
{ configDeps: { configDir: tmpDir }, log },
{ client, credentialsDeps: { configDir: tmpDir }, log },
);
await cmd.parseAsync(['impersonate', '--quit'], { from: 'user' });
expect(output.join('\n')).toContain('Returned to admin@test.com');
const creds = loadCredentials({ configDir: tmpDir });
expect(creds!.user).toBe('admin@test.com');
expect(creds!.token).toBe('admin-tok');
});
it('errors when not logged in', async () => {
const cmd = createConfigCommand(
{ configDeps: { configDir: tmpDir }, log },
{ client, credentialsDeps: { configDir: tmpDir }, log },
);
await cmd.parseAsync(['impersonate', 'other@test.com'], { from: 'user' });
expect(output.join('\n')).toContain('Not logged in');
});
it('errors when quitting with no backup', async () => {
const cmd = createConfigCommand(
{ configDeps: { configDir: tmpDir }, log },
{ client, credentialsDeps: { configDir: tmpDir }, log },
);
await cmd.parseAsync(['impersonate', '--quit'], { from: 'user' });
expect(output.join('\n')).toContain('No impersonation session to quit');
});
});

View File

@@ -1,6 +1,6 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { createCreateCommand } from '../../src/commands/create.js';
import type { ApiClient } from '../../src/api-client.js';
import { type ApiClient, ApiError } from '../../src/api-client.js';
function mockClient(): ApiClient {
return {
@@ -73,6 +73,59 @@ describe('create command', () => {
transport: 'STDIO',
}));
});
it('strips null values from template when using --from-template', async () => {
vi.mocked(client.get).mockResolvedValueOnce([{
id: 'tpl-1',
name: 'grafana',
version: '1.0.0',
description: 'Grafana MCP',
packageName: '@leval/mcp-grafana',
dockerImage: null,
transport: 'STDIO',
repositoryUrl: 'https://github.com/test',
externalUrl: null,
command: null,
containerPort: null,
replicas: 1,
env: [{ name: 'TOKEN', required: true, description: 'A token' }],
healthCheck: { tool: 'test', arguments: {} },
createdAt: '2025-01-01',
updatedAt: '2025-01-01',
}] as never);
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync([
'server', 'my-grafana', '--from-template=grafana',
'--env', 'TOKEN=secretRef:creds:TOKEN',
], { from: 'user' });
const call = vi.mocked(client.post).mock.calls[0]![1] as Record<string, unknown>;
// null fields from template should NOT be in the body
expect(call).not.toHaveProperty('dockerImage');
expect(call).not.toHaveProperty('externalUrl');
expect(call).not.toHaveProperty('command');
expect(call).not.toHaveProperty('containerPort');
// non-null fields should be present
expect(call.packageName).toBe('@leval/mcp-grafana');
expect(call.healthCheck).toEqual({ tool: 'test', arguments: {} });
expect(call.templateName).toBe('grafana');
});
it('throws on 409 without --force', async () => {
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"Server already exists: my-server"}'));
const cmd = createCreateCommand({ client, log });
await expect(cmd.parseAsync(['server', 'my-server'], { from: 'user' })).rejects.toThrow('API error 409');
});
it('updates existing server on 409 with --force', async () => {
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"Server already exists"}'));
vi.mocked(client.get).mockResolvedValueOnce([{ id: 'srv-1', name: 'my-server' }] as never);
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync(['server', 'my-server', '--force'], { from: 'user' });
expect(client.put).toHaveBeenCalledWith('/api/v1/servers/srv-1', expect.objectContaining({
transport: 'STDIO',
}));
expect(output.join('\n')).toContain("server 'my-server' updated");
});
});
describe('create secret', () => {
@@ -98,6 +151,21 @@ describe('create command', () => {
data: {},
});
});
it('throws on 409 without --force', async () => {
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"Secret already exists: my-creds"}'));
const cmd = createCreateCommand({ client, log });
await expect(cmd.parseAsync(['secret', 'my-creds', '--data', 'KEY=val'], { from: 'user' })).rejects.toThrow('API error 409');
});
it('updates existing secret on 409 with --force', async () => {
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"Secret already exists"}'));
vi.mocked(client.get).mockResolvedValueOnce([{ id: 'sec-1', name: 'my-creds' }] as never);
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync(['secret', 'my-creds', '--data', 'KEY=val', '--force'], { from: 'user' });
expect(client.put).toHaveBeenCalledWith('/api/v1/secrets/sec-1', { data: { KEY: 'val' } });
expect(output.join('\n')).toContain("secret 'my-creds' updated");
});
});
describe('create project', () => {
@@ -107,6 +175,7 @@ describe('create command', () => {
expect(client.post).toHaveBeenCalledWith('/api/v1/projects', {
name: 'my-project',
description: 'A test project',
proxyMode: 'direct',
});
expect(output.join('\n')).toContain("project 'test' created");
});
@@ -117,6 +186,264 @@ describe('create command', () => {
expect(client.post).toHaveBeenCalledWith('/api/v1/projects', {
name: 'minimal',
description: '',
proxyMode: 'direct',
});
});
it('updates existing project on 409 with --force', async () => {
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"Project already exists"}'));
vi.mocked(client.get).mockResolvedValueOnce([{ id: 'proj-1', name: 'my-proj' }] as never);
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync(['project', 'my-proj', '-d', 'updated', '--force'], { from: 'user' });
expect(client.put).toHaveBeenCalledWith('/api/v1/projects/proj-1', { description: 'updated', proxyMode: 'direct' });
expect(output.join('\n')).toContain("project 'my-proj' updated");
});
});
describe('create user', () => {
it('creates a user with password and name', async () => {
vi.mocked(client.post).mockResolvedValueOnce({ id: 'usr-1', email: 'alice@test.com' });
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync([
'user', 'alice@test.com',
'--password', 'secret123',
'--name', 'Alice',
], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/users', {
email: 'alice@test.com',
password: 'secret123',
name: 'Alice',
});
expect(output.join('\n')).toContain("user 'alice@test.com' created");
});
it('does not send role field (RBAC is the auth mechanism)', async () => {
vi.mocked(client.post).mockResolvedValueOnce({ id: 'usr-1', email: 'admin@test.com' });
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync([
'user', 'admin@test.com',
'--password', 'pass123',
], { from: 'user' });
const callBody = vi.mocked(client.post).mock.calls[0]![1] as Record<string, unknown>;
expect(callBody).not.toHaveProperty('role');
});
it('requires --password', async () => {
const cmd = createCreateCommand({ client, log });
await expect(cmd.parseAsync(['user', 'alice@test.com'], { from: 'user' })).rejects.toThrow('--password is required');
});
it('throws on 409 without --force', async () => {
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"User already exists"}'));
const cmd = createCreateCommand({ client, log });
await expect(
cmd.parseAsync(['user', 'alice@test.com', '--password', 'pass'], { from: 'user' }),
).rejects.toThrow('API error 409');
});
it('updates existing user on 409 with --force', async () => {
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"User already exists"}'));
vi.mocked(client.get).mockResolvedValueOnce([{ id: 'usr-1', email: 'alice@test.com' }] as never);
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync([
'user', 'alice@test.com', '--password', 'newpass', '--name', 'Alice New', '--force',
], { from: 'user' });
expect(client.put).toHaveBeenCalledWith('/api/v1/users/usr-1', {
password: 'newpass',
name: 'Alice New',
});
expect(output.join('\n')).toContain("user 'alice@test.com' updated");
});
});
describe('create group', () => {
it('creates a group with members', async () => {
vi.mocked(client.post).mockResolvedValueOnce({ id: 'grp-1', name: 'dev-team' });
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync([
'group', 'dev-team',
'--description', 'Development team',
'--member', 'alice@test.com',
'--member', 'bob@test.com',
], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/groups', {
name: 'dev-team',
description: 'Development team',
members: ['alice@test.com', 'bob@test.com'],
});
expect(output.join('\n')).toContain("group 'dev-team' created");
});
it('creates a group with no members', async () => {
vi.mocked(client.post).mockResolvedValueOnce({ id: 'grp-1', name: 'empty-group' });
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync(['group', 'empty-group'], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/groups', {
name: 'empty-group',
members: [],
});
});
it('throws on 409 without --force', async () => {
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"Group already exists"}'));
const cmd = createCreateCommand({ client, log });
await expect(
cmd.parseAsync(['group', 'dev-team'], { from: 'user' }),
).rejects.toThrow('API error 409');
});
it('updates existing group on 409 with --force', async () => {
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"Group already exists"}'));
vi.mocked(client.get).mockResolvedValueOnce([{ id: 'grp-1', name: 'dev-team' }] as never);
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync([
'group', 'dev-team', '--member', 'new@test.com', '--force',
], { from: 'user' });
expect(client.put).toHaveBeenCalledWith('/api/v1/groups/grp-1', {
members: ['new@test.com'],
});
expect(output.join('\n')).toContain("group 'dev-team' updated");
});
});
describe('create rbac', () => {
it('creates an RBAC definition with subjects and bindings', async () => {
vi.mocked(client.post).mockResolvedValueOnce({ id: 'rbac-1', name: 'developers' });
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync([
'rbac', 'developers',
'--subject', 'User:alice@test.com',
'--subject', 'Group:dev-team',
'--binding', 'edit:servers',
'--binding', 'view:instances',
], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', {
name: 'developers',
subjects: [
{ kind: 'User', name: 'alice@test.com' },
{ kind: 'Group', name: 'dev-team' },
],
roleBindings: [
{ role: 'edit', resource: 'servers' },
{ role: 'view', resource: 'instances' },
],
});
expect(output.join('\n')).toContain("rbac 'developers' created");
});
it('creates an RBAC definition with wildcard resource', async () => {
vi.mocked(client.post).mockResolvedValueOnce({ id: 'rbac-1', name: 'admins' });
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync([
'rbac', 'admins',
'--subject', 'User:admin@test.com',
'--binding', 'edit:*',
], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', {
name: 'admins',
subjects: [{ kind: 'User', name: 'admin@test.com' }],
roleBindings: [{ role: 'edit', resource: '*' }],
});
});
it('creates an RBAC definition with empty subjects and bindings', async () => {
vi.mocked(client.post).mockResolvedValueOnce({ id: 'rbac-1', name: 'empty' });
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync(['rbac', 'empty'], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', {
name: 'empty',
subjects: [],
roleBindings: [],
});
});
it('throws on invalid subject format', async () => {
const cmd = createCreateCommand({ client, log });
await expect(
cmd.parseAsync(['rbac', 'bad', '--subject', 'no-colon'], { from: 'user' }),
).rejects.toThrow('Invalid subject format');
});
it('throws on invalid binding format', async () => {
const cmd = createCreateCommand({ client, log });
await expect(
cmd.parseAsync(['rbac', 'bad', '--binding', 'no-colon'], { from: 'user' }),
).rejects.toThrow('Invalid binding format');
});
it('throws on 409 without --force', async () => {
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"RBAC already exists"}'));
const cmd = createCreateCommand({ client, log });
await expect(
cmd.parseAsync(['rbac', 'developers', '--subject', 'User:a@b.com', '--binding', 'edit:servers'], { from: 'user' }),
).rejects.toThrow('API error 409');
});
it('updates existing RBAC on 409 with --force', async () => {
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"RBAC already exists"}'));
vi.mocked(client.get).mockResolvedValueOnce([{ id: 'rbac-1', name: 'developers' }] as never);
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync([
'rbac', 'developers',
'--subject', 'User:new@test.com',
'--binding', 'edit:*',
'--force',
], { from: 'user' });
expect(client.put).toHaveBeenCalledWith('/api/v1/rbac/rbac-1', {
subjects: [{ kind: 'User', name: 'new@test.com' }],
roleBindings: [{ role: 'edit', resource: '*' }],
});
expect(output.join('\n')).toContain("rbac 'developers' updated");
});
it('creates an RBAC definition with operation bindings', async () => {
vi.mocked(client.post).mockResolvedValueOnce({ id: 'rbac-1', name: 'ops' });
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync([
'rbac', 'ops',
'--subject', 'Group:ops-team',
'--binding', 'edit:servers',
'--operation', 'logs',
'--operation', 'backup',
], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', {
name: 'ops',
subjects: [{ kind: 'Group', name: 'ops-team' }],
roleBindings: [
{ role: 'edit', resource: 'servers' },
{ role: 'run', action: 'logs' },
{ role: 'run', action: 'backup' },
],
});
expect(output.join('\n')).toContain("rbac 'ops' created");
});
it('creates an RBAC definition with name-scoped binding', async () => {
vi.mocked(client.post).mockResolvedValueOnce({ id: 'rbac-1', name: 'ha-viewer' });
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync([
'rbac', 'ha-viewer',
'--subject', 'User:alice@test.com',
'--binding', 'view:servers:my-ha',
], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', {
name: 'ha-viewer',
subjects: [{ kind: 'User', name: 'alice@test.com' }],
roleBindings: [
{ role: 'view', resource: 'servers', name: 'my-ha' },
],
});
});
});

View File

@@ -139,4 +139,558 @@ describe('describe command', () => {
expect(text).toContain('RUNNING');
expect(text).toContain('abc123');
});
it('resolves server name to instance for describe instance', async () => {
const deps = makeDeps({
id: 'inst-1',
serverId: 'srv-1',
server: { name: 'my-grafana' },
status: 'RUNNING',
containerId: 'abc123',
port: 3000,
});
// resolveNameOrId will throw (not a CUID, name won't match instances)
vi.mocked(deps.client.get)
.mockResolvedValueOnce([] as never) // instances list (no name match)
.mockResolvedValueOnce([{ id: 'srv-1', name: 'my-grafana' }] as never) // servers list
.mockResolvedValueOnce([{ id: 'inst-1', status: 'RUNNING' }] as never); // instances for server
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'instance', 'my-grafana']);
expect(deps.fetchResource).toHaveBeenCalledWith('instances', 'inst-1');
});
it('resolves server name and picks running instance over stopped', async () => {
const deps = makeDeps({
id: 'inst-2',
serverId: 'srv-1',
server: { name: 'my-ha' },
status: 'RUNNING',
containerId: 'def456',
});
vi.mocked(deps.client.get)
.mockResolvedValueOnce([] as never) // instances list
.mockResolvedValueOnce([{ id: 'srv-1', name: 'my-ha' }] as never)
.mockResolvedValueOnce([
{ id: 'inst-1', status: 'ERROR' },
{ id: 'inst-2', status: 'RUNNING' },
] as never);
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'instance', 'my-ha']);
expect(deps.fetchResource).toHaveBeenCalledWith('instances', 'inst-2');
});
it('throws when no instances found for server name', async () => {
const deps = makeDeps();
vi.mocked(deps.client.get)
.mockResolvedValueOnce([] as never) // instances list
.mockResolvedValueOnce([{ id: 'srv-1', name: 'my-server' }] as never)
.mockResolvedValueOnce([] as never); // no instances
const cmd = createDescribeCommand(deps);
await expect(cmd.parseAsync(['node', 'test', 'instance', 'my-server'])).rejects.toThrow(
/No instances found/,
);
});
it('shows instance with server name in header', async () => {
const deps = makeDeps({
id: 'inst-1',
serverId: 'srv-1',
server: { name: 'my-grafana' },
status: 'RUNNING',
containerId: 'abc123',
port: 3000,
});
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'instance', 'inst-1']);
const text = deps.output.join('\n');
expect(text).toContain('=== Instance: my-grafana ===');
});
it('shows instance health and events', async () => {
const deps = makeDeps({
id: 'inst-1',
serverId: 'srv-1',
server: { name: 'my-grafana' },
status: 'RUNNING',
containerId: 'abc123',
healthStatus: 'healthy',
lastHealthCheck: '2025-01-15T10:30:00Z',
events: [
{ timestamp: '2025-01-15T10:30:00Z', type: 'Normal', message: 'Health check passed (45ms)' },
],
});
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'instance', 'inst-1']);
const text = deps.output.join('\n');
expect(text).toContain('Health:');
expect(text).toContain('healthy');
expect(text).toContain('Events:');
expect(text).toContain('Health check passed');
});
it('shows server healthCheck section', async () => {
const deps = makeDeps({
id: 'srv-1',
name: 'my-grafana',
transport: 'STDIO',
healthCheck: {
tool: 'list_datasources',
arguments: {},
intervalSeconds: 60,
timeoutSeconds: 10,
failureThreshold: 3,
},
});
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'server', 'srv-1']);
const text = deps.output.join('\n');
expect(text).toContain('Health Check:');
expect(text).toContain('list_datasources');
expect(text).toContain('60s');
expect(text).toContain('Failure Threshold:');
});
it('shows template detail with healthCheck and usage', async () => {
const deps = makeDeps({
id: 'tpl-1',
name: 'grafana',
transport: 'STDIO',
version: '1.0.0',
packageName: '@leval/mcp-grafana',
env: [
{ name: 'GRAFANA_URL', required: true, description: 'Grafana instance URL' },
],
healthCheck: {
tool: 'list_datasources',
arguments: {},
intervalSeconds: 60,
timeoutSeconds: 10,
failureThreshold: 3,
},
});
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'template', 'tpl-1']);
const text = deps.output.join('\n');
expect(text).toContain('=== Template: grafana ===');
expect(text).toContain('@leval/mcp-grafana');
expect(text).toContain('GRAFANA_URL');
expect(text).toContain('Health Check:');
expect(text).toContain('list_datasources');
expect(text).toContain('mcpctl create server my-grafana --from-template=grafana');
});
it('shows user detail (no Role field — RBAC is the auth mechanism)', async () => {
const deps = makeDeps({
id: 'usr-1',
email: 'alice@test.com',
name: 'Alice Smith',
provider: null,
createdAt: '2025-01-01',
updatedAt: '2025-01-15',
});
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'user', 'usr-1']);
expect(deps.fetchResource).toHaveBeenCalledWith('users', 'usr-1');
const text = deps.output.join('\n');
expect(text).toContain('=== User: alice@test.com ===');
expect(text).toContain('Email:');
expect(text).toContain('alice@test.com');
expect(text).toContain('Name:');
expect(text).toContain('Alice Smith');
expect(text).not.toContain('Role:');
expect(text).toContain('Provider:');
expect(text).toContain('local');
expect(text).toContain('ID:');
expect(text).toContain('usr-1');
});
it('shows user with no name as dash', async () => {
const deps = makeDeps({
id: 'usr-2',
email: 'bob@test.com',
name: null,
provider: 'oidc',
});
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'user', 'usr-2']);
const text = deps.output.join('\n');
expect(text).toContain('=== User: bob@test.com ===');
expect(text).toContain('Name:');
expect(text).toContain('-');
expect(text).not.toContain('Role:');
expect(text).toContain('oidc');
});
it('shows group detail with members', async () => {
const deps = makeDeps({
id: 'grp-1',
name: 'dev-team',
description: 'Development team',
members: [
{ user: { email: 'alice@test.com' }, createdAt: '2025-01-01' },
{ user: { email: 'bob@test.com' }, createdAt: '2025-01-02' },
],
createdAt: '2025-01-01',
updatedAt: '2025-01-15',
});
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'group', 'grp-1']);
expect(deps.fetchResource).toHaveBeenCalledWith('groups', 'grp-1');
const text = deps.output.join('\n');
expect(text).toContain('=== Group: dev-team ===');
expect(text).toContain('Name:');
expect(text).toContain('dev-team');
expect(text).toContain('Description:');
expect(text).toContain('Development team');
expect(text).toContain('Members:');
expect(text).toContain('EMAIL');
expect(text).toContain('ADDED');
expect(text).toContain('alice@test.com');
expect(text).toContain('bob@test.com');
expect(text).toContain('ID:');
expect(text).toContain('grp-1');
});
it('shows group detail with no members', async () => {
const deps = makeDeps({
id: 'grp-2',
name: 'empty-group',
description: '',
members: [],
});
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'group', 'grp-2']);
const text = deps.output.join('\n');
expect(text).toContain('=== Group: empty-group ===');
// No Members section when empty
expect(text).not.toContain('EMAIL');
});
it('shows RBAC detail with subjects and bindings', async () => {
const deps = makeDeps({
id: 'rbac-1',
name: 'developers',
subjects: [
{ kind: 'User', name: 'alice@test.com' },
{ kind: 'Group', name: 'dev-team' },
],
roleBindings: [
{ role: 'edit', resource: 'servers' },
{ role: 'view', resource: 'instances' },
{ role: 'view', resource: 'projects' },
],
createdAt: '2025-01-01',
updatedAt: '2025-01-15',
});
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'rbac', 'rbac-1']);
expect(deps.fetchResource).toHaveBeenCalledWith('rbac', 'rbac-1');
const text = deps.output.join('\n');
expect(text).toContain('=== RBAC: developers ===');
expect(text).toContain('Name:');
expect(text).toContain('developers');
// Subjects section
expect(text).toContain('Subjects:');
expect(text).toContain('KIND');
expect(text).toContain('NAME');
expect(text).toContain('User');
expect(text).toContain('alice@test.com');
expect(text).toContain('Group');
expect(text).toContain('dev-team');
// Role Bindings section
expect(text).toContain('Resource Bindings:');
expect(text).toContain('ROLE');
expect(text).toContain('RESOURCE');
expect(text).toContain('edit');
expect(text).toContain('servers');
expect(text).toContain('view');
expect(text).toContain('instances');
expect(text).toContain('projects');
expect(text).toContain('ID:');
expect(text).toContain('rbac-1');
});
it('shows RBAC detail with wildcard resource', async () => {
const deps = makeDeps({
id: 'rbac-2',
name: 'admins',
subjects: [{ kind: 'User', name: 'admin@test.com' }],
roleBindings: [{ role: 'edit', resource: '*' }],
});
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'rbac', 'rbac-2']);
const text = deps.output.join('\n');
expect(text).toContain('=== RBAC: admins ===');
expect(text).toContain('edit');
expect(text).toContain('*');
});
it('shows RBAC detail with empty subjects and bindings', async () => {
const deps = makeDeps({
id: 'rbac-3',
name: 'empty-rbac',
subjects: [],
roleBindings: [],
});
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'rbac', 'rbac-3']);
const text = deps.output.join('\n');
expect(text).toContain('=== RBAC: empty-rbac ===');
// No Subjects or Role Bindings sections when empty
expect(text).not.toContain('KIND');
expect(text).not.toContain('ROLE');
expect(text).not.toContain('RESOURCE');
});
it('shows RBAC detail with mixed resource and operation bindings', async () => {
const deps = makeDeps({
id: 'rbac-1',
name: 'admin-access',
subjects: [{ kind: 'Group', name: 'admin' }],
roleBindings: [
{ role: 'edit', resource: '*' },
{ role: 'run', resource: 'projects' },
{ role: 'run', action: 'logs' },
{ role: 'run', action: 'backup' },
],
createdAt: '2025-01-01',
});
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'rbac', 'rbac-1']);
const text = deps.output.join('\n');
expect(text).toContain('Resource Bindings:');
expect(text).toContain('edit');
expect(text).toContain('*');
expect(text).toContain('run');
expect(text).toContain('projects');
expect(text).toContain('Operations:');
expect(text).toContain('ACTION');
expect(text).toContain('logs');
expect(text).toContain('backup');
});
it('shows RBAC detail with name-scoped resource binding', async () => {
const deps = makeDeps({
id: 'rbac-1',
name: 'ha-viewer',
subjects: [{ kind: 'User', name: 'alice@test.com' }],
roleBindings: [
{ role: 'view', resource: 'servers', name: 'my-ha' },
{ role: 'edit', resource: 'secrets' },
],
});
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'rbac', 'rbac-1']);
const text = deps.output.join('\n');
expect(text).toContain('Resource Bindings:');
expect(text).toContain('NAME');
expect(text).toContain('my-ha');
expect(text).toContain('view');
expect(text).toContain('servers');
});
it('shows user with direct RBAC permissions', async () => {
const deps = makeDeps({
id: 'usr-1',
email: 'alice@test.com',
name: 'Alice',
provider: null,
});
vi.mocked(deps.client.get)
.mockResolvedValueOnce([] as never) // users list (resolveNameOrId)
.mockResolvedValueOnce([ // RBAC defs
{
name: 'dev-access',
subjects: [{ kind: 'User', name: 'alice@test.com' }],
roleBindings: [
{ role: 'edit', resource: 'servers' },
{ role: 'run', action: 'logs' },
],
},
] as never)
.mockResolvedValueOnce([] as never); // groups
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'user', 'usr-1']);
const text = deps.output.join('\n');
expect(text).toContain('=== User: alice@test.com ===');
expect(text).toContain('Access:');
expect(text).toContain('Direct (dev-access)');
expect(text).toContain('Resources:');
expect(text).toContain('edit');
expect(text).toContain('servers');
expect(text).toContain('Operations:');
expect(text).toContain('logs');
});
it('shows user with inherited group permissions', async () => {
const deps = makeDeps({
id: 'usr-1',
email: 'bob@test.com',
name: 'Bob',
provider: null,
});
vi.mocked(deps.client.get)
.mockResolvedValueOnce([] as never) // users list
.mockResolvedValueOnce([ // RBAC defs
{
name: 'team-perms',
subjects: [{ kind: 'Group', name: 'dev-team' }],
roleBindings: [
{ role: 'view', resource: '*' },
{ role: 'run', action: 'backup' },
],
},
] as never)
.mockResolvedValueOnce([ // groups
{ name: 'dev-team', members: [{ user: { email: 'bob@test.com' } }] },
] as never);
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'user', 'usr-1']);
const text = deps.output.join('\n');
expect(text).toContain('Groups:');
expect(text).toContain('dev-team');
expect(text).toContain('Access:');
expect(text).toContain('Inherited (dev-team)');
expect(text).toContain('view');
expect(text).toContain('*');
expect(text).toContain('backup');
});
it('shows user with no permissions', async () => {
const deps = makeDeps({
id: 'usr-1',
email: 'nobody@test.com',
name: null,
provider: null,
});
vi.mocked(deps.client.get)
.mockResolvedValueOnce([] as never)
.mockResolvedValueOnce([] as never)
.mockResolvedValueOnce([] as never);
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'user', 'usr-1']);
const text = deps.output.join('\n');
expect(text).toContain('Access: (none)');
});
it('shows group with RBAC permissions', async () => {
const deps = makeDeps({
id: 'grp-1',
name: 'admin',
description: 'Admin group',
members: [{ user: { email: 'alice@test.com' } }],
});
vi.mocked(deps.client.get)
.mockResolvedValueOnce([] as never) // groups list (resolveNameOrId)
.mockResolvedValueOnce([ // RBAC defs
{
name: 'admin-access',
subjects: [{ kind: 'Group', name: 'admin' }],
roleBindings: [
{ role: 'edit', resource: '*' },
{ role: 'run', action: 'backup' },
{ role: 'run', action: 'restore' },
],
},
] as never);
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'group', 'grp-1']);
const text = deps.output.join('\n');
expect(text).toContain('=== Group: admin ===');
expect(text).toContain('Access:');
expect(text).toContain('Granted (admin-access)');
expect(text).toContain('edit');
expect(text).toContain('*');
expect(text).toContain('backup');
expect(text).toContain('restore');
});
it('shows group with name-scoped permissions', async () => {
const deps = makeDeps({
id: 'grp-1',
name: 'ha-team',
description: 'HA team',
members: [],
});
vi.mocked(deps.client.get)
.mockResolvedValueOnce([] as never)
.mockResolvedValueOnce([ // RBAC defs
{
name: 'ha-access',
subjects: [{ kind: 'Group', name: 'ha-team' }],
roleBindings: [
{ role: 'edit', resource: 'servers', name: 'my-ha' },
{ role: 'view', resource: 'secrets' },
],
},
] as never);
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'group', 'grp-1']);
const text = deps.output.join('\n');
expect(text).toContain('Access:');
expect(text).toContain('Granted (ha-access)');
expect(text).toContain('my-ha');
expect(text).toContain('NAME');
});
it('outputs user detail as JSON', async () => {
const deps = makeDeps({ id: 'usr-1', email: 'alice@test.com', name: 'Alice', role: 'ADMIN' });
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'user', 'usr-1', '-o', 'json']);
const parsed = JSON.parse(deps.output[0] ?? '');
expect(parsed.email).toBe('alice@test.com');
expect(parsed.role).toBe('ADMIN');
});
it('outputs group detail as YAML', async () => {
const deps = makeDeps({ id: 'grp-1', name: 'dev-team', description: 'Devs' });
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'group', 'grp-1', '-o', 'yaml']);
expect(deps.output[0]).toContain('name: dev-team');
});
it('outputs rbac detail as JSON', async () => {
const deps = makeDeps({
id: 'rbac-1',
name: 'devs',
subjects: [{ kind: 'User', name: 'a@b.com' }],
roleBindings: [{ role: 'edit', resource: 'servers' }],
});
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'rbac', 'rbac-1', '-o', 'json']);
const parsed = JSON.parse(deps.output[0] ?? '');
expect(parsed.subjects).toHaveLength(1);
expect(parsed.roleBindings[0].role).toBe('edit');
});
});

View File

@@ -69,11 +69,13 @@ describe('get command', () => {
it('lists instances with correct columns', async () => {
const deps = makeDeps([
{ id: 'inst-1', serverId: 'srv-1', status: 'RUNNING', containerId: 'abc123def456', port: 3000 },
{ id: 'inst-1', serverId: 'srv-1', server: { name: 'my-grafana' }, status: 'RUNNING', containerId: 'abc123def456', port: 3000 },
]);
const cmd = createGetCommand(deps);
await cmd.parseAsync(['node', 'test', 'instances']);
expect(deps.output[0]).toContain('NAME');
expect(deps.output[0]).toContain('STATUS');
expect(deps.output.join('\n')).toContain('my-grafana');
expect(deps.output.join('\n')).toContain('RUNNING');
});
@@ -83,4 +85,170 @@ describe('get command', () => {
await cmd.parseAsync(['node', 'test', 'servers']);
expect(deps.output[0]).toContain('No servers found');
});
it('lists users with correct columns (no ROLE column)', async () => {
const deps = makeDeps([
{ id: 'usr-1', email: 'alice@test.com', name: 'Alice', provider: null },
{ id: 'usr-2', email: 'bob@test.com', name: null, provider: 'oidc' },
]);
const cmd = createGetCommand(deps);
await cmd.parseAsync(['node', 'test', 'users']);
expect(deps.fetchResource).toHaveBeenCalledWith('users', undefined);
const text = deps.output.join('\n');
expect(text).toContain('EMAIL');
expect(text).toContain('NAME');
expect(text).not.toContain('ROLE');
expect(text).toContain('PROVIDER');
expect(text).toContain('alice@test.com');
expect(text).toContain('Alice');
expect(text).toContain('bob@test.com');
expect(text).toContain('oidc');
});
it('resolves user alias', async () => {
const deps = makeDeps([]);
const cmd = createGetCommand(deps);
await cmd.parseAsync(['node', 'test', 'user']);
expect(deps.fetchResource).toHaveBeenCalledWith('users', undefined);
});
it('lists groups with correct columns', async () => {
const deps = makeDeps([
{
id: 'grp-1',
name: 'dev-team',
description: 'Developers',
members: [{ user: { email: 'alice@test.com' } }, { user: { email: 'bob@test.com' } }],
},
{ id: 'grp-2', name: 'ops-team', description: 'Operations', members: [] },
]);
const cmd = createGetCommand(deps);
await cmd.parseAsync(['node', 'test', 'groups']);
expect(deps.fetchResource).toHaveBeenCalledWith('groups', undefined);
const text = deps.output.join('\n');
expect(text).toContain('NAME');
expect(text).toContain('MEMBERS');
expect(text).toContain('DESCRIPTION');
expect(text).toContain('dev-team');
expect(text).toContain('2');
expect(text).toContain('ops-team');
expect(text).toContain('0');
});
it('resolves group alias', async () => {
const deps = makeDeps([]);
const cmd = createGetCommand(deps);
await cmd.parseAsync(['node', 'test', 'group']);
expect(deps.fetchResource).toHaveBeenCalledWith('groups', undefined);
});
it('lists rbac definitions with correct columns', async () => {
const deps = makeDeps([
{
id: 'rbac-1',
name: 'admins',
subjects: [{ kind: 'User', name: 'admin@test.com' }],
roleBindings: [{ role: 'edit', resource: '*' }],
},
]);
const cmd = createGetCommand(deps);
await cmd.parseAsync(['node', 'test', 'rbac']);
expect(deps.fetchResource).toHaveBeenCalledWith('rbac', undefined);
const text = deps.output.join('\n');
expect(text).toContain('NAME');
expect(text).toContain('SUBJECTS');
expect(text).toContain('BINDINGS');
expect(text).toContain('admins');
expect(text).toContain('User:admin@test.com');
expect(text).toContain('edit:*');
});
it('resolves rbac-definition alias', async () => {
const deps = makeDeps([]);
const cmd = createGetCommand(deps);
await cmd.parseAsync(['node', 'test', 'rbac-definition']);
expect(deps.fetchResource).toHaveBeenCalledWith('rbac', undefined);
});
it('lists projects with new columns', async () => {
const deps = makeDeps([{
id: 'proj-1',
name: 'smart-home',
description: 'Home automation',
proxyMode: 'filtered',
ownerId: 'usr-1',
servers: [{ server: { name: 'grafana' } }],
}]);
const cmd = createGetCommand(deps);
await cmd.parseAsync(['node', 'test', 'projects']);
const text = deps.output.join('\n');
expect(text).toContain('MODE');
expect(text).toContain('SERVERS');
expect(text).toContain('smart-home');
expect(text).toContain('filtered');
expect(text).toContain('1');
});
it('displays mixed resource and operation bindings', async () => {
const deps = makeDeps([
{
id: 'rbac-1',
name: 'admin-access',
subjects: [{ kind: 'Group', name: 'admin' }],
roleBindings: [
{ role: 'edit', resource: '*' },
{ role: 'run', action: 'logs' },
{ role: 'run', action: 'backup' },
],
},
]);
const cmd = createGetCommand(deps);
await cmd.parseAsync(['node', 'test', 'rbac']);
const text = deps.output.join('\n');
expect(text).toContain('edit:*');
expect(text).toContain('run>logs');
expect(text).toContain('run>backup');
});
it('displays name-scoped resource bindings', async () => {
const deps = makeDeps([
{
id: 'rbac-1',
name: 'ha-viewer',
subjects: [{ kind: 'User', name: 'alice@test.com' }],
roleBindings: [{ role: 'view', resource: 'servers', name: 'my-ha' }],
},
]);
const cmd = createGetCommand(deps);
await cmd.parseAsync(['node', 'test', 'rbac']);
const text = deps.output.join('\n');
expect(text).toContain('view:servers:my-ha');
});
it('shows no results message for empty users list', async () => {
const deps = makeDeps([]);
const cmd = createGetCommand(deps);
await cmd.parseAsync(['node', 'test', 'users']);
expect(deps.output[0]).toContain('No users found');
});
it('shows no results message for empty groups list', async () => {
const deps = makeDeps([]);
const cmd = createGetCommand(deps);
await cmd.parseAsync(['node', 'test', 'groups']);
expect(deps.output[0]).toContain('No groups found');
});
it('shows no results message for empty rbac list', async () => {
const deps = makeDeps([]);
const cmd = createGetCommand(deps);
await cmd.parseAsync(['node', 'test', 'rbac']);
expect(deps.output[0]).toContain('No rbac found');
});
});

View File

@@ -68,16 +68,79 @@ describe('logs command', () => {
output = [];
});
it('shows logs', async () => {
vi.mocked(client.get).mockResolvedValue({ stdout: 'hello world\n', stderr: '' });
it('shows logs by instance ID', async () => {
vi.mocked(client.get)
.mockResolvedValueOnce({ id: 'inst-1', status: 'RUNNING' } as never) // instance lookup
.mockResolvedValueOnce({ stdout: 'hello world\n', stderr: '' } as never); // logs
const cmd = createLogsCommand({ client, log });
await cmd.parseAsync(['inst-1'], { from: 'user' });
expect(client.get).toHaveBeenCalledWith('/api/v1/instances/inst-1');
expect(client.get).toHaveBeenCalledWith('/api/v1/instances/inst-1/logs');
expect(output.join('\n')).toContain('hello world');
});
it('resolves server name to instance ID', async () => {
vi.mocked(client.get)
.mockRejectedValueOnce(new Error('not found')) // instance lookup fails
.mockResolvedValueOnce([{ id: 'srv-1', name: 'my-grafana' }] as never) // servers list
.mockResolvedValueOnce([{ id: 'inst-1', status: 'RUNNING', containerId: 'abc' }] as never) // instances for server
.mockResolvedValueOnce({ stdout: 'grafana logs\n', stderr: '' } as never); // logs
const cmd = createLogsCommand({ client, log });
await cmd.parseAsync(['my-grafana'], { from: 'user' });
expect(client.get).toHaveBeenCalledWith('/api/v1/instances/inst-1/logs');
expect(output.join('\n')).toContain('grafana logs');
});
it('picks RUNNING instance over others', async () => {
vi.mocked(client.get)
.mockRejectedValueOnce(new Error('not found'))
.mockResolvedValueOnce([{ id: 'srv-1', name: 'ha-mcp' }] as never)
.mockResolvedValueOnce([
{ id: 'inst-err', status: 'ERROR', containerId: null },
{ id: 'inst-ok', status: 'RUNNING', containerId: 'abc' },
] as never)
.mockResolvedValueOnce({ stdout: 'running instance\n', stderr: '' } as never);
const cmd = createLogsCommand({ client, log });
await cmd.parseAsync(['ha-mcp'], { from: 'user' });
expect(client.get).toHaveBeenCalledWith('/api/v1/instances/inst-ok/logs');
});
it('selects specific replica with --instance', async () => {
vi.mocked(client.get)
.mockRejectedValueOnce(new Error('not found'))
.mockResolvedValueOnce([{ id: 'srv-1', name: 'ha-mcp' }] as never)
.mockResolvedValueOnce([
{ id: 'inst-0', status: 'RUNNING', containerId: 'a' },
{ id: 'inst-1', status: 'RUNNING', containerId: 'b' },
] as never)
.mockResolvedValueOnce({ stdout: 'replica 1\n', stderr: '' } as never);
const cmd = createLogsCommand({ client, log });
await cmd.parseAsync(['ha-mcp', '-i', '1'], { from: 'user' });
expect(client.get).toHaveBeenCalledWith('/api/v1/instances/inst-1/logs');
});
it('throws on out-of-range --instance index', async () => {
vi.mocked(client.get)
.mockRejectedValueOnce(new Error('not found'))
.mockResolvedValueOnce([{ id: 'srv-1', name: 'ha-mcp' }] as never)
.mockResolvedValueOnce([{ id: 'inst-0', status: 'RUNNING' }] as never);
const cmd = createLogsCommand({ client, log });
await expect(cmd.parseAsync(['ha-mcp', '-i', '5'], { from: 'user' })).rejects.toThrow('out of range');
});
it('throws when server has no instances', async () => {
vi.mocked(client.get)
.mockRejectedValueOnce(new Error('not found'))
.mockResolvedValueOnce([{ id: 'srv-1', name: 'empty-srv' }] as never)
.mockResolvedValueOnce([] as never);
const cmd = createLogsCommand({ client, log });
await expect(cmd.parseAsync(['empty-srv'], { from: 'user' })).rejects.toThrow('No instances found');
});
it('passes tail option', async () => {
vi.mocked(client.get).mockResolvedValue({ stdout: '', stderr: '' });
vi.mocked(client.get)
.mockResolvedValueOnce({ id: 'inst-1' } as never)
.mockResolvedValueOnce({ stdout: '', stderr: '' } as never);
const cmd = createLogsCommand({ client, log });
await cmd.parseAsync(['inst-1', '-t', '50'], { from: 'user' });
expect(client.get).toHaveBeenCalledWith('/api/v1/instances/inst-1/logs?tail=50');

View File

@@ -1,17 +1,19 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { createProjectCommand } from '../../src/commands/project.js';
import type { ApiClient } from '../../src/api-client.js';
import { createCreateCommand } from '../../src/commands/create.js';
import { createGetCommand } from '../../src/commands/get.js';
import { createDescribeCommand } from '../../src/commands/describe.js';
import { type ApiClient, ApiError } from '../../src/api-client.js';
function mockClient(): ApiClient {
return {
get: vi.fn(async () => []),
post: vi.fn(async () => ({ id: 'proj-1', name: 'my-project' })),
post: vi.fn(async () => ({ id: 'new-id', name: 'test' })),
put: vi.fn(async () => ({})),
delete: vi.fn(async () => {}),
} as unknown as ApiClient;
}
describe('project command', () => {
describe('project with new fields', () => {
let client: ReturnType<typeof mockClient>;
let output: string[];
const log = (...args: unknown[]) => output.push(args.map(String).join(' '));
@@ -21,9 +23,94 @@ describe('project command', () => {
output = [];
});
it('creates command with alias', () => {
const cmd = createProjectCommand({ client, log });
expect(cmd.name()).toBe('project');
expect(cmd.alias()).toBe('proj');
describe('create project with enhanced options', () => {
it('creates project with proxy mode and servers', async () => {
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync([
'project', 'smart-home',
'-d', 'Smart home project',
'--proxy-mode', 'filtered',
'--proxy-mode-llm-provider', 'gemini-cli',
'--proxy-mode-llm-model', 'gemini-2.0-flash',
'--server', 'my-grafana',
'--server', 'my-ha',
], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/projects', expect.objectContaining({
name: 'smart-home',
description: 'Smart home project',
proxyMode: 'filtered',
llmProvider: 'gemini-cli',
llmModel: 'gemini-2.0-flash',
servers: ['my-grafana', 'my-ha'],
}));
});
it('defaults proxy mode to direct', async () => {
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync(['project', 'basic'], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/projects', expect.objectContaining({
proxyMode: 'direct',
}));
});
});
describe('get projects shows new columns', () => {
it('shows MODE and SERVERS columns', async () => {
const deps = {
output: [] as string[],
fetchResource: vi.fn(async () => [{
id: 'proj-1',
name: 'smart-home',
description: 'Test',
proxyMode: 'filtered',
ownerId: 'user-1',
servers: [{ server: { name: 'grafana' } }, { server: { name: 'ha' } }],
}]),
log: (...args: string[]) => deps.output.push(args.join(' ')),
};
const cmd = createGetCommand(deps);
await cmd.parseAsync(['node', 'test', 'projects']);
const text = deps.output.join('\n');
expect(text).toContain('MODE');
expect(text).toContain('SERVERS');
expect(text).toContain('smart-home');
});
});
describe('describe project shows full detail', () => {
it('shows servers and proxy config', async () => {
const deps = {
output: [] as string[],
client: mockClient(),
fetchResource: vi.fn(async () => ({
id: 'proj-1',
name: 'smart-home',
description: 'Smart home',
proxyMode: 'filtered',
llmProvider: 'gemini-cli',
llmModel: 'gemini-2.0-flash',
ownerId: 'user-1',
servers: [
{ server: { name: 'my-grafana' } },
{ server: { name: 'my-ha' } },
],
createdAt: '2025-01-01',
updatedAt: '2025-01-01',
})),
log: (...args: string[]) => deps.output.push(args.join(' ')),
};
const cmd = createDescribeCommand(deps);
await cmd.parseAsync(['node', 'test', 'project', 'proj-1']);
const text = deps.output.join('\n');
expect(text).toContain('=== Project: smart-home ===');
expect(text).toContain('filtered');
expect(text).toContain('gemini-cli');
expect(text).toContain('my-grafana');
expect(text).toContain('my-ha');
});
});
});

View File

@@ -21,35 +21,44 @@ describe('CLI command registration (e2e)', () => {
expect(commandNames).toContain('apply');
expect(commandNames).toContain('create');
expect(commandNames).toContain('edit');
expect(commandNames).toContain('claude');
expect(commandNames).toContain('project');
expect(commandNames).toContain('backup');
expect(commandNames).toContain('restore');
});
it('instance command is removed (use get/delete/logs instead)', () => {
it('old project and claude top-level commands are removed', () => {
const program = createProgram();
const commandNames = program.commands.map((c) => c.name());
expect(commandNames).not.toContain('claude');
expect(commandNames).not.toContain('project');
expect(commandNames).not.toContain('instance');
});
it('claude command has config management subcommands', () => {
it('config command has claude-generate and impersonate subcommands', () => {
const program = createProgram();
const claude = program.commands.find((c) => c.name() === 'claude');
expect(claude).toBeDefined();
const config = program.commands.find((c) => c.name() === 'config');
expect(config).toBeDefined();
const subcommands = claude!.commands.map((c) => c.name());
expect(subcommands).toContain('generate');
expect(subcommands).toContain('show');
expect(subcommands).toContain('add');
expect(subcommands).toContain('remove');
const subcommands = config!.commands.map((c) => c.name());
expect(subcommands).toContain('claude-generate');
expect(subcommands).toContain('impersonate');
expect(subcommands).toContain('view');
expect(subcommands).toContain('set');
expect(subcommands).toContain('path');
expect(subcommands).toContain('reset');
});
it('project command exists with alias', () => {
it('create command has user, group, rbac subcommands', () => {
const program = createProgram();
const project = program.commands.find((c) => c.name() === 'project');
expect(project).toBeDefined();
expect(project!.alias()).toBe('proj');
const create = program.commands.find((c) => c.name() === 'create');
expect(create).toBeDefined();
const subcommands = create!.commands.map((c) => c.name());
expect(subcommands).toContain('server');
expect(subcommands).toContain('secret');
expect(subcommands).toContain('project');
expect(subcommands).toContain('user');
expect(subcommands).toContain('group');
expect(subcommands).toContain('rbac');
});
it('displays version', () => {

View File

@@ -0,0 +1,8 @@
-- DropForeignKey
ALTER TABLE "ProjectMember" DROP CONSTRAINT IF EXISTS "ProjectMember_projectId_fkey";
-- DropForeignKey
ALTER TABLE "ProjectMember" DROP CONSTRAINT IF EXISTS "ProjectMember_userId_fkey";
-- DropTable
DROP TABLE IF EXISTS "ProjectMember";

View File

@@ -15,13 +15,16 @@ model User {
name String?
passwordHash String
role Role @default(USER)
provider String?
externalId String?
version Int @default(1)
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
sessions Session[]
auditLogs AuditLog[]
projects Project[]
ownedProjects Project[]
groupMemberships GroupMember[]
@@index([email])
}
@@ -71,6 +74,7 @@ model McpServer {
templateVersion String?
instances McpInstance[]
projects ProjectServer[]
@@index([name])
}
@@ -117,23 +121,82 @@ model Secret {
@@index([name])
}
// ── Groups ──
model Group {
id String @id @default(cuid())
name String @unique
description String @default("")
version Int @default(1)
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
members GroupMember[]
@@index([name])
}
model GroupMember {
id String @id @default(cuid())
groupId String
userId String
createdAt DateTime @default(now())
group Group @relation(fields: [groupId], references: [id], onDelete: Cascade)
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
@@unique([groupId, userId])
@@index([groupId])
@@index([userId])
}
// ── RBAC Definitions ──
model RbacDefinition {
id String @id @default(cuid())
name String @unique
subjects Json @default("[]")
roleBindings Json @default("[]")
version Int @default(1)
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
@@index([name])
}
// ── Projects ──
model Project {
id String @id @default(cuid())
name String @unique
description String @default("")
proxyMode String @default("direct")
llmProvider String?
llmModel String?
ownerId String
version Int @default(1)
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
owner User @relation(fields: [ownerId], references: [id], onDelete: Cascade)
servers ProjectServer[]
@@index([name])
@@index([ownerId])
}
model ProjectServer {
id String @id @default(cuid())
projectId String
serverId String
createdAt DateTime @default(now())
project Project @relation(fields: [projectId], references: [id], onDelete: Cascade)
server McpServer @relation(fields: [serverId], references: [id], onDelete: Cascade)
@@unique([projectId, serverId])
}
// ── MCP Instances (running containers) ──
model McpInstance {

View File

@@ -49,10 +49,15 @@ export async function clearAllTables(client: PrismaClient): Promise<void> {
// Delete in order respecting foreign keys
await client.auditLog.deleteMany();
await client.mcpInstance.deleteMany();
await client.projectServer.deleteMany();
await client.projectMember.deleteMany();
await client.secret.deleteMany();
await client.session.deleteMany();
await client.project.deleteMany();
await client.mcpServer.deleteMany();
await client.mcpTemplate.deleteMany();
await client.groupMember.deleteMany();
await client.group.deleteMany();
await client.rbacDefinition.deleteMany();
await client.user.deleteMany();
}

View File

@@ -23,11 +23,35 @@ async function createUser(overrides: { email?: string; name?: string; role?: 'US
data: {
email: overrides.email ?? `test-${Date.now()}@example.com`,
name: overrides.name ?? 'Test User',
passwordHash: '$2b$10$test-hash-placeholder',
role: overrides.role ?? 'USER',
},
});
}
async function createGroup(overrides: { name?: string; description?: string } = {}) {
return prisma.group.create({
data: {
name: overrides.name ?? `group-${Date.now()}`,
description: overrides.description ?? 'Test group',
},
});
}
async function createProject(overrides: { name?: string; ownerId?: string } = {}) {
let ownerId = overrides.ownerId;
if (!ownerId) {
const user = await createUser();
ownerId = user.id;
}
return prisma.project.create({
data: {
name: overrides.name ?? `project-${Date.now()}`,
ownerId,
},
});
}
async function createServer(overrides: { name?: string; transport?: 'STDIO' | 'SSE' | 'STREAMABLE_HTTP' } = {}) {
return prisma.mcpServer.create({
data: {
@@ -309,3 +333,236 @@ describe('AuditLog', () => {
expect(logs).toHaveLength(0);
});
});
// ── User SSO fields ──
describe('User SSO fields', () => {
it('stores provider and externalId', async () => {
const user = await prisma.user.create({
data: {
email: 'sso@example.com',
passwordHash: 'hash',
provider: 'oidc',
externalId: 'ext-123',
},
});
expect(user.provider).toBe('oidc');
expect(user.externalId).toBe('ext-123');
});
it('defaults provider and externalId to null', async () => {
const user = await createUser();
expect(user.provider).toBeNull();
expect(user.externalId).toBeNull();
});
});
// ── Group model ──
describe('Group', () => {
it('creates a group with defaults', async () => {
const group = await createGroup();
expect(group.id).toBeDefined();
expect(group.version).toBe(1);
});
it('enforces unique name', async () => {
await createGroup({ name: 'devs' });
await expect(createGroup({ name: 'devs' })).rejects.toThrow();
});
it('creates group members', async () => {
const group = await createGroup();
const user = await createUser();
const member = await prisma.groupMember.create({
data: { groupId: group.id, userId: user.id },
});
expect(member.groupId).toBe(group.id);
expect(member.userId).toBe(user.id);
});
it('enforces unique group-user pair', async () => {
const group = await createGroup();
const user = await createUser();
await prisma.groupMember.create({ data: { groupId: group.id, userId: user.id } });
await expect(
prisma.groupMember.create({ data: { groupId: group.id, userId: user.id } }),
).rejects.toThrow();
});
it('cascades delete when group is deleted', async () => {
const group = await createGroup();
const user = await createUser();
await prisma.groupMember.create({ data: { groupId: group.id, userId: user.id } });
await prisma.group.delete({ where: { id: group.id } });
const members = await prisma.groupMember.findMany({ where: { groupId: group.id } });
expect(members).toHaveLength(0);
});
});
// ── RbacDefinition model ──
describe('RbacDefinition', () => {
it('creates with defaults', async () => {
const rbac = await prisma.rbacDefinition.create({
data: { name: 'test-rbac' },
});
expect(rbac.subjects).toEqual([]);
expect(rbac.roleBindings).toEqual([]);
expect(rbac.version).toBe(1);
});
it('enforces unique name', async () => {
await prisma.rbacDefinition.create({ data: { name: 'dup-rbac' } });
await expect(prisma.rbacDefinition.create({ data: { name: 'dup-rbac' } })).rejects.toThrow();
});
it('stores subjects as JSON', async () => {
const rbac = await prisma.rbacDefinition.create({
data: {
name: 'with-subjects',
subjects: [{ kind: 'User', name: 'alice@test.com' }, { kind: 'Group', name: 'devs' }],
},
});
const subjects = rbac.subjects as Array<{ kind: string; name: string }>;
expect(subjects).toHaveLength(2);
expect(subjects[0].kind).toBe('User');
});
it('stores roleBindings as JSON', async () => {
const rbac = await prisma.rbacDefinition.create({
data: {
name: 'with-bindings',
roleBindings: [{ role: 'editor', resource: 'servers' }],
},
});
const bindings = rbac.roleBindings as Array<{ role: string; resource: string }>;
expect(bindings).toHaveLength(1);
expect(bindings[0].role).toBe('editor');
});
it('updates subjects and roleBindings', async () => {
const rbac = await prisma.rbacDefinition.create({ data: { name: 'updatable-rbac' } });
const updated = await prisma.rbacDefinition.update({
where: { id: rbac.id },
data: {
subjects: [{ kind: 'User', name: 'bob@test.com' }],
roleBindings: [{ role: 'admin', resource: '*' }],
},
});
expect((updated.subjects as unknown[]).length).toBe(1);
expect((updated.roleBindings as unknown[]).length).toBe(1);
});
});
// ── ProjectServer model ──
describe('ProjectServer', () => {
it('links project to server', async () => {
const project = await createProject();
const server = await createServer();
const ps = await prisma.projectServer.create({
data: { projectId: project.id, serverId: server.id },
});
expect(ps.projectId).toBe(project.id);
expect(ps.serverId).toBe(server.id);
});
it('enforces unique project-server pair', async () => {
const project = await createProject();
const server = await createServer();
await prisma.projectServer.create({ data: { projectId: project.id, serverId: server.id } });
await expect(
prisma.projectServer.create({ data: { projectId: project.id, serverId: server.id } }),
).rejects.toThrow();
});
it('cascades delete when project is deleted', async () => {
const project = await createProject();
const server = await createServer();
await prisma.projectServer.create({ data: { projectId: project.id, serverId: server.id } });
await prisma.project.delete({ where: { id: project.id } });
const links = await prisma.projectServer.findMany({ where: { projectId: project.id } });
expect(links).toHaveLength(0);
});
it('cascades delete when server is deleted', async () => {
const project = await createProject();
const server = await createServer();
await prisma.projectServer.create({ data: { projectId: project.id, serverId: server.id } });
await prisma.mcpServer.delete({ where: { id: server.id } });
const links = await prisma.projectServer.findMany({ where: { serverId: server.id } });
expect(links).toHaveLength(0);
});
});
// ── ProjectMember model ──
describe('ProjectMember', () => {
it('links project to user with role', async () => {
const user = await createUser();
const project = await createProject({ ownerId: user.id });
const pm = await prisma.projectMember.create({
data: { projectId: project.id, userId: user.id, role: 'admin' },
});
expect(pm.role).toBe('admin');
});
it('defaults role to member', async () => {
const user = await createUser();
const project = await createProject({ ownerId: user.id });
const pm = await prisma.projectMember.create({
data: { projectId: project.id, userId: user.id },
});
expect(pm.role).toBe('member');
});
it('enforces unique project-user pair', async () => {
const user = await createUser();
const project = await createProject({ ownerId: user.id });
await prisma.projectMember.create({ data: { projectId: project.id, userId: user.id } });
await expect(
prisma.projectMember.create({ data: { projectId: project.id, userId: user.id } }),
).rejects.toThrow();
});
it('cascades delete when project is deleted', async () => {
const user = await createUser();
const project = await createProject({ ownerId: user.id });
await prisma.projectMember.create({ data: { projectId: project.id, userId: user.id } });
await prisma.project.delete({ where: { id: project.id } });
const members = await prisma.projectMember.findMany({ where: { projectId: project.id } });
expect(members).toHaveLength(0);
});
});
// ── Project new fields ──
describe('Project new fields', () => {
it('defaults proxyMode to direct', async () => {
const project = await createProject();
expect(project.proxyMode).toBe('direct');
});
it('stores proxyMode, llmProvider, llmModel', async () => {
const user = await createUser();
const project = await prisma.project.create({
data: {
name: 'filtered-project',
ownerId: user.id,
proxyMode: 'filtered',
llmProvider: 'gemini-cli',
llmModel: 'gemini-2.0-flash',
},
});
expect(project.proxyMode).toBe('filtered');
expect(project.llmProvider).toBe('gemini-cli');
expect(project.llmModel).toBe('gemini-2.0-flash');
});
it('defaults llmProvider and llmModel to null', async () => {
const project = await createProject();
expect(project.llmProvider).toBeNull();
expect(project.llmModel).toBeNull();
});
});

View File

@@ -14,6 +14,9 @@ import {
ProjectRepository,
AuditLogRepository,
TemplateRepository,
RbacDefinitionRepository,
UserRepository,
GroupRepository,
} from './repositories/index.js';
import {
McpServerService,
@@ -29,7 +32,15 @@ import {
AuthService,
McpProxyService,
TemplateService,
HealthProbeRunner,
RbacDefinitionService,
RbacService,
UserService,
GroupService,
} from './services/index.js';
import type { RbacAction } from './services/index.js';
import type { UpdateRbacDefinitionInput } from './validation/rbac-definition.schema.js';
import { createAuthMiddleware } from './middleware/auth.js';
import {
registerMcpServerRoutes,
registerSecretRoutes,
@@ -41,8 +52,127 @@ import {
registerAuthRoutes,
registerMcpProxyRoutes,
registerTemplateRoutes,
registerRbacRoutes,
registerUserRoutes,
registerGroupRoutes,
} from './routes/index.js';
type PermissionCheck =
| { kind: 'resource'; resource: string; action: RbacAction; resourceName?: string }
| { kind: 'operation'; operation: string }
| { kind: 'skip' };
/**
* Map an HTTP method + URL to a permission check.
* Returns 'skip' for URLs that should not be RBAC-checked.
*/
function mapUrlToPermission(method: string, url: string): PermissionCheck {
const match = url.match(/^\/api\/v1\/([a-z-]+)/);
if (!match) return { kind: 'skip' };
const segment = match[1] as string;
// Operations (non-resource endpoints)
if (segment === 'backup') return { kind: 'operation', operation: 'backup' };
if (segment === 'restore') return { kind: 'operation', operation: 'restore' };
if (segment === 'audit-logs' && method === 'DELETE') return { kind: 'operation', operation: 'audit-purge' };
const resourceMap: Record<string, string | undefined> = {
'servers': 'servers',
'instances': 'instances',
'secrets': 'secrets',
'projects': 'projects',
'templates': 'templates',
'users': 'users',
'groups': 'groups',
'rbac': 'rbac',
'audit-logs': 'rbac',
'mcp': 'servers',
};
const resource = resourceMap[segment];
if (resource === undefined) return { kind: 'skip' };
// Special case: /api/v1/projects/:id/mcp-config → requires 'expose' permission
const mcpConfigMatch = url.match(/^\/api\/v1\/projects\/([^/?]+)\/mcp-config/);
if (mcpConfigMatch?.[1]) {
return { kind: 'resource', resource: 'projects', action: 'expose', resourceName: mcpConfigMatch[1] };
}
// Special case: /api/v1/projects/:id/servers — attach/detach requires 'edit'
const projectServersMatch = url.match(/^\/api\/v1\/projects\/([^/?]+)\/servers/);
if (projectServersMatch?.[1] && method !== 'GET') {
return { kind: 'resource', resource: 'projects', action: 'edit', resourceName: projectServersMatch[1] };
}
// Map HTTP method to action
let action: RbacAction;
switch (method) {
case 'GET':
case 'HEAD':
action = 'view';
break;
case 'POST':
action = 'create';
break;
case 'DELETE':
action = 'delete';
break;
default: // PUT, PATCH
action = 'edit';
break;
}
// Extract resource name/ID from URL (3rd segment: /api/v1/servers/:nameOrId)
const nameMatch = url.match(/^\/api\/v1\/[a-z-]+\/([^/?]+)/);
const resourceName = nameMatch?.[1];
const check: PermissionCheck = { kind: 'resource', resource, action };
if (resourceName !== undefined) (check as { resourceName: string }).resourceName = resourceName;
return check;
}
/**
* Migrate legacy 'admin' role bindings → granular roles.
* Old format: { role: 'admin', resource: '*' }
* New format: { role: 'edit', resource: '*' }, { role: 'run', resource: '*' },
* plus operation bindings for impersonate, logs, backup, restore, audit-purge
*/
async function migrateAdminRole(rbacRepo: InstanceType<typeof RbacDefinitionRepository>): Promise<void> {
const definitions = await rbacRepo.findAll();
for (const def of definitions) {
const bindings = def.roleBindings as Array<Record<string, unknown>>;
const hasAdminRole = bindings.some((b) => b['role'] === 'admin');
if (!hasAdminRole) continue;
// Replace admin bindings with granular equivalents
const newBindings: Array<Record<string, string>> = [];
for (const b of bindings) {
if (b['role'] === 'admin') {
const resource = b['resource'] as string;
newBindings.push({ role: 'edit', resource });
newBindings.push({ role: 'run', resource });
} else {
newBindings.push(b as Record<string, string>);
}
}
// Add operation bindings (idempotent — only for wildcard admin)
const hasWildcard = bindings.some((b) => b['role'] === 'admin' && b['resource'] === '*');
if (hasWildcard) {
const ops = ['impersonate', 'logs', 'backup', 'restore', 'audit-purge'];
for (const op of ops) {
if (!newBindings.some((b) => b['action'] === op)) {
newBindings.push({ role: 'run', action: op });
}
}
}
await rbacRepo.update(def.id, { roleBindings: newBindings as UpdateRbacDefinitionInput['roleBindings'] });
// eslint-disable-next-line no-console
console.log(`mcpd: migrated RBAC '${def.name}' from admin → granular roles`);
}
}
async function main(): Promise<void> {
const config = loadConfigFromEnv();
@@ -81,6 +211,21 @@ async function main(): Promise<void> {
const projectRepo = new ProjectRepository(prisma);
const auditLogRepo = new AuditLogRepository(prisma);
const templateRepo = new TemplateRepository(prisma);
const rbacDefinitionRepo = new RbacDefinitionRepository(prisma);
const userRepo = new UserRepository(prisma);
const groupRepo = new GroupRepository(prisma);
// CUID detection for RBAC name resolution
const CUID_RE = /^c[^\s-]{8,}$/i;
const nameResolvers: Record<string, { findById(id: string): Promise<{ name: string } | null> }> = {
servers: serverRepo,
secrets: secretRepo,
projects: projectRepo,
groups: groupRepo,
};
// Migrate legacy 'admin' role → granular roles
await migrateAdminRole(rbacDefinitionRepo);
// Orchestrator
const orchestrator = new DockerContainerManager();
@@ -90,15 +235,24 @@ async function main(): Promise<void> {
const instanceService = new InstanceService(instanceRepo, serverRepo, orchestrator, secretRepo);
serverService.setInstanceService(instanceService);
const secretService = new SecretService(secretRepo);
const projectService = new ProjectService(projectRepo);
const projectService = new ProjectService(projectRepo, serverRepo, secretRepo);
const auditLogService = new AuditLogService(auditLogRepo);
const metricsCollector = new MetricsCollector();
const healthAggregator = new HealthAggregator(metricsCollector, orchestrator);
const backupService = new BackupService(serverRepo, projectRepo, secretRepo);
const restoreService = new RestoreService(serverRepo, projectRepo, secretRepo);
const backupService = new BackupService(serverRepo, projectRepo, secretRepo, userRepo, groupRepo, rbacDefinitionRepo);
const restoreService = new RestoreService(serverRepo, projectRepo, secretRepo, userRepo, groupRepo, rbacDefinitionRepo);
const authService = new AuthService(prisma);
const templateService = new TemplateService(templateRepo);
const mcpProxyService = new McpProxyService(instanceRepo, serverRepo);
const rbacDefinitionService = new RbacDefinitionService(rbacDefinitionRepo);
const rbacService = new RbacService(rbacDefinitionRepo, prisma);
const userService = new UserService(userRepo);
const groupService = new GroupService(groupRepo, userRepo);
// Auth middleware for global hooks
const authMiddleware = createAuthMiddleware({
findSession: (token) => authService.findSession(token),
});
// Server
const app = await createServer(config, {
@@ -114,6 +268,55 @@ async function main(): Promise<void> {
},
});
// ── Global auth hook ──
// Runs on all /api/v1/* routes EXCEPT auth endpoints and health checks.
// Tests that use createServer() directly are NOT affected — this hook
// is only registered here in main.ts.
app.addHook('preHandler', async (request, reply) => {
const url = request.url;
// Skip auth for health, auth, and root
if (url.startsWith('/api/v1/auth/') || url === '/healthz' || url === '/health') return;
if (!url.startsWith('/api/v1/')) return;
// Run auth middleware
await authMiddleware(request, reply);
});
// ── Global RBAC hook ──
// Runs after the auth hook. Maps URL to resource+action and checks permissions.
app.addHook('preHandler', async (request, reply) => {
if (reply.sent) return; // Auth hook already rejected
const url = request.url;
if (url.startsWith('/api/v1/auth/') || url === '/healthz' || url === '/health') return;
if (!url.startsWith('/api/v1/')) return;
if (request.userId === undefined) return; // Auth hook will handle 401
const check = mapUrlToPermission(request.method, url);
if (check.kind === 'skip') return;
let allowed: boolean;
if (check.kind === 'operation') {
allowed = await rbacService.canRunOperation(request.userId, check.operation);
} else {
// Resolve CUID → human name for name-scoped RBAC bindings
if (check.resourceName !== undefined && CUID_RE.test(check.resourceName)) {
const resolver = nameResolvers[check.resource];
if (resolver) {
const entity = await resolver.findById(check.resourceName);
if (entity) check.resourceName = entity.name;
}
}
allowed = await rbacService.canAccess(request.userId, check.action, check.resource, check.resourceName);
// Compute scope for list filtering (used by preSerialization hook)
if (allowed && check.resourceName === undefined) {
request.rbacScope = await rbacService.getAllowedScope(request.userId, check.action, check.resource);
}
}
if (!allowed) {
reply.code(403).send({ error: 'Forbidden' });
}
});
// Routes
registerMcpServerRoutes(app, serverService, instanceService);
registerTemplateRoutes(app, templateService);
@@ -123,20 +326,57 @@ async function main(): Promise<void> {
registerAuditLogRoutes(app, auditLogService);
registerHealthMonitoringRoutes(app, { healthAggregator, metricsCollector });
registerBackupRoutes(app, { backupService, restoreService });
registerAuthRoutes(app, { authService });
registerAuthRoutes(app, { authService, userService, groupService, rbacDefinitionService, rbacService });
registerMcpProxyRoutes(app, {
mcpProxyService,
auditLogService,
authDeps: { findSession: (token) => authService.findSession(token) },
});
registerRbacRoutes(app, rbacDefinitionService);
registerUserRoutes(app, userService);
registerGroupRoutes(app, groupService);
// ── RBAC list filtering hook ──
// Filters array responses to only include resources the user is allowed to see.
app.addHook('preSerialization', async (request, _reply, payload) => {
if (!request.rbacScope || request.rbacScope.wildcard) return payload;
if (!Array.isArray(payload)) return payload;
return (payload as Array<Record<string, unknown>>).filter((item) => {
const name = item['name'];
return typeof name === 'string' && request.rbacScope!.names.has(name);
});
});
// Start
await app.listen({ port: config.port, host: config.host });
app.log.info(`mcpd listening on ${config.host}:${config.port}`);
// Periodic container liveness sync — detect crashed containers
const SYNC_INTERVAL_MS = 30_000; // 30s
const syncTimer = setInterval(async () => {
try {
await instanceService.syncStatus();
} catch (err) {
app.log.error({ err }, 'Container status sync failed');
}
}, SYNC_INTERVAL_MS);
// Health probe runner — periodic MCP tool-call probes (like k8s livenessProbe)
const healthProbeRunner = new HealthProbeRunner(
instanceRepo,
serverRepo,
orchestrator,
{ info: (msg) => app.log.info(msg), error: (obj, msg) => app.log.error(obj, msg) },
);
healthProbeRunner.start(15_000);
// Graceful shutdown
setupGracefulShutdown(app, {
disconnectDb: () => prisma.$disconnect(),
disconnectDb: async () => {
clearInterval(syncTimer);
healthProbeRunner.stop();
await prisma.$disconnect();
},
});
}

View File

@@ -7,6 +7,7 @@ export interface AuthDeps {
declare module 'fastify' {
interface FastifyRequest {
userId?: string;
rbacScope?: { wildcard: boolean; names: Set<string> };
}
}

View File

@@ -0,0 +1,36 @@
import type { FastifyRequest, FastifyReply } from 'fastify';
import type { RbacService, RbacAction } from '../services/rbac.service.js';
export function createRbacMiddleware(rbacService: RbacService) {
function requirePermission(resource: string, action: RbacAction, resourceName?: string) {
return async (request: FastifyRequest, reply: FastifyReply): Promise<void> => {
if (request.userId === undefined) {
reply.code(401).send({ error: 'Authentication required' });
return;
}
const allowed = await rbacService.canAccess(request.userId, action, resource, resourceName);
if (!allowed) {
reply.code(403).send({ error: 'Forbidden' });
return;
}
};
}
function requireOperation(operation: string) {
return async (request: FastifyRequest, reply: FastifyReply): Promise<void> => {
if (request.userId === undefined) {
reply.code(401).send({ error: 'Authentication required' });
return;
}
const allowed = await rbacService.canRunOperation(request.userId, operation);
if (!allowed) {
reply.code(403).send({ error: 'Forbidden' });
return;
}
};
}
return { requirePermission, requireOperation };
}

View File

@@ -0,0 +1,93 @@
import type { PrismaClient, Group } from '@prisma/client';
export interface GroupWithMembers extends Group {
members: Array<{ id: string; user: { id: string; email: string; name: string | null } }>;
}
export interface IGroupRepository {
findAll(): Promise<GroupWithMembers[]>;
findById(id: string): Promise<GroupWithMembers | null>;
findByName(name: string): Promise<GroupWithMembers | null>;
create(data: { name: string; description?: string }): Promise<Group>;
update(id: string, data: { description?: string }): Promise<Group>;
delete(id: string): Promise<void>;
setMembers(groupId: string, userIds: string[]): Promise<void>;
findGroupsForUser(userId: string): Promise<Array<{ id: string; name: string }>>;
}
const MEMBERS_INCLUDE = {
members: {
select: {
id: true,
user: {
select: { id: true, email: true, name: true },
},
},
},
} as const;
export class GroupRepository implements IGroupRepository {
constructor(private readonly prisma: PrismaClient) {}
async findAll(): Promise<GroupWithMembers[]> {
return this.prisma.group.findMany({
orderBy: { name: 'asc' },
include: MEMBERS_INCLUDE,
});
}
async findById(id: string): Promise<GroupWithMembers | null> {
return this.prisma.group.findUnique({
where: { id },
include: MEMBERS_INCLUDE,
});
}
async findByName(name: string): Promise<GroupWithMembers | null> {
return this.prisma.group.findUnique({
where: { name },
include: MEMBERS_INCLUDE,
});
}
async create(data: { name: string; description?: string }): Promise<Group> {
const createData: Record<string, unknown> = { name: data.name };
if (data.description !== undefined) createData['description'] = data.description;
return this.prisma.group.create({
data: createData as Parameters<PrismaClient['group']['create']>[0]['data'],
});
}
async update(id: string, data: { description?: string }): Promise<Group> {
const updateData: Record<string, unknown> = {};
if (data.description !== undefined) updateData['description'] = data.description;
return this.prisma.group.update({ where: { id }, data: updateData });
}
async delete(id: string): Promise<void> {
await this.prisma.group.delete({ where: { id } });
}
async setMembers(groupId: string, userIds: string[]): Promise<void> {
await this.prisma.$transaction(async (tx) => {
await tx.groupMember.deleteMany({ where: { groupId } });
if (userIds.length > 0) {
await tx.groupMember.createMany({
data: userIds.map((userId) => ({ groupId, userId })),
});
}
});
}
async findGroupsForUser(userId: string): Promise<Array<{ id: string; name: string }>> {
const memberships = await this.prisma.groupMember.findMany({
where: { userId },
select: {
group: {
select: { id: true, name: true },
},
},
});
return memberships.map((m) => m.group);
}
}

View File

@@ -1,9 +1,15 @@
export type { IMcpServerRepository, IMcpInstanceRepository, ISecretRepository, IAuditLogRepository, AuditLogFilter } from './interfaces.js';
export { McpServerRepository } from './mcp-server.repository.js';
export { SecretRepository } from './secret.repository.js';
export type { IProjectRepository } from './project.repository.js';
export type { IProjectRepository, ProjectWithRelations } from './project.repository.js';
export { ProjectRepository } from './project.repository.js';
export { McpInstanceRepository } from './mcp-instance.repository.js';
export { AuditLogRepository } from './audit-log.repository.js';
export type { ITemplateRepository } from './template.repository.js';
export { TemplateRepository } from './template.repository.js';
export type { IRbacDefinitionRepository } from './rbac-definition.repository.js';
export { RbacDefinitionRepository } from './rbac-definition.repository.js';
export type { IUserRepository, SafeUser } from './user.repository.js';
export { UserRepository } from './user.repository.js';
export type { IGroupRepository, GroupWithMembers } from './group.repository.js';
export { GroupRepository } from './group.repository.js';

View File

@@ -11,12 +11,16 @@ export class McpInstanceRepository implements IMcpInstanceRepository {
}
return this.prisma.mcpInstance.findMany({
where,
include: { server: { select: { name: true } } },
orderBy: { createdAt: 'desc' },
});
}
async findById(id: string): Promise<McpInstance | null> {
return this.prisma.mcpInstance.findUnique({ where: { id } });
return this.prisma.mcpInstance.findUnique({
where: { id },
include: { server: { select: { name: true } } },
});
}
async findByContainerId(containerId: string): Promise<McpInstance | null> {

View File

@@ -1,49 +1,91 @@
import type { PrismaClient, Project } from '@prisma/client';
import type { CreateProjectInput, UpdateProjectInput } from '../validation/project.schema.js';
export interface ProjectWithRelations extends Project {
servers: Array<{ id: string; server: { id: string; name: string } }>;
}
const PROJECT_INCLUDE = {
servers: { include: { server: { select: { id: true, name: true } } } },
} as const;
export interface IProjectRepository {
findAll(ownerId?: string): Promise<Project[]>;
findById(id: string): Promise<Project | null>;
findByName(name: string): Promise<Project | null>;
create(data: CreateProjectInput & { ownerId: string }): Promise<Project>;
update(id: string, data: UpdateProjectInput): Promise<Project>;
findAll(ownerId?: string): Promise<ProjectWithRelations[]>;
findById(id: string): Promise<ProjectWithRelations | null>;
findByName(name: string): Promise<ProjectWithRelations | null>;
create(data: { name: string; description: string; ownerId: string; proxyMode: string; llmProvider?: string; llmModel?: string }): Promise<ProjectWithRelations>;
update(id: string, data: Record<string, unknown>): Promise<ProjectWithRelations>;
delete(id: string): Promise<void>;
setServers(projectId: string, serverIds: string[]): Promise<void>;
addServer(projectId: string, serverId: string): Promise<void>;
removeServer(projectId: string, serverId: string): Promise<void>;
}
export class ProjectRepository implements IProjectRepository {
constructor(private readonly prisma: PrismaClient) {}
async findAll(ownerId?: string): Promise<Project[]> {
async findAll(ownerId?: string): Promise<ProjectWithRelations[]> {
const where = ownerId !== undefined ? { ownerId } : {};
return this.prisma.project.findMany({ where, orderBy: { name: 'asc' } });
return this.prisma.project.findMany({ where, orderBy: { name: 'asc' }, include: PROJECT_INCLUDE }) as unknown as Promise<ProjectWithRelations[]>;
}
async findById(id: string): Promise<Project | null> {
return this.prisma.project.findUnique({ where: { id } });
async findById(id: string): Promise<ProjectWithRelations | null> {
return this.prisma.project.findUnique({ where: { id }, include: PROJECT_INCLUDE }) as unknown as Promise<ProjectWithRelations | null>;
}
async findByName(name: string): Promise<Project | null> {
return this.prisma.project.findUnique({ where: { name } });
async findByName(name: string): Promise<ProjectWithRelations | null> {
return this.prisma.project.findUnique({ where: { name }, include: PROJECT_INCLUDE }) as unknown as Promise<ProjectWithRelations | null>;
}
async create(data: CreateProjectInput & { ownerId: string }): Promise<Project> {
return this.prisma.project.create({
data: {
async create(data: { name: string; description: string; ownerId: string; proxyMode: string; llmProvider?: string; llmModel?: string }): Promise<ProjectWithRelations> {
const createData: Record<string, unknown> = {
name: data.name,
description: data.description,
ownerId: data.ownerId,
},
});
proxyMode: data.proxyMode,
};
if (data.llmProvider !== undefined) createData['llmProvider'] = data.llmProvider;
if (data.llmModel !== undefined) createData['llmModel'] = data.llmModel;
return this.prisma.project.create({
data: createData as Parameters<PrismaClient['project']['create']>[0]['data'],
include: PROJECT_INCLUDE,
}) as unknown as Promise<ProjectWithRelations>;
}
async update(id: string, data: UpdateProjectInput): Promise<Project> {
const updateData: Record<string, unknown> = {};
if (data.description !== undefined) updateData['description'] = data.description;
return this.prisma.project.update({ where: { id }, data: updateData });
async update(id: string, data: Record<string, unknown>): Promise<ProjectWithRelations> {
return this.prisma.project.update({
where: { id },
data,
include: PROJECT_INCLUDE,
}) as unknown as Promise<ProjectWithRelations>;
}
async delete(id: string): Promise<void> {
await this.prisma.project.delete({ where: { id } });
}
async setServers(projectId: string, serverIds: string[]): Promise<void> {
await this.prisma.$transaction(async (tx) => {
await tx.projectServer.deleteMany({ where: { projectId } });
if (serverIds.length > 0) {
await tx.projectServer.createMany({
data: serverIds.map((serverId) => ({ projectId, serverId })),
});
}
});
}
async addServer(projectId: string, serverId: string): Promise<void> {
await this.prisma.projectServer.upsert({
where: { projectId_serverId: { projectId, serverId } },
create: { projectId, serverId },
update: {},
});
}
async removeServer(projectId: string, serverId: string): Promise<void> {
await this.prisma.projectServer.deleteMany({
where: { projectId, serverId },
});
}
}

View File

@@ -0,0 +1,48 @@
import type { PrismaClient, RbacDefinition } from '@prisma/client';
import type { CreateRbacDefinitionInput, UpdateRbacDefinitionInput } from '../validation/rbac-definition.schema.js';
export interface IRbacDefinitionRepository {
findAll(): Promise<RbacDefinition[]>;
findById(id: string): Promise<RbacDefinition | null>;
findByName(name: string): Promise<RbacDefinition | null>;
create(data: CreateRbacDefinitionInput): Promise<RbacDefinition>;
update(id: string, data: UpdateRbacDefinitionInput): Promise<RbacDefinition>;
delete(id: string): Promise<void>;
}
export class RbacDefinitionRepository implements IRbacDefinitionRepository {
constructor(private readonly prisma: PrismaClient) {}
async findAll(): Promise<RbacDefinition[]> {
return this.prisma.rbacDefinition.findMany({ orderBy: { name: 'asc' } });
}
async findById(id: string): Promise<RbacDefinition | null> {
return this.prisma.rbacDefinition.findUnique({ where: { id } });
}
async findByName(name: string): Promise<RbacDefinition | null> {
return this.prisma.rbacDefinition.findUnique({ where: { name } });
}
async create(data: CreateRbacDefinitionInput): Promise<RbacDefinition> {
return this.prisma.rbacDefinition.create({
data: {
name: data.name,
subjects: data.subjects,
roleBindings: data.roleBindings,
},
});
}
async update(id: string, data: UpdateRbacDefinitionInput): Promise<RbacDefinition> {
const updateData: Record<string, unknown> = {};
if (data.subjects !== undefined) updateData['subjects'] = data.subjects;
if (data.roleBindings !== undefined) updateData['roleBindings'] = data.roleBindings;
return this.prisma.rbacDefinition.update({ where: { id }, data: updateData });
}
async delete(id: string): Promise<void> {
await this.prisma.rbacDefinition.delete({ where: { id } });
}
}

View File

@@ -0,0 +1,76 @@
import type { PrismaClient, User } from '@prisma/client';
/** User without the passwordHash field — safe for API responses. */
export type SafeUser = Omit<User, 'passwordHash'>;
export interface IUserRepository {
findAll(): Promise<SafeUser[]>;
findById(id: string): Promise<SafeUser | null>;
findByEmail(email: string, includeHash?: boolean): Promise<SafeUser | null> | Promise<User | null>;
create(data: { email: string; passwordHash: string; name?: string; role?: string }): Promise<SafeUser>;
delete(id: string): Promise<void>;
count(): Promise<number>;
}
/** Fields to select when passwordHash must be excluded. */
const safeSelect = {
id: true,
email: true,
name: true,
role: true,
provider: true,
externalId: true,
version: true,
createdAt: true,
updatedAt: true,
} as const;
export class UserRepository implements IUserRepository {
constructor(private readonly prisma: PrismaClient) {}
async findAll(): Promise<SafeUser[]> {
return this.prisma.user.findMany({
select: safeSelect,
orderBy: { email: 'asc' },
});
}
async findById(id: string): Promise<SafeUser | null> {
return this.prisma.user.findUnique({
where: { id },
select: safeSelect,
});
}
async findByEmail(email: string, includeHash?: boolean): Promise<User | SafeUser | null> {
if (includeHash === true) {
return this.prisma.user.findUnique({ where: { email } });
}
return this.prisma.user.findUnique({
where: { email },
select: safeSelect,
});
}
async create(data: { email: string; passwordHash: string; name?: string; role?: string }): Promise<SafeUser> {
const createData: Record<string, unknown> = {
email: data.email,
passwordHash: data.passwordHash,
};
if (data.name !== undefined) createData['name'] = data.name;
if (data.role !== undefined) createData['role'] = data.role;
return this.prisma.user.create({
data: createData as Parameters<PrismaClient['user']['create']>[0]['data'],
select: safeSelect,
});
}
async delete(id: string): Promise<void> {
await this.prisma.user.delete({ where: { id } });
}
async count(): Promise<number> {
return this.prisma.user.count();
}
}

View File

@@ -1,15 +1,76 @@
import type { FastifyInstance } from 'fastify';
import type { AuthService } from '../services/auth.service.js';
import type { UserService } from '../services/user.service.js';
import type { GroupService } from '../services/group.service.js';
import type { RbacDefinitionService } from '../services/rbac-definition.service.js';
import type { RbacService } from '../services/rbac.service.js';
import { createAuthMiddleware } from '../middleware/auth.js';
import { createRbacMiddleware } from '../middleware/rbac.js';
export interface AuthRouteDeps {
authService: AuthService;
userService: UserService;
groupService: GroupService;
rbacDefinitionService: RbacDefinitionService;
rbacService: RbacService;
}
export function registerAuthRoutes(app: FastifyInstance, deps: AuthRouteDeps): void {
const authMiddleware = createAuthMiddleware({
findSession: (token) => deps.authService.findSession(token),
});
const { requireOperation } = createRbacMiddleware(deps.rbacService);
// GET /api/v1/auth/status — unauthenticated, returns whether any users exist
app.get('/api/v1/auth/status', async () => {
const count = await deps.userService.count();
return { hasUsers: count > 0 };
});
// POST /api/v1/auth/bootstrap — only works when no users exist (first-run setup)
app.post('/api/v1/auth/bootstrap', async (request, reply) => {
const count = await deps.userService.count();
if (count > 0) {
reply.code(409).send({ error: 'Users already exist. Use login instead.' });
return;
}
const { email, password, name } = request.body as { email: string; password: string; name?: string };
// Create the first admin user
await deps.userService.create({
email,
password,
...(name !== undefined ? { name } : {}),
});
// Create "admin" group and add the first user to it
await deps.groupService.create({
name: 'admin',
description: 'Bootstrap admin group',
members: [email],
});
// Create bootstrap RBAC: full resource access + all operations
await deps.rbacDefinitionService.create({
name: 'bootstrap-admin',
subjects: [{ kind: 'Group', name: 'admin' }],
roleBindings: [
{ role: 'edit', resource: '*' },
{ role: 'run', resource: '*' },
{ role: 'run', action: 'impersonate' },
{ role: 'run', action: 'logs' },
{ role: 'run', action: 'backup' },
{ role: 'run', action: 'restore' },
{ role: 'run', action: 'audit-purge' },
],
});
// Auto-login so the caller gets a token immediately
const session = await deps.authService.login(email, password);
reply.code(201);
return session;
});
// POST /api/v1/auth/login — no auth required
app.post<{
@@ -28,4 +89,15 @@ export function registerAuthRoutes(app: FastifyInstance, deps: AuthRouteDeps): v
await deps.authService.logout(token);
return { success: true };
});
// POST /api/v1/auth/impersonate — requires auth + run:impersonate operation
app.post(
'/api/v1/auth/impersonate',
{ preHandler: [authMiddleware, requireOperation('impersonate')] },
async (request) => {
const { email } = request.body as { email: string };
const result = await deps.authService.impersonate(email);
return result;
},
);
}

View File

@@ -13,7 +13,7 @@ export function registerBackupRoutes(app: FastifyInstance, deps: BackupDeps): vo
app.post<{
Body: {
password?: string;
resources?: Array<'servers' | 'secrets' | 'projects'>;
resources?: Array<'servers' | 'secrets' | 'projects' | 'users' | 'groups' | 'rbac'>;
};
}>('/api/v1/backup', async (request) => {
const opts: BackupOptions = {};
@@ -51,7 +51,7 @@ export function registerBackupRoutes(app: FastifyInstance, deps: BackupDeps): vo
const result = await deps.restoreService.restore(bundle, restoreOpts);
if (result.errors.length > 0 && result.serversCreated === 0 && result.secretsCreated === 0 && result.projectsCreated === 0) {
if (result.errors.length > 0 && result.serversCreated === 0 && result.secretsCreated === 0 && result.projectsCreated === 0 && result.usersCreated === 0 && result.groupsCreated === 0 && result.rbacCreated === 0) {
reply.code(422);
}

View File

@@ -0,0 +1,35 @@
import type { FastifyInstance } from 'fastify';
import type { GroupService } from '../services/group.service.js';
export function registerGroupRoutes(
app: FastifyInstance,
service: GroupService,
): void {
app.get('/api/v1/groups', async () => {
return service.list();
});
app.get<{ Params: { id: string } }>('/api/v1/groups/:id', async (request) => {
// Try by ID first, fall back to name lookup
try {
return await service.getById(request.params.id);
} catch {
return service.getByName(request.params.id);
}
});
app.post('/api/v1/groups', async (request, reply) => {
const group = await service.create(request.body);
reply.code(201);
return group;
});
app.put<{ Params: { id: string } }>('/api/v1/groups/:id', async (request) => {
return service.update(request.params.id, request.body);
});
app.delete<{ Params: { id: string } }>('/api/v1/groups/:id', async (request, reply) => {
await service.delete(request.params.id);
reply.code(204);
});
}

View File

@@ -14,3 +14,6 @@ export type { AuthRouteDeps } from './auth.js';
export { registerMcpProxyRoutes } from './mcp-proxy.js';
export type { McpProxyRouteDeps } from './mcp-proxy.js';
export { registerTemplateRoutes } from './templates.js';
export { registerRbacRoutes } from './rbac-definitions.js';
export { registerUserRoutes } from './users.js';
export { registerGroupRoutes } from './groups.js';

View File

@@ -2,13 +2,13 @@ import type { FastifyInstance } from 'fastify';
import type { ProjectService } from '../services/project.service.js';
export function registerProjectRoutes(app: FastifyInstance, service: ProjectService): void {
app.get('/api/v1/projects', async (request) => {
// If authenticated, filter by owner; otherwise list all
return service.list(request.userId);
app.get('/api/v1/projects', async () => {
// RBAC preSerialization hook handles access filtering
return service.list();
});
app.get<{ Params: { id: string } }>('/api/v1/projects/:id', async (request) => {
return service.getById(request.params.id);
return service.resolveAndGet(request.params.id);
});
app.post('/api/v1/projects', async (request, reply) => {
@@ -19,11 +19,39 @@ export function registerProjectRoutes(app: FastifyInstance, service: ProjectServ
});
app.put<{ Params: { id: string } }>('/api/v1/projects/:id', async (request) => {
return service.update(request.params.id, request.body);
const project = await service.resolveAndGet(request.params.id);
return service.update(project.id, request.body);
});
app.delete<{ Params: { id: string } }>('/api/v1/projects/:id', async (request, reply) => {
await service.delete(request.params.id);
const project = await service.resolveAndGet(request.params.id);
await service.delete(project.id);
reply.code(204);
});
// Generate .mcp.json for a project
app.get<{ Params: { id: string } }>('/api/v1/projects/:id/mcp-config', async (request) => {
return service.generateMcpConfig(request.params.id);
});
// Attach a server to a project
app.post<{ Params: { id: string }; Body: { server: string } }>('/api/v1/projects/:id/servers', async (request) => {
const body = request.body as { server?: string };
if (!body.server) {
throw Object.assign(new Error('Missing "server" in request body'), { statusCode: 400 });
}
return service.addServer(request.params.id, body.server);
});
// Detach a server from a project
app.delete<{ Params: { id: string; serverName: string } }>('/api/v1/projects/:id/servers/:serverName', async (request, reply) => {
await service.removeServer(request.params.id, request.params.serverName);
reply.code(204);
});
// List servers in a project (for mcplocal discovery)
app.get<{ Params: { id: string } }>('/api/v1/projects/:id/servers', async (request) => {
const project = await service.resolveAndGet(request.params.id);
return project.servers.map((ps) => ps.server);
});
}

View File

@@ -0,0 +1,30 @@
import type { FastifyInstance } from 'fastify';
import type { RbacDefinitionService } from '../services/rbac-definition.service.js';
export function registerRbacRoutes(
app: FastifyInstance,
service: RbacDefinitionService,
): void {
app.get('/api/v1/rbac', async () => {
return service.list();
});
app.get<{ Params: { id: string } }>('/api/v1/rbac/:id', async (request) => {
return service.getById(request.params.id);
});
app.post('/api/v1/rbac', async (request, reply) => {
const def = await service.create(request.body);
reply.code(201);
return def;
});
app.put<{ Params: { id: string } }>('/api/v1/rbac/:id', async (request) => {
return service.update(request.params.id, request.body);
});
app.delete<{ Params: { id: string } }>('/api/v1/rbac/:id', async (request, reply) => {
await service.delete(request.params.id);
reply.code(204);
});
}

View File

@@ -0,0 +1,31 @@
import type { FastifyInstance } from 'fastify';
import type { UserService } from '../services/user.service.js';
export function registerUserRoutes(
app: FastifyInstance,
service: UserService,
): void {
app.get('/api/v1/users', async () => {
return service.list();
});
app.get<{ Params: { id: string } }>('/api/v1/users/:id', async (request) => {
// Support lookup by email (contains @) or by id
const idOrEmail = request.params.id;
if (idOrEmail.includes('@')) {
return service.getByEmail(idOrEmail);
}
return service.getById(idOrEmail);
});
app.post('/api/v1/users', async (request, reply) => {
const user = await service.create(request.body);
reply.code(201);
return user;
});
app.delete<{ Params: { id: string } }>('/api/v1/users/:id', async (_request, reply) => {
await service.delete(_request.params.id);
reply.code(204);
});
}

View File

@@ -63,4 +63,32 @@ export class AuthService {
}
return { userId: session.userId, expiresAt: session.expiresAt };
}
/**
* Create a session for a user by email without requiring their password.
* Used for admin impersonation.
*/
async impersonate(email: string): Promise<LoginResult> {
const user = await this.prisma.user.findUnique({ where: { email } });
if (user === null) {
throw new AuthenticationError('User not found');
}
const token = randomUUID();
const expiresAt = new Date(Date.now() + SESSION_TTL_MS);
await this.prisma.session.create({
data: {
token,
userId: user.id,
expiresAt,
},
});
return {
token,
expiresAt,
user: { id: user.id, email: user.email, role: user.role },
};
}
}

View File

@@ -1,5 +1,8 @@
import type { IMcpServerRepository, ISecretRepository } from '../../repositories/interfaces.js';
import type { IProjectRepository } from '../../repositories/project.repository.js';
import type { IUserRepository } from '../../repositories/user.repository.js';
import type { IGroupRepository } from '../../repositories/group.repository.js';
import type { IRbacDefinitionRepository } from '../../repositories/rbac-definition.repository.js';
import { encrypt, isSensitiveKey } from './crypto.js';
import type { EncryptedPayload } from './crypto.js';
import { APP_VERSION } from '@mcpctl/shared';
@@ -12,6 +15,9 @@ export interface BackupBundle {
servers: BackupServer[];
secrets: BackupSecret[];
projects: BackupProject[];
users?: BackupUser[];
groups?: BackupGroup[];
rbacBindings?: BackupRbacBinding[];
encryptedSecrets?: EncryptedPayload;
}
@@ -33,11 +39,34 @@ export interface BackupSecret {
export interface BackupProject {
name: string;
description: string;
proxyMode?: string;
llmProvider?: string | null;
llmModel?: string | null;
serverNames?: string[];
}
export interface BackupUser {
email: string;
name: string | null;
role: string;
provider: string | null;
}
export interface BackupGroup {
name: string;
description: string;
memberEmails: string[];
}
export interface BackupRbacBinding {
name: string;
subjects: unknown;
roleBindings: unknown;
}
export interface BackupOptions {
password?: string;
resources?: Array<'servers' | 'secrets' | 'projects'>;
resources?: Array<'servers' | 'secrets' | 'projects' | 'users' | 'groups' | 'rbac'>;
}
export class BackupService {
@@ -45,14 +74,20 @@ export class BackupService {
private serverRepo: IMcpServerRepository,
private projectRepo: IProjectRepository,
private secretRepo: ISecretRepository,
private userRepo?: IUserRepository,
private groupRepo?: IGroupRepository,
private rbacRepo?: IRbacDefinitionRepository,
) {}
async createBackup(options?: BackupOptions): Promise<BackupBundle> {
const resources = options?.resources ?? ['servers', 'secrets', 'projects'];
const resources = options?.resources ?? ['servers', 'secrets', 'projects', 'users', 'groups', 'rbac'];
let servers: BackupServer[] = [];
let secrets: BackupSecret[] = [];
let projects: BackupProject[] = [];
let users: BackupUser[] = [];
let groups: BackupGroup[] = [];
let rbacBindings: BackupRbacBinding[] = [];
if (resources.includes('servers')) {
const allServers = await this.serverRepo.findAll();
@@ -80,6 +115,38 @@ export class BackupService {
projects = allProjects.map((proj) => ({
name: proj.name,
description: proj.description,
proxyMode: proj.proxyMode,
llmProvider: proj.llmProvider,
llmModel: proj.llmModel,
serverNames: proj.servers.map((ps) => ps.server.name),
}));
}
if (resources.includes('users') && this.userRepo) {
const allUsers = await this.userRepo.findAll();
users = allUsers.map((u) => ({
email: u.email,
name: u.name,
role: u.role,
provider: u.provider,
}));
}
if (resources.includes('groups') && this.groupRepo) {
const allGroups = await this.groupRepo.findAll();
groups = allGroups.map((g) => ({
name: g.name,
description: g.description,
memberEmails: g.members.map((m) => m.user.email),
}));
}
if (resources.includes('rbac') && this.rbacRepo) {
const allRbac = await this.rbacRepo.findAll();
rbacBindings = allRbac.map((r) => ({
name: r.name,
subjects: r.subjects,
roleBindings: r.roleBindings,
}));
}
@@ -91,6 +158,9 @@ export class BackupService {
servers,
secrets,
projects,
users,
groups,
rbacBindings,
};
if (options?.password && secrets.length > 0) {

View File

@@ -1,5 +1,9 @@
import type { IMcpServerRepository, ISecretRepository } from '../../repositories/interfaces.js';
import type { IProjectRepository } from '../../repositories/project.repository.js';
import type { IUserRepository } from '../../repositories/user.repository.js';
import type { IGroupRepository } from '../../repositories/group.repository.js';
import type { IRbacDefinitionRepository } from '../../repositories/rbac-definition.repository.js';
import type { RbacRoleBinding } from '../../validation/rbac-definition.schema.js';
import { decrypt } from './crypto.js';
import type { BackupBundle } from './backup-service.js';
@@ -17,6 +21,12 @@ export interface RestoreResult {
secretsSkipped: number;
projectsCreated: number;
projectsSkipped: number;
usersCreated: number;
usersSkipped: number;
groupsCreated: number;
groupsSkipped: number;
rbacCreated: number;
rbacSkipped: number;
errors: string[];
}
@@ -25,6 +35,9 @@ export class RestoreService {
private serverRepo: IMcpServerRepository,
private projectRepo: IProjectRepository,
private secretRepo: ISecretRepository,
private userRepo?: IUserRepository,
private groupRepo?: IGroupRepository,
private rbacRepo?: IRbacDefinitionRepository,
) {}
validateBundle(bundle: unknown): bundle is BackupBundle {
@@ -36,6 +49,7 @@ export class RestoreService {
Array.isArray(b['secrets']) &&
Array.isArray(b['projects'])
);
// users, groups, rbacBindings are optional for backwards compatibility
}
async restore(bundle: BackupBundle, options?: RestoreOptions): Promise<RestoreResult> {
@@ -47,6 +61,12 @@ export class RestoreService {
secretsSkipped: 0,
projectsCreated: 0,
projectsSkipped: 0,
usersCreated: 0,
usersSkipped: 0,
groupsCreated: 0,
groupsSkipped: 0,
rbacCreated: 0,
rbacSkipped: 0,
errors: [],
};
@@ -78,6 +98,37 @@ export class RestoreService {
}
}
// Restore order: secrets → servers → users → groups → projects → rbacBindings
// Restore secrets
for (const secret of bundle.secrets) {
try {
const existing = await this.secretRepo.findByName(secret.name);
if (existing) {
if (strategy === 'fail') {
result.errors.push(`Secret "${secret.name}" already exists`);
return result;
}
if (strategy === 'skip') {
result.secretsSkipped++;
continue;
}
// overwrite
await this.secretRepo.update(existing.id, { data: secret.data });
result.secretsCreated++;
continue;
}
await this.secretRepo.create({
name: secret.name,
data: secret.data,
});
result.secretsCreated++;
} catch (err) {
result.errors.push(`Failed to restore secret "${secret.name}": ${err instanceof Error ? err.message : String(err)}`);
}
}
// Restore servers
for (const server of bundle.servers) {
try {
@@ -121,36 +172,75 @@ export class RestoreService {
}
}
// Restore secrets
for (const secret of bundle.secrets) {
// Restore users
if (bundle.users && this.userRepo) {
for (const user of bundle.users) {
try {
const existing = await this.secretRepo.findByName(secret.name);
const existing = await this.userRepo.findByEmail(user.email);
if (existing) {
if (strategy === 'fail') {
result.errors.push(`Secret "${secret.name}" already exists`);
result.errors.push(`User "${user.email}" already exists`);
return result;
}
result.usersSkipped++;
continue;
}
// Create with placeholder passwordHash — user must reset password
const createData: { email: string; passwordHash: string; name?: string; role?: string } = {
email: user.email,
passwordHash: '__RESTORED_MUST_RESET__',
role: user.role,
};
if (user.name !== null) createData.name = user.name;
await this.userRepo.create(createData);
result.usersCreated++;
} catch (err) {
result.errors.push(`Failed to restore user "${user.email}": ${err instanceof Error ? err.message : String(err)}`);
}
}
}
// Restore groups
if (bundle.groups && this.groupRepo && this.userRepo) {
for (const group of bundle.groups) {
try {
const existing = await this.groupRepo.findByName(group.name);
if (existing) {
if (strategy === 'fail') {
result.errors.push(`Group "${group.name}" already exists`);
return result;
}
if (strategy === 'skip') {
result.secretsSkipped++;
result.groupsSkipped++;
continue;
}
// overwrite
await this.secretRepo.update(existing.id, { data: secret.data });
result.secretsCreated++;
// overwrite: update description and re-set members
await this.groupRepo.update(existing.id, { description: group.description });
if (group.memberEmails.length > 0) {
const memberIds = await this.resolveUserEmails(group.memberEmails);
await this.groupRepo.setMembers(existing.id, memberIds);
}
result.groupsCreated++;
continue;
}
await this.secretRepo.create({
name: secret.name,
data: secret.data,
const created = await this.groupRepo.create({
name: group.name,
description: group.description,
});
result.secretsCreated++;
if (group.memberEmails.length > 0) {
const memberIds = await this.resolveUserEmails(group.memberEmails);
await this.groupRepo.setMembers(created.id, memberIds);
}
result.groupsCreated++;
} catch (err) {
result.errors.push(`Failed to restore secret "${secret.name}": ${err instanceof Error ? err.message : String(err)}`);
result.errors.push(`Failed to restore group "${group.name}": ${err instanceof Error ? err.message : String(err)}`);
}
}
}
// Restore projects
// Restore projects (enriched)
for (const project of bundle.projects) {
try {
const existing = await this.projectRepo.findByName(project.name);
@@ -164,22 +254,100 @@ export class RestoreService {
continue;
}
// overwrite
await this.projectRepo.update(existing.id, { description: project.description });
const updateData: Record<string, unknown> = { description: project.description };
if (project.proxyMode) updateData['proxyMode'] = project.proxyMode;
if (project.llmProvider !== undefined) updateData['llmProvider'] = project.llmProvider;
if (project.llmModel !== undefined) updateData['llmModel'] = project.llmModel;
await this.projectRepo.update(existing.id, updateData);
// Re-link servers
if (project.serverNames && project.serverNames.length > 0) {
const serverIds = await this.resolveServerNames(project.serverNames);
await this.projectRepo.setServers(existing.id, serverIds);
}
result.projectsCreated++;
continue;
}
await this.projectRepo.create({
const projectCreateData: { name: string; description: string; ownerId: string; proxyMode: string; llmProvider?: string; llmModel?: string } = {
name: project.name,
description: project.description,
ownerId: 'system',
});
proxyMode: project.proxyMode ?? 'direct',
};
if (project.llmProvider != null) projectCreateData.llmProvider = project.llmProvider;
if (project.llmModel != null) projectCreateData.llmModel = project.llmModel;
const created = await this.projectRepo.create(projectCreateData);
// Link servers
if (project.serverNames && project.serverNames.length > 0) {
const serverIds = await this.resolveServerNames(project.serverNames);
await this.projectRepo.setServers(created.id, serverIds);
}
result.projectsCreated++;
} catch (err) {
result.errors.push(`Failed to restore project "${project.name}": ${err instanceof Error ? err.message : String(err)}`);
}
}
// Restore RBAC bindings
if (bundle.rbacBindings && this.rbacRepo) {
for (const rbac of bundle.rbacBindings) {
try {
const existing = await this.rbacRepo.findByName(rbac.name);
if (existing) {
if (strategy === 'fail') {
result.errors.push(`RBAC binding "${rbac.name}" already exists`);
return result;
}
if (strategy === 'skip') {
result.rbacSkipped++;
continue;
}
// overwrite
await this.rbacRepo.update(existing.id, {
subjects: rbac.subjects as Array<{ kind: 'User' | 'Group'; name: string }>,
roleBindings: rbac.roleBindings as RbacRoleBinding[],
});
result.rbacCreated++;
continue;
}
await this.rbacRepo.create({
name: rbac.name,
subjects: rbac.subjects as Array<{ kind: 'User' | 'Group'; name: string }>,
roleBindings: rbac.roleBindings as RbacRoleBinding[],
});
result.rbacCreated++;
} catch (err) {
result.errors.push(`Failed to restore RBAC binding "${rbac.name}": ${err instanceof Error ? err.message : String(err)}`);
}
}
}
return result;
}
/** Resolve email addresses to user IDs via the user repository. */
private async resolveUserEmails(emails: string[]): Promise<string[]> {
const ids: string[] = [];
for (const email of emails) {
const user = await this.userRepo!.findByEmail(email);
if (user) ids.push(user.id);
}
return ids;
}
/** Resolve server names to server IDs via the server repository. */
private async resolveServerNames(names: string[]): Promise<string[]> {
const ids: string[] = [];
for (const name of names) {
const server = await this.serverRepo.findByName(name);
if (server) ids.push(server.id);
}
return ids;
}
}

View File

@@ -1,11 +1,13 @@
import Docker from 'dockerode';
import { PassThrough } from 'node:stream';
import type {
McpOrchestrator,
ContainerSpec,
ContainerInfo,
ContainerLogs,
ExecResult,
} from '../orchestrator.js';
import { DEFAULT_MEMORY_LIMIT, DEFAULT_NANO_CPUS } from '../orchestrator.js';
import { DEFAULT_MEMORY_LIMIT } from '../orchestrator.js';
const MCPCTL_LABEL = 'mcpctl.managed';
@@ -54,7 +56,7 @@ export class DockerContainerManager implements McpOrchestrator {
async createContainer(spec: ContainerSpec): Promise<ContainerInfo> {
const memoryLimit = spec.memoryLimit ?? DEFAULT_MEMORY_LIMIT;
const nanoCpus = spec.nanoCpus ?? DEFAULT_NANO_CPUS;
const nanoCpus = spec.nanoCpus;
const portBindings: Record<string, Array<{ HostPort: string }>> = {};
const exposedPorts: Record<string, Record<string, never>> = {};
@@ -80,10 +82,13 @@ export class DockerContainerManager implements McpOrchestrator {
Env: envArr,
ExposedPorts: exposedPorts,
Labels: labels,
// Keep stdin open for STDIO MCP servers (they read from stdin)
OpenStdin: true,
StdinOnce: false,
HostConfig: {
PortBindings: portBindings,
Memory: memoryLimit,
NanoCpus: nanoCpus,
...(nanoCpus ? { NanoCpus: nanoCpus } : {}),
NetworkMode: spec.network ?? 'bridge',
},
};
@@ -133,6 +138,19 @@ export class DockerContainerManager implements McpOrchestrator {
if (port !== undefined) {
result.port = port;
}
// Extract container IP from first non-default network
const networks = info.NetworkSettings?.Networks;
if (networks) {
for (const [, net] of Object.entries(networks)) {
const netInfo = net as { IPAddress?: string };
if (netInfo.IPAddress) {
result.ip = netInfo.IPAddress;
break;
}
}
}
return result;
}
@@ -158,4 +176,67 @@ export class DockerContainerManager implements McpOrchestrator {
// For simplicity we return everything as stdout.
return { stdout: raw, stderr: '' };
}
async execInContainer(
containerId: string,
cmd: string[],
opts?: { stdin?: string; timeoutMs?: number },
): Promise<ExecResult> {
const container = this.docker.getContainer(containerId);
const hasStdin = opts?.stdin !== undefined;
const exec = await container.exec({
Cmd: cmd,
AttachStdin: hasStdin,
AttachStdout: true,
AttachStderr: true,
});
const stream = await exec.start({ hijack: hasStdin, stdin: hasStdin });
const timeoutMs = opts?.timeoutMs ?? 30_000;
return new Promise<ExecResult>((resolve, reject) => {
const stdout = new PassThrough();
const stderr = new PassThrough();
const stdoutChunks: Buffer[] = [];
const stderrChunks: Buffer[] = [];
stdout.on('data', (chunk: Buffer) => stdoutChunks.push(chunk));
stderr.on('data', (chunk: Buffer) => stderrChunks.push(chunk));
this.docker.modem.demuxStream(stream, stdout, stderr);
if (hasStdin) {
stream.write(opts!.stdin);
stream.end();
}
const timer = setTimeout(() => {
stream.destroy();
reject(new Error(`Exec timed out after ${timeoutMs}ms`));
}, timeoutMs);
stream.on('end', () => {
clearTimeout(timer);
exec.inspect().then((info) => {
resolve({
exitCode: (info as { ExitCode: number }).ExitCode,
stdout: Buffer.concat(stdoutChunks).toString('utf-8'),
stderr: Buffer.concat(stderrChunks).toString('utf-8'),
});
}).catch((err) => {
resolve({
exitCode: -1,
stdout: Buffer.concat(stdoutChunks).toString('utf-8'),
stderr: err instanceof Error ? err.message : String(err),
});
});
});
stream.on('error', (err: Error) => {
clearTimeout(timer);
reject(err);
});
});
}
}

View File

@@ -0,0 +1,89 @@
import type { GroupWithMembers, IGroupRepository } from '../repositories/group.repository.js';
import type { IUserRepository } from '../repositories/user.repository.js';
import { CreateGroupSchema, UpdateGroupSchema } from '../validation/group.schema.js';
import { NotFoundError, ConflictError } from './mcp-server.service.js';
export class GroupService {
constructor(
private readonly groupRepo: IGroupRepository,
private readonly userRepo: IUserRepository,
) {}
async list(): Promise<GroupWithMembers[]> {
return this.groupRepo.findAll();
}
async getById(id: string): Promise<GroupWithMembers> {
const group = await this.groupRepo.findById(id);
if (group === null) {
throw new NotFoundError(`Group not found: ${id}`);
}
return group;
}
async getByName(name: string): Promise<GroupWithMembers> {
const group = await this.groupRepo.findByName(name);
if (group === null) {
throw new NotFoundError(`Group not found: ${name}`);
}
return group;
}
async create(input: unknown): Promise<GroupWithMembers> {
const data = CreateGroupSchema.parse(input);
const existing = await this.groupRepo.findByName(data.name);
if (existing !== null) {
throw new ConflictError(`Group already exists: ${data.name}`);
}
const group = await this.groupRepo.create({
name: data.name,
description: data.description,
});
if (data.members.length > 0) {
const userIds = await this.resolveEmails(data.members);
await this.groupRepo.setMembers(group.id, userIds);
}
const result = await this.groupRepo.findById(group.id);
// Should always exist since we just created it
return result!;
}
async update(id: string, input: unknown): Promise<GroupWithMembers> {
const data = UpdateGroupSchema.parse(input);
// Verify exists
await this.getById(id);
if (data.description !== undefined) {
await this.groupRepo.update(id, { description: data.description });
}
if (data.members !== undefined) {
const userIds = await this.resolveEmails(data.members);
await this.groupRepo.setMembers(id, userIds);
}
return this.getById(id);
}
async delete(id: string): Promise<void> {
await this.getById(id);
await this.groupRepo.delete(id);
}
private async resolveEmails(emails: string[]): Promise<string[]> {
const userIds: string[] = [];
for (const email of emails) {
const user = await this.userRepo.findByEmail(email);
if (user === null) {
throw new NotFoundError(`User not found: ${email}`);
}
userIds.push(user.id);
}
return userIds;
}
}

View File

@@ -0,0 +1,520 @@
import type { McpServer, McpInstance } from '@prisma/client';
import type { IMcpInstanceRepository, IMcpServerRepository } from '../repositories/interfaces.js';
import type { McpOrchestrator } from './orchestrator.js';
export interface HealthCheckSpec {
tool: string;
arguments?: Record<string, unknown>;
intervalSeconds?: number;
timeoutSeconds?: number;
failureThreshold?: number;
}
export interface ProbeResult {
healthy: boolean;
latencyMs: number;
message: string;
}
interface ProbeState {
consecutiveFailures: number;
lastProbeAt: number;
}
/**
* Periodic health probe runner — calls MCP tools on running instances to verify
* they are alive and responsive. Mirrors Kubernetes liveness probe semantics.
*
* For STDIO servers: runs `docker exec` with a disposable MCP client script
* that sends initialize + tool/call via the package binary.
*
* For SSE/HTTP servers: sends HTTP JSON-RPC directly to the container port.
*/
export class HealthProbeRunner {
private probeStates = new Map<string, ProbeState>();
private timer: ReturnType<typeof setInterval> | null = null;
constructor(
private instanceRepo: IMcpInstanceRepository,
private serverRepo: IMcpServerRepository,
private orchestrator: McpOrchestrator,
private logger?: { info: (msg: string) => void; error: (obj: unknown, msg: string) => void },
) {}
/** Start the periodic probe loop. Runs every `tickIntervalMs` (default 15s). */
start(tickIntervalMs = 15_000): void {
if (this.timer) return;
this.timer = setInterval(() => {
this.tick().catch((err) => {
this.logger?.error({ err }, 'Health probe tick failed');
});
}, tickIntervalMs);
}
stop(): void {
if (this.timer) {
clearInterval(this.timer);
this.timer = null;
}
}
/** Single tick: probe all RUNNING instances that have healthCheck configs and are due. */
async tick(): Promise<void> {
const instances = await this.instanceRepo.findAll();
const running = instances.filter((i) => i.status === 'RUNNING' && i.containerId);
// Cache servers by ID to avoid repeated lookups
const serverCache = new Map<string, McpServer>();
for (const inst of running) {
let server = serverCache.get(inst.serverId);
if (!server) {
const s = await this.serverRepo.findById(inst.serverId);
if (!s) continue;
serverCache.set(inst.serverId, s);
server = s;
}
const healthCheck = server.healthCheck as HealthCheckSpec | null;
if (!healthCheck) continue;
const intervalMs = (healthCheck.intervalSeconds ?? 60) * 1000;
const state = this.probeStates.get(inst.id);
const now = Date.now();
// Skip if not due yet
if (state && (now - state.lastProbeAt) < intervalMs) continue;
await this.probeInstance(inst, server, healthCheck);
}
// Clean up states for instances that no longer exist
const activeIds = new Set(running.map((i) => i.id));
for (const key of this.probeStates.keys()) {
if (!activeIds.has(key)) {
this.probeStates.delete(key);
}
}
}
/** Probe a single instance and update its health status. */
async probeInstance(
instance: McpInstance,
server: McpServer,
healthCheck: HealthCheckSpec,
): Promise<ProbeResult> {
const timeoutMs = (healthCheck.timeoutSeconds ?? 10) * 1000;
const failureThreshold = healthCheck.failureThreshold ?? 3;
const now = new Date();
const start = Date.now();
let result: ProbeResult;
try {
if (server.transport === 'SSE' || server.transport === 'STREAMABLE_HTTP') {
result = await this.probeHttp(instance, server, healthCheck, timeoutMs);
} else {
result = await this.probeStdio(instance, server, healthCheck, timeoutMs);
}
} catch (err) {
result = {
healthy: false,
latencyMs: Date.now() - start,
message: err instanceof Error ? err.message : String(err),
};
}
// Update probe state
const state = this.probeStates.get(instance.id) ?? { consecutiveFailures: 0, lastProbeAt: 0 };
state.lastProbeAt = Date.now();
if (result.healthy) {
state.consecutiveFailures = 0;
} else {
state.consecutiveFailures++;
}
this.probeStates.set(instance.id, state);
// Determine health status
const healthStatus = result.healthy
? 'healthy'
: state.consecutiveFailures >= failureThreshold
? 'unhealthy'
: 'degraded';
// Build event
const eventType = result.healthy ? 'Normal' : 'Warning';
const eventMessage = result.healthy
? `Health check passed (${result.latencyMs}ms)`
: `Health check failed: ${result.message}`;
const existingEvents = (instance.events as Array<{ timestamp: string; type: string; message: string }>) ?? [];
// Keep last 50 events
const events = [
...existingEvents.slice(-49),
{ timestamp: now.toISOString(), type: eventType, message: eventMessage },
];
// Update instance
await this.instanceRepo.updateStatus(instance.id, instance.status as 'RUNNING', {
healthStatus,
lastHealthCheck: now,
events,
});
this.logger?.info(
`[health] ${(instance as unknown as { server?: { name: string } }).server?.name ?? instance.serverId}: ${healthStatus} (${result.latencyMs}ms) - ${eventMessage}`,
);
return result;
}
/** Probe an HTTP/SSE MCP server by sending a JSON-RPC tool call. */
private async probeHttp(
instance: McpInstance,
server: McpServer,
healthCheck: HealthCheckSpec,
timeoutMs: number,
): Promise<ProbeResult> {
if (!instance.containerId) {
return { healthy: false, latencyMs: 0, message: 'No container ID' };
}
// Get container IP for internal network communication
// (mcpd and MCP containers share the mcp-servers network)
const containerInfo = await this.orchestrator.inspectContainer(instance.containerId);
const containerPort = (server.containerPort as number | null) ?? 3000;
let baseUrl: string;
if (containerInfo.ip) {
baseUrl = `http://${containerInfo.ip}:${containerPort}`;
} else if (instance.port) {
baseUrl = `http://localhost:${instance.port}`;
} else {
return { healthy: false, latencyMs: 0, message: 'No container IP or port' };
}
if (server.transport === 'SSE') {
return this.probeSse(baseUrl, healthCheck, timeoutMs);
}
return this.probeStreamableHttp(baseUrl, healthCheck, timeoutMs);
}
/**
* Probe a streamable-http MCP server (POST to root endpoint).
*/
private async probeStreamableHttp(
baseUrl: string,
healthCheck: HealthCheckSpec,
timeoutMs: number,
): Promise<ProbeResult> {
const start = Date.now();
const controller = new AbortController();
const timer = setTimeout(() => controller.abort(), timeoutMs);
try {
const initResp = await fetch(baseUrl, {
method: 'POST',
headers: { 'Content-Type': 'application/json', 'Accept': 'application/json, text/event-stream' },
body: JSON.stringify({
jsonrpc: '2.0', id: 1, method: 'initialize',
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'mcpctl-health', version: '0.1.0' } },
}),
signal: controller.signal,
});
if (!initResp.ok) {
return { healthy: false, latencyMs: Date.now() - start, message: `Initialize HTTP ${initResp.status}` };
}
const sessionId = initResp.headers.get('mcp-session-id');
const headers: Record<string, string> = { 'Content-Type': 'application/json', 'Accept': 'application/json, text/event-stream' };
if (sessionId) headers['Mcp-Session-Id'] = sessionId;
await fetch(baseUrl, {
method: 'POST', headers,
body: JSON.stringify({ jsonrpc: '2.0', method: 'notifications/initialized' }),
signal: controller.signal,
});
const toolResp = await fetch(baseUrl, {
method: 'POST', headers,
body: JSON.stringify({
jsonrpc: '2.0', id: 2, method: 'tools/call',
params: { name: healthCheck.tool, arguments: healthCheck.arguments ?? {} },
}),
signal: controller.signal,
});
const latencyMs = Date.now() - start;
if (!toolResp.ok) {
return { healthy: false, latencyMs, message: `Tool call HTTP ${toolResp.status}` };
}
const body = await toolResp.text();
try {
const parsed = JSON.parse(body.includes('data: ') ? body.split('data: ')[1]!.split('\n')[0]! : body);
if (parsed.error) {
return { healthy: false, latencyMs, message: parsed.error.message ?? 'Tool call error' };
}
} catch {
// If parsing fails but HTTP was ok, consider it healthy
}
return { healthy: true, latencyMs, message: 'ok' };
} finally {
clearTimeout(timer);
}
}
/**
* Probe an SSE-transport MCP server.
* SSE protocol: GET /sse → endpoint event → POST /messages?session_id=...
*/
private async probeSse(
baseUrl: string,
healthCheck: HealthCheckSpec,
timeoutMs: number,
): Promise<ProbeResult> {
const start = Date.now();
const controller = new AbortController();
const timer = setTimeout(() => controller.abort(), timeoutMs);
try {
// 1. Connect to SSE endpoint to get the message URL
const sseResp = await fetch(`${baseUrl}/sse`, {
method: 'GET',
headers: { 'Accept': 'text/event-stream' },
signal: controller.signal,
});
if (!sseResp.ok) {
return { healthy: false, latencyMs: Date.now() - start, message: `SSE connect HTTP ${sseResp.status}` };
}
// 2. Read the SSE stream to find the endpoint event
const reader = sseResp.body?.getReader();
if (!reader) {
return { healthy: false, latencyMs: Date.now() - start, message: 'No SSE stream body' };
}
const decoder = new TextDecoder();
let buffer = '';
let messagesUrl = '';
// Read until we get the endpoint event
while (!messagesUrl) {
const { done, value } = await reader.read();
if (done) break;
buffer += decoder.decode(value, { stream: true });
for (const line of buffer.split('\n')) {
if (line.startsWith('data: ') && buffer.includes('event: endpoint')) {
const endpoint = line.slice(6).trim();
// Endpoint may be relative (e.g., /messages?session_id=...) or absolute
messagesUrl = endpoint.startsWith('http') ? endpoint : `${baseUrl}${endpoint}`;
}
}
// Keep only the last incomplete line
const lines = buffer.split('\n');
buffer = lines[lines.length - 1] ?? '';
}
if (!messagesUrl) {
reader.cancel();
return { healthy: false, latencyMs: Date.now() - start, message: 'No endpoint event from SSE' };
}
// 3. Initialize via the messages endpoint
const postHeaders = { 'Content-Type': 'application/json' };
const initResp = await fetch(messagesUrl, {
method: 'POST', headers: postHeaders,
body: JSON.stringify({
jsonrpc: '2.0', id: 1, method: 'initialize',
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'mcpctl-health', version: '0.1.0' } },
}),
signal: controller.signal,
});
if (!initResp.ok) {
reader.cancel();
return { healthy: false, latencyMs: Date.now() - start, message: `Initialize HTTP ${initResp.status}` };
}
// 4. Send initialized notification
await fetch(messagesUrl, {
method: 'POST', headers: postHeaders,
body: JSON.stringify({ jsonrpc: '2.0', method: 'notifications/initialized' }),
signal: controller.signal,
});
// 5. Call health check tool
const toolResp = await fetch(messagesUrl, {
method: 'POST', headers: postHeaders,
body: JSON.stringify({
jsonrpc: '2.0', id: 2, method: 'tools/call',
params: { name: healthCheck.tool, arguments: healthCheck.arguments ?? {} },
}),
signal: controller.signal,
});
const latencyMs = Date.now() - start;
// 6. Read tool response from SSE stream
// The response comes back on the SSE stream, not the POST response
let responseBuffer = '';
const readTimeout = setTimeout(() => reader.cancel(), 5000);
while (true) {
const { done, value } = await reader.read();
if (done) break;
responseBuffer += decoder.decode(value, { stream: true });
// Look for data lines containing our response (id: 2)
for (const line of responseBuffer.split('\n')) {
if (line.startsWith('data: ')) {
try {
const parsed = JSON.parse(line.slice(6));
if (parsed.id === 2) {
clearTimeout(readTimeout);
reader.cancel();
if (parsed.error) {
return { healthy: false, latencyMs, message: parsed.error.message ?? 'Tool call error' };
}
return { healthy: true, latencyMs, message: 'ok' };
}
} catch {
// Not valid JSON, skip
}
}
}
const respLines = responseBuffer.split('\n');
responseBuffer = respLines[respLines.length - 1] ?? '';
}
clearTimeout(readTimeout);
reader.cancel();
// If POST response itself was ok (202 for SSE), consider it healthy
if (toolResp.ok) {
return { healthy: true, latencyMs, message: 'ok' };
}
return { healthy: false, latencyMs, message: `Tool call HTTP ${toolResp.status}` };
} finally {
clearTimeout(timer);
}
}
/**
* Probe a STDIO MCP server by running `docker exec` with a disposable Node.js
* script that pipes JSON-RPC messages into the package binary.
*/
private async probeStdio(
instance: McpInstance,
server: McpServer,
healthCheck: HealthCheckSpec,
timeoutMs: number,
): Promise<ProbeResult> {
if (!instance.containerId) {
return { healthy: false, latencyMs: 0, message: 'No container ID' };
}
const start = Date.now();
const packageName = server.packageName as string | null;
if (!packageName) {
return { healthy: false, latencyMs: 0, message: 'No package name for STDIO server' };
}
// Build JSON-RPC messages for the health probe
const initMsg = JSON.stringify({
jsonrpc: '2.0', id: 1, method: 'initialize',
params: {
protocolVersion: '2024-11-05',
capabilities: {},
clientInfo: { name: 'mcpctl-health', version: '0.1.0' },
},
});
const initializedMsg = JSON.stringify({
jsonrpc: '2.0', method: 'notifications/initialized',
});
const toolCallMsg = JSON.stringify({
jsonrpc: '2.0', id: 2, method: 'tools/call',
params: { name: healthCheck.tool, arguments: healthCheck.arguments ?? {} },
});
// Use a Node.js inline script that:
// 1. Spawns the MCP server binary via npx
// 2. Sends initialize + initialized + tool call via stdin
// 3. Reads responses from stdout
// 4. Exits with 0 if tool call succeeds, 1 if it fails
const probeScript = `
const { spawn } = require('child_process');
const proc = spawn('npx', ['--prefer-offline', '-y', ${JSON.stringify(packageName)}], { stdio: ['pipe', 'pipe', 'pipe'] });
let output = '';
let responded = false;
proc.stdout.on('data', d => {
output += d;
const lines = output.split('\\n');
for (const line of lines) {
if (!line.trim()) continue;
try {
const msg = JSON.parse(line);
if (msg.id === 2) {
responded = true;
if (msg.error) {
process.stdout.write('ERROR:' + (msg.error.message || 'unknown'));
proc.kill();
process.exit(1);
} else {
process.stdout.write('OK');
proc.kill();
process.exit(0);
}
}
} catch {}
}
output = lines[lines.length - 1] || '';
});
proc.stderr.on('data', () => {});
proc.on('error', e => { process.stdout.write('ERROR:' + e.message); process.exit(1); });
proc.on('exit', (code) => { if (!responded) { process.stdout.write('ERROR:process exited ' + code); process.exit(1); } });
setTimeout(() => { if (!responded) { process.stdout.write('ERROR:timeout'); proc.kill(); process.exit(1); } }, ${timeoutMs - 2000});
proc.stdin.write(${JSON.stringify(initMsg)} + '\\n');
setTimeout(() => {
proc.stdin.write(${JSON.stringify(initializedMsg)} + '\\n');
setTimeout(() => {
proc.stdin.write(${JSON.stringify(toolCallMsg)} + '\\n');
}, 500);
}, 500);
`.trim();
try {
const result = await this.orchestrator.execInContainer(
instance.containerId,
['node', '-e', probeScript],
{ timeoutMs },
);
const latencyMs = Date.now() - start;
if (result.exitCode === 0 && result.stdout.includes('OK')) {
return { healthy: true, latencyMs, message: 'ok' };
}
// Extract error message
const errorMatch = result.stdout.match(/ERROR:(.*)/);
const errorMsg = errorMatch?.[1] ?? (result.stderr.trim() || `exit code ${result.exitCode}`);
return { healthy: false, latencyMs, message: errorMsg };
} catch (err) {
return {
healthy: false,
latencyMs: Date.now() - start,
message: err instanceof Error ? err.message : String(err),
};
}
}
}

View File

@@ -5,7 +5,7 @@ export { ProjectService } from './project.service.js';
export { InstanceService, InvalidStateError } from './instance.service.js';
export { generateMcpConfig } from './mcp-config-generator.js';
export type { McpConfig, McpConfigServer } from './mcp-config-generator.js';
export type { McpOrchestrator, ContainerSpec, ContainerInfo, ContainerLogs } from './orchestrator.js';
export type { McpOrchestrator, ContainerSpec, ContainerInfo, ContainerLogs, ExecResult } from './orchestrator.js';
export { DEFAULT_MEMORY_LIMIT, DEFAULT_NANO_CPUS } from './orchestrator.js';
export { DockerContainerManager } from './docker/container-manager.js';
export { AuditLogService } from './audit-log.service.js';
@@ -25,3 +25,10 @@ export type { LoginResult } from './auth.service.js';
export { McpProxyService } from './mcp-proxy-service.js';
export type { McpProxyRequest, McpProxyResponse } from './mcp-proxy-service.js';
export { TemplateService } from './template.service.js';
export { HealthProbeRunner } from './health-probe.service.js';
export type { HealthCheckSpec, ProbeResult } from './health-probe.service.js';
export { RbacDefinitionService } from './rbac-definition.service.js';
export { RbacService } from './rbac.service.js';
export type { RbacAction, Permission, AllowedScope } from './rbac.service.js';
export { UserService } from './user.service.js';
export { GroupService } from './group.service.js';

View File

@@ -4,6 +4,12 @@ import type { McpOrchestrator, ContainerSpec, ContainerInfo } from './orchestrat
import { NotFoundError } from './mcp-server.service.js';
import { resolveServerEnv } from './env-resolver.js';
/** Default image for npm-based MCP servers (STDIO with packageName, no dockerImage). */
const DEFAULT_NODE_RUNNER_IMAGE = process.env['MCPD_NODE_RUNNER_IMAGE'] ?? 'mysources.co.uk/michal/mcpctl-node-runner:latest';
/** Network for MCP server containers (matches docker-compose mcp-servers network). */
const MCP_SERVERS_NETWORK = process.env['MCPD_MCP_NETWORK'] ?? 'mcp-servers';
export class InvalidStateError extends Error {
readonly statusCode = 409;
constructor(message: string) {
@@ -30,8 +36,41 @@ export class InstanceService {
return instance;
}
/**
* Sync instance statuses with actual container state.
* Detects crashed/stopped containers and marks them ERROR.
*/
async syncStatus(): Promise<void> {
const instances = await this.instanceRepo.findAll();
for (const inst of instances) {
if ((inst.status === 'RUNNING' || inst.status === 'STARTING') && inst.containerId) {
try {
const info = await this.orchestrator.inspectContainer(inst.containerId);
if (info.state === 'stopped' || info.state === 'error') {
// Container died — get last logs for error context
let errorMsg = `Container ${info.state}`;
try {
const logs = await this.orchestrator.getContainerLogs(inst.containerId, { tail: 5 });
const lastLog = (logs.stdout || logs.stderr).trim().split('\n').pop();
if (lastLog) errorMsg = lastLog;
} catch { /* best-effort */ }
await this.instanceRepo.updateStatus(inst.id, 'ERROR', {
metadata: { error: errorMsg },
});
}
} catch {
// Container gone entirely
await this.instanceRepo.updateStatus(inst.id, 'ERROR', {
metadata: { error: 'Container not found' },
});
}
}
}
}
/**
* Reconcile instances for a server to match desired replica count.
* - Syncs container statuses first (detect crashed containers)
* - If fewer running instances than replicas: start new ones
* - If more running instances than replicas: remove excess (oldest first)
*/
@@ -39,6 +78,9 @@ export class InstanceService {
const server = await this.serverRepo.findById(serverId);
if (!server) throw new NotFoundError(`McpServer '${serverId}' not found`);
// Sync container statuses before counting active instances
await this.syncStatus();
const instances = await this.instanceRepo.findAll(serverId);
const active = instances.filter((i) => i.status === 'RUNNING' || i.status === 'STARTING');
const desired = server.replicas;
@@ -139,7 +181,23 @@ export class InstanceService {
});
}
const image = server.dockerImage ?? server.packageName ?? server.name;
// Determine image + command based on server config:
// 1. Explicit dockerImage → use as-is
// 2. packageName (npm) → use node-runner image + npx command
// 3. Fallback → server name (legacy)
let image: string;
let npmCommand: string[] | undefined;
if (server.dockerImage) {
image = server.dockerImage;
} else if (server.packageName) {
image = DEFAULT_NODE_RUNNER_IMAGE;
// Build npx command: entrypoint is ["npx", "-y"], so CMD = [packageName, ...args]
const serverCommand = server.command as string[] | null;
npmCommand = [server.packageName, ...(serverCommand ?? [])];
} else {
image = server.name;
}
let instance = await this.instanceRepo.create({
serverId,
@@ -151,6 +209,7 @@ export class InstanceService {
image,
name: `mcpctl-${server.name}-${instance.id}`,
hostPort: null,
network: MCP_SERVERS_NETWORK,
labels: {
'mcpctl.server-id': serverId,
'mcpctl.instance-id': instance.id,
@@ -159,10 +218,16 @@ export class InstanceService {
if (server.transport === 'SSE' || server.transport === 'STREAMABLE_HTTP') {
spec.containerPort = server.containerPort ?? 3000;
}
// npm-based servers: command = [packageName, ...args] (entrypoint handles npx -y)
// Docker-image servers: use explicit command if provided
if (npmCommand) {
spec.command = npmCommand;
} else {
const command = server.command as string[] | null;
if (command) {
spec.command = command;
}
}
// Resolve env vars from inline values and secret refs
if (this.secretRepo) {
@@ -177,6 +242,13 @@ export class InstanceService {
}
}
// Pull image if not available locally
try {
await this.orchestrator.pullImage(image);
} catch {
// Image may already be available locally
}
const containerInfo = await this.orchestrator.createContainer(spec);
const updateFields: { containerId: string; port?: number } = {

View File

@@ -3,6 +3,7 @@ import type {
ContainerSpec,
ContainerInfo,
ContainerLogs,
ExecResult,
} from '../orchestrator.js';
import { K8sClient } from './k8s-client.js';
import type { K8sClientConfig } from './k8s-client.js';
@@ -164,6 +165,15 @@ export class KubernetesOrchestrator implements McpOrchestrator {
return { stdout, stderr: '' };
}
async execInContainer(
_containerId: string,
_cmd: string[],
_opts?: { stdin?: string; timeoutMs?: number },
): Promise<ExecResult> {
// K8s exec via API — future implementation
throw new Error('execInContainer not yet implemented for Kubernetes');
}
async listContainers(namespace?: string): Promise<ContainerInfo[]> {
const ns = namespace ?? this.namespace;
const res = await this.client.get<K8sPodList>(

View File

@@ -1,8 +1,10 @@
import type { McpServer } from '@prisma/client';
export interface McpConfigServer {
command: string;
args: string[];
command?: string;
args?: string[];
url?: string;
headers?: Record<string, string>;
env?: Record<string, string>;
}
@@ -19,6 +21,13 @@ export function generateMcpConfig(
const mcpServers: Record<string, McpConfigServer> = {};
for (const { server, resolvedEnv } of servers) {
if (server.transport === 'SSE' || server.transport === 'STREAMABLE_HTTP') {
// Point at mcpd proxy URL for non-STDIO transports
mcpServers[server.name] = {
url: `http://localhost:3100/api/v1/mcp/proxy/${server.name}`,
};
} else {
// STDIO — npx command approach
const config: McpConfigServer = {
command: 'npx',
args: ['-y', server.packageName ?? server.name],
@@ -30,6 +39,7 @@ export function generateMcpConfig(
mcpServers[server.name] = config;
}
}
return { mcpServers };
}

View File

@@ -30,6 +30,8 @@ export interface ContainerInfo {
name: string;
state: 'running' | 'stopped' | 'starting' | 'error' | 'unknown';
port?: number;
/** Container IP on the first non-default network (for internal communication) */
ip?: string;
createdAt: Date;
}
@@ -38,6 +40,12 @@ export interface ContainerLogs {
stderr: string;
}
export interface ExecResult {
exitCode: number;
stdout: string;
stderr: string;
}
export interface McpOrchestrator {
/** Pull an image if not present locally */
pullImage(image: string): Promise<void>;
@@ -57,6 +65,9 @@ export interface McpOrchestrator {
/** Get container logs */
getContainerLogs(containerId: string, opts?: { tail?: number; since?: number }): Promise<ContainerLogs>;
/** Execute a command inside a running container with optional stdin */
execInContainer(containerId: string, cmd: string[], opts?: { stdin?: string; timeoutMs?: number }): Promise<ExecResult>;
/** Check if the orchestrator runtime is available */
ping(): Promise<boolean>;
}

View File

@@ -1,18 +1,24 @@
import type { Project } from '@prisma/client';
import type { IProjectRepository } from '../repositories/project.repository.js';
import type { McpServer } from '@prisma/client';
import type { IProjectRepository, ProjectWithRelations } from '../repositories/project.repository.js';
import type { IMcpServerRepository, ISecretRepository } from '../repositories/interfaces.js';
import { CreateProjectSchema, UpdateProjectSchema } from '../validation/project.schema.js';
import { NotFoundError, ConflictError } from './mcp-server.service.js';
import { resolveServerEnv } from './env-resolver.js';
import { generateMcpConfig } from './mcp-config-generator.js';
import type { McpConfig } from './mcp-config-generator.js';
export class ProjectService {
constructor(
private readonly projectRepo: IProjectRepository,
private readonly serverRepo: IMcpServerRepository,
private readonly secretRepo: ISecretRepository,
) {}
async list(ownerId?: string): Promise<Project[]> {
async list(ownerId?: string): Promise<ProjectWithRelations[]> {
return this.projectRepo.findAll(ownerId);
}
async getById(id: string): Promise<Project> {
async getById(id: string): Promise<ProjectWithRelations> {
const project = await this.projectRepo.findById(id);
if (project === null) {
throw new NotFoundError(`Project not found: ${id}`);
@@ -20,7 +26,20 @@ export class ProjectService {
return project;
}
async create(input: unknown, ownerId: string): Promise<Project> {
/** Resolve by ID or name. */
async resolveAndGet(idOrName: string): Promise<ProjectWithRelations> {
// Try by ID first
const byId = await this.projectRepo.findById(idOrName);
if (byId !== null) return byId;
// Fall back to name
const byName = await this.projectRepo.findByName(idOrName);
if (byName !== null) return byName;
throw new NotFoundError(`Project not found: ${idOrName}`);
}
async create(input: unknown, ownerId: string): Promise<ProjectWithRelations> {
const data = CreateProjectSchema.parse(input);
const existing = await this.projectRepo.findByName(data.name);
@@ -28,17 +47,107 @@ export class ProjectService {
throw new ConflictError(`Project already exists: ${data.name}`);
}
return this.projectRepo.create({ ...data, ownerId });
// Resolve server names to IDs
const serverIds = await this.resolveServerNames(data.servers);
const project = await this.projectRepo.create({
name: data.name,
description: data.description,
ownerId,
proxyMode: data.proxyMode,
...(data.llmProvider !== undefined ? { llmProvider: data.llmProvider } : {}),
...(data.llmModel !== undefined ? { llmModel: data.llmModel } : {}),
});
// Link servers
if (serverIds.length > 0) {
await this.projectRepo.setServers(project.id, serverIds);
}
async update(id: string, input: unknown): Promise<Project> {
// Re-fetch to include relations
return this.getById(project.id);
}
async update(id: string, input: unknown): Promise<ProjectWithRelations> {
const data = UpdateProjectSchema.parse(input);
await this.getById(id);
return this.projectRepo.update(id, data);
const project = await this.getById(id);
// Build update data for scalar fields
const updateData: Record<string, unknown> = {};
if (data.description !== undefined) updateData['description'] = data.description;
if (data.proxyMode !== undefined) updateData['proxyMode'] = data.proxyMode;
if (data.llmProvider !== undefined) updateData['llmProvider'] = data.llmProvider;
if (data.llmModel !== undefined) updateData['llmModel'] = data.llmModel;
// Update scalar fields if any changed
if (Object.keys(updateData).length > 0) {
await this.projectRepo.update(project.id, updateData);
}
// Update servers if provided
if (data.servers !== undefined) {
const serverIds = await this.resolveServerNames(data.servers);
await this.projectRepo.setServers(project.id, serverIds);
}
// Re-fetch to include updated relations
return this.getById(project.id);
}
async delete(id: string): Promise<void> {
await this.getById(id);
await this.projectRepo.delete(id);
}
async generateMcpConfig(idOrName: string): Promise<McpConfig> {
const project = await this.resolveAndGet(idOrName);
if (project.proxyMode === 'filtered') {
// Single entry pointing at mcplocal proxy
return {
mcpServers: {
[project.name]: {
url: `http://localhost:3100/api/v1/mcp/proxy/project/${project.name}`,
},
},
};
}
// Direct mode: fetch full servers and resolve env
const serverEntries: Array<{ server: McpServer; resolvedEnv: Record<string, string> }> = [];
for (const ps of project.servers) {
const server = await this.serverRepo.findById(ps.server.id);
if (server === null) continue;
const resolvedEnv = await resolveServerEnv(server, this.secretRepo);
serverEntries.push({ server, resolvedEnv });
}
return generateMcpConfig(serverEntries);
}
async addServer(idOrName: string, serverName: string): Promise<ProjectWithRelations> {
const project = await this.resolveAndGet(idOrName);
const server = await this.serverRepo.findByName(serverName);
if (server === null) throw new NotFoundError(`Server not found: ${serverName}`);
await this.projectRepo.addServer(project.id, server.id);
return this.getById(project.id);
}
async removeServer(idOrName: string, serverName: string): Promise<ProjectWithRelations> {
const project = await this.resolveAndGet(idOrName);
const server = await this.serverRepo.findByName(serverName);
if (server === null) throw new NotFoundError(`Server not found: ${serverName}`);
await this.projectRepo.removeServer(project.id, server.id);
return this.getById(project.id);
}
private async resolveServerNames(names: string[]): Promise<string[]> {
return Promise.all(names.map(async (name) => {
const server = await this.serverRepo.findByName(name);
if (server === null) throw new NotFoundError(`Server not found: ${name}`);
return server.id;
}));
}
}

View File

@@ -0,0 +1,54 @@
import type { RbacDefinition } from '@prisma/client';
import type { IRbacDefinitionRepository } from '../repositories/rbac-definition.repository.js';
import { CreateRbacDefinitionSchema, UpdateRbacDefinitionSchema } from '../validation/rbac-definition.schema.js';
import { NotFoundError, ConflictError } from './mcp-server.service.js';
export class RbacDefinitionService {
constructor(private readonly repo: IRbacDefinitionRepository) {}
async list(): Promise<RbacDefinition[]> {
return this.repo.findAll();
}
async getById(id: string): Promise<RbacDefinition> {
const def = await this.repo.findById(id);
if (def === null) {
throw new NotFoundError(`RbacDefinition not found: ${id}`);
}
return def;
}
async getByName(name: string): Promise<RbacDefinition> {
const def = await this.repo.findByName(name);
if (def === null) {
throw new NotFoundError(`RbacDefinition not found: ${name}`);
}
return def;
}
async create(input: unknown): Promise<RbacDefinition> {
const data = CreateRbacDefinitionSchema.parse(input);
const existing = await this.repo.findByName(data.name);
if (existing !== null) {
throw new ConflictError(`RbacDefinition already exists: ${data.name}`);
}
return this.repo.create(data);
}
async update(id: string, input: unknown): Promise<RbacDefinition> {
const data = UpdateRbacDefinitionSchema.parse(input);
// Verify exists
await this.getById(id);
return this.repo.update(id, data);
}
async delete(id: string): Promise<void> {
// Verify exists
await this.getById(id);
await this.repo.delete(id);
}
}

View File

@@ -0,0 +1,161 @@
import type { PrismaClient } from '@prisma/client';
import type { IRbacDefinitionRepository } from '../repositories/rbac-definition.repository.js';
import {
normalizeResource,
isResourceBinding,
isOperationBinding,
type RbacSubject,
type RbacRoleBinding,
} from '../validation/rbac-definition.schema.js';
export type RbacAction = 'view' | 'create' | 'delete' | 'edit' | 'run' | 'expose';
export interface ResourcePermission {
role: string;
resource: string;
name?: string;
}
export interface OperationPermission {
role: 'run';
action: string;
}
export type Permission = ResourcePermission | OperationPermission;
export interface AllowedScope {
wildcard: boolean;
names: Set<string>;
}
/** Maps roles to the set of actions they grant. */
const ROLE_ACTIONS: Record<string, readonly RbacAction[]> = {
edit: ['view', 'create', 'delete', 'edit', 'expose'],
view: ['view'],
create: ['create'],
delete: ['delete'],
run: ['run'],
expose: ['expose', 'view'],
};
export class RbacService {
constructor(
private readonly rbacRepo: IRbacDefinitionRepository,
private readonly prisma: PrismaClient,
) {}
/**
* Check whether a user is allowed to perform an action on a resource.
* @param resourceName — optional specific resource name (e.g. 'my-ha').
* If provided, name-scoped bindings only match when their name equals this.
* If omitted (listing), name-scoped bindings still grant access.
*/
async canAccess(userId: string, action: RbacAction, resource: string, resourceName?: string): Promise<boolean> {
const permissions = await this.getPermissions(userId);
const normalized = normalizeResource(resource);
for (const perm of permissions) {
if (!('resource' in perm)) continue;
const actions = ROLE_ACTIONS[perm.role];
if (actions === undefined) continue;
if (!actions.includes(action)) continue;
const permResource = normalizeResource(perm.resource);
if (permResource !== '*' && permResource !== normalized) continue;
// Name-scoped check: if binding has a name AND caller specified a resourceName, must match
if (perm.name !== undefined && resourceName !== undefined && perm.name !== resourceName) continue;
return true;
}
return false;
}
/**
* Check whether a user is allowed to perform a named operation.
* Operations require an explicit 'run' role binding with a matching action.
*/
async canRunOperation(userId: string, operation: string): Promise<boolean> {
const permissions = await this.getPermissions(userId);
for (const perm of permissions) {
if ('action' in perm && perm.role === 'run' && perm.action === operation) {
return true;
}
}
return false;
}
/**
* Determine the set of resource names a user may access for a given action+resource.
* Returns wildcard:true if any matching binding is unscoped (no name constraint).
* Returns wildcard:false with a set of allowed names if all bindings are name-scoped.
*/
async getAllowedScope(userId: string, action: RbacAction, resource: string): Promise<AllowedScope> {
const permissions = await this.getPermissions(userId);
const normalized = normalizeResource(resource);
const names = new Set<string>();
for (const perm of permissions) {
if (!('resource' in perm)) continue;
const actions = ROLE_ACTIONS[perm.role];
if (actions === undefined) continue;
if (!actions.includes(action)) continue;
const permResource = normalizeResource(perm.resource);
if (permResource !== '*' && permResource !== normalized) continue;
// Unscoped binding → wildcard access to this resource
if (perm.name === undefined) return { wildcard: true, names: new Set() };
names.add(perm.name);
}
return { wildcard: false, names };
}
/**
* Collect all permissions for a user across all matching RbacDefinitions.
*/
async getPermissions(userId: string): Promise<Permission[]> {
// 1. Resolve user email
const user = await this.prisma.user.findUnique({
where: { id: userId },
select: { email: true },
});
if (user === null) return [];
// 2. Resolve group names the user belongs to
const memberships = await this.prisma.groupMember.findMany({
where: { userId },
select: { group: { select: { name: true } } },
});
const groupNames = memberships.map((m) => m.group.name);
// 3. Load all RbacDefinitions
const definitions = await this.rbacRepo.findAll();
// 4. Find definitions where user is a subject
const permissions: Permission[] = [];
for (const def of definitions) {
const subjects = def.subjects as RbacSubject[];
const matched = subjects.some((s) => {
if (s.kind === 'User') return s.name === user.email;
if (s.kind === 'Group') return groupNames.includes(s.name);
return false;
});
if (!matched) continue;
// 5. Collect roleBindings
const bindings = def.roleBindings as RbacRoleBinding[];
for (const binding of bindings) {
if (isResourceBinding(binding)) {
const perm: ResourcePermission = { role: binding.role, resource: binding.resource };
if (binding.name !== undefined) perm.name = binding.name;
permissions.push(perm);
} else if (isOperationBinding(binding)) {
permissions.push({ role: 'run', action: binding.action });
}
}
}
return permissions;
}
}

View File

@@ -0,0 +1,60 @@
import bcrypt from 'bcrypt';
import type { IUserRepository, SafeUser } from '../repositories/user.repository.js';
import { CreateUserSchema } from '../validation/user.schema.js';
import { NotFoundError, ConflictError } from './mcp-server.service.js';
const SALT_ROUNDS = 10;
export class UserService {
constructor(private readonly userRepo: IUserRepository) {}
async list(): Promise<SafeUser[]> {
return this.userRepo.findAll();
}
async getById(id: string): Promise<SafeUser> {
const user = await this.userRepo.findById(id);
if (user === null) {
throw new NotFoundError(`User not found: ${id}`);
}
return user;
}
async getByEmail(email: string): Promise<SafeUser> {
const user = await this.userRepo.findByEmail(email);
if (user === null) {
throw new NotFoundError(`User not found: ${email}`);
}
return user;
}
async create(input: unknown): Promise<SafeUser> {
const data = CreateUserSchema.parse(input);
const existing = await this.userRepo.findByEmail(data.email);
if (existing !== null) {
throw new ConflictError(`User already exists: ${data.email}`);
}
const passwordHash = await bcrypt.hash(data.password, SALT_ROUNDS);
const createData: { email: string; passwordHash: string; name?: string } = {
email: data.email,
passwordHash,
};
if (data.name !== undefined) {
createData.name = data.name;
}
return this.userRepo.create(createData);
}
async delete(id: string): Promise<void> {
await this.getById(id);
await this.userRepo.delete(id);
}
async count(): Promise<number> {
return this.userRepo.count();
}
}

View File

@@ -0,0 +1,15 @@
import { z } from 'zod';
export const CreateGroupSchema = z.object({
name: z.string().min(1).max(100).regex(/^[a-z0-9-]+$/, 'Name must be lowercase alphanumeric with hyphens'),
description: z.string().max(1000).default(''),
members: z.array(z.string().email()).default([]),
});
export const UpdateGroupSchema = z.object({
description: z.string().max(1000).optional(),
members: z.array(z.string().email()).optional(),
});
export type CreateGroupInput = z.infer<typeof CreateGroupSchema>;
export type UpdateGroupInput = z.infer<typeof UpdateGroupSchema>;

View File

@@ -2,3 +2,5 @@ export { CreateMcpServerSchema, UpdateMcpServerSchema } from './mcp-server.schem
export type { CreateMcpServerInput, UpdateMcpServerInput } from './mcp-server.schema.js';
export { CreateProjectSchema, UpdateProjectSchema } from './project.schema.js';
export type { CreateProjectInput, UpdateProjectInput } from './project.schema.js';
export { CreateRbacDefinitionSchema, UpdateRbacDefinitionSchema, RbacSubjectSchema, RbacRoleBindingSchema, RBAC_ROLES, RBAC_RESOURCES } from './rbac-definition.schema.js';
export type { CreateRbacDefinitionInput, UpdateRbacDefinitionInput, RbacSubject, RbacRoleBinding } from './rbac-definition.schema.js';

View File

@@ -3,10 +3,21 @@ import { z } from 'zod';
export const CreateProjectSchema = z.object({
name: z.string().min(1).max(100).regex(/^[a-z0-9-]+$/, 'Name must be lowercase alphanumeric with hyphens'),
description: z.string().max(1000).default(''),
});
proxyMode: z.enum(['direct', 'filtered']).default('direct'),
llmProvider: z.string().max(100).optional(),
llmModel: z.string().max(100).optional(),
servers: z.array(z.string().min(1)).default([]),
}).refine(
(d) => d.proxyMode !== 'filtered' || d.llmProvider,
{ message: 'llmProvider is required when proxyMode is "filtered"' },
);
export const UpdateProjectSchema = z.object({
description: z.string().max(1000).optional(),
proxyMode: z.enum(['direct', 'filtered']).optional(),
llmProvider: z.string().max(100).nullable().optional(),
llmModel: z.string().max(100).nullable().optional(),
servers: z.array(z.string().min(1)).optional(),
});
export type CreateProjectInput = z.infer<typeof CreateProjectSchema>;

View File

@@ -0,0 +1,71 @@
import { z } from 'zod';
export const RBAC_ROLES = ['edit', 'view', 'create', 'delete', 'run', 'expose'] as const;
export const RBAC_RESOURCES = ['*', 'servers', 'instances', 'secrets', 'projects', 'templates', 'users', 'groups', 'rbac'] as const;
/** Singular→plural map for resource names. */
const RESOURCE_ALIASES: Record<string, string> = {
server: 'servers',
instance: 'instances',
secret: 'secrets',
project: 'projects',
template: 'templates',
user: 'users',
group: 'groups',
};
/** Normalize a resource name to its canonical plural form. */
export function normalizeResource(resource: string): string {
return RESOURCE_ALIASES[resource] ?? resource;
}
export const RbacSubjectSchema = z.object({
kind: z.enum(['User', 'Group']),
name: z.string().min(1),
});
/** Resource binding: role grants access to a resource type (optionally scoped to a named instance). */
export const ResourceBindingSchema = z.object({
role: z.enum(RBAC_ROLES),
resource: z.string().min(1).transform(normalizeResource),
name: z.string().min(1).optional(),
});
/** Operation binding: 'run' role grants access to a named operation. */
export const OperationBindingSchema = z.object({
role: z.literal('run'),
action: z.string().min(1),
});
/** Union of both binding types. */
export const RbacRoleBindingSchema = z.union([
ResourceBindingSchema,
OperationBindingSchema,
]);
export type RbacSubject = z.infer<typeof RbacSubjectSchema>;
export type ResourceBinding = z.infer<typeof ResourceBindingSchema>;
export type OperationBinding = z.infer<typeof OperationBindingSchema>;
export type RbacRoleBinding = z.infer<typeof RbacRoleBindingSchema>;
export function isResourceBinding(b: RbacRoleBinding): b is ResourceBinding {
return 'resource' in b;
}
export function isOperationBinding(b: RbacRoleBinding): b is OperationBinding {
return 'action' in b;
}
export const CreateRbacDefinitionSchema = z.object({
name: z.string().min(1).max(100).regex(/^[a-z0-9-]+$/, 'Name must be lowercase alphanumeric with hyphens'),
subjects: z.array(RbacSubjectSchema).min(1),
roleBindings: z.array(RbacRoleBindingSchema).min(1),
});
export const UpdateRbacDefinitionSchema = z.object({
subjects: z.array(RbacSubjectSchema).min(1).optional(),
roleBindings: z.array(RbacRoleBindingSchema).min(1).optional(),
});
export type CreateRbacDefinitionInput = z.infer<typeof CreateRbacDefinitionSchema>;
export type UpdateRbacDefinitionInput = z.infer<typeof UpdateRbacDefinitionSchema>;

View File

@@ -0,0 +1,15 @@
import { z } from 'zod';
export const CreateUserSchema = z.object({
email: z.string().email(),
password: z.string().min(8).max(128),
name: z.string().max(100).optional(),
});
export const UpdateUserSchema = z.object({
name: z.string().max(100).optional(),
password: z.string().min(8).max(128).optional(),
});
export type CreateUserInput = z.infer<typeof CreateUserSchema>;
export type UpdateUserInput = z.infer<typeof UpdateUserSchema>;

View File

@@ -0,0 +1,424 @@
import { describe, it, expect, vi, afterEach, beforeEach } from 'vitest';
import Fastify from 'fastify';
import type { FastifyInstance } from 'fastify';
import { registerAuthRoutes } from '../src/routes/auth.js';
import { errorHandler } from '../src/middleware/error-handler.js';
import type { AuthService, LoginResult } from '../src/services/auth.service.js';
import type { UserService } from '../src/services/user.service.js';
import type { GroupService } from '../src/services/group.service.js';
import type { RbacDefinitionService } from '../src/services/rbac-definition.service.js';
import type { RbacService, RbacAction } from '../src/services/rbac.service.js';
import type { SafeUser } from '../src/repositories/user.repository.js';
import type { RbacDefinition } from '@prisma/client';
let app: FastifyInstance;
afterEach(async () => {
if (app) await app.close();
});
function makeLoginResult(overrides?: Partial<LoginResult>): LoginResult {
return {
token: 'test-token-123',
expiresAt: new Date(Date.now() + 86400_000),
user: { id: 'user-1', email: 'admin@example.com', role: 'user' },
...overrides,
};
}
function makeSafeUser(overrides?: Partial<SafeUser>): SafeUser {
return {
id: 'user-1',
email: 'admin@example.com',
name: null,
role: 'user',
provider: 'local',
externalId: null,
version: 1,
createdAt: new Date(),
updatedAt: new Date(),
...overrides,
};
}
function makeRbacDef(overrides?: Partial<RbacDefinition>): RbacDefinition {
return {
id: 'rbac-1',
name: 'bootstrap-admin',
subjects: [{ kind: 'Group', name: 'admin' }],
roleBindings: [
{ role: 'edit', resource: '*' },
{ role: 'run', resource: '*' },
{ role: 'run', action: 'impersonate' },
{ role: 'run', action: 'logs' },
{ role: 'run', action: 'backup' },
{ role: 'run', action: 'restore' },
{ role: 'run', action: 'audit-purge' },
],
version: 1,
createdAt: new Date(),
updatedAt: new Date(),
...overrides,
};
}
interface MockDeps {
authService: {
login: ReturnType<typeof vi.fn>;
logout: ReturnType<typeof vi.fn>;
findSession: ReturnType<typeof vi.fn>;
impersonate: ReturnType<typeof vi.fn>;
};
userService: {
count: ReturnType<typeof vi.fn>;
create: ReturnType<typeof vi.fn>;
list: ReturnType<typeof vi.fn>;
getById: ReturnType<typeof vi.fn>;
getByEmail: ReturnType<typeof vi.fn>;
delete: ReturnType<typeof vi.fn>;
};
groupService: {
create: ReturnType<typeof vi.fn>;
list: ReturnType<typeof vi.fn>;
getById: ReturnType<typeof vi.fn>;
getByName: ReturnType<typeof vi.fn>;
update: ReturnType<typeof vi.fn>;
delete: ReturnType<typeof vi.fn>;
};
rbacDefinitionService: {
create: ReturnType<typeof vi.fn>;
list: ReturnType<typeof vi.fn>;
getById: ReturnType<typeof vi.fn>;
getByName: ReturnType<typeof vi.fn>;
update: ReturnType<typeof vi.fn>;
delete: ReturnType<typeof vi.fn>;
};
rbacService: {
canAccess: ReturnType<typeof vi.fn>;
canRunOperation: ReturnType<typeof vi.fn>;
getPermissions: ReturnType<typeof vi.fn>;
};
}
function createMockDeps(): MockDeps {
return {
authService: {
login: vi.fn(async () => makeLoginResult()),
logout: vi.fn(async () => {}),
findSession: vi.fn(async () => null),
impersonate: vi.fn(async () => makeLoginResult({ token: 'impersonated-token' })),
},
userService: {
count: vi.fn(async () => 0),
create: vi.fn(async () => makeSafeUser()),
list: vi.fn(async () => []),
getById: vi.fn(async () => makeSafeUser()),
getByEmail: vi.fn(async () => makeSafeUser()),
delete: vi.fn(async () => {}),
},
groupService: {
create: vi.fn(async () => ({ id: 'grp-1', name: 'admin', description: 'Bootstrap admin group', members: [] })),
list: vi.fn(async () => []),
getById: vi.fn(async () => null),
getByName: vi.fn(async () => null),
update: vi.fn(async () => null),
delete: vi.fn(async () => {}),
},
rbacDefinitionService: {
create: vi.fn(async () => makeRbacDef()),
list: vi.fn(async () => []),
getById: vi.fn(async () => makeRbacDef()),
getByName: vi.fn(async () => null),
update: vi.fn(async () => makeRbacDef()),
delete: vi.fn(async () => {}),
},
rbacService: {
canAccess: vi.fn(async () => false),
canRunOperation: vi.fn(async () => false),
getPermissions: vi.fn(async () => []),
},
};
}
function createApp(deps: MockDeps): Promise<FastifyInstance> {
app = Fastify({ logger: false });
app.setErrorHandler(errorHandler);
registerAuthRoutes(app, deps as unknown as {
authService: AuthService;
userService: UserService;
groupService: GroupService;
rbacDefinitionService: RbacDefinitionService;
rbacService: RbacService;
});
return app.ready();
}
describe('Auth Bootstrap', () => {
describe('GET /api/v1/auth/status', () => {
it('returns hasUsers: false when no users exist', async () => {
const deps = createMockDeps();
deps.userService.count.mockResolvedValue(0);
await createApp(deps);
const res = await app.inject({ method: 'GET', url: '/api/v1/auth/status' });
expect(res.statusCode).toBe(200);
expect(res.json<{ hasUsers: boolean }>().hasUsers).toBe(false);
});
it('returns hasUsers: true when users exist', async () => {
const deps = createMockDeps();
deps.userService.count.mockResolvedValue(1);
await createApp(deps);
const res = await app.inject({ method: 'GET', url: '/api/v1/auth/status' });
expect(res.statusCode).toBe(200);
expect(res.json<{ hasUsers: boolean }>().hasUsers).toBe(true);
});
});
describe('POST /api/v1/auth/bootstrap', () => {
it('creates admin user, admin group, RBAC definition targeting group, and returns session token', async () => {
const deps = createMockDeps();
deps.userService.count.mockResolvedValue(0);
await createApp(deps);
const res = await app.inject({
method: 'POST',
url: '/api/v1/auth/bootstrap',
payload: { email: 'admin@example.com', password: 'securepass123' },
});
expect(res.statusCode).toBe(201);
const body = res.json<LoginResult>();
expect(body.token).toBe('test-token-123');
expect(body.user.email).toBe('admin@example.com');
// Verify user was created
expect(deps.userService.create).toHaveBeenCalledWith({
email: 'admin@example.com',
password: 'securepass123',
});
// Verify admin group was created with the user as member
expect(deps.groupService.create).toHaveBeenCalledWith({
name: 'admin',
description: 'Bootstrap admin group',
members: ['admin@example.com'],
});
// Verify RBAC definition targets the Group, not the User
expect(deps.rbacDefinitionService.create).toHaveBeenCalledWith({
name: 'bootstrap-admin',
subjects: [{ kind: 'Group', name: 'admin' }],
roleBindings: [
{ role: 'edit', resource: '*' },
{ role: 'run', resource: '*' },
{ role: 'run', action: 'impersonate' },
{ role: 'run', action: 'logs' },
{ role: 'run', action: 'backup' },
{ role: 'run', action: 'restore' },
{ role: 'run', action: 'audit-purge' },
],
});
// Verify auto-login was called
expect(deps.authService.login).toHaveBeenCalledWith('admin@example.com', 'securepass123');
});
it('passes name when provided', async () => {
const deps = createMockDeps();
deps.userService.count.mockResolvedValue(0);
await createApp(deps);
await app.inject({
method: 'POST',
url: '/api/v1/auth/bootstrap',
payload: { email: 'admin@example.com', password: 'securepass123', name: 'Admin User' },
});
expect(deps.userService.create).toHaveBeenCalledWith({
email: 'admin@example.com',
password: 'securepass123',
name: 'Admin User',
});
});
it('returns 409 when users already exist', async () => {
const deps = createMockDeps();
deps.userService.count.mockResolvedValue(1);
await createApp(deps);
const res = await app.inject({
method: 'POST',
url: '/api/v1/auth/bootstrap',
payload: { email: 'admin@example.com', password: 'securepass123' },
});
expect(res.statusCode).toBe(409);
expect(res.json<{ error: string }>().error).toContain('Users already exist');
// Should NOT have created user, group, or RBAC
expect(deps.userService.create).not.toHaveBeenCalled();
expect(deps.groupService.create).not.toHaveBeenCalled();
expect(deps.rbacDefinitionService.create).not.toHaveBeenCalled();
});
it('validates email and password via UserService', async () => {
const deps = createMockDeps();
deps.userService.count.mockResolvedValue(0);
// Simulate Zod validation error from UserService
deps.userService.create.mockRejectedValue(
Object.assign(new Error('Validation error'), { statusCode: 400, issues: [] }),
);
await createApp(deps);
const res = await app.inject({
method: 'POST',
url: '/api/v1/auth/bootstrap',
payload: { email: 'not-an-email', password: 'short' },
});
// The error handler should handle the validation error
expect(res.statusCode).toBeGreaterThanOrEqual(400);
});
});
describe('POST /api/v1/auth/login', () => {
it('logs in successfully', async () => {
const deps = createMockDeps();
await createApp(deps);
const res = await app.inject({
method: 'POST',
url: '/api/v1/auth/login',
payload: { email: 'admin@example.com', password: 'securepass123' },
});
expect(res.statusCode).toBe(200);
expect(res.json<LoginResult>().token).toBe('test-token-123');
});
});
describe('POST /api/v1/auth/logout', () => {
it('logs out with valid token', async () => {
const deps = createMockDeps();
deps.authService.findSession.mockResolvedValue({
userId: 'user-1',
expiresAt: new Date(Date.now() + 86400_000),
});
await createApp(deps);
const res = await app.inject({
method: 'POST',
url: '/api/v1/auth/logout',
headers: { authorization: 'Bearer valid-token' },
});
expect(res.statusCode).toBe(200);
expect(res.json<{ success: boolean }>().success).toBe(true);
expect(deps.authService.logout).toHaveBeenCalledWith('valid-token');
});
it('returns 401 without auth', async () => {
const deps = createMockDeps();
await createApp(deps);
const res = await app.inject({
method: 'POST',
url: '/api/v1/auth/logout',
});
expect(res.statusCode).toBe(401);
});
});
describe('POST /api/v1/auth/impersonate', () => {
it('creates session for target user when caller is admin', async () => {
const deps = createMockDeps();
// Auth: valid session
deps.authService.findSession.mockResolvedValue({
userId: 'admin-user-id',
expiresAt: new Date(Date.now() + 86400_000),
});
// RBAC: allow impersonate operation
deps.rbacService.canRunOperation.mockResolvedValue(true);
// Impersonate returns token for target
deps.authService.impersonate.mockResolvedValue(
makeLoginResult({ token: 'impersonated-token', user: { id: 'user-2', email: 'target@example.com', role: 'user' } }),
);
await createApp(deps);
const res = await app.inject({
method: 'POST',
url: '/api/v1/auth/impersonate',
headers: { authorization: 'Bearer admin-token' },
payload: { email: 'target@example.com' },
});
expect(res.statusCode).toBe(200);
const body = res.json<LoginResult>();
expect(body.token).toBe('impersonated-token');
expect(body.user.email).toBe('target@example.com');
expect(deps.authService.impersonate).toHaveBeenCalledWith('target@example.com');
});
it('returns 401 without auth', async () => {
const deps = createMockDeps();
await createApp(deps);
const res = await app.inject({
method: 'POST',
url: '/api/v1/auth/impersonate',
payload: { email: 'target@example.com' },
});
expect(res.statusCode).toBe(401);
});
it('returns 403 when caller lacks admin permission on users', async () => {
const deps = createMockDeps();
// Auth: valid session
deps.authService.findSession.mockResolvedValue({
userId: 'non-admin-id',
expiresAt: new Date(Date.now() + 86400_000),
});
// RBAC: deny
deps.rbacService.canRunOperation.mockResolvedValue(false);
await createApp(deps);
const res = await app.inject({
method: 'POST',
url: '/api/v1/auth/impersonate',
headers: { authorization: 'Bearer regular-token' },
payload: { email: 'target@example.com' },
});
expect(res.statusCode).toBe(403);
});
it('returns 401 when impersonation target does not exist', async () => {
const deps = createMockDeps();
// Auth: valid session
deps.authService.findSession.mockResolvedValue({
userId: 'admin-user-id',
expiresAt: new Date(Date.now() + 86400_000),
});
// RBAC: allow
deps.rbacService.canRunOperation.mockResolvedValue(true);
// Impersonate fails — user not found
const authError = new Error('User not found');
(authError as Error & { statusCode: number }).statusCode = 401;
deps.authService.impersonate.mockRejectedValue(authError);
await createApp(deps);
const res = await app.inject({
method: 'POST',
url: '/api/v1/auth/impersonate',
headers: { authorization: 'Bearer admin-token' },
payload: { email: 'nonexistent@example.com' },
});
expect(res.statusCode).toBe(401);
});
});
});

View File

@@ -6,6 +6,9 @@ import { encrypt, decrypt, isSensitiveKey } from '../src/services/backup/crypto.
import { registerBackupRoutes } from '../src/routes/backup.js';
import type { IMcpServerRepository, ISecretRepository } from '../src/repositories/interfaces.js';
import type { IProjectRepository } from '../src/repositories/project.repository.js';
import type { IUserRepository } from '../src/repositories/user.repository.js';
import type { IGroupRepository } from '../src/repositories/group.repository.js';
import type { IRbacDefinitionRepository } from '../src/repositories/rbac-definition.repository.js';
// Mock data
const mockServers = [
@@ -31,8 +34,32 @@ const mockSecrets = [
const mockProjects = [
{
id: 'proj1', name: 'my-project', description: 'Test project',
id: 'proj1', name: 'my-project', description: 'Test project', proxyMode: 'direct', llmProvider: null, llmModel: null,
ownerId: 'user1', version: 1, createdAt: new Date(), updatedAt: new Date(),
servers: [{ id: 'ps1', server: { id: 's1', name: 'github' } }],
},
];
const mockUsers = [
{ id: 'u1', email: 'alice@test.com', name: 'Alice', role: 'ADMIN', provider: null, externalId: null, version: 1, createdAt: new Date(), updatedAt: new Date() },
{ id: 'u2', email: 'bob@test.com', name: null, role: 'USER', provider: 'oidc', externalId: null, version: 1, createdAt: new Date(), updatedAt: new Date() },
];
const mockGroups = [
{
id: 'g1', name: 'dev-team', description: 'Developers', version: 1, createdAt: new Date(), updatedAt: new Date(),
members: [
{ id: 'gm1', user: { id: 'u1', email: 'alice@test.com', name: 'Alice' } },
{ id: 'gm2', user: { id: 'u2', email: 'bob@test.com', name: null } },
],
},
];
const mockRbacDefinitions = [
{
id: 'rbac1', name: 'admins', version: 1, createdAt: new Date(), updatedAt: new Date(),
subjects: [{ kind: 'User', name: 'alice@test.com' }],
roleBindings: [{ role: 'edit', resource: '*' }],
},
];
@@ -63,9 +90,47 @@ function mockProjectRepo(): IProjectRepository {
findAll: vi.fn(async () => [...mockProjects]),
findById: vi.fn(async (id: string) => mockProjects.find((p) => p.id === id) ?? null),
findByName: vi.fn(async () => null),
create: vi.fn(async (data) => ({ id: 'new-proj', ...data, version: 1, createdAt: new Date(), updatedAt: new Date() } as typeof mockProjects[0])),
create: vi.fn(async (data) => ({ id: 'new-proj', ...data, servers: [], version: 1, createdAt: new Date(), updatedAt: new Date() } as typeof mockProjects[0])),
update: vi.fn(async (id, data) => ({ ...mockProjects.find((p) => p.id === id)!, ...data })),
delete: vi.fn(async () => {}),
setServers: vi.fn(async () => {}),
addServer: vi.fn(async () => {}),
removeServer: vi.fn(async () => {}),
};
}
function mockUserRepo(): IUserRepository {
return {
findAll: vi.fn(async () => [...mockUsers]),
findById: vi.fn(async (id: string) => mockUsers.find((u) => u.id === id) ?? null),
findByEmail: vi.fn(async (email: string) => mockUsers.find((u) => u.email === email) ?? null),
create: vi.fn(async (data) => ({ id: 'new-u', ...data, provider: null, externalId: null, version: 1, createdAt: new Date(), updatedAt: new Date() } as typeof mockUsers[0])),
delete: vi.fn(async () => {}),
count: vi.fn(async () => mockUsers.length),
};
}
function mockGroupRepo(): IGroupRepository {
return {
findAll: vi.fn(async () => [...mockGroups]),
findById: vi.fn(async (id: string) => mockGroups.find((g) => g.id === id) ?? null),
findByName: vi.fn(async (name: string) => mockGroups.find((g) => g.name === name) ?? null),
create: vi.fn(async (data) => ({ id: 'new-g', ...data, version: 1, createdAt: new Date(), updatedAt: new Date() } as typeof mockGroups[0])),
update: vi.fn(async (id, data) => ({ ...mockGroups.find((g) => g.id === id)!, ...data })),
delete: vi.fn(async () => {}),
setMembers: vi.fn(async () => {}),
findGroupsForUser: vi.fn(async () => []),
};
}
function mockRbacRepo(): IRbacDefinitionRepository {
return {
findAll: vi.fn(async () => [...mockRbacDefinitions]),
findById: vi.fn(async (id: string) => mockRbacDefinitions.find((r) => r.id === id) ?? null),
findByName: vi.fn(async (name: string) => mockRbacDefinitions.find((r) => r.name === name) ?? null),
create: vi.fn(async (data) => ({ id: 'new-rbac', ...data, version: 1, createdAt: new Date(), updatedAt: new Date() } as typeof mockRbacDefinitions[0])),
update: vi.fn(async (id, data) => ({ ...mockRbacDefinitions.find((r) => r.id === id)!, ...data })),
delete: vi.fn(async () => {}),
};
}
@@ -110,7 +175,7 @@ describe('BackupService', () => {
let backupService: BackupService;
beforeEach(() => {
backupService = new BackupService(mockServerRepo(), mockProjectRepo(), mockSecretRepo());
backupService = new BackupService(mockServerRepo(), mockProjectRepo(), mockSecretRepo(), mockUserRepo(), mockGroupRepo(), mockRbacRepo());
});
it('creates backup with all resources', async () => {
@@ -126,11 +191,50 @@ describe('BackupService', () => {
expect(bundle.projects[0]!.name).toBe('my-project');
});
it('includes users in backup', async () => {
const bundle = await backupService.createBackup();
expect(bundle.users).toHaveLength(2);
expect(bundle.users![0]!.email).toBe('alice@test.com');
expect(bundle.users![0]!.role).toBe('ADMIN');
expect(bundle.users![1]!.email).toBe('bob@test.com');
expect(bundle.users![1]!.provider).toBe('oidc');
});
it('includes groups in backup with member emails', async () => {
const bundle = await backupService.createBackup();
expect(bundle.groups).toHaveLength(1);
expect(bundle.groups![0]!.name).toBe('dev-team');
expect(bundle.groups![0]!.memberEmails).toEqual(['alice@test.com', 'bob@test.com']);
});
it('includes rbac bindings in backup', async () => {
const bundle = await backupService.createBackup();
expect(bundle.rbacBindings).toHaveLength(1);
expect(bundle.rbacBindings![0]!.name).toBe('admins');
expect(bundle.rbacBindings![0]!.subjects).toEqual([{ kind: 'User', name: 'alice@test.com' }]);
});
it('includes enriched projects with server names', async () => {
const bundle = await backupService.createBackup();
const proj = bundle.projects[0]!;
expect(proj.proxyMode).toBe('direct');
expect(proj.serverNames).toEqual(['github']);
});
it('filters resources', async () => {
const bundle = await backupService.createBackup({ resources: ['servers'] });
expect(bundle.servers).toHaveLength(2);
expect(bundle.secrets).toHaveLength(0);
expect(bundle.projects).toHaveLength(0);
expect(bundle.users).toHaveLength(0);
expect(bundle.groups).toHaveLength(0);
expect(bundle.rbacBindings).toHaveLength(0);
});
it('filters to only users', async () => {
const bundle = await backupService.createBackup({ resources: ['users'] });
expect(bundle.servers).toHaveLength(0);
expect(bundle.users).toHaveLength(2);
});
it('encrypts sensitive secret values when password provided', async () => {
@@ -150,13 +254,22 @@ describe('BackupService', () => {
(emptySecretRepo.findAll as ReturnType<typeof vi.fn>).mockResolvedValue([]);
const emptyProjectRepo = mockProjectRepo();
(emptyProjectRepo.findAll as ReturnType<typeof vi.fn>).mockResolvedValue([]);
const emptyUserRepo = mockUserRepo();
(emptyUserRepo.findAll as ReturnType<typeof vi.fn>).mockResolvedValue([]);
const emptyGroupRepo = mockGroupRepo();
(emptyGroupRepo.findAll as ReturnType<typeof vi.fn>).mockResolvedValue([]);
const emptyRbacRepo = mockRbacRepo();
(emptyRbacRepo.findAll as ReturnType<typeof vi.fn>).mockResolvedValue([]);
const service = new BackupService(emptyServerRepo, emptyProjectRepo, emptySecretRepo);
const service = new BackupService(emptyServerRepo, emptyProjectRepo, emptySecretRepo, emptyUserRepo, emptyGroupRepo, emptyRbacRepo);
const bundle = await service.createBackup();
expect(bundle.servers).toHaveLength(0);
expect(bundle.secrets).toHaveLength(0);
expect(bundle.projects).toHaveLength(0);
expect(bundle.users).toHaveLength(0);
expect(bundle.groups).toHaveLength(0);
expect(bundle.rbacBindings).toHaveLength(0);
});
});
@@ -165,16 +278,25 @@ describe('RestoreService', () => {
let serverRepo: IMcpServerRepository;
let secretRepo: ISecretRepository;
let projectRepo: IProjectRepository;
let userRepo: IUserRepository;
let groupRepo: IGroupRepository;
let rbacRepo: IRbacDefinitionRepository;
beforeEach(() => {
serverRepo = mockServerRepo();
secretRepo = mockSecretRepo();
projectRepo = mockProjectRepo();
userRepo = mockUserRepo();
groupRepo = mockGroupRepo();
rbacRepo = mockRbacRepo();
// Default: nothing exists yet
(serverRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
(secretRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
(projectRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
restoreService = new RestoreService(serverRepo, projectRepo, secretRepo);
(userRepo.findByEmail as ReturnType<typeof vi.fn>).mockResolvedValue(null);
(groupRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
(rbacRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
restoreService = new RestoreService(serverRepo, projectRepo, secretRepo, userRepo, groupRepo, rbacRepo);
});
const validBundle = {
@@ -187,6 +309,23 @@ describe('RestoreService', () => {
projects: [{ name: 'test-proj', description: 'Test' }],
};
const fullBundle = {
...validBundle,
users: [
{ email: 'alice@test.com', name: 'Alice', role: 'ADMIN', provider: null },
{ email: 'bob@test.com', name: null, role: 'USER', provider: 'oidc' },
],
groups: [
{ name: 'dev-team', description: 'Developers', memberEmails: ['alice@test.com', 'bob@test.com'] },
],
rbacBindings: [
{ name: 'admins', subjects: [{ kind: 'User', name: 'alice@test.com' }], roleBindings: [{ role: 'edit', resource: '*' }] },
],
projects: [
{ name: 'test-proj', description: 'Test', proxyMode: 'filtered', llmProvider: 'openai', llmModel: 'gpt-4', serverNames: ['github'], members: ['alice@test.com'] },
],
};
it('validates valid bundle', () => {
expect(restoreService.validateBundle(validBundle)).toBe(true);
});
@@ -197,6 +336,11 @@ describe('RestoreService', () => {
expect(restoreService.validateBundle({ version: '1' })).toBe(false);
});
it('validates old bundles without new fields (backwards compatibility)', () => {
expect(restoreService.validateBundle(validBundle)).toBe(true);
// Old bundle has no users/groups/rbacBindings — should still validate
});
it('restores all resources', async () => {
const result = await restoreService.restore(validBundle);
@@ -209,6 +353,95 @@ describe('RestoreService', () => {
expect(projectRepo.create).toHaveBeenCalled();
});
it('restores users', async () => {
const result = await restoreService.restore(fullBundle);
expect(result.usersCreated).toBe(2);
expect(userRepo.create).toHaveBeenCalledWith(expect.objectContaining({
email: 'alice@test.com',
name: 'Alice',
role: 'ADMIN',
passwordHash: '__RESTORED_MUST_RESET__',
}));
expect(userRepo.create).toHaveBeenCalledWith(expect.objectContaining({
email: 'bob@test.com',
role: 'USER',
}));
});
it('restores groups with member resolution', async () => {
// After users are created, simulate they can be found by email
let callCount = 0;
(userRepo.findByEmail as ReturnType<typeof vi.fn>).mockImplementation(async (email: string) => {
// First calls during user restore return null (user doesn't exist yet)
// Later calls during group member resolution return the created user
callCount++;
if (callCount > 2) {
// After user creation phase, simulate finding created users
if (email === 'alice@test.com') return { id: 'new-u-alice', email };
if (email === 'bob@test.com') return { id: 'new-u-bob', email };
}
return null;
});
const result = await restoreService.restore(fullBundle);
expect(result.groupsCreated).toBe(1);
expect(groupRepo.create).toHaveBeenCalledWith(expect.objectContaining({
name: 'dev-team',
description: 'Developers',
}));
expect(groupRepo.setMembers).toHaveBeenCalled();
});
it('restores rbac bindings', async () => {
const result = await restoreService.restore(fullBundle);
expect(result.rbacCreated).toBe(1);
expect(rbacRepo.create).toHaveBeenCalledWith(expect.objectContaining({
name: 'admins',
subjects: [{ kind: 'User', name: 'alice@test.com' }],
roleBindings: [{ role: 'edit', resource: '*' }],
}));
});
it('restores enriched projects with server linking', async () => {
// Simulate servers exist (restored in prior step)
(serverRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
// After server restore, we can find them
let serverCallCount = 0;
(serverRepo.findByName as ReturnType<typeof vi.fn>).mockImplementation(async (name: string) => {
serverCallCount++;
// During server restore phase, first call returns null (server doesn't exist)
// During project restore phase, server should be found
if (serverCallCount > 1 && name === 'github') return { id: 'restored-s1', name: 'github' };
return null;
});
const result = await restoreService.restore(fullBundle);
expect(result.projectsCreated).toBe(1);
expect(projectRepo.create).toHaveBeenCalledWith(expect.objectContaining({
name: 'test-proj',
proxyMode: 'filtered',
llmProvider: 'openai',
llmModel: 'gpt-4',
}));
expect(projectRepo.setServers).toHaveBeenCalled();
});
it('restores old bundle without users/groups/rbac', async () => {
const result = await restoreService.restore(validBundle);
expect(result.serversCreated).toBe(1);
expect(result.secretsCreated).toBe(1);
expect(result.projectsCreated).toBe(1);
expect(result.usersCreated).toBe(0);
expect(result.groupsCreated).toBe(0);
expect(result.rbacCreated).toBe(0);
expect(result.errors).toHaveLength(0);
});
it('skips existing resources with skip strategy', async () => {
(serverRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(mockServers[0]);
const result = await restoreService.restore(validBundle, { conflictStrategy: 'skip' });
@@ -218,6 +451,33 @@ describe('RestoreService', () => {
expect(serverRepo.create).not.toHaveBeenCalled();
});
it('skips existing users', async () => {
(userRepo.findByEmail as ReturnType<typeof vi.fn>).mockResolvedValue(mockUsers[0]);
const bundle = { ...validBundle, users: [{ email: 'alice@test.com', name: 'Alice', role: 'ADMIN', provider: null }] };
const result = await restoreService.restore(bundle, { conflictStrategy: 'skip' });
expect(result.usersSkipped).toBe(1);
expect(result.usersCreated).toBe(0);
});
it('skips existing groups', async () => {
(groupRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(mockGroups[0]);
const bundle = { ...validBundle, groups: [{ name: 'dev-team', description: 'Devs', memberEmails: [] }] };
const result = await restoreService.restore(bundle, { conflictStrategy: 'skip' });
expect(result.groupsSkipped).toBe(1);
expect(result.groupsCreated).toBe(0);
});
it('skips existing rbac bindings', async () => {
(rbacRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(mockRbacDefinitions[0]);
const bundle = { ...validBundle, rbacBindings: [{ name: 'admins', subjects: [], roleBindings: [] }] };
const result = await restoreService.restore(bundle, { conflictStrategy: 'skip' });
expect(result.rbacSkipped).toBe(1);
expect(result.rbacCreated).toBe(0);
});
it('aborts on conflict with fail strategy', async () => {
(serverRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(mockServers[0]);
const result = await restoreService.restore(validBundle, { conflictStrategy: 'fail' });
@@ -233,6 +493,18 @@ describe('RestoreService', () => {
expect(serverRepo.update).toHaveBeenCalled();
});
it('overwrites existing rbac bindings', async () => {
(rbacRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(mockRbacDefinitions[0]);
const bundle = {
...validBundle,
rbacBindings: [{ name: 'admins', subjects: [{ kind: 'User', name: 'new@test.com' }], roleBindings: [{ role: 'view', resource: 'servers' }] }],
};
const result = await restoreService.restore(bundle, { conflictStrategy: 'overwrite' });
expect(result.rbacCreated).toBe(1);
expect(rbacRepo.update).toHaveBeenCalled();
});
it('fails restore with encrypted bundle and no password', async () => {
const encBundle = { ...validBundle, encrypted: true, encryptedSecrets: encrypt('{}', 'pw') };
const result = await restoreService.restore(encBundle);
@@ -262,6 +534,26 @@ describe('RestoreService', () => {
const result = await restoreService.restore(encBundle, { password: 'wrong' });
expect(result.errors[0]).toContain('Failed to decrypt');
});
it('restores in correct order: secrets → servers → users → groups → projects → rbac', async () => {
const callOrder: string[] = [];
(secretRepo.create as ReturnType<typeof vi.fn>).mockImplementation(async () => { callOrder.push('secret'); return { id: 'sec' }; });
(serverRepo.create as ReturnType<typeof vi.fn>).mockImplementation(async () => { callOrder.push('server'); return { id: 'srv' }; });
(userRepo.create as ReturnType<typeof vi.fn>).mockImplementation(async () => { callOrder.push('user'); return { id: 'usr' }; });
(groupRepo.create as ReturnType<typeof vi.fn>).mockImplementation(async () => { callOrder.push('group'); return { id: 'grp' }; });
(projectRepo.create as ReturnType<typeof vi.fn>).mockImplementation(async () => { callOrder.push('project'); return { id: 'proj', servers: [] }; });
(rbacRepo.create as ReturnType<typeof vi.fn>).mockImplementation(async () => { callOrder.push('rbac'); return { id: 'rbac' }; });
await restoreService.restore(fullBundle);
expect(callOrder[0]).toBe('secret');
expect(callOrder[1]).toBe('server');
expect(callOrder[2]).toBe('user');
expect(callOrder[3]).toBe('user'); // second user
expect(callOrder[4]).toBe('group');
expect(callOrder[5]).toBe('project');
expect(callOrder[6]).toBe('rbac');
});
});
describe('Backup Routes', () => {
@@ -272,7 +564,7 @@ describe('Backup Routes', () => {
const sRepo = mockServerRepo();
const secRepo = mockSecretRepo();
const prRepo = mockProjectRepo();
backupService = new BackupService(sRepo, prRepo, secRepo);
backupService = new BackupService(sRepo, prRepo, secRepo, mockUserRepo(), mockGroupRepo(), mockRbacRepo());
const rSRepo = mockServerRepo();
(rSRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
@@ -280,7 +572,13 @@ describe('Backup Routes', () => {
(rSecRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
const rPrRepo = mockProjectRepo();
(rPrRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
restoreService = new RestoreService(rSRepo, rPrRepo, rSecRepo);
const rUserRepo = mockUserRepo();
(rUserRepo.findByEmail as ReturnType<typeof vi.fn>).mockResolvedValue(null);
const rGroupRepo = mockGroupRepo();
(rGroupRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
const rRbacRepo = mockRbacRepo();
(rRbacRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
restoreService = new RestoreService(rSRepo, rPrRepo, rSecRepo, rUserRepo, rGroupRepo, rRbacRepo);
});
async function buildApp() {
@@ -289,7 +587,7 @@ describe('Backup Routes', () => {
return app;
}
it('POST /api/v1/backup returns bundle', async () => {
it('POST /api/v1/backup returns bundle with new resource types', async () => {
const app = await buildApp();
const res = await app.inject({
method: 'POST',
@@ -303,6 +601,9 @@ describe('Backup Routes', () => {
expect(body.servers).toBeDefined();
expect(body.secrets).toBeDefined();
expect(body.projects).toBeDefined();
expect(body.users).toBeDefined();
expect(body.groups).toBeDefined();
expect(body.rbacBindings).toBeDefined();
});
it('POST /api/v1/restore imports bundle', async () => {
@@ -318,6 +619,9 @@ describe('Backup Routes', () => {
expect(res.statusCode).toBe(200);
const body = res.json();
expect(body.serversCreated).toBeDefined();
expect(body.usersCreated).toBeDefined();
expect(body.groupsCreated).toBeDefined();
expect(body.rbacCreated).toBeDefined();
});
it('POST /api/v1/restore rejects invalid bundle', async () => {

View File

@@ -0,0 +1,250 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { GroupService } from '../src/services/group.service.js';
import { NotFoundError, ConflictError } from '../src/services/mcp-server.service.js';
import type { IGroupRepository, GroupWithMembers } from '../src/repositories/group.repository.js';
import type { IUserRepository, SafeUser } from '../src/repositories/user.repository.js';
import type { Group } from '@prisma/client';
function makeGroup(overrides: Partial<Group> = {}): Group {
return {
id: 'grp-1',
name: 'developers',
description: 'Dev team',
version: 1,
createdAt: new Date(),
updatedAt: new Date(),
...overrides,
};
}
function makeGroupWithMembers(overrides: Partial<Group> = {}, members: GroupWithMembers['members'] = []): GroupWithMembers {
return {
...makeGroup(overrides),
members,
};
}
function makeUser(overrides: Partial<SafeUser> = {}): SafeUser {
return {
id: 'user-1',
email: 'alice@example.com',
name: 'Alice',
role: 'USER',
provider: null,
externalId: null,
version: 1,
createdAt: new Date(),
updatedAt: new Date(),
...overrides,
};
}
function mockGroupRepo(): IGroupRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByName: vi.fn(async () => null),
create: vi.fn(async (data) => makeGroup({ name: data.name, description: data.description ?? '' })),
update: vi.fn(async (id, data) => makeGroup({ id, description: data.description ?? '' })),
delete: vi.fn(async () => {}),
setMembers: vi.fn(async () => {}),
findGroupsForUser: vi.fn(async () => []),
};
}
function mockUserRepo(): IUserRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByEmail: vi.fn(async () => null),
create: vi.fn(async () => makeUser()),
delete: vi.fn(async () => {}),
count: vi.fn(async () => 0),
};
}
describe('GroupService', () => {
let groupRepo: ReturnType<typeof mockGroupRepo>;
let userRepo: ReturnType<typeof mockUserRepo>;
let service: GroupService;
beforeEach(() => {
groupRepo = mockGroupRepo();
userRepo = mockUserRepo();
service = new GroupService(groupRepo, userRepo);
});
describe('list', () => {
it('returns empty list', async () => {
const result = await service.list();
expect(result).toEqual([]);
expect(groupRepo.findAll).toHaveBeenCalled();
});
it('returns groups with members', async () => {
const groups = [
makeGroupWithMembers({ id: 'g1', name: 'admins' }, [
{ id: 'gm-1', user: { id: 'u1', email: 'a@b.com', name: 'A' } },
]),
];
vi.mocked(groupRepo.findAll).mockResolvedValue(groups);
const result = await service.list();
expect(result).toHaveLength(1);
expect(result[0].members).toHaveLength(1);
});
});
describe('create', () => {
it('creates a group without members', async () => {
const created = makeGroupWithMembers({ name: 'my-group', description: '' }, []);
vi.mocked(groupRepo.findById).mockResolvedValue(created);
const result = await service.create({ name: 'my-group' });
expect(result.name).toBe('my-group');
expect(groupRepo.create).toHaveBeenCalledWith({ name: 'my-group', description: '' });
expect(groupRepo.setMembers).not.toHaveBeenCalled();
});
it('creates a group with members', async () => {
const alice = makeUser({ id: 'u-alice', email: 'alice@example.com' });
const bob = makeUser({ id: 'u-bob', email: 'bob@example.com', name: 'Bob' });
vi.mocked(userRepo.findByEmail).mockImplementation(async (email: string) => {
if (email === 'alice@example.com') return alice;
if (email === 'bob@example.com') return bob;
return null;
});
const created = makeGroupWithMembers({ name: 'team' }, [
{ id: 'gm-1', user: { id: 'u-alice', email: 'alice@example.com', name: 'Alice' } },
{ id: 'gm-2', user: { id: 'u-bob', email: 'bob@example.com', name: 'Bob' } },
]);
vi.mocked(groupRepo.findById).mockResolvedValue(created);
const result = await service.create({
name: 'team',
members: ['alice@example.com', 'bob@example.com'],
});
expect(groupRepo.setMembers).toHaveBeenCalledWith('grp-1', ['u-alice', 'u-bob']);
expect(result.members).toHaveLength(2);
});
it('throws ConflictError when name exists', async () => {
vi.mocked(groupRepo.findByName).mockResolvedValue(makeGroupWithMembers({ name: 'taken' }));
await expect(service.create({ name: 'taken' })).rejects.toThrow(ConflictError);
});
it('throws NotFoundError for unknown member email', async () => {
vi.mocked(userRepo.findByEmail).mockResolvedValue(null);
await expect(
service.create({ name: 'team', members: ['unknown@example.com'] }),
).rejects.toThrow(NotFoundError);
});
it('validates input', async () => {
await expect(service.create({ name: '' })).rejects.toThrow();
await expect(service.create({ name: 'UPPERCASE' })).rejects.toThrow();
});
});
describe('getById', () => {
it('returns group when found', async () => {
const group = makeGroupWithMembers({ id: 'g1' });
vi.mocked(groupRepo.findById).mockResolvedValue(group);
const result = await service.getById('g1');
expect(result.id).toBe('g1');
});
it('throws NotFoundError when not found', async () => {
await expect(service.getById('missing')).rejects.toThrow(NotFoundError);
});
});
describe('getByName', () => {
it('returns group when found', async () => {
const group = makeGroupWithMembers({ name: 'admins' });
vi.mocked(groupRepo.findByName).mockResolvedValue(group);
const result = await service.getByName('admins');
expect(result.name).toBe('admins');
});
it('throws NotFoundError when not found', async () => {
await expect(service.getByName('missing')).rejects.toThrow(NotFoundError);
});
});
describe('update', () => {
it('updates description', async () => {
const group = makeGroupWithMembers({ id: 'g1' });
vi.mocked(groupRepo.findById).mockResolvedValue(group);
const updated = makeGroupWithMembers({ id: 'g1', description: 'new desc' });
// After update, getById is called again to return fresh data
vi.mocked(groupRepo.findById).mockResolvedValue(updated);
const result = await service.update('g1', { description: 'new desc' });
expect(groupRepo.update).toHaveBeenCalledWith('g1', { description: 'new desc' });
expect(result.description).toBe('new desc');
});
it('updates members (full replacement)', async () => {
const group = makeGroupWithMembers({ id: 'g1' }, [
{ id: 'gm-1', user: { id: 'u-old', email: 'old@example.com', name: 'Old' } },
]);
vi.mocked(groupRepo.findById).mockResolvedValue(group);
const alice = makeUser({ id: 'u-alice', email: 'alice@example.com' });
vi.mocked(userRepo.findByEmail).mockResolvedValue(alice);
const updated = makeGroupWithMembers({ id: 'g1' }, [
{ id: 'gm-2', user: { id: 'u-alice', email: 'alice@example.com', name: 'Alice' } },
]);
vi.mocked(groupRepo.findById).mockResolvedValueOnce(group).mockResolvedValue(updated);
const result = await service.update('g1', { members: ['alice@example.com'] });
expect(groupRepo.setMembers).toHaveBeenCalledWith('g1', ['u-alice']);
expect(result.members).toHaveLength(1);
});
it('throws NotFoundError when group not found', async () => {
await expect(service.update('missing', { description: 'x' })).rejects.toThrow(NotFoundError);
});
it('throws NotFoundError for unknown member email on update', async () => {
const group = makeGroupWithMembers({ id: 'g1' });
vi.mocked(groupRepo.findById).mockResolvedValue(group);
vi.mocked(userRepo.findByEmail).mockResolvedValue(null);
await expect(
service.update('g1', { members: ['unknown@example.com'] }),
).rejects.toThrow(NotFoundError);
});
});
describe('delete', () => {
it('deletes group', async () => {
const group = makeGroupWithMembers({ id: 'g1' });
vi.mocked(groupRepo.findById).mockResolvedValue(group);
await service.delete('g1');
expect(groupRepo.delete).toHaveBeenCalledWith('g1');
});
it('throws NotFoundError when group not found', async () => {
await expect(service.delete('missing')).rejects.toThrow(NotFoundError);
});
});
describe('group includes resolved member info', () => {
it('members include user id, email, and name', async () => {
const group = makeGroupWithMembers({ id: 'g1', name: 'team' }, [
{ id: 'gm-1', user: { id: 'u1', email: 'alice@example.com', name: 'Alice' } },
{ id: 'gm-2', user: { id: 'u2', email: 'bob@example.com', name: null } },
]);
vi.mocked(groupRepo.findById).mockResolvedValue(group);
const result = await service.getById('g1');
expect(result.members[0].user).toEqual({ id: 'u1', email: 'alice@example.com', name: 'Alice' });
expect(result.members[1].user).toEqual({ id: 'u2', email: 'bob@example.com', name: null });
});
});
});

View File

@@ -11,10 +11,17 @@ function makeServer(overrides: Partial<McpServer> = {}): McpServer {
dockerImage: null,
transport: 'STDIO',
repositoryUrl: null,
externalUrl: null,
command: null,
containerPort: null,
replicas: 1,
env: [],
healthCheck: null,
version: 1,
createdAt: new Date(),
updatedAt: new Date(),
templateName: null,
templateVersion: null,
...overrides,
};
}
@@ -25,7 +32,7 @@ describe('generateMcpConfig', () => {
expect(result).toEqual({ mcpServers: {} });
});
it('generates config for a single server', () => {
it('generates config for a single STDIO server', () => {
const result = generateMcpConfig([
{ server: makeServer(), resolvedEnv: {} },
]);
@@ -34,7 +41,7 @@ describe('generateMcpConfig', () => {
expect(result.mcpServers['slack']?.args).toEqual(['-y', '@anthropic/slack-mcp']);
});
it('includes resolved env when present', () => {
it('includes resolved env when present for STDIO server', () => {
const result = generateMcpConfig([
{ server: makeServer(), resolvedEnv: { SLACK_TEAM_ID: 'T123' } },
]);
@@ -67,4 +74,35 @@ describe('generateMcpConfig', () => {
]);
expect(result.mcpServers['slack']?.args).toEqual(['-y', 'slack']);
});
it('generates URL-based config for SSE servers', () => {
const server = makeServer({ name: 'sse-server', transport: 'SSE' });
const result = generateMcpConfig([
{ server, resolvedEnv: { TOKEN: 'abc' } },
]);
const config = result.mcpServers['sse-server'];
expect(config?.url).toBe('http://localhost:3100/api/v1/mcp/proxy/sse-server');
expect(config?.command).toBeUndefined();
expect(config?.args).toBeUndefined();
expect(config?.env).toBeUndefined();
});
it('generates URL-based config for STREAMABLE_HTTP servers', () => {
const server = makeServer({ name: 'stream-server', transport: 'STREAMABLE_HTTP' });
const result = generateMcpConfig([
{ server, resolvedEnv: {} },
]);
const config = result.mcpServers['stream-server'];
expect(config?.url).toBe('http://localhost:3100/api/v1/mcp/proxy/stream-server');
expect(config?.command).toBeUndefined();
});
it('mixes STDIO and SSE servers correctly', () => {
const result = generateMcpConfig([
{ server: makeServer({ name: 'stdio-srv', transport: 'STDIO' }), resolvedEnv: {} },
{ server: makeServer({ name: 'sse-srv', transport: 'SSE' }), resolvedEnv: {} },
]);
expect(result.mcpServers['stdio-srv']?.command).toBe('npx');
expect(result.mcpServers['sse-srv']?.url).toBeDefined();
});
});

View File

@@ -0,0 +1,283 @@
import { describe, it, expect, vi, afterEach } from 'vitest';
import Fastify from 'fastify';
import type { FastifyInstance } from 'fastify';
import { registerProjectRoutes } from '../src/routes/projects.js';
import { ProjectService } from '../src/services/project.service.js';
import { errorHandler } from '../src/middleware/error-handler.js';
import type { IProjectRepository, ProjectWithRelations } from '../src/repositories/project.repository.js';
import type { IMcpServerRepository, ISecretRepository } from '../src/repositories/interfaces.js';
let app: FastifyInstance;
function makeProject(overrides: Partial<ProjectWithRelations> = {}): ProjectWithRelations {
return {
id: 'proj-1',
name: 'test-project',
description: '',
ownerId: 'user-1',
proxyMode: 'direct',
llmProvider: null,
llmModel: null,
version: 1,
createdAt: new Date(),
updatedAt: new Date(),
servers: [],
...overrides,
};
}
function mockProjectRepo(): IProjectRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByName: vi.fn(async () => null),
create: vi.fn(async (data) => makeProject({
name: data.name,
description: data.description,
ownerId: data.ownerId,
proxyMode: data.proxyMode,
})),
update: vi.fn(async (_id, data) => makeProject({ ...data as Partial<ProjectWithRelations> })),
delete: vi.fn(async () => {}),
setServers: vi.fn(async () => {}),
addServer: vi.fn(async () => {}),
removeServer: vi.fn(async () => {}),
};
}
function mockServerRepo(): IMcpServerRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByName: vi.fn(async () => null),
create: vi.fn(async () => ({} as never)),
update: vi.fn(async () => ({} as never)),
delete: vi.fn(async () => {}),
};
}
function mockSecretRepo(): ISecretRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByName: vi.fn(async () => null),
create: vi.fn(async () => ({} as never)),
update: vi.fn(async () => ({} as never)),
delete: vi.fn(async () => {}),
};
}
afterEach(async () => {
if (app) await app.close();
});
function createApp(projectRepo: IProjectRepository, serverRepo?: IMcpServerRepository) {
app = Fastify({ logger: false });
app.setErrorHandler(errorHandler);
const service = new ProjectService(projectRepo, serverRepo ?? mockServerRepo(), mockSecretRepo());
registerProjectRoutes(app, service);
return app.ready();
}
describe('Project Routes', () => {
describe('GET /api/v1/projects', () => {
it('returns project list', async () => {
const repo = mockProjectRepo();
vi.mocked(repo.findAll).mockResolvedValue([
makeProject({ id: 'p1', name: 'alpha', ownerId: 'user-1' }),
makeProject({ id: 'p2', name: 'beta', ownerId: 'user-2' }),
]);
await createApp(repo);
const res = await app.inject({ method: 'GET', url: '/api/v1/projects' });
expect(res.statusCode).toBe(200);
const body = res.json<Array<{ name: string }>>();
expect(body).toHaveLength(2);
});
it('lists all projects without ownerId filtering', async () => {
// This is the bug fix: the route must call list() without ownerId
// so that RBAC (preSerialization) handles access filtering, not the DB query.
const repo = mockProjectRepo();
vi.mocked(repo.findAll).mockResolvedValue([makeProject()]);
await createApp(repo);
await app.inject({ method: 'GET', url: '/api/v1/projects' });
// findAll must be called with NO arguments (undefined ownerId)
expect(repo.findAll).toHaveBeenCalledWith(undefined);
});
});
describe('GET /api/v1/projects/:id', () => {
it('returns 404 when not found', async () => {
const repo = mockProjectRepo();
await createApp(repo);
const res = await app.inject({ method: 'GET', url: '/api/v1/projects/missing' });
expect(res.statusCode).toBe(404);
});
it('returns project when found by ID', async () => {
const repo = mockProjectRepo();
vi.mocked(repo.findById).mockResolvedValue(makeProject({ id: 'p1', name: 'my-proj' }));
await createApp(repo);
const res = await app.inject({ method: 'GET', url: '/api/v1/projects/p1' });
expect(res.statusCode).toBe(200);
expect(res.json<{ name: string }>().name).toBe('my-proj');
});
it('resolves by name when ID not found', async () => {
const repo = mockProjectRepo();
vi.mocked(repo.findByName).mockResolvedValue(makeProject({ name: 'my-proj' }));
await createApp(repo);
const res = await app.inject({ method: 'GET', url: '/api/v1/projects/my-proj' });
expect(res.statusCode).toBe(200);
expect(res.json<{ name: string }>().name).toBe('my-proj');
});
});
describe('POST /api/v1/projects', () => {
it('creates a project and returns 201', async () => {
const repo = mockProjectRepo();
vi.mocked(repo.findById).mockResolvedValue(makeProject({ name: 'new-proj' }));
await createApp(repo);
const res = await app.inject({
method: 'POST',
url: '/api/v1/projects',
payload: { name: 'new-proj' },
});
expect(res.statusCode).toBe(201);
});
it('returns 400 for invalid input', async () => {
const repo = mockProjectRepo();
await createApp(repo);
const res = await app.inject({
method: 'POST',
url: '/api/v1/projects',
payload: { name: '' },
});
expect(res.statusCode).toBe(400);
});
it('returns 409 when name already exists', async () => {
const repo = mockProjectRepo();
vi.mocked(repo.findByName).mockResolvedValue(makeProject());
await createApp(repo);
const res = await app.inject({
method: 'POST',
url: '/api/v1/projects',
payload: { name: 'taken' },
});
expect(res.statusCode).toBe(409);
});
});
describe('PUT /api/v1/projects/:id', () => {
it('updates a project', async () => {
const repo = mockProjectRepo();
vi.mocked(repo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
await createApp(repo);
const res = await app.inject({
method: 'PUT',
url: '/api/v1/projects/p1',
payload: { description: 'Updated' },
});
expect(res.statusCode).toBe(200);
});
it('returns 404 when not found', async () => {
const repo = mockProjectRepo();
await createApp(repo);
const res = await app.inject({
method: 'PUT',
url: '/api/v1/projects/missing',
payload: { description: 'x' },
});
expect(res.statusCode).toBe(404);
});
});
describe('DELETE /api/v1/projects/:id', () => {
it('deletes a project and returns 204', async () => {
const repo = mockProjectRepo();
vi.mocked(repo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
await createApp(repo);
const res = await app.inject({ method: 'DELETE', url: '/api/v1/projects/p1' });
expect(res.statusCode).toBe(204);
});
it('returns 404 when not found', async () => {
const repo = mockProjectRepo();
await createApp(repo);
const res = await app.inject({ method: 'DELETE', url: '/api/v1/projects/missing' });
expect(res.statusCode).toBe(404);
});
});
describe('POST /api/v1/projects/:id/servers (attach)', () => {
it('attaches a server to a project', async () => {
const projectRepo = mockProjectRepo();
const serverRepo = mockServerRepo();
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
vi.mocked(serverRepo.findByName).mockResolvedValue({ id: 'srv-1', name: 'my-ha' } as never);
await createApp(projectRepo, serverRepo);
const res = await app.inject({
method: 'POST',
url: '/api/v1/projects/p1/servers',
payload: { server: 'my-ha' },
});
expect(res.statusCode).toBe(200);
expect(projectRepo.addServer).toHaveBeenCalledWith('p1', 'srv-1');
});
it('returns 400 when server field is missing', async () => {
const repo = mockProjectRepo();
vi.mocked(repo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
await createApp(repo);
const res = await app.inject({
method: 'POST',
url: '/api/v1/projects/p1/servers',
payload: {},
});
expect(res.statusCode).toBe(400);
});
it('returns 404 when server not found', async () => {
const projectRepo = mockProjectRepo();
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
await createApp(projectRepo);
const res = await app.inject({
method: 'POST',
url: '/api/v1/projects/p1/servers',
payload: { server: 'nonexistent' },
});
expect(res.statusCode).toBe(404);
});
});
describe('DELETE /api/v1/projects/:id/servers/:serverName (detach)', () => {
it('detaches a server from a project', async () => {
const projectRepo = mockProjectRepo();
const serverRepo = mockServerRepo();
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
vi.mocked(serverRepo.findByName).mockResolvedValue({ id: 'srv-1', name: 'my-ha' } as never);
await createApp(projectRepo, serverRepo);
const res = await app.inject({ method: 'DELETE', url: '/api/v1/projects/p1/servers/my-ha' });
expect(res.statusCode).toBe(204);
expect(projectRepo.removeServer).toHaveBeenCalledWith('p1', 'srv-1');
});
it('returns 404 when server not found', async () => {
const projectRepo = mockProjectRepo();
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
await createApp(projectRepo);
const res = await app.inject({ method: 'DELETE', url: '/api/v1/projects/p1/servers/nonexistent' });
expect(res.statusCode).toBe(404);
});
});
});

View File

@@ -1,66 +1,383 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { ProjectService } from '../src/services/project.service.js';
import { NotFoundError, ConflictError } from '../src/services/mcp-server.service.js';
import type { IProjectRepository } from '../src/repositories/project.repository.js';
import type { IProjectRepository, ProjectWithRelations } from '../src/repositories/project.repository.js';
import type { IMcpServerRepository, ISecretRepository } from '../src/repositories/interfaces.js';
import type { McpServer } from '@prisma/client';
function makeProject(overrides: Partial<ProjectWithRelations> = {}): ProjectWithRelations {
return {
id: 'proj-1',
name: 'test-project',
description: '',
ownerId: 'user-1',
proxyMode: 'direct',
llmProvider: null,
llmModel: null,
version: 1,
createdAt: new Date(),
updatedAt: new Date(),
servers: [],
...overrides,
};
}
function makeServer(overrides: Partial<McpServer> = {}): McpServer {
return {
id: 'srv-1',
name: 'test-server',
description: '',
packageName: '@mcp/test',
dockerImage: null,
transport: 'STDIO',
repositoryUrl: null,
externalUrl: null,
command: null,
containerPort: null,
replicas: 1,
env: [],
healthCheck: null,
version: 1,
createdAt: new Date(),
updatedAt: new Date(),
templateName: null,
templateVersion: null,
...overrides,
};
}
function mockProjectRepo(): IProjectRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByName: vi.fn(async () => null),
create: vi.fn(async (data) => ({
id: 'proj-1',
create: vi.fn(async (data) => makeProject({
name: data.name,
description: data.description ?? '',
description: data.description,
ownerId: data.ownerId,
version: 1,
createdAt: new Date(),
updatedAt: new Date(),
})),
update: vi.fn(async (id) => ({
id, name: 'test', description: '', ownerId: 'u1', version: 2,
createdAt: new Date(), updatedAt: new Date(),
proxyMode: data.proxyMode,
llmProvider: data.llmProvider ?? null,
llmModel: data.llmModel ?? null,
})),
update: vi.fn(async (_id, data) => makeProject({ ...data as Partial<ProjectWithRelations> })),
delete: vi.fn(async () => {}),
setServers: vi.fn(async () => {}),
addServer: vi.fn(async () => {}),
removeServer: vi.fn(async () => {}),
};
}
function mockServerRepo(): IMcpServerRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByName: vi.fn(async () => null),
create: vi.fn(async () => makeServer()),
update: vi.fn(async () => makeServer()),
delete: vi.fn(async () => {}),
};
}
function mockSecretRepo(): ISecretRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByName: vi.fn(async () => null),
create: vi.fn(async () => ({ id: 'sec-1', name: 'test', data: {}, version: 1, createdAt: new Date(), updatedAt: new Date() })),
update: vi.fn(async () => ({ id: 'sec-1', name: 'test', data: {}, version: 1, createdAt: new Date(), updatedAt: new Date() })),
delete: vi.fn(async () => {}),
};
}
describe('ProjectService', () => {
let projectRepo: ReturnType<typeof mockProjectRepo>;
let serverRepo: ReturnType<typeof mockServerRepo>;
let secretRepo: ReturnType<typeof mockSecretRepo>;
let service: ProjectService;
beforeEach(() => {
projectRepo = mockProjectRepo();
service = new ProjectService(projectRepo);
serverRepo = mockServerRepo();
secretRepo = mockSecretRepo();
service = new ProjectService(projectRepo, serverRepo, secretRepo);
});
describe('create', () => {
it('creates a project', async () => {
it('creates a basic project', async () => {
// After create, getById is called to re-fetch with relations
const created = makeProject({ name: 'my-project', ownerId: 'user-1' });
vi.mocked(projectRepo.findById).mockResolvedValue(created);
const result = await service.create({ name: 'my-project' }, 'user-1');
expect(result.name).toBe('my-project');
expect(result.ownerId).toBe('user-1');
expect(projectRepo.create).toHaveBeenCalled();
});
it('throws ConflictError when name exists', async () => {
vi.mocked(projectRepo.findByName).mockResolvedValue({ id: '1' } as never);
vi.mocked(projectRepo.findByName).mockResolvedValue(makeProject());
await expect(service.create({ name: 'taken' }, 'u1')).rejects.toThrow(ConflictError);
});
it('validates input', async () => {
await expect(service.create({ name: '' }, 'u1')).rejects.toThrow();
});
it('creates project with servers (resolves names)', async () => {
const srv1 = makeServer({ id: 'srv-1', name: 'github' });
const srv2 = makeServer({ id: 'srv-2', name: 'slack' });
vi.mocked(serverRepo.findByName).mockImplementation(async (name) => {
if (name === 'github') return srv1;
if (name === 'slack') return srv2;
return null;
});
const created = makeProject({ id: 'proj-new' });
vi.mocked(projectRepo.create).mockResolvedValue(created);
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({
id: 'proj-new',
servers: [
{ id: 'ps-1', server: { id: 'srv-1', name: 'github' } },
{ id: 'ps-2', server: { id: 'srv-2', name: 'slack' } },
],
}));
const result = await service.create({ name: 'my-project', servers: ['github', 'slack'] }, 'user-1');
expect(projectRepo.setServers).toHaveBeenCalledWith('proj-new', ['srv-1', 'srv-2']);
expect(result.servers).toHaveLength(2);
});
it('creates project with proxyMode and llmProvider', async () => {
const created = makeProject({ id: 'proj-filtered', proxyMode: 'filtered', llmProvider: 'openai' });
vi.mocked(projectRepo.create).mockResolvedValue(created);
vi.mocked(projectRepo.findById).mockResolvedValue(created);
const result = await service.create({
name: 'filtered-proj',
proxyMode: 'filtered',
llmProvider: 'openai',
}, 'user-1');
expect(result.proxyMode).toBe('filtered');
expect(result.llmProvider).toBe('openai');
});
it('rejects filtered project without llmProvider', async () => {
await expect(
service.create({ name: 'bad-proj', proxyMode: 'filtered' }, 'user-1'),
).rejects.toThrow();
});
it('throws NotFoundError when server name resolution fails', async () => {
vi.mocked(serverRepo.findByName).mockResolvedValue(null);
await expect(
service.create({ name: 'my-project', servers: ['nonexistent'] }, 'user-1'),
).rejects.toThrow(NotFoundError);
});
});
describe('getById', () => {
it('throws NotFoundError when not found', async () => {
await expect(service.getById('missing')).rejects.toThrow(NotFoundError);
});
it('returns project when found', async () => {
const proj = makeProject({ id: 'found' });
vi.mocked(projectRepo.findById).mockResolvedValue(proj);
const result = await service.getById('found');
expect(result.id).toBe('found');
});
});
describe('resolveAndGet', () => {
it('finds by ID first', async () => {
const proj = makeProject({ id: 'proj-id' });
vi.mocked(projectRepo.findById).mockResolvedValue(proj);
const result = await service.resolveAndGet('proj-id');
expect(result.id).toBe('proj-id');
});
it('falls back to name when ID not found', async () => {
vi.mocked(projectRepo.findById).mockResolvedValue(null);
const proj = makeProject({ name: 'my-name' });
vi.mocked(projectRepo.findByName).mockResolvedValue(proj);
const result = await service.resolveAndGet('my-name');
expect(result.name).toBe('my-name');
});
it('throws NotFoundError when neither ID nor name found', async () => {
await expect(service.resolveAndGet('nothing')).rejects.toThrow(NotFoundError);
});
});
describe('update', () => {
it('updates servers (full replacement)', async () => {
const existing = makeProject({ id: 'proj-1' });
vi.mocked(projectRepo.findById).mockResolvedValue(existing);
const srv = makeServer({ id: 'srv-new', name: 'new-srv' });
vi.mocked(serverRepo.findByName).mockResolvedValue(srv);
await service.update('proj-1', { servers: ['new-srv'] });
expect(projectRepo.setServers).toHaveBeenCalledWith('proj-1', ['srv-new']);
});
it('updates proxyMode', async () => {
const existing = makeProject({ id: 'proj-1' });
vi.mocked(projectRepo.findById).mockResolvedValue(existing);
await service.update('proj-1', { proxyMode: 'filtered', llmProvider: 'anthropic' });
expect(projectRepo.update).toHaveBeenCalledWith('proj-1', {
proxyMode: 'filtered',
llmProvider: 'anthropic',
});
});
});
describe('delete', () => {
it('deletes project', async () => {
vi.mocked(projectRepo.findById).mockResolvedValue({ id: 'p1' } as never);
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
await service.delete('p1');
expect(projectRepo.delete).toHaveBeenCalledWith('p1');
});
it('throws NotFoundError when project does not exist', async () => {
await expect(service.delete('missing')).rejects.toThrow(NotFoundError);
});
});
describe('addServer', () => {
it('attaches a server by name', async () => {
const project = makeProject({ id: 'proj-1' });
const srv = makeServer({ id: 'srv-1', name: 'my-ha' });
vi.mocked(projectRepo.findById).mockResolvedValue(project);
vi.mocked(serverRepo.findByName).mockResolvedValue(srv);
await service.addServer('proj-1', 'my-ha');
expect(projectRepo.addServer).toHaveBeenCalledWith('proj-1', 'srv-1');
});
it('throws NotFoundError when project not found', async () => {
await expect(service.addServer('missing', 'my-ha')).rejects.toThrow(NotFoundError);
});
it('throws NotFoundError when server not found', async () => {
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({ id: 'proj-1' }));
vi.mocked(serverRepo.findByName).mockResolvedValue(null);
await expect(service.addServer('proj-1', 'nonexistent')).rejects.toThrow(NotFoundError);
});
});
describe('removeServer', () => {
it('detaches a server by name', async () => {
const project = makeProject({ id: 'proj-1' });
const srv = makeServer({ id: 'srv-1', name: 'my-ha' });
vi.mocked(projectRepo.findById).mockResolvedValue(project);
vi.mocked(serverRepo.findByName).mockResolvedValue(srv);
await service.removeServer('proj-1', 'my-ha');
expect(projectRepo.removeServer).toHaveBeenCalledWith('proj-1', 'srv-1');
});
it('throws NotFoundError when project not found', async () => {
await expect(service.removeServer('missing', 'my-ha')).rejects.toThrow(NotFoundError);
});
it('throws NotFoundError when server not found', async () => {
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({ id: 'proj-1' }));
vi.mocked(serverRepo.findByName).mockResolvedValue(null);
await expect(service.removeServer('proj-1', 'nonexistent')).rejects.toThrow(NotFoundError);
});
});
describe('generateMcpConfig', () => {
it('generates direct mode config with STDIO servers', async () => {
const srv = makeServer({ id: 'srv-1', name: 'github', packageName: '@mcp/github', transport: 'STDIO' });
const project = makeProject({
id: 'proj-1',
name: 'my-proj',
proxyMode: 'direct',
servers: [{ id: 'ps-1', server: { id: 'srv-1', name: 'github' } }],
});
vi.mocked(projectRepo.findById).mockResolvedValue(project);
vi.mocked(serverRepo.findById).mockResolvedValue(srv);
const config = await service.generateMcpConfig('proj-1');
expect(config.mcpServers['github']).toBeDefined();
expect(config.mcpServers['github']?.command).toBe('npx');
expect(config.mcpServers['github']?.args).toEqual(['-y', '@mcp/github']);
});
it('generates direct mode config with SSE servers (URL-based)', async () => {
const srv = makeServer({ id: 'srv-2', name: 'sse-server', transport: 'SSE' });
const project = makeProject({
id: 'proj-1',
proxyMode: 'direct',
servers: [{ id: 'ps-1', server: { id: 'srv-2', name: 'sse-server' } }],
});
vi.mocked(projectRepo.findById).mockResolvedValue(project);
vi.mocked(serverRepo.findById).mockResolvedValue(srv);
const config = await service.generateMcpConfig('proj-1');
expect(config.mcpServers['sse-server']?.url).toBe('http://localhost:3100/api/v1/mcp/proxy/sse-server');
expect(config.mcpServers['sse-server']?.command).toBeUndefined();
});
it('generates filtered mode config (single mcplocal entry)', async () => {
const project = makeProject({
id: 'proj-1',
name: 'filtered-proj',
proxyMode: 'filtered',
llmProvider: 'openai',
servers: [{ id: 'ps-1', server: { id: 'srv-1', name: 'github' } }],
});
vi.mocked(projectRepo.findById).mockResolvedValue(project);
const config = await service.generateMcpConfig('proj-1');
expect(Object.keys(config.mcpServers)).toHaveLength(1);
expect(config.mcpServers['filtered-proj']?.url).toBe('http://localhost:3100/api/v1/mcp/proxy/project/filtered-proj');
});
it('resolves by name for mcp-config', async () => {
const project = makeProject({
id: 'proj-1',
name: 'my-proj',
proxyMode: 'direct',
servers: [],
});
vi.mocked(projectRepo.findById).mockResolvedValue(null);
vi.mocked(projectRepo.findByName).mockResolvedValue(project);
const config = await service.generateMcpConfig('my-proj');
expect(config.mcpServers).toEqual({});
});
it('includes env for STDIO servers', async () => {
const srv = makeServer({
id: 'srv-1',
name: 'github',
transport: 'STDIO',
env: [{ name: 'GITHUB_TOKEN', value: 'tok123' }],
});
const project = makeProject({
id: 'proj-1',
proxyMode: 'direct',
servers: [{ id: 'ps-1', server: { id: 'srv-1', name: 'github' } }],
});
vi.mocked(projectRepo.findById).mockResolvedValue(project);
vi.mocked(serverRepo.findById).mockResolvedValue(srv);
const config = await service.generateMcpConfig('proj-1');
expect(config.mcpServers['github']?.env?.['GITHUB_TOKEN']).toBe('tok123');
});
});
});

View File

@@ -0,0 +1,229 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { RbacDefinitionService } from '../src/services/rbac-definition.service.js';
import { NotFoundError, ConflictError } from '../src/services/mcp-server.service.js';
import type { IRbacDefinitionRepository } from '../src/repositories/rbac-definition.repository.js';
import type { RbacDefinition } from '@prisma/client';
function makeDef(overrides: Partial<RbacDefinition> = {}): RbacDefinition {
return {
id: 'def-1',
name: 'test-rbac',
subjects: [{ kind: 'User', name: 'alice@example.com' }],
roleBindings: [{ role: 'edit', resource: '*' }],
version: 1,
createdAt: new Date(),
updatedAt: new Date(),
...overrides,
};
}
function mockRepo(): IRbacDefinitionRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByName: vi.fn(async () => null),
create: vi.fn(async (data) => makeDef({ name: data.name, subjects: data.subjects, roleBindings: data.roleBindings })),
update: vi.fn(async (id, data) => makeDef({ id, ...data })),
delete: vi.fn(async () => {}),
};
}
describe('RbacDefinitionService', () => {
let repo: ReturnType<typeof mockRepo>;
let service: RbacDefinitionService;
beforeEach(() => {
repo = mockRepo();
service = new RbacDefinitionService(repo);
});
describe('list', () => {
it('returns all definitions', async () => {
const defs = await service.list();
expect(repo.findAll).toHaveBeenCalled();
expect(defs).toEqual([]);
});
});
describe('getById', () => {
it('returns definition when found', async () => {
const def = makeDef();
vi.mocked(repo.findById).mockResolvedValue(def);
const result = await service.getById('def-1');
expect(result.id).toBe('def-1');
});
it('throws NotFoundError when not found', async () => {
await expect(service.getById('missing')).rejects.toThrow(NotFoundError);
});
});
describe('getByName', () => {
it('returns definition when found', async () => {
const def = makeDef();
vi.mocked(repo.findByName).mockResolvedValue(def);
const result = await service.getByName('test-rbac');
expect(result.name).toBe('test-rbac');
});
it('throws NotFoundError when not found', async () => {
await expect(service.getByName('missing')).rejects.toThrow(NotFoundError);
});
});
describe('create', () => {
it('creates a definition with valid input', async () => {
const result = await service.create({
name: 'new-rbac',
subjects: [{ kind: 'User', name: 'alice@example.com' }],
roleBindings: [{ role: 'edit', resource: '*' }],
});
expect(result.name).toBe('new-rbac');
expect(repo.create).toHaveBeenCalled();
});
it('throws ConflictError when name exists', async () => {
vi.mocked(repo.findByName).mockResolvedValue(makeDef());
await expect(
service.create({
name: 'test-rbac',
subjects: [{ kind: 'User', name: 'bob@example.com' }],
roleBindings: [{ role: 'view', resource: 'servers' }],
}),
).rejects.toThrow(ConflictError);
});
it('throws on missing subjects', async () => {
await expect(
service.create({
name: 'bad-rbac',
subjects: [],
roleBindings: [{ role: 'view', resource: 'servers' }],
}),
).rejects.toThrow();
});
it('throws on missing roleBindings', async () => {
await expect(
service.create({
name: 'bad-rbac',
subjects: [{ kind: 'User', name: 'alice@example.com' }],
roleBindings: [],
}),
).rejects.toThrow();
});
it('throws on invalid role', async () => {
await expect(
service.create({
name: 'bad-rbac',
subjects: [{ kind: 'User', name: 'alice@example.com' }],
roleBindings: [{ role: 'superadmin', resource: '*' }],
}),
).rejects.toThrow();
});
it('throws on invalid subject kind', async () => {
await expect(
service.create({
name: 'bad-rbac',
subjects: [{ kind: 'Robot', name: 'bot-1' }],
roleBindings: [{ role: 'view', resource: 'servers' }],
}),
).rejects.toThrow();
});
it('throws on invalid name format', async () => {
await expect(
service.create({
name: 'Invalid Name!',
subjects: [{ kind: 'User', name: 'alice@example.com' }],
roleBindings: [{ role: 'view', resource: 'servers' }],
}),
).rejects.toThrow();
});
it('normalizes singular resource names to plural', async () => {
await service.create({
name: 'singular-rbac',
subjects: [{ kind: 'User', name: 'alice@example.com' }],
roleBindings: [
{ role: 'view', resource: 'server' },
{ role: 'edit', resource: 'secret', name: 'my-secret' },
],
});
const call = vi.mocked(repo.create).mock.calls[0]![0];
expect(call.roleBindings[0]!.resource).toBe('servers');
expect(call.roleBindings[1]!.resource).toBe('secrets');
expect(call.roleBindings[1]!.name).toBe('my-secret');
});
it('creates a definition with operation bindings', async () => {
const result = await service.create({
name: 'ops-rbac',
subjects: [{ kind: 'User', name: 'alice@example.com' }],
roleBindings: [{ role: 'run', action: 'logs' }],
});
expect(result.name).toBe('ops-rbac');
expect(repo.create).toHaveBeenCalled();
const call = vi.mocked(repo.create).mock.calls[0]![0];
expect(call.roleBindings[0]!.action).toBe('logs');
});
it('creates a definition with mixed resource and operation bindings', async () => {
const result = await service.create({
name: 'mixed-rbac',
subjects: [{ kind: 'User', name: 'alice@example.com' }],
roleBindings: [
{ role: 'view', resource: 'servers' },
{ role: 'run', action: 'logs' },
],
});
expect(result.name).toBe('mixed-rbac');
expect(repo.create).toHaveBeenCalled();
const call = vi.mocked(repo.create).mock.calls[0]![0];
expect(call.roleBindings).toHaveLength(2);
expect(call.roleBindings[0]!.resource).toBe('servers');
expect(call.roleBindings[1]!.action).toBe('logs');
});
it('creates a definition with name-scoped resource binding', async () => {
const result = await service.create({
name: 'scoped-rbac',
subjects: [{ kind: 'User', name: 'alice@example.com' }],
roleBindings: [{ role: 'view', resource: 'servers', name: 'my-ha' }],
});
expect(result.name).toBe('scoped-rbac');
expect(repo.create).toHaveBeenCalled();
const call = vi.mocked(repo.create).mock.calls[0]![0];
expect(call.roleBindings[0]!.resource).toBe('servers');
expect(call.roleBindings[0]!.name).toBe('my-ha');
});
});
describe('update', () => {
it('updates an existing definition', async () => {
vi.mocked(repo.findById).mockResolvedValue(makeDef());
await service.update('def-1', { subjects: [{ kind: 'User', name: 'bob@example.com' }] });
expect(repo.update).toHaveBeenCalledWith('def-1', {
subjects: [{ kind: 'User', name: 'bob@example.com' }],
});
});
it('throws NotFoundError when definition does not exist', async () => {
await expect(service.update('missing', {})).rejects.toThrow(NotFoundError);
});
});
describe('delete', () => {
it('deletes an existing definition', async () => {
vi.mocked(repo.findById).mockResolvedValue(makeDef());
await service.delete('def-1');
expect(repo.delete).toHaveBeenCalledWith('def-1');
});
it('throws NotFoundError when definition does not exist', async () => {
await expect(service.delete('missing')).rejects.toThrow(NotFoundError);
});
});
});

View File

@@ -0,0 +1,444 @@
/**
* Integration tests reproducing RBAC name-scoped access bugs.
*
* Bug 1: `mcpctl get servers` shows ALL servers despite user only having
* view:servers+name:my-home-assistant
* Bug 2: `mcpctl get server my-home-assistant -o yaml` returns 403 because
* CLI resolves name→CUID, and RBAC compares CUID against binding name
*
* These tests spin up a full Fastify app with auth + RBAC hooks + server routes,
* exactly like main.ts, to catch regressions at the HTTP level.
*/
import { describe, it, expect, vi, afterEach, beforeEach } from 'vitest';
import Fastify from 'fastify';
import type { FastifyInstance } from 'fastify';
import { registerMcpServerRoutes } from '../src/routes/mcp-servers.js';
import { McpServerService } from '../src/services/mcp-server.service.js';
import { InstanceService } from '../src/services/instance.service.js';
import { RbacService } from '../src/services/rbac.service.js';
import { errorHandler } from '../src/middleware/error-handler.js';
import type { IMcpServerRepository, IMcpInstanceRepository } from '../src/repositories/interfaces.js';
import type { IRbacDefinitionRepository } from '../src/repositories/rbac-definition.repository.js';
import type { McpOrchestrator } from '../src/services/orchestrator.js';
import type { McpServer, RbacDefinition, PrismaClient } from '@prisma/client';
// ── Test data ──
const SERVERS: McpServer[] = [
{ id: 'clxyz000000001', name: 'my-home-assistant', description: 'HA server', transport: 'STDIO', packageName: null, dockerImage: null, repositoryUrl: null, externalUrl: null, command: null, containerPort: null, replicas: 1, env: [], healthCheck: null, version: 1, createdAt: new Date(), updatedAt: new Date() },
{ id: 'clxyz000000002', name: 'slack-server', description: 'Slack MCP', transport: 'STDIO', packageName: null, dockerImage: null, repositoryUrl: null, externalUrl: null, command: null, containerPort: null, replicas: 1, env: [], healthCheck: null, version: 1, createdAt: new Date(), updatedAt: new Date() },
{ id: 'clxyz000000003', name: 'github-server', description: 'GitHub MCP', transport: 'STDIO', packageName: null, dockerImage: null, repositoryUrl: null, externalUrl: null, command: null, containerPort: null, replicas: 1, env: [], healthCheck: null, version: 1, createdAt: new Date(), updatedAt: new Date() },
];
// User tokens → userId mapping
const SESSIONS: Record<string, { userId: string }> = {
'scoped-token': { userId: 'user-scoped' },
'admin-token': { userId: 'user-admin' },
'multi-scoped-token': { userId: 'user-multi' },
'secrets-only-token': { userId: 'user-secrets' },
'edit-scoped-token': { userId: 'user-edit-scoped' },
};
// User email mapping
const USERS: Record<string, { email: string }> = {
'user-scoped': { email: 'scoped@example.com' },
'user-admin': { email: 'admin@example.com' },
'user-multi': { email: 'multi@example.com' },
'user-secrets': { email: 'secrets@example.com' },
'user-edit-scoped': { email: 'editscoped@example.com' },
};
// RBAC definitions
const RBAC_DEFS: RbacDefinition[] = [
{
id: 'rbac-scoped', name: 'scoped-view', version: 1, createdAt: new Date(), updatedAt: new Date(),
subjects: [{ kind: 'User', name: 'scoped@example.com' }],
roleBindings: [{ role: 'view', resource: 'servers', name: 'my-home-assistant' }],
},
{
id: 'rbac-admin', name: 'admin-all', version: 1, createdAt: new Date(), updatedAt: new Date(),
subjects: [{ kind: 'User', name: 'admin@example.com' }],
roleBindings: [{ role: 'edit', resource: '*' }],
},
{
id: 'rbac-multi', name: 'multi-scoped', version: 1, createdAt: new Date(), updatedAt: new Date(),
subjects: [{ kind: 'User', name: 'multi@example.com' }],
roleBindings: [
{ role: 'view', resource: 'servers', name: 'my-home-assistant' },
{ role: 'view', resource: 'servers', name: 'slack-server' },
],
},
{
id: 'rbac-secrets', name: 'secrets-only', version: 1, createdAt: new Date(), updatedAt: new Date(),
subjects: [{ kind: 'User', name: 'secrets@example.com' }],
roleBindings: [{ role: 'view', resource: 'secrets' }],
},
{
id: 'rbac-edit-scoped', name: 'edit-scoped', version: 1, createdAt: new Date(), updatedAt: new Date(),
subjects: [{ kind: 'User', name: 'editscoped@example.com' }],
roleBindings: [{ role: 'edit', resource: 'servers', name: 'my-home-assistant' }],
},
];
// ── Mock factories ──
function mockServerRepo(): IMcpServerRepository {
return {
findAll: vi.fn(async () => [...SERVERS]),
findById: vi.fn(async (id: string) => SERVERS.find((s) => s.id === id) ?? null),
findByName: vi.fn(async (name: string) => SERVERS.find((s) => s.name === name) ?? null),
create: vi.fn(async () => SERVERS[0]!),
update: vi.fn(async () => SERVERS[0]!),
delete: vi.fn(async () => {}),
};
}
function mockRbacRepo(): IRbacDefinitionRepository {
return {
findAll: vi.fn(async () => [...RBAC_DEFS]),
findById: vi.fn(async () => null),
findByName: vi.fn(async () => null),
create: vi.fn(async () => RBAC_DEFS[0]!),
update: vi.fn(async () => RBAC_DEFS[0]!),
delete: vi.fn(async () => {}),
};
}
function mockPrisma(): PrismaClient {
return {
user: {
findUnique: vi.fn(async ({ where }: { where: { id: string } }) => {
const u = USERS[where.id];
return u ? { email: u.email } : null;
}),
},
groupMember: {
findMany: vi.fn(async () => []),
},
} as unknown as PrismaClient;
}
function stubInstanceRepo(): IMcpInstanceRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByContainerId: vi.fn(async () => null),
create: vi.fn(async (data) => ({
id: 'inst-stub', serverId: data.serverId, containerId: null,
status: data.status ?? 'STOPPED', port: null, metadata: {},
healthStatus: null, lastHealthCheck: null, events: [],
version: 1, createdAt: new Date(), updatedAt: new Date(),
}) as never),
updateStatus: vi.fn(async () => ({}) as never),
delete: vi.fn(async () => {}),
};
}
function stubOrchestrator(): McpOrchestrator {
return {
ping: vi.fn(async () => true),
pullImage: vi.fn(async () => {}),
createContainer: vi.fn(async () => ({ containerId: 'ctr', name: 'stub', state: 'running' as const, port: 3000, createdAt: new Date() })),
stopContainer: vi.fn(async () => {}),
removeContainer: vi.fn(async () => {}),
inspectContainer: vi.fn(async () => ({ containerId: 'ctr', name: 'stub', state: 'running' as const, createdAt: new Date() })),
getContainerLogs: vi.fn(async () => ({ stdout: '', stderr: '' })),
};
}
// ── App setup (replicates main.ts hooks) ──
import { normalizeResource } from '../src/validation/rbac-definition.schema.js';
import type { RbacAction } from '../src/services/rbac.service.js';
type PermissionCheck =
| { kind: 'resource'; resource: string; action: RbacAction; resourceName?: string }
| { kind: 'operation'; operation: string }
| { kind: 'skip' };
function mapUrlToPermission(method: string, url: string): PermissionCheck {
const match = url.match(/^\/api\/v1\/([a-z-]+)/);
if (!match) return { kind: 'skip' };
const segment = match[1] as string;
if (segment === 'backup') return { kind: 'operation', operation: 'backup' };
if (segment === 'restore') return { kind: 'operation', operation: 'restore' };
if (segment === 'audit-logs' && method === 'DELETE') return { kind: 'operation', operation: 'audit-purge' };
const resourceMap: Record<string, string | undefined> = {
servers: 'servers', instances: 'instances', secrets: 'secrets',
projects: 'projects', templates: 'templates', users: 'users',
groups: 'groups', rbac: 'rbac', 'audit-logs': 'rbac', mcp: 'servers',
};
const resource = resourceMap[segment];
if (resource === undefined) return { kind: 'skip' };
let action: RbacAction;
switch (method) {
case 'GET': case 'HEAD': action = 'view'; break;
case 'POST': action = 'create'; break;
case 'DELETE': action = 'delete'; break;
default: action = 'edit'; break;
}
const nameMatch = url.match(/^\/api\/v1\/[a-z-]+\/([^/?]+)/);
const resourceName = nameMatch?.[1];
const check: PermissionCheck = { kind: 'resource', resource, action };
if (resourceName !== undefined) (check as { resourceName: string }).resourceName = resourceName;
return check;
}
let app: FastifyInstance;
afterEach(async () => {
if (app) await app.close();
});
async function createTestApp() {
const serverRepo = mockServerRepo();
const rbacRepo = mockRbacRepo();
const prisma = mockPrisma();
const rbacService = new RbacService(rbacRepo, prisma);
const CUID_RE = /^c[^\s-]{8,}$/i;
const nameResolvers: Record<string, { findById(id: string): Promise<{ name: string } | null> }> = {
servers: serverRepo,
};
app = Fastify({ logger: false });
app.setErrorHandler(errorHandler);
// Auth hook (mock)
app.addHook('preHandler', async (request, reply) => {
const url = request.url;
if (url.startsWith('/api/v1/auth/') || url === '/healthz') return;
if (!url.startsWith('/api/v1/')) return;
const header = request.headers.authorization;
if (!header?.startsWith('Bearer ')) {
reply.code(401).send({ error: 'Unauthorized' });
return;
}
const token = header.slice(7);
const session = SESSIONS[token];
if (!session) {
reply.code(401).send({ error: 'Invalid token' });
return;
}
request.userId = session.userId;
});
// RBAC hook (replicates main.ts)
app.addHook('preHandler', async (request, reply) => {
if (reply.sent) return;
const url = request.url;
if (url.startsWith('/api/v1/auth/') || url === '/healthz') return;
if (!url.startsWith('/api/v1/')) return;
if (request.userId === undefined) return;
const check = mapUrlToPermission(request.method, url);
if (check.kind === 'skip') return;
let allowed: boolean;
if (check.kind === 'operation') {
allowed = await rbacService.canRunOperation(request.userId, check.operation);
} else {
// CUID→name resolution
if (check.resourceName !== undefined && CUID_RE.test(check.resourceName)) {
const resolver = nameResolvers[check.resource];
if (resolver) {
const entity = await resolver.findById(check.resourceName);
if (entity) check.resourceName = entity.name;
}
}
allowed = await rbacService.canAccess(request.userId, check.action, check.resource, check.resourceName);
// Compute scope for list filtering
if (allowed && check.resourceName === undefined) {
request.rbacScope = await rbacService.getAllowedScope(request.userId, check.action, check.resource);
}
}
if (!allowed) {
reply.code(403).send({ error: 'Forbidden' });
}
});
// Routes
const serverService = new McpServerService(serverRepo);
const instanceService = new InstanceService(stubInstanceRepo(), serverRepo, stubOrchestrator());
serverService.setInstanceService(instanceService);
registerMcpServerRoutes(app, serverService, instanceService);
// preSerialization hook (list filtering)
app.addHook('preSerialization', async (request, _reply, payload) => {
if (!request.rbacScope || request.rbacScope.wildcard) return payload;
if (!Array.isArray(payload)) return payload;
return (payload as Array<Record<string, unknown>>).filter((item) => {
const name = item['name'];
return typeof name === 'string' && request.rbacScope!.names.has(name);
});
});
await app.ready();
return app;
}
// ── Tests ──
describe('RBAC name-scoped integration (reproduces mcpctl bugs)', () => {
beforeEach(async () => {
await createTestApp();
});
describe('Bug 1: mcpctl get servers (list filtering)', () => {
it('name-scoped user sees ONLY their permitted server', async () => {
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers',
headers: { authorization: 'Bearer scoped-token' },
});
expect(res.statusCode).toBe(200);
const servers = res.json<Array<{ name: string }>>();
expect(servers).toHaveLength(1);
expect(servers[0]!.name).toBe('my-home-assistant');
});
it('wildcard user sees ALL servers', async () => {
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers',
headers: { authorization: 'Bearer admin-token' },
});
expect(res.statusCode).toBe(200);
const servers = res.json<Array<{ name: string }>>();
expect(servers).toHaveLength(3);
});
it('user with multiple name-scoped bindings sees only those servers', async () => {
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers',
headers: { authorization: 'Bearer multi-scoped-token' },
});
expect(res.statusCode).toBe(200);
const servers = res.json<Array<{ name: string }>>();
expect(servers).toHaveLength(2);
const names = servers.map((s) => s.name);
expect(names).toContain('my-home-assistant');
expect(names).toContain('slack-server');
expect(names).not.toContain('github-server');
});
it('user with no server permissions gets 403', async () => {
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers',
headers: { authorization: 'Bearer secrets-only-token' },
});
expect(res.statusCode).toBe(403);
});
});
describe('Bug 2: mcpctl get server NAME (CUID resolution)', () => {
it('allows access when URL contains CUID matching a name-scoped binding', async () => {
// CLI resolves my-home-assistant → clxyz000000001
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers/clxyz000000001',
headers: { authorization: 'Bearer scoped-token' },
});
expect(res.statusCode).toBe(200);
expect(res.json<{ name: string }>().name).toBe('my-home-assistant');
});
it('denies access when CUID resolves to server NOT in binding', async () => {
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers/clxyz000000002',
headers: { authorization: 'Bearer scoped-token' },
});
expect(res.statusCode).toBe(403);
});
it('passes RBAC when URL has human-readable name (route 404 is expected)', async () => {
// Human name in URL: RBAC passes (matches binding directly),
// but the route only does findById, so it 404s.
// CLI always resolves name→CUID first, so this doesn't happen in practice.
// The important thing: it does NOT return 403.
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers/my-home-assistant',
headers: { authorization: 'Bearer scoped-token' },
});
expect(res.statusCode).toBe(404); // Not 403!
});
it('handles nonexistent CUID gracefully (403)', async () => {
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers/cnonexistent12345678',
headers: { authorization: 'Bearer scoped-token' },
});
expect(res.statusCode).toBe(403);
});
it('wildcard user can access any server by CUID', async () => {
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers/clxyz000000002',
headers: { authorization: 'Bearer admin-token' },
});
expect(res.statusCode).toBe(200);
expect(res.json<{ name: string }>().name).toBe('slack-server');
});
});
describe('name-scoped write operations', () => {
it('name-scoped edit user can DELETE their named server by CUID', async () => {
const res = await app.inject({
method: 'DELETE',
url: '/api/v1/servers/clxyz000000001',
headers: { authorization: 'Bearer edit-scoped-token' },
});
expect(res.statusCode).toBe(204);
});
it('name-scoped edit user CANNOT delete other servers', async () => {
const res = await app.inject({
method: 'DELETE',
url: '/api/v1/servers/clxyz000000002',
headers: { authorization: 'Bearer edit-scoped-token' },
});
expect(res.statusCode).toBe(403);
});
it('name-scoped view user CANNOT delete their named server', async () => {
const res = await app.inject({
method: 'DELETE',
url: '/api/v1/servers/clxyz000000001',
headers: { authorization: 'Bearer scoped-token' },
});
expect(res.statusCode).toBe(403);
});
});
describe('preSerialization edge cases', () => {
it('single-object responses pass through unmodified', async () => {
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers/clxyz000000001',
headers: { authorization: 'Bearer scoped-token' },
});
expect(res.statusCode).toBe(200);
expect(res.json<{ name: string }>().name).toBe('my-home-assistant');
});
it('unauthenticated requests get 401', async () => {
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers',
});
expect(res.statusCode).toBe(401);
});
});
});

1012
src/mcpd/tests/rbac.test.ts Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,355 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { HealthProbeRunner } from '../../src/services/health-probe.service.js';
import type { HealthCheckSpec } from '../../src/services/health-probe.service.js';
import type { IMcpInstanceRepository, IMcpServerRepository } from '../../src/repositories/interfaces.js';
import type { McpOrchestrator, ExecResult } from '../../src/services/orchestrator.js';
import type { McpInstance, McpServer } from '@prisma/client';
function makeInstance(overrides: Partial<McpInstance> = {}): McpInstance {
return {
id: 'inst-1',
serverId: 'srv-1',
status: 'RUNNING',
containerId: 'container-abc',
port: null,
healthStatus: null,
lastHealthCheck: null,
events: [],
metadata: {},
version: 1,
createdAt: new Date(),
updatedAt: new Date(),
...overrides,
} as McpInstance;
}
function makeServer(overrides: Partial<McpServer> = {}): McpServer {
return {
id: 'srv-1',
name: 'my-grafana',
transport: 'STDIO',
packageName: '@leval/mcp-grafana',
dockerImage: null,
externalUrl: null,
containerPort: null,
repositoryUrl: null,
description: null,
command: null,
env: [],
replicas: 1,
projectId: null,
healthCheck: {
tool: 'list_datasources',
arguments: {},
intervalSeconds: 60,
timeoutSeconds: 10,
failureThreshold: 3,
},
version: 1,
createdAt: new Date(),
updatedAt: new Date(),
...overrides,
} as McpServer;
}
function mockInstanceRepo(): IMcpInstanceRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByContainerId: vi.fn(async () => null),
create: vi.fn(async (data) => makeInstance(data)),
updateStatus: vi.fn(async (id, status, fields) => makeInstance({ id, status, ...fields })),
delete: vi.fn(async () => {}),
};
}
function mockServerRepo(): IMcpServerRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByName: vi.fn(async () => null),
create: vi.fn(async () => makeServer()),
update: vi.fn(async () => makeServer()),
delete: vi.fn(async () => {}),
};
}
function mockOrchestrator(): McpOrchestrator {
return {
pullImage: vi.fn(async () => {}),
createContainer: vi.fn(async () => ({ containerId: 'c1', name: 'test', state: 'running' as const, createdAt: new Date() })),
stopContainer: vi.fn(async () => {}),
removeContainer: vi.fn(async () => {}),
inspectContainer: vi.fn(async () => ({ containerId: 'c1', name: 'test', state: 'running' as const, createdAt: new Date() })),
getContainerLogs: vi.fn(async () => ({ stdout: '', stderr: '' })),
execInContainer: vi.fn(async () => ({ exitCode: 0, stdout: 'OK', stderr: '' })),
ping: vi.fn(async () => true),
};
}
describe('HealthProbeRunner', () => {
let instanceRepo: IMcpInstanceRepository;
let serverRepo: IMcpServerRepository;
let orchestrator: McpOrchestrator;
let runner: HealthProbeRunner;
beforeEach(() => {
instanceRepo = mockInstanceRepo();
serverRepo = mockServerRepo();
orchestrator = mockOrchestrator();
runner = new HealthProbeRunner(instanceRepo, serverRepo, orchestrator);
});
it('skips instances without healthCheck config', async () => {
const instance = makeInstance();
const server = makeServer({ healthCheck: null });
vi.mocked(instanceRepo.findAll).mockResolvedValue([instance]);
vi.mocked(serverRepo.findById).mockResolvedValue(server);
await runner.tick();
expect(orchestrator.execInContainer).not.toHaveBeenCalled();
expect(instanceRepo.updateStatus).not.toHaveBeenCalled();
});
it('skips non-RUNNING instances', async () => {
const instance = makeInstance({ status: 'ERROR' });
vi.mocked(instanceRepo.findAll).mockResolvedValue([instance]);
await runner.tick();
expect(serverRepo.findById).not.toHaveBeenCalled();
});
it('probes STDIO instance with exec and marks healthy on success', async () => {
const instance = makeInstance();
const server = makeServer();
vi.mocked(instanceRepo.findAll).mockResolvedValue([instance]);
vi.mocked(serverRepo.findById).mockResolvedValue(server);
vi.mocked(orchestrator.execInContainer).mockResolvedValue({
exitCode: 0,
stdout: 'OK',
stderr: '',
});
await runner.tick();
expect(orchestrator.execInContainer).toHaveBeenCalledWith(
'container-abc',
expect.arrayContaining(['node', '-e']),
expect.objectContaining({ timeoutMs: 10000 }),
);
expect(instanceRepo.updateStatus).toHaveBeenCalledWith(
'inst-1',
'RUNNING',
expect.objectContaining({
healthStatus: 'healthy',
lastHealthCheck: expect.any(Date),
events: expect.arrayContaining([
expect.objectContaining({ type: 'Normal', message: expect.stringContaining('passed') }),
]),
}),
);
});
it('marks unhealthy after failureThreshold consecutive failures', async () => {
const instance = makeInstance();
const healthCheck: HealthCheckSpec = {
tool: 'list_datasources',
arguments: {},
intervalSeconds: 0, // always due
failureThreshold: 2,
};
const server = makeServer({ healthCheck: healthCheck as unknown as undefined });
vi.mocked(instanceRepo.findAll).mockResolvedValue([instance]);
vi.mocked(serverRepo.findById).mockResolvedValue(server);
vi.mocked(orchestrator.execInContainer).mockResolvedValue({
exitCode: 1,
stdout: 'ERROR:connection refused',
stderr: '',
});
// First failure → degraded
await runner.tick();
expect(instanceRepo.updateStatus).toHaveBeenCalledWith(
'inst-1',
'RUNNING',
expect.objectContaining({ healthStatus: 'degraded' }),
);
// Second failure → unhealthy (threshold = 2)
await runner.tick();
expect(instanceRepo.updateStatus).toHaveBeenCalledWith(
'inst-1',
'RUNNING',
expect.objectContaining({ healthStatus: 'unhealthy' }),
);
});
it('resets failure count on success', async () => {
const instance = makeInstance();
const healthCheck: HealthCheckSpec = {
tool: 'list_datasources',
arguments: {},
intervalSeconds: 0,
failureThreshold: 3,
};
const server = makeServer({ healthCheck: healthCheck as unknown as undefined });
vi.mocked(instanceRepo.findAll).mockResolvedValue([instance]);
vi.mocked(serverRepo.findById).mockResolvedValue(server);
// Two failures
vi.mocked(orchestrator.execInContainer).mockResolvedValue({
exitCode: 1, stdout: 'ERROR:fail', stderr: '',
});
await runner.tick();
await runner.tick();
// Then success — should reset to healthy
vi.mocked(orchestrator.execInContainer).mockResolvedValue({
exitCode: 0, stdout: 'OK', stderr: '',
});
await runner.tick();
const lastCall = vi.mocked(instanceRepo.updateStatus).mock.calls.at(-1);
expect(lastCall?.[2]).toEqual(expect.objectContaining({ healthStatus: 'healthy' }));
});
it('handles exec timeout as failure', async () => {
const instance = makeInstance();
const server = makeServer();
vi.mocked(instanceRepo.findAll).mockResolvedValue([instance]);
vi.mocked(serverRepo.findById).mockResolvedValue(server);
vi.mocked(orchestrator.execInContainer).mockRejectedValue(new Error('Exec timed out after 10000ms'));
await runner.tick();
expect(instanceRepo.updateStatus).toHaveBeenCalledWith(
'inst-1',
'RUNNING',
expect.objectContaining({
healthStatus: 'degraded',
events: expect.arrayContaining([
expect.objectContaining({ type: 'Warning', message: expect.stringContaining('timed out') }),
]),
}),
);
});
it('appends events without losing history', async () => {
const existingEvents = [
{ timestamp: '2025-01-01T00:00:00Z', type: 'Normal', message: 'old event' },
];
const instance = makeInstance({ events: existingEvents });
const server = makeServer({
healthCheck: { tool: 'test', intervalSeconds: 0 } as McpServer['healthCheck'],
});
vi.mocked(instanceRepo.findAll).mockResolvedValue([instance]);
vi.mocked(serverRepo.findById).mockResolvedValue(server);
vi.mocked(orchestrator.execInContainer).mockResolvedValue({
exitCode: 0, stdout: 'OK', stderr: '',
});
await runner.tick();
const events = vi.mocked(instanceRepo.updateStatus).mock.calls[0]?.[2]?.events as unknown[];
expect(events).toHaveLength(2);
expect((events[0] as { message: string }).message).toBe('old event');
expect((events[1] as { message: string }).message).toContain('passed');
});
it('respects interval — skips probing if not due', async () => {
const instance = makeInstance();
const server = makeServer({
healthCheck: { tool: 'test', intervalSeconds: 300 } as McpServer['healthCheck'],
});
vi.mocked(instanceRepo.findAll).mockResolvedValue([instance]);
vi.mocked(serverRepo.findById).mockResolvedValue(server);
vi.mocked(orchestrator.execInContainer).mockResolvedValue({
exitCode: 0, stdout: 'OK', stderr: '',
});
// First tick: should probe
await runner.tick();
expect(orchestrator.execInContainer).toHaveBeenCalledTimes(1);
// Second tick immediately: should skip (300s interval not elapsed)
await runner.tick();
expect(orchestrator.execInContainer).toHaveBeenCalledTimes(1);
});
it('cleans up probe states for removed instances', async () => {
const instance = makeInstance();
const server = makeServer({
healthCheck: { tool: 'test', intervalSeconds: 0 } as McpServer['healthCheck'],
});
vi.mocked(instanceRepo.findAll).mockResolvedValue([instance]);
vi.mocked(serverRepo.findById).mockResolvedValue(server);
await runner.tick();
expect(orchestrator.execInContainer).toHaveBeenCalledTimes(1);
// Instance removed
vi.mocked(instanceRepo.findAll).mockResolvedValue([]);
await runner.tick();
// Re-add same instance — should probe again (state was cleaned)
vi.mocked(instanceRepo.findAll).mockResolvedValue([instance]);
await runner.tick();
expect(orchestrator.execInContainer).toHaveBeenCalledTimes(2);
});
it('skips STDIO instances without containerId', async () => {
const instance = makeInstance({ containerId: null });
const server = makeServer();
// containerId is null, but status is RUNNING — shouldn't be probed
vi.mocked(instanceRepo.findAll).mockResolvedValue([instance]);
await runner.tick();
expect(serverRepo.findById).not.toHaveBeenCalled();
});
it('probeInstance returns result directly', async () => {
const instance = makeInstance();
const server = makeServer();
const healthCheck: HealthCheckSpec = {
tool: 'list_datasources',
arguments: {},
};
vi.mocked(orchestrator.execInContainer).mockResolvedValue({
exitCode: 0, stdout: 'OK', stderr: '',
});
const result = await runner.probeInstance(instance, server, healthCheck);
expect(result.healthy).toBe(true);
expect(result.latencyMs).toBeGreaterThanOrEqual(0);
expect(result.message).toBe('ok');
});
it('handles STDIO exec failure with error message', async () => {
const instance = makeInstance();
const server = makeServer();
const healthCheck: HealthCheckSpec = { tool: 'list_datasources', arguments: {} };
vi.mocked(orchestrator.execInContainer).mockResolvedValue({
exitCode: 1,
stdout: 'ERROR:ECONNREFUSED 10.0.0.1:3000',
stderr: '',
});
const result = await runner.probeInstance(instance, server, healthCheck);
expect(result.healthy).toBe(false);
expect(result.message).toBe('ECONNREFUSED 10.0.0.1:3000');
});
});

View File

@@ -0,0 +1,208 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { UserService } from '../src/services/user.service.js';
import { NotFoundError, ConflictError } from '../src/services/mcp-server.service.js';
import type { IUserRepository, SafeUser } from '../src/repositories/user.repository.js';
function makeSafeUser(overrides: Partial<SafeUser> = {}): SafeUser {
return {
id: 'user-1',
email: 'alice@example.com',
name: 'Alice',
role: 'USER',
provider: null,
externalId: null,
version: 1,
createdAt: new Date(),
updatedAt: new Date(),
...overrides,
};
}
function mockUserRepo(): IUserRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByEmail: vi.fn(async () => null),
create: vi.fn(async (data) =>
makeSafeUser({ email: data.email, name: data.name ?? null }),
),
delete: vi.fn(async () => {}),
count: vi.fn(async () => 0),
};
}
describe('UserService', () => {
let repo: ReturnType<typeof mockUserRepo>;
let service: UserService;
beforeEach(() => {
repo = mockUserRepo();
service = new UserService(repo);
});
// ── list ──────────────────────────────────────────────────
describe('list', () => {
it('returns empty array when no users', async () => {
const result = await service.list();
expect(result).toEqual([]);
expect(repo.findAll).toHaveBeenCalledOnce();
});
it('returns all users', async () => {
const users = [
makeSafeUser({ id: 'u1', email: 'a@b.com' }),
makeSafeUser({ id: 'u2', email: 'c@d.com' }),
];
vi.mocked(repo.findAll).mockResolvedValue(users);
const result = await service.list();
expect(result).toHaveLength(2);
expect(result[0]!.email).toBe('a@b.com');
});
});
// ── create ────────────────────────────────────────────────
describe('create', () => {
it('creates a user and hashes password', async () => {
const result = await service.create({
email: 'alice@example.com',
password: 'securePass123',
});
expect(result.email).toBe('alice@example.com');
expect(repo.create).toHaveBeenCalledOnce();
// Verify the passwordHash was generated (not the plain password)
const createCall = vi.mocked(repo.create).mock.calls[0]![0]!;
expect(createCall.passwordHash).toBeDefined();
expect(createCall.passwordHash).not.toBe('securePass123');
expect(createCall.passwordHash.startsWith('$2b$')).toBe(true);
});
it('creates a user with optional name', async () => {
await service.create({
email: 'bob@example.com',
password: 'securePass123',
name: 'Bob',
});
const createCall = vi.mocked(repo.create).mock.calls[0]![0]!;
expect(createCall.email).toBe('bob@example.com');
expect(createCall.name).toBe('Bob');
});
it('returns user without passwordHash', async () => {
const result = await service.create({
email: 'alice@example.com',
password: 'securePass123',
});
// SafeUser type should not have passwordHash
expect(result).not.toHaveProperty('passwordHash');
});
it('throws ConflictError when email already exists', async () => {
vi.mocked(repo.findByEmail).mockResolvedValue(makeSafeUser());
await expect(
service.create({ email: 'alice@example.com', password: 'securePass123' }),
).rejects.toThrow(ConflictError);
});
it('throws ZodError for invalid email', async () => {
await expect(
service.create({ email: 'not-an-email', password: 'securePass123' }),
).rejects.toThrow();
});
it('throws ZodError for short password', async () => {
await expect(
service.create({ email: 'a@b.com', password: 'short' }),
).rejects.toThrow();
});
it('throws ZodError for missing email', async () => {
await expect(
service.create({ password: 'securePass123' }),
).rejects.toThrow();
});
it('throws ZodError for password exceeding max length', async () => {
await expect(
service.create({ email: 'a@b.com', password: 'x'.repeat(129) }),
).rejects.toThrow();
});
});
// ── getById ───────────────────────────────────────────────
describe('getById', () => {
it('returns user when found', async () => {
const user = makeSafeUser();
vi.mocked(repo.findById).mockResolvedValue(user);
const result = await service.getById('user-1');
expect(result.email).toBe('alice@example.com');
expect(repo.findById).toHaveBeenCalledWith('user-1');
});
it('throws NotFoundError when not found', async () => {
await expect(service.getById('missing')).rejects.toThrow(NotFoundError);
});
});
// ── getByEmail ────────────────────────────────────────────
describe('getByEmail', () => {
it('returns user when found', async () => {
const user = makeSafeUser();
vi.mocked(repo.findByEmail).mockResolvedValue(user);
const result = await service.getByEmail('alice@example.com');
expect(result.email).toBe('alice@example.com');
expect(repo.findByEmail).toHaveBeenCalledWith('alice@example.com');
});
it('throws NotFoundError when not found', async () => {
await expect(service.getByEmail('nobody@example.com')).rejects.toThrow(NotFoundError);
});
});
// ── delete ────────────────────────────────────────────────
describe('delete', () => {
it('deletes user by id', async () => {
vi.mocked(repo.findById).mockResolvedValue(makeSafeUser());
await service.delete('user-1');
expect(repo.delete).toHaveBeenCalledWith('user-1');
});
it('throws NotFoundError when user does not exist', async () => {
await expect(service.delete('missing')).rejects.toThrow(NotFoundError);
});
});
// ── count ─────────────────────────────────────────────────
describe('count', () => {
it('returns 0 when no users', async () => {
const result = await service.count();
expect(result).toBe(0);
});
it('returns 1 when one user exists', async () => {
vi.mocked(repo.count).mockResolvedValue(1);
const result = await service.count();
expect(result).toBe(1);
});
it('returns correct count for multiple users', async () => {
vi.mocked(repo.count).mockResolvedValue(5);
const result = await service.count();
expect(result).toBe(5);
});
});
});

View File

@@ -15,6 +15,40 @@ interface McpdServer {
*/
export async function refreshUpstreams(router: McpRouter, mcpdClient: McpdClient): Promise<string[]> {
const servers = await mcpdClient.get<McpdServer[]>('/api/v1/servers');
return syncUpstreams(router, mcpdClient, servers);
}
/**
* Discovers MCP servers scoped to a project and registers them as upstreams.
* Uses the project-servers endpoint that returns only servers linked to the project.
*
* @param authToken - Optional bearer token forwarded to mcpd for RBAC checks.
*/
export async function refreshProjectUpstreams(
router: McpRouter,
mcpdClient: McpdClient,
projectName: string,
authToken?: string,
): Promise<string[]> {
const path = `/api/v1/projects/${encodeURIComponent(projectName)}/servers`;
let servers: McpdServer[];
if (authToken) {
// Forward the client's auth token to mcpd so RBAC applies
const result = await mcpdClient.forward('GET', path, '', undefined);
if (result.status >= 400) {
throw new Error(`Failed to fetch project servers: ${result.status}`);
}
servers = result.body as McpdServer[];
} else {
servers = await mcpdClient.get<McpdServer[]>(path);
}
return syncUpstreams(router, mcpdClient, servers);
}
/** Shared sync logic: reconcile a router's upstreams with a server list. */
function syncUpstreams(router: McpRouter, mcpdClient: McpdClient, servers: McpdServer[]): string[] {
const registered: string[] = [];
// Remove stale upstreams

View File

@@ -5,3 +5,4 @@ export type { HttpConfig } from './config.js';
export { McpdClient, AuthenticationError, ConnectionError } from './mcpd-client.js';
export { registerProxyRoutes } from './routes/proxy.js';
export { registerMcpEndpoint } from './mcp-endpoint.js';
export { registerProjectMcpEndpoint } from './project-mcp-endpoint.js';

View File

@@ -49,16 +49,20 @@ export class McpdClient {
/**
* Forward a raw request to mcpd. Returns the status code and body
* so the proxy route can relay them directly.
*
* @param authOverride - If provided, used as the Bearer token instead of the
* service token. This allows forwarding end-user tokens for RBAC enforcement.
*/
async forward(
method: string,
path: string,
query: string,
body: unknown | undefined,
authOverride?: string,
): Promise<{ status: number; body: unknown }> {
const url = `${this.baseUrl}${path}${query ? `?${query}` : ''}`;
const headers: Record<string, string> = {
'Authorization': `Bearer ${this.token}`,
'Authorization': `Bearer ${authOverride ?? this.token}`,
'Accept': 'application/json',
};

View File

@@ -0,0 +1,131 @@
/**
* Project-scoped Streamable HTTP MCP protocol endpoint.
*
* Exposes per-project MCP endpoints at /projects/:projectName/mcp so
* Claude Code can connect to a specific project's servers only.
*
* Each project gets its own McpRouter instance (cached with TTL).
* Sessions are managed per-project.
*/
import { randomUUID } from 'node:crypto';
import type { FastifyInstance } from 'fastify';
import { StreamableHTTPServerTransport } from '@modelcontextprotocol/sdk/server/streamableHttp.js';
import type { JSONRPCMessage } from '@modelcontextprotocol/sdk/types.js';
import { McpRouter } from '../router.js';
import { refreshProjectUpstreams } from '../discovery.js';
import type { McpdClient } from './mcpd-client.js';
import type { JsonRpcRequest } from '../types.js';
interface ProjectCacheEntry {
router: McpRouter;
lastRefresh: number;
}
interface SessionEntry {
transport: StreamableHTTPServerTransport;
projectName: string;
}
const CACHE_TTL_MS = 60_000; // 60 seconds
export function registerProjectMcpEndpoint(app: FastifyInstance, mcpdClient: McpdClient): void {
const projectCache = new Map<string, ProjectCacheEntry>();
const sessions = new Map<string, SessionEntry>();
async function getOrCreateRouter(projectName: string, authToken?: string): Promise<McpRouter> {
const existing = projectCache.get(projectName);
const now = Date.now();
if (existing && (now - existing.lastRefresh) < CACHE_TTL_MS) {
return existing.router;
}
// Create new router or refresh existing one
const router = existing?.router ?? new McpRouter();
await refreshProjectUpstreams(router, mcpdClient, projectName, authToken);
projectCache.set(projectName, { router, lastRefresh: now });
return router;
}
// POST /projects/:projectName/mcp — JSON-RPC requests
app.post<{ Params: { projectName: string } }>('/projects/:projectName/mcp', async (request, reply) => {
const { projectName } = request.params;
const sessionId = request.headers['mcp-session-id'] as string | undefined;
const authToken = (request.headers['authorization'] as string | undefined)?.replace(/^Bearer\s+/i, '');
if (sessionId && sessions.has(sessionId)) {
const session = sessions.get(sessionId)!;
await session.transport.handleRequest(request.raw, reply.raw, request.body);
reply.hijack();
return;
}
if (sessionId && !sessions.has(sessionId)) {
reply.code(404).send({ error: 'Session not found' });
return;
}
// New session — get/create project router
let router: McpRouter;
try {
router = await getOrCreateRouter(projectName, authToken);
} catch (err) {
reply.code(502).send({ error: `Failed to load project: ${err instanceof Error ? err.message : String(err)}` });
return;
}
const transport = new StreamableHTTPServerTransport({
sessionIdGenerator: () => randomUUID(),
onsessioninitialized: (id) => {
sessions.set(id, { transport, projectName });
},
});
transport.onmessage = async (message: JSONRPCMessage) => {
if ('method' in message && 'id' in message) {
const response = await router.route(message as unknown as JsonRpcRequest);
await transport.send(response as unknown as JSONRPCMessage);
}
};
transport.onclose = () => {
const id = transport.sessionId;
if (id) {
sessions.delete(id);
}
};
await transport.handleRequest(request.raw, reply.raw, request.body);
reply.hijack();
});
// GET /projects/:projectName/mcp — SSE stream
app.get<{ Params: { projectName: string } }>('/projects/:projectName/mcp', async (request, reply) => {
const sessionId = request.headers['mcp-session-id'] as string | undefined;
if (!sessionId || !sessions.has(sessionId)) {
reply.code(400).send({ error: 'Invalid or missing session ID' });
return;
}
const session = sessions.get(sessionId)!;
await session.transport.handleRequest(request.raw, reply.raw);
reply.hijack();
});
// DELETE /projects/:projectName/mcp — Session cleanup
app.delete<{ Params: { projectName: string } }>('/projects/:projectName/mcp', async (request, reply) => {
const sessionId = request.headers['mcp-session-id'] as string | undefined;
if (!sessionId || !sessions.has(sessionId)) {
reply.code(400).send({ error: 'Invalid or missing session ID' });
return;
}
const session = sessions.get(sessionId)!;
await session.transport.handleRequest(request.raw, reply.raw);
sessions.delete(sessionId);
reply.hijack();
});
}

View File

@@ -16,8 +16,13 @@ export function registerProxyRoutes(app: FastifyInstance, client: McpdClient): v
? (request.body as unknown)
: undefined;
// Forward the user's auth token to mcpd so RBAC applies per-user.
// If no user token is present, mcpd will use its auth hook to reject.
const authHeader = request.headers['authorization'] as string | undefined;
const userToken = authHeader?.startsWith('Bearer ') ? authHeader.slice(7) : undefined;
try {
const result = await client.forward(request.method, path, querystring, body);
const result = await client.forward(request.method, path, querystring, body, userToken);
return reply.code(result.status).send(result.body);
} catch (err: unknown) {
if (err instanceof AuthenticationError) {

View File

@@ -6,6 +6,7 @@ import type { HttpConfig } from './config.js';
import { McpdClient } from './mcpd-client.js';
import { registerProxyRoutes } from './routes/proxy.js';
import { registerMcpEndpoint } from './mcp-endpoint.js';
import { registerProjectMcpEndpoint } from './project-mcp-endpoint.js';
import type { McpRouter } from '../router.js';
import type { HealthMonitor } from '../health.js';
import type { TieredHealthMonitor } from '../health/tiered.js';
@@ -85,5 +86,8 @@ export async function createHttpServer(
// Streamable HTTP MCP protocol endpoint at /mcp
registerMcpEndpoint(app, deps.router);
// Project-scoped MCP endpoint at /projects/:projectName/mcp
registerProjectMcpEndpoint(app, mcpdClient);
return app;
}

View File

@@ -0,0 +1,87 @@
import { describe, it, expect, vi } from 'vitest';
import { refreshProjectUpstreams } from '../src/discovery.js';
import { McpRouter } from '../src/router.js';
function mockMcpdClient(servers: Array<{ id: string; name: string; transport: string }>) {
return {
baseUrl: 'http://test:3100',
token: 'test-token',
get: vi.fn(async () => servers),
post: vi.fn(async () => ({ result: {} })),
put: vi.fn(),
delete: vi.fn(),
forward: vi.fn(async () => ({ status: 200, body: servers })),
};
}
describe('refreshProjectUpstreams', () => {
it('registers project-scoped servers as upstreams', async () => {
const router = new McpRouter();
const client = mockMcpdClient([
{ id: 'srv-1', name: 'grafana', transport: 'stdio' },
{ id: 'srv-2', name: 'ha', transport: 'stdio' },
]);
const registered = await refreshProjectUpstreams(router, client as any, 'smart-home');
expect(registered).toEqual(['grafana', 'ha']);
expect(router.getUpstreamNames()).toContain('grafana');
expect(router.getUpstreamNames()).toContain('ha');
expect(client.get).toHaveBeenCalledWith('/api/v1/projects/smart-home/servers');
});
it('removes stale upstreams on refresh', async () => {
const router = new McpRouter();
// First refresh: 2 servers
const client1 = mockMcpdClient([
{ id: 'srv-1', name: 'grafana', transport: 'stdio' },
{ id: 'srv-2', name: 'ha', transport: 'stdio' },
]);
await refreshProjectUpstreams(router, client1 as any, 'smart-home');
expect(router.getUpstreamNames()).toHaveLength(2);
// Second refresh: only 1 server
const client2 = mockMcpdClient([
{ id: 'srv-1', name: 'grafana', transport: 'stdio' },
]);
await refreshProjectUpstreams(router, client2 as any, 'smart-home');
expect(router.getUpstreamNames()).toEqual(['grafana']);
});
it('forwards auth token via forward() method', async () => {
const router = new McpRouter();
const servers = [{ id: 'srv-1', name: 'grafana', transport: 'stdio' }];
const client = mockMcpdClient(servers);
await refreshProjectUpstreams(router, client as any, 'smart-home', 'user-token-123');
expect(client.forward).toHaveBeenCalledWith('GET', '/api/v1/projects/smart-home/servers', '', undefined);
expect(router.getUpstreamNames()).toContain('grafana');
});
it('throws on failed project fetch', async () => {
const router = new McpRouter();
const client = mockMcpdClient([]);
client.forward.mockResolvedValue({ status: 403, body: { error: 'Forbidden' } });
await expect(
refreshProjectUpstreams(router, client as any, 'secret-project', 'bad-token'),
).rejects.toThrow('Failed to fetch project servers: 403');
});
it('URL-encodes project name', async () => {
const router = new McpRouter();
const client = mockMcpdClient([]);
await refreshProjectUpstreams(router, client as any, 'my project');
expect(client.get).toHaveBeenCalledWith('/api/v1/projects/my%20project/servers');
});
it('handles empty project server list', async () => {
const router = new McpRouter();
const client = mockMcpdClient([]);
const registered = await refreshProjectUpstreams(router, client as any, 'empty-project');
expect(registered).toEqual([]);
expect(router.getUpstreamNames()).toHaveLength(0);
});
});

View File

@@ -0,0 +1,172 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import Fastify from 'fastify';
import type { FastifyInstance } from 'fastify';
import { registerProjectMcpEndpoint } from '../src/http/project-mcp-endpoint.js';
// Mock discovery module — we don't want real HTTP calls
vi.mock('../src/discovery.js', () => ({
refreshProjectUpstreams: vi.fn(async () => ['mock-server']),
}));
import { refreshProjectUpstreams } from '../src/discovery.js';
function mockMcpdClient() {
return {
baseUrl: 'http://test:3100',
token: 'test-token',
get: vi.fn(async () => []),
post: vi.fn(async () => ({})),
put: vi.fn(),
delete: vi.fn(),
forward: vi.fn(async () => ({ status: 200, body: [] })),
};
}
describe('registerProjectMcpEndpoint', () => {
let app: FastifyInstance;
beforeEach(async () => {
vi.clearAllMocks();
app = Fastify();
registerProjectMcpEndpoint(app, mockMcpdClient() as any);
await app.ready();
});
it('registers POST /projects/:projectName/mcp route', async () => {
// The endpoint should exist and attempt to handle MCP protocol
const res = await app.inject({
method: 'POST',
url: '/projects/smart-home/mcp',
payload: { jsonrpc: '2.0', id: 1, method: 'initialize', params: {} },
headers: { 'content-type': 'application/json' },
});
// The StreamableHTTPServerTransport hijacks the response,
// so we may get a 200 or the transport handles it directly
expect(res.statusCode).not.toBe(404);
});
it('calls refreshProjectUpstreams with project name', async () => {
await app.inject({
method: 'POST',
url: '/projects/smart-home/mcp',
payload: { jsonrpc: '2.0', id: 1, method: 'initialize', params: {} },
headers: { 'content-type': 'application/json' },
});
expect(refreshProjectUpstreams).toHaveBeenCalledWith(
expect.any(Object), // McpRouter instance
expect.any(Object), // McpdClient
'smart-home',
undefined, // no auth token
);
});
it('forwards auth token from Authorization header', async () => {
await app.inject({
method: 'POST',
url: '/projects/secure-project/mcp',
payload: { jsonrpc: '2.0', id: 1, method: 'initialize', params: {} },
headers: {
'content-type': 'application/json',
'authorization': 'Bearer my-token-123',
},
});
expect(refreshProjectUpstreams).toHaveBeenCalledWith(
expect.any(Object),
expect.any(Object),
'secure-project',
'my-token-123',
);
});
it('returns 502 when project discovery fails', async () => {
vi.mocked(refreshProjectUpstreams).mockRejectedValueOnce(new Error('Forbidden'));
const res = await app.inject({
method: 'POST',
url: '/projects/bad-project/mcp',
payload: { jsonrpc: '2.0', id: 1, method: 'initialize', params: {} },
headers: { 'content-type': 'application/json' },
});
expect(res.statusCode).toBe(502);
expect(res.json().error).toContain('Failed to load project');
});
it('returns 404 for unknown session ID', async () => {
const res = await app.inject({
method: 'POST',
url: '/projects/smart-home/mcp',
payload: { jsonrpc: '2.0', id: 1, method: 'tools/list', params: {} },
headers: {
'content-type': 'application/json',
'mcp-session-id': 'nonexistent-session',
},
});
expect(res.statusCode).toBe(404);
});
it('returns 400 for GET without session', async () => {
const res = await app.inject({
method: 'GET',
url: '/projects/smart-home/mcp',
});
expect(res.statusCode).toBe(400);
expect(res.json().error).toContain('session');
});
it('returns 400 for DELETE without session', async () => {
const res = await app.inject({
method: 'DELETE',
url: '/projects/smart-home/mcp',
});
expect(res.statusCode).toBe(400);
expect(res.json().error).toContain('session');
});
it('caches project router across requests', async () => {
// Two requests to the same project should reuse the router
await app.inject({
method: 'POST',
url: '/projects/smart-home/mcp',
payload: { jsonrpc: '2.0', id: 1, method: 'initialize', params: {} },
headers: { 'content-type': 'application/json' },
});
await app.inject({
method: 'POST',
url: '/projects/smart-home/mcp',
payload: { jsonrpc: '2.0', id: 2, method: 'initialize', params: {} },
headers: { 'content-type': 'application/json' },
});
// refreshProjectUpstreams should only be called once (cached)
expect(refreshProjectUpstreams).toHaveBeenCalledTimes(1);
});
it('creates separate routers for different projects', async () => {
await app.inject({
method: 'POST',
url: '/projects/project-a/mcp',
payload: { jsonrpc: '2.0', id: 1, method: 'initialize', params: {} },
headers: { 'content-type': 'application/json' },
});
await app.inject({
method: 'POST',
url: '/projects/project-b/mcp',
payload: { jsonrpc: '2.0', id: 2, method: 'initialize', params: {} },
headers: { 'content-type': 'application/json' },
});
// Two different projects should trigger two refreshes
expect(refreshProjectUpstreams).toHaveBeenCalledTimes(2);
expect(refreshProjectUpstreams).toHaveBeenCalledWith(expect.any(Object), expect.any(Object), 'project-a', undefined);
expect(refreshProjectUpstreams).toHaveBeenCalledWith(expect.any(Object), expect.any(Object), 'project-b', undefined);
});
});

View File

@@ -28,6 +28,8 @@ services:
MCPD_PORT: "3100"
MCPD_HOST: "0.0.0.0"
MCPD_LOG_LEVEL: ${MCPD_LOG_LEVEL:-info}
MCPD_NODE_RUNNER_IMAGE: mysources.co.uk/michal/mcpctl-node-runner:latest
MCPD_MCP_NETWORK: mcp-servers
depends_on:
postgres:
condition: service_healthy
@@ -47,8 +49,10 @@ networks:
mcpctl:
driver: bridge
mcp-servers:
name: mcp-servers
driver: bridge
internal: true
# Not internal — MCP servers need outbound access for external APIs.
# Isolation enforced by not binding host ports on MCP containers.
volumes:
mcpctl-pgdata:

View File

@@ -1,16 +1,22 @@
name: home-assistant
version: "1.0.0"
description: Home Assistant MCP server for smart home control and entity management
packageName: "home-assistant-mcp-server"
transport: STDIO
repositoryUrl: https://github.com/tevonsb/homeassistant-mcp
dockerImage: "ghcr.io/homeassistant-ai/ha-mcp:latest"
transport: SSE
containerPort: 8086
repositoryUrl: https://github.com/homeassistant-ai/ha-mcp
command:
- python
- -c
- "from ha_mcp.server import HomeAssistantSmartMCPServer; s = HomeAssistantSmartMCPServer(); s.mcp.run(transport='sse', host='0.0.0.0', port=8086)"
healthCheck:
tool: get_entities
arguments: {}
tool: ha_search_entities
arguments:
query: "light"
env:
- name: HASS_URL
- name: HOMEASSISTANT_URL
description: Home Assistant instance URL (e.g. http://homeassistant.local:8123)
required: true
- name: HASS_TOKEN
- name: HOMEASSISTANT_TOKEN
description: Home Assistant long-lived access token
required: true

82
tests.sh Executable file
View File

@@ -0,0 +1,82 @@
#!/usr/bin/env bash
set -euo pipefail
PATH="$HOME/.npm-global/bin:$PATH"
SHORT=false
FILTER=""
while [[ $# -gt 0 ]]; do
case "$1" in
--short|-s) SHORT=true; shift ;;
--filter|-f) FILTER="$2"; shift 2 ;;
*) echo "Usage: tests.sh [--short|-s] [--filter|-f <package>]"; exit 1 ;;
esac
done
strip_ansi() {
sed $'s/\033\[[0-9;]*m//g'
}
run_tests() {
local pkg="$1"
local label="$2"
if $SHORT; then
local tmpfile
tmpfile=$(mktemp)
trap "rm -f $tmpfile" RETURN
local exit_code=0
pnpm --filter "$pkg" test:run >"$tmpfile" 2>&1 || exit_code=$?
# Parse from cleaned output
local clean
clean=$(strip_ansi < "$tmpfile")
local tests_line files_line duration_line
tests_line=$(echo "$clean" | grep -oP 'Tests\s+\K.*' | tail -1 | xargs)
files_line=$(echo "$clean" | grep -oP 'Test Files\s+\K.*' | tail -1 | xargs)
duration_line=$(echo "$clean" | grep -oP 'Duration\s+\K[0-9.]+s' | tail -1)
if [[ $exit_code -eq 0 ]]; then
printf " \033[32mPASS\033[0m %-6s %s | %s | %s\n" "$label" "$files_line" "$tests_line" "$duration_line"
else
printf " \033[31mFAIL\033[0m %-6s %s | %s | %s\n" "$label" "$files_line" "$tests_line" "$duration_line"
echo "$clean" | grep -E 'FAIL |AssertionError|expected .* to' | head -10 | sed 's/^/ /'
fi
rm -f "$tmpfile"
return $exit_code
else
echo "=== $label ==="
pnpm --filter "$pkg" test:run
echo ""
fi
}
if $SHORT; then
echo "Running tests..."
echo ""
fi
failed=0
if [[ -z "$FILTER" || "$FILTER" == "mcpd" ]]; then
run_tests mcpd "mcpd" || failed=1
fi
if [[ -z "$FILTER" || "$FILTER" == "cli" ]]; then
run_tests cli "cli" || failed=1
fi
if $SHORT; then
echo ""
if [[ $failed -eq 0 ]]; then
echo "All tests passed."
else
echo "Some tests FAILED."
fi
fi
exit $failed