Compare commits

..

39 Commits

Author SHA1 Message Date
Michal
e06db9afba feat: smart response pagination for large MCP tool results
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
Intercepts oversized tool responses (>80K chars), caches them, and returns
a page index. LLM can fetch specific pages via _resultId/_page params.
Supports LLM-generated smart summaries with simple fallback.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-24 21:40:33 +00:00
Michal
a25809b84a fix: auto-read user credentials for mcpd auth
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
mcplocal now reads ~/.mcpctl/credentials automatically when
MCPLOCAL_MCPD_TOKEN env var is not set, matching CLI behavior.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-24 19:14:56 +00:00
f5a902d3e0 Merge pull request 'fix: STDIO transport stdout flush and MCP notification handling' (#37) from fix/stdio-flush-and-notifications into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-24 19:10:03 +00:00
Michal
9cb0c5ce24 fix: STDIO transport stdout flush and MCP notification handling
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- Wait for stdout.write callback before process.exit in STDIO transport
  to prevent truncation of large responses (e.g. grafana tools/list)
- Handle MCP notification methods (notifications/initialized, etc.) in
  router instead of returning "Method not found" error
- Use -p shorthand in config claude output

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-24 19:09:47 +00:00
06230ec034 Merge pull request 'feat: prompt resources, proxy transport fix, enriched descriptions' (#36) from feat/prompt-resources-and-proxy-transport into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-24 14:53:24 +00:00
Michal
079c7b3dfa feat: add prompt resources, fix MCP proxy transport, enrich tool descriptions
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- Fix MCP proxy to support SSE and STDIO transports (not just HTTP POST)
- Enrich tool descriptions with server context for LLM clarity
- Add Prompt and PromptRequest resources with two-resource RBAC model
- Add propose_prompt MCP tool for LLM to create pending prompt requests
- Add prompt resources visible in MCP resources/list (approved + session's pending)
- Add project-level prompt/instructions in MCP initialize response
- Add ServiceAccount subject type for RBAC (SA identity from X-Service-Account header)
- Add CLI commands: create prompt, get prompts/promptrequests, approve promptrequest
- Add prompts to apply config schema
- 956 tests passing across all packages

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-24 14:53:00 +00:00
Michal
7829f4fb92 fix: handle SSE responses in MCP bridge and add Commander-level tests
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
The bridge now parses SSE text/event-stream responses (extracting data:
lines) in addition to plain JSON. Also sends correct Accept header
per MCP streamable HTTP spec. Added tests for SSE handling and
command option parsing (-p/--project).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-24 10:17:45 +00:00
Michal
fa6240107f fix: mcp command accepts --project directly for Claude spawned processes
The mcp subcommand now has its own -p/--project option with
passThroughOptions(), so `mcpctl mcp --project NAME` works when Claude
spawns the process. Updated config claude to generate
args: ['mcp', '--project', project] and added Commander-level tests.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-24 10:14:16 +00:00
b34ea63d3d Merge pull request 'feat: add mcpctl mcp STDIO bridge, rework config claude' (#35) from feat/mcp-stdio-bridge into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-24 00:52:21 +00:00
Michal
e17a2282e8 feat: add mcpctl mcp STDIO bridge, rework config claude
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- New `mcpctl mcp -p PROJECT` command: STDIO-to-StreamableHTTP bridge
  that reads JSON-RPC from stdin and forwards to mcplocal project endpoint
- Rework `config claude` to write mcpctl mcp entry instead of fetching
  server configs from API (no secrets in .mcp.json)
- Keep `config claude-generate` as backward-compat alias
- Fix discovery.ts auth token not being forwarded to mcpd (RBAC bypass)
- Update fish/bash completions for new commands
- 10 new MCP bridge tests, updated claude tests, fixed project-discovery test

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-24 00:52:05 +00:00
01d3c4e02d Merge pull request 'fix: don't send Content-Type on bodyless DELETE, include full server data in project queries' (#34) from fix/delete-content-type-and-project-servers into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 19:55:35 +00:00
Michal
e4affe5962 fix: don't send Content-Type on bodyless DELETE, include full server data in project queries
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- Only set Content-Type: application/json when request body is present (fixes
  Fastify rejecting empty DELETE with "Body cannot be empty" 400 error)
- Changed PROJECT_INCLUDE to return full server objects instead of just {id, name}
  so project server listings show transport, package, image columns

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 19:54:34 +00:00
c75e7cdf4d Merge pull request 'fix: prevent attach/detach-server from repeating server arg on tab' (#33) from fix/completion-no-repeat-server-arg into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 19:36:53 +00:00
Michal
65c340a03c fix: prevent attach/detach-server from repeating server arg on tab
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
Added __mcpctl_needs_server_arg guard in fish and position check in
bash so completions stop after one server name is selected.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 19:36:45 +00:00
677d34b868 Merge pull request 'fix: instance completions use server.name, smart attach/detach' (#32) from fix/completion-instances-attach-detach into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 19:32:34 +00:00
Michal
c5b8cb60b7 fix: instance completions use server.name, smart attach/detach
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- Instances have no name field — use server.name for completions
- attach-server: show only servers NOT in the project
- detach-server: show only servers IN the project
- Add helper functions for project-aware server completion
- 5 new tests covering all three fixes

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 19:32:18 +00:00
9a5deffb8f Merge pull request 'fix: use .[][].name in jq for wrapped JSON response' (#31) from fix/completion-jq-wrapped-json into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 19:27:02 +00:00
Michal
ec7ada5383 fix: use .[][].name in jq for wrapped JSON response
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
API returns { "resources": [...] } not bare arrays, so .[].name
produced no output. Use .[][].name to unwrap the outer object first.
Also auto-load .env in pr.sh.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 19:26:47 +00:00
b81d3be2d5 Merge pull request 'fix: use jq for completion name extraction to avoid nested matches' (#30) from fix/completion-nested-names into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 19:23:48 +00:00
Michal
e2c54bfc5c fix: use jq for completion name extraction to avoid nested matches
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
The regex "name":\s*"..." on JSON matched nested server names inside
project objects, mixing resource types in completions. Switch to
jq -r '.[].name' for proper top-level extraction. Add jq as RPM
dependency. Add pr.sh for PR creation via Gitea API.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 19:23:21 +00:00
7b7854b007 Merge pull request 'feat: erase stale fish completions and add completion tests' (#29) from feat/completions-stale-erase-and-tests into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 19:17:00 +00:00
Michal
f23dd99662 feat: erase stale fish completions and add completion tests
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
Fish completions are additive — sourcing a new file doesn't remove old
rules. Add `complete -c mcpctl -e` at the top to clear stale entries.
Also add 12 structural tests to prevent completion regressions.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 19:16:36 +00:00
43af85cb58 Merge pull request 'feat: context-aware completions with dynamic resource names' (#28) from feat/completions-project-scope-dynamic into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 19:08:45 +00:00
Michal
6d2e3c2eb3 feat: context-aware completions with dynamic resource names
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- Hide attach-server/detach-server from --help (only relevant with --project)
- --project shows only project-scoped commands in tab completion
- Tab after resource type fetches live resource names from API
- --project value auto-completes from existing project names
- Stop offering resource types after one is already selected

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 19:08:29 +00:00
ce21db3853 Merge pull request 'feat: --project scopes get servers/instances' (#27) from feat/project-scoped-get into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 19:03:23 +00:00
Michal
767725023e feat: --project flag scopes get servers/instances to project
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
mcpctl --project NAME get servers — shows only servers attached to the project
mcpctl --project NAME get instances — shows only instances of project servers

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 19:03:07 +00:00
2bd1b55fe8 Merge pull request 'feat: add tests.sh runner and project routes tests' (#26) from feat/tests-sh-and-project-routes-tests into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 18:58:06 +00:00
Michal
0f2a93f2f0 feat: add tests.sh runner and project routes integration tests
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- tests.sh: run all tests with `bash tests.sh`, summary with `--short`
- tests.sh --filter mcpd/cli: run specific package
- project-routes.test.ts: 17 new route-level tests covering CRUD,
  attach/detach, and the ownerId filtering bug fix

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 18:57:46 +00:00
ce81d9d616 Merge pull request 'fix: project list uses RBAC filtering instead of ownerId' (#25) from fix/project-list-rbac into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 18:52:29 +00:00
Michal
c6cc39c6f7 fix: project list should use RBAC filtering, not ownerId
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
The list endpoint was filtering by ownerId before RBAC could include
projects the user has view access to via name-scoped bindings.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 18:52:13 +00:00
de074d9a90 Merge pull request 'feat: remove ProjectMember, add expose RBAC role, attach/detach-server' (#24) from feat/project-improvements into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 17:50:24 +00:00
Michal
783cf15179 feat: remove ProjectMember, add expose RBAC role, attach/detach-server commands
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- Remove ProjectMember model entirely (RBAC manages project access)
- Add 'expose' RBAC role for /mcp-config endpoint access (edit implies expose)
- Rename CLI flags: --llm-provider → --proxy-mode-llm-provider, --llm-model → --proxy-mode-llm-model
- Add attach-server / detach-server CLI commands (mcpctl --project NAME attach-server SERVER)
- Add POST/DELETE /api/v1/projects/:id/servers endpoints for server attach/detach
- Remove members from backup/restore, apply, get, describe
- Prisma migration to drop ProjectMember table

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 17:50:01 +00:00
5844d6c73f Merge pull request 'fix: RBAC name-scoped access — CUID resolution + list filtering' (#23) from fix/rbac-name-scoped-access into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 12:27:48 +00:00
Michal
604bd76d60 fix: RBAC name-scoped access — CUID resolution + list filtering
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
Two bugs fixed:
- GET /api/v1/servers/:cuid now resolves CUID→name before RBAC check,
  so name-scoped bindings match correctly
- List endpoints now filter responses via preSerialization hook using
  getAllowedScope(), so name-scoped users only see their resources

Also adds fulldeploy.sh orchestrator script.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 12:26:37 +00:00
da14bb8c23 Merge pull request 'fix: update shell completions for current CLI commands' (#22) from fix/update-shell-completions into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 12:00:50 +00:00
Michal
9e9a2f4a54 fix: update shell completions for current CLI commands
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
Add users, groups, rbac, secrets, templates to resource completions.
Remove stale profiles references. Add login, logout, create, edit,
delete, logs commands. Update config subcommands.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 12:00:31 +00:00
c8cdd7f514 Merge pull request 'fix: migrate legacy admin role at startup' (#21) from fix/migrate-legacy-admin-role into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 11:31:31 +00:00
Michal
ec1dfe7438 fix: migrate legacy admin role to granular roles at startup
Some checks failed
CI / lint (pull_request) Has been cancelled
CI / typecheck (pull_request) Has been cancelled
CI / test (pull_request) Has been cancelled
CI / build (pull_request) Has been cancelled
CI / package (pull_request) Has been cancelled
- Add migrateAdminRole() that runs on mcpd boot
- Converts { role: 'admin', resource: X } → edit + run bindings
- Adds operation bindings for wildcard admin (impersonate, logs, etc.)
- Add tests verifying unknown/legacy roles are denied by canAccess

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 11:31:15 +00:00
50b4112398 Merge pull request 'fix: resolve tsc --build type errors' (#20) from fix/build-type-errors into main
Some checks are pending
CI / lint (push) Waiting to run
CI / typecheck (push) Waiting to run
CI / test (push) Waiting to run
CI / build (push) Blocked by required conditions
CI / package (push) Blocked by required conditions
2026-02-23 11:08:08 +00:00
65 changed files with 5460 additions and 458 deletions

View File

@@ -2,91 +2,166 @@ _mcpctl() {
local cur prev words cword
_init_completion || return
local commands="config status get describe instance instances apply setup claude project projects backup restore help"
local global_opts="-v --version -o --output --daemon-url -h --help"
local resources="servers profiles projects instances"
local commands="status login logout config get describe delete logs create edit apply backup restore mcp help"
local project_commands="attach-server detach-server get describe delete logs create edit help"
local global_opts="-v --version --daemon-url --direct --project -h --help"
local resources="servers instances secrets templates projects users groups rbac"
case "${words[1]}" in
# Check if --project was given
local has_project=false
local i
for ((i=1; i < cword; i++)); do
if [[ "${words[i]}" == "--project" ]]; then
has_project=true
break
fi
done
# Find the first subcommand (skip --project and its argument, skip flags)
local subcmd=""
local subcmd_pos=0
for ((i=1; i < cword; i++)); do
if [[ "${words[i]}" == "--project" || "${words[i]}" == "--daemon-url" ]]; then
((i++)) # skip the argument
continue
fi
if [[ "${words[i]}" != -* ]]; then
subcmd="${words[i]}"
subcmd_pos=$i
break
fi
done
# Find the resource type after get/describe/delete/edit
local resource_type=""
if [[ -n "$subcmd_pos" ]] && [[ $subcmd_pos -gt 0 ]]; then
for ((i=subcmd_pos+1; i < cword; i++)); do
if [[ "${words[i]}" != -* ]] && [[ " $resources " == *" ${words[i]} "* ]]; then
resource_type="${words[i]}"
break
fi
done
fi
# If completing the --project value
if [[ "$prev" == "--project" ]]; then
local names
names=$(mcpctl get projects -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null)
COMPREPLY=($(compgen -W "$names" -- "$cur"))
return
fi
# Fetch resource names dynamically (jq extracts only top-level names)
_mcpctl_resource_names() {
local rt="$1"
if [[ -n "$rt" ]]; then
# Instances don't have a name field — use server.name instead
if [[ "$rt" == "instances" ]]; then
mcpctl get instances -o json 2>/dev/null | jq -r '.[][].server.name' 2>/dev/null
else
mcpctl get "$rt" -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null
fi
fi
}
# Get the --project value from the command line
_mcpctl_get_project_value() {
local i
for ((i=1; i < cword; i++)); do
if [[ "${words[i]}" == "--project" ]] && (( i+1 < cword )); then
echo "${words[i+1]}"
return
fi
done
}
case "$subcmd" in
config)
COMPREPLY=($(compgen -W "view set path reset help" -- "$cur"))
if [[ $((cword - subcmd_pos)) -eq 1 ]]; then
COMPREPLY=($(compgen -W "view set path reset claude impersonate help" -- "$cur"))
fi
return ;;
status)
COMPREPLY=($(compgen -W "--daemon-url -h --help" -- "$cur"))
COMPREPLY=($(compgen -W "-h --help" -- "$cur"))
return ;;
get)
if [[ $cword -eq 2 ]]; then
login)
COMPREPLY=($(compgen -W "--url --email --password -h --help" -- "$cur"))
return ;;
logout)
return ;;
mcp)
return ;;
get|describe|delete)
if [[ -z "$resource_type" ]]; then
COMPREPLY=($(compgen -W "$resources" -- "$cur"))
else
COMPREPLY=($(compgen -W "-o --output --daemon-url -h --help" -- "$cur"))
local names
names=$(_mcpctl_resource_names "$resource_type")
COMPREPLY=($(compgen -W "$names -o --output -h --help" -- "$cur"))
fi
return ;;
describe)
if [[ $cword -eq 2 ]]; then
COMPREPLY=($(compgen -W "$resources" -- "$cur"))
edit)
if [[ -z "$resource_type" ]]; then
COMPREPLY=($(compgen -W "servers projects" -- "$cur"))
else
COMPREPLY=($(compgen -W "-o --output --daemon-url -h --help" -- "$cur"))
local names
names=$(_mcpctl_resource_names "$resource_type")
COMPREPLY=($(compgen -W "$names -h --help" -- "$cur"))
fi
return ;;
instance|instances)
if [[ $cword -eq 2 ]]; then
COMPREPLY=($(compgen -W "list ls start stop restart remove rm logs inspect help" -- "$cur"))
else
case "${words[2]}" in
logs)
COMPREPLY=($(compgen -W "--tail --since -h --help" -- "$cur"))
;;
start)
COMPREPLY=($(compgen -W "--env --image -h --help" -- "$cur"))
;;
list|ls)
COMPREPLY=($(compgen -W "--server-id -o --output -h --help" -- "$cur"))
;;
esac
fi
logs)
COMPREPLY=($(compgen -W "--tail --since -f --follow -h --help" -- "$cur"))
return ;;
claude)
if [[ $cword -eq 2 ]]; then
COMPREPLY=($(compgen -W "generate show add remove help" -- "$cur"))
else
case "${words[2]}" in
generate|show|add|remove)
COMPREPLY=($(compgen -W "--path -p -h --help" -- "$cur"))
;;
esac
fi
return ;;
project|projects)
if [[ $cword -eq 2 ]]; then
COMPREPLY=($(compgen -W "list ls create delete rm show profiles set-profiles help" -- "$cur"))
else
case "${words[2]}" in
create)
COMPREPLY=($(compgen -W "--description -d -h --help" -- "$cur"))
;;
list|ls)
COMPREPLY=($(compgen -W "-o --output -h --help" -- "$cur"))
;;
esac
create)
if [[ $((cword - subcmd_pos)) -eq 1 ]]; then
COMPREPLY=($(compgen -W "server secret project user group rbac help" -- "$cur"))
fi
return ;;
apply)
COMPREPLY=($(compgen -f -- "$cur"))
return ;;
backup)
COMPREPLY=($(compgen -W "-o --output -p --password -r --resources -h --help" -- "$cur"))
COMPREPLY=($(compgen -W "-o --output -p --password -h --help" -- "$cur"))
return ;;
restore)
COMPREPLY=($(compgen -W "-i --input -p --password -c --conflict -h --help" -- "$cur"))
return ;;
setup)
attach-server)
# Only complete if no server arg given yet (first arg after subcmd)
if [[ $((cword - subcmd_pos)) -ne 1 ]]; then return; fi
local proj names all_servers proj_servers
proj=$(_mcpctl_get_project_value)
if [[ -n "$proj" ]]; then
all_servers=$(mcpctl get servers -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null)
proj_servers=$(mcpctl --project "$proj" get servers -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null)
names=$(comm -23 <(echo "$all_servers" | sort) <(echo "$proj_servers" | sort))
else
names=$(_mcpctl_resource_names "servers")
fi
COMPREPLY=($(compgen -W "$names" -- "$cur"))
return ;;
detach-server)
# Only complete if no server arg given yet (first arg after subcmd)
if [[ $((cword - subcmd_pos)) -ne 1 ]]; then return; fi
local proj names
proj=$(_mcpctl_get_project_value)
if [[ -n "$proj" ]]; then
names=$(mcpctl --project "$proj" get servers -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null)
fi
COMPREPLY=($(compgen -W "$names" -- "$cur"))
return ;;
help)
COMPREPLY=($(compgen -W "$commands" -- "$cur"))
return ;;
esac
if [[ $cword -eq 1 ]]; then
COMPREPLY=($(compgen -W "$commands $global_opts" -- "$cur"))
# No subcommand yet — offer commands based on context
if [[ -z "$subcmd" ]]; then
if $has_project; then
COMPREPLY=($(compgen -W "$project_commands $global_opts" -- "$cur"))
else
COMPREPLY=($(compgen -W "$commands $global_opts" -- "$cur"))
fi
fi
}

View File

@@ -1,73 +1,226 @@
# mcpctl fish completions
set -l commands config status get describe instance instances apply setup claude project projects backup restore help
# Erase any stale completions from previous versions
complete -c mcpctl -e
set -l commands status login logout config get describe delete logs create edit apply backup restore mcp help
set -l project_commands attach-server detach-server get describe delete logs create edit help
# Disable file completions by default
complete -c mcpctl -f
# Global options
complete -c mcpctl -s v -l version -d 'Show version'
complete -c mcpctl -s o -l output -d 'Output format' -xa 'table json yaml'
complete -c mcpctl -l daemon-url -d 'mcpd daemon URL' -x
complete -c mcpctl -l daemon-url -d 'mcplocal daemon URL' -x
complete -c mcpctl -l direct -d 'Bypass mcplocal, connect directly to mcpd'
complete -c mcpctl -l project -d 'Target project context' -x
complete -c mcpctl -s h -l help -d 'Show help'
# Top-level commands
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a config -d 'Manage configuration'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a status -d 'Show status and connectivity'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a get -d 'List resources'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a describe -d 'Show resource details'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a instance -d 'Manage instances'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a apply -d 'Apply configuration from file'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a setup -d 'Interactive setup wizard'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a claude -d 'Manage Claude .mcp.json'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a project -d 'Manage projects'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a backup -d 'Backup configuration'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a restore -d 'Restore from backup'
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a help -d 'Show help'
# Helper: check if --project was given
function __mcpctl_has_project
set -l tokens (commandline -opc)
for i in (seq (count $tokens))
if test "$tokens[$i]" = "--project"
return 0
end
end
return 1
end
# get/describe resources
complete -c mcpctl -n "__fish_seen_subcommand_from get describe" -a 'servers profiles projects instances' -d 'Resource type'
# Helper: check if a resource type has been selected after get/describe/delete/edit
set -l resources servers instances secrets templates projects users groups rbac
function __mcpctl_needs_resource_type
set -l tokens (commandline -opc)
set -l found_cmd false
for tok in $tokens
if $found_cmd
# Check if next token after get/describe/delete/edit is a resource type
if contains -- $tok servers instances secrets templates projects users groups rbac
return 1 # resource type already present
end
end
if contains -- $tok get describe delete edit
set found_cmd true
end
end
if $found_cmd
return 0 # command found but no resource type yet
end
return 1
end
function __mcpctl_get_resource_type
set -l tokens (commandline -opc)
set -l found_cmd false
for tok in $tokens
if $found_cmd
if contains -- $tok servers instances secrets templates projects users groups rbac
echo $tok
return
end
end
if contains -- $tok get describe delete edit
set found_cmd true
end
end
end
# Fetch resource names dynamically from the API (jq extracts only top-level names)
function __mcpctl_resource_names
set -l resource (__mcpctl_get_resource_type)
if test -z "$resource"
return
end
# Instances don't have a name field — use server.name instead
if test "$resource" = "instances"
mcpctl get instances -o json 2>/dev/null | jq -r '.[][].server.name' 2>/dev/null
else
mcpctl get $resource -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null
end
end
# Fetch project names for --project value
function __mcpctl_project_names
mcpctl get projects -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null
end
# Helper: get the --project value from the command line
function __mcpctl_get_project_value
set -l tokens (commandline -opc)
for i in (seq (count $tokens))
if test "$tokens[$i]" = "--project"; and test $i -lt (count $tokens)
echo $tokens[(math $i + 1)]
return
end
end
end
# Servers currently attached to the project (for detach-server)
function __mcpctl_project_servers
set -l proj (__mcpctl_get_project_value)
if test -z "$proj"
return
end
mcpctl --project $proj get servers -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null
end
# Servers NOT attached to the project (for attach-server)
function __mcpctl_available_servers
set -l proj (__mcpctl_get_project_value)
if test -z "$proj"
# No project — show all servers
mcpctl get servers -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null
return
end
set -l all (mcpctl get servers -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null)
set -l attached (mcpctl --project $proj get servers -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null)
for s in $all
if not contains -- $s $attached
echo $s
end
end
end
# --project value completion
complete -c mcpctl -l project -xa '(__mcpctl_project_names)'
# Top-level commands (without --project)
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a status -d 'Show status and connectivity'
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a login -d 'Authenticate with mcpd'
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a logout -d 'Log out'
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a config -d 'Manage configuration'
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a get -d 'List resources'
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a describe -d 'Show resource details'
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a delete -d 'Delete a resource'
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a logs -d 'Get instance logs'
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a create -d 'Create a resource'
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a edit -d 'Edit a resource'
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a apply -d 'Apply configuration from file'
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a backup -d 'Backup configuration'
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a restore -d 'Restore from backup'
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a help -d 'Show help'
# Project-scoped commands (with --project)
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a attach-server -d 'Attach a server to the project'
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a detach-server -d 'Detach a server from the project'
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a get -d 'List resources (scoped to project)'
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a describe -d 'Show resource details'
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a delete -d 'Delete a resource'
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a logs -d 'Get instance logs'
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a create -d 'Create a resource'
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a edit -d 'Edit a resource'
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a help -d 'Show help'
# Resource types — only when resource type not yet selected
complete -c mcpctl -n "__fish_seen_subcommand_from get describe delete; and __mcpctl_needs_resource_type" -a "$resources" -d 'Resource type'
complete -c mcpctl -n "__fish_seen_subcommand_from edit; and __mcpctl_needs_resource_type" -a 'servers projects' -d 'Resource type'
# Resource names — after resource type is selected
complete -c mcpctl -n "__fish_seen_subcommand_from get describe delete edit; and not __mcpctl_needs_resource_type" -a '(__mcpctl_resource_names)' -d 'Resource name'
# Helper: check if attach-server/detach-server already has a server argument
function __mcpctl_needs_server_arg
set -l tokens (commandline -opc)
set -l found_cmd false
for tok in $tokens
if $found_cmd
if not string match -q -- '-*' $tok
return 1 # server arg already present
end
end
if contains -- $tok attach-server detach-server
set found_cmd true
end
end
if $found_cmd
return 0 # command found but no server arg yet
end
return 1
end
# attach-server: show servers NOT in the project (only if no server arg yet)
complete -c mcpctl -n "__fish_seen_subcommand_from attach-server; and __mcpctl_needs_server_arg" -a '(__mcpctl_available_servers)' -d 'Server'
# detach-server: show servers IN the project (only if no server arg yet)
complete -c mcpctl -n "__fish_seen_subcommand_from detach-server; and __mcpctl_needs_server_arg" -a '(__mcpctl_project_servers)' -d 'Server'
# get/describe options
complete -c mcpctl -n "__fish_seen_subcommand_from get" -s o -l output -d 'Output format' -xa 'table json yaml'
complete -c mcpctl -n "__fish_seen_subcommand_from describe" -s o -l output -d 'Output format' -xa 'detail json yaml'
complete -c mcpctl -n "__fish_seen_subcommand_from describe" -l show-values -d 'Show secret values'
# login options
complete -c mcpctl -n "__fish_seen_subcommand_from login" -l url -d 'mcpd URL' -x
complete -c mcpctl -n "__fish_seen_subcommand_from login" -l email -d 'Email address' -x
complete -c mcpctl -n "__fish_seen_subcommand_from login" -l password -d 'Password' -x
# config subcommands
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from view set path reset" -a view -d 'Show configuration'
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from view set path reset" -a set -d 'Set a config value'
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from view set path reset" -a path -d 'Show config file path'
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from view set path reset" -a reset -d 'Reset to defaults'
set -l config_cmds view set path reset claude claude-generate impersonate
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a view -d 'Show configuration'
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a set -d 'Set a config value'
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a path -d 'Show config file path'
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a reset -d 'Reset to defaults'
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a claude -d 'Generate .mcp.json for project'
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a impersonate -d 'Impersonate a user'
# instance subcommands
set -l instance_cmds list ls start stop restart remove rm logs inspect
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a list -d 'List instances'
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a start -d 'Start instance'
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a stop -d 'Stop instance'
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a restart -d 'Restart instance'
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a remove -d 'Remove instance'
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a logs -d 'Get logs'
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a inspect -d 'Inspect container'
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and __fish_seen_subcommand_from logs" -l tail -d 'Number of lines' -x
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and __fish_seen_subcommand_from logs" -l since -d 'Since timestamp' -x
# create subcommands
set -l create_cmds server secret project user group rbac
complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a server -d 'Create a server'
complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a secret -d 'Create a secret'
complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a project -d 'Create a project'
complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a user -d 'Create a user'
complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a group -d 'Create a group'
complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a rbac -d 'Create an RBAC binding'
# claude subcommands
set -l claude_cmds generate show add remove
complete -c mcpctl -n "__fish_seen_subcommand_from claude; and not __fish_seen_subcommand_from $claude_cmds" -a generate -d 'Generate .mcp.json'
complete -c mcpctl -n "__fish_seen_subcommand_from claude; and not __fish_seen_subcommand_from $claude_cmds" -a show -d 'Show .mcp.json'
complete -c mcpctl -n "__fish_seen_subcommand_from claude; and not __fish_seen_subcommand_from $claude_cmds" -a add -d 'Add server entry'
complete -c mcpctl -n "__fish_seen_subcommand_from claude; and not __fish_seen_subcommand_from $claude_cmds" -a remove -d 'Remove server entry'
complete -c mcpctl -n "__fish_seen_subcommand_from claude; and __fish_seen_subcommand_from $claude_cmds" -s p -l path -d 'Path to .mcp.json' -rF
# project subcommands
set -l project_cmds list ls create delete rm show profiles set-profiles
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a list -d 'List projects'
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a create -d 'Create project'
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a delete -d 'Delete project'
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a show -d 'Show project'
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a profiles -d 'List profiles'
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a set-profiles -d 'Set profiles'
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and __fish_seen_subcommand_from create" -s d -l description -d 'Description' -x
# logs options
complete -c mcpctl -n "__fish_seen_subcommand_from logs" -l tail -d 'Number of lines' -x
complete -c mcpctl -n "__fish_seen_subcommand_from logs" -l since -d 'Since timestamp' -x
complete -c mcpctl -n "__fish_seen_subcommand_from logs" -s f -l follow -d 'Follow log output'
# backup options
complete -c mcpctl -n "__fish_seen_subcommand_from backup" -s o -l output -d 'Output file' -rF
complete -c mcpctl -n "__fish_seen_subcommand_from backup" -s p -l password -d 'Encryption password' -x
complete -c mcpctl -n "__fish_seen_subcommand_from backup" -s r -l resources -d 'Resources to backup' -xa 'servers profiles projects'
# restore options
complete -c mcpctl -n "__fish_seen_subcommand_from restore" -s i -l input -d 'Input file' -rF
@@ -75,6 +228,7 @@ complete -c mcpctl -n "__fish_seen_subcommand_from restore" -s p -l password -d
complete -c mcpctl -n "__fish_seen_subcommand_from restore" -s c -l conflict -d 'Conflict strategy' -xa 'skip overwrite fail'
# apply takes a file
complete -c mcpctl -n "__fish_seen_subcommand_from apply" -s f -l file -d 'Configuration file' -rF
complete -c mcpctl -n "__fish_seen_subcommand_from apply" -F
# help completions

35
fulldeploy.sh Executable file
View File

@@ -0,0 +1,35 @@
#!/bin/bash
# Full deployment: Docker image → Portainer stack → RPM build/publish/install
set -e
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
cd "$SCRIPT_DIR"
# Load .env
if [ -f .env ]; then
set -a; source .env; set +a
fi
echo "========================================"
echo " mcpctl Full Deploy"
echo "========================================"
echo ""
echo ">>> Step 1/3: Build & push mcpd Docker image"
echo ""
bash scripts/build-mcpd.sh "$@"
echo ""
echo ">>> Step 2/3: Deploy stack to production"
echo ""
bash deploy.sh
echo ""
echo ">>> Step 3/3: Build, publish & install RPM"
echo ""
bash scripts/release.sh
echo ""
echo "========================================"
echo " Full deploy complete!"
echo "========================================"

View File

@@ -5,6 +5,8 @@ release: "1"
maintainer: michal
description: kubectl-like CLI for managing MCP servers
license: MIT
depends:
- jq
contents:
- src: ./dist/mcpctl
dst: /usr/bin/mcpctl

55
pr.sh Executable file
View File

@@ -0,0 +1,55 @@
#!/usr/bin/env bash
# Usage: bash pr.sh "PR title" "PR body"
# Loads GITEA_TOKEN from .env automatically
set -euo pipefail
# Load .env if GITEA_TOKEN not already exported
if [ -z "${GITEA_TOKEN:-}" ] && [ -f .env ]; then
set -a
source .env
set +a
fi
GITEA_URL="${GITEA_URL:-http://10.0.0.194:3012}"
REPO="${GITEA_OWNER:-michal}/mcpctl"
TITLE="${1:?Usage: pr.sh <title> [body]}"
BODY="${2:-}"
BASE="${3:-main}"
HEAD=$(git rev-parse --abbrev-ref HEAD)
if [ "$HEAD" = "$BASE" ]; then
echo "Error: already on $BASE, switch to a feature branch first" >&2
exit 1
fi
if [ -z "${GITEA_TOKEN:-}" ]; then
echo "Error: GITEA_TOKEN not set and .env not found" >&2
exit 1
fi
# Push if needed
if ! git rev-parse --verify "origin/$HEAD" &>/dev/null; then
git push -u origin "$HEAD"
else
git push
fi
# Create PR
RESPONSE=$(curl -s -X POST "$GITEA_URL/api/v1/repos/$REPO/pulls" \
-H "Authorization: token $GITEA_TOKEN" \
-H "Content-Type: application/json" \
-d "$(jq -n --arg t "$TITLE" --arg b "$BODY" --arg h "$HEAD" --arg base "$BASE" \
'{title: $t, body: $b, head: $h, base: $base}')")
PR_NUM=$(echo "$RESPONSE" | jq -r '.number // empty')
PR_URL=$(echo "$RESPONSE" | jq -r '.html_url // empty')
if [ -z "$PR_NUM" ]; then
echo "Error creating PR:" >&2
echo "$RESPONSE" | jq . 2>/dev/null || echo "$RESPONSE" >&2
exit 1
fi
echo "PR #$PR_NUM: https://mysources.co.uk/$REPO/pulls/$PR_NUM"

View File

@@ -24,7 +24,10 @@ export class ApiError extends Error {
function request<T>(method: string, url: string, timeout: number, body?: unknown, token?: string): Promise<ApiResponse<T>> {
return new Promise((resolve, reject) => {
const parsed = new URL(url);
const headers: Record<string, string> = { 'Content-Type': 'application/json' };
const headers: Record<string, string> = {};
if (body !== undefined) {
headers['Content-Type'] = 'application/json';
}
if (token) {
headers['Authorization'] = `Bearer ${token}`;
}

View File

@@ -76,18 +76,19 @@ const GroupSpecSchema = z.object({
});
const RbacSubjectSchema = z.object({
kind: z.enum(['User', 'Group']),
kind: z.enum(['User', 'Group', 'ServiceAccount']),
name: z.string().min(1),
});
const RESOURCE_ALIASES: Record<string, string> = {
server: 'servers', instance: 'instances', secret: 'secrets',
project: 'projects', template: 'templates', user: 'users', group: 'groups',
prompt: 'prompts', promptrequest: 'promptrequests',
};
const RbacRoleBindingSchema = z.union([
z.object({
role: z.enum(['edit', 'view', 'create', 'delete', 'run']),
role: z.enum(['edit', 'view', 'create', 'delete', 'run', 'expose']),
resource: z.string().min(1).transform((r) => RESOURCE_ALIASES[r] ?? r),
name: z.string().min(1).optional(),
}),
@@ -103,14 +104,20 @@ const RbacBindingSpecSchema = z.object({
roleBindings: z.array(RbacRoleBindingSchema).default([]),
});
const PromptSpecSchema = z.object({
name: z.string().min(1).max(100).regex(/^[a-z0-9-]+$/),
content: z.string().min(1).max(50000),
projectId: z.string().optional(),
});
const ProjectSpecSchema = z.object({
name: z.string().min(1),
description: z.string().default(''),
prompt: z.string().max(10000).default(''),
proxyMode: z.enum(['direct', 'filtered']).default('direct'),
llmProvider: z.string().optional(),
llmModel: z.string().optional(),
servers: z.array(z.string()).default([]),
members: z.array(z.string().email()).default([]),
});
const ApplyConfigSchema = z.object({
@@ -122,6 +129,7 @@ const ApplyConfigSchema = z.object({
templates: z.array(TemplateSpecSchema).default([]),
rbacBindings: z.array(RbacBindingSpecSchema).default([]),
rbac: z.array(RbacBindingSpecSchema).default([]),
prompts: z.array(PromptSpecSchema).default([]),
}).transform((data) => ({
...data,
// Merge rbac into rbacBindings so both keys work
@@ -159,6 +167,7 @@ export function createApplyCommand(deps: ApplyCommandDeps): Command {
if (config.projects.length > 0) log(` ${config.projects.length} project(s)`);
if (config.templates.length > 0) log(` ${config.templates.length} template(s)`);
if (config.rbacBindings.length > 0) log(` ${config.rbacBindings.length} rbacBinding(s)`);
if (config.prompts.length > 0) log(` ${config.prompts.length} prompt(s)`);
return;
}
@@ -246,7 +255,7 @@ async function applyConfig(client: ApiClient, config: ApplyConfig, log: (...args
}
}
// Apply projects (send full spec including servers/members)
// Apply projects (send full spec including servers)
for (const project of config.projects) {
try {
const existing = await findByName(client, 'projects', project.name);
@@ -293,6 +302,22 @@ async function applyConfig(client: ApiClient, config: ApplyConfig, log: (...args
log(`Error applying rbacBinding '${rbacBinding.name}': ${err instanceof Error ? err.message : err}`);
}
}
// Apply prompts
for (const prompt of config.prompts) {
try {
const existing = await findByName(client, 'prompts', prompt.name);
if (existing) {
await client.put(`/api/v1/prompts/${(existing as { id: string }).id}`, { content: prompt.content });
log(`Updated prompt: ${prompt.name}`);
} else {
await client.post('/api/v1/prompts', prompt);
log(`Created prompt: ${prompt.name}`);
}
} catch (err) {
log(`Error applying prompt '${prompt.name}': ${err instanceof Error ? err.message : err}`);
}
}
}
async function findByName(client: ApiClient, resource: string, name: string): Promise<unknown | null> {

View File

@@ -10,7 +10,7 @@ import type { CredentialsDeps, StoredCredentials } from '../auth/index.js';
import type { ApiClient } from '../api-client.js';
interface McpConfig {
mcpServers: Record<string, { command: string; args: string[]; env?: Record<string, string> }>;
mcpServers: Record<string, { command?: string; args?: string[]; url?: string; env?: Record<string, string> }>;
}
export interface ConfigCommandDeps {
@@ -84,21 +84,27 @@ export function createConfigCommand(deps?: Partial<ConfigCommandDeps>, apiDeps?:
log('Configuration reset to defaults');
});
if (apiDeps) {
const { client, credentialsDeps, log: apiLog } = apiDeps;
config
.command('claude-generate')
.description('Generate .mcp.json from a project configuration')
// claude/claude-generate: generate .mcp.json pointing at mcpctl mcp bridge
function registerClaudeCommand(name: string, hidden: boolean): void {
const cmd = config
.command(name)
.description(hidden ? '' : 'Generate .mcp.json that connects a project via mcpctl mcp bridge')
.requiredOption('--project <name>', 'Project name')
.option('-o, --output <path>', 'Output file path', '.mcp.json')
.option('--merge', 'Merge with existing .mcp.json instead of overwriting')
.option('--stdout', 'Print to stdout instead of writing a file')
.action(async (opts: { project: string; output: string; merge?: boolean; stdout?: boolean }) => {
const mcpConfig = await client.get<McpConfig>(`/api/v1/projects/${opts.project}/mcp-config`);
.action((opts: { project: string; output: string; merge?: boolean; stdout?: boolean }) => {
const mcpConfig: McpConfig = {
mcpServers: {
[opts.project]: {
command: 'mcpctl',
args: ['mcp', '-p', opts.project],
},
},
};
if (opts.stdout) {
apiLog(JSON.stringify(mcpConfig, null, 2));
log(JSON.stringify(mcpConfig, null, 2));
return;
}
@@ -121,8 +127,19 @@ export function createConfigCommand(deps?: Partial<ConfigCommandDeps>, apiDeps?:
writeFileSync(outputPath, JSON.stringify(finalConfig, null, 2) + '\n');
const serverCount = Object.keys(finalConfig.mcpServers).length;
apiLog(`Wrote ${outputPath} (${serverCount} server(s))`);
log(`Wrote ${outputPath} (${serverCount} server(s))`);
});
if (hidden) {
// Commander shows empty-description commands but they won't clutter help output
void cmd; // suppress unused lint
}
}
registerClaudeCommand('claude', false);
registerClaudeCommand('claude-generate', true); // backward compat
if (apiDeps) {
const { client, credentialsDeps, log: apiLog } = apiDeps;
config
.command('impersonate')

View File

@@ -196,10 +196,10 @@ export function createCreateCommand(deps: CreateCommandDeps): Command {
.argument('<name>', 'Project name')
.option('-d, --description <text>', 'Project description', '')
.option('--proxy-mode <mode>', 'Proxy mode (direct, filtered)')
.option('--llm-provider <name>', 'LLM provider name')
.option('--llm-model <name>', 'LLM model name')
.option('--proxy-mode-llm-provider <name>', 'LLM provider name (for filtered proxy mode)')
.option('--proxy-mode-llm-model <name>', 'LLM model name (for filtered proxy mode)')
.option('--prompt <text>', 'Project-level prompt / instructions for the LLM')
.option('--server <name>', 'Server name (repeat for multiple)', collect, [])
.option('--member <email>', 'Member email (repeat for multiple)', collect, [])
.option('--force', 'Update if already exists')
.action(async (name: string, opts) => {
const body: Record<string, unknown> = {
@@ -207,10 +207,10 @@ export function createCreateCommand(deps: CreateCommandDeps): Command {
description: opts.description,
proxyMode: opts.proxyMode ?? 'direct',
};
if (opts.llmProvider) body.llmProvider = opts.llmProvider;
if (opts.llmModel) body.llmModel = opts.llmModel;
if (opts.prompt) body.prompt = opts.prompt;
if (opts.proxyModeLlmProvider) body.llmProvider = opts.proxyModeLlmProvider;
if (opts.proxyModeLlmModel) body.llmModel = opts.proxyModeLlmModel;
if (opts.server.length > 0) body.servers = opts.server;
if (opts.member.length > 0) body.members = opts.member;
try {
const project = await client.post<{ id: string; name: string }>('/api/v1/projects', body);
@@ -349,5 +349,35 @@ export function createCreateCommand(deps: CreateCommandDeps): Command {
}
});
// --- create prompt ---
cmd.command('prompt')
.description('Create an approved prompt')
.argument('<name>', 'Prompt name (lowercase alphanumeric with hyphens)')
.option('--project <name>', 'Project name to scope the prompt to')
.option('--content <text>', 'Prompt content text')
.option('--content-file <path>', 'Read prompt content from file')
.action(async (name: string, opts) => {
let content = opts.content as string | undefined;
if (opts.contentFile) {
const fs = await import('node:fs/promises');
content = await fs.readFile(opts.contentFile as string, 'utf-8');
}
if (!content) {
throw new Error('--content or --content-file is required');
}
const body: Record<string, unknown> = { name, content };
if (opts.project) {
// Resolve project name to ID
const projects = await client.get<Array<{ id: string; name: string }>>('/api/v1/projects');
const project = projects.find((p) => p.name === opts.project);
if (!project) throw new Error(`Project '${opts.project as string}' not found`);
body.projectId = project.id;
}
const prompt = await client.post<{ id: string; name: string }>('/api/v1/prompts', body);
log(`prompt '${prompt.name}' created (id: ${prompt.id})`);
});
return cmd;
}

View File

@@ -11,7 +11,7 @@ export function createDeleteCommand(deps: DeleteCommandDeps): Command {
const { client, log } = deps;
return new Command('delete')
.description('Delete a resource (server, instance, profile, project)')
.description('Delete a resource (server, instance, secret, project, user, group, rbac)')
.argument('<resource>', 'resource type')
.argument('<id>', 'resource ID or name')
.action(async (resourceArg: string, idOrName: string) => {

View File

@@ -162,17 +162,6 @@ function formatProjectDetail(project: Record<string, unknown>): string {
}
}
// Members section (no role — all permissions are in RBAC)
const members = project.members as Array<{ user: { email: string } }> | undefined;
if (members && members.length > 0) {
lines.push('');
lines.push('Members:');
lines.push(' EMAIL');
for (const m of members) {
lines.push(` ${m.user.email}`);
}
}
lines.push('');
lines.push('Metadata:');
lines.push(` ${pad('ID:', 12)}${project.id}`);

View File

@@ -24,7 +24,6 @@ interface ProjectRow {
proxyMode: string;
ownerId: string;
servers?: Array<{ server: { name: string } }>;
members?: Array<{ user: { email: string }; role: string }>;
}
interface SecretRow {
@@ -85,7 +84,6 @@ const projectColumns: Column<ProjectRow>[] = [
{ header: 'NAME', key: 'name' },
{ header: 'MODE', key: (r) => r.proxyMode ?? 'direct', width: 10 },
{ header: 'SERVERS', key: (r) => r.servers ? String(r.servers.length) : '0', width: 8 },
{ header: 'MEMBERS', key: (r) => r.members ? String(r.members.length) : '0', width: 8 },
{ header: 'DESCRIPTION', key: 'description', width: 30 },
{ header: 'ID', key: 'id' },
];
@@ -132,6 +130,36 @@ const templateColumns: Column<TemplateRow>[] = [
{ header: 'DESCRIPTION', key: 'description', width: 50 },
];
interface PromptRow {
id: string;
name: string;
projectId: string | null;
createdAt: string;
}
interface PromptRequestRow {
id: string;
name: string;
projectId: string | null;
createdBySession: string | null;
createdAt: string;
}
const promptColumns: Column<PromptRow>[] = [
{ header: 'NAME', key: 'name' },
{ header: 'PROJECT', key: (r) => r.projectId ?? '-', width: 20 },
{ header: 'CREATED', key: (r) => new Date(r.createdAt).toLocaleString(), width: 20 },
{ header: 'ID', key: 'id' },
];
const promptRequestColumns: Column<PromptRequestRow>[] = [
{ header: 'NAME', key: 'name' },
{ header: 'PROJECT', key: (r) => r.projectId ?? '-', width: 20 },
{ header: 'SESSION', key: (r) => r.createdBySession ? r.createdBySession.slice(0, 12) : '-', width: 14 },
{ header: 'CREATED', key: (r) => new Date(r.createdAt).toLocaleString(), width: 20 },
{ header: 'ID', key: 'id' },
];
const instanceColumns: Column<InstanceRow>[] = [
{ header: 'NAME', key: (r) => r.server?.name ?? '-', width: 20 },
{ header: 'STATUS', key: 'status', width: 10 },
@@ -159,6 +187,10 @@ function getColumnsForResource(resource: string): Column<Record<string, unknown>
return groupColumns as unknown as Column<Record<string, unknown>>[];
case 'rbac':
return rbacColumns as unknown as Column<Record<string, unknown>>[];
case 'prompts':
return promptColumns as unknown as Column<Record<string, unknown>>[];
case 'promptrequests':
return promptRequestColumns as unknown as Column<Record<string, unknown>>[];
default:
return [
{ header: 'ID', key: 'id' as keyof Record<string, unknown> },

224
src/cli/src/commands/mcp.ts Normal file
View File

@@ -0,0 +1,224 @@
import { Command } from 'commander';
import http from 'node:http';
import { createInterface } from 'node:readline';
export interface McpBridgeOptions {
projectName: string;
mcplocalUrl: string;
token?: string | undefined;
stdin: NodeJS.ReadableStream;
stdout: NodeJS.WritableStream;
stderr: NodeJS.WritableStream;
}
function postJsonRpc(
url: string,
body: string,
sessionId: string | undefined,
token: string | undefined,
): Promise<{ status: number; headers: http.IncomingHttpHeaders; body: string }> {
return new Promise((resolve, reject) => {
const parsed = new URL(url);
const headers: Record<string, string> = {
'Content-Type': 'application/json',
'Accept': 'application/json, text/event-stream',
};
if (sessionId) {
headers['mcp-session-id'] = sessionId;
}
if (token) {
headers['Authorization'] = `Bearer ${token}`;
}
const req = http.request(
{
hostname: parsed.hostname,
port: parsed.port,
path: parsed.pathname,
method: 'POST',
headers,
timeout: 30_000,
},
(res) => {
const chunks: Buffer[] = [];
res.on('data', (chunk: Buffer) => chunks.push(chunk));
res.on('end', () => {
resolve({
status: res.statusCode ?? 0,
headers: res.headers,
body: Buffer.concat(chunks).toString('utf-8'),
});
});
},
);
req.on('error', reject);
req.on('timeout', () => {
req.destroy();
reject(new Error('Request timed out'));
});
req.write(body);
req.end();
});
}
function sendDelete(
url: string,
sessionId: string,
token: string | undefined,
): Promise<void> {
return new Promise((resolve) => {
const parsed = new URL(url);
const headers: Record<string, string> = {
'mcp-session-id': sessionId,
};
if (token) {
headers['Authorization'] = `Bearer ${token}`;
}
const req = http.request(
{
hostname: parsed.hostname,
port: parsed.port,
path: parsed.pathname,
method: 'DELETE',
headers,
timeout: 5_000,
},
() => resolve(),
);
req.on('error', () => resolve()); // Best effort cleanup
req.on('timeout', () => {
req.destroy();
resolve();
});
req.end();
});
}
/**
* Extract JSON-RPC messages from an HTTP response body.
* Handles both plain JSON and SSE (text/event-stream) formats.
*/
function extractJsonRpcMessages(contentType: string | undefined, body: string): string[] {
if (contentType?.includes('text/event-stream')) {
// Parse SSE: extract data: lines
const messages: string[] = [];
for (const line of body.split('\n')) {
if (line.startsWith('data: ')) {
messages.push(line.slice(6));
}
}
return messages;
}
// Plain JSON response
return [body];
}
/**
* STDIO-to-Streamable-HTTP MCP bridge.
*
* Reads JSON-RPC messages line-by-line from stdin, POSTs them to
* mcplocal's project endpoint, and writes responses to stdout.
*/
export async function runMcpBridge(opts: McpBridgeOptions): Promise<void> {
const { projectName, mcplocalUrl, token, stdin, stdout, stderr } = opts;
const endpointUrl = `${mcplocalUrl.replace(/\/$/, '')}/projects/${encodeURIComponent(projectName)}/mcp`;
let sessionId: string | undefined;
const rl = createInterface({ input: stdin, crlfDelay: Infinity });
for await (const line of rl) {
const trimmed = line.trim();
if (!trimmed) continue;
try {
const result = await postJsonRpc(endpointUrl, trimmed, sessionId, token);
// Capture session ID from first response
if (!sessionId) {
const sid = result.headers['mcp-session-id'];
if (typeof sid === 'string') {
sessionId = sid;
}
}
if (result.status >= 400) {
stderr.write(`MCP bridge error: HTTP ${result.status}: ${result.body}\n`);
}
// Handle both plain JSON and SSE responses
const messages = extractJsonRpcMessages(result.headers['content-type'], result.body);
for (const msg of messages) {
const trimmedMsg = msg.trim();
if (trimmedMsg) {
stdout.write(trimmedMsg + '\n');
}
}
} catch (err) {
stderr.write(`MCP bridge error: ${err instanceof Error ? err.message : String(err)}\n`);
}
}
// stdin closed — cleanup session
if (sessionId) {
await sendDelete(endpointUrl, sessionId, token);
}
}
export interface McpCommandDeps {
getProject: () => string | undefined;
configLoader?: () => { mcplocalUrl: string };
credentialsLoader?: () => { token: string } | null;
}
export function createMcpCommand(deps: McpCommandDeps): Command {
const cmd = new Command('mcp')
.description('MCP STDIO transport bridge — connects stdin/stdout to a project MCP endpoint')
.passThroughOptions()
.option('-p, --project <name>', 'Project name')
.action(async (opts: { project?: string }) => {
// Accept -p/--project on the command itself, or fall back to global --project
const projectName = opts.project ?? deps.getProject();
if (!projectName) {
process.stderr.write('Error: --project is required for the mcp command\n');
process.exitCode = 1;
return;
}
let mcplocalUrl = 'http://localhost:3200';
if (deps.configLoader) {
mcplocalUrl = deps.configLoader().mcplocalUrl;
} else {
try {
const { loadConfig } = await import('../config/index.js');
mcplocalUrl = loadConfig().mcplocalUrl;
} catch {
// Use default
}
}
let token: string | undefined;
if (deps.credentialsLoader) {
token = deps.credentialsLoader()?.token;
} else {
try {
const { loadCredentials } = await import('../auth/index.js');
token = loadCredentials()?.token;
} catch {
// No credentials
}
}
await runMcpBridge({
projectName,
mcplocalUrl,
token,
stdin: process.stdin,
stdout: process.stdout,
stderr: process.stderr,
});
});
return cmd;
}

View File

@@ -0,0 +1,66 @@
import { Command } from 'commander';
import type { ApiClient } from '../api-client.js';
import { resolveNameOrId, resolveResource } from './shared.js';
export interface ProjectOpsDeps {
client: ApiClient;
log: (...args: string[]) => void;
getProject: () => string | undefined;
}
function requireProject(deps: ProjectOpsDeps): string {
const project = deps.getProject();
if (!project) {
deps.log('Error: --project <name> is required for this command.');
process.exitCode = 1;
throw new Error('--project required');
}
return project;
}
export function createAttachServerCommand(deps: ProjectOpsDeps): Command {
const { client, log } = deps;
return new Command('attach-server')
.description('Attach a server to a project (requires --project)')
.argument('<server-name>', 'Server name to attach')
.action(async (serverName: string) => {
const projectName = requireProject(deps);
const projectId = await resolveNameOrId(client, 'projects', projectName);
await client.post(`/api/v1/projects/${projectId}/servers`, { server: serverName });
log(`server '${serverName}' attached to project '${projectName}'`);
});
}
export function createDetachServerCommand(deps: ProjectOpsDeps): Command {
const { client, log } = deps;
return new Command('detach-server')
.description('Detach a server from a project (requires --project)')
.argument('<server-name>', 'Server name to detach')
.action(async (serverName: string) => {
const projectName = requireProject(deps);
const projectId = await resolveNameOrId(client, 'projects', projectName);
await client.delete(`/api/v1/projects/${projectId}/servers/${serverName}`);
log(`server '${serverName}' detached from project '${projectName}'`);
});
}
export function createApproveCommand(deps: ProjectOpsDeps): Command {
const { client, log } = deps;
return new Command('approve')
.description('Approve a pending prompt request (atomic: delete request, create prompt)')
.argument('<resource>', 'Resource type (promptrequest)')
.argument('<name>', 'Prompt request name or ID')
.action(async (resourceArg: string, nameOrId: string) => {
const resource = resolveResource(resourceArg);
if (resource !== 'promptrequests') {
throw new Error(`approve is only supported for 'promptrequest', got '${resourceArg}'`);
}
const id = await resolveNameOrId(client, 'promptrequests', nameOrId);
const prompt = await client.post<{ id: string; name: string }>(`/api/v1/promptrequests/${id}/approve`, {});
log(`prompt request approved → prompt '${prompt.name}' created (id: ${prompt.id})`);
});
}

View File

@@ -16,6 +16,11 @@ export const RESOURCE_ALIASES: Record<string, string> = {
rbac: 'rbac',
'rbac-definition': 'rbac',
'rbac-binding': 'rbac',
prompt: 'prompts',
prompts: 'prompts',
promptrequest: 'promptrequests',
promptrequests: 'promptrequests',
pr: 'promptrequests',
};
export function resolveResource(name: string): string {

View File

@@ -12,6 +12,8 @@ import { createCreateCommand } from './commands/create.js';
import { createEditCommand } from './commands/edit.js';
import { createBackupCommand, createRestoreCommand } from './commands/backup.js';
import { createLoginCommand, createLogoutCommand } from './commands/auth.js';
import { createAttachServerCommand, createDetachServerCommand, createApproveCommand } from './commands/project-ops.js';
import { createMcpCommand } from './commands/mcp.js';
import { ApiClient, ApiError } from './api-client.js';
import { loadConfig } from './config/index.js';
import { loadCredentials } from './auth/index.js';
@@ -24,7 +26,8 @@ export function createProgram(): Command {
.version(APP_VERSION, '-v, --version')
.enablePositionalOptions()
.option('--daemon-url <url>', 'mcplocal daemon URL')
.option('--direct', 'bypass mcplocal and connect directly to mcpd');
.option('--direct', 'bypass mcplocal and connect directly to mcpd')
.option('--project <name>', 'Target project for project commands');
program.addCommand(createStatusCommand());
program.addCommand(createLoginCommand());
@@ -52,6 +55,21 @@ export function createProgram(): Command {
}));
const fetchResource = async (resource: string, nameOrId?: string): Promise<unknown[]> => {
const projectName = program.opts().project as string | undefined;
// --project scoping for servers and instances
if (projectName && !nameOrId && (resource === 'servers' || resource === 'instances')) {
const projectId = await resolveNameOrId(client, 'projects', projectName);
if (resource === 'servers') {
return client.get<unknown[]>(`/api/v1/projects/${projectId}/servers`);
}
// instances: fetch project servers, then filter instances by serverId
const projectServers = await client.get<Array<{ id: string }>>(`/api/v1/projects/${projectId}/servers`);
const serverIds = new Set(projectServers.map((s) => s.id));
const allInstances = await client.get<Array<{ serverId: string }>>(`/api/v1/instances`);
return allInstances.filter((inst) => serverIds.has(inst.serverId));
}
if (nameOrId) {
// Glob pattern — use query param filtering
if (nameOrId.includes('*')) {
@@ -126,6 +144,18 @@ export function createProgram(): Command {
log: (...args) => console.log(...args),
}));
const projectOpsDeps = {
client,
log: (...args: string[]) => console.log(...args),
getProject: () => program.opts().project as string | undefined,
};
program.addCommand(createAttachServerCommand(projectOpsDeps), { hidden: true });
program.addCommand(createDetachServerCommand(projectOpsDeps), { hidden: true });
program.addCommand(createApproveCommand(projectOpsDeps));
program.addCommand(createMcpCommand({
getProject: () => program.opts().project as string | undefined,
}), { hidden: true });
return program;
}

View File

@@ -21,6 +21,16 @@ beforeAll(async () => {
res.writeHead(201, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ id: 'srv-new', ...body }));
});
} else if (req.url === '/api/v1/servers/srv-1' && req.method === 'DELETE') {
// Fastify rejects empty body with Content-Type: application/json
const ct = req.headers['content-type'] ?? '';
if (ct.includes('application/json')) {
res.writeHead(400, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ error: "Body cannot be empty when content-type is set to 'application/json'" }));
} else {
res.writeHead(204);
res.end();
}
} else if (req.url === '/api/v1/missing' && req.method === 'GET') {
res.writeHead(404, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ error: 'Not found' }));
@@ -75,6 +85,12 @@ describe('ApiClient', () => {
await expect(client.get('/anything')).rejects.toThrow();
});
it('performs DELETE without Content-Type header', async () => {
const client = new ApiClient({ baseUrl: `http://localhost:${port}` });
// Should succeed (204) because no Content-Type is sent on bodyless DELETE
await expect(client.delete('/api/v1/servers/srv-1')).resolves.toBeUndefined();
});
it('sends Authorization header when token provided', async () => {
// We need a separate server to check the header
let receivedAuth = '';

View File

@@ -326,7 +326,7 @@ rbacBindings:
rmSync(tmpDir, { recursive: true, force: true });
});
it('applies projects with servers and members', async () => {
it('applies projects with servers', async () => {
const configPath = join(tmpDir, 'config.yaml');
writeFileSync(configPath, `
projects:
@@ -338,9 +338,6 @@ projects:
servers:
- my-grafana
- my-ha
members:
- alice@test.com
- bob@test.com
`);
const cmd = createApplyCommand({ client, log });
@@ -352,7 +349,6 @@ projects:
llmProvider: 'gemini-cli',
llmModel: 'gemini-2.0-flash',
servers: ['my-grafana', 'my-ha'],
members: ['alice@test.com', 'bob@test.com'],
}));
expect(output.join('\n')).toContain('Created project: smart-home');

View File

@@ -8,19 +8,14 @@ import { saveCredentials, loadCredentials } from '../../src/auth/index.js';
function mockClient(): ApiClient {
return {
get: vi.fn(async () => ({
mcpServers: {
'slack--default': { command: 'npx', args: ['-y', '@anthropic/slack-mcp'], env: { WORKSPACE: 'test' } },
'github--default': { command: 'npx', args: ['-y', '@anthropic/github-mcp'] },
},
})),
get: vi.fn(async () => ({})),
post: vi.fn(async () => ({ token: 'impersonated-tok', user: { email: 'other@test.com' } })),
put: vi.fn(async () => ({})),
delete: vi.fn(async () => {}),
} as unknown as ApiClient;
}
describe('config claude-generate', () => {
describe('config claude', () => {
let client: ReturnType<typeof mockClient>;
let output: string[];
let tmpDir: string;
@@ -36,18 +31,23 @@ describe('config claude-generate', () => {
rmSync(tmpDir, { recursive: true, force: true });
});
it('generates .mcp.json from project config', async () => {
it('generates .mcp.json with mcpctl mcp bridge entry', async () => {
const outPath = join(tmpDir, '.mcp.json');
const cmd = createConfigCommand(
{ configDeps: { configDir: tmpDir }, log },
{ client, credentialsDeps: { configDir: tmpDir }, log },
);
await cmd.parseAsync(['claude-generate', '--project', 'proj-1', '-o', outPath], { from: 'user' });
await cmd.parseAsync(['claude', '--project', 'homeautomation', '-o', outPath], { from: 'user' });
// No API call should be made
expect(client.get).not.toHaveBeenCalled();
expect(client.get).toHaveBeenCalledWith('/api/v1/projects/proj-1/mcp-config');
const written = JSON.parse(readFileSync(outPath, 'utf-8'));
expect(written.mcpServers['slack--default']).toBeDefined();
expect(output.join('\n')).toContain('2 server(s)');
expect(written.mcpServers['homeautomation']).toEqual({
command: 'mcpctl',
args: ['mcp', '-p', 'homeautomation'],
});
expect(output.join('\n')).toContain('1 server(s)');
});
it('prints to stdout with --stdout', async () => {
@@ -55,9 +55,13 @@ describe('config claude-generate', () => {
{ configDeps: { configDir: tmpDir }, log },
{ client, credentialsDeps: { configDir: tmpDir }, log },
);
await cmd.parseAsync(['claude-generate', '--project', 'proj-1', '--stdout'], { from: 'user' });
await cmd.parseAsync(['claude', '--project', 'myproj', '--stdout'], { from: 'user' });
expect(output[0]).toContain('mcpServers');
const parsed = JSON.parse(output[0]);
expect(parsed.mcpServers['myproj']).toEqual({
command: 'mcpctl',
args: ['mcp', '-p', 'myproj'],
});
});
it('merges with existing .mcp.json', async () => {
@@ -70,12 +74,41 @@ describe('config claude-generate', () => {
{ configDeps: { configDir: tmpDir }, log },
{ client, credentialsDeps: { configDir: tmpDir }, log },
);
await cmd.parseAsync(['claude-generate', '--project', 'proj-1', '-o', outPath, '--merge'], { from: 'user' });
await cmd.parseAsync(['claude', '--project', 'proj-1', '-o', outPath, '--merge'], { from: 'user' });
const written = JSON.parse(readFileSync(outPath, 'utf-8'));
expect(written.mcpServers['existing--server']).toBeDefined();
expect(written.mcpServers['slack--default']).toBeDefined();
expect(output.join('\n')).toContain('3 server(s)');
expect(written.mcpServers['proj-1']).toEqual({
command: 'mcpctl',
args: ['mcp', '-p', 'proj-1'],
});
expect(output.join('\n')).toContain('2 server(s)');
});
it('backward compat: claude-generate still works', async () => {
const outPath = join(tmpDir, '.mcp.json');
const cmd = createConfigCommand(
{ configDeps: { configDir: tmpDir }, log },
{ client, credentialsDeps: { configDir: tmpDir }, log },
);
await cmd.parseAsync(['claude-generate', '--project', 'proj-1', '-o', outPath], { from: 'user' });
const written = JSON.parse(readFileSync(outPath, 'utf-8'));
expect(written.mcpServers['proj-1']).toEqual({
command: 'mcpctl',
args: ['mcp', '-p', 'proj-1'],
});
});
it('uses project name as the server key', async () => {
const outPath = join(tmpDir, '.mcp.json');
const cmd = createConfigCommand(
{ configDeps: { configDir: tmpDir }, log },
);
await cmd.parseAsync(['claude', '--project', 'my-fancy-project', '-o', outPath], { from: 'user' });
const written = JSON.parse(readFileSync(outPath, 'utf-8'));
expect(Object.keys(written.mcpServers)).toEqual(['my-fancy-project']);
});
});

View File

@@ -181,7 +181,6 @@ describe('get command', () => {
proxyMode: 'filtered',
ownerId: 'usr-1',
servers: [{ server: { name: 'grafana' } }],
members: [{ user: { email: 'a@b.com' }, role: 'admin' }, { user: { email: 'c@d.com' }, role: 'member' }],
}]);
const cmd = createGetCommand(deps);
await cmd.parseAsync(['node', 'test', 'projects']);
@@ -189,11 +188,9 @@ describe('get command', () => {
const text = deps.output.join('\n');
expect(text).toContain('MODE');
expect(text).toContain('SERVERS');
expect(text).toContain('MEMBERS');
expect(text).toContain('smart-home');
expect(text).toContain('filtered');
expect(text).toContain('1');
expect(text).toContain('2');
});
it('displays mixed resource and operation bindings', async () => {

View File

@@ -0,0 +1,481 @@
import { describe, it, expect, beforeAll, afterAll } from 'vitest';
import http from 'node:http';
import { Readable, Writable } from 'node:stream';
import { runMcpBridge, createMcpCommand } from '../../src/commands/mcp.js';
// ---- Mock MCP server (simulates mcplocal project endpoint) ----
interface RecordedRequest {
method: string;
url: string;
headers: http.IncomingHttpHeaders;
body: string;
}
let mockServer: http.Server;
let mockPort: number;
const recorded: RecordedRequest[] = [];
let sessionCounter = 0;
function makeInitializeResponse(id: number | string) {
return JSON.stringify({
jsonrpc: '2.0',
id,
result: {
protocolVersion: '2024-11-05',
capabilities: { tools: {} },
serverInfo: { name: 'test-server', version: '1.0.0' },
},
});
}
function makeToolsListResponse(id: number | string) {
return JSON.stringify({
jsonrpc: '2.0',
id,
result: {
tools: [
{ name: 'grafana/query', description: 'Query Grafana', inputSchema: { type: 'object', properties: {} } },
],
},
});
}
function makeToolCallResponse(id: number | string) {
return JSON.stringify({
jsonrpc: '2.0',
id,
result: {
content: [{ type: 'text', text: 'tool result' }],
},
});
}
beforeAll(async () => {
mockServer = http.createServer((req, res) => {
const chunks: Buffer[] = [];
req.on('data', (c: Buffer) => chunks.push(c));
req.on('end', () => {
const body = Buffer.concat(chunks).toString('utf-8');
recorded.push({ method: req.method ?? '', url: req.url ?? '', headers: req.headers, body });
if (req.method === 'DELETE') {
res.writeHead(200);
res.end();
return;
}
if (req.method === 'POST' && req.url?.startsWith('/projects/')) {
let sessionId = req.headers['mcp-session-id'] as string | undefined;
// Assign session ID on first request
if (!sessionId) {
sessionCounter++;
sessionId = `session-${sessionCounter}`;
}
res.setHeader('mcp-session-id', sessionId);
// Parse JSON-RPC and respond based on method
try {
const rpc = JSON.parse(body) as { id: number | string; method: string };
let responseBody: string;
switch (rpc.method) {
case 'initialize':
responseBody = makeInitializeResponse(rpc.id);
break;
case 'tools/list':
responseBody = makeToolsListResponse(rpc.id);
break;
case 'tools/call':
responseBody = makeToolCallResponse(rpc.id);
break;
default:
responseBody = JSON.stringify({ jsonrpc: '2.0', id: rpc.id, error: { code: -32601, message: 'Method not found' } });
}
// Respond in SSE format for /projects/sse-project/mcp
if (req.url?.includes('sse-project')) {
res.writeHead(200, { 'Content-Type': 'text/event-stream' });
res.end(`event: message\ndata: ${responseBody}\n\n`);
} else {
res.writeHead(200, { 'Content-Type': 'application/json' });
res.end(responseBody);
}
} catch {
res.writeHead(400, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ error: 'Invalid JSON' }));
}
return;
}
res.writeHead(404);
res.end();
});
});
await new Promise<void>((resolve) => {
mockServer.listen(0, () => {
const addr = mockServer.address();
if (addr && typeof addr === 'object') {
mockPort = addr.port;
}
resolve();
});
});
});
afterAll(() => {
mockServer.close();
});
// ---- Helper to run bridge with mock streams ----
function createMockStreams() {
const stdoutChunks: string[] = [];
const stderrChunks: string[] = [];
const stdout = new Writable({
write(chunk: Buffer, _encoding, callback) {
stdoutChunks.push(chunk.toString());
callback();
},
});
const stderr = new Writable({
write(chunk: Buffer, _encoding, callback) {
stderrChunks.push(chunk.toString());
callback();
},
});
return { stdout, stderr, stdoutChunks, stderrChunks };
}
function pushAndEnd(stdin: Readable, lines: string[]) {
for (const line of lines) {
stdin.push(line + '\n');
}
stdin.push(null); // EOF
}
// ---- Tests ----
describe('MCP STDIO Bridge', () => {
beforeAll(() => {
recorded.length = 0;
sessionCounter = 0;
});
it('forwards initialize request and returns response', async () => {
recorded.length = 0;
const stdin = new Readable({ read() {} });
const { stdout, stdoutChunks } = createMockStreams();
const initMsg = JSON.stringify({
jsonrpc: '2.0', id: 1, method: 'initialize',
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
});
pushAndEnd(stdin, [initMsg]);
await runMcpBridge({
projectName: 'test-project',
mcplocalUrl: `http://localhost:${mockPort}`,
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
});
// Verify request was made to correct URL
expect(recorded.some((r) => r.url === '/projects/test-project/mcp' && r.method === 'POST')).toBe(true);
// Verify response on stdout
const output = stdoutChunks.join('');
const parsed = JSON.parse(output.trim());
expect(parsed.result.serverInfo.name).toBe('test-server');
expect(parsed.result.protocolVersion).toBe('2024-11-05');
});
it('sends session ID on subsequent requests', async () => {
recorded.length = 0;
const stdin = new Readable({ read() {} });
const { stdout, stdoutChunks } = createMockStreams();
const initMsg = JSON.stringify({
jsonrpc: '2.0', id: 1, method: 'initialize',
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
});
const toolsListMsg = JSON.stringify({ jsonrpc: '2.0', id: 2, method: 'tools/list', params: {} });
pushAndEnd(stdin, [initMsg, toolsListMsg]);
await runMcpBridge({
projectName: 'test-project',
mcplocalUrl: `http://localhost:${mockPort}`,
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
});
// First POST should NOT have mcp-session-id header
const firstPost = recorded.find((r) => r.method === 'POST' && r.body.includes('initialize'));
expect(firstPost).toBeDefined();
expect(firstPost!.headers['mcp-session-id']).toBeUndefined();
// Second POST SHOULD have mcp-session-id header
const secondPost = recorded.find((r) => r.method === 'POST' && r.body.includes('tools/list'));
expect(secondPost).toBeDefined();
expect(secondPost!.headers['mcp-session-id']).toMatch(/^session-/);
// Verify tools/list response
const lines = stdoutChunks.join('').trim().split('\n');
expect(lines.length).toBe(2);
const toolsResponse = JSON.parse(lines[1]);
expect(toolsResponse.result.tools[0].name).toBe('grafana/query');
});
it('forwards tools/call and returns result', async () => {
recorded.length = 0;
const stdin = new Readable({ read() {} });
const { stdout, stdoutChunks } = createMockStreams();
const initMsg = JSON.stringify({
jsonrpc: '2.0', id: 1, method: 'initialize',
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
});
const callMsg = JSON.stringify({
jsonrpc: '2.0', id: 2, method: 'tools/call',
params: { name: 'grafana/query', arguments: { query: 'test' } },
});
pushAndEnd(stdin, [initMsg, callMsg]);
await runMcpBridge({
projectName: 'test-project',
mcplocalUrl: `http://localhost:${mockPort}`,
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
});
const lines = stdoutChunks.join('').trim().split('\n');
expect(lines.length).toBe(2);
const callResponse = JSON.parse(lines[1]);
expect(callResponse.result.content[0].text).toBe('tool result');
});
it('forwards Authorization header when token provided', async () => {
recorded.length = 0;
const stdin = new Readable({ read() {} });
const { stdout } = createMockStreams();
const initMsg = JSON.stringify({
jsonrpc: '2.0', id: 1, method: 'initialize',
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
});
pushAndEnd(stdin, [initMsg]);
await runMcpBridge({
projectName: 'test-project',
mcplocalUrl: `http://localhost:${mockPort}`,
token: 'my-secret-token',
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
});
const post = recorded.find((r) => r.method === 'POST');
expect(post).toBeDefined();
expect(post!.headers['authorization']).toBe('Bearer my-secret-token');
});
it('does not send Authorization header when no token', async () => {
recorded.length = 0;
const stdin = new Readable({ read() {} });
const { stdout } = createMockStreams();
const initMsg = JSON.stringify({
jsonrpc: '2.0', id: 1, method: 'initialize',
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
});
pushAndEnd(stdin, [initMsg]);
await runMcpBridge({
projectName: 'test-project',
mcplocalUrl: `http://localhost:${mockPort}`,
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
});
const post = recorded.find((r) => r.method === 'POST');
expect(post).toBeDefined();
expect(post!.headers['authorization']).toBeUndefined();
});
it('sends DELETE to clean up session on stdin EOF', async () => {
recorded.length = 0;
const stdin = new Readable({ read() {} });
const { stdout } = createMockStreams();
const initMsg = JSON.stringify({
jsonrpc: '2.0', id: 1, method: 'initialize',
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
});
pushAndEnd(stdin, [initMsg]);
await runMcpBridge({
projectName: 'test-project',
mcplocalUrl: `http://localhost:${mockPort}`,
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
});
// Should have a DELETE request for session cleanup
const deleteReq = recorded.find((r) => r.method === 'DELETE');
expect(deleteReq).toBeDefined();
expect(deleteReq!.headers['mcp-session-id']).toMatch(/^session-/);
});
it('does not send DELETE if no session was established', async () => {
recorded.length = 0;
const stdin = new Readable({ read() {} });
const { stdout } = createMockStreams();
// Push EOF immediately with no messages
stdin.push(null);
await runMcpBridge({
projectName: 'test-project',
mcplocalUrl: `http://localhost:${mockPort}`,
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
});
expect(recorded.filter((r) => r.method === 'DELETE')).toHaveLength(0);
});
it('writes errors to stderr, not stdout', async () => {
recorded.length = 0;
const stdin = new Readable({ read() {} });
const { stdout, stdoutChunks, stderr, stderrChunks } = createMockStreams();
// Send to a non-existent port to trigger connection error
const badMsg = JSON.stringify({ jsonrpc: '2.0', id: 1, method: 'initialize', params: {} });
pushAndEnd(stdin, [badMsg]);
await runMcpBridge({
projectName: 'test-project',
mcplocalUrl: 'http://localhost:1', // will fail to connect
stdin, stdout, stderr,
});
// Error should be on stderr
expect(stderrChunks.join('')).toContain('MCP bridge error');
// stdout should be empty (no corrupted output)
expect(stdoutChunks.join('')).toBe('');
});
it('skips blank lines in stdin', async () => {
recorded.length = 0;
const stdin = new Readable({ read() {} });
const { stdout, stdoutChunks } = createMockStreams();
const initMsg = JSON.stringify({
jsonrpc: '2.0', id: 1, method: 'initialize',
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
});
pushAndEnd(stdin, ['', ' ', initMsg, '']);
await runMcpBridge({
projectName: 'test-project',
mcplocalUrl: `http://localhost:${mockPort}`,
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
});
// Only one POST (for the actual message)
const posts = recorded.filter((r) => r.method === 'POST');
expect(posts).toHaveLength(1);
// One response line
const lines = stdoutChunks.join('').trim().split('\n');
expect(lines).toHaveLength(1);
});
it('handles SSE (text/event-stream) responses', async () => {
recorded.length = 0;
const stdin = new Readable({ read() {} });
const { stdout, stdoutChunks } = createMockStreams();
const initMsg = JSON.stringify({
jsonrpc: '2.0', id: 1, method: 'initialize',
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
});
pushAndEnd(stdin, [initMsg]);
await runMcpBridge({
projectName: 'sse-project', // triggers SSE response from mock server
mcplocalUrl: `http://localhost:${mockPort}`,
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
});
// Should extract JSON from SSE data: lines
const output = stdoutChunks.join('').trim();
const parsed = JSON.parse(output);
expect(parsed.result.serverInfo.name).toBe('test-server');
});
it('URL-encodes project name', async () => {
recorded.length = 0;
const stdin = new Readable({ read() {} });
const { stdout } = createMockStreams();
const { stderr } = createMockStreams();
const initMsg = JSON.stringify({
jsonrpc: '2.0', id: 1, method: 'initialize',
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
});
pushAndEnd(stdin, [initMsg]);
await runMcpBridge({
projectName: 'my project',
mcplocalUrl: `http://localhost:${mockPort}`,
stdin, stdout, stderr,
});
const post = recorded.find((r) => r.method === 'POST');
expect(post?.url).toBe('/projects/my%20project/mcp');
});
});
describe('createMcpCommand', () => {
it('accepts --project option directly', () => {
const cmd = createMcpCommand({
getProject: () => undefined,
configLoader: () => ({ mcplocalUrl: 'http://localhost:3200' }),
credentialsLoader: () => null,
});
const opt = cmd.options.find((o) => o.long === '--project');
expect(opt).toBeDefined();
expect(opt!.short).toBe('-p');
});
it('parses --project from command args', async () => {
let capturedProject: string | undefined;
const cmd = createMcpCommand({
getProject: () => undefined,
configLoader: () => ({ mcplocalUrl: `http://localhost:${mockPort}` }),
credentialsLoader: () => null,
});
// Override the action to capture what project was parsed
// We test by checking the option parsing works, not by running the full bridge
const parsed = cmd.parse(['--project', 'test-proj'], { from: 'user' });
capturedProject = parsed.opts().project;
expect(capturedProject).toBe('test-proj');
});
it('parses -p shorthand from command args', () => {
const cmd = createMcpCommand({
getProject: () => undefined,
configLoader: () => ({ mcplocalUrl: `http://localhost:${mockPort}` }),
credentialsLoader: () => null,
});
const parsed = cmd.parse(['-p', 'my-project'], { from: 'user' });
expect(parsed.opts().project).toBe('my-project');
});
});

View File

@@ -30,8 +30,8 @@ describe('project with new fields', () => {
'project', 'smart-home',
'-d', 'Smart home project',
'--proxy-mode', 'filtered',
'--llm-provider', 'gemini-cli',
'--llm-model', 'gemini-2.0-flash',
'--proxy-mode-llm-provider', 'gemini-cli',
'--proxy-mode-llm-model', 'gemini-2.0-flash',
'--server', 'my-grafana',
'--server', 'my-ha',
], { from: 'user' });
@@ -46,20 +46,6 @@ describe('project with new fields', () => {
}));
});
it('creates project with members', async () => {
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync([
'project', 'team-project',
'--member', 'alice@test.com',
'--member', 'bob@test.com',
], { from: 'user' });
expect(client.post).toHaveBeenCalledWith('/api/v1/projects', expect.objectContaining({
name: 'team-project',
members: ['alice@test.com', 'bob@test.com'],
}));
});
it('defaults proxy mode to direct', async () => {
const cmd = createCreateCommand({ client, log });
await cmd.parseAsync(['project', 'basic'], { from: 'user' });
@@ -71,7 +57,7 @@ describe('project with new fields', () => {
});
describe('get projects shows new columns', () => {
it('shows MODE, SERVERS, MEMBERS columns', async () => {
it('shows MODE and SERVERS columns', async () => {
const deps = {
output: [] as string[],
fetchResource: vi.fn(async () => [{
@@ -81,7 +67,6 @@ describe('project with new fields', () => {
proxyMode: 'filtered',
ownerId: 'user-1',
servers: [{ server: { name: 'grafana' } }, { server: { name: 'ha' } }],
members: [{ user: { email: 'alice@test.com' } }],
}]),
log: (...args: string[]) => deps.output.push(args.join(' ')),
};
@@ -91,13 +76,12 @@ describe('project with new fields', () => {
const text = deps.output.join('\n');
expect(text).toContain('MODE');
expect(text).toContain('SERVERS');
expect(text).toContain('MEMBERS');
expect(text).toContain('smart-home');
});
});
describe('describe project shows full detail', () => {
it('shows servers and members', async () => {
it('shows servers and proxy config', async () => {
const deps = {
output: [] as string[],
client: mockClient(),
@@ -113,10 +97,6 @@ describe('project with new fields', () => {
{ server: { name: 'my-grafana' } },
{ server: { name: 'my-ha' } },
],
members: [
{ user: { email: 'alice@test.com' } },
{ user: { email: 'bob@test.com' } },
],
createdAt: '2025-01-01',
updatedAt: '2025-01-01',
})),
@@ -131,8 +111,6 @@ describe('project with new fields', () => {
expect(text).toContain('gemini-cli');
expect(text).toContain('my-grafana');
expect(text).toContain('my-ha');
expect(text).toContain('alice@test.com');
expect(text).toContain('bob@test.com');
});
});
});

View File

@@ -0,0 +1,176 @@
import { describe, it, expect } from 'vitest';
import { readFileSync } from 'node:fs';
import { join, dirname } from 'node:path';
import { fileURLToPath } from 'node:url';
const root = join(dirname(fileURLToPath(import.meta.url)), '..', '..', '..');
const fishFile = readFileSync(join(root, 'completions', 'mcpctl.fish'), 'utf-8');
const bashFile = readFileSync(join(root, 'completions', 'mcpctl.bash'), 'utf-8');
describe('fish completions', () => {
it('erases stale completions at the top', () => {
const lines = fishFile.split('\n');
const firstComplete = lines.findIndex((l) => l.startsWith('complete '));
expect(lines[firstComplete]).toContain('-e');
});
it('does not offer resource types without __mcpctl_needs_resource_type guard', () => {
const resourceTypes = ['servers', 'instances', 'secrets', 'templates', 'projects', 'users', 'groups', 'rbac'];
const lines = fishFile.split('\n').filter((l) => l.startsWith('complete '));
for (const line of lines) {
// Find lines that offer resource types as positional args
const offersResourceType = resourceTypes.some((r) => {
// Match `-a "...servers..."` or `-a 'servers projects'`
const aMatch = line.match(/-a\s+['"]([^'"]+)['"]/);
if (!aMatch) return false;
return aMatch[1].split(/\s+/).includes(r);
});
if (!offersResourceType) continue;
// Skip the help completions line and the -e line
if (line.includes('__fish_seen_subcommand_from help')) continue;
// Skip project-scoped command offerings (those offer commands, not resource types)
if (line.includes('attach-server') || line.includes('detach-server')) continue;
// Skip lines that offer commands (not resource types)
if (line.includes("-d 'Show") || line.includes("-d 'Manage") || line.includes("-d 'Authenticate") ||
line.includes("-d 'Log out'") || line.includes("-d 'Get instance") || line.includes("-d 'Create a resource'") ||
line.includes("-d 'Edit a resource'") || line.includes("-d 'Apply") || line.includes("-d 'Backup") ||
line.includes("-d 'Restore") || line.includes("-d 'List resources") || line.includes("-d 'Delete a resource'")) continue;
// Lines offering resource types MUST have __mcpctl_needs_resource_type in their condition
expect(line, `Resource type completion missing guard: ${line}`).toContain('__mcpctl_needs_resource_type');
}
});
it('resource name completions require resource type to be selected', () => {
const lines = fishFile.split('\n').filter((l) => l.startsWith('complete') && l.includes('__mcpctl_resource_names'));
expect(lines.length).toBeGreaterThan(0);
for (const line of lines) {
expect(line).toContain('not __mcpctl_needs_resource_type');
}
});
it('defines --project option', () => {
expect(fishFile).toContain("complete -c mcpctl -l project");
});
it('attach-server command only shows with --project', () => {
// Only check lines that OFFER attach-server as a command (via -a attach-server), not argument completions
const lines = fishFile.split('\n').filter((l) =>
l.startsWith('complete') && l.includes("-a attach-server"));
expect(lines.length).toBeGreaterThan(0);
for (const line of lines) {
expect(line).toContain('__mcpctl_has_project');
}
});
it('detach-server command only shows with --project', () => {
const lines = fishFile.split('\n').filter((l) =>
l.startsWith('complete') && l.includes("-a detach-server"));
expect(lines.length).toBeGreaterThan(0);
for (const line of lines) {
expect(line).toContain('__mcpctl_has_project');
}
});
it('resource name functions use jq .[][].name to unwrap wrapped JSON and avoid nested matches', () => {
// API returns { "resources": [...] } not [...], so .[].name fails silently.
// Must use .[][].name to unwrap the outer object then iterate the array.
// Also must not use string match regex which matches nested name fields.
const resourceNamesFn = fishFile.match(/function __mcpctl_resource_names[\s\S]*?^end/m)?.[0] ?? '';
const projectNamesFn = fishFile.match(/function __mcpctl_project_names[\s\S]*?^end/m)?.[0] ?? '';
expect(resourceNamesFn, '__mcpctl_resource_names must use jq .[][].name').toContain("jq -r '.[][].name'");
expect(resourceNamesFn, '__mcpctl_resource_names must not use string match on name').not.toMatch(/string match.*"name"/);
expect(projectNamesFn, '__mcpctl_project_names must use jq .[][].name').toContain("jq -r '.[][].name'");
expect(projectNamesFn, '__mcpctl_project_names must not use string match on name').not.toMatch(/string match.*"name"/);
});
it('instances use server.name instead of name', () => {
const resourceNamesFn = fishFile.match(/function __mcpctl_resource_names[\s\S]*?^end/m)?.[0] ?? '';
expect(resourceNamesFn, 'must handle instances via server.name').toContain('.server.name');
});
it('attach-server completes with available (unattached) servers and guards against repeat', () => {
const attachLine = fishFile.split('\n').find((l) =>
l.startsWith('complete') && l.includes('__fish_seen_subcommand_from attach-server'));
expect(attachLine, 'attach-server argument completion must exist').toBeDefined();
expect(attachLine, 'attach-server must use __mcpctl_available_servers').toContain('__mcpctl_available_servers');
expect(attachLine, 'attach-server must guard with __mcpctl_needs_server_arg').toContain('__mcpctl_needs_server_arg');
});
it('detach-server completes with project servers and guards against repeat', () => {
const detachLine = fishFile.split('\n').find((l) =>
l.startsWith('complete') && l.includes('__fish_seen_subcommand_from detach-server'));
expect(detachLine, 'detach-server argument completion must exist').toBeDefined();
expect(detachLine, 'detach-server must use __mcpctl_project_servers').toContain('__mcpctl_project_servers');
expect(detachLine, 'detach-server must guard with __mcpctl_needs_server_arg').toContain('__mcpctl_needs_server_arg');
});
it('non-project commands do not show with --project', () => {
const nonProjectCmds = ['status', 'login', 'logout', 'config', 'apply', 'backup', 'restore'];
const lines = fishFile.split('\n').filter((l) => l.startsWith('complete') && l.includes('-a '));
for (const cmd of nonProjectCmds) {
const cmdLines = lines.filter((l) => {
const aMatch = l.match(/-a\s+(\S+)/);
return aMatch && aMatch[1].replace(/['"]/g, '') === cmd;
});
for (const line of cmdLines) {
expect(line, `${cmd} should require 'not __mcpctl_has_project'`).toContain('not __mcpctl_has_project');
}
}
});
});
describe('bash completions', () => {
it('separates project commands from regular commands', () => {
expect(bashFile).toContain('project_commands=');
expect(bashFile).toContain('attach-server detach-server');
});
it('checks has_project before offering project commands', () => {
expect(bashFile).toContain('if $has_project');
expect(bashFile).toContain('$project_commands');
});
it('fetches resource names dynamically after resource type', () => {
expect(bashFile).toContain('_mcpctl_resource_names');
// get/describe/delete should use resource_names when resource_type is set
expect(bashFile).toMatch(/get\|describe\|delete\)[\s\S]*?_mcpctl_resource_names/);
});
it('attach-server filters out already-attached servers and guards against repeat', () => {
const attachBlock = bashFile.match(/attach-server\)[\s\S]*?return ;;/)?.[0] ?? '';
expect(attachBlock, 'attach-server must use _mcpctl_get_project_value').toContain('_mcpctl_get_project_value');
expect(attachBlock, 'attach-server must query project servers to exclude').toContain('--project');
expect(attachBlock, 'attach-server must check position to prevent repeat').toContain('cword - subcmd_pos');
});
it('detach-server shows only project servers and guards against repeat', () => {
const detachBlock = bashFile.match(/detach-server\)[\s\S]*?return ;;/)?.[0] ?? '';
expect(detachBlock, 'detach-server must use _mcpctl_get_project_value').toContain('_mcpctl_get_project_value');
expect(detachBlock, 'detach-server must query project servers').toContain('--project');
expect(detachBlock, 'detach-server must check position to prevent repeat').toContain('cword - subcmd_pos');
});
it('instances use server.name instead of name', () => {
const fnMatch = bashFile.match(/_mcpctl_resource_names\(\)[\s\S]*?\n\s*\}/)?.[0] ?? '';
expect(fnMatch, 'must handle instances via .server.name').toContain('.server.name');
});
it('defines --project option', () => {
expect(bashFile).toContain('--project');
});
it('resource name function uses jq .[][].name to unwrap wrapped JSON and avoid nested matches', () => {
const fnMatch = bashFile.match(/_mcpctl_resource_names\(\)[\s\S]*?\n\s*\}/)?.[0] ?? '';
expect(fnMatch, '_mcpctl_resource_names must use jq .[][].name').toContain("jq -r '.[][].name'");
expect(fnMatch, '_mcpctl_resource_names must not use grep on name').not.toMatch(/grep.*"name"/);
// Guard against .[].name (single bracket) which fails on wrapped JSON
expect(fnMatch, '_mcpctl_resource_names must not use .[].name (needs .[][].name)').not.toMatch(/jq.*'\.\[\]\.name'/);
});
});

View File

@@ -0,0 +1,8 @@
-- DropForeignKey
ALTER TABLE "ProjectMember" DROP CONSTRAINT IF EXISTS "ProjectMember_projectId_fkey";
-- DropForeignKey
ALTER TABLE "ProjectMember" DROP CONSTRAINT IF EXISTS "ProjectMember_userId_fkey";
-- DropTable
DROP TABLE IF EXISTS "ProjectMember";

View File

@@ -24,7 +24,6 @@ model User {
sessions Session[]
auditLogs AuditLog[]
ownedProjects Project[]
projectMemberships ProjectMember[]
groupMemberships GroupMember[]
@@index([email])
@@ -171,6 +170,7 @@ model Project {
id String @id @default(cuid())
name String @unique
description String @default("")
prompt String @default("")
proxyMode String @default("direct")
llmProvider String?
llmModel String?
@@ -179,9 +179,10 @@ model Project {
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
owner User @relation(fields: [ownerId], references: [id], onDelete: Cascade)
servers ProjectServer[]
members ProjectMember[]
owner User @relation(fields: [ownerId], references: [id], onDelete: Cascade)
servers ProjectServer[]
prompts Prompt[]
promptRequests PromptRequest[]
@@index([name])
@@index([ownerId])
@@ -199,18 +200,6 @@ model ProjectServer {
@@unique([projectId, serverId])
}
model ProjectMember {
id String @id @default(cuid())
projectId String
userId String
createdAt DateTime @default(now())
project Project @relation(fields: [projectId], references: [id], onDelete: Cascade)
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
@@unique([projectId, userId])
}
// ── MCP Instances (running containers) ──
model McpInstance {
@@ -241,6 +230,41 @@ enum InstanceStatus {
ERROR
}
// ── Prompts (approved content resources) ──
model Prompt {
id String @id @default(cuid())
name String
content String @db.Text
projectId String?
version Int @default(1)
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
project Project? @relation(fields: [projectId], references: [id], onDelete: Cascade)
@@unique([name, projectId])
@@index([projectId])
}
// ── Prompt Requests (pending proposals from LLM sessions) ──
model PromptRequest {
id String @id @default(cuid())
name String
content String @db.Text
projectId String?
createdBySession String?
createdByUserId String?
createdAt DateTime @default(now())
project Project? @relation(fields: [projectId], references: [id], onDelete: Cascade)
@@unique([name, projectId])
@@index([projectId])
@@index([createdBySession])
}
// ── Audit Logs ──
model AuditLog {

View File

@@ -18,6 +18,8 @@ import {
UserRepository,
GroupRepository,
} from './repositories/index.js';
import { PromptRepository } from './repositories/prompt.repository.js';
import { PromptRequestRepository } from './repositories/prompt-request.repository.js';
import {
McpServerService,
SecretService,
@@ -39,6 +41,7 @@ import {
GroupService,
} from './services/index.js';
import type { RbacAction } from './services/index.js';
import type { UpdateRbacDefinitionInput } from './validation/rbac-definition.schema.js';
import { createAuthMiddleware } from './middleware/auth.js';
import {
registerMcpServerRoutes,
@@ -55,6 +58,8 @@ import {
registerUserRoutes,
registerGroupRoutes,
} from './routes/index.js';
import { registerPromptRoutes } from './routes/prompts.js';
import { PromptService } from './services/prompt.service.js';
type PermissionCheck =
| { kind: 'resource'; resource: string; action: RbacAction; resourceName?: string }
@@ -87,11 +92,50 @@ function mapUrlToPermission(method: string, url: string): PermissionCheck {
'rbac': 'rbac',
'audit-logs': 'rbac',
'mcp': 'servers',
'prompts': 'prompts',
'promptrequests': 'promptrequests',
};
const resource = resourceMap[segment];
if (resource === undefined) return { kind: 'skip' };
// Special case: /api/v1/promptrequests/:id/approve → needs both delete+promptrequests and create+prompts
// We check delete on promptrequests (the harder permission); create on prompts is checked in the service layer
const approveMatch = url.match(/^\/api\/v1\/promptrequests\/([^/?]+)\/approve/);
if (approveMatch?.[1]) {
return { kind: 'resource', resource: 'promptrequests', action: 'delete', resourceName: approveMatch[1] };
}
// Special case: /api/v1/projects/:name/prompts/visible → view prompts
const visiblePromptsMatch = url.match(/^\/api\/v1\/projects\/([^/?]+)\/prompts\/visible/);
if (visiblePromptsMatch?.[1]) {
return { kind: 'resource', resource: 'prompts', action: 'view' };
}
// Special case: /api/v1/projects/:name/promptrequests → create promptrequests
const projectPromptrequestsMatch = url.match(/^\/api\/v1\/projects\/([^/?]+)\/promptrequests/);
if (projectPromptrequestsMatch?.[1] && method === 'POST') {
return { kind: 'resource', resource: 'promptrequests', action: 'create' };
}
// Special case: /api/v1/projects/:id/instructions → view projects
const instructionsMatch = url.match(/^\/api\/v1\/projects\/([^/?]+)\/instructions/);
if (instructionsMatch?.[1]) {
return { kind: 'resource', resource: 'projects', action: 'view', resourceName: instructionsMatch[1] };
}
// Special case: /api/v1/projects/:id/mcp-config → requires 'expose' permission
const mcpConfigMatch = url.match(/^\/api\/v1\/projects\/([^/?]+)\/mcp-config/);
if (mcpConfigMatch?.[1]) {
return { kind: 'resource', resource: 'projects', action: 'expose', resourceName: mcpConfigMatch[1] };
}
// Special case: /api/v1/projects/:id/servers — attach/detach requires 'edit'
const projectServersMatch = url.match(/^\/api\/v1\/projects\/([^/?]+)\/servers/);
if (projectServersMatch?.[1] && method !== 'GET') {
return { kind: 'resource', resource: 'projects', action: 'edit', resourceName: projectServersMatch[1] };
}
// Map HTTP method to action
let action: RbacAction;
switch (method) {
@@ -119,6 +163,47 @@ function mapUrlToPermission(method: string, url: string): PermissionCheck {
return check;
}
/**
* Migrate legacy 'admin' role bindings → granular roles.
* Old format: { role: 'admin', resource: '*' }
* New format: { role: 'edit', resource: '*' }, { role: 'run', resource: '*' },
* plus operation bindings for impersonate, logs, backup, restore, audit-purge
*/
async function migrateAdminRole(rbacRepo: InstanceType<typeof RbacDefinitionRepository>): Promise<void> {
const definitions = await rbacRepo.findAll();
for (const def of definitions) {
const bindings = def.roleBindings as Array<Record<string, unknown>>;
const hasAdminRole = bindings.some((b) => b['role'] === 'admin');
if (!hasAdminRole) continue;
// Replace admin bindings with granular equivalents
const newBindings: Array<Record<string, string>> = [];
for (const b of bindings) {
if (b['role'] === 'admin') {
const resource = b['resource'] as string;
newBindings.push({ role: 'edit', resource });
newBindings.push({ role: 'run', resource });
} else {
newBindings.push(b as Record<string, string>);
}
}
// Add operation bindings (idempotent — only for wildcard admin)
const hasWildcard = bindings.some((b) => b['role'] === 'admin' && b['resource'] === '*');
if (hasWildcard) {
const ops = ['impersonate', 'logs', 'backup', 'restore', 'audit-purge'];
for (const op of ops) {
if (!newBindings.some((b) => b['action'] === op)) {
newBindings.push({ role: 'run', action: op });
}
}
}
await rbacRepo.update(def.id, { roleBindings: newBindings as UpdateRbacDefinitionInput['roleBindings'] });
// eslint-disable-next-line no-console
console.log(`mcpd: migrated RBAC '${def.name}' from admin → granular roles`);
}
}
async function main(): Promise<void> {
const config = loadConfigFromEnv();
@@ -161,6 +246,18 @@ async function main(): Promise<void> {
const userRepo = new UserRepository(prisma);
const groupRepo = new GroupRepository(prisma);
// CUID detection for RBAC name resolution
const CUID_RE = /^c[^\s-]{8,}$/i;
const nameResolvers: Record<string, { findById(id: string): Promise<{ name: string } | null> }> = {
servers: serverRepo,
secrets: secretRepo,
projects: projectRepo,
groups: groupRepo,
};
// Migrate legacy 'admin' role → granular roles
await migrateAdminRole(rbacDefinitionRepo);
// Orchestrator
const orchestrator = new DockerContainerManager();
@@ -169,7 +266,7 @@ async function main(): Promise<void> {
const instanceService = new InstanceService(instanceRepo, serverRepo, orchestrator, secretRepo);
serverService.setInstanceService(instanceService);
const secretService = new SecretService(secretRepo);
const projectService = new ProjectService(projectRepo, serverRepo, secretRepo, userRepo);
const projectService = new ProjectService(projectRepo, serverRepo, secretRepo);
const auditLogService = new AuditLogService(auditLogRepo);
const metricsCollector = new MetricsCollector();
const healthAggregator = new HealthAggregator(metricsCollector, orchestrator);
@@ -177,11 +274,14 @@ async function main(): Promise<void> {
const restoreService = new RestoreService(serverRepo, projectRepo, secretRepo, userRepo, groupRepo, rbacDefinitionRepo);
const authService = new AuthService(prisma);
const templateService = new TemplateService(templateRepo);
const mcpProxyService = new McpProxyService(instanceRepo, serverRepo);
const mcpProxyService = new McpProxyService(instanceRepo, serverRepo, orchestrator);
const rbacDefinitionService = new RbacDefinitionService(rbacDefinitionRepo);
const rbacService = new RbacService(rbacDefinitionRepo, prisma);
const userService = new UserService(userRepo);
const groupService = new GroupService(groupRepo, userRepo);
const promptRepo = new PromptRepository(prisma);
const promptRequestRepo = new PromptRequestRepository(prisma);
const promptService = new PromptService(promptRepo, promptRequestRepo, projectRepo);
// Auth middleware for global hooks
const authMiddleware = createAuthMiddleware({
@@ -228,11 +328,27 @@ async function main(): Promise<void> {
const check = mapUrlToPermission(request.method, url);
if (check.kind === 'skip') return;
// Extract service account identity from header (sent by mcplocal)
const saHeader = request.headers['x-service-account'];
const serviceAccountName = typeof saHeader === 'string' ? saHeader : undefined;
let allowed: boolean;
if (check.kind === 'operation') {
allowed = await rbacService.canRunOperation(request.userId, check.operation);
allowed = await rbacService.canRunOperation(request.userId, check.operation, serviceAccountName);
} else {
allowed = await rbacService.canAccess(request.userId, check.action, check.resource, check.resourceName);
// Resolve CUID → human name for name-scoped RBAC bindings
if (check.resourceName !== undefined && CUID_RE.test(check.resourceName)) {
const resolver = nameResolvers[check.resource];
if (resolver) {
const entity = await resolver.findById(check.resourceName);
if (entity) check.resourceName = entity.name;
}
}
allowed = await rbacService.canAccess(request.userId, check.action, check.resource, check.resourceName, serviceAccountName);
// Compute scope for list filtering (used by preSerialization hook)
if (allowed && check.resourceName === undefined) {
request.rbacScope = await rbacService.getAllowedScope(request.userId, check.action, check.resource, serviceAccountName);
}
}
if (!allowed) {
reply.code(403).send({ error: 'Forbidden' });
@@ -257,6 +373,18 @@ async function main(): Promise<void> {
registerRbacRoutes(app, rbacDefinitionService);
registerUserRoutes(app, userService);
registerGroupRoutes(app, groupService);
registerPromptRoutes(app, promptService, projectRepo);
// ── RBAC list filtering hook ──
// Filters array responses to only include resources the user is allowed to see.
app.addHook('preSerialization', async (request, _reply, payload) => {
if (!request.rbacScope || request.rbacScope.wildcard) return payload;
if (!Array.isArray(payload)) return payload;
return (payload as Array<Record<string, unknown>>).filter((item) => {
const name = item['name'];
return typeof name === 'string' && request.rbacScope!.names.has(name);
});
});
// Start
await app.listen({ port: config.port, host: config.host });

View File

@@ -7,6 +7,7 @@ export interface AuthDeps {
declare module 'fastify' {
interface FastifyRequest {
userId?: string;
rbacScope?: { wildcard: boolean; names: Set<string> };
}
}

View File

@@ -1,24 +1,23 @@
import type { PrismaClient, Project } from '@prisma/client';
export interface ProjectWithRelations extends Project {
servers: Array<{ id: string; server: { id: string; name: string } }>;
members: Array<{ id: string; user: { id: string; email: string; name: string | null } }>;
servers: Array<{ id: string; projectId: string; serverId: string; server: Record<string, unknown> & { id: string; name: string } }>;
}
const PROJECT_INCLUDE = {
servers: { include: { server: { select: { id: true, name: true } } } },
members: { include: { user: { select: { id: true, email: true, name: true } } } },
servers: { include: { server: true } },
} as const;
export interface IProjectRepository {
findAll(ownerId?: string): Promise<ProjectWithRelations[]>;
findById(id: string): Promise<ProjectWithRelations | null>;
findByName(name: string): Promise<ProjectWithRelations | null>;
create(data: { name: string; description: string; ownerId: string; proxyMode: string; llmProvider?: string; llmModel?: string }): Promise<ProjectWithRelations>;
create(data: { name: string; description: string; prompt?: string; ownerId: string; proxyMode: string; llmProvider?: string; llmModel?: string }): Promise<ProjectWithRelations>;
update(id: string, data: Record<string, unknown>): Promise<ProjectWithRelations>;
delete(id: string): Promise<void>;
setServers(projectId: string, serverIds: string[]): Promise<void>;
setMembers(projectId: string, userIds: string[]): Promise<void>;
addServer(projectId: string, serverId: string): Promise<void>;
removeServer(projectId: string, serverId: string): Promise<void>;
}
export class ProjectRepository implements IProjectRepository {
@@ -37,13 +36,14 @@ export class ProjectRepository implements IProjectRepository {
return this.prisma.project.findUnique({ where: { name }, include: PROJECT_INCLUDE }) as unknown as Promise<ProjectWithRelations | null>;
}
async create(data: { name: string; description: string; ownerId: string; proxyMode: string; llmProvider?: string; llmModel?: string }): Promise<ProjectWithRelations> {
async create(data: { name: string; description: string; prompt?: string; ownerId: string; proxyMode: string; llmProvider?: string; llmModel?: string }): Promise<ProjectWithRelations> {
const createData: Record<string, unknown> = {
name: data.name,
description: data.description,
ownerId: data.ownerId,
proxyMode: data.proxyMode,
};
if (data.prompt !== undefined) createData['prompt'] = data.prompt;
if (data.llmProvider !== undefined) createData['llmProvider'] = data.llmProvider;
if (data.llmModel !== undefined) createData['llmModel'] = data.llmModel;
@@ -76,14 +76,17 @@ export class ProjectRepository implements IProjectRepository {
});
}
async setMembers(projectId: string, userIds: string[]): Promise<void> {
await this.prisma.$transaction(async (tx) => {
await tx.projectMember.deleteMany({ where: { projectId } });
if (userIds.length > 0) {
await tx.projectMember.createMany({
data: userIds.map((userId) => ({ projectId, userId })),
});
}
async addServer(projectId: string, serverId: string): Promise<void> {
await this.prisma.projectServer.upsert({
where: { projectId_serverId: { projectId, serverId } },
create: { projectId, serverId },
update: {},
});
}
async removeServer(projectId: string, serverId: string): Promise<void> {
await this.prisma.projectServer.deleteMany({
where: { projectId, serverId },
});
}
}

View File

@@ -0,0 +1,53 @@
import type { PrismaClient, PromptRequest } from '@prisma/client';
export interface IPromptRequestRepository {
findAll(projectId?: string): Promise<PromptRequest[]>;
findById(id: string): Promise<PromptRequest | null>;
findByNameAndProject(name: string, projectId: string | null): Promise<PromptRequest | null>;
findBySession(sessionId: string, projectId?: string): Promise<PromptRequest[]>;
create(data: { name: string; content: string; projectId?: string; createdBySession?: string; createdByUserId?: string }): Promise<PromptRequest>;
delete(id: string): Promise<void>;
}
export class PromptRequestRepository implements IPromptRequestRepository {
constructor(private readonly prisma: PrismaClient) {}
async findAll(projectId?: string): Promise<PromptRequest[]> {
if (projectId !== undefined) {
return this.prisma.promptRequest.findMany({
where: { OR: [{ projectId }, { projectId: null }] },
orderBy: { createdAt: 'desc' },
});
}
return this.prisma.promptRequest.findMany({ orderBy: { createdAt: 'desc' } });
}
async findById(id: string): Promise<PromptRequest | null> {
return this.prisma.promptRequest.findUnique({ where: { id } });
}
async findByNameAndProject(name: string, projectId: string | null): Promise<PromptRequest | null> {
return this.prisma.promptRequest.findUnique({
where: { name_projectId: { name, projectId: projectId ?? '' } },
});
}
async findBySession(sessionId: string, projectId?: string): Promise<PromptRequest[]> {
const where: Record<string, unknown> = { createdBySession: sessionId };
if (projectId !== undefined) {
where['OR'] = [{ projectId }, { projectId: null }];
}
return this.prisma.promptRequest.findMany({
where,
orderBy: { createdAt: 'desc' },
});
}
async create(data: { name: string; content: string; projectId?: string; createdBySession?: string; createdByUserId?: string }): Promise<PromptRequest> {
return this.prisma.promptRequest.create({ data });
}
async delete(id: string): Promise<void> {
await this.prisma.promptRequest.delete({ where: { id } });
}
}

View File

@@ -0,0 +1,47 @@
import type { PrismaClient, Prompt } from '@prisma/client';
export interface IPromptRepository {
findAll(projectId?: string): Promise<Prompt[]>;
findById(id: string): Promise<Prompt | null>;
findByNameAndProject(name: string, projectId: string | null): Promise<Prompt | null>;
create(data: { name: string; content: string; projectId?: string }): Promise<Prompt>;
update(id: string, data: { content?: string }): Promise<Prompt>;
delete(id: string): Promise<void>;
}
export class PromptRepository implements IPromptRepository {
constructor(private readonly prisma: PrismaClient) {}
async findAll(projectId?: string): Promise<Prompt[]> {
if (projectId !== undefined) {
// Project-scoped + global prompts
return this.prisma.prompt.findMany({
where: { OR: [{ projectId }, { projectId: null }] },
orderBy: { name: 'asc' },
});
}
return this.prisma.prompt.findMany({ orderBy: { name: 'asc' } });
}
async findById(id: string): Promise<Prompt | null> {
return this.prisma.prompt.findUnique({ where: { id } });
}
async findByNameAndProject(name: string, projectId: string | null): Promise<Prompt | null> {
return this.prisma.prompt.findUnique({
where: { name_projectId: { name, projectId: projectId ?? '' } },
});
}
async create(data: { name: string; content: string; projectId?: string }): Promise<Prompt> {
return this.prisma.prompt.create({ data });
}
async update(id: string, data: { content?: string }): Promise<Prompt> {
return this.prisma.prompt.update({ where: { id }, data });
}
async delete(id: string): Promise<void> {
await this.prisma.prompt.delete({ where: { id } });
}
}

View File

@@ -2,9 +2,9 @@ import type { FastifyInstance } from 'fastify';
import type { ProjectService } from '../services/project.service.js';
export function registerProjectRoutes(app: FastifyInstance, service: ProjectService): void {
app.get('/api/v1/projects', async (request) => {
// If authenticated, filter by owner; otherwise list all
return service.list(request.userId);
app.get('/api/v1/projects', async () => {
// RBAC preSerialization hook handles access filtering
return service.list();
});
app.get<{ Params: { id: string } }>('/api/v1/projects/:id', async (request) => {
@@ -34,9 +34,36 @@ export function registerProjectRoutes(app: FastifyInstance, service: ProjectServ
return service.generateMcpConfig(request.params.id);
});
// Attach a server to a project
app.post<{ Params: { id: string }; Body: { server: string } }>('/api/v1/projects/:id/servers', async (request) => {
const body = request.body as { server?: string };
if (!body.server) {
throw Object.assign(new Error('Missing "server" in request body'), { statusCode: 400 });
}
return service.addServer(request.params.id, body.server);
});
// Detach a server from a project
app.delete<{ Params: { id: string; serverName: string } }>('/api/v1/projects/:id/servers/:serverName', async (request, reply) => {
await service.removeServer(request.params.id, request.params.serverName);
reply.code(204);
});
// List servers in a project (for mcplocal discovery)
app.get<{ Params: { id: string } }>('/api/v1/projects/:id/servers', async (request) => {
const project = await service.resolveAndGet(request.params.id);
return project.servers.map((ps) => ps.server);
});
// Get project instructions for LLM (prompt + server list)
app.get<{ Params: { id: string } }>('/api/v1/projects/:id/instructions', async (request) => {
const project = await service.resolveAndGet(request.params.id);
return {
prompt: project.prompt,
servers: project.servers.map((ps) => ({
name: (ps.server as Record<string, unknown>).name as string,
description: (ps.server as Record<string, unknown>).description as string,
})),
};
});
}

View File

@@ -0,0 +1,86 @@
import type { FastifyInstance } from 'fastify';
import type { PromptService } from '../services/prompt.service.js';
import type { IProjectRepository } from '../repositories/project.repository.js';
export function registerPromptRoutes(
app: FastifyInstance,
service: PromptService,
projectRepo: IProjectRepository,
): void {
// ── Prompts (approved) ──
app.get('/api/v1/prompts', async () => {
return service.listPrompts();
});
app.get<{ Params: { id: string } }>('/api/v1/prompts/:id', async (request) => {
return service.getPrompt(request.params.id);
});
app.post('/api/v1/prompts', async (request, reply) => {
const prompt = await service.createPrompt(request.body);
reply.code(201);
return prompt;
});
app.put<{ Params: { id: string } }>('/api/v1/prompts/:id', async (request) => {
return service.updatePrompt(request.params.id, request.body);
});
app.delete<{ Params: { id: string } }>('/api/v1/prompts/:id', async (request, reply) => {
await service.deletePrompt(request.params.id);
reply.code(204);
});
// ── Prompt Requests (pending proposals) ──
app.get('/api/v1/promptrequests', async () => {
return service.listPromptRequests();
});
app.get<{ Params: { id: string } }>('/api/v1/promptrequests/:id', async (request) => {
return service.getPromptRequest(request.params.id);
});
app.delete<{ Params: { id: string } }>('/api/v1/promptrequests/:id', async (request, reply) => {
await service.deletePromptRequest(request.params.id);
reply.code(204);
});
// Approve: atomic delete request → create prompt
app.post<{ Params: { id: string } }>('/api/v1/promptrequests/:id/approve', async (request) => {
return service.approve(request.params.id);
});
// ── Project-scoped endpoints (for mcplocal) ──
// Visible prompts: approved + session's pending requests
app.get<{ Params: { name: string }; Querystring: { session?: string } }>(
'/api/v1/projects/:name/prompts/visible',
async (request) => {
const project = await projectRepo.findByName(request.params.name);
if (!project) {
throw Object.assign(new Error(`Project not found: ${request.params.name}`), { statusCode: 404 });
}
return service.getVisiblePrompts(project.id, request.query.session);
},
);
// LLM propose: create a PromptRequest for a project
app.post<{ Params: { name: string } }>(
'/api/v1/projects/:name/promptrequests',
async (request, reply) => {
const project = await projectRepo.findByName(request.params.name);
if (!project) {
throw Object.assign(new Error(`Project not found: ${request.params.name}`), { statusCode: 404 });
}
const body = request.body as Record<string, unknown>;
const req = await service.propose({
...body,
projectId: project.id,
});
reply.code(201);
return req;
},
);
}

View File

@@ -43,7 +43,6 @@ export interface BackupProject {
llmProvider?: string | null;
llmModel?: string | null;
serverNames?: string[];
members?: string[];
}
export interface BackupUser {
@@ -120,7 +119,6 @@ export class BackupService {
llmProvider: proj.llmProvider,
llmModel: proj.llmModel,
serverNames: proj.servers.map((ps) => ps.server.name),
members: proj.members.map((pm) => pm.user.email),
}));
}

View File

@@ -260,15 +260,11 @@ export class RestoreService {
if (project.llmModel !== undefined) updateData['llmModel'] = project.llmModel;
await this.projectRepo.update(existing.id, updateData);
// Re-link servers and members
// Re-link servers
if (project.serverNames && project.serverNames.length > 0) {
const serverIds = await this.resolveServerNames(project.serverNames);
await this.projectRepo.setServers(existing.id, serverIds);
}
if (project.members && project.members.length > 0 && this.userRepo) {
const memberData = await this.resolveProjectMembers(project.members);
await this.projectRepo.setMembers(existing.id, memberData);
}
result.projectsCreated++;
continue;
@@ -289,11 +285,6 @@ export class RestoreService {
const serverIds = await this.resolveServerNames(project.serverNames);
await this.projectRepo.setServers(created.id, serverIds);
}
// Link members
if (project.members && project.members.length > 0 && this.userRepo) {
const memberData = await this.resolveProjectMembers(project.members);
await this.projectRepo.setMembers(created.id, memberData);
}
result.projectsCreated++;
} catch (err) {
@@ -359,15 +350,4 @@ export class RestoreService {
return ids;
}
/** Resolve project member emails to user IDs. */
private async resolveProjectMembers(
members: string[],
): Promise<string[]> {
const resolved: string[] = [];
for (const email of members) {
const user = await this.userRepo!.findByEmail(email);
if (user) resolved.push(user.id);
}
return resolved;
}
}

View File

@@ -29,6 +29,6 @@ export { HealthProbeRunner } from './health-probe.service.js';
export type { HealthCheckSpec, ProbeResult } from './health-probe.service.js';
export { RbacDefinitionService } from './rbac-definition.service.js';
export { RbacService } from './rbac.service.js';
export type { RbacAction, Permission } from './rbac.service.js';
export type { RbacAction, Permission, AllowedScope } from './rbac.service.js';
export { UserService } from './user.service.js';
export { GroupService } from './group.service.js';

View File

@@ -1,7 +1,10 @@
import type { McpInstance } from '@prisma/client';
import type { McpInstance, McpServer } from '@prisma/client';
import type { IMcpInstanceRepository, IMcpServerRepository } from '../repositories/interfaces.js';
import type { McpOrchestrator } from './orchestrator.js';
import { NotFoundError } from './mcp-server.service.js';
import { InvalidStateError } from './instance.service.js';
import { sendViaSse } from './transport/sse-client.js';
import { sendViaStdio } from './transport/stdio-client.js';
export interface McpProxyRequest {
serverId: string;
@@ -38,17 +41,21 @@ export class McpProxyService {
constructor(
private readonly instanceRepo: IMcpInstanceRepository,
private readonly serverRepo: IMcpServerRepository,
private readonly orchestrator?: McpOrchestrator,
) {}
async execute(request: McpProxyRequest): Promise<McpProxyResponse> {
const server = await this.serverRepo.findById(request.serverId);
// External server: proxy directly to externalUrl
if (server?.externalUrl) {
return this.sendToExternal(server.id, server.externalUrl, request.method, request.params);
if (!server) {
throw new NotFoundError(`Server '${request.serverId}' not found`);
}
// Managed server: find running instance
// External server: proxy directly to externalUrl
if (server.externalUrl) {
return this.sendToExternal(server, request.method, request.params);
}
// Managed server: find running instance and dispatch by transport
const instances = await this.instanceRepo.findAll(request.serverId);
const running = instances.find((i) => i.status === 'RUNNING');
@@ -56,20 +63,95 @@ export class McpProxyService {
throw new NotFoundError(`No running instance found for server '${request.serverId}'`);
}
if (running.port === null || running.port === undefined) {
throw new InvalidStateError(
`Running instance '${running.id}' for server '${request.serverId}' has no port assigned`,
);
}
return this.sendJsonRpc(running, request.method, request.params);
return this.sendToManaged(server, running, request.method, request.params);
}
/**
* Send a JSON-RPC request to an external MCP server.
* Handles streamable-http protocol (session management + SSE response parsing).
* Send to an external MCP server. Dispatches based on transport type.
*/
private async sendToExternal(
server: McpServer,
method: string,
params?: Record<string, unknown>,
): Promise<McpProxyResponse> {
const url = server.externalUrl as string;
if (server.transport === 'SSE') {
return sendViaSse(url, method, params);
}
// STREAMABLE_HTTP (default for external)
return this.sendStreamableHttp(server.id, url, method, params);
}
/**
* Send to a managed (containerized) MCP server. Dispatches based on transport type.
*/
private async sendToManaged(
server: McpServer,
instance: McpInstance,
method: string,
params?: Record<string, unknown>,
): Promise<McpProxyResponse> {
const transport = server.transport as string;
// STDIO: use docker exec
if (transport === 'STDIO') {
if (!this.orchestrator) {
throw new InvalidStateError('Orchestrator required for STDIO transport');
}
if (!instance.containerId) {
throw new InvalidStateError(`Instance '${instance.id}' has no container ID`);
}
const packageName = server.packageName as string | null;
if (!packageName) {
throw new InvalidStateError(`Server '${server.id}' has no package name for STDIO transport`);
}
return sendViaStdio(this.orchestrator, instance.containerId, packageName, method, params);
}
// SSE or STREAMABLE_HTTP: need a base URL
const baseUrl = await this.resolveBaseUrl(instance, server);
if (transport === 'SSE') {
return sendViaSse(baseUrl, method, params);
}
// STREAMABLE_HTTP (default)
return this.sendStreamableHttp(server.id, baseUrl, method, params);
}
/**
* Resolve the base URL for an HTTP-based managed server.
* Prefers container internal IP on Docker network, falls back to localhost:port.
*/
private async resolveBaseUrl(instance: McpInstance, server: McpServer): Promise<string> {
const containerPort = (server.containerPort as number | null) ?? 3000;
if (this.orchestrator && instance.containerId) {
try {
const containerInfo = await this.orchestrator.inspectContainer(instance.containerId);
if (containerInfo.ip) {
return `http://${containerInfo.ip}:${containerPort}`;
}
} catch {
// Fall through to localhost
}
}
if (instance.port !== null && instance.port !== undefined) {
return `http://localhost:${instance.port}`;
}
throw new InvalidStateError(
`Cannot resolve URL for instance '${instance.id}': no container IP or host port`,
);
}
/**
* Send via streamable-http protocol with session management.
*/
private async sendStreamableHttp(
serverId: string,
url: string,
method: string,
@@ -109,14 +191,14 @@ export class McpProxyService {
// Session expired? Clear and retry once
if (response.status === 400 || response.status === 404) {
this.sessions.delete(serverId);
return this.sendToExternal(serverId, url, method, params);
return this.sendStreamableHttp(serverId, url, method, params);
}
return {
jsonrpc: '2.0',
id: 1,
error: {
code: -32000,
message: `External MCP server returned HTTP ${response.status}: ${response.statusText}`,
message: `MCP server returned HTTP ${response.status}: ${response.statusText}`,
},
};
}
@@ -126,8 +208,7 @@ export class McpProxyService {
}
/**
* Initialize a streamable-http session with an external server.
* Sends `initialize` and `notifications/initialized`, caches the session ID.
* Initialize a streamable-http session with a server.
*/
private async initSession(serverId: string, url: string): Promise<void> {
const initBody = {
@@ -174,41 +255,4 @@ export class McpProxyService {
body: JSON.stringify({ jsonrpc: '2.0', method: 'notifications/initialized' }),
});
}
private async sendJsonRpc(
instance: McpInstance,
method: string,
params?: Record<string, unknown>,
): Promise<McpProxyResponse> {
const url = `http://localhost:${instance.port}`;
const body: Record<string, unknown> = {
jsonrpc: '2.0',
id: 1,
method,
};
if (params !== undefined) {
body.params = params;
}
const response = await fetch(url, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(body),
});
if (!response.ok) {
return {
jsonrpc: '2.0',
id: 1,
error: {
code: -32000,
message: `MCP server returned HTTP ${response.status}: ${response.statusText}`,
},
};
}
const result = (await response.json()) as McpProxyResponse;
return result;
}
}

View File

@@ -1,7 +1,6 @@
import type { McpServer } from '@prisma/client';
import type { IProjectRepository, ProjectWithRelations } from '../repositories/project.repository.js';
import type { IMcpServerRepository, ISecretRepository } from '../repositories/interfaces.js';
import type { IUserRepository } from '../repositories/user.repository.js';
import { CreateProjectSchema, UpdateProjectSchema } from '../validation/project.schema.js';
import { NotFoundError, ConflictError } from './mcp-server.service.js';
import { resolveServerEnv } from './env-resolver.js';
@@ -13,7 +12,6 @@ export class ProjectService {
private readonly projectRepo: IProjectRepository,
private readonly serverRepo: IMcpServerRepository,
private readonly secretRepo: ISecretRepository,
private readonly userRepo: IUserRepository,
) {}
async list(ownerId?: string): Promise<ProjectWithRelations[]> {
@@ -52,25 +50,20 @@ export class ProjectService {
// Resolve server names to IDs
const serverIds = await this.resolveServerNames(data.servers);
// Resolve member emails to user IDs
const resolvedMembers = await this.resolveMemberEmails(data.members);
const project = await this.projectRepo.create({
name: data.name,
description: data.description,
prompt: data.prompt,
ownerId,
proxyMode: data.proxyMode,
...(data.llmProvider !== undefined ? { llmProvider: data.llmProvider } : {}),
...(data.llmModel !== undefined ? { llmModel: data.llmModel } : {}),
});
// Link servers and members
// Link servers
if (serverIds.length > 0) {
await this.projectRepo.setServers(project.id, serverIds);
}
if (resolvedMembers.length > 0) {
await this.projectRepo.setMembers(project.id, resolvedMembers);
}
// Re-fetch to include relations
return this.getById(project.id);
@@ -83,6 +76,7 @@ export class ProjectService {
// Build update data for scalar fields
const updateData: Record<string, unknown> = {};
if (data.description !== undefined) updateData['description'] = data.description;
if (data.prompt !== undefined) updateData['prompt'] = data.prompt;
if (data.proxyMode !== undefined) updateData['proxyMode'] = data.proxyMode;
if (data.llmProvider !== undefined) updateData['llmProvider'] = data.llmProvider;
if (data.llmModel !== undefined) updateData['llmModel'] = data.llmModel;
@@ -98,12 +92,6 @@ export class ProjectService {
await this.projectRepo.setServers(project.id, serverIds);
}
// Update members if provided
if (data.members !== undefined) {
const resolvedMembers = await this.resolveMemberEmails(data.members);
await this.projectRepo.setMembers(project.id, resolvedMembers);
}
// Re-fetch to include updated relations
return this.getById(project.id);
}
@@ -141,6 +129,22 @@ export class ProjectService {
return generateMcpConfig(serverEntries);
}
async addServer(idOrName: string, serverName: string): Promise<ProjectWithRelations> {
const project = await this.resolveAndGet(idOrName);
const server = await this.serverRepo.findByName(serverName);
if (server === null) throw new NotFoundError(`Server not found: ${serverName}`);
await this.projectRepo.addServer(project.id, server.id);
return this.getById(project.id);
}
async removeServer(idOrName: string, serverName: string): Promise<ProjectWithRelations> {
const project = await this.resolveAndGet(idOrName);
const server = await this.serverRepo.findByName(serverName);
if (server === null) throw new NotFoundError(`Server not found: ${serverName}`);
await this.projectRepo.removeServer(project.id, server.id);
return this.getById(project.id);
}
private async resolveServerNames(names: string[]): Promise<string[]> {
return Promise.all(names.map(async (name) => {
const server = await this.serverRepo.findByName(name);
@@ -148,12 +152,4 @@ export class ProjectService {
return server.id;
}));
}
private async resolveMemberEmails(emails: string[]): Promise<string[]> {
return Promise.all(emails.map(async (email) => {
const user = await this.userRepo.findByEmail(email);
if (user === null) throw new NotFoundError(`User not found: ${email}`);
return user.id;
}));
}
}

View File

@@ -0,0 +1,137 @@
import type { Prompt, PromptRequest } from '@prisma/client';
import type { IPromptRepository } from '../repositories/prompt.repository.js';
import type { IPromptRequestRepository } from '../repositories/prompt-request.repository.js';
import type { IProjectRepository } from '../repositories/project.repository.js';
import { CreatePromptSchema, UpdatePromptSchema, CreatePromptRequestSchema } from '../validation/prompt.schema.js';
import { NotFoundError } from './mcp-server.service.js';
export class PromptService {
constructor(
private readonly promptRepo: IPromptRepository,
private readonly promptRequestRepo: IPromptRequestRepository,
private readonly projectRepo: IProjectRepository,
) {}
// ── Prompt CRUD ──
async listPrompts(projectId?: string): Promise<Prompt[]> {
return this.promptRepo.findAll(projectId);
}
async getPrompt(id: string): Promise<Prompt> {
const prompt = await this.promptRepo.findById(id);
if (prompt === null) throw new NotFoundError(`Prompt not found: ${id}`);
return prompt;
}
async createPrompt(input: unknown): Promise<Prompt> {
const data = CreatePromptSchema.parse(input);
if (data.projectId) {
const project = await this.projectRepo.findById(data.projectId);
if (project === null) throw new NotFoundError(`Project not found: ${data.projectId}`);
}
const createData: { name: string; content: string; projectId?: string } = {
name: data.name,
content: data.content,
};
if (data.projectId !== undefined) createData.projectId = data.projectId;
return this.promptRepo.create(createData);
}
async updatePrompt(id: string, input: unknown): Promise<Prompt> {
const data = UpdatePromptSchema.parse(input);
await this.getPrompt(id);
const updateData: { content?: string } = {};
if (data.content !== undefined) updateData.content = data.content;
return this.promptRepo.update(id, updateData);
}
async deletePrompt(id: string): Promise<void> {
await this.getPrompt(id);
await this.promptRepo.delete(id);
}
// ── PromptRequest CRUD ──
async listPromptRequests(projectId?: string): Promise<PromptRequest[]> {
return this.promptRequestRepo.findAll(projectId);
}
async getPromptRequest(id: string): Promise<PromptRequest> {
const req = await this.promptRequestRepo.findById(id);
if (req === null) throw new NotFoundError(`PromptRequest not found: ${id}`);
return req;
}
async deletePromptRequest(id: string): Promise<void> {
await this.getPromptRequest(id);
await this.promptRequestRepo.delete(id);
}
// ── Propose (LLM creates a PromptRequest) ──
async propose(input: unknown): Promise<PromptRequest> {
const data = CreatePromptRequestSchema.parse(input);
if (data.projectId) {
const project = await this.projectRepo.findById(data.projectId);
if (project === null) throw new NotFoundError(`Project not found: ${data.projectId}`);
}
const createData: { name: string; content: string; projectId?: string; createdBySession?: string; createdByUserId?: string } = {
name: data.name,
content: data.content,
};
if (data.projectId !== undefined) createData.projectId = data.projectId;
if (data.createdBySession !== undefined) createData.createdBySession = data.createdBySession;
if (data.createdByUserId !== undefined) createData.createdByUserId = data.createdByUserId;
return this.promptRequestRepo.create(createData);
}
// ── Approve (delete PromptRequest → create Prompt) ──
async approve(requestId: string): Promise<Prompt> {
const req = await this.getPromptRequest(requestId);
// Create the approved prompt
const createData: { name: string; content: string; projectId?: string } = {
name: req.name,
content: req.content,
};
if (req.projectId !== null) createData.projectId = req.projectId;
const prompt = await this.promptRepo.create(createData);
// Delete the request
await this.promptRequestRepo.delete(requestId);
return prompt;
}
// ── Visibility for MCP (approved prompts + session's pending requests) ──
async getVisiblePrompts(
projectId?: string,
sessionId?: string,
): Promise<Array<{ name: string; content: string; type: 'prompt' | 'promptrequest' }>> {
const results: Array<{ name: string; content: string; type: 'prompt' | 'promptrequest' }> = [];
// Approved prompts (project-scoped + global)
const prompts = await this.promptRepo.findAll(projectId);
for (const p of prompts) {
results.push({ name: p.name, content: p.content, type: 'prompt' });
}
// Session's own pending requests
if (sessionId) {
const requests = await this.promptRequestRepo.findBySession(sessionId, projectId);
for (const r of requests) {
results.push({ name: r.name, content: r.content, type: 'promptrequest' });
}
}
return results;
}
}

View File

@@ -8,7 +8,7 @@ import {
type RbacRoleBinding,
} from '../validation/rbac-definition.schema.js';
export type RbacAction = 'view' | 'create' | 'delete' | 'edit' | 'run';
export type RbacAction = 'view' | 'create' | 'delete' | 'edit' | 'run' | 'expose';
export interface ResourcePermission {
role: string;
@@ -23,13 +23,19 @@ export interface OperationPermission {
export type Permission = ResourcePermission | OperationPermission;
export interface AllowedScope {
wildcard: boolean;
names: Set<string>;
}
/** Maps roles to the set of actions they grant. */
const ROLE_ACTIONS: Record<string, readonly RbacAction[]> = {
edit: ['view', 'create', 'delete', 'edit'],
edit: ['view', 'create', 'delete', 'edit', 'expose'],
view: ['view'],
create: ['create'],
delete: ['delete'],
run: ['run'],
expose: ['expose', 'view'],
};
export class RbacService {
@@ -44,8 +50,8 @@ export class RbacService {
* If provided, name-scoped bindings only match when their name equals this.
* If omitted (listing), name-scoped bindings still grant access.
*/
async canAccess(userId: string, action: RbacAction, resource: string, resourceName?: string): Promise<boolean> {
const permissions = await this.getPermissions(userId);
async canAccess(userId: string, action: RbacAction, resource: string, resourceName?: string, serviceAccountName?: string): Promise<boolean> {
const permissions = await this.getPermissions(userId, serviceAccountName);
const normalized = normalizeResource(resource);
for (const perm of permissions) {
@@ -67,8 +73,8 @@ export class RbacService {
* Check whether a user is allowed to perform a named operation.
* Operations require an explicit 'run' role binding with a matching action.
*/
async canRunOperation(userId: string, operation: string): Promise<boolean> {
const permissions = await this.getPermissions(userId);
async canRunOperation(userId: string, operation: string, serviceAccountName?: string): Promise<boolean> {
const permissions = await this.getPermissions(userId, serviceAccountName);
for (const perm of permissions) {
if ('action' in perm && perm.role === 'run' && perm.action === operation) {
@@ -79,34 +85,63 @@ export class RbacService {
return false;
}
/**
* Determine the set of resource names a user may access for a given action+resource.
* Returns wildcard:true if any matching binding is unscoped (no name constraint).
* Returns wildcard:false with a set of allowed names if all bindings are name-scoped.
*/
async getAllowedScope(userId: string, action: RbacAction, resource: string, serviceAccountName?: string): Promise<AllowedScope> {
const permissions = await this.getPermissions(userId, serviceAccountName);
const normalized = normalizeResource(resource);
const names = new Set<string>();
for (const perm of permissions) {
if (!('resource' in perm)) continue;
const actions = ROLE_ACTIONS[perm.role];
if (actions === undefined) continue;
if (!actions.includes(action)) continue;
const permResource = normalizeResource(perm.resource);
if (permResource !== '*' && permResource !== normalized) continue;
// Unscoped binding → wildcard access to this resource
if (perm.name === undefined) return { wildcard: true, names: new Set() };
names.add(perm.name);
}
return { wildcard: false, names };
}
/**
* Collect all permissions for a user across all matching RbacDefinitions.
*/
async getPermissions(userId: string): Promise<Permission[]> {
async getPermissions(userId: string, serviceAccountName?: string): Promise<Permission[]> {
// 1. Resolve user email
const user = await this.prisma.user.findUnique({
where: { id: userId },
select: { email: true },
});
if (user === null) return [];
if (user === null && serviceAccountName === undefined) return [];
// 2. Resolve group names the user belongs to
const memberships = await this.prisma.groupMember.findMany({
where: { userId },
select: { group: { select: { name: true } } },
});
const groupNames = memberships.map((m) => m.group.name);
let groupNames: string[] = [];
if (user !== null) {
const memberships = await this.prisma.groupMember.findMany({
where: { userId },
select: { group: { select: { name: true } } },
});
groupNames = memberships.map((m) => m.group.name);
}
// 3. Load all RbacDefinitions
const definitions = await this.rbacRepo.findAll();
// 4. Find definitions where user is a subject
// 4. Find definitions where user or service account is a subject
const permissions: Permission[] = [];
for (const def of definitions) {
const subjects = def.subjects as RbacSubject[];
const matched = subjects.some((s) => {
if (s.kind === 'User') return s.name === user.email;
if (s.kind === 'User') return user !== null && s.name === user.email;
if (s.kind === 'Group') return groupNames.includes(s.name);
if (s.kind === 'ServiceAccount') return serviceAccountName !== undefined && s.name === serviceAccountName;
return false;
});

View File

@@ -0,0 +1,2 @@
export { sendViaSse } from './sse-client.js';
export { sendViaStdio } from './stdio-client.js';

View File

@@ -0,0 +1,150 @@
import type { McpProxyResponse } from '../mcp-proxy-service.js';
/**
* SSE transport client for MCP servers using the legacy SSE protocol.
*
* Protocol: GET /sse → endpoint event with messages URL → POST to messages URL.
* Responses come back on the SSE stream, matched by JSON-RPC request ID.
*
* Each call opens a fresh SSE connection, initializes, sends the request,
* reads the response, and closes. Session caching may be added later.
*/
export async function sendViaSse(
baseUrl: string,
method: string,
params?: Record<string, unknown>,
timeoutMs = 30_000,
): Promise<McpProxyResponse> {
const controller = new AbortController();
const timer = setTimeout(() => controller.abort(), timeoutMs);
try {
// 1. GET /sse → SSE stream
const sseResp = await fetch(`${baseUrl}/sse`, {
method: 'GET',
headers: { 'Accept': 'text/event-stream' },
signal: controller.signal,
});
if (!sseResp.ok) {
return errorResponse(`SSE connect failed: HTTP ${sseResp.status}`);
}
const reader = sseResp.body?.getReader();
if (!reader) {
return errorResponse('No SSE stream body');
}
// 2. Read until we get the endpoint event with messages URL
const decoder = new TextDecoder();
let buffer = '';
let messagesUrl = '';
while (!messagesUrl) {
const { done, value } = await reader.read();
if (done) break;
buffer += decoder.decode(value, { stream: true });
for (const line of buffer.split('\n')) {
if (line.startsWith('data: ') && buffer.includes('event: endpoint')) {
const endpoint = line.slice(6).trim();
messagesUrl = endpoint.startsWith('http') ? endpoint : `${baseUrl}${endpoint}`;
}
}
const lines = buffer.split('\n');
buffer = lines[lines.length - 1] ?? '';
}
if (!messagesUrl) {
reader.cancel();
return errorResponse('No endpoint event from SSE stream');
}
const postHeaders = { 'Content-Type': 'application/json' };
// 3. Initialize
const initResp = await fetch(messagesUrl, {
method: 'POST',
headers: postHeaders,
body: JSON.stringify({
jsonrpc: '2.0',
id: 1,
method: 'initialize',
params: {
protocolVersion: '2024-11-05',
capabilities: {},
clientInfo: { name: 'mcpctl-proxy', version: '0.1.0' },
},
}),
signal: controller.signal,
});
if (!initResp.ok) {
reader.cancel();
return errorResponse(`SSE initialize failed: HTTP ${initResp.status}`);
}
// 4. Send notifications/initialized
await fetch(messagesUrl, {
method: 'POST',
headers: postHeaders,
body: JSON.stringify({ jsonrpc: '2.0', method: 'notifications/initialized' }),
signal: controller.signal,
});
// 5. Send the actual request
const requestId = 2;
await fetch(messagesUrl, {
method: 'POST',
headers: postHeaders,
body: JSON.stringify({
jsonrpc: '2.0',
id: requestId,
method,
...(params !== undefined ? { params } : {}),
}),
signal: controller.signal,
});
// 6. Read response from SSE stream (matched by request ID)
let responseBuffer = '';
const readTimeout = setTimeout(() => reader.cancel(), 5000);
while (true) {
const { done, value } = await reader.read();
if (done) break;
responseBuffer += decoder.decode(value, { stream: true });
for (const line of responseBuffer.split('\n')) {
if (line.startsWith('data: ')) {
try {
const parsed = JSON.parse(line.slice(6)) as McpProxyResponse;
if (parsed.id === requestId) {
clearTimeout(readTimeout);
reader.cancel();
return parsed;
}
} catch {
// Not valid JSON, skip
}
}
}
const respLines = responseBuffer.split('\n');
responseBuffer = respLines[respLines.length - 1] ?? '';
}
clearTimeout(readTimeout);
reader.cancel();
return errorResponse('No response received from SSE stream');
} finally {
clearTimeout(timer);
}
}
function errorResponse(message: string): McpProxyResponse {
return {
jsonrpc: '2.0',
id: 1,
error: { code: -32000, message },
};
}

View File

@@ -0,0 +1,119 @@
import type { McpOrchestrator } from '../orchestrator.js';
import type { McpProxyResponse } from '../mcp-proxy-service.js';
/**
* STDIO transport client for MCP servers running as Docker containers.
*
* Runs `docker exec` with an inline Node.js script that spawns the MCP server
* binary, pipes JSON-RPC messages via stdin/stdout, and returns the response.
*
* Each call is self-contained: initialize → notifications/initialized → request → response.
*/
export async function sendViaStdio(
orchestrator: McpOrchestrator,
containerId: string,
packageName: string,
method: string,
params?: Record<string, unknown>,
timeoutMs = 30_000,
): Promise<McpProxyResponse> {
const initMsg = JSON.stringify({
jsonrpc: '2.0',
id: 1,
method: 'initialize',
params: {
protocolVersion: '2024-11-05',
capabilities: {},
clientInfo: { name: 'mcpctl-proxy', version: '0.1.0' },
},
});
const initializedMsg = JSON.stringify({
jsonrpc: '2.0',
method: 'notifications/initialized',
});
const requestBody: Record<string, unknown> = {
jsonrpc: '2.0',
id: 2,
method,
};
if (params !== undefined) {
requestBody.params = params;
}
const requestMsg = JSON.stringify(requestBody);
// Inline Node.js script that:
// 1. Spawns the MCP server binary via npx
// 2. Sends initialize → initialized → actual request via stdin
// 3. Reads stdout for JSON-RPC response with id: 2
// 4. Outputs the full JSON-RPC response to stdout
const probeScript = `
const { spawn } = require('child_process');
const proc = spawn('npx', ['--prefer-offline', '-y', ${JSON.stringify(packageName)}], { stdio: ['pipe', 'pipe', 'pipe'] });
let output = '';
let responded = false;
proc.stdout.on('data', d => {
output += d;
const lines = output.split('\\n');
for (const line of lines) {
if (!line.trim()) continue;
try {
const msg = JSON.parse(line);
if (msg.id === 2) {
responded = true;
process.stdout.write(JSON.stringify(msg), () => {
proc.kill();
process.exit(0);
});
}
} catch {}
}
output = lines[lines.length - 1] || '';
});
proc.stderr.on('data', () => {});
proc.on('error', e => { process.stdout.write(JSON.stringify({jsonrpc:'2.0',id:2,error:{code:-32000,message:e.message}})); process.exit(1); });
proc.on('exit', (code) => { if (!responded) { process.stdout.write(JSON.stringify({jsonrpc:'2.0',id:2,error:{code:-32000,message:'process exited '+code}})); process.exit(1); } });
setTimeout(() => { if (!responded) { process.stdout.write(JSON.stringify({jsonrpc:'2.0',id:2,error:{code:-32000,message:'timeout'}})); proc.kill(); process.exit(1); } }, ${timeoutMs - 2000});
proc.stdin.write(${JSON.stringify(initMsg)} + '\\n');
setTimeout(() => {
proc.stdin.write(${JSON.stringify(initializedMsg)} + '\\n');
setTimeout(() => {
proc.stdin.write(${JSON.stringify(requestMsg)} + '\\n');
}, 500);
}, 500);
`.trim();
try {
const result = await orchestrator.execInContainer(
containerId,
['node', '-e', probeScript],
{ timeoutMs },
);
if (result.exitCode === 0 && result.stdout.trim()) {
try {
return JSON.parse(result.stdout.trim()) as McpProxyResponse;
} catch {
return errorResponse(`Failed to parse STDIO response: ${result.stdout.slice(0, 200)}`);
}
}
// Try to parse error response from stdout
try {
return JSON.parse(result.stdout.trim()) as McpProxyResponse;
} catch {
const errorMsg = result.stderr.trim() || `docker exec exit code ${result.exitCode}`;
return errorResponse(errorMsg);
}
} catch (err) {
return errorResponse(err instanceof Error ? err.message : String(err));
}
}
function errorResponse(message: string): McpProxyResponse {
return {
jsonrpc: '2.0',
id: 2,
error: { code: -32000, message },
};
}

View File

@@ -3,11 +3,11 @@ import { z } from 'zod';
export const CreateProjectSchema = z.object({
name: z.string().min(1).max(100).regex(/^[a-z0-9-]+$/, 'Name must be lowercase alphanumeric with hyphens'),
description: z.string().max(1000).default(''),
prompt: z.string().max(10000).default(''),
proxyMode: z.enum(['direct', 'filtered']).default('direct'),
llmProvider: z.string().max(100).optional(),
llmModel: z.string().max(100).optional(),
servers: z.array(z.string().min(1)).default([]),
members: z.array(z.string().email()).default([]),
}).refine(
(d) => d.proxyMode !== 'filtered' || d.llmProvider,
{ message: 'llmProvider is required when proxyMode is "filtered"' },
@@ -15,11 +15,11 @@ export const CreateProjectSchema = z.object({
export const UpdateProjectSchema = z.object({
description: z.string().max(1000).optional(),
prompt: z.string().max(10000).optional(),
proxyMode: z.enum(['direct', 'filtered']).optional(),
llmProvider: z.string().max(100).nullable().optional(),
llmModel: z.string().max(100).nullable().optional(),
servers: z.array(z.string().min(1)).optional(),
members: z.array(z.string().email()).optional(),
});
export type CreateProjectInput = z.infer<typeof CreateProjectSchema>;

View File

@@ -0,0 +1,23 @@
import { z } from 'zod';
export const CreatePromptSchema = z.object({
name: z.string().min(1).max(100).regex(/^[a-z0-9-]+$/, 'Name must be lowercase alphanumeric with hyphens'),
content: z.string().min(1).max(50000),
projectId: z.string().optional(),
});
export const UpdatePromptSchema = z.object({
content: z.string().min(1).max(50000).optional(),
});
export const CreatePromptRequestSchema = z.object({
name: z.string().min(1).max(100).regex(/^[a-z0-9-]+$/, 'Name must be lowercase alphanumeric with hyphens'),
content: z.string().min(1).max(50000),
projectId: z.string().optional(),
createdBySession: z.string().optional(),
createdByUserId: z.string().optional(),
});
export type CreatePromptInput = z.infer<typeof CreatePromptSchema>;
export type UpdatePromptInput = z.infer<typeof UpdatePromptSchema>;
export type CreatePromptRequestInput = z.infer<typeof CreatePromptRequestSchema>;

View File

@@ -1,7 +1,7 @@
import { z } from 'zod';
export const RBAC_ROLES = ['edit', 'view', 'create', 'delete', 'run'] as const;
export const RBAC_RESOURCES = ['*', 'servers', 'instances', 'secrets', 'projects', 'templates', 'users', 'groups', 'rbac'] as const;
export const RBAC_ROLES = ['edit', 'view', 'create', 'delete', 'run', 'expose'] as const;
export const RBAC_RESOURCES = ['*', 'servers', 'instances', 'secrets', 'projects', 'templates', 'users', 'groups', 'rbac', 'prompts', 'promptrequests'] as const;
/** Singular→plural map for resource names. */
const RESOURCE_ALIASES: Record<string, string> = {
@@ -12,6 +12,8 @@ const RESOURCE_ALIASES: Record<string, string> = {
template: 'templates',
user: 'users',
group: 'groups',
prompt: 'prompts',
promptrequest: 'promptrequests',
};
/** Normalize a resource name to its canonical plural form. */
@@ -20,7 +22,7 @@ export function normalizeResource(resource: string): string {
}
export const RbacSubjectSchema = z.object({
kind: z.enum(['User', 'Group']),
kind: z.enum(['User', 'Group', 'ServiceAccount']),
name: z.string().min(1),
});

View File

@@ -37,7 +37,6 @@ const mockProjects = [
id: 'proj1', name: 'my-project', description: 'Test project', proxyMode: 'direct', llmProvider: null, llmModel: null,
ownerId: 'user1', version: 1, createdAt: new Date(), updatedAt: new Date(),
servers: [{ id: 'ps1', server: { id: 's1', name: 'github' } }],
members: [{ id: 'pm1', user: { id: 'u1', email: 'alice@test.com', name: 'Alice' } }],
},
];
@@ -91,11 +90,12 @@ function mockProjectRepo(): IProjectRepository {
findAll: vi.fn(async () => [...mockProjects]),
findById: vi.fn(async (id: string) => mockProjects.find((p) => p.id === id) ?? null),
findByName: vi.fn(async () => null),
create: vi.fn(async (data) => ({ id: 'new-proj', ...data, servers: [], members: [], version: 1, createdAt: new Date(), updatedAt: new Date() } as typeof mockProjects[0])),
create: vi.fn(async (data) => ({ id: 'new-proj', ...data, servers: [], version: 1, createdAt: new Date(), updatedAt: new Date() } as typeof mockProjects[0])),
update: vi.fn(async (id, data) => ({ ...mockProjects.find((p) => p.id === id)!, ...data })),
delete: vi.fn(async () => {}),
setServers: vi.fn(async () => {}),
setMembers: vi.fn(async () => {}),
addServer: vi.fn(async () => {}),
removeServer: vi.fn(async () => {}),
};
}
@@ -214,12 +214,11 @@ describe('BackupService', () => {
expect(bundle.rbacBindings![0]!.subjects).toEqual([{ kind: 'User', name: 'alice@test.com' }]);
});
it('includes enriched projects with server names and members', async () => {
it('includes enriched projects with server names', async () => {
const bundle = await backupService.createBackup();
const proj = bundle.projects[0]!;
expect(proj.proxyMode).toBe('direct');
expect(proj.serverNames).toEqual(['github']);
expect(proj.members).toEqual(['alice@test.com']);
});
it('filters resources', async () => {
@@ -406,7 +405,7 @@ describe('RestoreService', () => {
}));
});
it('restores enriched projects with server and member linking', async () => {
it('restores enriched projects with server linking', async () => {
// Simulate servers exist (restored in prior step)
(serverRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
// After server restore, we can find them
@@ -419,14 +418,6 @@ describe('RestoreService', () => {
return null;
});
// Simulate users exist for member resolution
let userCallCount = 0;
(userRepo.findByEmail as ReturnType<typeof vi.fn>).mockImplementation(async (email: string) => {
userCallCount++;
if (userCallCount > 2 && email === 'alice@test.com') return { id: 'restored-u1', email };
return null;
});
const result = await restoreService.restore(fullBundle);
expect(result.projectsCreated).toBe(1);
@@ -437,7 +428,6 @@ describe('RestoreService', () => {
llmModel: 'gpt-4',
}));
expect(projectRepo.setServers).toHaveBeenCalled();
expect(projectRepo.setMembers).toHaveBeenCalled();
});
it('restores old bundle without users/groups/rbac', async () => {
@@ -551,7 +541,7 @@ describe('RestoreService', () => {
(serverRepo.create as ReturnType<typeof vi.fn>).mockImplementation(async () => { callOrder.push('server'); return { id: 'srv' }; });
(userRepo.create as ReturnType<typeof vi.fn>).mockImplementation(async () => { callOrder.push('user'); return { id: 'usr' }; });
(groupRepo.create as ReturnType<typeof vi.fn>).mockImplementation(async () => { callOrder.push('group'); return { id: 'grp' }; });
(projectRepo.create as ReturnType<typeof vi.fn>).mockImplementation(async () => { callOrder.push('project'); return { id: 'proj', servers: [], members: [] }; });
(projectRepo.create as ReturnType<typeof vi.fn>).mockImplementation(async () => { callOrder.push('project'); return { id: 'proj', servers: [] }; });
(rbacRepo.create as ReturnType<typeof vi.fn>).mockImplementation(async () => { callOrder.push('rbac'); return { id: 'rbac' }; });
await restoreService.restore(fullBundle);

View File

@@ -0,0 +1,283 @@
import { describe, it, expect, vi, afterEach } from 'vitest';
import Fastify from 'fastify';
import type { FastifyInstance } from 'fastify';
import { registerProjectRoutes } from '../src/routes/projects.js';
import { ProjectService } from '../src/services/project.service.js';
import { errorHandler } from '../src/middleware/error-handler.js';
import type { IProjectRepository, ProjectWithRelations } from '../src/repositories/project.repository.js';
import type { IMcpServerRepository, ISecretRepository } from '../src/repositories/interfaces.js';
let app: FastifyInstance;
function makeProject(overrides: Partial<ProjectWithRelations> = {}): ProjectWithRelations {
return {
id: 'proj-1',
name: 'test-project',
description: '',
ownerId: 'user-1',
proxyMode: 'direct',
llmProvider: null,
llmModel: null,
version: 1,
createdAt: new Date(),
updatedAt: new Date(),
servers: [],
...overrides,
};
}
function mockProjectRepo(): IProjectRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByName: vi.fn(async () => null),
create: vi.fn(async (data) => makeProject({
name: data.name,
description: data.description,
ownerId: data.ownerId,
proxyMode: data.proxyMode,
})),
update: vi.fn(async (_id, data) => makeProject({ ...data as Partial<ProjectWithRelations> })),
delete: vi.fn(async () => {}),
setServers: vi.fn(async () => {}),
addServer: vi.fn(async () => {}),
removeServer: vi.fn(async () => {}),
};
}
function mockServerRepo(): IMcpServerRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByName: vi.fn(async () => null),
create: vi.fn(async () => ({} as never)),
update: vi.fn(async () => ({} as never)),
delete: vi.fn(async () => {}),
};
}
function mockSecretRepo(): ISecretRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByName: vi.fn(async () => null),
create: vi.fn(async () => ({} as never)),
update: vi.fn(async () => ({} as never)),
delete: vi.fn(async () => {}),
};
}
afterEach(async () => {
if (app) await app.close();
});
function createApp(projectRepo: IProjectRepository, serverRepo?: IMcpServerRepository) {
app = Fastify({ logger: false });
app.setErrorHandler(errorHandler);
const service = new ProjectService(projectRepo, serverRepo ?? mockServerRepo(), mockSecretRepo());
registerProjectRoutes(app, service);
return app.ready();
}
describe('Project Routes', () => {
describe('GET /api/v1/projects', () => {
it('returns project list', async () => {
const repo = mockProjectRepo();
vi.mocked(repo.findAll).mockResolvedValue([
makeProject({ id: 'p1', name: 'alpha', ownerId: 'user-1' }),
makeProject({ id: 'p2', name: 'beta', ownerId: 'user-2' }),
]);
await createApp(repo);
const res = await app.inject({ method: 'GET', url: '/api/v1/projects' });
expect(res.statusCode).toBe(200);
const body = res.json<Array<{ name: string }>>();
expect(body).toHaveLength(2);
});
it('lists all projects without ownerId filtering', async () => {
// This is the bug fix: the route must call list() without ownerId
// so that RBAC (preSerialization) handles access filtering, not the DB query.
const repo = mockProjectRepo();
vi.mocked(repo.findAll).mockResolvedValue([makeProject()]);
await createApp(repo);
await app.inject({ method: 'GET', url: '/api/v1/projects' });
// findAll must be called with NO arguments (undefined ownerId)
expect(repo.findAll).toHaveBeenCalledWith(undefined);
});
});
describe('GET /api/v1/projects/:id', () => {
it('returns 404 when not found', async () => {
const repo = mockProjectRepo();
await createApp(repo);
const res = await app.inject({ method: 'GET', url: '/api/v1/projects/missing' });
expect(res.statusCode).toBe(404);
});
it('returns project when found by ID', async () => {
const repo = mockProjectRepo();
vi.mocked(repo.findById).mockResolvedValue(makeProject({ id: 'p1', name: 'my-proj' }));
await createApp(repo);
const res = await app.inject({ method: 'GET', url: '/api/v1/projects/p1' });
expect(res.statusCode).toBe(200);
expect(res.json<{ name: string }>().name).toBe('my-proj');
});
it('resolves by name when ID not found', async () => {
const repo = mockProjectRepo();
vi.mocked(repo.findByName).mockResolvedValue(makeProject({ name: 'my-proj' }));
await createApp(repo);
const res = await app.inject({ method: 'GET', url: '/api/v1/projects/my-proj' });
expect(res.statusCode).toBe(200);
expect(res.json<{ name: string }>().name).toBe('my-proj');
});
});
describe('POST /api/v1/projects', () => {
it('creates a project and returns 201', async () => {
const repo = mockProjectRepo();
vi.mocked(repo.findById).mockResolvedValue(makeProject({ name: 'new-proj' }));
await createApp(repo);
const res = await app.inject({
method: 'POST',
url: '/api/v1/projects',
payload: { name: 'new-proj' },
});
expect(res.statusCode).toBe(201);
});
it('returns 400 for invalid input', async () => {
const repo = mockProjectRepo();
await createApp(repo);
const res = await app.inject({
method: 'POST',
url: '/api/v1/projects',
payload: { name: '' },
});
expect(res.statusCode).toBe(400);
});
it('returns 409 when name already exists', async () => {
const repo = mockProjectRepo();
vi.mocked(repo.findByName).mockResolvedValue(makeProject());
await createApp(repo);
const res = await app.inject({
method: 'POST',
url: '/api/v1/projects',
payload: { name: 'taken' },
});
expect(res.statusCode).toBe(409);
});
});
describe('PUT /api/v1/projects/:id', () => {
it('updates a project', async () => {
const repo = mockProjectRepo();
vi.mocked(repo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
await createApp(repo);
const res = await app.inject({
method: 'PUT',
url: '/api/v1/projects/p1',
payload: { description: 'Updated' },
});
expect(res.statusCode).toBe(200);
});
it('returns 404 when not found', async () => {
const repo = mockProjectRepo();
await createApp(repo);
const res = await app.inject({
method: 'PUT',
url: '/api/v1/projects/missing',
payload: { description: 'x' },
});
expect(res.statusCode).toBe(404);
});
});
describe('DELETE /api/v1/projects/:id', () => {
it('deletes a project and returns 204', async () => {
const repo = mockProjectRepo();
vi.mocked(repo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
await createApp(repo);
const res = await app.inject({ method: 'DELETE', url: '/api/v1/projects/p1' });
expect(res.statusCode).toBe(204);
});
it('returns 404 when not found', async () => {
const repo = mockProjectRepo();
await createApp(repo);
const res = await app.inject({ method: 'DELETE', url: '/api/v1/projects/missing' });
expect(res.statusCode).toBe(404);
});
});
describe('POST /api/v1/projects/:id/servers (attach)', () => {
it('attaches a server to a project', async () => {
const projectRepo = mockProjectRepo();
const serverRepo = mockServerRepo();
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
vi.mocked(serverRepo.findByName).mockResolvedValue({ id: 'srv-1', name: 'my-ha' } as never);
await createApp(projectRepo, serverRepo);
const res = await app.inject({
method: 'POST',
url: '/api/v1/projects/p1/servers',
payload: { server: 'my-ha' },
});
expect(res.statusCode).toBe(200);
expect(projectRepo.addServer).toHaveBeenCalledWith('p1', 'srv-1');
});
it('returns 400 when server field is missing', async () => {
const repo = mockProjectRepo();
vi.mocked(repo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
await createApp(repo);
const res = await app.inject({
method: 'POST',
url: '/api/v1/projects/p1/servers',
payload: {},
});
expect(res.statusCode).toBe(400);
});
it('returns 404 when server not found', async () => {
const projectRepo = mockProjectRepo();
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
await createApp(projectRepo);
const res = await app.inject({
method: 'POST',
url: '/api/v1/projects/p1/servers',
payload: { server: 'nonexistent' },
});
expect(res.statusCode).toBe(404);
});
});
describe('DELETE /api/v1/projects/:id/servers/:serverName (detach)', () => {
it('detaches a server from a project', async () => {
const projectRepo = mockProjectRepo();
const serverRepo = mockServerRepo();
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
vi.mocked(serverRepo.findByName).mockResolvedValue({ id: 'srv-1', name: 'my-ha' } as never);
await createApp(projectRepo, serverRepo);
const res = await app.inject({ method: 'DELETE', url: '/api/v1/projects/p1/servers/my-ha' });
expect(res.statusCode).toBe(204);
expect(projectRepo.removeServer).toHaveBeenCalledWith('p1', 'srv-1');
});
it('returns 404 when server not found', async () => {
const projectRepo = mockProjectRepo();
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
await createApp(projectRepo);
const res = await app.inject({ method: 'DELETE', url: '/api/v1/projects/p1/servers/nonexistent' });
expect(res.statusCode).toBe(404);
});
});
});

View File

@@ -3,7 +3,6 @@ import { ProjectService } from '../src/services/project.service.js';
import { NotFoundError, ConflictError } from '../src/services/mcp-server.service.js';
import type { IProjectRepository, ProjectWithRelations } from '../src/repositories/project.repository.js';
import type { IMcpServerRepository, ISecretRepository } from '../src/repositories/interfaces.js';
import type { IUserRepository } from '../src/repositories/user.repository.js';
import type { McpServer } from '@prisma/client';
function makeProject(overrides: Partial<ProjectWithRelations> = {}): ProjectWithRelations {
@@ -19,7 +18,6 @@ function makeProject(overrides: Partial<ProjectWithRelations> = {}): ProjectWith
createdAt: new Date(),
updatedAt: new Date(),
servers: [],
members: [],
...overrides,
};
}
@@ -64,7 +62,8 @@ function mockProjectRepo(): IProjectRepository {
update: vi.fn(async (_id, data) => makeProject({ ...data as Partial<ProjectWithRelations> })),
delete: vi.fn(async () => {}),
setServers: vi.fn(async () => {}),
setMembers: vi.fn(async () => {}),
addServer: vi.fn(async () => {}),
removeServer: vi.fn(async () => {}),
};
}
@@ -90,33 +89,17 @@ function mockSecretRepo(): ISecretRepository {
};
}
function mockUserRepo(): IUserRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByEmail: vi.fn(async () => null),
create: vi.fn(async () => ({
id: 'u-1', email: 'test@example.com', name: null, role: 'user',
provider: null, externalId: null, version: 1, createdAt: new Date(), updatedAt: new Date(),
})),
delete: vi.fn(async () => {}),
count: vi.fn(async () => 0),
};
}
describe('ProjectService', () => {
let projectRepo: ReturnType<typeof mockProjectRepo>;
let serverRepo: ReturnType<typeof mockServerRepo>;
let secretRepo: ReturnType<typeof mockSecretRepo>;
let userRepo: ReturnType<typeof mockUserRepo>;
let service: ProjectService;
beforeEach(() => {
projectRepo = mockProjectRepo();
serverRepo = mockServerRepo();
secretRepo = mockSecretRepo();
userRepo = mockUserRepo();
service = new ProjectService(projectRepo, serverRepo, secretRepo, userRepo);
service = new ProjectService(projectRepo, serverRepo, secretRepo);
});
describe('create', () => {
@@ -164,32 +147,6 @@ describe('ProjectService', () => {
expect(result.servers).toHaveLength(2);
});
it('creates project with members (resolves emails)', async () => {
vi.mocked(userRepo.findByEmail).mockImplementation(async (email) => {
if (email === 'alice@test.com') {
return { id: 'u-alice', email: 'alice@test.com', name: 'Alice', role: 'user', provider: null, externalId: null, version: 1, createdAt: new Date(), updatedAt: new Date() };
}
return null;
});
const created = makeProject({ id: 'proj-new' });
vi.mocked(projectRepo.create).mockResolvedValue(created);
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({
id: 'proj-new',
members: [
{ id: 'pm-1', user: { id: 'u-alice', email: 'alice@test.com', name: 'Alice' } },
],
}));
const result = await service.create({
name: 'my-project',
members: ['alice@test.com'],
}, 'user-1');
expect(projectRepo.setMembers).toHaveBeenCalledWith('proj-new', ['u-alice']);
expect(result.members).toHaveLength(1);
});
it('creates project with proxyMode and llmProvider', async () => {
const created = makeProject({ id: 'proj-filtered', proxyMode: 'filtered', llmProvider: 'openai' });
vi.mocked(projectRepo.create).mockResolvedValue(created);
@@ -219,16 +176,6 @@ describe('ProjectService', () => {
).rejects.toThrow(NotFoundError);
});
it('throws NotFoundError when member email resolution fails', async () => {
vi.mocked(userRepo.findByEmail).mockResolvedValue(null);
await expect(
service.create({
name: 'my-project',
members: ['nobody@test.com'],
}, 'user-1'),
).rejects.toThrow(NotFoundError);
});
});
describe('getById', () => {
@@ -277,19 +224,6 @@ describe('ProjectService', () => {
expect(projectRepo.setServers).toHaveBeenCalledWith('proj-1', ['srv-new']);
});
it('updates members (full replacement)', async () => {
const existing = makeProject({ id: 'proj-1' });
vi.mocked(projectRepo.findById).mockResolvedValue(existing);
vi.mocked(userRepo.findByEmail).mockResolvedValue({
id: 'u-bob', email: 'bob@test.com', name: 'Bob', role: 'user',
provider: null, externalId: null, version: 1, createdAt: new Date(), updatedAt: new Date(),
});
await service.update('proj-1', { members: ['bob@test.com'] });
expect(projectRepo.setMembers).toHaveBeenCalledWith('proj-1', ['u-bob']);
});
it('updates proxyMode', async () => {
const existing = makeProject({ id: 'proj-1' });
vi.mocked(projectRepo.findById).mockResolvedValue(existing);
@@ -314,6 +248,52 @@ describe('ProjectService', () => {
});
});
describe('addServer', () => {
it('attaches a server by name', async () => {
const project = makeProject({ id: 'proj-1' });
const srv = makeServer({ id: 'srv-1', name: 'my-ha' });
vi.mocked(projectRepo.findById).mockResolvedValue(project);
vi.mocked(serverRepo.findByName).mockResolvedValue(srv);
await service.addServer('proj-1', 'my-ha');
expect(projectRepo.addServer).toHaveBeenCalledWith('proj-1', 'srv-1');
});
it('throws NotFoundError when project not found', async () => {
await expect(service.addServer('missing', 'my-ha')).rejects.toThrow(NotFoundError);
});
it('throws NotFoundError when server not found', async () => {
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({ id: 'proj-1' }));
vi.mocked(serverRepo.findByName).mockResolvedValue(null);
await expect(service.addServer('proj-1', 'nonexistent')).rejects.toThrow(NotFoundError);
});
});
describe('removeServer', () => {
it('detaches a server by name', async () => {
const project = makeProject({ id: 'proj-1' });
const srv = makeServer({ id: 'srv-1', name: 'my-ha' });
vi.mocked(projectRepo.findById).mockResolvedValue(project);
vi.mocked(serverRepo.findByName).mockResolvedValue(srv);
await service.removeServer('proj-1', 'my-ha');
expect(projectRepo.removeServer).toHaveBeenCalledWith('proj-1', 'srv-1');
});
it('throws NotFoundError when project not found', async () => {
await expect(service.removeServer('missing', 'my-ha')).rejects.toThrow(NotFoundError);
});
it('throws NotFoundError when server not found', async () => {
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({ id: 'proj-1' }));
vi.mocked(serverRepo.findByName).mockResolvedValue(null);
await expect(service.removeServer('proj-1', 'nonexistent')).rejects.toThrow(NotFoundError);
});
});
describe('generateMcpConfig', () => {
it('generates direct mode config with STDIO servers', async () => {
const srv = makeServer({ id: 'srv-1', name: 'github', packageName: '@mcp/github', transport: 'STDIO' });

View File

@@ -0,0 +1,444 @@
/**
* Integration tests reproducing RBAC name-scoped access bugs.
*
* Bug 1: `mcpctl get servers` shows ALL servers despite user only having
* view:servers+name:my-home-assistant
* Bug 2: `mcpctl get server my-home-assistant -o yaml` returns 403 because
* CLI resolves name→CUID, and RBAC compares CUID against binding name
*
* These tests spin up a full Fastify app with auth + RBAC hooks + server routes,
* exactly like main.ts, to catch regressions at the HTTP level.
*/
import { describe, it, expect, vi, afterEach, beforeEach } from 'vitest';
import Fastify from 'fastify';
import type { FastifyInstance } from 'fastify';
import { registerMcpServerRoutes } from '../src/routes/mcp-servers.js';
import { McpServerService } from '../src/services/mcp-server.service.js';
import { InstanceService } from '../src/services/instance.service.js';
import { RbacService } from '../src/services/rbac.service.js';
import { errorHandler } from '../src/middleware/error-handler.js';
import type { IMcpServerRepository, IMcpInstanceRepository } from '../src/repositories/interfaces.js';
import type { IRbacDefinitionRepository } from '../src/repositories/rbac-definition.repository.js';
import type { McpOrchestrator } from '../src/services/orchestrator.js';
import type { McpServer, RbacDefinition, PrismaClient } from '@prisma/client';
// ── Test data ──
const SERVERS: McpServer[] = [
{ id: 'clxyz000000001', name: 'my-home-assistant', description: 'HA server', transport: 'STDIO', packageName: null, dockerImage: null, repositoryUrl: null, externalUrl: null, command: null, containerPort: null, replicas: 1, env: [], healthCheck: null, version: 1, createdAt: new Date(), updatedAt: new Date() },
{ id: 'clxyz000000002', name: 'slack-server', description: 'Slack MCP', transport: 'STDIO', packageName: null, dockerImage: null, repositoryUrl: null, externalUrl: null, command: null, containerPort: null, replicas: 1, env: [], healthCheck: null, version: 1, createdAt: new Date(), updatedAt: new Date() },
{ id: 'clxyz000000003', name: 'github-server', description: 'GitHub MCP', transport: 'STDIO', packageName: null, dockerImage: null, repositoryUrl: null, externalUrl: null, command: null, containerPort: null, replicas: 1, env: [], healthCheck: null, version: 1, createdAt: new Date(), updatedAt: new Date() },
];
// User tokens → userId mapping
const SESSIONS: Record<string, { userId: string }> = {
'scoped-token': { userId: 'user-scoped' },
'admin-token': { userId: 'user-admin' },
'multi-scoped-token': { userId: 'user-multi' },
'secrets-only-token': { userId: 'user-secrets' },
'edit-scoped-token': { userId: 'user-edit-scoped' },
};
// User email mapping
const USERS: Record<string, { email: string }> = {
'user-scoped': { email: 'scoped@example.com' },
'user-admin': { email: 'admin@example.com' },
'user-multi': { email: 'multi@example.com' },
'user-secrets': { email: 'secrets@example.com' },
'user-edit-scoped': { email: 'editscoped@example.com' },
};
// RBAC definitions
const RBAC_DEFS: RbacDefinition[] = [
{
id: 'rbac-scoped', name: 'scoped-view', version: 1, createdAt: new Date(), updatedAt: new Date(),
subjects: [{ kind: 'User', name: 'scoped@example.com' }],
roleBindings: [{ role: 'view', resource: 'servers', name: 'my-home-assistant' }],
},
{
id: 'rbac-admin', name: 'admin-all', version: 1, createdAt: new Date(), updatedAt: new Date(),
subjects: [{ kind: 'User', name: 'admin@example.com' }],
roleBindings: [{ role: 'edit', resource: '*' }],
},
{
id: 'rbac-multi', name: 'multi-scoped', version: 1, createdAt: new Date(), updatedAt: new Date(),
subjects: [{ kind: 'User', name: 'multi@example.com' }],
roleBindings: [
{ role: 'view', resource: 'servers', name: 'my-home-assistant' },
{ role: 'view', resource: 'servers', name: 'slack-server' },
],
},
{
id: 'rbac-secrets', name: 'secrets-only', version: 1, createdAt: new Date(), updatedAt: new Date(),
subjects: [{ kind: 'User', name: 'secrets@example.com' }],
roleBindings: [{ role: 'view', resource: 'secrets' }],
},
{
id: 'rbac-edit-scoped', name: 'edit-scoped', version: 1, createdAt: new Date(), updatedAt: new Date(),
subjects: [{ kind: 'User', name: 'editscoped@example.com' }],
roleBindings: [{ role: 'edit', resource: 'servers', name: 'my-home-assistant' }],
},
];
// ── Mock factories ──
function mockServerRepo(): IMcpServerRepository {
return {
findAll: vi.fn(async () => [...SERVERS]),
findById: vi.fn(async (id: string) => SERVERS.find((s) => s.id === id) ?? null),
findByName: vi.fn(async (name: string) => SERVERS.find((s) => s.name === name) ?? null),
create: vi.fn(async () => SERVERS[0]!),
update: vi.fn(async () => SERVERS[0]!),
delete: vi.fn(async () => {}),
};
}
function mockRbacRepo(): IRbacDefinitionRepository {
return {
findAll: vi.fn(async () => [...RBAC_DEFS]),
findById: vi.fn(async () => null),
findByName: vi.fn(async () => null),
create: vi.fn(async () => RBAC_DEFS[0]!),
update: vi.fn(async () => RBAC_DEFS[0]!),
delete: vi.fn(async () => {}),
};
}
function mockPrisma(): PrismaClient {
return {
user: {
findUnique: vi.fn(async ({ where }: { where: { id: string } }) => {
const u = USERS[where.id];
return u ? { email: u.email } : null;
}),
},
groupMember: {
findMany: vi.fn(async () => []),
},
} as unknown as PrismaClient;
}
function stubInstanceRepo(): IMcpInstanceRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByContainerId: vi.fn(async () => null),
create: vi.fn(async (data) => ({
id: 'inst-stub', serverId: data.serverId, containerId: null,
status: data.status ?? 'STOPPED', port: null, metadata: {},
healthStatus: null, lastHealthCheck: null, events: [],
version: 1, createdAt: new Date(), updatedAt: new Date(),
}) as never),
updateStatus: vi.fn(async () => ({}) as never),
delete: vi.fn(async () => {}),
};
}
function stubOrchestrator(): McpOrchestrator {
return {
ping: vi.fn(async () => true),
pullImage: vi.fn(async () => {}),
createContainer: vi.fn(async () => ({ containerId: 'ctr', name: 'stub', state: 'running' as const, port: 3000, createdAt: new Date() })),
stopContainer: vi.fn(async () => {}),
removeContainer: vi.fn(async () => {}),
inspectContainer: vi.fn(async () => ({ containerId: 'ctr', name: 'stub', state: 'running' as const, createdAt: new Date() })),
getContainerLogs: vi.fn(async () => ({ stdout: '', stderr: '' })),
};
}
// ── App setup (replicates main.ts hooks) ──
import { normalizeResource } from '../src/validation/rbac-definition.schema.js';
import type { RbacAction } from '../src/services/rbac.service.js';
type PermissionCheck =
| { kind: 'resource'; resource: string; action: RbacAction; resourceName?: string }
| { kind: 'operation'; operation: string }
| { kind: 'skip' };
function mapUrlToPermission(method: string, url: string): PermissionCheck {
const match = url.match(/^\/api\/v1\/([a-z-]+)/);
if (!match) return { kind: 'skip' };
const segment = match[1] as string;
if (segment === 'backup') return { kind: 'operation', operation: 'backup' };
if (segment === 'restore') return { kind: 'operation', operation: 'restore' };
if (segment === 'audit-logs' && method === 'DELETE') return { kind: 'operation', operation: 'audit-purge' };
const resourceMap: Record<string, string | undefined> = {
servers: 'servers', instances: 'instances', secrets: 'secrets',
projects: 'projects', templates: 'templates', users: 'users',
groups: 'groups', rbac: 'rbac', 'audit-logs': 'rbac', mcp: 'servers',
};
const resource = resourceMap[segment];
if (resource === undefined) return { kind: 'skip' };
let action: RbacAction;
switch (method) {
case 'GET': case 'HEAD': action = 'view'; break;
case 'POST': action = 'create'; break;
case 'DELETE': action = 'delete'; break;
default: action = 'edit'; break;
}
const nameMatch = url.match(/^\/api\/v1\/[a-z-]+\/([^/?]+)/);
const resourceName = nameMatch?.[1];
const check: PermissionCheck = { kind: 'resource', resource, action };
if (resourceName !== undefined) (check as { resourceName: string }).resourceName = resourceName;
return check;
}
let app: FastifyInstance;
afterEach(async () => {
if (app) await app.close();
});
async function createTestApp() {
const serverRepo = mockServerRepo();
const rbacRepo = mockRbacRepo();
const prisma = mockPrisma();
const rbacService = new RbacService(rbacRepo, prisma);
const CUID_RE = /^c[^\s-]{8,}$/i;
const nameResolvers: Record<string, { findById(id: string): Promise<{ name: string } | null> }> = {
servers: serverRepo,
};
app = Fastify({ logger: false });
app.setErrorHandler(errorHandler);
// Auth hook (mock)
app.addHook('preHandler', async (request, reply) => {
const url = request.url;
if (url.startsWith('/api/v1/auth/') || url === '/healthz') return;
if (!url.startsWith('/api/v1/')) return;
const header = request.headers.authorization;
if (!header?.startsWith('Bearer ')) {
reply.code(401).send({ error: 'Unauthorized' });
return;
}
const token = header.slice(7);
const session = SESSIONS[token];
if (!session) {
reply.code(401).send({ error: 'Invalid token' });
return;
}
request.userId = session.userId;
});
// RBAC hook (replicates main.ts)
app.addHook('preHandler', async (request, reply) => {
if (reply.sent) return;
const url = request.url;
if (url.startsWith('/api/v1/auth/') || url === '/healthz') return;
if (!url.startsWith('/api/v1/')) return;
if (request.userId === undefined) return;
const check = mapUrlToPermission(request.method, url);
if (check.kind === 'skip') return;
let allowed: boolean;
if (check.kind === 'operation') {
allowed = await rbacService.canRunOperation(request.userId, check.operation);
} else {
// CUID→name resolution
if (check.resourceName !== undefined && CUID_RE.test(check.resourceName)) {
const resolver = nameResolvers[check.resource];
if (resolver) {
const entity = await resolver.findById(check.resourceName);
if (entity) check.resourceName = entity.name;
}
}
allowed = await rbacService.canAccess(request.userId, check.action, check.resource, check.resourceName);
// Compute scope for list filtering
if (allowed && check.resourceName === undefined) {
request.rbacScope = await rbacService.getAllowedScope(request.userId, check.action, check.resource);
}
}
if (!allowed) {
reply.code(403).send({ error: 'Forbidden' });
}
});
// Routes
const serverService = new McpServerService(serverRepo);
const instanceService = new InstanceService(stubInstanceRepo(), serverRepo, stubOrchestrator());
serverService.setInstanceService(instanceService);
registerMcpServerRoutes(app, serverService, instanceService);
// preSerialization hook (list filtering)
app.addHook('preSerialization', async (request, _reply, payload) => {
if (!request.rbacScope || request.rbacScope.wildcard) return payload;
if (!Array.isArray(payload)) return payload;
return (payload as Array<Record<string, unknown>>).filter((item) => {
const name = item['name'];
return typeof name === 'string' && request.rbacScope!.names.has(name);
});
});
await app.ready();
return app;
}
// ── Tests ──
describe('RBAC name-scoped integration (reproduces mcpctl bugs)', () => {
beforeEach(async () => {
await createTestApp();
});
describe('Bug 1: mcpctl get servers (list filtering)', () => {
it('name-scoped user sees ONLY their permitted server', async () => {
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers',
headers: { authorization: 'Bearer scoped-token' },
});
expect(res.statusCode).toBe(200);
const servers = res.json<Array<{ name: string }>>();
expect(servers).toHaveLength(1);
expect(servers[0]!.name).toBe('my-home-assistant');
});
it('wildcard user sees ALL servers', async () => {
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers',
headers: { authorization: 'Bearer admin-token' },
});
expect(res.statusCode).toBe(200);
const servers = res.json<Array<{ name: string }>>();
expect(servers).toHaveLength(3);
});
it('user with multiple name-scoped bindings sees only those servers', async () => {
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers',
headers: { authorization: 'Bearer multi-scoped-token' },
});
expect(res.statusCode).toBe(200);
const servers = res.json<Array<{ name: string }>>();
expect(servers).toHaveLength(2);
const names = servers.map((s) => s.name);
expect(names).toContain('my-home-assistant');
expect(names).toContain('slack-server');
expect(names).not.toContain('github-server');
});
it('user with no server permissions gets 403', async () => {
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers',
headers: { authorization: 'Bearer secrets-only-token' },
});
expect(res.statusCode).toBe(403);
});
});
describe('Bug 2: mcpctl get server NAME (CUID resolution)', () => {
it('allows access when URL contains CUID matching a name-scoped binding', async () => {
// CLI resolves my-home-assistant → clxyz000000001
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers/clxyz000000001',
headers: { authorization: 'Bearer scoped-token' },
});
expect(res.statusCode).toBe(200);
expect(res.json<{ name: string }>().name).toBe('my-home-assistant');
});
it('denies access when CUID resolves to server NOT in binding', async () => {
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers/clxyz000000002',
headers: { authorization: 'Bearer scoped-token' },
});
expect(res.statusCode).toBe(403);
});
it('passes RBAC when URL has human-readable name (route 404 is expected)', async () => {
// Human name in URL: RBAC passes (matches binding directly),
// but the route only does findById, so it 404s.
// CLI always resolves name→CUID first, so this doesn't happen in practice.
// The important thing: it does NOT return 403.
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers/my-home-assistant',
headers: { authorization: 'Bearer scoped-token' },
});
expect(res.statusCode).toBe(404); // Not 403!
});
it('handles nonexistent CUID gracefully (403)', async () => {
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers/cnonexistent12345678',
headers: { authorization: 'Bearer scoped-token' },
});
expect(res.statusCode).toBe(403);
});
it('wildcard user can access any server by CUID', async () => {
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers/clxyz000000002',
headers: { authorization: 'Bearer admin-token' },
});
expect(res.statusCode).toBe(200);
expect(res.json<{ name: string }>().name).toBe('slack-server');
});
});
describe('name-scoped write operations', () => {
it('name-scoped edit user can DELETE their named server by CUID', async () => {
const res = await app.inject({
method: 'DELETE',
url: '/api/v1/servers/clxyz000000001',
headers: { authorization: 'Bearer edit-scoped-token' },
});
expect(res.statusCode).toBe(204);
});
it('name-scoped edit user CANNOT delete other servers', async () => {
const res = await app.inject({
method: 'DELETE',
url: '/api/v1/servers/clxyz000000002',
headers: { authorization: 'Bearer edit-scoped-token' },
});
expect(res.statusCode).toBe(403);
});
it('name-scoped view user CANNOT delete their named server', async () => {
const res = await app.inject({
method: 'DELETE',
url: '/api/v1/servers/clxyz000000001',
headers: { authorization: 'Bearer scoped-token' },
});
expect(res.statusCode).toBe(403);
});
});
describe('preSerialization edge cases', () => {
it('single-object responses pass through unmodified', async () => {
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers/clxyz000000001',
headers: { authorization: 'Bearer scoped-token' },
});
expect(res.statusCode).toBe(200);
expect(res.json<{ name: string }>().name).toBe('my-home-assistant');
});
it('unauthenticated requests get 401', async () => {
const res = await app.inject({
method: 'GET',
url: '/api/v1/servers',
});
expect(res.statusCode).toBe(401);
});
});
});

View File

@@ -680,4 +680,333 @@ describe('RbacService', () => {
expect(perms).toEqual([]);
});
});
describe('getAllowedScope', () => {
describe('unscoped binding → wildcard', () => {
it('returns wildcard:true for matching resource', async () => {
const repo = mockRepo([
makeDef({
subjects: [{ kind: 'User', name: 'alice@example.com' }],
roleBindings: [{ role: 'view', resource: 'servers' }],
}),
]);
const prisma = mockPrisma({
user: { findUnique: vi.fn(async () => ({ email: 'alice@example.com' })) },
groupMember: { findMany: vi.fn(async () => []) },
});
const service = new RbacService(repo, prisma);
const scope = await service.getAllowedScope('user-1', 'view', 'servers');
expect(scope.wildcard).toBe(true);
expect(scope.names.size).toBe(0);
});
it('returns wildcard:true with wildcard resource binding', async () => {
const repo = mockRepo([
makeDef({
subjects: [{ kind: 'User', name: 'alice@example.com' }],
roleBindings: [{ role: 'edit', resource: '*' }],
}),
]);
const prisma = mockPrisma({
user: { findUnique: vi.fn(async () => ({ email: 'alice@example.com' })) },
groupMember: { findMany: vi.fn(async () => []) },
});
const service = new RbacService(repo, prisma);
const scope = await service.getAllowedScope('user-1', 'view', 'servers');
expect(scope.wildcard).toBe(true);
});
});
describe('name-scoped binding → restricted', () => {
let service: RbacService;
beforeEach(() => {
const repo = mockRepo([
makeDef({
subjects: [{ kind: 'User', name: 'alice@example.com' }],
roleBindings: [{ role: 'view', resource: 'servers', name: 'my-ha' }],
}),
]);
const prisma = mockPrisma({
user: { findUnique: vi.fn(async () => ({ email: 'alice@example.com' })) },
groupMember: { findMany: vi.fn(async () => []) },
});
service = new RbacService(repo, prisma);
});
it('returns names containing the scoped name', async () => {
const scope = await service.getAllowedScope('user-1', 'view', 'servers');
expect(scope.wildcard).toBe(false);
expect(scope.names).toEqual(new Set(['my-ha']));
});
it('returns empty names for wrong resource', async () => {
const scope = await service.getAllowedScope('user-1', 'view', 'secrets');
expect(scope.wildcard).toBe(false);
expect(scope.names.size).toBe(0);
});
it('returns empty names for wrong action', async () => {
const scope = await service.getAllowedScope('user-1', 'edit', 'servers');
expect(scope.wildcard).toBe(false);
expect(scope.names.size).toBe(0);
});
});
describe('multiple name-scoped bindings → union of names', () => {
it('collects names from multiple bindings', async () => {
const repo = mockRepo([
makeDef({
id: 'def-1',
name: 'rbac-a',
subjects: [{ kind: 'User', name: 'alice@example.com' }],
roleBindings: [{ role: 'view', resource: 'servers', name: 'server-a' }],
}),
makeDef({
id: 'def-2',
name: 'rbac-b',
subjects: [{ kind: 'User', name: 'alice@example.com' }],
roleBindings: [{ role: 'view', resource: 'servers', name: 'server-b' }],
}),
]);
const prisma = mockPrisma({
user: { findUnique: vi.fn(async () => ({ email: 'alice@example.com' })) },
groupMember: { findMany: vi.fn(async () => []) },
});
const service = new RbacService(repo, prisma);
const scope = await service.getAllowedScope('user-1', 'view', 'servers');
expect(scope.wildcard).toBe(false);
expect(scope.names).toEqual(new Set(['server-a', 'server-b']));
});
});
describe('mixed scoped + unscoped → wildcard wins', () => {
it('returns wildcard:true when any binding is unscoped', async () => {
const repo = mockRepo([
makeDef({
subjects: [{ kind: 'User', name: 'alice@example.com' }],
roleBindings: [
{ role: 'view', resource: 'servers', name: 'my-ha' },
{ role: 'view', resource: 'servers' },
],
}),
]);
const prisma = mockPrisma({
user: { findUnique: vi.fn(async () => ({ email: 'alice@example.com' })) },
groupMember: { findMany: vi.fn(async () => []) },
});
const service = new RbacService(repo, prisma);
const scope = await service.getAllowedScope('user-1', 'view', 'servers');
expect(scope.wildcard).toBe(true);
});
});
describe('no matching permissions → empty', () => {
it('returns wildcard:false with empty names', async () => {
const repo = mockRepo([]);
const prisma = mockPrisma();
const service = new RbacService(repo, prisma);
const scope = await service.getAllowedScope('unknown', 'view', 'servers');
expect(scope.wildcard).toBe(false);
expect(scope.names.size).toBe(0);
});
});
describe('edit role grants view scope', () => {
let service: RbacService;
beforeEach(() => {
const repo = mockRepo([
makeDef({
subjects: [{ kind: 'User', name: 'alice@example.com' }],
roleBindings: [{ role: 'edit', resource: 'servers', name: 'my-ha' }],
}),
]);
const prisma = mockPrisma({
user: { findUnique: vi.fn(async () => ({ email: 'alice@example.com' })) },
groupMember: { findMany: vi.fn(async () => []) },
});
service = new RbacService(repo, prisma);
});
it('returns names for view action', async () => {
const scope = await service.getAllowedScope('user-1', 'view', 'servers');
expect(scope.wildcard).toBe(false);
expect(scope.names).toEqual(new Set(['my-ha']));
});
it('returns names for create action', async () => {
const scope = await service.getAllowedScope('user-1', 'create', 'servers');
expect(scope.wildcard).toBe(false);
expect(scope.names).toEqual(new Set(['my-ha']));
});
it('returns names for delete action', async () => {
const scope = await service.getAllowedScope('user-1', 'delete', 'servers');
expect(scope.wildcard).toBe(false);
expect(scope.names).toEqual(new Set(['my-ha']));
});
});
describe('operation bindings are ignored', () => {
it('returns empty names when only operation bindings exist', async () => {
const repo = mockRepo([
makeDef({
subjects: [{ kind: 'User', name: 'alice@example.com' }],
roleBindings: [{ role: 'run', action: 'logs' }],
}),
]);
const prisma = mockPrisma({
user: { findUnique: vi.fn(async () => ({ email: 'alice@example.com' })) },
groupMember: { findMany: vi.fn(async () => []) },
});
const service = new RbacService(repo, prisma);
const scope = await service.getAllowedScope('user-1', 'view', 'servers');
expect(scope.wildcard).toBe(false);
expect(scope.names.size).toBe(0);
});
});
});
describe('unknown/legacy roles are denied', () => {
let service: RbacService;
beforeEach(() => {
const repo = mockRepo([
makeDef({
roleBindings: [{ role: 'admin', resource: '*' }],
}),
]);
const prisma = mockPrisma({
user: { findUnique: vi.fn(async () => ({ email: 'alice@example.com' })) },
groupMember: { findMany: vi.fn(async () => []) },
});
service = new RbacService(repo, prisma);
});
it('denies view when only legacy admin role exists', async () => {
expect(await service.canAccess('user-1', 'view', 'servers')).toBe(false);
});
it('denies create when only legacy admin role exists', async () => {
expect(await service.canAccess('user-1', 'create', 'servers')).toBe(false);
});
it('denies edit when only legacy admin role exists', async () => {
expect(await service.canAccess('user-1', 'edit', 'servers')).toBe(false);
});
it('denies delete when only legacy admin role exists', async () => {
expect(await service.canAccess('user-1', 'delete', 'servers')).toBe(false);
});
it('denies any made-up role', async () => {
const repo = mockRepo([
makeDef({
roleBindings: [{ role: 'superuser', resource: 'servers' }],
}),
]);
const prisma = mockPrisma({
user: { findUnique: vi.fn(async () => ({ email: 'alice@example.com' })) },
groupMember: { findMany: vi.fn(async () => []) },
});
const svc = new RbacService(repo, prisma);
expect(await svc.canAccess('user-1', 'view', 'servers')).toBe(false);
expect(await svc.canAccess('user-1', 'edit', 'servers')).toBe(false);
});
});
describe('expose role', () => {
it('grants expose access with expose role binding', async () => {
const repo = mockRepo([
makeDef({
roleBindings: [{ role: 'expose', resource: 'projects' }],
}),
]);
const prisma = mockPrisma({
user: { findUnique: vi.fn(async () => ({ email: 'alice@example.com' })) },
groupMember: { findMany: vi.fn(async () => []) },
});
const service = new RbacService(repo, prisma);
expect(await service.canAccess('user-1', 'expose', 'projects')).toBe(true);
});
it('grants expose access with edit role binding (edit includes expose)', async () => {
const repo = mockRepo([
makeDef({
roleBindings: [{ role: 'edit', resource: 'projects' }],
}),
]);
const prisma = mockPrisma({
user: { findUnique: vi.fn(async () => ({ email: 'alice@example.com' })) },
groupMember: { findMany: vi.fn(async () => []) },
});
const service = new RbacService(repo, prisma);
expect(await service.canAccess('user-1', 'expose', 'projects')).toBe(true);
});
it('denies expose access with view role binding', async () => {
const repo = mockRepo([
makeDef({
roleBindings: [{ role: 'view', resource: 'projects' }],
}),
]);
const prisma = mockPrisma({
user: { findUnique: vi.fn(async () => ({ email: 'alice@example.com' })) },
groupMember: { findMany: vi.fn(async () => []) },
});
const service = new RbacService(repo, prisma);
expect(await service.canAccess('user-1', 'expose', 'projects')).toBe(false);
});
it('expose role also grants view access', async () => {
const repo = mockRepo([
makeDef({
roleBindings: [{ role: 'expose', resource: 'projects' }],
}),
]);
const prisma = mockPrisma({
user: { findUnique: vi.fn(async () => ({ email: 'alice@example.com' })) },
groupMember: { findMany: vi.fn(async () => []) },
});
const service = new RbacService(repo, prisma);
expect(await service.canAccess('user-1', 'view', 'projects')).toBe(true);
});
it('expose role with name-scoped binding', async () => {
const repo = mockRepo([
makeDef({
roleBindings: [{ role: 'expose', resource: 'projects', name: 'my-project' }],
}),
]);
const prisma = mockPrisma({
user: { findUnique: vi.fn(async () => ({ email: 'alice@example.com' })) },
groupMember: { findMany: vi.fn(async () => []) },
});
const service = new RbacService(repo, prisma);
expect(await service.canAccess('user-1', 'expose', 'projects', 'my-project')).toBe(true);
expect(await service.canAccess('user-1', 'expose', 'projects', 'other-project')).toBe(false);
});
it('getAllowedScope with expose role grants view scope', async () => {
const repo = mockRepo([
makeDef({
roleBindings: [{ role: 'expose', resource: 'projects' }],
}),
]);
const prisma = mockPrisma({
user: { findUnique: vi.fn(async () => ({ email: 'alice@example.com' })) },
groupMember: { findMany: vi.fn(async () => []) },
});
const service = new RbacService(repo, prisma);
const scope = await service.getAllowedScope('user-1', 'view', 'projects');
expect(scope.wildcard).toBe(true);
});
});
});

View File

@@ -0,0 +1,302 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { PromptService } from '../../src/services/prompt.service.js';
import type { IPromptRepository } from '../../src/repositories/prompt.repository.js';
import type { IPromptRequestRepository } from '../../src/repositories/prompt-request.repository.js';
import type { IProjectRepository } from '../../src/repositories/project.repository.js';
import type { Prompt, PromptRequest, Project } from '@prisma/client';
function makePrompt(overrides: Partial<Prompt> = {}): Prompt {
return {
id: 'prompt-1',
name: 'test-prompt',
content: 'Hello world',
projectId: null,
version: 1,
createdAt: new Date(),
updatedAt: new Date(),
...overrides,
};
}
function makePromptRequest(overrides: Partial<PromptRequest> = {}): PromptRequest {
return {
id: 'req-1',
name: 'test-request',
content: 'Proposed content',
projectId: null,
createdBySession: 'session-abc',
createdByUserId: null,
createdAt: new Date(),
...overrides,
};
}
function makeProject(overrides: Partial<Project> = {}): Project {
return {
id: 'proj-1',
name: 'test-project',
description: '',
prompt: '',
proxyMode: 'direct',
llmProvider: null,
llmModel: null,
ownerId: 'user-1',
createdAt: new Date(),
updatedAt: new Date(),
...overrides,
} as Project;
}
function mockPromptRepo(): IPromptRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByNameAndProject: vi.fn(async () => null),
create: vi.fn(async (data) => makePrompt(data)),
update: vi.fn(async (id, data) => makePrompt({ id, ...data })),
delete: vi.fn(async () => {}),
};
}
function mockPromptRequestRepo(): IPromptRequestRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByNameAndProject: vi.fn(async () => null),
findBySession: vi.fn(async () => []),
create: vi.fn(async (data) => makePromptRequest(data)),
delete: vi.fn(async () => {}),
};
}
function mockProjectRepo(): IProjectRepository {
return {
findAll: vi.fn(async () => []),
findById: vi.fn(async () => null),
findByName: vi.fn(async () => null),
create: vi.fn(async (data) => makeProject(data)),
update: vi.fn(async (id, data) => makeProject({ id, ...data })),
delete: vi.fn(async () => {}),
};
}
describe('PromptService', () => {
let promptRepo: IPromptRepository;
let promptRequestRepo: IPromptRequestRepository;
let projectRepo: IProjectRepository;
let service: PromptService;
beforeEach(() => {
promptRepo = mockPromptRepo();
promptRequestRepo = mockPromptRequestRepo();
projectRepo = mockProjectRepo();
service = new PromptService(promptRepo, promptRequestRepo, projectRepo);
});
// ── Prompt CRUD ──
describe('listPrompts', () => {
it('should return all prompts', async () => {
const prompts = [makePrompt(), makePrompt({ id: 'prompt-2', name: 'other' })];
vi.mocked(promptRepo.findAll).mockResolvedValue(prompts);
const result = await service.listPrompts();
expect(result).toEqual(prompts);
expect(promptRepo.findAll).toHaveBeenCalledWith(undefined);
});
it('should filter by projectId', async () => {
await service.listPrompts('proj-1');
expect(promptRepo.findAll).toHaveBeenCalledWith('proj-1');
});
});
describe('getPrompt', () => {
it('should return a prompt by id', async () => {
const prompt = makePrompt();
vi.mocked(promptRepo.findById).mockResolvedValue(prompt);
const result = await service.getPrompt('prompt-1');
expect(result).toEqual(prompt);
});
it('should throw NotFoundError for missing prompt', async () => {
await expect(service.getPrompt('nope')).rejects.toThrow('Prompt not found: nope');
});
});
describe('createPrompt', () => {
it('should create a prompt', async () => {
const result = await service.createPrompt({ name: 'new-prompt', content: 'stuff' });
expect(promptRepo.create).toHaveBeenCalledWith({ name: 'new-prompt', content: 'stuff' });
expect(result.name).toBe('new-prompt');
});
it('should validate project exists when projectId given', async () => {
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject());
await service.createPrompt({ name: 'scoped', content: 'x', projectId: 'proj-1' });
expect(projectRepo.findById).toHaveBeenCalledWith('proj-1');
});
it('should throw when project not found', async () => {
await expect(
service.createPrompt({ name: 'bad', content: 'x', projectId: 'nope' }),
).rejects.toThrow('Project not found: nope');
});
it('should reject invalid name format', async () => {
await expect(
service.createPrompt({ name: 'INVALID_NAME', content: 'x' }),
).rejects.toThrow();
});
});
describe('updatePrompt', () => {
it('should update prompt content', async () => {
vi.mocked(promptRepo.findById).mockResolvedValue(makePrompt());
await service.updatePrompt('prompt-1', { content: 'updated' });
expect(promptRepo.update).toHaveBeenCalledWith('prompt-1', { content: 'updated' });
});
it('should throw for missing prompt', async () => {
await expect(service.updatePrompt('nope', { content: 'x' })).rejects.toThrow('Prompt not found');
});
});
describe('deletePrompt', () => {
it('should delete an existing prompt', async () => {
vi.mocked(promptRepo.findById).mockResolvedValue(makePrompt());
await service.deletePrompt('prompt-1');
expect(promptRepo.delete).toHaveBeenCalledWith('prompt-1');
});
it('should throw for missing prompt', async () => {
await expect(service.deletePrompt('nope')).rejects.toThrow('Prompt not found');
});
});
// ── PromptRequest CRUD ──
describe('listPromptRequests', () => {
it('should return all prompt requests', async () => {
const reqs = [makePromptRequest()];
vi.mocked(promptRequestRepo.findAll).mockResolvedValue(reqs);
const result = await service.listPromptRequests();
expect(result).toEqual(reqs);
});
});
describe('getPromptRequest', () => {
it('should return a prompt request by id', async () => {
const req = makePromptRequest();
vi.mocked(promptRequestRepo.findById).mockResolvedValue(req);
const result = await service.getPromptRequest('req-1');
expect(result).toEqual(req);
});
it('should throw for missing request', async () => {
await expect(service.getPromptRequest('nope')).rejects.toThrow('PromptRequest not found');
});
});
describe('deletePromptRequest', () => {
it('should delete an existing request', async () => {
vi.mocked(promptRequestRepo.findById).mockResolvedValue(makePromptRequest());
await service.deletePromptRequest('req-1');
expect(promptRequestRepo.delete).toHaveBeenCalledWith('req-1');
});
});
// ── Propose ──
describe('propose', () => {
it('should create a prompt request', async () => {
const result = await service.propose({
name: 'my-prompt',
content: 'proposal',
createdBySession: 'sess-1',
});
expect(promptRequestRepo.create).toHaveBeenCalledWith(
expect.objectContaining({ name: 'my-prompt', content: 'proposal', createdBySession: 'sess-1' }),
);
expect(result.name).toBe('my-prompt');
});
it('should validate project exists when projectId given', async () => {
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject());
await service.propose({
name: 'scoped',
content: 'x',
projectId: 'proj-1',
});
expect(projectRepo.findById).toHaveBeenCalledWith('proj-1');
});
});
// ── Approve ──
describe('approve', () => {
it('should delete request and create prompt (atomic)', async () => {
const req = makePromptRequest({ id: 'req-1', name: 'approved', content: 'good stuff', projectId: 'proj-1' });
vi.mocked(promptRequestRepo.findById).mockResolvedValue(req);
const result = await service.approve('req-1');
expect(promptRepo.create).toHaveBeenCalledWith(
expect.objectContaining({ name: 'approved', content: 'good stuff', projectId: 'proj-1' }),
);
expect(promptRequestRepo.delete).toHaveBeenCalledWith('req-1');
expect(result.name).toBe('approved');
});
it('should throw for missing request', async () => {
await expect(service.approve('nope')).rejects.toThrow('PromptRequest not found');
});
it('should handle global prompt (no projectId)', async () => {
const req = makePromptRequest({ id: 'req-2', name: 'global', content: 'stuff', projectId: null });
vi.mocked(promptRequestRepo.findById).mockResolvedValue(req);
await service.approve('req-2');
// Should NOT include projectId in the create call
const createArg = vi.mocked(promptRepo.create).mock.calls[0]![0];
expect(createArg).not.toHaveProperty('projectId');
});
});
// ── Visibility ──
describe('getVisiblePrompts', () => {
it('should return approved prompts and session requests', async () => {
vi.mocked(promptRepo.findAll).mockResolvedValue([
makePrompt({ name: 'approved-1', content: 'A' }),
]);
vi.mocked(promptRequestRepo.findBySession).mockResolvedValue([
makePromptRequest({ name: 'pending-1', content: 'B' }),
]);
const result = await service.getVisiblePrompts('proj-1', 'sess-1');
expect(result).toHaveLength(2);
expect(result[0]).toEqual({ name: 'approved-1', content: 'A', type: 'prompt' });
expect(result[1]).toEqual({ name: 'pending-1', content: 'B', type: 'promptrequest' });
});
it('should not include pending requests without sessionId', async () => {
vi.mocked(promptRepo.findAll).mockResolvedValue([makePrompt()]);
const result = await service.getVisiblePrompts('proj-1');
expect(result).toHaveLength(1);
expect(promptRequestRepo.findBySession).not.toHaveBeenCalled();
});
it('should return empty when no prompts or requests', async () => {
const result = await service.getVisiblePrompts();
expect(result).toEqual([]);
});
});
});

View File

@@ -5,6 +5,7 @@ import { McpdUpstream } from './upstream/mcpd.js';
interface McpdServer {
id: string;
name: string;
description?: string;
transport: string;
status?: string;
}
@@ -35,7 +36,7 @@ export async function refreshProjectUpstreams(
let servers: McpdServer[];
if (authToken) {
// Forward the client's auth token to mcpd so RBAC applies
const result = await mcpdClient.forward('GET', path, '', undefined);
const result = await mcpdClient.forward('GET', path, '', undefined, authToken);
if (result.status >= 400) {
throw new Error(`Failed to fetch project servers: ${result.status}`);
}
@@ -63,7 +64,7 @@ function syncUpstreams(router: McpRouter, mcpdClient: McpdClient, servers: McpdS
// Add/update upstreams for each server
for (const server of servers) {
if (!currentNames.has(server.name)) {
const upstream = new McpdUpstream(server.id, server.name, mcpdClient);
const upstream = new McpdUpstream(server.id, server.name, mcpdClient, server.description);
router.addUpstream(upstream);
}
registered.push(server.name);

View File

@@ -1,3 +1,7 @@
import { existsSync, readFileSync } from 'node:fs';
import { join } from 'node:path';
import { homedir } from 'node:os';
/** Configuration for the mcplocal HTTP server. */
export interface HttpConfig {
/** Port for the HTTP server (default: 3200) */
@@ -15,9 +19,24 @@ export interface HttpConfig {
const DEFAULT_HTTP_PORT = 3200;
const DEFAULT_HTTP_HOST = '127.0.0.1';
const DEFAULT_MCPD_URL = 'http://localhost:3100';
const DEFAULT_MCPD_TOKEN = '';
const DEFAULT_LOG_LEVEL = 'info';
/**
* Read the user's mcpctl credentials from ~/.mcpctl/credentials.
* Returns the token if found, empty string otherwise.
*/
function loadUserToken(): string {
try {
const credPath = join(homedir(), '.mcpctl', 'credentials');
if (!existsSync(credPath)) return '';
const raw = readFileSync(credPath, 'utf-8');
const parsed = JSON.parse(raw) as { token?: string };
return parsed.token ?? '';
} catch {
return '';
}
}
export function loadHttpConfig(env: Record<string, string | undefined> = process.env): HttpConfig {
const portStr = env['MCPLOCAL_HTTP_PORT'];
const port = portStr !== undefined ? parseInt(portStr, 10) : DEFAULT_HTTP_PORT;
@@ -26,7 +45,7 @@ export function loadHttpConfig(env: Record<string, string | undefined> = process
httpPort: Number.isFinite(port) ? port : DEFAULT_HTTP_PORT,
httpHost: env['MCPLOCAL_HTTP_HOST'] ?? DEFAULT_HTTP_HOST,
mcpdUrl: env['MCPLOCAL_MCPD_URL'] ?? DEFAULT_MCPD_URL,
mcpdToken: env['MCPLOCAL_MCPD_TOKEN'] ?? DEFAULT_MCPD_TOKEN,
mcpdToken: env['MCPLOCAL_MCPD_TOKEN'] ?? loadUserToken(),
logLevel: (env['MCPLOCAL_LOG_LEVEL'] as HttpConfig['logLevel'] | undefined) ?? DEFAULT_LOG_LEVEL,
};
}

View File

@@ -23,11 +23,21 @@ export class ConnectionError extends Error {
export class McpdClient {
private readonly baseUrl: string;
private readonly token: string;
private readonly extraHeaders: Record<string, string>;
constructor(baseUrl: string, token: string) {
constructor(baseUrl: string, token: string, extraHeaders?: Record<string, string>) {
// Strip trailing slash for consistent URL joining
this.baseUrl = baseUrl.replace(/\/+$/, '');
this.token = token;
this.extraHeaders = extraHeaders ?? {};
}
/**
* Create a new client with additional default headers.
* Inherits base URL and token from the current client.
*/
withHeaders(headers: Record<string, string>): McpdClient {
return new McpdClient(this.baseUrl, this.token, { ...this.extraHeaders, ...headers });
}
async get<T>(path: string): Promise<T> {
@@ -62,6 +72,7 @@ export class McpdClient {
): Promise<{ status: number; body: unknown }> {
const url = `${this.baseUrl}${path}${query ? `?${query}` : ''}`;
const headers: Record<string, string> = {
...this.extraHeaders,
'Authorization': `Bearer ${authOverride ?? this.token}`,
'Accept': 'application/json',
};

View File

@@ -12,6 +12,7 @@ import type { FastifyInstance } from 'fastify';
import { StreamableHTTPServerTransport } from '@modelcontextprotocol/sdk/server/streamableHttp.js';
import type { JSONRPCMessage } from '@modelcontextprotocol/sdk/types.js';
import { McpRouter } from '../router.js';
import { ResponsePaginator } from '../llm/pagination.js';
import { refreshProjectUpstreams } from '../discovery.js';
import type { McpdClient } from './mcpd-client.js';
import type { JsonRpcRequest } from '../types.js';
@@ -44,6 +45,35 @@ export function registerProjectMcpEndpoint(app: FastifyInstance, mcpdClient: Mcp
const router = existing?.router ?? new McpRouter();
await refreshProjectUpstreams(router, mcpdClient, projectName, authToken);
// Wire pagination support (no LLM provider for now — simple index fallback)
router.setPaginator(new ResponsePaginator(null));
// Configure prompt resources with SA-scoped client for RBAC
const saClient = mcpdClient.withHeaders({ 'X-Service-Account': `project:${projectName}` });
router.setPromptConfig(saClient, projectName);
// Fetch project instructions and set on router
try {
const instructions = await mcpdClient.get<{ prompt: string; servers: Array<{ name: string; description: string }> }>(
`/api/v1/projects/${encodeURIComponent(projectName)}/instructions`,
);
const parts: string[] = [];
if (instructions.prompt) {
parts.push(instructions.prompt);
}
if (instructions.servers.length > 0) {
parts.push('Available MCP servers:');
for (const s of instructions.servers) {
parts.push(`- ${s.name}${s.description ? `: ${s.description}` : ''}`);
}
}
if (parts.length > 0) {
router.setInstructions(parts.join('\n'));
}
} catch {
// Instructions are optional — don't fail if endpoint is unavailable
}
projectCache.set(projectName, { router, lastRefresh: now });
return router;
}
@@ -84,7 +114,8 @@ export function registerProjectMcpEndpoint(app: FastifyInstance, mcpdClient: Mcp
transport.onmessage = async (message: JSONRPCMessage) => {
if ('method' in message && 'id' in message) {
const response = await router.route(message as unknown as JsonRpcRequest);
const ctx = transport.sessionId ? { sessionId: transport.sessionId } : undefined;
const response = await router.route(message as unknown as JsonRpcRequest, ctx);
await transport.send(response as unknown as JSONRPCMessage);
}
};

View File

@@ -6,3 +6,5 @@ export { FilterCache, DEFAULT_FILTER_CACHE_CONFIG } from './filter-cache.js';
export type { FilterCacheConfig } from './filter-cache.js';
export { FilterMetrics } from './metrics.js';
export type { FilterMetricsSnapshot } from './metrics.js';
export { ResponsePaginator, DEFAULT_PAGINATION_CONFIG, PAGINATION_INDEX_SYSTEM_PROMPT } from './pagination.js';
export type { PaginationConfig, PaginationIndex, PageSummary, PaginatedToolResponse } from './pagination.js';

View File

@@ -0,0 +1,354 @@
import { randomUUID } from 'node:crypto';
import type { ProviderRegistry } from '../providers/registry.js';
import { estimateTokens } from './token-counter.js';
// --- Configuration ---
export interface PaginationConfig {
/** Character threshold above which responses get paginated (default 80_000) */
sizeThreshold: number;
/** Characters per page (default 40_000) */
pageSize: number;
/** Max cached results (LRU eviction) (default 64) */
maxCachedResults: number;
/** TTL for cached results in ms (default 300_000 = 5 min) */
ttlMs: number;
/** Max tokens for the LLM index generation call (default 2048) */
indexMaxTokens: number;
}
export const DEFAULT_PAGINATION_CONFIG: PaginationConfig = {
sizeThreshold: 80_000,
pageSize: 40_000,
maxCachedResults: 64,
ttlMs: 300_000,
indexMaxTokens: 2048,
};
// --- Cache Entry ---
interface PageInfo {
/** 0-based page index */
index: number;
/** Start character offset in the raw string */
startChar: number;
/** End character offset (exclusive) */
endChar: number;
/** Approximate token count */
estimatedTokens: number;
}
interface CachedResult {
resultId: string;
toolName: string;
raw: string;
pages: PageInfo[];
index: PaginationIndex;
createdAt: number;
}
// --- Index Types ---
export interface PageSummary {
page: number;
startChar: number;
endChar: number;
estimatedTokens: number;
summary: string;
}
export interface PaginationIndex {
resultId: string;
toolName: string;
totalSize: number;
totalTokens: number;
totalPages: number;
pageSummaries: PageSummary[];
indexType: 'smart' | 'simple';
}
// --- The MCP response format ---
export interface PaginatedToolResponse {
content: Array<{
type: 'text';
text: string;
}>;
}
// --- LLM Prompt ---
export const PAGINATION_INDEX_SYSTEM_PROMPT = `You are a document indexing assistant. Given a large tool response split into pages, generate a concise summary for each page describing what data it contains.
Rules:
- For each page, write 1-2 sentences describing the key content
- Be specific: mention entity names, IDs, counts, or key fields visible on that page
- If it's JSON, describe the structure and notable entries
- If it's text, describe the topics covered
- Output valid JSON only: an array of objects with "page" (1-based number) and "summary" (string)
- Example output: [{"page": 1, "summary": "Configuration nodes and global settings (inject, debug, function nodes 1-15)"}, {"page": 2, "summary": "HTTP request nodes and API integrations (nodes 16-40)"}]`;
/**
* Handles transparent pagination of large MCP tool responses.
*
* When a tool response exceeds the size threshold, it is cached and an
* index is returned instead. The LLM can then request specific pages
* via _page/_resultId parameters on subsequent tool calls.
*
* If an LLM provider is available, the index includes AI-generated
* per-page summaries. Otherwise, simple byte-range descriptions are used.
*/
export class ResponsePaginator {
private cache = new Map<string, CachedResult>();
private readonly config: PaginationConfig;
constructor(
private providers: ProviderRegistry | null,
config: Partial<PaginationConfig> = {},
) {
this.config = { ...DEFAULT_PAGINATION_CONFIG, ...config };
}
/**
* Check if a raw response string should be paginated.
*/
shouldPaginate(raw: string): boolean {
return raw.length >= this.config.sizeThreshold;
}
/**
* Paginate a large response: cache it and return the index.
* Returns null if the response is below threshold.
*/
async paginate(toolName: string, raw: string): Promise<PaginatedToolResponse | null> {
if (!this.shouldPaginate(raw)) return null;
const resultId = randomUUID();
const pages = this.splitPages(raw);
let index: PaginationIndex;
try {
index = await this.generateSmartIndex(resultId, toolName, raw, pages);
} catch {
index = this.generateSimpleIndex(resultId, toolName, raw, pages);
}
// Store in cache
this.evictExpired();
this.evictLRU();
this.cache.set(resultId, {
resultId,
toolName,
raw,
pages,
index,
createdAt: Date.now(),
});
return this.formatIndexResponse(index);
}
/**
* Serve a specific page from cache.
* Returns null if the resultId is not found (cache miss / expired).
*/
getPage(resultId: string, page: number | 'all'): PaginatedToolResponse | null {
this.evictExpired();
const entry = this.cache.get(resultId);
if (!entry) return null;
if (page === 'all') {
return {
content: [{ type: 'text', text: entry.raw }],
};
}
// Pages are 1-based in the API
const pageInfo = entry.pages[page - 1];
if (!pageInfo) {
return {
content: [{
type: 'text',
text: `Error: page ${String(page)} is out of range. This result has ${String(entry.pages.length)} pages (1-${String(entry.pages.length)}).`,
}],
};
}
const pageContent = entry.raw.slice(pageInfo.startChar, pageInfo.endChar);
return {
content: [{
type: 'text',
text: `[Page ${String(page)}/${String(entry.pages.length)} of result ${resultId}]\n\n${pageContent}`,
}],
};
}
/**
* Check if a tool call has pagination parameters (_page / _resultId).
* Returns the parsed pagination request, or null if not a pagination request.
*/
static extractPaginationParams(
args: Record<string, unknown>,
): { resultId: string; page: number | 'all' } | null {
const resultId = args['_resultId'];
const pageParam = args['_page'];
if (typeof resultId !== 'string' || pageParam === undefined) return null;
if (pageParam === 'all') return { resultId, page: 'all' };
const page = Number(pageParam);
if (!Number.isInteger(page) || page < 1) return null;
return { resultId, page };
}
// --- Private methods ---
private splitPages(raw: string): PageInfo[] {
const pages: PageInfo[] = [];
let offset = 0;
let pageIndex = 0;
while (offset < raw.length) {
const end = Math.min(offset + this.config.pageSize, raw.length);
// Try to break at a newline boundary if we're not at the end
let breakAt = end;
if (end < raw.length) {
const lastNewline = raw.lastIndexOf('\n', end);
if (lastNewline > offset) {
breakAt = lastNewline + 1;
}
}
pages.push({
index: pageIndex,
startChar: offset,
endChar: breakAt,
estimatedTokens: estimateTokens(raw.slice(offset, breakAt)),
});
offset = breakAt;
pageIndex++;
}
return pages;
}
private async generateSmartIndex(
resultId: string,
toolName: string,
raw: string,
pages: PageInfo[],
): Promise<PaginationIndex> {
const provider = this.providers?.getActive();
if (!provider) {
return this.generateSimpleIndex(resultId, toolName, raw, pages);
}
// Build a prompt with page previews (first ~500 chars of each page)
const previews = pages.map((p, i) => {
const preview = raw.slice(p.startChar, Math.min(p.startChar + 500, p.endChar));
const truncated = p.endChar - p.startChar > 500 ? '\n[...]' : '';
return `--- Page ${String(i + 1)} (chars ${String(p.startChar)}-${String(p.endChar)}, ~${String(p.estimatedTokens)} tokens) ---\n${preview}${truncated}`;
}).join('\n\n');
const result = await provider.complete({
messages: [
{ role: 'system', content: PAGINATION_INDEX_SYSTEM_PROMPT },
{ role: 'user', content: `Tool: ${toolName}\nTotal size: ${String(raw.length)} chars, ${String(pages.length)} pages\n\n${previews}` },
],
maxTokens: this.config.indexMaxTokens,
temperature: 0,
});
const summaries = JSON.parse(result.content) as Array<{ page: number; summary: string }>;
return {
resultId,
toolName,
totalSize: raw.length,
totalTokens: estimateTokens(raw),
totalPages: pages.length,
indexType: 'smart',
pageSummaries: pages.map((p, i) => ({
page: i + 1,
startChar: p.startChar,
endChar: p.endChar,
estimatedTokens: p.estimatedTokens,
summary: summaries.find((s) => s.page === i + 1)?.summary ?? `Page ${String(i + 1)}`,
})),
};
}
private generateSimpleIndex(
resultId: string,
toolName: string,
raw: string,
pages: PageInfo[],
): PaginationIndex {
return {
resultId,
toolName,
totalSize: raw.length,
totalTokens: estimateTokens(raw),
totalPages: pages.length,
indexType: 'simple',
pageSummaries: pages.map((p, i) => ({
page: i + 1,
startChar: p.startChar,
endChar: p.endChar,
estimatedTokens: p.estimatedTokens,
summary: `Page ${String(i + 1)}: characters ${String(p.startChar)}-${String(p.endChar)} (~${String(p.estimatedTokens)} tokens)`,
})),
};
}
private formatIndexResponse(index: PaginationIndex): PaginatedToolResponse {
const lines = [
`This response is too large to return directly (${String(index.totalSize)} chars, ~${String(index.totalTokens)} tokens).`,
`It has been split into ${String(index.totalPages)} pages.`,
'',
'To retrieve a specific page, call this same tool again with additional arguments:',
` "_resultId": "${index.resultId}"`,
` "_page": <page_number> (1-${String(index.totalPages)})`,
' "_page": "all" (returns the full response)',
'',
`--- Page Index${index.indexType === 'smart' ? ' (AI-generated summaries)' : ''} ---`,
];
for (const page of index.pageSummaries) {
lines.push(` Page ${String(page.page)}: ${page.summary}`);
}
return {
content: [{ type: 'text', text: lines.join('\n') }],
};
}
private evictExpired(): void {
const now = Date.now();
for (const [id, entry] of this.cache) {
if (now - entry.createdAt > this.config.ttlMs) {
this.cache.delete(id);
}
}
}
private evictLRU(): void {
while (this.cache.size >= this.config.maxCachedResults) {
const oldest = this.cache.keys().next();
if (oldest.done) break;
this.cache.delete(oldest.value);
}
}
/** Exposed for testing. */
get cacheSize(): number {
return this.cache.size;
}
/** Clear all cached results. */
clearCache(): void {
this.cache.clear();
}
}

View File

@@ -1,5 +1,11 @@
import type { UpstreamConnection, JsonRpcRequest, JsonRpcResponse, JsonRpcNotification } from './types.js';
import type { LlmProcessor } from './llm/processor.js';
import { ResponsePaginator } from './llm/pagination.js';
import type { McpdClient } from './http/mcpd-client.js';
export interface RouteContext {
sessionId?: string;
}
/**
* Routes MCP requests to the appropriate upstream server.
@@ -17,11 +23,29 @@ export class McpRouter {
private promptToServer = new Map<string, string>();
private notificationHandler: ((notification: JsonRpcNotification) => void) | null = null;
private llmProcessor: LlmProcessor | null = null;
private instructions: string | null = null;
private mcpdClient: McpdClient | null = null;
private projectName: string | null = null;
private mcpctlResourceContents = new Map<string, string>();
private paginator: ResponsePaginator | null = null;
setPaginator(paginator: ResponsePaginator): void {
this.paginator = paginator;
}
setLlmProcessor(processor: LlmProcessor): void {
this.llmProcessor = processor;
}
setInstructions(instructions: string): void {
this.instructions = instructions;
}
setPromptConfig(mcpdClient: McpdClient, projectName: string): void {
this.mcpdClient = mcpdClient;
this.projectName = projectName;
}
addUpstream(connection: UpstreamConnection): void {
this.upstreams.set(connection.name, connection);
if (this.notificationHandler && connection.onNotification) {
@@ -87,10 +111,18 @@ export class McpRouter {
for (const tool of tools) {
const namespacedName = `${serverName}/${tool.name}`;
this.toolToServer.set(namespacedName, serverName);
allTools.push({
// Enrich description with server context if available
const entry: { name: string; description?: string; inputSchema?: unknown } = {
...tool,
name: namespacedName,
});
};
if (upstream.description && tool.description) {
entry.description = `[${upstream.description}] ${tool.description}`;
} else if (upstream.description) {
entry.description = `[${upstream.description}]`;
}
// If neither upstream.description nor tool.description, keep tool.description (may be undefined — that's fine, just don't set it)
allTools.push(entry);
}
}
} catch {
@@ -223,7 +255,7 @@ export class McpRouter {
* Route a generic request. Handles protocol-level methods locally,
* delegates tool/resource/prompt calls to upstreams.
*/
async route(request: JsonRpcRequest): Promise<JsonRpcResponse> {
async route(request: JsonRpcRequest, context?: RouteContext): Promise<JsonRpcResponse> {
switch (request.method) {
case 'initialize':
return {
@@ -240,11 +272,27 @@ export class McpRouter {
resources: {},
prompts: {},
},
...(this.instructions ? { instructions: this.instructions } : {}),
},
};
case 'tools/list': {
const tools = await this.discoverTools();
// Append propose_prompt tool if prompt config is set
if (this.mcpdClient && this.projectName) {
tools.push({
name: 'propose_prompt',
description: 'Propose a new prompt for this project. Creates a pending request that must be approved by a user before becoming active.',
inputSchema: {
type: 'object',
properties: {
name: { type: 'string', description: 'Prompt name (lowercase alphanumeric with hyphens, e.g. "debug-guide")' },
content: { type: 'string', description: 'Prompt content text' },
},
required: ['name', 'content'],
},
});
}
return {
jsonrpc: '2.0',
id: request.id,
@@ -253,10 +301,32 @@ export class McpRouter {
}
case 'tools/call':
return this.routeToolCall(request);
return this.routeToolCall(request, context);
case 'resources/list': {
const resources = await this.discoverResources();
// Append mcpctl prompt resources
if (this.mcpdClient && this.projectName) {
try {
const sessionParam = context?.sessionId ? `?session=${encodeURIComponent(context.sessionId)}` : '';
const visible = await this.mcpdClient.get<Array<{ name: string; content: string; type: string }>>(
`/api/v1/projects/${encodeURIComponent(this.projectName)}/prompts/visible${sessionParam}`,
);
this.mcpctlResourceContents.clear();
for (const p of visible) {
const uri = `mcpctl://prompts/${p.name}`;
resources.push({
uri,
name: p.name,
description: p.type === 'promptrequest' ? `[Pending proposal] ${p.name}` : `[Approved prompt] ${p.name}`,
mimeType: 'text/plain',
});
this.mcpctlResourceContents.set(uri, p.content);
}
} catch {
// Prompt resources are optional — don't fail discovery
}
}
return {
jsonrpc: '2.0',
id: request.id,
@@ -264,8 +334,28 @@ export class McpRouter {
};
}
case 'resources/read':
case 'resources/read': {
const params = request.params as Record<string, unknown> | undefined;
const uri = params?.['uri'] as string | undefined;
if (uri?.startsWith('mcpctl://')) {
const content = this.mcpctlResourceContents.get(uri);
if (content !== undefined) {
return {
jsonrpc: '2.0',
id: request.id,
result: {
contents: [{ uri, mimeType: 'text/plain', text: content }],
},
};
}
return {
jsonrpc: '2.0',
id: request.id,
error: { code: -32602, message: `Resource not found: ${uri}` },
};
}
return this.routeNamespacedCall(request, 'uri', this.resourceToServer);
}
case 'resources/subscribe':
case 'resources/unsubscribe':
@@ -283,6 +373,17 @@ export class McpRouter {
case 'prompts/get':
return this.routeNamespacedCall(request, 'name', this.promptToServer);
// Handle MCP notifications (no response expected, but return empty result if called as request)
case 'notifications/initialized':
case 'notifications/cancelled':
case 'notifications/progress':
case 'notifications/roots/list_changed':
return {
jsonrpc: '2.0',
id: request.id,
result: {},
};
default:
return {
jsonrpc: '2.0',
@@ -295,18 +396,45 @@ export class McpRouter {
/**
* Route a tools/call request, optionally applying LLM pre/post-processing.
*/
private async routeToolCall(request: JsonRpcRequest): Promise<JsonRpcResponse> {
private async routeToolCall(request: JsonRpcRequest, context?: RouteContext): Promise<JsonRpcResponse> {
const params = request.params as Record<string, unknown> | undefined;
const toolName = params?.['name'] as string | undefined;
// Handle built-in propose_prompt tool
if (toolName === 'propose_prompt') {
return this.handleProposePrompt(request, context);
}
// Intercept pagination page requests before routing to upstream
const toolArgs = (params?.['arguments'] ?? {}) as Record<string, unknown>;
if (this.paginator) {
const paginationReq = ResponsePaginator.extractPaginationParams(toolArgs);
if (paginationReq) {
const pageResult = this.paginator.getPage(paginationReq.resultId, paginationReq.page);
if (pageResult) {
return { jsonrpc: '2.0', id: request.id, result: pageResult };
}
return {
jsonrpc: '2.0',
id: request.id,
result: {
content: [{
type: 'text',
text: 'Cached result not found (expired or invalid _resultId). Please re-call the tool without _resultId/_page to get a fresh result.',
}],
},
};
}
}
// If no processor or tool shouldn't be processed, route directly
if (!this.llmProcessor || !toolName || !this.llmProcessor.shouldProcess('tools/call', toolName)) {
return this.routeNamespacedCall(request, 'name', this.toolToServer);
const response = await this.routeNamespacedCall(request, 'name', this.toolToServer);
return this.maybePaginate(toolName, response);
}
// Preprocess request params
const toolParams = (params?.['arguments'] ?? {}) as Record<string, unknown>;
const processed = await this.llmProcessor.preprocessRequest(toolName, toolParams);
const processed = await this.llmProcessor.preprocessRequest(toolName, toolArgs);
const processedRequest: JsonRpcRequest = processed.optimized
? { ...request, params: { ...params, arguments: processed.params } }
: request;
@@ -314,6 +442,10 @@ export class McpRouter {
// Route to upstream
const response = await this.routeNamespacedCall(processedRequest, 'name', this.toolToServer);
// Paginate if response is large (skip LLM filtering for paginated responses)
const paginated = await this.maybePaginate(toolName, response);
if (paginated !== response) return paginated;
// Filter response
if (response.error) return response;
const filtered = await this.llmProcessor.filterResponse(toolName, response);
@@ -323,6 +455,76 @@ export class McpRouter {
return response;
}
/**
* If the response is large enough, paginate it and return the index instead.
*/
private async maybePaginate(toolName: string | undefined, response: JsonRpcResponse): Promise<JsonRpcResponse> {
if (!this.paginator || !toolName || response.error) return response;
const raw = JSON.stringify(response.result);
if (!this.paginator.shouldPaginate(raw)) return response;
const paginated = await this.paginator.paginate(toolName, raw);
if (!paginated) return response;
return { jsonrpc: '2.0', id: response.id, result: paginated };
}
private async handleProposePrompt(request: JsonRpcRequest, context?: RouteContext): Promise<JsonRpcResponse> {
if (!this.mcpdClient || !this.projectName) {
return {
jsonrpc: '2.0',
id: request.id,
error: { code: -32603, message: 'Prompt config not set — propose_prompt unavailable' },
};
}
const params = request.params as Record<string, unknown> | undefined;
const args = (params?.['arguments'] ?? {}) as Record<string, unknown>;
const name = args['name'] as string | undefined;
const content = args['content'] as string | undefined;
if (!name || !content) {
return {
jsonrpc: '2.0',
id: request.id,
error: { code: -32602, message: 'Missing required arguments: name and content' },
};
}
try {
const body: Record<string, unknown> = { name, content };
if (context?.sessionId) {
body['createdBySession'] = context.sessionId;
}
await this.mcpdClient.post(
`/api/v1/projects/${encodeURIComponent(this.projectName)}/promptrequests`,
body,
);
return {
jsonrpc: '2.0',
id: request.id,
result: {
content: [
{
type: 'text',
text: `Prompt request "${name}" created successfully. It will be visible to you as a resource at mcpctl://prompts/${name}. A user must approve it before it becomes permanent.`,
},
],
},
};
} catch (err) {
return {
jsonrpc: '2.0',
id: request.id,
error: {
code: -32603,
message: `Failed to propose prompt: ${err instanceof Error ? err.message : String(err)}`,
},
};
}
}
getUpstreamNames(): string[] {
return [...this.upstreams.keys()];
}

View File

@@ -63,6 +63,8 @@ export interface ProxyConfig {
export interface UpstreamConnection {
/** Server name */
name: string;
/** Human-readable description of the server's purpose */
description?: string;
/** Send a JSON-RPC request and get a response */
send(request: JsonRpcRequest): Promise<JsonRpcResponse>;
/** Disconnect from the upstream */

View File

@@ -18,14 +18,17 @@ interface McpdProxyResponse {
*/
export class McpdUpstream implements UpstreamConnection {
readonly name: string;
readonly description?: string;
private alive = true;
constructor(
private serverId: string,
serverName: string,
private mcpdClient: McpdClient,
serverDescription?: string,
) {
this.name = serverName;
if (serverDescription !== undefined) this.description = serverDescription;
}
async send(request: JsonRpcRequest): Promise<JsonRpcResponse> {

View File

@@ -0,0 +1,433 @@
import { describe, it, expect, vi, afterEach } from 'vitest';
import { ResponsePaginator, DEFAULT_PAGINATION_CONFIG } from '../src/llm/pagination.js';
import type { ProviderRegistry } from '../src/providers/registry.js';
import type { LlmProvider } from '../src/providers/types.js';
function makeProvider(response: string): ProviderRegistry {
const provider: LlmProvider = {
name: 'test',
isAvailable: () => true,
complete: vi.fn().mockResolvedValue({ content: response }),
};
return {
getActive: () => provider,
register: vi.fn(),
setActive: vi.fn(),
listProviders: () => [{ name: 'test', available: true, active: true }],
} as unknown as ProviderRegistry;
}
function makeLargeString(size: number, pattern = 'x'): string {
return pattern.repeat(size);
}
function makeLargeStringWithNewlines(size: number, lineLen = 100): string {
const lines: string[] = [];
let total = 0;
let lineNum = 0;
while (total < size) {
const line = `line-${String(lineNum).padStart(5, '0')} ${'x'.repeat(lineLen - 15)}`;
lines.push(line);
total += line.length + 1; // +1 for newline
lineNum++;
}
return lines.join('\n');
}
describe('ResponsePaginator', () => {
afterEach(() => {
vi.restoreAllMocks();
});
// --- shouldPaginate ---
describe('shouldPaginate', () => {
it('returns false for strings below threshold', () => {
const paginator = new ResponsePaginator(null);
expect(paginator.shouldPaginate('short string')).toBe(false);
});
it('returns false for strings just below threshold', () => {
const paginator = new ResponsePaginator(null);
const str = makeLargeString(DEFAULT_PAGINATION_CONFIG.sizeThreshold - 1);
expect(paginator.shouldPaginate(str)).toBe(false);
});
it('returns true for strings at threshold', () => {
const paginator = new ResponsePaginator(null);
const str = makeLargeString(DEFAULT_PAGINATION_CONFIG.sizeThreshold);
expect(paginator.shouldPaginate(str)).toBe(true);
});
it('returns true for strings above threshold', () => {
const paginator = new ResponsePaginator(null);
const str = makeLargeString(DEFAULT_PAGINATION_CONFIG.sizeThreshold + 1000);
expect(paginator.shouldPaginate(str)).toBe(true);
});
it('respects custom threshold', () => {
const paginator = new ResponsePaginator(null, { sizeThreshold: 100 });
expect(paginator.shouldPaginate('x'.repeat(99))).toBe(false);
expect(paginator.shouldPaginate('x'.repeat(100))).toBe(true);
});
});
// --- paginate (no LLM) ---
describe('paginate without LLM', () => {
it('returns null for small responses', async () => {
const paginator = new ResponsePaginator(null);
const result = await paginator.paginate('test/tool', 'small response');
expect(result).toBeNull();
});
it('paginates large responses with simple index', async () => {
const paginator = new ResponsePaginator(null, { sizeThreshold: 100, pageSize: 50 });
const raw = makeLargeStringWithNewlines(200);
const result = await paginator.paginate('test/tool', raw);
expect(result).not.toBeNull();
expect(result!.content).toHaveLength(1);
expect(result!.content[0]!.type).toBe('text');
const text = result!.content[0]!.text;
expect(text).toContain('too large to return directly');
expect(text).toContain('_resultId');
expect(text).toContain('_page');
expect(text).not.toContain('AI-generated summaries');
});
it('includes correct page count in index', async () => {
const paginator = new ResponsePaginator(null, { sizeThreshold: 100, pageSize: 50 });
const raw = 'x'.repeat(200);
const result = await paginator.paginate('test/tool', raw);
expect(result).not.toBeNull();
const text = result!.content[0]!.text;
// 200 chars / 50 per page = 4 pages
expect(text).toContain('4 pages');
expect(text).toContain('Page 1:');
expect(text).toContain('Page 4:');
});
it('caches the result for later page retrieval', async () => {
const paginator = new ResponsePaginator(null, { sizeThreshold: 100, pageSize: 50 });
const raw = 'x'.repeat(200);
await paginator.paginate('test/tool', raw);
expect(paginator.cacheSize).toBe(1);
});
it('includes page instructions with _resultId and _page', async () => {
const paginator = new ResponsePaginator(null, { sizeThreshold: 100, pageSize: 50 });
const raw = 'x'.repeat(200);
const result = await paginator.paginate('test/tool', raw);
const text = result!.content[0]!.text;
expect(text).toContain('"_resultId"');
expect(text).toContain('"_page"');
expect(text).toContain('"all"');
});
});
// --- paginate (with LLM) ---
describe('paginate with LLM', () => {
it('generates smart index when provider available', async () => {
const summaries = JSON.stringify([
{ page: 1, summary: 'Configuration nodes and global settings' },
{ page: 2, summary: 'HTTP request nodes and API integrations' },
]);
const registry = makeProvider(summaries);
const paginator = new ResponsePaginator(registry, { sizeThreshold: 100, pageSize: 60 });
const raw = makeLargeStringWithNewlines(150);
const result = await paginator.paginate('node-red/get_flows', raw);
expect(result).not.toBeNull();
const text = result!.content[0]!.text;
expect(text).toContain('AI-generated summaries');
expect(text).toContain('Configuration nodes and global settings');
expect(text).toContain('HTTP request nodes and API integrations');
});
it('falls back to simple index on LLM failure', async () => {
const provider: LlmProvider = {
name: 'test',
isAvailable: () => true,
complete: vi.fn().mockRejectedValue(new Error('LLM unavailable')),
};
const registry = {
getActive: () => provider,
register: vi.fn(),
setActive: vi.fn(),
listProviders: () => [{ name: 'test', available: true, active: true }],
} as unknown as ProviderRegistry;
const paginator = new ResponsePaginator(registry, { sizeThreshold: 100, pageSize: 50 });
const raw = 'x'.repeat(200);
const result = await paginator.paginate('test/tool', raw);
expect(result).not.toBeNull();
const text = result!.content[0]!.text;
// Should NOT contain AI-generated label
expect(text).not.toContain('AI-generated summaries');
expect(text).toContain('Page 1:');
});
it('sends page previews to LLM, not full content', async () => {
const completeFn = vi.fn().mockResolvedValue({
content: JSON.stringify([
{ page: 1, summary: 'test' },
{ page: 2, summary: 'test2' },
{ page: 3, summary: 'test3' },
]),
});
const provider: LlmProvider = {
name: 'test',
isAvailable: () => true,
complete: completeFn,
};
const registry = {
getActive: () => provider,
register: vi.fn(),
setActive: vi.fn(),
listProviders: () => [{ name: 'test', available: true, active: true }],
} as unknown as ProviderRegistry;
// Use a large enough string (3000 chars, pages of 1000) so previews (500 per page) are smaller than raw
const paginator = new ResponsePaginator(registry, { sizeThreshold: 2000, pageSize: 1000 });
const raw = makeLargeStringWithNewlines(3000);
await paginator.paginate('test/tool', raw);
expect(completeFn).toHaveBeenCalledOnce();
const call = completeFn.mock.calls[0]![0]!;
const userMsg = call.messages.find((m: { role: string }) => m.role === 'user');
// Should contain page preview markers
expect(userMsg.content).toContain('Page 1');
// The LLM prompt should be significantly smaller than the full content
// (each page sends ~500 chars preview, not full 1000 chars)
expect(userMsg.content.length).toBeLessThan(raw.length);
});
it('falls back to simple when no active provider', async () => {
const registry = {
getActive: () => null,
register: vi.fn(),
setActive: vi.fn(),
listProviders: () => [],
} as unknown as ProviderRegistry;
const paginator = new ResponsePaginator(registry, { sizeThreshold: 100, pageSize: 50 });
const raw = 'x'.repeat(200);
const result = await paginator.paginate('test/tool', raw);
expect(result).not.toBeNull();
const text = result!.content[0]!.text;
expect(text).not.toContain('AI-generated summaries');
});
});
// --- getPage ---
describe('getPage', () => {
it('returns specific page content', async () => {
const paginator = new ResponsePaginator(null, { sizeThreshold: 100, pageSize: 50 });
const raw = 'AAAA'.repeat(25) + 'BBBB'.repeat(25); // 200 chars total
await paginator.paginate('test/tool', raw);
// Extract resultId from cache (there should be exactly 1 entry)
expect(paginator.cacheSize).toBe(1);
// We need the resultId — get it from the index response
const indexResult = await paginator.paginate('test/tool2', 'C'.repeat(200));
const text = indexResult!.content[0]!.text;
const match = /"_resultId": "([^"]+)"/.exec(text);
expect(match).not.toBeNull();
const resultId = match![1]!;
const page1 = paginator.getPage(resultId, 1);
expect(page1).not.toBeNull();
expect(page1!.content[0]!.text).toContain('Page 1/');
expect(page1!.content[0]!.text).toContain('C');
});
it('returns full content with _page=all', async () => {
const paginator = new ResponsePaginator(null, { sizeThreshold: 100, pageSize: 50 });
const raw = 'D'.repeat(200);
const indexResult = await paginator.paginate('test/tool', raw);
const match = /"_resultId": "([^"]+)"/.exec(indexResult!.content[0]!.text);
const resultId = match![1]!;
const allPages = paginator.getPage(resultId, 'all');
expect(allPages).not.toBeNull();
expect(allPages!.content[0]!.text).toBe(raw);
});
it('returns null for unknown resultId (cache miss)', () => {
const paginator = new ResponsePaginator(null);
const result = paginator.getPage('nonexistent-id', 1);
expect(result).toBeNull();
});
it('returns error for out-of-range page', async () => {
const paginator = new ResponsePaginator(null, { sizeThreshold: 100, pageSize: 50 });
const raw = 'x'.repeat(200);
const indexResult = await paginator.paginate('test/tool', raw);
const match = /"_resultId": "([^"]+)"/.exec(indexResult!.content[0]!.text);
const resultId = match![1]!;
const page999 = paginator.getPage(resultId, 999);
expect(page999).not.toBeNull();
expect(page999!.content[0]!.text).toContain('out of range');
});
it('returns null after TTL expiry', async () => {
const now = Date.now();
vi.spyOn(Date, 'now').mockReturnValue(now);
const paginator = new ResponsePaginator(null, { sizeThreshold: 100, pageSize: 50, ttlMs: 1000 });
const raw = 'x'.repeat(200);
const indexResult = await paginator.paginate('test/tool', raw);
const match = /"_resultId": "([^"]+)"/.exec(indexResult!.content[0]!.text);
const resultId = match![1]!;
// Within TTL — should work
expect(paginator.getPage(resultId, 1)).not.toBeNull();
// Past TTL — should be null
vi.spyOn(Date, 'now').mockReturnValue(now + 1001);
expect(paginator.getPage(resultId, 1)).toBeNull();
});
});
// --- extractPaginationParams ---
describe('extractPaginationParams', () => {
it('returns null when no pagination params', () => {
expect(ResponsePaginator.extractPaginationParams({ query: 'test' })).toBeNull();
});
it('returns null when only _resultId (no _page)', () => {
expect(ResponsePaginator.extractPaginationParams({ _resultId: 'abc' })).toBeNull();
});
it('returns null when only _page (no _resultId)', () => {
expect(ResponsePaginator.extractPaginationParams({ _page: 1 })).toBeNull();
});
it('extracts numeric page', () => {
const result = ResponsePaginator.extractPaginationParams({ _resultId: 'abc-123', _page: 2 });
expect(result).toEqual({ resultId: 'abc-123', page: 2 });
});
it('extracts _page=all', () => {
const result = ResponsePaginator.extractPaginationParams({ _resultId: 'abc-123', _page: 'all' });
expect(result).toEqual({ resultId: 'abc-123', page: 'all' });
});
it('rejects negative page numbers', () => {
expect(ResponsePaginator.extractPaginationParams({ _resultId: 'abc', _page: -1 })).toBeNull();
});
it('rejects zero page number', () => {
expect(ResponsePaginator.extractPaginationParams({ _resultId: 'abc', _page: 0 })).toBeNull();
});
it('rejects non-integer page numbers', () => {
expect(ResponsePaginator.extractPaginationParams({ _resultId: 'abc', _page: 1.5 })).toBeNull();
});
it('requires string resultId', () => {
expect(ResponsePaginator.extractPaginationParams({ _resultId: 123, _page: 1 })).toBeNull();
});
});
// --- Cache management ---
describe('cache management', () => {
it('evicts expired entries on paginate', async () => {
const now = Date.now();
vi.spyOn(Date, 'now').mockReturnValue(now);
const paginator = new ResponsePaginator(null, { sizeThreshold: 100, pageSize: 50, ttlMs: 1000 });
await paginator.paginate('test/tool1', 'x'.repeat(200));
expect(paginator.cacheSize).toBe(1);
// Advance past TTL and paginate again
vi.spyOn(Date, 'now').mockReturnValue(now + 1001);
await paginator.paginate('test/tool2', 'y'.repeat(200));
// Old entry evicted, new one added
expect(paginator.cacheSize).toBe(1);
});
it('evicts LRU at capacity', async () => {
const paginator = new ResponsePaginator(null, { sizeThreshold: 100, pageSize: 50, maxCachedResults: 2 });
await paginator.paginate('test/tool1', 'A'.repeat(200));
await paginator.paginate('test/tool2', 'B'.repeat(200));
expect(paginator.cacheSize).toBe(2);
// Third entry should evict the first
await paginator.paginate('test/tool3', 'C'.repeat(200));
expect(paginator.cacheSize).toBe(2);
});
it('clearCache removes all entries', async () => {
const paginator = new ResponsePaginator(null, { sizeThreshold: 100, pageSize: 50 });
await paginator.paginate('test/tool1', 'x'.repeat(200));
await paginator.paginate('test/tool2', 'y'.repeat(200));
expect(paginator.cacheSize).toBe(2);
paginator.clearCache();
expect(paginator.cacheSize).toBe(0);
});
});
// --- Page splitting ---
describe('page splitting', () => {
it('breaks at newline boundaries', async () => {
// Create content where a newline falls within the page boundary
const paginator = new ResponsePaginator(null, { sizeThreshold: 100, pageSize: 60 });
const lines = Array.from({ length: 10 }, (_, i) => `line${String(i).padStart(3, '0')} ${'x'.repeat(20)}`);
const raw = lines.join('\n');
// raw is ~269 chars
const result = await paginator.paginate('test/tool', raw);
expect(result).not.toBeNull();
// Pages should break at newline boundaries, not mid-line
const text = result!.content[0]!.text;
const match = /"_resultId": "([^"]+)"/.exec(text);
const resultId = match![1]!;
const page1 = paginator.getPage(resultId, 1);
expect(page1).not.toBeNull();
// Page content should end at a newline boundary (no partial lines)
const pageText = page1!.content[0]!.text;
// Remove the header line
const contentStart = pageText.indexOf('\n\n') + 2;
const pageContent = pageText.slice(contentStart);
// Content should contain complete lines
expect(pageContent).toMatch(/line\d{3}/);
});
it('handles content without newlines', async () => {
const paginator = new ResponsePaginator(null, { sizeThreshold: 100, pageSize: 50 });
const raw = 'x'.repeat(200); // No newlines at all
const result = await paginator.paginate('test/tool', raw);
expect(result).not.toBeNull();
const text = result!.content[0]!.text;
expect(text).toContain('4 pages'); // 200/50 = 4
});
it('handles content that fits exactly in one page at threshold', async () => {
const paginator = new ResponsePaginator(null, { sizeThreshold: 100, pageSize: 100 });
const raw = 'x'.repeat(100); // Exactly at threshold and page size
const result = await paginator.paginate('test/tool', raw);
expect(result).not.toBeNull();
const text = result!.content[0]!.text;
expect(text).toContain('1 pages');
});
});
});

View File

@@ -54,7 +54,7 @@ describe('refreshProjectUpstreams', () => {
const client = mockMcpdClient(servers);
await refreshProjectUpstreams(router, client as any, 'smart-home', 'user-token-123');
expect(client.forward).toHaveBeenCalledWith('GET', '/api/v1/projects/smart-home/servers', '', undefined);
expect(client.forward).toHaveBeenCalledWith('GET', '/api/v1/projects/smart-home/servers', '', undefined, 'user-token-123');
expect(router.getUpstreamNames()).toContain('grafana');
});

View File

@@ -11,7 +11,7 @@ vi.mock('../src/discovery.js', () => ({
import { refreshProjectUpstreams } from '../src/discovery.js';
function mockMcpdClient() {
return {
const client: Record<string, unknown> = {
baseUrl: 'http://test:3100',
token: 'test-token',
get: vi.fn(async () => []),
@@ -19,7 +19,11 @@ function mockMcpdClient() {
put: vi.fn(),
delete: vi.fn(),
forward: vi.fn(async () => ({ status: 200, body: [] })),
withHeaders: vi.fn(),
};
// withHeaders returns a new client-like object (returns self for simplicity)
(client.withHeaders as ReturnType<typeof vi.fn>).mockReturnValue(client);
return client;
}
describe('registerProjectMcpEndpoint', () => {

View File

@@ -0,0 +1,248 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { McpRouter } from '../src/router.js';
import type { UpstreamConnection, JsonRpcRequest, JsonRpcResponse, JsonRpcNotification } from '../src/types.js';
import type { McpdClient } from '../src/http/mcpd-client.js';
function mockUpstream(name: string, opts?: {
tools?: Array<{ name: string; description?: string; inputSchema?: unknown }>;
}): UpstreamConnection {
return {
name,
isAlive: vi.fn(() => true),
close: vi.fn(async () => {}),
onNotification: vi.fn(),
send: vi.fn(async (req: JsonRpcRequest): Promise<JsonRpcResponse> => {
if (req.method === 'tools/list') {
return { jsonrpc: '2.0', id: req.id, result: { tools: opts?.tools ?? [] } };
}
if (req.method === 'resources/list') {
return { jsonrpc: '2.0', id: req.id, result: { resources: [] } };
}
return { jsonrpc: '2.0', id: req.id, result: {} };
}),
};
}
function mockMcpdClient(): McpdClient {
return {
get: vi.fn(async () => []),
post: vi.fn(async () => ({})),
put: vi.fn(async () => ({})),
delete: vi.fn(async () => {}),
forward: vi.fn(async () => ({ status: 200, body: {} })),
withHeaders: vi.fn(function (this: McpdClient) { return this; }),
} as unknown as McpdClient;
}
describe('McpRouter - Prompt Integration', () => {
let router: McpRouter;
let mcpdClient: McpdClient;
beforeEach(() => {
router = new McpRouter();
mcpdClient = mockMcpdClient();
});
describe('propose_prompt tool', () => {
it('should include propose_prompt in tools/list when prompt config is set', async () => {
router.setPromptConfig(mcpdClient, 'test-project');
router.addUpstream(mockUpstream('server1'));
const response = await router.route({
jsonrpc: '2.0',
id: 1,
method: 'tools/list',
});
const tools = (response.result as { tools: Array<{ name: string }> }).tools;
expect(tools.some((t) => t.name === 'propose_prompt')).toBe(true);
});
it('should NOT include propose_prompt when no prompt config', async () => {
router.addUpstream(mockUpstream('server1'));
const response = await router.route({
jsonrpc: '2.0',
id: 1,
method: 'tools/list',
});
const tools = (response.result as { tools: Array<{ name: string }> }).tools;
expect(tools.some((t) => t.name === 'propose_prompt')).toBe(false);
});
it('should call mcpd to create a prompt request', async () => {
router.setPromptConfig(mcpdClient, 'my-project');
const response = await router.route(
{
jsonrpc: '2.0',
id: 2,
method: 'tools/call',
params: {
name: 'propose_prompt',
arguments: { name: 'my-prompt', content: 'Hello world' },
},
},
{ sessionId: 'sess-123' },
);
expect(response.error).toBeUndefined();
expect(mcpdClient.post).toHaveBeenCalledWith(
'/api/v1/projects/my-project/promptrequests',
{ name: 'my-prompt', content: 'Hello world', createdBySession: 'sess-123' },
);
});
it('should return error when name or content missing', async () => {
router.setPromptConfig(mcpdClient, 'proj');
const response = await router.route({
jsonrpc: '2.0',
id: 3,
method: 'tools/call',
params: {
name: 'propose_prompt',
arguments: { name: 'only-name' },
},
});
expect(response.error?.code).toBe(-32602);
expect(response.error?.message).toContain('Missing required arguments');
});
it('should return error when mcpd call fails', async () => {
router.setPromptConfig(mcpdClient, 'proj');
vi.mocked(mcpdClient.post).mockRejectedValue(new Error('mcpd returned 409'));
const response = await router.route({
jsonrpc: '2.0',
id: 4,
method: 'tools/call',
params: {
name: 'propose_prompt',
arguments: { name: 'dup', content: 'x' },
},
});
expect(response.error?.code).toBe(-32603);
expect(response.error?.message).toContain('mcpd returned 409');
});
});
describe('prompt resources', () => {
it('should include prompt resources in resources/list', async () => {
router.setPromptConfig(mcpdClient, 'test-project');
vi.mocked(mcpdClient.get).mockResolvedValue([
{ name: 'approved-prompt', content: 'Content A', type: 'prompt' },
{ name: 'pending-req', content: 'Content B', type: 'promptrequest' },
]);
const response = await router.route(
{ jsonrpc: '2.0', id: 1, method: 'resources/list' },
{ sessionId: 'sess-1' },
);
const resources = (response.result as { resources: Array<{ uri: string; description?: string }> }).resources;
expect(resources).toHaveLength(2);
expect(resources[0]!.uri).toBe('mcpctl://prompts/approved-prompt');
expect(resources[0]!.description).toContain('Approved');
expect(resources[1]!.uri).toBe('mcpctl://prompts/pending-req');
expect(resources[1]!.description).toContain('Pending');
});
it('should pass session ID when fetching visible prompts', async () => {
router.setPromptConfig(mcpdClient, 'proj');
vi.mocked(mcpdClient.get).mockResolvedValue([]);
await router.route(
{ jsonrpc: '2.0', id: 1, method: 'resources/list' },
{ sessionId: 'my-session' },
);
expect(mcpdClient.get).toHaveBeenCalledWith(
'/api/v1/projects/proj/prompts/visible?session=my-session',
);
});
it('should read mcpctl resource content', async () => {
router.setPromptConfig(mcpdClient, 'proj');
vi.mocked(mcpdClient.get).mockResolvedValue([
{ name: 'my-prompt', content: 'The content here', type: 'prompt' },
]);
// First list to populate cache
await router.route({ jsonrpc: '2.0', id: 1, method: 'resources/list' });
// Then read
const response = await router.route({
jsonrpc: '2.0',
id: 2,
method: 'resources/read',
params: { uri: 'mcpctl://prompts/my-prompt' },
});
expect(response.error).toBeUndefined();
const contents = (response.result as { contents: Array<{ text: string }> }).contents;
expect(contents[0]!.text).toBe('The content here');
});
it('should return error for unknown mcpctl resource', async () => {
router.setPromptConfig(mcpdClient, 'proj');
const response = await router.route({
jsonrpc: '2.0',
id: 3,
method: 'resources/read',
params: { uri: 'mcpctl://prompts/nonexistent' },
});
expect(response.error?.code).toBe(-32602);
expect(response.error?.message).toContain('Resource not found');
});
it('should not fail when mcpd is unavailable', async () => {
router.setPromptConfig(mcpdClient, 'proj');
vi.mocked(mcpdClient.get).mockRejectedValue(new Error('Connection refused'));
const response = await router.route({ jsonrpc: '2.0', id: 1, method: 'resources/list' });
// Should succeed with empty resources (upstream errors are swallowed)
expect(response.error).toBeUndefined();
const resources = (response.result as { resources: unknown[] }).resources;
expect(resources).toEqual([]);
});
});
describe('session isolation', () => {
it('should not include session parameter when no sessionId in context', async () => {
router.setPromptConfig(mcpdClient, 'proj');
vi.mocked(mcpdClient.get).mockResolvedValue([]);
await router.route({ jsonrpc: '2.0', id: 1, method: 'resources/list' });
expect(mcpdClient.get).toHaveBeenCalledWith(
'/api/v1/projects/proj/prompts/visible',
);
});
it('should not include session in propose when no context', async () => {
router.setPromptConfig(mcpdClient, 'proj');
await router.route({
jsonrpc: '2.0',
id: 2,
method: 'tools/call',
params: {
name: 'propose_prompt',
arguments: { name: 'test', content: 'stuff' },
},
});
expect(mcpdClient.post).toHaveBeenCalledWith(
'/api/v1/projects/proj/promptrequests',
{ name: 'test', content: 'stuff' },
);
});
});
});

82
tests.sh Executable file
View File

@@ -0,0 +1,82 @@
#!/usr/bin/env bash
set -euo pipefail
PATH="$HOME/.npm-global/bin:$PATH"
SHORT=false
FILTER=""
while [[ $# -gt 0 ]]; do
case "$1" in
--short|-s) SHORT=true; shift ;;
--filter|-f) FILTER="$2"; shift 2 ;;
*) echo "Usage: tests.sh [--short|-s] [--filter|-f <package>]"; exit 1 ;;
esac
done
strip_ansi() {
sed $'s/\033\[[0-9;]*m//g'
}
run_tests() {
local pkg="$1"
local label="$2"
if $SHORT; then
local tmpfile
tmpfile=$(mktemp)
trap "rm -f $tmpfile" RETURN
local exit_code=0
pnpm --filter "$pkg" test:run >"$tmpfile" 2>&1 || exit_code=$?
# Parse from cleaned output
local clean
clean=$(strip_ansi < "$tmpfile")
local tests_line files_line duration_line
tests_line=$(echo "$clean" | grep -oP 'Tests\s+\K.*' | tail -1 | xargs)
files_line=$(echo "$clean" | grep -oP 'Test Files\s+\K.*' | tail -1 | xargs)
duration_line=$(echo "$clean" | grep -oP 'Duration\s+\K[0-9.]+s' | tail -1)
if [[ $exit_code -eq 0 ]]; then
printf " \033[32mPASS\033[0m %-6s %s | %s | %s\n" "$label" "$files_line" "$tests_line" "$duration_line"
else
printf " \033[31mFAIL\033[0m %-6s %s | %s | %s\n" "$label" "$files_line" "$tests_line" "$duration_line"
echo "$clean" | grep -E 'FAIL |AssertionError|expected .* to' | head -10 | sed 's/^/ /'
fi
rm -f "$tmpfile"
return $exit_code
else
echo "=== $label ==="
pnpm --filter "$pkg" test:run
echo ""
fi
}
if $SHORT; then
echo "Running tests..."
echo ""
fi
failed=0
if [[ -z "$FILTER" || "$FILTER" == "mcpd" ]]; then
run_tests mcpd "mcpd" || failed=1
fi
if [[ -z "$FILTER" || "$FILTER" == "cli" ]]; then
run_tests cli "cli" || failed=1
fi
if $SHORT; then
echo ""
if [[ $failed -eq 0 ]]; then
echo "All tests passed."
else
echo "Some tests FAILED."
fi
fi
exit $failed