Compare commits
67 Commits
feat/healt
...
feat/gated
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
705df06996 | ||
| 62647a7f90 | |||
|
|
39ca134201 | ||
| 78a1dc9c8a | |||
|
|
9ce705608b | ||
|
|
0824f8e635 | ||
|
|
9bd3127519 | ||
| e8ac500ae9 | |||
|
|
bed725b387 | ||
| 17a456d835 | |||
|
|
9481d394a1 | ||
|
|
bc769c4eeb | ||
| 6f534c8ba9 | |||
|
|
11da8b1fbf | ||
|
|
848868d45f | ||
|
|
869217a07a | ||
| 04d115933b | |||
|
|
7c23da10c6 | ||
| 32b4de4343 | |||
|
|
e06db9afba | ||
|
|
a25809b84a | ||
| f5a902d3e0 | |||
|
|
9cb0c5ce24 | ||
| 06230ec034 | |||
|
|
079c7b3dfa | ||
|
|
7829f4fb92 | ||
|
|
fa6240107f | ||
| b34ea63d3d | |||
|
|
e17a2282e8 | ||
| 01d3c4e02d | |||
|
|
e4affe5962 | ||
| c75e7cdf4d | |||
|
|
65c340a03c | ||
| 677d34b868 | |||
|
|
c5b8cb60b7 | ||
| 9a5deffb8f | |||
|
|
ec7ada5383 | ||
| b81d3be2d5 | |||
|
|
e2c54bfc5c | ||
| 7b7854b007 | |||
|
|
f23dd99662 | ||
| 43af85cb58 | |||
|
|
6d2e3c2eb3 | ||
| ce21db3853 | |||
|
|
767725023e | ||
| 2bd1b55fe8 | |||
|
|
0f2a93f2f0 | ||
| ce81d9d616 | |||
|
|
c6cc39c6f7 | ||
| de074d9a90 | |||
|
|
783cf15179 | ||
| 5844d6c73f | |||
|
|
604bd76d60 | ||
| da14bb8c23 | |||
|
|
9e9a2f4a54 | ||
| c8cdd7f514 | |||
|
|
ec1dfe7438 | ||
| 50b4112398 | |||
|
|
bb17a892d6 | ||
| a8117091a1 | |||
|
|
dcda93d179 | ||
| a6b5e24a8d | |||
|
|
3a6e58274c | ||
|
|
c819b65175 | ||
|
|
c3ef5a664f | ||
|
|
4c2927a16e | ||
| 79dd6e723d |
1
.gitignore
vendored
1
.gitignore
vendored
@@ -37,3 +37,4 @@ pgdata/
|
||||
|
||||
# Prisma
|
||||
src/db/prisma/migrations/*.sql.backup
|
||||
logs.sh
|
||||
|
||||
392
.taskmaster/docs/prd-gated-prompts.md
Normal file
392
.taskmaster/docs/prd-gated-prompts.md
Normal file
@@ -0,0 +1,392 @@
|
||||
# PRD: Gated Project Experience & Prompt Intelligence
|
||||
|
||||
## Overview
|
||||
|
||||
When 300 developers connect their LLM clients (Claude Code, Cursor, etc.) to mcpctl projects, they need relevant context — security policies, architecture decisions, operational runbooks — without flooding the context window. This feature introduces a gated session flow where the client LLM drives its own context retrieval through keyword-based matching, with the proxy providing a prompt index and encouraging ongoing discovery.
|
||||
|
||||
## Problem
|
||||
|
||||
- Injecting all prompts into instructions doesn't scale (hundreds of pages of policies)
|
||||
- Exposing prompts only as MCP resources means LLMs never read them
|
||||
- An index-only approach works for small numbers but breaks down at scale
|
||||
- No mechanism to link external knowledge (Notion, Docmost) as prompts
|
||||
- LLMs tend to work with whatever they have rather than proactively seek more context
|
||||
|
||||
## Core Concepts
|
||||
|
||||
### Gated Experience
|
||||
|
||||
A project-level flag (`gated: boolean`, default: `true`) that controls whether sessions go through a keyword-driven prompt retrieval flow before accessing project tools and resources.
|
||||
|
||||
**Flow (A + C):**
|
||||
|
||||
1. On `initialize`, instructions include the **prompt index** (names + summaries for all prompts, up to a reasonable cap) and tell client LLM: "Call `begin_session` with 5 keywords describing your task"
|
||||
2. **If client obeys**: `begin_session({ tags: ["zigbee", "lights", "mqtt", "pairing", "automation"] })` → prompt selection (see below) → returns matched prompt content + full prompt index + encouragement to retrieve more → session ungated
|
||||
3. **If client ignores**: First `tools/call` is intercepted → keywords extracted from tool name + arguments → same prompt selection → briefing injected alongside tool result → session ungated
|
||||
4. **Ongoing retrieval**: Client can call `read_prompts({ tags: ["security", "vpn"] })` at any point to retrieve more prompts. The prompt index is always visible so the client LLM can see what's available.
|
||||
|
||||
**Prompt selection — tiered approach:**
|
||||
|
||||
- **Primary (heavy LLM available)**: Tags + full prompt index (names, priorities, summaries, chapters) are sent to the heavy LLM (e.g. Gemini). The LLM understands synonyms, context, and intent — it knows "zigbee" relates to "Z2M" and "Zigbee2MQTT", and that someone working on "lights" probably needs the "common-mistakes" prompt about pairing. The LLM returns a ranked list of relevant prompt names with brief explanations of why each is relevant. The heavy LLM may use the fast LLM for preprocessing if needed (e.g. generating missing summaries on the fly).
|
||||
- **Fallback (no LLM, or `llmProvider=none`)**: Deterministic keyword-based tag matching against summaries/chapters with byte-budget allocation (see "Tag Matching Algorithm" below). Same approach as ResponsePaginator's byte-based fallback. Triggered when: no LLM providers configured, project has `llmProvider: "none"`, or local override sets `provider: "none"`.
|
||||
- **Hybrid (both paths always available)**: Even when heavy LLM does the initial selection, the `read_prompts({ tags: [...] })` tool always uses keyword matching. This way the client LLM can retrieve specific prompts by keyword that the heavy LLM may have missed. The LLM is smart about context, keywords are precise about names — together they cover both fuzzy and exact retrieval.
|
||||
|
||||
**LLM availability resolution** (same chain as existing LLM features):
|
||||
- Project `llmProvider: "none"` → no LLM, keyword fallback only
|
||||
- Project `llmProvider: null` → inherit from global config
|
||||
- Local override `provider: "none"` → no LLM, keyword fallback only
|
||||
- No providers configured → keyword fallback only
|
||||
- Otherwise → use heavy LLM for `begin_session`, fast LLM for summary generation
|
||||
|
||||
### Encouraging Retrieval
|
||||
|
||||
LLMs tend to proceed with incomplete information rather than seek more context. The system must actively counter this at multiple points:
|
||||
|
||||
**In `initialize` instructions:**
|
||||
```
|
||||
You have access to project knowledge containing policies, architecture decisions,
|
||||
and guidelines. Some may contain critical rules about what you're doing. After your
|
||||
initial briefing, if you're unsure about conventions, security requirements, or
|
||||
best practices — request more context using read_prompts. It's always better to
|
||||
check than to guess wrong. The project may have specific rules you don't know about yet.
|
||||
```
|
||||
|
||||
**In `begin_session` response (after matched prompts):**
|
||||
```
|
||||
Other prompts available that may become relevant as your work progresses:
|
||||
- security-policies: Network segmentation, firewall rules, VPN access
|
||||
- naming-conventions: Service and resource naming standards
|
||||
- ...
|
||||
If any of these seem related to what you're doing now or later, request them
|
||||
with read_prompts({ tags: [...] }) or resources/read. Don't assume you have
|
||||
all the context — check when in doubt.
|
||||
```
|
||||
|
||||
**In `read_prompts` response:**
|
||||
```
|
||||
Remember: you can request more prompts at any time with read_prompts({ tags: [...] }).
|
||||
The project may have additional guidelines relevant to your current approach.
|
||||
```
|
||||
|
||||
The tone is not "here's optional reading" but "there are rules you might not know about, and violating them costs more than reading them."
|
||||
|
||||
### Prompt Priority (1-10)
|
||||
|
||||
Every prompt has a priority level that influences selection order and byte-budget allocation:
|
||||
|
||||
| Range | Meaning | Behavior |
|
||||
|-------|---------|----------|
|
||||
| 1-3 | Reference | Low priority, included only on strong keyword match |
|
||||
| 4-6 | Standard | Default priority, included on moderate keyword match |
|
||||
| 7-9 | Important | High priority, lower match threshold |
|
||||
| 10 | Critical | Always included in full, regardless of keyword match (guardrails, common mistakes) |
|
||||
|
||||
Default priority for new prompts: `5`.
|
||||
|
||||
### Prompt Summaries & Chapters (Auto-generated)
|
||||
|
||||
Each prompt gets auto-generated metadata used for the prompt index and tag matching:
|
||||
|
||||
- `summary` (string, ~20 words) — one-line description of what the prompt covers
|
||||
- `chapters` (string[]) — key sections/topics extracted from content
|
||||
|
||||
Generation pipeline:
|
||||
- **Fast LLM available**: Summarize content, extract key topics
|
||||
- **No fast LLM**: First sentence of content + markdown headings via regex
|
||||
- Regenerated on prompt create/update
|
||||
- Cached on the prompt record
|
||||
|
||||
### Tag Matching Algorithm (No-LLM Fallback)
|
||||
|
||||
When no local LLM is available, the system falls back to a deterministic retrieval algorithm:
|
||||
|
||||
1. Client provides tags (5 keywords from `begin_session`, or extracted from tool call)
|
||||
2. For each prompt, compute a match score:
|
||||
- Check tags against prompt `summary` and `chapters` (case-insensitive substring match)
|
||||
- Score = `number_of_matching_tags * base_priority`
|
||||
- Priority 10 prompts: score = infinity (always included)
|
||||
3. Sort by score descending
|
||||
4. Fill a byte budget (configurable, default ~8KB) from top down:
|
||||
- Include full content until budget exhausted
|
||||
- Remaining matched prompts: include as index entries (name + summary)
|
||||
- Non-matched prompts: listed as names only in the "other prompts available" section
|
||||
|
||||
**When `begin_session` is skipped (intercept path):**
|
||||
- Extract keywords from tool name + arguments (e.g., `home-assistant/get_entities({ domain: "light" })` → tags: `["home-assistant", "entities", "light"]`)
|
||||
- Run same matching algorithm
|
||||
- Inject briefing alongside the real tool result
|
||||
|
||||
### `read_prompts` Tool (Ongoing Retrieval)
|
||||
|
||||
Available after session is ungated. Allows the client LLM to request more context at any point:
|
||||
|
||||
```json
|
||||
{
|
||||
"name": "read_prompts",
|
||||
"description": "Request additional project context by keywords. Use this whenever you need guidelines, policies, or conventions related to your current work. It's better to check than to guess.",
|
||||
"inputSchema": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"tags": {
|
||||
"type": "array",
|
||||
"items": { "type": "string" },
|
||||
"description": "Keywords describing what context you need (e.g. [\"security\", \"vpn\", \"firewall\"])"
|
||||
}
|
||||
},
|
||||
"required": ["tags"]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Returns matched prompt content + the prompt index reminder.
|
||||
|
||||
### Prompt Links
|
||||
|
||||
A prompt can be a **link** to an MCP resource in another project's server. The linked content is fetched server-side (by the proxy, not the client), enforcing RBAC.
|
||||
|
||||
Format: `project/server:resource-uri`
|
||||
Example: `system-public/docmost-mcp:docmost://pages/architecture-overview`
|
||||
|
||||
Properties:
|
||||
- The proxy fetches linked content using the source project's service account
|
||||
- Client LLM never gets direct access to the source MCP server
|
||||
- Dead links are detected and marked (health check on link resolution)
|
||||
- Dead links generate error log entries
|
||||
|
||||
RBAC for links:
|
||||
- Creating a link requires `edit` permission on RBAC in the target project
|
||||
- A service account permission is created on the source project for the linked resource
|
||||
- Default: admin group members can manage links
|
||||
|
||||
## Schema Changes
|
||||
|
||||
### Project
|
||||
|
||||
Add field:
|
||||
- `gated: boolean` (default: `true`)
|
||||
|
||||
### Prompt
|
||||
|
||||
Add fields:
|
||||
- `priority: integer` (1-10, default: 5)
|
||||
- `summary: string | null` (auto-generated)
|
||||
- `chapters: string[] | null` (auto-generated, stored as JSON)
|
||||
- `linkTarget: string | null` (format: `project/server:resource-uri`, null for regular prompts)
|
||||
|
||||
### PromptRequest
|
||||
|
||||
Add field:
|
||||
- `priority: integer` (1-10, default: 5)
|
||||
|
||||
## API Changes
|
||||
|
||||
### Modified Endpoints
|
||||
|
||||
- `POST /api/v1/prompts` — accept `priority`, `linkTarget`
|
||||
- `PUT /api/v1/prompts/:id` — accept `priority` (not `linkTarget` — links are immutable, delete and recreate)
|
||||
- `POST /api/v1/promptrequests` — accept `priority`
|
||||
- `GET /api/v1/prompts` — return `priority`, `summary`, `linkTarget`, `linkStatus` (alive/dead/unknown)
|
||||
- `GET /api/v1/projects/:name/prompts/visible` — return `priority`, `summary`, `chapters`
|
||||
|
||||
### New Endpoints
|
||||
|
||||
- `POST /api/v1/prompts/:id/regenerate-summary` — force re-generation of summary/chapters
|
||||
- `GET /api/v1/projects/:name/prompt-index` — returns compact index (name, priority, summary, chapters)
|
||||
|
||||
## MCP Protocol Changes (mcplocal router)
|
||||
|
||||
### Session State
|
||||
|
||||
Router tracks per-session state:
|
||||
- `gated: boolean` — starts `true` if project is gated
|
||||
- `tags: string[]` — accumulated tags from begin_session + read_prompts calls
|
||||
- `retrievedPrompts: Set<string>` — prompts already sent to client (avoid re-sending)
|
||||
|
||||
### Gated Session Flow
|
||||
|
||||
1. On `initialize`: instructions include prompt index + gate message + retrieval encouragement
|
||||
2. `tools/list` while gated: only `begin_session` visible (progressive tool exposure)
|
||||
3. `begin_session({ tags })`: match tags → return briefing + prompt index + encouragement → ungate → send `notifications/tools/list_changed`
|
||||
4. On first `tools/call` while still gated: extract keywords → match → inject briefing alongside result → ungate
|
||||
5. After ungating: all tools work normally, `read_prompts` available for ongoing retrieval
|
||||
|
||||
### `begin_session` Tool
|
||||
|
||||
```json
|
||||
{
|
||||
"name": "begin_session",
|
||||
"description": "Start your session by providing 5 keywords that describe your current task. You'll receive relevant project context, policies, and guidelines. Required before using other tools.",
|
||||
"inputSchema": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"tags": {
|
||||
"type": "array",
|
||||
"items": { "type": "string" },
|
||||
"maxItems": 10,
|
||||
"description": "5 keywords describing your current task (e.g. [\"zigbee\", \"automation\", \"lights\", \"mqtt\", \"pairing\"])"
|
||||
}
|
||||
},
|
||||
"required": ["tags"]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Response structure:
|
||||
```
|
||||
[Priority 10 prompts — always, full content]
|
||||
|
||||
[Tag-matched prompts — full content, byte-budget-capped, priority-ordered]
|
||||
|
||||
Other prompts available that may become relevant as your work progresses:
|
||||
- <name>: <summary>
|
||||
- <name>: <summary>
|
||||
- ...
|
||||
If any of these seem related to what you're doing, request them with
|
||||
read_prompts({ tags: [...] }). Don't assume you have all the context — check.
|
||||
```
|
||||
|
||||
### Prompt Index in Instructions
|
||||
|
||||
The `initialize` instructions include a compact prompt index so the client LLM can see what knowledge exists. Format per prompt: `- <name>: <summary>` (~100 chars max per entry).
|
||||
|
||||
Cap: if more than 50 prompts, include only priority 7+ in instructions index. Full index always available via `resources/list`.
|
||||
|
||||
## CLI Changes
|
||||
|
||||
### New/Modified Commands
|
||||
|
||||
- `mcpctl create prompt <name> --priority <1-10>` — create with priority
|
||||
- `mcpctl create prompt <name> --link <project/server:uri>` — create linked prompt
|
||||
- `mcpctl get prompt -A` — show all prompts across all projects, with link targets
|
||||
- `mcpctl describe project <name>` — show gated status, session greeting, prompt table
|
||||
- `mcpctl edit project <name>` — `gated` field editable
|
||||
|
||||
### Prompt Link Display
|
||||
|
||||
```
|
||||
$ mcpctl get prompt -A
|
||||
PROJECT NAME PRIORITY LINK STATUS
|
||||
homeautomation security-policies 8 - -
|
||||
homeautomation architecture-adr 6 system-public/docmost-mcp:docmost://pages/a1 alive
|
||||
homeautomation common-mistakes 10 - -
|
||||
system-public onboarding 4 - -
|
||||
```
|
||||
|
||||
## Describe Project Output
|
||||
|
||||
```
|
||||
$ mcpctl describe project homeautomation
|
||||
Name: homeautomation
|
||||
Gated: true
|
||||
LLM Provider: gemini-cli
|
||||
...
|
||||
|
||||
Session greeting:
|
||||
You have access to project knowledge containing policies, architecture decisions,
|
||||
and guidelines. Call begin_session with 5 keywords describing your task to receive
|
||||
relevant context. Some prompts contain critical rules — it's better to check than guess.
|
||||
|
||||
Prompts:
|
||||
NAME PRIORITY TYPE LINK
|
||||
common-mistakes 10 local -
|
||||
security-policies 8 local -
|
||||
architecture-adr 6 link system-public/docmost-mcp:docmost://pages/a1
|
||||
stack 5 local -
|
||||
```
|
||||
|
||||
## Testing Strategy
|
||||
|
||||
**Full test coverage is required.** Every new module, service, route, and algorithm must have comprehensive tests. No feature ships without tests.
|
||||
|
||||
### Unit Tests (mcpd)
|
||||
- Prompt priority CRUD: create/update/get with priority field, default value, validation (1-10 range)
|
||||
- Prompt link CRUD: create with linkTarget, immutability (can't update linkTarget), delete
|
||||
- Prompt summary generation: auto-generation on create/update, regex fallback when no LLM
|
||||
- `GET /api/v1/prompts` with priority, linkTarget, linkStatus fields
|
||||
- `GET /api/v1/projects/:name/prompt-index` returns compact index
|
||||
- `POST /api/v1/prompts/:id/regenerate-summary` triggers re-generation
|
||||
- Project `gated` field: CRUD, default value
|
||||
|
||||
### Unit Tests (mcplocal — gating flow)
|
||||
- State machine: gated → `begin_session` → ungated (happy path)
|
||||
- State machine: gated → `tools/call` intercepted → ungated (fallback path)
|
||||
- State machine: non-gated project skips gate entirely
|
||||
- LLM selection path: tags + prompt index sent to heavy LLM, ranked results returned, priority 10 always included
|
||||
- LLM selection path: heavy LLM uses fast LLM for missing summary generation
|
||||
- No-LLM fallback: tag matching score calculation, priority weighting, substring matching
|
||||
- No-LLM fallback: byte-budget exhaustion, priority ordering, index fallback, edge cases
|
||||
- Keyword extraction from tool calls: tool name parsing, argument extraction
|
||||
- `begin_session` response: matched content + index + encouragement text (both LLM and fallback paths)
|
||||
- `read_prompts` response: additional matches, deduplication against already-sent prompts (both paths)
|
||||
- Tools blocked while gated: return error directing to `begin_session`
|
||||
- `tools/list` while gated: only `begin_session` visible
|
||||
- `tools/list` after ungating: `begin_session` replaced by `read_prompts` + all upstream tools
|
||||
- Priority 10 always included regardless of tag match or budget
|
||||
- Prompt index in instructions: cap at 50, priority 7+ when over cap
|
||||
- Notifications: `tools/list_changed` sent after ungating
|
||||
|
||||
### Unit Tests (mcplocal — prompt links)
|
||||
- Link resolution: fetch content from source project's MCP server via service account
|
||||
- Dead link detection: source server unavailable, resource not found, permission denied
|
||||
- Dead link marking: status field updated, error logged
|
||||
- RBAC enforcement: link creation requires edit permission on target project RBAC
|
||||
- Service account permission: auto-created on source project for linked resource
|
||||
- Content isolation: client LLM cannot access source server directly
|
||||
|
||||
### Unit Tests (CLI)
|
||||
- `create prompt` with `--priority` flag, validation
|
||||
- `create prompt` with `--link` flag, format validation
|
||||
- `get prompt -A` output: all projects, link targets, status columns
|
||||
- `describe project` output: gated status, session greeting, prompt table
|
||||
- `edit project` with gated field
|
||||
- Shell completions for new flags and resources
|
||||
|
||||
### Integration Tests
|
||||
- End-to-end gated session: connect → begin_session with tags → tools available → correct prompts returned
|
||||
- End-to-end intercept: connect → skip begin_session → call tool → keywords extracted → briefing injected
|
||||
- End-to-end read_prompts: after ungating → request more context → additional prompts returned → no duplicates
|
||||
- Prompt link resolution: create link → fetch content → verify content matches source
|
||||
- Dead link lifecycle: create link → kill source → verify dead detection → restore → verify recovery
|
||||
- Priority ordering: create prompts at various priorities → verify selection order and budget allocation
|
||||
- Encouragement text: verify retrieval encouragement present in begin_session, read_prompts, and instructions
|
||||
|
||||
## System Prompts (mcpctl-system project)
|
||||
|
||||
All gate messages, encouragement text, and briefing templates are stored as prompts in a special `mcpctl-system` project. This makes them editable at runtime via `mcpctl edit prompt` without code changes or redeployment.
|
||||
|
||||
### Required System Prompts
|
||||
|
||||
| Name | Priority | Purpose |
|
||||
|------|----------|---------|
|
||||
| `gate-instructions` | 10 | Text injected into `initialize` instructions for gated projects. Tells client to call `begin_session` with 5 keywords. |
|
||||
| `gate-encouragement` | 10 | Appended after `begin_session` response. Lists remaining prompts and encourages further retrieval. |
|
||||
| `read-prompts-reminder` | 10 | Appended after `read_prompts` response. Reminds client that more context is available. |
|
||||
| `gate-intercept-preamble` | 10 | Prepended to briefing when injected via tool call intercept (Option C fallback). |
|
||||
| `session-greeting` | 10 | Shown in `mcpctl describe project` as the "hello prompt" — what client LLMs see on connect. |
|
||||
|
||||
### Bootstrap
|
||||
|
||||
The `mcpctl-system` project and its system prompts are created automatically on first startup (seed migration). They can be edited afterward but not deleted — delete attempts return an error.
|
||||
|
||||
### How mcplocal Uses Them
|
||||
|
||||
On router initialization, mcplocal fetches system prompts from mcpd via:
|
||||
```
|
||||
GET /api/v1/projects/mcpctl-system/prompts/visible
|
||||
```
|
||||
|
||||
These are cached with the same 60s TTL as project routers. The prompt content supports template variables:
|
||||
- `{{prompt_index}}` — replaced with the current project's prompt index
|
||||
- `{{project_name}}` — replaced with the current project name
|
||||
- `{{matched_prompts}}` — replaced with tag-matched prompt content
|
||||
- `{{remaining_prompts}}` — replaced with the list of non-matched prompts
|
||||
|
||||
This way the encouragement text, tone, and structure can be tuned by editing prompts — no code changes needed.
|
||||
|
||||
## Security Considerations
|
||||
|
||||
- Prompt links: content fetched server-side, client never gets direct access to source MCP server
|
||||
- RBAC: link creation requires edit permission on target project's RBAC
|
||||
- Service account: source project grants read access to linked resource only
|
||||
- Dead links: logged as errors, marked in listings, never expose source server errors to client
|
||||
- Tag extraction: sanitize tool call arguments before using as keywords (prevent injection)
|
||||
@@ -1408,13 +1408,497 @@
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-21T18:52:29.084Z"
|
||||
},
|
||||
{
|
||||
"id": "37",
|
||||
"title": "Add priority, summary, chapters, and linkTarget fields to Prompt schema",
|
||||
"description": "Extend the Prisma schema for the Prompt model to include priority (integer 1-10, default 5), summary (nullable string), chapters (nullable JSON array), and linkTarget (nullable string for prompt links).",
|
||||
"details": "1. Update `/src/db/prisma/schema.prisma` to add fields to the Prompt model:\n - `priority Int @default(5)` with check constraint 1-10\n - `summary String? @db.Text`\n - `chapters Json?` (stored as JSON array of strings)\n - `linkTarget String?` (format: `project/server:resource-uri`)\n\n2. Create Prisma migration:\n ```bash\n pnpm --filter db exec prisma migrate dev --name add-prompt-priority-summary-chapters-link\n ```\n\n3. Update TypeScript types in shared package to reflect new fields\n\n4. Add validation for priority range (1-10) at the database level if possible, otherwise enforce in application layer",
|
||||
"testStrategy": "- Unit test: Verify migration creates columns with correct types and defaults\n- Unit test: Verify priority default is 5\n- Unit test: Verify nullable fields accept null\n- Unit test: Verify chapters stores/retrieves JSON arrays correctly\n- Integration test: Create prompt with all new fields, retrieve and verify values",
|
||||
"priority": "high",
|
||||
"dependencies": [],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:35:08.154Z"
|
||||
},
|
||||
{
|
||||
"id": "38",
|
||||
"title": "Add priority field to PromptRequest schema",
|
||||
"description": "Extend the Prisma schema for the PromptRequest model to include the priority field (integer 1-10, default 5) to match the Prompt model.",
|
||||
"details": "1. Update `/src/db/prisma/schema.prisma` to add to PromptRequest:\n - `priority Int @default(5)`\n\n2. Create Prisma migration:\n ```bash\n pnpm --filter db exec prisma migrate dev --name add-promptrequest-priority\n ```\n\n3. Update the `CreatePromptRequestSchema` in `/src/mcpd/src/validation/prompt.schema.ts` to include priority validation:\n ```typescript\n priority: z.number().int().min(1).max(10).default(5).optional(),\n ```\n\n4. Update TypeScript types in shared package",
|
||||
"testStrategy": "- Unit test: Migration creates priority column with default 5\n- Unit test: PromptRequest creation with explicit priority\n- Unit test: PromptRequest creation uses default priority when not specified\n- Unit test: Validation rejects priority outside 1-10 range",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"37"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:35:08.160Z"
|
||||
},
|
||||
{
|
||||
"id": "39",
|
||||
"title": "Add gated field to Project schema",
|
||||
"description": "Extend the Prisma schema for the Project model to include the gated boolean field (default true) that controls whether sessions go through the keyword-driven prompt retrieval flow.",
|
||||
"details": "1. Update `/src/db/prisma/schema.prisma` to add to Project:\n - `gated Boolean @default(true)`\n\n2. Create Prisma migration:\n ```bash\n pnpm --filter db exec prisma migrate dev --name add-project-gated\n ```\n\n3. Update project-related TypeScript types\n\n4. Update project validation schemas to include gated field:\n ```typescript\n gated: z.boolean().default(true).optional(),\n ```\n\n5. Update project API routes to accept and return the gated field",
|
||||
"testStrategy": "- Unit test: Migration creates gated column with default true\n- Unit test: Project creation with gated=false\n- Unit test: Project creation uses default gated=true when not specified\n- Unit test: Project update can toggle gated field\n- Integration test: GET /api/v1/projects/:name returns gated field",
|
||||
"priority": "high",
|
||||
"dependencies": [],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:35:08.165Z"
|
||||
},
|
||||
{
|
||||
"id": "40",
|
||||
"title": "Update Prompt CRUD API to handle priority and linkTarget",
|
||||
"description": "Modify prompt API endpoints to accept, validate, and return the priority and linkTarget fields. LinkTarget should be immutable after creation.",
|
||||
"details": "1. Update `/src/mcpd/src/validation/prompt.schema.ts`:\n ```typescript\n export const CreatePromptSchema = z.object({\n name: z.string().min(1).max(100).regex(/^[a-z0-9-]+$/),\n content: z.string().min(1).max(50000),\n projectId: z.string().optional(),\n priority: z.number().int().min(1).max(10).default(5).optional(),\n linkTarget: z.string().regex(/^[a-z0-9-]+\\/[a-z0-9-]+:[\\S]+$/).optional(),\n });\n \n export const UpdatePromptSchema = z.object({\n content: z.string().min(1).max(50000).optional(),\n priority: z.number().int().min(1).max(10).optional(),\n // Note: linkTarget is NOT included - links are immutable\n });\n ```\n\n2. Update `/src/mcpd/src/routes/prompts.ts`:\n - POST /api/v1/prompts: Accept priority, linkTarget\n - PUT /api/v1/prompts/:id: Accept priority only (not linkTarget)\n - GET endpoints: Return priority, linkTarget in response\n\n3. Update repository layer to handle new fields\n\n4. Add linkTarget format validation: `project/server:resource-uri`",
|
||||
"testStrategy": "- Unit test: POST /api/v1/prompts with priority creates prompt with correct priority\n- Unit test: POST /api/v1/prompts with linkTarget creates linked prompt\n- Unit test: PUT /api/v1/prompts/:id with priority updates priority\n- Unit test: PUT /api/v1/prompts/:id rejects linkTarget (immutable)\n- Unit test: GET /api/v1/prompts returns priority and linkTarget fields\n- Unit test: Invalid linkTarget format rejected (validation error)\n- Unit test: Priority outside 1-10 range rejected",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"37"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:37:17.506Z"
|
||||
},
|
||||
{
|
||||
"id": "41",
|
||||
"title": "Update PromptRequest API to handle priority",
|
||||
"description": "Modify prompt request API endpoints to accept, validate, and return the priority field for proposed prompts.",
|
||||
"details": "1. Update validation in `/src/mcpd/src/validation/prompt.schema.ts`:\n ```typescript\n export const CreatePromptRequestSchema = z.object({\n name: z.string().min(1).max(100).regex(/^[a-z0-9-]+$/),\n content: z.string().min(1).max(50000),\n projectId: z.string().optional(),\n createdBySession: z.string().optional(),\n createdByUserId: z.string().optional(),\n priority: z.number().int().min(1).max(10).default(5).optional(),\n });\n ```\n\n2. Update `/src/mcpd/src/routes/prompts.ts` for PromptRequest endpoints:\n - POST /api/v1/promptrequests: Accept priority\n - GET /api/v1/promptrequests: Return priority\n - POST /api/v1/promptrequests/:id/approve: Preserve priority when creating Prompt\n\n3. Update PromptService.approve() to copy priority from request to prompt\n\n4. Update repository layer",
|
||||
"testStrategy": "- Unit test: POST /api/v1/promptrequests with priority creates request with correct priority\n- Unit test: POST /api/v1/promptrequests uses default priority 5 when not specified\n- Unit test: GET /api/v1/promptrequests returns priority field\n- Unit test: Approve preserves priority from request to created prompt\n- Unit test: Priority validation (1-10 range)",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"38"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:37:17.511Z"
|
||||
},
|
||||
{
|
||||
"id": "42",
|
||||
"title": "Implement prompt summary generation service",
|
||||
"description": "Create a service that auto-generates summary (20 words) and chapters (key sections) for prompts, using fast LLM when available or regex fallback.",
|
||||
"details": "1. Create `/src/mcpd/src/services/prompt-summary.service.ts`:\n ```typescript\n export class PromptSummaryService {\n constructor(\n private llmClient: LlmClient | null,\n private promptRepo: IPromptRepository\n ) {}\n \n async generateSummary(content: string): Promise<{ summary: string; chapters: string[] }> {\n if (this.llmClient) {\n return this.generateWithLlm(content);\n }\n return this.generateWithRegex(content);\n }\n \n private async generateWithLlm(content: string): Promise<...> {\n // Send content to fast LLM with prompt:\n // \"Generate a 20-word summary and extract key section topics...\"\n }\n \n private generateWithRegex(content: string): { summary: string; chapters: string[] } {\n // summary: first sentence of content (truncated to ~20 words)\n // chapters: extract markdown headings via regex /^#+\\s+(.+)$/gm\n }\n }\n ```\n\n2. Integrate with PromptService:\n - Call generateSummary on prompt create\n - Call generateSummary on prompt update (when content changes)\n - Cache results on the prompt record\n\n3. Handle LLM availability check via existing LlmConfig patterns",
|
||||
"testStrategy": "- Unit test: generateWithRegex extracts first sentence as summary\n- Unit test: generateWithRegex extracts markdown headings as chapters\n- Unit test: generateWithLlm calls LLM with correct prompt (mock LLM)\n- Unit test: generateSummary uses LLM when available\n- Unit test: generateSummary falls back to regex when no LLM\n- Unit test: Empty content handled gracefully\n- Unit test: Content without headings returns empty chapters array\n- Integration test: Creating prompt triggers summary generation",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"37"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:39:28.196Z"
|
||||
},
|
||||
{
|
||||
"id": "43",
|
||||
"title": "Add regenerate-summary API endpoint",
|
||||
"description": "Create POST /api/v1/prompts/:id/regenerate-summary endpoint to force re-generation of summary and chapters for a prompt.",
|
||||
"details": "1. Add route in `/src/mcpd/src/routes/prompts.ts`:\n ```typescript\n fastify.post('/api/v1/prompts/:id/regenerate-summary', async (request, reply) => {\n const { id } = request.params as { id: string };\n const prompt = await promptService.findById(id);\n if (!prompt) {\n return reply.status(404).send({ error: 'Prompt not found' });\n }\n \n const { summary, chapters } = await summaryService.generateSummary(prompt.content);\n const updated = await promptService.updateSummary(id, summary, chapters);\n \n return reply.send(updated);\n });\n ```\n\n2. Add `updateSummary(id, summary, chapters)` method to PromptRepository and PromptService\n\n3. Return the updated prompt with new summary/chapters in response",
|
||||
"testStrategy": "- Unit test: POST to valid prompt ID regenerates summary\n- Unit test: Returns updated prompt with new summary/chapters\n- Unit test: 404 for non-existent prompt ID\n- Unit test: Uses LLM when available, regex fallback otherwise\n- Integration test: End-to-end regeneration updates database",
|
||||
"priority": "medium",
|
||||
"dependencies": [
|
||||
"42"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:39:28.201Z"
|
||||
},
|
||||
{
|
||||
"id": "44",
|
||||
"title": "Create prompt-index API endpoint",
|
||||
"description": "Create GET /api/v1/projects/:name/prompt-index endpoint that returns a compact index of prompts (name, priority, summary, chapters) for a project.",
|
||||
"details": "1. Add route in `/src/mcpd/src/routes/prompts.ts`:\n ```typescript\n fastify.get('/api/v1/projects/:name/prompt-index', async (request, reply) => {\n const { name } = request.params as { name: string };\n const project = await projectService.findByName(name);\n if (!project) {\n return reply.status(404).send({ error: 'Project not found' });\n }\n \n const prompts = await promptService.findByProject(project.id);\n const index = prompts.map(p => ({\n name: p.name,\n priority: p.priority,\n summary: p.summary,\n chapters: p.chapters,\n linkTarget: p.linkTarget,\n }));\n \n return reply.send({ prompts: index });\n });\n ```\n\n2. Consider adding global prompts to the index (inherited by all projects)\n\n3. Sort by priority descending in response",
|
||||
"testStrategy": "- Unit test: Returns compact index for valid project\n- Unit test: Index contains name, priority, summary, chapters, linkTarget\n- Unit test: 404 for non-existent project\n- Unit test: Empty array for project with no prompts\n- Unit test: Results sorted by priority descending\n- Integration test: End-to-end retrieval matches database state",
|
||||
"priority": "medium",
|
||||
"dependencies": [
|
||||
"42"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:39:28.208Z"
|
||||
},
|
||||
{
|
||||
"id": "45",
|
||||
"title": "Implement tag-matching algorithm for prompt selection",
|
||||
"description": "Create a deterministic keyword-based tag matching algorithm as the no-LLM fallback for prompt selection, with byte-budget allocation and priority weighting.",
|
||||
"details": "1. Create `/src/mcplocal/src/services/tag-matcher.service.ts`:\n ```typescript\n interface MatchedPrompt {\n prompt: PromptIndex;\n score: number;\n matchedTags: string[];\n }\n \n export class TagMatcherService {\n constructor(private byteBudget: number = 8192) {}\n \n matchPrompts(tags: string[], promptIndex: PromptIndex[]): {\n fullContent: PromptIndex[]; // Prompts to include in full\n indexOnly: PromptIndex[]; // Prompts to include as index entries\n remaining: PromptIndex[]; // Non-matched prompts (names only)\n } {\n // 1. Priority 10 prompts: always included (score = Infinity)\n // 2. For each prompt, compute score:\n // - Check tags against summary + chapters (case-insensitive substring)\n // - score = matching_tags_count * priority\n // 3. Sort by score descending\n // 4. Fill byte budget from top:\n // - Include full content until budget exhausted\n // - Remaining matched: include as index entries\n // - Non-matched: names only\n }\n \n private computeScore(tags: string[], prompt: PromptIndex): number {\n if (prompt.priority === 10) return Infinity;\n const matchingTags = tags.filter(tag => \n this.matchesPrompt(tag.toLowerCase(), prompt)\n );\n return matchingTags.length * prompt.priority;\n }\n \n private matchesPrompt(tag: string, prompt: PromptIndex): boolean {\n const searchText = [\n prompt.summary || '',\n ...(prompt.chapters || [])\n ].join(' ').toLowerCase();\n return searchText.includes(tag);\n }\n }\n ```\n\n2. Handle edge cases: empty tags, no prompts, all priority 10, etc.",
|
||||
"testStrategy": "- Unit test: Priority 10 prompts always included regardless of tags\n- Unit test: Score calculation: matching_tags * priority\n- Unit test: Case-insensitive matching\n- Unit test: Substring matching in summary and chapters\n- Unit test: Byte budget exhaustion stops full content inclusion\n- Unit test: Matched prompts beyond budget become index entries\n- Unit test: Non-matched prompts listed as names only\n- Unit test: Sorting by score descending\n- Unit test: Empty tags returns priority 10 only\n- Unit test: No prompts returns empty result",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"44"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:40:47.570Z"
|
||||
},
|
||||
{
|
||||
"id": "46",
|
||||
"title": "Implement LLM-based prompt selection service",
|
||||
"description": "Create a service that uses the heavy LLM to intelligently select relevant prompts based on tags and the full prompt index, understanding synonyms and context.",
|
||||
"details": "1. Create `/src/mcplocal/src/services/llm-prompt-selector.service.ts`:\n ```typescript\n export class LlmPromptSelectorService {\n constructor(\n private llmClient: LlmClient,\n private fastLlmClient: LlmClient | null,\n private tagMatcher: TagMatcherService // fallback\n ) {}\n \n async selectPrompts(tags: string[], promptIndex: PromptIndex[]): Promise<{\n selected: Array<{ name: string; reason: string }>;\n priority10: PromptIndex[]; // Always included\n }> {\n // 1. Extract priority 10 prompts (always included)\n // 2. Generate missing summaries using fast LLM if needed\n // 3. Send to heavy LLM:\n const prompt = `\n Given these keywords: ${tags.join(', ')}\n And this prompt index:\n ${promptIndex.map(p => `- ${p.name}: ${p.summary}`).join('\\n')}\n \n Select the most relevant prompts for someone working on tasks\n related to these keywords. Consider synonyms and related concepts.\n Return a ranked JSON array: [{name: string, reason: string}]\n `;\n // 4. Parse LLM response\n // 5. On LLM error, fall back to tag matcher\n }\n }\n ```\n\n2. Handle LLM timeouts and errors gracefully with fallback\n\n3. Validate LLM response format",
|
||||
"testStrategy": "- Unit test: Priority 10 prompts always returned regardless of LLM selection\n- Unit test: LLM called with correct prompt format (mock)\n- Unit test: LLM response parsed correctly\n- Unit test: Invalid LLM response falls back to tag matcher\n- Unit test: LLM timeout falls back to tag matcher\n- Unit test: Missing summaries trigger fast LLM generation\n- Unit test: No LLM available uses tag matcher directly\n- Integration test: End-to-end selection with mock LLM",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"45"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:45:57.158Z"
|
||||
},
|
||||
{
|
||||
"id": "47",
|
||||
"title": "Implement session state management for gating",
|
||||
"description": "Extend the McpRouter to track per-session gating state including gated status, accumulated tags, and retrieved prompts set.",
|
||||
"details": "1. Update `/src/mcplocal/src/router.ts` to add session state:\n ```typescript\n interface SessionState {\n gated: boolean; // starts true if project is gated\n tags: string[]; // accumulated from begin_session + read_prompts\n retrievedPrompts: Set<string>; // prompts already sent (avoid duplicates)\n }\n \n export class McpRouter {\n private sessionStates: Map<string, SessionState> = new Map();\n \n getSessionState(sessionId: string): SessionState {\n if (!this.sessionStates.has(sessionId)) {\n this.sessionStates.set(sessionId, {\n gated: this.projectConfig?.gated ?? true,\n tags: [],\n retrievedPrompts: new Set(),\n });\n }\n return this.sessionStates.get(sessionId)!;\n }\n \n ungateSession(sessionId: string): void {\n const state = this.getSessionState(sessionId);\n state.gated = false;\n }\n \n addRetrievedPrompts(sessionId: string, names: string[]): void {\n const state = this.getSessionState(sessionId);\n names.forEach(n => state.retrievedPrompts.add(n));\n }\n }\n ```\n\n2. Clean up session state when session closes\n\n3. Handle session state for non-gated projects (gated=false from start)",
|
||||
"testStrategy": "- Unit test: New session starts with gated=true for gated project\n- Unit test: New session starts with gated=false for non-gated project\n- Unit test: ungateSession changes gated to false\n- Unit test: addRetrievedPrompts adds to set\n- Unit test: retrievedPrompts prevents duplicates\n- Unit test: Session state isolated per sessionId\n- Unit test: Session cleanup removes state",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"39"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:45:57.164Z"
|
||||
},
|
||||
{
|
||||
"id": "48",
|
||||
"title": "Implement begin_session tool for gated sessions",
|
||||
"description": "Create the begin_session MCP tool that accepts 5 keywords, triggers prompt selection, returns matched content with encouragement, and ungates the session.",
|
||||
"details": "1. Add begin_session tool definition in `/src/mcplocal/src/router.ts`:\n ```typescript\n private getBeginSessionTool(): Tool {\n return {\n name: 'begin_session',\n description: 'Start your session by providing 5 keywords that describe your current task. You\\'ll receive relevant project context, policies, and guidelines. Required before using other tools.',\n inputSchema: {\n type: 'object',\n properties: {\n tags: {\n type: 'array',\n items: { type: 'string' },\n maxItems: 10,\n description: '5 keywords describing your current task'\n }\n },\n required: ['tags']\n }\n };\n }\n ```\n\n2. Implement begin_session handler:\n - Validate tags array (1-10 items)\n - Call LlmPromptSelector or TagMatcher based on LLM availability\n - Fetch full content for selected prompts\n - Build response with matched content + index + encouragement\n - Ungate session\n - Send `notifications/tools/list_changed`\n\n3. Response format:\n ```\n [Priority 10 prompts - full content]\n \n [Tag-matched prompts - full content, priority-ordered]\n \n Other prompts available that may become relevant...\n - name: summary\n ...\n If any seem related, request them with read_prompts({ tags: [...] }).\n ```",
|
||||
"testStrategy": "- Unit test: begin_session with valid tags returns matched prompts\n- Unit test: begin_session includes priority 10 prompts always\n- Unit test: begin_session response includes encouragement text\n- Unit test: begin_session response includes prompt index\n- Unit test: Session ungated after successful begin_session\n- Unit test: notifications/tools/list_changed sent after ungating\n- Unit test: Empty tags handled (returns priority 10 only)\n- Unit test: Invalid tags rejected with error\n- Unit test: begin_session while already ungated returns error",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"46",
|
||||
"47"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:50:39.111Z"
|
||||
},
|
||||
{
|
||||
"id": "49",
|
||||
"title": "Implement read_prompts tool for ongoing retrieval",
|
||||
"description": "Create the read_prompts MCP tool that allows clients to request additional context by keywords after the session is ungated.",
|
||||
"details": "1. Add read_prompts tool definition:\n ```typescript\n private getReadPromptsTool(): Tool {\n return {\n name: 'read_prompts',\n description: 'Request additional project context by keywords. Use this whenever you need guidelines, policies, or conventions related to your current work.',\n inputSchema: {\n type: 'object',\n properties: {\n tags: {\n type: 'array',\n items: { type: 'string' },\n description: 'Keywords describing what context you need'\n }\n },\n required: ['tags']\n }\n };\n }\n ```\n\n2. Implement read_prompts handler:\n - Always use keyword matching (not LLM) for precision\n - Exclude already-retrieved prompts from response\n - Add newly retrieved prompts to session state\n - Include reminder about more prompts available\n\n3. Response format:\n ```\n [Matched prompt content - deduplicated]\n \n Remember: you can request more prompts at any time with read_prompts({ tags: [...] }).\n The project may have additional guidelines relevant to your current approach.\n ```",
|
||||
"testStrategy": "- Unit test: read_prompts returns matched prompts by keyword\n- Unit test: Already retrieved prompts excluded from response\n- Unit test: Newly retrieved prompts added to session state\n- Unit test: Response includes reminder text\n- Unit test: read_prompts while gated returns error\n- Unit test: Empty tags returns empty response\n- Unit test: Uses keyword matching not LLM",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"48"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:50:39.115Z"
|
||||
},
|
||||
{
|
||||
"id": "50",
|
||||
"title": "Implement progressive tool exposure for gated sessions",
|
||||
"description": "Modify tools/list behavior to only expose begin_session while gated, and expose all tools plus read_prompts after ungating.",
|
||||
"details": "1. Update tools/list handling in `/src/mcplocal/src/router.ts`:\n ```typescript\n async handleToolsList(sessionId: string): Promise<Tool[]> {\n const state = this.getSessionState(sessionId);\n \n if (state.gated) {\n // Only show begin_session while gated\n return [this.getBeginSessionTool()];\n }\n \n // After ungating: all upstream tools + read_prompts\n const upstreamTools = await this.discoverTools();\n return [...upstreamTools, this.getReadPromptsTool()];\n }\n ```\n\n2. Block direct tool calls while gated:\n ```typescript\n async handleToolCall(sessionId: string, toolName: string, args: any): Promise<any> {\n const state = this.getSessionState(sessionId);\n \n if (state.gated && toolName !== 'begin_session') {\n // Intercept: extract keywords, match prompts, inject briefing\n return this.handleInterceptedCall(sessionId, toolName, args);\n }\n \n // Normal routing\n return this.routeToolCall(toolName, args);\n }\n ```\n\n3. Ensure notifications/tools/list_changed is sent after ungating",
|
||||
"testStrategy": "- Unit test: tools/list while gated returns only begin_session\n- Unit test: tools/list after ungating returns all tools + read_prompts\n- Unit test: begin_session not visible after ungating\n- Unit test: Tool call while gated (not begin_session) triggers intercept\n- Unit test: Tool call after ungating routes normally\n- Unit test: notifications/tools/list_changed sent on ungate",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"48",
|
||||
"49"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:50:39.120Z"
|
||||
},
|
||||
{
|
||||
"id": "51",
|
||||
"title": "Implement keyword extraction from tool calls",
|
||||
"description": "Create a service that extracts keywords from tool names and arguments for the intercept fallback path when clients skip begin_session.",
|
||||
"details": "1. Create `/src/mcplocal/src/services/keyword-extractor.service.ts`:\n ```typescript\n export class KeywordExtractorService {\n extractKeywords(toolName: string, args: Record<string, any>): string[] {\n const keywords: string[] = [];\n \n // Extract from tool name (split on / and -)\n // e.g., \"home-assistant/get_entities\" -> [\"home\", \"assistant\", \"get\", \"entities\"]\n keywords.push(...this.extractFromName(toolName));\n \n // Extract from argument values\n // e.g., { domain: \"light\", entity_id: \"light.kitchen\" } -> [\"light\", \"kitchen\"]\n keywords.push(...this.extractFromArgs(args));\n \n // Deduplicate and sanitize\n return [...new Set(keywords.map(k => this.sanitize(k)))];\n }\n \n private sanitize(keyword: string): string {\n // Remove special characters, lowercase, limit length\n return keyword.toLowerCase().replace(/[^a-z0-9]/g, '').slice(0, 50);\n }\n }\n ```\n\n2. Handle various argument types: strings, arrays, nested objects\n\n3. Prevent injection by sanitizing extracted keywords",
|
||||
"testStrategy": "- Unit test: Extracts keywords from tool name with /\n- Unit test: Extracts keywords from tool name with -\n- Unit test: Extracts keywords from string argument values\n- Unit test: Extracts keywords from array argument values\n- Unit test: Handles nested object arguments\n- Unit test: Sanitizes special characters\n- Unit test: Deduplicates keywords\n- Unit test: Handles empty arguments\n- Unit test: Limits keyword length to prevent abuse",
|
||||
"priority": "medium",
|
||||
"dependencies": [],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:40:47.575Z"
|
||||
},
|
||||
{
|
||||
"id": "52",
|
||||
"title": "Implement tool call intercept with briefing injection",
|
||||
"description": "When a gated session calls a tool without first calling begin_session, intercept the call, extract keywords, match prompts, and inject the briefing alongside the real tool result.",
|
||||
"details": "1. Implement handleInterceptedCall in `/src/mcplocal/src/router.ts`:\n ```typescript\n async handleInterceptedCall(\n sessionId: string,\n toolName: string,\n args: any\n ): Promise<ToolResult> {\n // 1. Extract keywords from tool call\n const keywords = this.keywordExtractor.extractKeywords(toolName, args);\n \n // 2. Match prompts using keywords\n const { fullContent, indexOnly, remaining } = \n await this.promptSelector.selectPrompts(keywords, this.promptIndex);\n \n // 3. Execute the actual tool call\n const actualResult = await this.routeToolCall(toolName, args);\n \n // 4. Build briefing with intercept preamble\n const briefing = this.buildBriefing(fullContent, indexOnly, remaining, 'intercept');\n \n // 5. Ungate session\n this.ungateSession(sessionId);\n \n // 6. Send notifications/tools/list_changed\n await this.sendToolsListChanged();\n \n // 7. Return combined result\n return {\n content: [{\n type: 'text',\n text: `${briefing}\\n\\n---\\n\\n${actualResult.content[0].text}`\n }]\n };\n }\n ```\n\n2. Use gate-intercept-preamble system prompt for the briefing prefix",
|
||||
"testStrategy": "- Unit test: Tool call while gated triggers intercept\n- Unit test: Keywords extracted from tool name and args\n- Unit test: Prompts matched using extracted keywords\n- Unit test: Actual tool still executes and returns result\n- Unit test: Briefing prepended to tool result\n- Unit test: Session ungated after intercept\n- Unit test: notifications/tools/list_changed sent\n- Unit test: Intercept preamble included in briefing\n- Integration test: End-to-end intercept flow",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"50",
|
||||
"51"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:51:03.822Z"
|
||||
},
|
||||
{
|
||||
"id": "53",
|
||||
"title": "Add prompt index to initialize instructions",
|
||||
"description": "Modify the initialize handler to include the compact prompt index and gate message in instructions for gated projects.",
|
||||
"details": "1. Update initialize handling in `/src/mcplocal/src/router.ts`:\n ```typescript\n async handleInitialize(sessionId: string): Promise<InitializeResult> {\n const state = this.getSessionState(sessionId);\n \n let instructions = this.projectConfig.prompt || '';\n \n if (state.gated) {\n // Add gate instructions\n const gateInstructions = await this.getSystemPrompt('gate-instructions');\n \n // Build prompt index (cap at 50, priority 7+ if over)\n const index = this.buildPromptIndex();\n \n instructions += `\\n\\n${gateInstructions.replace('{{prompt_index}}', index)}`;\n }\n \n return {\n protocolVersion: '2024-11-05',\n capabilities: { ... },\n serverInfo: { ... },\n instructions,\n };\n }\n ```\n\n2. Build prompt index with cap:\n - If <= 50 prompts: include all\n - If > 50 prompts: include only priority 7+\n - Format: `- <name>: <summary>` (~100 chars per entry)",
|
||||
"testStrategy": "- Unit test: Gated project includes gate instructions in initialize\n- Unit test: Prompt index included in instructions\n- Unit test: Index capped at 50 entries\n- Unit test: Over 50 prompts shows priority 7+ only\n- Unit test: Non-gated project skips gate instructions\n- Unit test: {{prompt_index}} template replaced\n- Integration test: End-to-end initialize with gated project",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"47",
|
||||
"44"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:52:13.697Z"
|
||||
},
|
||||
{
|
||||
"id": "54",
|
||||
"title": "Create mcpctl-system project with system prompts",
|
||||
"description": "Implement bootstrap logic to create the mcpctl-system project and its required system prompts on first startup, with protection against deletion.",
|
||||
"details": "1. Create seed migration or startup hook:\n ```typescript\n async function bootstrapSystemProject() {\n const systemProject = await projectRepo.findByName('mcpctl-system');\n if (systemProject) return; // Already exists\n \n // Create mcpctl-system project\n const project = await projectRepo.create({\n name: 'mcpctl-system',\n description: 'System prompts for mcpctl gating and encouragement',\n gated: false, // System project is not gated\n ownerId: SYSTEM_USER_ID,\n });\n \n // Create required system prompts\n const systemPrompts = [\n { name: 'gate-instructions', priority: 10, content: GATE_INSTRUCTIONS },\n { name: 'gate-encouragement', priority: 10, content: GATE_ENCOURAGEMENT },\n { name: 'read-prompts-reminder', priority: 10, content: READ_PROMPTS_REMINDER },\n { name: 'gate-intercept-preamble', priority: 10, content: GATE_INTERCEPT_PREAMBLE },\n { name: 'session-greeting', priority: 10, content: SESSION_GREETING },\n ];\n \n for (const p of systemPrompts) {\n await promptRepo.create({ ...p, projectId: project.id });\n }\n }\n ```\n\n2. Add delete protection in prompt delete endpoint:\n - Check if prompt belongs to mcpctl-system\n - Return 403 error if attempting to delete system prompt\n\n3. Define default content for each system prompt per PRD",
|
||||
"testStrategy": "- Unit test: System project created on first startup\n- Unit test: All 5 system prompts created\n- Unit test: Subsequent startups don't duplicate\n- Unit test: Delete system prompt returns 403\n- Unit test: System prompts have priority 10\n- Unit test: mcpctl-system project has gated=false\n- Integration test: End-to-end bootstrap flow",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"40",
|
||||
"39"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:56:12.064Z"
|
||||
},
|
||||
{
|
||||
"id": "55",
|
||||
"title": "Implement system prompt fetching and caching in mcplocal",
|
||||
"description": "Add functionality to mcplocal router to fetch system prompts from mcpd and cache them with 60s TTL, supporting template variable replacement.",
|
||||
"details": "1. Add system prompt fetching in `/src/mcplocal/src/router.ts`:\n ```typescript\n private systemPromptCache: Map<string, { content: string; expiresAt: number }> = new Map();\n \n async getSystemPrompt(name: string): Promise<string> {\n const cached = this.systemPromptCache.get(name);\n if (cached && cached.expiresAt > Date.now()) {\n return cached.content;\n }\n \n const prompts = await this.mcpdClient.fetch(\n '/api/v1/projects/mcpctl-system/prompts/visible'\n );\n const prompt = prompts.find(p => p.name === name);\n if (!prompt) {\n throw new Error(`System prompt not found: ${name}`);\n }\n \n this.systemPromptCache.set(name, {\n content: prompt.content,\n expiresAt: Date.now() + 60000, // 60s TTL\n });\n \n return prompt.content;\n }\n ```\n\n2. Add template variable replacement:\n ```typescript\n replaceTemplateVariables(content: string, vars: Record<string, string>): string {\n return content\n .replace(/\\{\\{prompt_index\\}\\}/g, vars.prompt_index || '')\n .replace(/\\{\\{project_name\\}\\}/g, vars.project_name || '')\n .replace(/\\{\\{matched_prompts\\}\\}/g, vars.matched_prompts || '')\n .replace(/\\{\\{remaining_prompts\\}\\}/g, vars.remaining_prompts || '');\n }\n ```",
|
||||
"testStrategy": "- Unit test: System prompt fetched from mcpd\n- Unit test: Cached prompt returned within TTL\n- Unit test: Cache miss triggers fresh fetch\n- Unit test: Missing system prompt throws error\n- Unit test: Template variables replaced correctly\n- Unit test: Unknown template variables left as-is\n- Integration test: End-to-end fetch and cache",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"54"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:57:28.917Z"
|
||||
},
|
||||
{
|
||||
"id": "56",
|
||||
"title": "Implement prompt link resolution service",
|
||||
"description": "Create a service that fetches linked prompt content from source MCP servers using the project's service account, with dead link detection.",
|
||||
"details": "1. Create `/src/mcplocal/src/services/link-resolver.service.ts`:\n ```typescript\n export class LinkResolverService {\n constructor(private mcpdClient: McpdClient) {}\n \n async resolveLink(linkTarget: string): Promise<{\n content: string | null;\n status: 'alive' | 'dead' | 'unknown';\n error?: string;\n }> {\n // Parse linkTarget: project/server:resource-uri\n const { project, server, uri } = this.parseLink(linkTarget);\n \n try {\n // Use service account for source project\n const content = await this.fetchResource(project, server, uri);\n return { content, status: 'alive' };\n } catch (error) {\n this.logDeadLink(linkTarget, error);\n return { \n content: null, \n status: 'dead',\n error: error.message \n };\n }\n }\n \n private parseLink(linkTarget: string): { project: string; server: string; uri: string } {\n const match = linkTarget.match(/^([^/]+)\\/([^:]+):(.+)$/);\n if (!match) throw new Error('Invalid link format');\n return { project: match[1], server: match[2], uri: match[3] };\n }\n \n private async fetchResource(project: string, server: string, uri: string): Promise<string> {\n // Call mcpd to fetch resource via service account\n // mcpd routes to the source project's MCP server\n }\n }\n ```\n\n2. Log dead links as errors\n\n3. Cache resolution results",
|
||||
"testStrategy": "- Unit test: Valid link parsed correctly\n- Unit test: Invalid link format throws error\n- Unit test: Successful resolution returns content and status='alive'\n- Unit test: Failed resolution returns status='dead' with error\n- Unit test: Dead link logged as error\n- Unit test: Service account header included in request\n- Integration test: End-to-end link resolution",
|
||||
"priority": "medium",
|
||||
"dependencies": [
|
||||
"40"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T23:07:29.026Z"
|
||||
},
|
||||
{
|
||||
"id": "57",
|
||||
"title": "Add linkStatus to prompt GET responses",
|
||||
"description": "Modify the GET /api/v1/prompts endpoint to include linkStatus (alive/dead/unknown) for linked prompts by checking link health.",
|
||||
"details": "1. Update `/src/mcpd/src/routes/prompts.ts` GET endpoint:\n ```typescript\n fastify.get('/api/v1/prompts', async (request, reply) => {\n const prompts = await promptService.findAll(filter);\n \n // Check link status for linked prompts\n const promptsWithStatus = await Promise.all(\n prompts.map(async (p) => {\n if (!p.linkTarget) {\n return { ...p, linkStatus: null };\n }\n const status = await linkResolver.checkLinkHealth(p.linkTarget);\n return { ...p, linkStatus: status };\n })\n );\n \n return reply.send(promptsWithStatus);\n });\n ```\n\n2. Consider caching link health to avoid repeated checks\n\n3. Add `linkStatus` field to prompt response schema:\n - `null` for non-linked prompts\n - `'alive'` for working links\n - `'dead'` for broken links\n - `'unknown'` for unchecked links",
|
||||
"testStrategy": "- Unit test: Non-linked prompt has linkStatus=null\n- Unit test: Linked prompt with working link has linkStatus='alive'\n- Unit test: Linked prompt with broken link has linkStatus='dead'\n- Unit test: Link health cached to avoid repeated checks\n- Unit test: All prompts in response have linkStatus field\n- Integration test: End-to-end GET with linked prompts",
|
||||
"priority": "medium",
|
||||
"dependencies": [
|
||||
"56"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T23:09:07.078Z"
|
||||
},
|
||||
{
|
||||
"id": "58",
|
||||
"title": "Add RBAC for prompt link creation",
|
||||
"description": "Implement RBAC checks requiring edit permission on the target project to create prompt links, and auto-create service account permission on the source project.",
|
||||
"details": "1. Update prompt creation in `/src/mcpd/src/services/prompt.service.ts`:\n ```typescript\n async createPrompt(data: CreatePromptInput, userId: string): Promise<Prompt> {\n if (data.linkTarget) {\n // Verify user has edit permission on target project RBAC\n const hasPermission = await this.rbacService.checkPermission(\n userId, data.projectId, 'edit'\n );\n if (!hasPermission) {\n throw new ForbiddenError('Edit permission required to create prompt links');\n }\n \n // Parse link target\n const { project: sourceProject, server, uri } = this.parseLink(data.linkTarget);\n \n // Create service account permission on source project\n await this.rbacService.createServiceAccountPermission(\n data.projectId, // target project\n sourceProject, // source project\n server,\n uri,\n 'read'\n );\n }\n \n return this.promptRepo.create(data);\n }\n ```\n\n2. Clean up service account permission when link is deleted\n\n3. Handle permission denied from source project",
|
||||
"testStrategy": "- Unit test: Link creation requires edit permission\n- Unit test: Link creation without permission throws 403\n- Unit test: Service account permission created on source project\n- Unit test: Service account permission deleted when link deleted\n- Unit test: Non-link prompts skip RBAC checks\n- Integration test: End-to-end link creation with RBAC",
|
||||
"priority": "medium",
|
||||
"dependencies": [
|
||||
"56"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T23:09:07.081Z"
|
||||
},
|
||||
{
|
||||
"id": "59",
|
||||
"title": "Update CLI create prompt command for priority and link",
|
||||
"description": "Extend the mcpctl create prompt command to accept --priority (1-10) and --link (project/server:uri) flags.",
|
||||
"details": "1. Update `/src/cli/src/commands/create.ts` for prompt:\n ```typescript\n .command('prompt <name>')\n .description('Create a new prompt')\n .option('-p, --project <name>', 'Project to create prompt in')\n .option('--priority <number>', 'Priority level (1-10, default: 5)', '5')\n .option('--link <target>', 'Link to MCP resource (project/server:uri)')\n .option('-f, --file <path>', 'Read content from file')\n .action(async (name, options) => {\n const priority = parseInt(options.priority, 10);\n if (priority < 1 || priority > 10) {\n console.error('Priority must be between 1 and 10');\n process.exit(1);\n }\n \n let content = '';\n if (options.link) {\n // Linked prompts don't need content (fetched from source)\n content = `[Link: ${options.link}]`;\n } else if (options.file) {\n content = await fs.readFile(options.file, 'utf-8');\n } else {\n content = await promptForContent();\n }\n \n const body = {\n name,\n content,\n projectId: options.project,\n priority,\n linkTarget: options.link,\n };\n \n await api.post('/api/v1/prompts', body);\n });\n ```\n\n2. Validate link format: `project/server:resource-uri`\n\n3. Add shell completions for new flags",
|
||||
"testStrategy": "- Unit test: --priority flag sets prompt priority\n- Unit test: --priority validation (1-10 range)\n- Unit test: --link flag sets linkTarget\n- Unit test: --link validation (format check)\n- Unit test: Linked prompt skips content prompt\n- Unit test: Default priority is 5\n- Integration test: End-to-end create with flags",
|
||||
"priority": "medium",
|
||||
"dependencies": [
|
||||
"40"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T23:03:45.972Z"
|
||||
},
|
||||
{
|
||||
"id": "60",
|
||||
"title": "Update CLI get prompt command for -A flag and link columns",
|
||||
"description": "Extend the mcpctl get prompt command with -A (all projects) flag and add link target and status columns to output.",
|
||||
"details": "1. Update `/src/cli/src/commands/get.ts` for prompt:\n ```typescript\n .command('prompt [name]')\n .option('-A, --all-projects', 'Show prompts from all projects')\n .option('-p, --project <name>', 'Filter by project')\n .action(async (name, options) => {\n let url = '/api/v1/prompts';\n if (options.allProjects) {\n url += '?all=true';\n } else if (options.project) {\n url += `?project=${options.project}`;\n }\n \n const prompts = await api.get(url);\n \n // Format table with new columns\n formatPromptsTable(prompts, {\n columns: ['PROJECT', 'NAME', 'PRIORITY', 'LINK', 'STATUS']\n });\n });\n ```\n\n2. Update table formatter to handle link columns:\n ```\n PROJECT NAME PRIORITY LINK STATUS\n homeautomation security-policies 8 - -\n homeautomation architecture-adr 6 system-public/docmost-mcp:docmost://pages/a1 alive\n ```\n\n3. Add shell completions for -A flag",
|
||||
"testStrategy": "- Unit test: -A flag shows all projects\n- Unit test: --project flag filters by project\n- Unit test: PRIORITY column displayed\n- Unit test: LINK column shows linkTarget or -\n- Unit test: STATUS column shows linkStatus or -\n- Unit test: Table formatted correctly\n- Integration test: End-to-end get with flags",
|
||||
"priority": "medium",
|
||||
"dependencies": [
|
||||
"57",
|
||||
"59"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T23:09:31.501Z"
|
||||
},
|
||||
{
|
||||
"id": "61",
|
||||
"title": "Update CLI describe project command for gated status",
|
||||
"description": "Extend mcpctl describe project to show gated status, session greeting, and prompt table with priority and link information.",
|
||||
"details": "1. Update `/src/cli/src/commands/get.ts` describe project:\n ```typescript\n async function describeProject(name: string) {\n const project = await api.get(`/api/v1/projects/${name}`);\n const prompts = await api.get(`/api/v1/projects/${name}/prompt-index`);\n const greeting = await getSessionGreeting(name);\n \n console.log(`Name: ${project.name}`);\n console.log(`Gated: ${project.gated}`);\n console.log(`LLM Provider: ${project.llmProvider || '-'}`);\n console.log(`...`);\n console.log();\n console.log(`Session greeting:`);\n console.log(` ${greeting}`);\n console.log();\n console.log(`Prompts:`);\n console.log(` NAME PRIORITY TYPE LINK`);\n for (const p of prompts) {\n const type = p.linkTarget ? 'link' : 'local';\n const link = p.linkTarget || '-';\n console.log(` ${p.name.padEnd(20)} ${p.priority.toString().padEnd(9)} ${type.padEnd(7)} ${link}`);\n }\n }\n ```\n\n2. Fetch session greeting from system prompts or project config",
|
||||
"testStrategy": "- Unit test: Gated status displayed\n- Unit test: Session greeting displayed\n- Unit test: Prompt table with PRIORITY, TYPE, LINK columns\n- Unit test: TYPE shows 'local' or 'link'\n- Unit test: LINK shows target or -\n- Integration test: End-to-end describe project",
|
||||
"priority": "medium",
|
||||
"dependencies": [
|
||||
"44",
|
||||
"54"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T23:04:56.320Z"
|
||||
},
|
||||
{
|
||||
"id": "62",
|
||||
"title": "Update CLI edit project command for gated field",
|
||||
"description": "Extend mcpctl edit project to allow editing the gated boolean field.",
|
||||
"details": "1. Update `/src/cli/src/commands/edit.ts` for project:\n ```typescript\n async function editProject(name: string) {\n const project = await api.get(`/api/v1/projects/${name}`);\n \n // Add gated to editable fields\n const yaml = `\n name: ${project.name}\n description: ${project.description}\n gated: ${project.gated}\n llmProvider: ${project.llmProvider || ''}\n ...`;\n \n const edited = await openEditor(yaml);\n const parsed = YAML.parse(edited);\n \n // Validate gated is boolean\n if (typeof parsed.gated !== 'boolean') {\n console.error('gated must be true or false');\n process.exit(1);\n }\n \n await api.put(`/api/v1/projects/${name}`, parsed);\n }\n ```\n\n2. Update project validation schema to accept gated\n\n3. Handle conversion from string 'true'/'false' to boolean",
|
||||
"testStrategy": "- Unit test: Gated field appears in editor YAML\n- Unit test: Gated field saved on edit\n- Unit test: Boolean validation (true/false only)\n- Unit test: String 'true'/'false' converted to boolean\n- Integration test: End-to-end edit project gated",
|
||||
"priority": "medium",
|
||||
"dependencies": [
|
||||
"39"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T23:03:46.657Z"
|
||||
},
|
||||
{
|
||||
"id": "63",
|
||||
"title": "Add unit tests for prompt priority and link CRUD",
|
||||
"description": "Create comprehensive unit tests for all prompt CRUD operations with the new priority and linkTarget fields.",
|
||||
"details": "1. Add tests in `/src/mcpd/tests/services/prompt-service.test.ts`:\n ```typescript\n describe('Prompt Priority', () => {\n it('creates prompt with explicit priority', async () => {\n const prompt = await service.createPrompt({ ...data, priority: 8 });\n expect(prompt.priority).toBe(8);\n });\n \n it('uses default priority 5 when not specified', async () => {\n const prompt = await service.createPrompt(data);\n expect(prompt.priority).toBe(5);\n });\n \n it('validates priority range 1-10', async () => {\n await expect(service.createPrompt({ ...data, priority: 11 }))\n .rejects.toThrow();\n });\n \n it('updates priority', async () => {\n const updated = await service.updatePrompt(id, { priority: 3 });\n expect(updated.priority).toBe(3);\n });\n });\n \n describe('Prompt Links', () => {\n it('creates linked prompt', async () => {\n const prompt = await service.createPrompt({\n ...data,\n linkTarget: 'project/server:uri'\n });\n expect(prompt.linkTarget).toBe('project/server:uri');\n });\n \n it('rejects invalid link format', async () => {\n await expect(service.createPrompt({\n ...data,\n linkTarget: 'invalid'\n })).rejects.toThrow();\n });\n \n it('linkTarget is immutable on update', async () => {\n // linkTarget not accepted in update schema\n });\n });\n ```",
|
||||
"testStrategy": "This task IS the test implementation. Verify:\n- All priority CRUD tests pass\n- All link CRUD tests pass\n- Validation tests cover edge cases\n- Tests use proper mocking patterns\n- Coverage meets project standards",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"40",
|
||||
"41"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:52:53.091Z"
|
||||
},
|
||||
{
|
||||
"id": "64",
|
||||
"title": "Add unit tests for tag matching algorithm",
|
||||
"description": "Create comprehensive unit tests for the deterministic tag matching algorithm covering score calculation, byte budget, and priority handling.",
|
||||
"details": "1. Add tests in `/src/mcplocal/tests/services/tag-matcher.test.ts`:\n ```typescript\n describe('TagMatcherService', () => {\n describe('score calculation', () => {\n it('priority 10 prompts have infinite score', () => {\n const score = matcher.computeScore(['any'], { priority: 10, ... });\n expect(score).toBe(Infinity);\n });\n \n it('score = matching_tags * priority', () => {\n const score = matcher.computeScore(\n ['tag1', 'tag2'],\n { priority: 5, summary: 'tag1 tag2', chapters: [] }\n );\n expect(score).toBe(10); // 2 tags * 5 priority\n });\n });\n \n describe('matching', () => {\n it('matches case-insensitively', () => {\n const matches = matcher.matchesPrompt('ZIGBEE', { summary: 'zigbee setup' });\n expect(matches).toBe(true);\n });\n \n it('matches substring in summary', () => { ... });\n it('matches substring in chapters', () => { ... });\n });\n \n describe('byte budget', () => {\n it('includes full content until budget exhausted', () => { ... });\n it('matched prompts beyond budget become index entries', () => { ... });\n it('non-matched prompts listed as names only', () => { ... });\n });\n });\n ```",
|
||||
"testStrategy": "This task IS the test implementation. Verify:\n- Score calculation tests pass\n- Matching tests cover all cases\n- Byte budget tests verify allocation\n- Edge cases handled (empty tags, no prompts, etc.)\n- Tests are deterministic",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"45"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:51:03.827Z"
|
||||
},
|
||||
{
|
||||
"id": "65",
|
||||
"title": "Add unit tests for gating state machine",
|
||||
"description": "Create comprehensive unit tests for the session gating state machine covering all transitions and edge cases.",
|
||||
"details": "1. Add tests in `/src/mcplocal/tests/router-gating.test.ts`:\n ```typescript\n describe('Gating State Machine', () => {\n describe('initial state', () => {\n it('starts gated for gated project', () => {\n const router = createRouter({ gated: true });\n const state = router.getSessionState('session1');\n expect(state.gated).toBe(true);\n });\n \n it('starts ungated for non-gated project', () => {\n const router = createRouter({ gated: false });\n const state = router.getSessionState('session1');\n expect(state.gated).toBe(false);\n });\n });\n \n describe('begin_session transition', () => {\n it('ungates session on successful begin_session', async () => {\n const router = createGatedRouter();\n await router.handleBeginSession('session1', { tags: ['test'] });\n expect(router.getSessionState('session1').gated).toBe(false);\n });\n \n it('returns matched prompts', async () => { ... });\n it('sends notifications/tools/list_changed', async () => { ... });\n });\n \n describe('intercept transition', () => {\n it('ungates session on tool call intercept', async () => { ... });\n it('extracts keywords from tool call', async () => { ... });\n it('injects briefing with tool result', async () => { ... });\n });\n \n describe('tools/list behavior', () => {\n it('returns only begin_session while gated', async () => { ... });\n it('returns all tools + read_prompts after ungating', async () => { ... });\n });\n });\n ```",
|
||||
"testStrategy": "This task IS the test implementation. Verify:\n- Initial state tests pass\n- Transition tests cover happy paths\n- Edge case tests (already ungated, etc.)\n- Notification tests verify signals sent\n- Tests use proper mocking",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"50",
|
||||
"52"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:51:03.832Z"
|
||||
},
|
||||
{
|
||||
"id": "66",
|
||||
"title": "Add unit tests for LLM prompt selection",
|
||||
"description": "Create unit tests for the LLM-based prompt selection service covering LLM interactions, fallback behavior, and priority 10 handling.",
|
||||
"details": "1. Add tests in `/src/mcplocal/tests/services/llm-prompt-selector.test.ts`:\n ```typescript\n describe('LlmPromptSelectorService', () => {\n describe('priority 10 handling', () => {\n it('always includes priority 10 prompts', async () => {\n const result = await selector.selectPrompts(['unrelated'], promptIndex);\n expect(result.priority10).toContain(priority10Prompt);\n });\n });\n \n describe('LLM selection', () => {\n it('sends tags and index to heavy LLM', async () => {\n await selector.selectPrompts(['zigbee', 'mqtt'], promptIndex);\n expect(mockLlm.complete).toHaveBeenCalledWith(\n expect.stringContaining('zigbee')\n );\n });\n \n it('parses LLM response correctly', async () => {\n mockLlm.complete.mockResolvedValue(\n '[{\"name\": \"prompt1\", \"reason\": \"relevant\"}]'\n );\n const result = await selector.selectPrompts(['test'], promptIndex);\n expect(result.selected[0].name).toBe('prompt1');\n });\n });\n \n describe('fallback behavior', () => {\n it('falls back to tag matcher on LLM error', async () => { ... });\n it('falls back on LLM timeout', async () => { ... });\n it('falls back when no LLM available', async () => { ... });\n });\n \n describe('summary generation', () => {\n it('generates missing summaries with fast LLM', async () => { ... });\n });\n });\n ```",
|
||||
"testStrategy": "This task IS the test implementation. Verify:\n- Priority 10 tests pass\n- LLM interaction tests use proper mocks\n- Fallback tests cover all error scenarios\n- Summary generation tests pass\n- Response parsing handles edge cases",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"46"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:51:03.836Z"
|
||||
},
|
||||
{
|
||||
"id": "67",
|
||||
"title": "Add integration tests for gated session flow",
|
||||
"description": "Create end-to-end integration tests for the complete gated session flow including connect, begin_session, tool calls, and read_prompts.",
|
||||
"details": "1. Add tests in `/src/mcplocal/tests/integration/gated-flow.test.ts`:\n ```typescript\n describe('Gated Session Flow Integration', () => {\n let app: FastifyInstance;\n let mcpClient: McpClient;\n \n beforeAll(async () => {\n app = await createTestApp();\n // Seed test project with gated=true and test prompts\n });\n \n describe('end-to-end gated flow', () => {\n it('connect → begin_session with tags → tools available → correct prompts', async () => {\n // 1. Connect to MCP endpoint\n const session = await mcpClient.connect(app, 'test-project');\n \n // 2. Verify only begin_session available\n const toolsBefore = await session.listTools();\n expect(toolsBefore.map(t => t.name)).toEqual(['begin_session']);\n \n // 3. Call begin_session\n const briefing = await session.callTool('begin_session', {\n tags: ['test', 'integration']\n });\n expect(briefing).toContain('matched prompt content');\n \n // 4. Verify all tools now available\n const toolsAfter = await session.listTools();\n expect(toolsAfter.map(t => t.name)).toContain('read_prompts');\n });\n });\n \n describe('end-to-end intercept flow', () => {\n it('connect → skip begin_session → call tool → keywords extracted → briefing injected', async () => { ... });\n });\n \n describe('end-to-end read_prompts', () => {\n it('after ungating → request more context → additional prompts → no duplicates', async () => { ... });\n });\n });\n ```",
|
||||
"testStrategy": "This task IS the test implementation. Verify:\n- Happy path tests pass\n- Intercept path tests pass\n- read_prompts deduplication works\n- Tests use realistic data\n- Tests clean up properly",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"65"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:51:03.840Z"
|
||||
},
|
||||
{
|
||||
"id": "68",
|
||||
"title": "Add integration tests for prompt links",
|
||||
"description": "Create end-to-end integration tests for prompt link creation, resolution, and dead link detection.",
|
||||
"details": "1. Add tests in `/src/mcplocal/tests/integration/prompt-links.test.ts`:\n ```typescript\n describe('Prompt Links Integration', () => {\n describe('link creation', () => {\n it('creates link with RBAC permission', async () => {\n // Setup: user with edit permission on target project\n const prompt = await api.post('/api/v1/prompts', {\n name: 'linked-prompt',\n content: '[Link]',\n projectId: targetProject.id,\n linkTarget: 'source-project/server:uri'\n });\n expect(prompt.linkTarget).toBe('source-project/server:uri');\n });\n \n it('rejects link creation without RBAC permission', async () => { ... });\n });\n \n describe('link resolution', () => {\n it('fetches content from source server', async () => { ... });\n it('uses service account for RBAC', async () => { ... });\n });\n \n describe('dead link lifecycle', () => {\n it('detects dead link when source unavailable', async () => {\n // Kill source server\n const prompts = await api.get('/api/v1/prompts');\n const linked = prompts.find(p => p.linkTarget);\n expect(linked.linkStatus).toBe('dead');\n });\n \n it('recovers when source restored', async () => { ... });\n });\n });\n ```",
|
||||
"testStrategy": "This task IS the test implementation. Verify:\n- RBAC tests cover permission scenarios\n- Resolution tests verify content fetched\n- Dead link tests cover full lifecycle\n- Tests properly mock/control source servers\n- Tests clean up resources",
|
||||
"priority": "medium",
|
||||
"dependencies": [
|
||||
"57",
|
||||
"58"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T23:12:22.348Z"
|
||||
},
|
||||
{
|
||||
"id": "69",
|
||||
"title": "Add CLI unit tests for new prompt and project flags",
|
||||
"description": "Create unit tests for the new CLI flags: --priority, --link for prompts, -A for get, and gated field for projects.",
|
||||
"details": "1. Add tests in `/src/cli/tests/commands/prompt.test.ts`:\n ```typescript\n describe('create prompt command', () => {\n it('--priority sets prompt priority', async () => {\n await cli('create prompt test --priority 8');\n expect(mockApi.post).toHaveBeenCalledWith(\n '/api/v1/prompts',\n expect.objectContaining({ priority: 8 })\n );\n });\n \n it('--priority validates range 1-10', async () => {\n await expect(cli('create prompt test --priority 15'))\n .rejects.toThrow('Priority must be between 1 and 10');\n });\n \n it('--link sets linkTarget', async () => {\n await cli('create prompt test --link proj/srv:uri');\n expect(mockApi.post).toHaveBeenCalledWith(\n '/api/v1/prompts',\n expect.objectContaining({ linkTarget: 'proj/srv:uri' })\n );\n });\n });\n \n describe('get prompt command', () => {\n it('-A shows all projects', async () => {\n await cli('get prompt -A');\n expect(mockApi.get).toHaveBeenCalledWith('/api/v1/prompts?all=true');\n });\n });\n ```\n\n2. Add tests for project gated field editing\n\n3. Add tests for describe project output",
|
||||
"testStrategy": "This task IS the test implementation. Verify:\n- Flag parsing tests pass\n- Validation tests cover edge cases\n- API call tests verify correct parameters\n- Output formatting tests verify columns\n- Tests mock API properly",
|
||||
"priority": "medium",
|
||||
"dependencies": [
|
||||
"59",
|
||||
"60",
|
||||
"61",
|
||||
"62"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T23:12:22.352Z"
|
||||
},
|
||||
{
|
||||
"id": "70",
|
||||
"title": "Add shell completions for new CLI flags",
|
||||
"description": "Update shell completion scripts (bash, zsh, fish) to include completions for new flags: --priority, --link, -A, and gated values.",
|
||||
"details": "1. Update `/completions/mcpctl.fish`:\n ```fish\n # create prompt completions\n complete -c mcpctl -n '__fish_seen_subcommand_from create; and __fish_seen_subcommand_from prompt' -l priority -d 'Priority level (1-10)' -a '(seq 1 10)'\n complete -c mcpctl -n '__fish_seen_subcommand_from create; and __fish_seen_subcommand_from prompt' -l link -d 'Link to MCP resource (project/server:uri)'\n \n # get prompt completions \n complete -c mcpctl -n '__fish_seen_subcommand_from get; and __fish_seen_subcommand_from prompt' -s A -l all-projects -d 'Show prompts from all projects'\n ```\n\n2. Update bash completions similarly\n\n3. Update zsh completions similarly\n\n4. Add dynamic completion for priority values (1-10)",
|
||||
"testStrategy": "- Manual test: Fish completions suggest --priority with values 1-10\n- Manual test: Fish completions suggest --link flag\n- Manual test: Fish completions suggest -A/--all-projects\n- Manual test: Bash completions work similarly\n- Manual test: Zsh completions work similarly",
|
||||
"priority": "low",
|
||||
"dependencies": [
|
||||
"59",
|
||||
"60"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T23:12:22.363Z"
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"version": "1.0.0",
|
||||
"lastModified": "2026-02-21T18:52:29.084Z",
|
||||
"taskCount": 36,
|
||||
"completedCount": 33,
|
||||
"lastModified": "2026-02-25T23:12:22.364Z",
|
||||
"taskCount": 70,
|
||||
"completedCount": 67,
|
||||
"tags": [
|
||||
"master"
|
||||
]
|
||||
|
||||
@@ -2,91 +2,175 @@ _mcpctl() {
|
||||
local cur prev words cword
|
||||
_init_completion || return
|
||||
|
||||
local commands="config status get describe instance instances apply setup claude project projects backup restore help"
|
||||
local global_opts="-v --version -o --output --daemon-url -h --help"
|
||||
local resources="servers profiles projects instances"
|
||||
local commands="status login logout config get describe delete logs create edit apply backup restore mcp approve help"
|
||||
local project_commands="attach-server detach-server get describe delete logs create edit help"
|
||||
local global_opts="-v --version --daemon-url --direct --project -h --help"
|
||||
local resources="servers instances secrets templates projects users groups rbac prompts promptrequests"
|
||||
|
||||
case "${words[1]}" in
|
||||
# Check if --project was given
|
||||
local has_project=false
|
||||
local i
|
||||
for ((i=1; i < cword; i++)); do
|
||||
if [[ "${words[i]}" == "--project" ]]; then
|
||||
has_project=true
|
||||
break
|
||||
fi
|
||||
done
|
||||
|
||||
# Find the first subcommand (skip --project and its argument, skip flags)
|
||||
local subcmd=""
|
||||
local subcmd_pos=0
|
||||
for ((i=1; i < cword; i++)); do
|
||||
if [[ "${words[i]}" == "--project" || "${words[i]}" == "--daemon-url" ]]; then
|
||||
((i++)) # skip the argument
|
||||
continue
|
||||
fi
|
||||
if [[ "${words[i]}" != -* ]]; then
|
||||
subcmd="${words[i]}"
|
||||
subcmd_pos=$i
|
||||
break
|
||||
fi
|
||||
done
|
||||
|
||||
# Find the resource type after get/describe/delete/edit
|
||||
local resource_type=""
|
||||
if [[ -n "$subcmd_pos" ]] && [[ $subcmd_pos -gt 0 ]]; then
|
||||
for ((i=subcmd_pos+1; i < cword; i++)); do
|
||||
if [[ "${words[i]}" != -* ]] && [[ " $resources " == *" ${words[i]} "* ]]; then
|
||||
resource_type="${words[i]}"
|
||||
break
|
||||
fi
|
||||
done
|
||||
fi
|
||||
|
||||
# If completing the --project value
|
||||
if [[ "$prev" == "--project" ]]; then
|
||||
local names
|
||||
names=$(mcpctl get projects -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null)
|
||||
COMPREPLY=($(compgen -W "$names" -- "$cur"))
|
||||
return
|
||||
fi
|
||||
|
||||
# Fetch resource names dynamically (jq extracts only top-level names)
|
||||
_mcpctl_resource_names() {
|
||||
local rt="$1"
|
||||
if [[ -n "$rt" ]]; then
|
||||
# Instances don't have a name field — use server.name instead
|
||||
if [[ "$rt" == "instances" ]]; then
|
||||
mcpctl get instances -o json 2>/dev/null | jq -r '.[][].server.name' 2>/dev/null
|
||||
else
|
||||
mcpctl get "$rt" -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null
|
||||
fi
|
||||
fi
|
||||
}
|
||||
|
||||
# Get the --project value from the command line
|
||||
_mcpctl_get_project_value() {
|
||||
local i
|
||||
for ((i=1; i < cword; i++)); do
|
||||
if [[ "${words[i]}" == "--project" ]] && (( i+1 < cword )); then
|
||||
echo "${words[i+1]}"
|
||||
return
|
||||
fi
|
||||
done
|
||||
}
|
||||
|
||||
case "$subcmd" in
|
||||
config)
|
||||
COMPREPLY=($(compgen -W "view set path reset help" -- "$cur"))
|
||||
if [[ $((cword - subcmd_pos)) -eq 1 ]]; then
|
||||
COMPREPLY=($(compgen -W "view set path reset claude claude-generate setup impersonate help" -- "$cur"))
|
||||
fi
|
||||
return ;;
|
||||
status)
|
||||
COMPREPLY=($(compgen -W "--daemon-url -h --help" -- "$cur"))
|
||||
COMPREPLY=($(compgen -W "-h --help" -- "$cur"))
|
||||
return ;;
|
||||
get)
|
||||
if [[ $cword -eq 2 ]]; then
|
||||
login)
|
||||
COMPREPLY=($(compgen -W "--url --email --password -h --help" -- "$cur"))
|
||||
return ;;
|
||||
logout)
|
||||
return ;;
|
||||
mcp)
|
||||
return ;;
|
||||
get|describe|delete)
|
||||
if [[ -z "$resource_type" ]]; then
|
||||
COMPREPLY=($(compgen -W "$resources" -- "$cur"))
|
||||
else
|
||||
COMPREPLY=($(compgen -W "-o --output --daemon-url -h --help" -- "$cur"))
|
||||
local names
|
||||
names=$(_mcpctl_resource_names "$resource_type")
|
||||
COMPREPLY=($(compgen -W "$names -o --output -h --help" -- "$cur"))
|
||||
fi
|
||||
return ;;
|
||||
describe)
|
||||
if [[ $cword -eq 2 ]]; then
|
||||
COMPREPLY=($(compgen -W "$resources" -- "$cur"))
|
||||
edit)
|
||||
if [[ -z "$resource_type" ]]; then
|
||||
COMPREPLY=($(compgen -W "servers projects" -- "$cur"))
|
||||
else
|
||||
COMPREPLY=($(compgen -W "-o --output --daemon-url -h --help" -- "$cur"))
|
||||
local names
|
||||
names=$(_mcpctl_resource_names "$resource_type")
|
||||
COMPREPLY=($(compgen -W "$names -h --help" -- "$cur"))
|
||||
fi
|
||||
return ;;
|
||||
instance|instances)
|
||||
if [[ $cword -eq 2 ]]; then
|
||||
COMPREPLY=($(compgen -W "list ls start stop restart remove rm logs inspect help" -- "$cur"))
|
||||
else
|
||||
case "${words[2]}" in
|
||||
logs)
|
||||
COMPREPLY=($(compgen -W "--tail --since -h --help" -- "$cur"))
|
||||
;;
|
||||
start)
|
||||
COMPREPLY=($(compgen -W "--env --image -h --help" -- "$cur"))
|
||||
;;
|
||||
list|ls)
|
||||
COMPREPLY=($(compgen -W "--server-id -o --output -h --help" -- "$cur"))
|
||||
;;
|
||||
esac
|
||||
fi
|
||||
logs)
|
||||
COMPREPLY=($(compgen -W "--tail --since -f --follow -h --help" -- "$cur"))
|
||||
return ;;
|
||||
claude)
|
||||
if [[ $cword -eq 2 ]]; then
|
||||
COMPREPLY=($(compgen -W "generate show add remove help" -- "$cur"))
|
||||
else
|
||||
case "${words[2]}" in
|
||||
generate|show|add|remove)
|
||||
COMPREPLY=($(compgen -W "--path -p -h --help" -- "$cur"))
|
||||
;;
|
||||
esac
|
||||
fi
|
||||
return ;;
|
||||
project|projects)
|
||||
if [[ $cword -eq 2 ]]; then
|
||||
COMPREPLY=($(compgen -W "list ls create delete rm show profiles set-profiles help" -- "$cur"))
|
||||
else
|
||||
case "${words[2]}" in
|
||||
create)
|
||||
COMPREPLY=($(compgen -W "--description -d -h --help" -- "$cur"))
|
||||
;;
|
||||
list|ls)
|
||||
COMPREPLY=($(compgen -W "-o --output -h --help" -- "$cur"))
|
||||
;;
|
||||
esac
|
||||
create)
|
||||
if [[ $((cword - subcmd_pos)) -eq 1 ]]; then
|
||||
COMPREPLY=($(compgen -W "server secret project user group rbac prompt promptrequest help" -- "$cur"))
|
||||
fi
|
||||
return ;;
|
||||
apply)
|
||||
COMPREPLY=($(compgen -f -- "$cur"))
|
||||
return ;;
|
||||
backup)
|
||||
COMPREPLY=($(compgen -W "-o --output -p --password -r --resources -h --help" -- "$cur"))
|
||||
COMPREPLY=($(compgen -W "-o --output -p --password -h --help" -- "$cur"))
|
||||
return ;;
|
||||
restore)
|
||||
COMPREPLY=($(compgen -W "-i --input -p --password -c --conflict -h --help" -- "$cur"))
|
||||
return ;;
|
||||
setup)
|
||||
attach-server)
|
||||
# Only complete if no server arg given yet (first arg after subcmd)
|
||||
if [[ $((cword - subcmd_pos)) -ne 1 ]]; then return; fi
|
||||
local proj names all_servers proj_servers
|
||||
proj=$(_mcpctl_get_project_value)
|
||||
if [[ -n "$proj" ]]; then
|
||||
all_servers=$(mcpctl get servers -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null)
|
||||
proj_servers=$(mcpctl --project "$proj" get servers -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null)
|
||||
names=$(comm -23 <(echo "$all_servers" | sort) <(echo "$proj_servers" | sort))
|
||||
else
|
||||
names=$(_mcpctl_resource_names "servers")
|
||||
fi
|
||||
COMPREPLY=($(compgen -W "$names" -- "$cur"))
|
||||
return ;;
|
||||
detach-server)
|
||||
# Only complete if no server arg given yet (first arg after subcmd)
|
||||
if [[ $((cword - subcmd_pos)) -ne 1 ]]; then return; fi
|
||||
local proj names
|
||||
proj=$(_mcpctl_get_project_value)
|
||||
if [[ -n "$proj" ]]; then
|
||||
names=$(mcpctl --project "$proj" get servers -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null)
|
||||
fi
|
||||
COMPREPLY=($(compgen -W "$names" -- "$cur"))
|
||||
return ;;
|
||||
approve)
|
||||
if [[ -z "$resource_type" ]]; then
|
||||
COMPREPLY=($(compgen -W "promptrequest" -- "$cur"))
|
||||
else
|
||||
local names
|
||||
names=$(_mcpctl_resource_names "$resource_type")
|
||||
COMPREPLY=($(compgen -W "$names" -- "$cur"))
|
||||
fi
|
||||
return ;;
|
||||
help)
|
||||
COMPREPLY=($(compgen -W "$commands" -- "$cur"))
|
||||
return ;;
|
||||
esac
|
||||
|
||||
if [[ $cword -eq 1 ]]; then
|
||||
COMPREPLY=($(compgen -W "$commands $global_opts" -- "$cur"))
|
||||
# No subcommand yet — offer commands based on context
|
||||
if [[ -z "$subcmd" ]]; then
|
||||
if $has_project; then
|
||||
COMPREPLY=($(compgen -W "$project_commands $global_opts" -- "$cur"))
|
||||
else
|
||||
COMPREPLY=($(compgen -W "$commands $global_opts" -- "$cur"))
|
||||
fi
|
||||
fi
|
||||
}
|
||||
|
||||
|
||||
@@ -1,80 +1,325 @@
|
||||
# mcpctl fish completions
|
||||
|
||||
set -l commands config status get describe instance instances apply setup claude project projects backup restore help
|
||||
# Erase any stale completions from previous versions
|
||||
complete -c mcpctl -e
|
||||
|
||||
set -l commands status login logout config get describe delete logs create edit apply patch backup restore mcp approve help
|
||||
set -l project_commands attach-server detach-server get describe delete logs create edit help
|
||||
|
||||
# Disable file completions by default
|
||||
complete -c mcpctl -f
|
||||
|
||||
# Global options
|
||||
complete -c mcpctl -s v -l version -d 'Show version'
|
||||
complete -c mcpctl -s o -l output -d 'Output format' -xa 'table json yaml'
|
||||
complete -c mcpctl -l daemon-url -d 'mcpd daemon URL' -x
|
||||
complete -c mcpctl -l daemon-url -d 'mcplocal daemon URL' -x
|
||||
complete -c mcpctl -l direct -d 'Bypass mcplocal, connect directly to mcpd'
|
||||
complete -c mcpctl -l project -d 'Target project context' -x
|
||||
complete -c mcpctl -s h -l help -d 'Show help'
|
||||
|
||||
# Top-level commands
|
||||
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a config -d 'Manage configuration'
|
||||
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a status -d 'Show status and connectivity'
|
||||
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a get -d 'List resources'
|
||||
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a describe -d 'Show resource details'
|
||||
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a instance -d 'Manage instances'
|
||||
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a apply -d 'Apply configuration from file'
|
||||
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a setup -d 'Interactive setup wizard'
|
||||
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a claude -d 'Manage Claude .mcp.json'
|
||||
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a project -d 'Manage projects'
|
||||
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a backup -d 'Backup configuration'
|
||||
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a restore -d 'Restore from backup'
|
||||
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a help -d 'Show help'
|
||||
# Helper: check if --project was given
|
||||
function __mcpctl_has_project
|
||||
set -l tokens (commandline -opc)
|
||||
for i in (seq (count $tokens))
|
||||
if test "$tokens[$i]" = "--project"
|
||||
return 0
|
||||
end
|
||||
end
|
||||
return 1
|
||||
end
|
||||
|
||||
# get/describe resources
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from get describe" -a 'servers profiles projects instances' -d 'Resource type'
|
||||
# Helper: check if a resource type has been selected after get/describe/delete/edit
|
||||
set -l resources servers instances secrets templates projects users groups rbac prompts promptrequests
|
||||
# All accepted resource aliases (plural + singular + short forms)
|
||||
set -l resource_aliases servers server srv instances instance inst secrets secret sec templates template tpl projects project proj users user groups group rbac rbac-definition rbac-binding prompts prompt promptrequests promptrequest pr
|
||||
|
||||
function __mcpctl_needs_resource_type
|
||||
set -l tokens (commandline -opc)
|
||||
set -l found_cmd false
|
||||
for tok in $tokens
|
||||
if $found_cmd
|
||||
# Check if next token after get/describe/delete/edit is a resource type or alias
|
||||
if contains -- $tok $resource_aliases
|
||||
return 1 # resource type already present
|
||||
end
|
||||
end
|
||||
if contains -- $tok get describe delete edit patch
|
||||
set found_cmd true
|
||||
end
|
||||
end
|
||||
if $found_cmd
|
||||
return 0 # command found but no resource type yet
|
||||
end
|
||||
return 1
|
||||
end
|
||||
|
||||
# Map any resource alias to the canonical plural form for API calls
|
||||
function __mcpctl_resolve_resource
|
||||
switch $argv[1]
|
||||
case server srv servers; echo servers
|
||||
case instance inst instances; echo instances
|
||||
case secret sec secrets; echo secrets
|
||||
case template tpl templates; echo templates
|
||||
case project proj projects; echo projects
|
||||
case user users; echo users
|
||||
case group groups; echo groups
|
||||
case rbac rbac-definition rbac-binding; echo rbac
|
||||
case prompt prompts; echo prompts
|
||||
case promptrequest promptrequests pr; echo promptrequests
|
||||
case '*'; echo $argv[1]
|
||||
end
|
||||
end
|
||||
|
||||
function __mcpctl_get_resource_type
|
||||
set -l tokens (commandline -opc)
|
||||
set -l found_cmd false
|
||||
for tok in $tokens
|
||||
if $found_cmd
|
||||
if contains -- $tok $resource_aliases
|
||||
__mcpctl_resolve_resource $tok
|
||||
return
|
||||
end
|
||||
end
|
||||
if contains -- $tok get describe delete edit patch
|
||||
set found_cmd true
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
# Fetch resource names dynamically from the API (jq extracts only top-level names)
|
||||
function __mcpctl_resource_names
|
||||
set -l resource (__mcpctl_get_resource_type)
|
||||
if test -z "$resource"
|
||||
return
|
||||
end
|
||||
# Instances don't have a name field — use server.name instead
|
||||
if test "$resource" = "instances"
|
||||
mcpctl get instances -o json 2>/dev/null | jq -r '.[][].server.name' 2>/dev/null
|
||||
else if test "$resource" = "prompts" -o "$resource" = "promptrequests"
|
||||
# Use -A to include all projects, not just global
|
||||
mcpctl get $resource -A -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null
|
||||
else
|
||||
mcpctl get $resource -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null
|
||||
end
|
||||
end
|
||||
|
||||
# Fetch project names for --project value
|
||||
function __mcpctl_project_names
|
||||
mcpctl get projects -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null
|
||||
end
|
||||
|
||||
# Helper: get the --project value from the command line
|
||||
function __mcpctl_get_project_value
|
||||
set -l tokens (commandline -opc)
|
||||
for i in (seq (count $tokens))
|
||||
if test "$tokens[$i]" = "--project"; and test $i -lt (count $tokens)
|
||||
echo $tokens[(math $i + 1)]
|
||||
return
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
# Servers currently attached to the project (for detach-server)
|
||||
function __mcpctl_project_servers
|
||||
set -l proj (__mcpctl_get_project_value)
|
||||
if test -z "$proj"
|
||||
return
|
||||
end
|
||||
mcpctl --project $proj get servers -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null
|
||||
end
|
||||
|
||||
# Servers NOT attached to the project (for attach-server)
|
||||
function __mcpctl_available_servers
|
||||
set -l proj (__mcpctl_get_project_value)
|
||||
if test -z "$proj"
|
||||
# No project — show all servers
|
||||
mcpctl get servers -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null
|
||||
return
|
||||
end
|
||||
set -l all (mcpctl get servers -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null)
|
||||
set -l attached (mcpctl --project $proj get servers -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null)
|
||||
for s in $all
|
||||
if not contains -- $s $attached
|
||||
echo $s
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
# --project value completion
|
||||
complete -c mcpctl -l project -xa '(__mcpctl_project_names)'
|
||||
|
||||
# Top-level commands (without --project)
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a status -d 'Show status and connectivity'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a login -d 'Authenticate with mcpd'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a logout -d 'Log out'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a config -d 'Manage configuration'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a get -d 'List resources'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a describe -d 'Show resource details'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a delete -d 'Delete a resource'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a logs -d 'Get instance logs'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a create -d 'Create a resource'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a edit -d 'Edit a resource'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a apply -d 'Apply configuration from file'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a backup -d 'Backup configuration'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a restore -d 'Restore from backup'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a patch -d 'Patch a resource field'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a approve -d 'Approve a prompt request'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a help -d 'Show help'
|
||||
|
||||
# Project-scoped commands (with --project)
|
||||
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a attach-server -d 'Attach a server to the project'
|
||||
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a detach-server -d 'Detach a server from the project'
|
||||
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a get -d 'List resources (scoped to project)'
|
||||
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a describe -d 'Show resource details'
|
||||
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a delete -d 'Delete a resource'
|
||||
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a logs -d 'Get instance logs'
|
||||
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a create -d 'Create a resource'
|
||||
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a edit -d 'Edit a resource'
|
||||
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a help -d 'Show help'
|
||||
|
||||
# Resource types — only when resource type not yet selected
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from get describe delete patch; and __mcpctl_needs_resource_type" -a "$resources" -d 'Resource type'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from edit; and __mcpctl_needs_resource_type" -a 'servers secrets projects groups rbac prompts promptrequests' -d 'Resource type'
|
||||
|
||||
# Resource names — after resource type is selected
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from get describe delete edit patch; and not __mcpctl_needs_resource_type" -a '(__mcpctl_resource_names)' -d 'Resource name'
|
||||
|
||||
# Helper: check if attach-server/detach-server already has a server argument
|
||||
function __mcpctl_needs_server_arg
|
||||
set -l tokens (commandline -opc)
|
||||
set -l found_cmd false
|
||||
for tok in $tokens
|
||||
if $found_cmd
|
||||
if not string match -q -- '-*' $tok
|
||||
return 1 # server arg already present
|
||||
end
|
||||
end
|
||||
if contains -- $tok attach-server detach-server
|
||||
set found_cmd true
|
||||
end
|
||||
end
|
||||
if $found_cmd
|
||||
return 0 # command found but no server arg yet
|
||||
end
|
||||
return 1
|
||||
end
|
||||
|
||||
# attach-server: show servers NOT in the project (only if no server arg yet)
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from attach-server; and __mcpctl_needs_server_arg" -a '(__mcpctl_available_servers)' -d 'Server'
|
||||
|
||||
# detach-server: show servers IN the project (only if no server arg yet)
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from detach-server; and __mcpctl_needs_server_arg" -a '(__mcpctl_project_servers)' -d 'Server'
|
||||
|
||||
# get/describe options
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from get" -s o -l output -d 'Output format' -xa 'table json yaml'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from get" -l project -d 'Filter by project' -xa '(__mcpctl_project_names)'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from get" -s A -l all -d 'Show all resources across projects'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from describe" -s o -l output -d 'Output format' -xa 'detail json yaml'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from describe" -l show-values -d 'Show secret values'
|
||||
|
||||
# login options
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from login" -l url -d 'mcpd URL' -x
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from login" -l email -d 'Email address' -x
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from login" -l password -d 'Password' -x
|
||||
|
||||
# config subcommands
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from view set path reset" -a view -d 'Show configuration'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from view set path reset" -a set -d 'Set a config value'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from view set path reset" -a path -d 'Show config file path'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from view set path reset" -a reset -d 'Reset to defaults'
|
||||
set -l config_cmds view set path reset claude claude-generate setup impersonate
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a view -d 'Show configuration'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a set -d 'Set a config value'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a path -d 'Show config file path'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a reset -d 'Reset to defaults'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a claude -d 'Generate .mcp.json for project'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a setup -d 'Configure LLM provider'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a impersonate -d 'Impersonate a user'
|
||||
|
||||
# instance subcommands
|
||||
set -l instance_cmds list ls start stop restart remove rm logs inspect
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a list -d 'List instances'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a start -d 'Start instance'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a stop -d 'Stop instance'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a restart -d 'Restart instance'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a remove -d 'Remove instance'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a logs -d 'Get logs'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a inspect -d 'Inspect container'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and __fish_seen_subcommand_from logs" -l tail -d 'Number of lines' -x
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and __fish_seen_subcommand_from logs" -l since -d 'Since timestamp' -x
|
||||
# create subcommands
|
||||
set -l create_cmds server secret project user group rbac prompt promptrequest
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a server -d 'Create a server'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a secret -d 'Create a secret'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a project -d 'Create a project'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a user -d 'Create a user'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a group -d 'Create a group'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a rbac -d 'Create an RBAC binding'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a prompt -d 'Create an approved prompt'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a promptrequest -d 'Create a prompt request'
|
||||
|
||||
# claude subcommands
|
||||
set -l claude_cmds generate show add remove
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from claude; and not __fish_seen_subcommand_from $claude_cmds" -a generate -d 'Generate .mcp.json'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from claude; and not __fish_seen_subcommand_from $claude_cmds" -a show -d 'Show .mcp.json'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from claude; and not __fish_seen_subcommand_from $claude_cmds" -a add -d 'Add server entry'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from claude; and not __fish_seen_subcommand_from $claude_cmds" -a remove -d 'Remove server entry'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from claude; and __fish_seen_subcommand_from $claude_cmds" -s p -l path -d 'Path to .mcp.json' -rF
|
||||
# create prompt/promptrequest options
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and __fish_seen_subcommand_from prompt promptrequest" -l project -d 'Project name' -xa '(__mcpctl_project_names)'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and __fish_seen_subcommand_from prompt promptrequest" -l content -d 'Prompt content text' -x
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and __fish_seen_subcommand_from prompt promptrequest" -l content-file -d 'Read content from file' -rF
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and __fish_seen_subcommand_from prompt promptrequest" -l priority -d 'Priority 1-10' -xa '(seq 1 10)'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and __fish_seen_subcommand_from prompt" -l link -d 'Link to MCP resource (project/server:uri)' -x
|
||||
|
||||
# project subcommands
|
||||
set -l project_cmds list ls create delete rm show profiles set-profiles
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a list -d 'List projects'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a create -d 'Create project'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a delete -d 'Delete project'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a show -d 'Show project'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a profiles -d 'List profiles'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a set-profiles -d 'Set profiles'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and __fish_seen_subcommand_from create" -s d -l description -d 'Description' -x
|
||||
# create project --gated/--no-gated
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and __fish_seen_subcommand_from project" -l gated -d 'Enable gated sessions'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and __fish_seen_subcommand_from project" -l no-gated -d 'Disable gated sessions'
|
||||
|
||||
# logs: takes a server/instance name, then options
|
||||
function __mcpctl_instance_names
|
||||
mcpctl get instances -o json 2>/dev/null | jq -r '.[][].server.name' 2>/dev/null
|
||||
end
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from logs" -a '(__mcpctl_instance_names)' -d 'Server name'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from logs" -l tail -d 'Number of lines' -x
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from logs" -l since -d 'Since timestamp' -x
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from logs" -s f -l follow -d 'Follow log output'
|
||||
|
||||
# backup options
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from backup" -s o -l output -d 'Output file' -rF
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from backup" -s p -l password -d 'Encryption password' -x
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from backup" -s r -l resources -d 'Resources to backup' -xa 'servers profiles projects'
|
||||
|
||||
# restore options
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from restore" -s i -l input -d 'Input file' -rF
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from restore" -s p -l password -d 'Decryption password' -x
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from restore" -s c -l conflict -d 'Conflict strategy' -xa 'skip overwrite fail'
|
||||
|
||||
# approve: first arg is resource type, second is name
|
||||
function __mcpctl_approve_needs_type
|
||||
set -l tokens (commandline -opc)
|
||||
set -l found false
|
||||
for tok in $tokens
|
||||
if $found
|
||||
if contains -- $tok promptrequest promptrequests
|
||||
return 1 # type already given
|
||||
end
|
||||
end
|
||||
if test "$tok" = "approve"
|
||||
set found true
|
||||
end
|
||||
end
|
||||
if $found
|
||||
return 0 # approve found but no type yet
|
||||
end
|
||||
return 1
|
||||
end
|
||||
|
||||
function __mcpctl_approve_needs_name
|
||||
set -l tokens (commandline -opc)
|
||||
set -l found_type false
|
||||
for tok in $tokens
|
||||
if $found_type
|
||||
# next non-flag token after type is the name
|
||||
if not string match -q -- '-*' $tok
|
||||
return 1 # name already given
|
||||
end
|
||||
end
|
||||
if contains -- $tok promptrequest promptrequests
|
||||
set found_type true
|
||||
end
|
||||
end
|
||||
if $found_type
|
||||
return 0 # type given but no name yet
|
||||
end
|
||||
return 1
|
||||
end
|
||||
|
||||
function __mcpctl_promptrequest_names
|
||||
mcpctl get promptrequests -A -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null
|
||||
end
|
||||
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from approve; and __mcpctl_approve_needs_type" -a 'promptrequest' -d 'Resource type'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from approve; and __mcpctl_approve_needs_name" -a '(__mcpctl_promptrequest_names)' -d 'Prompt request name'
|
||||
|
||||
# apply takes a file
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from apply" -s f -l file -d 'Configuration file' -rF
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from apply" -F
|
||||
|
||||
# help completions
|
||||
|
||||
35
fulldeploy.sh
Executable file
35
fulldeploy.sh
Executable file
@@ -0,0 +1,35 @@
|
||||
#!/bin/bash
|
||||
# Full deployment: Docker image → Portainer stack → RPM build/publish/install
|
||||
set -e
|
||||
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
cd "$SCRIPT_DIR"
|
||||
|
||||
# Load .env
|
||||
if [ -f .env ]; then
|
||||
set -a; source .env; set +a
|
||||
fi
|
||||
|
||||
echo "========================================"
|
||||
echo " mcpctl Full Deploy"
|
||||
echo "========================================"
|
||||
|
||||
echo ""
|
||||
echo ">>> Step 1/3: Build & push mcpd Docker image"
|
||||
echo ""
|
||||
bash scripts/build-mcpd.sh "$@"
|
||||
|
||||
echo ""
|
||||
echo ">>> Step 2/3: Deploy stack to production"
|
||||
echo ""
|
||||
bash deploy.sh
|
||||
|
||||
echo ""
|
||||
echo ">>> Step 3/3: Build, publish & install RPM"
|
||||
echo ""
|
||||
bash scripts/release.sh
|
||||
|
||||
echo ""
|
||||
echo "========================================"
|
||||
echo " Full deploy complete!"
|
||||
echo "========================================"
|
||||
@@ -5,6 +5,8 @@ release: "1"
|
||||
maintainer: michal
|
||||
description: kubectl-like CLI for managing MCP servers
|
||||
license: MIT
|
||||
depends:
|
||||
- jq
|
||||
contents:
|
||||
- src: ./dist/mcpctl
|
||||
dst: /usr/bin/mcpctl
|
||||
|
||||
55
pr.sh
Executable file
55
pr.sh
Executable file
@@ -0,0 +1,55 @@
|
||||
#!/usr/bin/env bash
|
||||
# Usage: bash pr.sh "PR title" "PR body"
|
||||
# Loads GITEA_TOKEN from .env automatically
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
# Load .env if GITEA_TOKEN not already exported
|
||||
if [ -z "${GITEA_TOKEN:-}" ] && [ -f .env ]; then
|
||||
set -a
|
||||
source .env
|
||||
set +a
|
||||
fi
|
||||
|
||||
GITEA_URL="${GITEA_URL:-http://10.0.0.194:3012}"
|
||||
REPO="${GITEA_OWNER:-michal}/mcpctl"
|
||||
|
||||
TITLE="${1:?Usage: pr.sh <title> [body]}"
|
||||
BODY="${2:-}"
|
||||
BASE="${3:-main}"
|
||||
HEAD=$(git rev-parse --abbrev-ref HEAD)
|
||||
|
||||
if [ "$HEAD" = "$BASE" ]; then
|
||||
echo "Error: already on $BASE, switch to a feature branch first" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [ -z "${GITEA_TOKEN:-}" ]; then
|
||||
echo "Error: GITEA_TOKEN not set and .env not found" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Push if needed
|
||||
if ! git rev-parse --verify "origin/$HEAD" &>/dev/null; then
|
||||
git push -u origin "$HEAD"
|
||||
else
|
||||
git push
|
||||
fi
|
||||
|
||||
# Create PR
|
||||
RESPONSE=$(curl -s -X POST "$GITEA_URL/api/v1/repos/$REPO/pulls" \
|
||||
-H "Authorization: token $GITEA_TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d "$(jq -n --arg t "$TITLE" --arg b "$BODY" --arg h "$HEAD" --arg base "$BASE" \
|
||||
'{title: $t, body: $b, head: $h, base: $base}')")
|
||||
|
||||
PR_NUM=$(echo "$RESPONSE" | jq -r '.number // empty')
|
||||
PR_URL=$(echo "$RESPONSE" | jq -r '.html_url // empty')
|
||||
|
||||
if [ -z "$PR_NUM" ]; then
|
||||
echo "Error creating PR:" >&2
|
||||
echo "$RESPONSE" | jq . 2>/dev/null || echo "$RESPONSE" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "PR #$PR_NUM: https://mysources.co.uk/$REPO/pulls/$PR_NUM"
|
||||
@@ -24,7 +24,10 @@ export class ApiError extends Error {
|
||||
function request<T>(method: string, url: string, timeout: number, body?: unknown, token?: string): Promise<ApiResponse<T>> {
|
||||
return new Promise((resolve, reject) => {
|
||||
const parsed = new URL(url);
|
||||
const headers: Record<string, string> = { 'Content-Type': 'application/json' };
|
||||
const headers: Record<string, string> = {};
|
||||
if (body !== undefined) {
|
||||
headers['Content-Type'] = 'application/json';
|
||||
}
|
||||
if (token) {
|
||||
headers['Authorization'] = `Bearer ${token}`;
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { Command } from 'commander';
|
||||
import { readFileSync } from 'node:fs';
|
||||
import { readFileSync, readSync } from 'node:fs';
|
||||
import yaml from 'js-yaml';
|
||||
import { z } from 'zod';
|
||||
import type { ApiClient } from '../api-client.js';
|
||||
@@ -63,17 +63,81 @@ const TemplateSpecSchema = z.object({
|
||||
healthCheck: HealthCheckSchema.optional(),
|
||||
});
|
||||
|
||||
const UserSpecSchema = z.object({
|
||||
email: z.string().email(),
|
||||
password: z.string().min(8),
|
||||
name: z.string().optional(),
|
||||
});
|
||||
|
||||
const GroupSpecSchema = z.object({
|
||||
name: z.string().min(1),
|
||||
description: z.string().default(''),
|
||||
members: z.array(z.string().email()).default([]),
|
||||
});
|
||||
|
||||
const RbacSubjectSchema = z.object({
|
||||
kind: z.enum(['User', 'Group', 'ServiceAccount']),
|
||||
name: z.string().min(1),
|
||||
});
|
||||
|
||||
const RESOURCE_ALIASES: Record<string, string> = {
|
||||
server: 'servers', instance: 'instances', secret: 'secrets',
|
||||
project: 'projects', template: 'templates', user: 'users', group: 'groups',
|
||||
prompt: 'prompts', promptrequest: 'promptrequests',
|
||||
};
|
||||
|
||||
const RbacRoleBindingSchema = z.union([
|
||||
z.object({
|
||||
role: z.enum(['edit', 'view', 'create', 'delete', 'run', 'expose']),
|
||||
resource: z.string().min(1).transform((r) => RESOURCE_ALIASES[r] ?? r),
|
||||
name: z.string().min(1).optional(),
|
||||
}),
|
||||
z.object({
|
||||
role: z.literal('run'),
|
||||
action: z.string().min(1),
|
||||
}),
|
||||
]);
|
||||
|
||||
const RbacBindingSpecSchema = z.object({
|
||||
name: z.string().min(1),
|
||||
subjects: z.array(RbacSubjectSchema).default([]),
|
||||
roleBindings: z.array(RbacRoleBindingSchema).default([]),
|
||||
});
|
||||
|
||||
const PromptSpecSchema = z.object({
|
||||
name: z.string().min(1).max(100).regex(/^[a-z0-9-]+$/),
|
||||
content: z.string().min(1).max(50000),
|
||||
projectId: z.string().optional(),
|
||||
priority: z.number().int().min(1).max(10).optional(),
|
||||
linkTarget: z.string().optional(),
|
||||
});
|
||||
|
||||
const ProjectSpecSchema = z.object({
|
||||
name: z.string().min(1),
|
||||
description: z.string().default(''),
|
||||
prompt: z.string().max(10000).default(''),
|
||||
proxyMode: z.enum(['direct', 'filtered']).default('direct'),
|
||||
gated: z.boolean().default(true),
|
||||
llmProvider: z.string().optional(),
|
||||
llmModel: z.string().optional(),
|
||||
servers: z.array(z.string()).default([]),
|
||||
});
|
||||
|
||||
const ApplyConfigSchema = z.object({
|
||||
servers: z.array(ServerSpecSchema).default([]),
|
||||
secrets: z.array(SecretSpecSchema).default([]),
|
||||
servers: z.array(ServerSpecSchema).default([]),
|
||||
users: z.array(UserSpecSchema).default([]),
|
||||
groups: z.array(GroupSpecSchema).default([]),
|
||||
projects: z.array(ProjectSpecSchema).default([]),
|
||||
templates: z.array(TemplateSpecSchema).default([]),
|
||||
});
|
||||
rbacBindings: z.array(RbacBindingSpecSchema).default([]),
|
||||
rbac: z.array(RbacBindingSpecSchema).default([]),
|
||||
prompts: z.array(PromptSpecSchema).default([]),
|
||||
}).transform((data) => ({
|
||||
...data,
|
||||
// Merge rbac into rbacBindings so both keys work
|
||||
rbacBindings: [...data.rbacBindings, ...data.rbac],
|
||||
}));
|
||||
|
||||
export type ApplyConfig = z.infer<typeof ApplyConfigSchema>;
|
||||
|
||||
@@ -87,17 +151,26 @@ export function createApplyCommand(deps: ApplyCommandDeps): Command {
|
||||
|
||||
return new Command('apply')
|
||||
.description('Apply declarative configuration from a YAML or JSON file')
|
||||
.argument('<file>', 'Path to config file (.yaml, .yml, or .json)')
|
||||
.argument('[file]', 'Path to config file (.yaml, .yml, or .json)')
|
||||
.option('-f, --file <file>', 'Path to config file (alternative to positional arg)')
|
||||
.option('--dry-run', 'Validate and show changes without applying')
|
||||
.action(async (file: string, opts: { dryRun?: boolean }) => {
|
||||
.action(async (fileArg: string | undefined, opts: { file?: string; dryRun?: boolean }) => {
|
||||
const file = fileArg ?? opts.file;
|
||||
if (!file) {
|
||||
throw new Error('File path required. Usage: mcpctl apply <file> or mcpctl apply -f <file>');
|
||||
}
|
||||
const config = loadConfigFile(file);
|
||||
|
||||
if (opts.dryRun) {
|
||||
log('Dry run - would apply:');
|
||||
if (config.servers.length > 0) log(` ${config.servers.length} server(s)`);
|
||||
if (config.secrets.length > 0) log(` ${config.secrets.length} secret(s)`);
|
||||
if (config.servers.length > 0) log(` ${config.servers.length} server(s)`);
|
||||
if (config.users.length > 0) log(` ${config.users.length} user(s)`);
|
||||
if (config.groups.length > 0) log(` ${config.groups.length} group(s)`);
|
||||
if (config.projects.length > 0) log(` ${config.projects.length} project(s)`);
|
||||
if (config.templates.length > 0) log(` ${config.templates.length} template(s)`);
|
||||
if (config.rbacBindings.length > 0) log(` ${config.rbacBindings.length} rbacBinding(s)`);
|
||||
if (config.prompts.length > 0) log(` ${config.prompts.length} prompt(s)`);
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -105,11 +178,27 @@ export function createApplyCommand(deps: ApplyCommandDeps): Command {
|
||||
});
|
||||
}
|
||||
|
||||
function readStdin(): string {
|
||||
const chunks: Buffer[] = [];
|
||||
const buf = Buffer.alloc(4096);
|
||||
try {
|
||||
// eslint-disable-next-line no-constant-condition
|
||||
while (true) {
|
||||
const bytesRead = readSync(0, buf, 0, buf.length, null);
|
||||
if (bytesRead === 0) break;
|
||||
chunks.push(buf.subarray(0, bytesRead));
|
||||
}
|
||||
} catch {
|
||||
// EOF or closed pipe
|
||||
}
|
||||
return Buffer.concat(chunks).toString('utf-8');
|
||||
}
|
||||
|
||||
function loadConfigFile(path: string): ApplyConfig {
|
||||
const raw = readFileSync(path, 'utf-8');
|
||||
const raw = path === '-' ? readStdin() : readFileSync(path, 'utf-8');
|
||||
let parsed: unknown;
|
||||
|
||||
if (path.endsWith('.json')) {
|
||||
if (path === '-' ? raw.trimStart().startsWith('{') : path.endsWith('.json')) {
|
||||
parsed = JSON.parse(raw);
|
||||
} else {
|
||||
parsed = yaml.load(raw);
|
||||
@@ -119,21 +208,7 @@ function loadConfigFile(path: string): ApplyConfig {
|
||||
}
|
||||
|
||||
async function applyConfig(client: ApiClient, config: ApplyConfig, log: (...args: unknown[]) => void): Promise<void> {
|
||||
// Apply servers first
|
||||
for (const server of config.servers) {
|
||||
try {
|
||||
const existing = await findByName(client, 'servers', server.name);
|
||||
if (existing) {
|
||||
await client.put(`/api/v1/servers/${(existing as { id: string }).id}`, server);
|
||||
log(`Updated server: ${server.name}`);
|
||||
} else {
|
||||
await client.post('/api/v1/servers', server);
|
||||
log(`Created server: ${server.name}`);
|
||||
}
|
||||
} catch (err) {
|
||||
log(`Error applying server '${server.name}': ${err instanceof Error ? err.message : err}`);
|
||||
}
|
||||
}
|
||||
// Apply order: secrets, servers, users, groups, projects, templates, rbacBindings
|
||||
|
||||
// Apply secrets
|
||||
for (const secret of config.secrets) {
|
||||
@@ -151,20 +226,63 @@ async function applyConfig(client: ApiClient, config: ApplyConfig, log: (...args
|
||||
}
|
||||
}
|
||||
|
||||
// Apply projects
|
||||
// Apply servers
|
||||
for (const server of config.servers) {
|
||||
try {
|
||||
const existing = await findByName(client, 'servers', server.name);
|
||||
if (existing) {
|
||||
await client.put(`/api/v1/servers/${(existing as { id: string }).id}`, server);
|
||||
log(`Updated server: ${server.name}`);
|
||||
} else {
|
||||
await client.post('/api/v1/servers', server);
|
||||
log(`Created server: ${server.name}`);
|
||||
}
|
||||
} catch (err) {
|
||||
log(`Error applying server '${server.name}': ${err instanceof Error ? err.message : err}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Apply users (matched by email)
|
||||
for (const user of config.users) {
|
||||
try {
|
||||
const existing = await findByField(client, 'users', 'email', user.email);
|
||||
if (existing) {
|
||||
await client.put(`/api/v1/users/${(existing as { id: string }).id}`, user);
|
||||
log(`Updated user: ${user.email}`);
|
||||
} else {
|
||||
await client.post('/api/v1/users', user);
|
||||
log(`Created user: ${user.email}`);
|
||||
}
|
||||
} catch (err) {
|
||||
log(`Error applying user '${user.email}': ${err instanceof Error ? err.message : err}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Apply groups
|
||||
for (const group of config.groups) {
|
||||
try {
|
||||
const existing = await findByName(client, 'groups', group.name);
|
||||
if (existing) {
|
||||
await client.put(`/api/v1/groups/${(existing as { id: string }).id}`, group);
|
||||
log(`Updated group: ${group.name}`);
|
||||
} else {
|
||||
await client.post('/api/v1/groups', group);
|
||||
log(`Created group: ${group.name}`);
|
||||
}
|
||||
} catch (err) {
|
||||
log(`Error applying group '${group.name}': ${err instanceof Error ? err.message : err}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Apply projects (send full spec including servers)
|
||||
for (const project of config.projects) {
|
||||
try {
|
||||
const existing = await findByName(client, 'projects', project.name);
|
||||
if (existing) {
|
||||
await client.put(`/api/v1/projects/${(existing as { id: string }).id}`, {
|
||||
description: project.description,
|
||||
});
|
||||
await client.put(`/api/v1/projects/${(existing as { id: string }).id}`, project);
|
||||
log(`Updated project: ${project.name}`);
|
||||
} else {
|
||||
await client.post('/api/v1/projects', {
|
||||
name: project.name,
|
||||
description: project.description,
|
||||
});
|
||||
await client.post('/api/v1/projects', project);
|
||||
log(`Created project: ${project.name}`);
|
||||
}
|
||||
} catch (err) {
|
||||
@@ -187,6 +305,40 @@ async function applyConfig(client: ApiClient, config: ApplyConfig, log: (...args
|
||||
log(`Error applying template '${template.name}': ${err instanceof Error ? err.message : err}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Apply RBAC bindings
|
||||
for (const rbacBinding of config.rbacBindings) {
|
||||
try {
|
||||
const existing = await findByName(client, 'rbac', rbacBinding.name);
|
||||
if (existing) {
|
||||
await client.put(`/api/v1/rbac/${(existing as { id: string }).id}`, rbacBinding);
|
||||
log(`Updated rbacBinding: ${rbacBinding.name}`);
|
||||
} else {
|
||||
await client.post('/api/v1/rbac', rbacBinding);
|
||||
log(`Created rbacBinding: ${rbacBinding.name}`);
|
||||
}
|
||||
} catch (err) {
|
||||
log(`Error applying rbacBinding '${rbacBinding.name}': ${err instanceof Error ? err.message : err}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Apply prompts
|
||||
for (const prompt of config.prompts) {
|
||||
try {
|
||||
const existing = await findByName(client, 'prompts', prompt.name);
|
||||
if (existing) {
|
||||
const updateData: Record<string, unknown> = { content: prompt.content };
|
||||
if (prompt.priority !== undefined) updateData.priority = prompt.priority;
|
||||
await client.put(`/api/v1/prompts/${(existing as { id: string }).id}`, updateData);
|
||||
log(`Updated prompt: ${prompt.name}`);
|
||||
} else {
|
||||
await client.post('/api/v1/prompts', prompt);
|
||||
log(`Created prompt: ${prompt.name}`);
|
||||
}
|
||||
} catch (err) {
|
||||
log(`Error applying prompt '${prompt.name}': ${err instanceof Error ? err.message : err}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async function findByName(client: ApiClient, resource: string, name: string): Promise<unknown | null> {
|
||||
@@ -198,5 +350,14 @@ async function findByName(client: ApiClient, resource: string, name: string): Pr
|
||||
}
|
||||
}
|
||||
|
||||
async function findByField<T extends string>(client: ApiClient, resource: string, field: T, value: string): Promise<unknown | null> {
|
||||
try {
|
||||
const items = await client.get<Array<Record<string, unknown>>>(`/api/v1/${resource}`);
|
||||
return items.find((item) => item[field] === value) ?? null;
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
// Export for testing
|
||||
export { loadConfigFile, applyConfig };
|
||||
|
||||
@@ -10,6 +10,10 @@ export interface PromptDeps {
|
||||
password(message: string): Promise<string>;
|
||||
}
|
||||
|
||||
export interface StatusResponse {
|
||||
hasUsers: boolean;
|
||||
}
|
||||
|
||||
export interface AuthCommandDeps {
|
||||
configDeps: Partial<ConfigLoaderDeps>;
|
||||
credentialsDeps: Partial<CredentialsDeps>;
|
||||
@@ -17,6 +21,8 @@ export interface AuthCommandDeps {
|
||||
log: (...args: string[]) => void;
|
||||
loginRequest: (mcpdUrl: string, email: string, password: string) => Promise<LoginResponse>;
|
||||
logoutRequest: (mcpdUrl: string, token: string) => Promise<void>;
|
||||
statusRequest: (mcpdUrl: string) => Promise<StatusResponse>;
|
||||
bootstrapRequest: (mcpdUrl: string, email: string, password: string, name?: string) => Promise<LoginResponse>;
|
||||
}
|
||||
|
||||
interface LoginResponse {
|
||||
@@ -80,6 +86,70 @@ function defaultLogoutRequest(mcpdUrl: string, token: string): Promise<void> {
|
||||
});
|
||||
}
|
||||
|
||||
function defaultStatusRequest(mcpdUrl: string): Promise<StatusResponse> {
|
||||
return new Promise((resolve, reject) => {
|
||||
const url = new URL('/api/v1/auth/status', mcpdUrl);
|
||||
const opts: http.RequestOptions = {
|
||||
hostname: url.hostname,
|
||||
port: url.port,
|
||||
path: url.pathname,
|
||||
method: 'GET',
|
||||
timeout: 10000,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
};
|
||||
const req = http.request(opts, (res) => {
|
||||
const chunks: Buffer[] = [];
|
||||
res.on('data', (chunk: Buffer) => chunks.push(chunk));
|
||||
res.on('end', () => {
|
||||
const raw = Buffer.concat(chunks).toString('utf-8');
|
||||
if ((res.statusCode ?? 0) >= 400) {
|
||||
reject(new Error(`Status check failed (${res.statusCode}): ${raw}`));
|
||||
return;
|
||||
}
|
||||
resolve(JSON.parse(raw) as StatusResponse);
|
||||
});
|
||||
});
|
||||
req.on('error', (err) => reject(new Error(`Cannot reach mcpd: ${err.message}`)));
|
||||
req.on('timeout', () => { req.destroy(); reject(new Error('Status request timed out')); });
|
||||
req.end();
|
||||
});
|
||||
}
|
||||
|
||||
function defaultBootstrapRequest(mcpdUrl: string, email: string, password: string, name?: string): Promise<LoginResponse> {
|
||||
return new Promise((resolve, reject) => {
|
||||
const url = new URL('/api/v1/auth/bootstrap', mcpdUrl);
|
||||
const payload: Record<string, string> = { email, password };
|
||||
if (name) {
|
||||
payload['name'] = name;
|
||||
}
|
||||
const body = JSON.stringify(payload);
|
||||
const opts: http.RequestOptions = {
|
||||
hostname: url.hostname,
|
||||
port: url.port,
|
||||
path: url.pathname,
|
||||
method: 'POST',
|
||||
timeout: 10000,
|
||||
headers: { 'Content-Type': 'application/json', 'Content-Length': Buffer.byteLength(body) },
|
||||
};
|
||||
const req = http.request(opts, (res) => {
|
||||
const chunks: Buffer[] = [];
|
||||
res.on('data', (chunk: Buffer) => chunks.push(chunk));
|
||||
res.on('end', () => {
|
||||
const raw = Buffer.concat(chunks).toString('utf-8');
|
||||
if ((res.statusCode ?? 0) >= 400) {
|
||||
reject(new Error(`Bootstrap failed (${res.statusCode}): ${raw}`));
|
||||
return;
|
||||
}
|
||||
resolve(JSON.parse(raw) as LoginResponse);
|
||||
});
|
||||
});
|
||||
req.on('error', (err) => reject(new Error(`Cannot reach mcpd: ${err.message}`)));
|
||||
req.on('timeout', () => { req.destroy(); reject(new Error('Bootstrap request timed out')); });
|
||||
req.write(body);
|
||||
req.end();
|
||||
});
|
||||
}
|
||||
|
||||
async function defaultInput(message: string): Promise<string> {
|
||||
const { default: inquirer } = await import('inquirer');
|
||||
const { answer } = await inquirer.prompt([{ type: 'input', name: 'answer', message }]);
|
||||
@@ -99,10 +169,12 @@ const defaultDeps: AuthCommandDeps = {
|
||||
log: (...args) => console.log(...args),
|
||||
loginRequest: defaultLoginRequest,
|
||||
logoutRequest: defaultLogoutRequest,
|
||||
statusRequest: defaultStatusRequest,
|
||||
bootstrapRequest: defaultBootstrapRequest,
|
||||
};
|
||||
|
||||
export function createLoginCommand(deps?: Partial<AuthCommandDeps>): Command {
|
||||
const { configDeps, credentialsDeps, prompt, log, loginRequest } = { ...defaultDeps, ...deps };
|
||||
const { configDeps, credentialsDeps, prompt, log, loginRequest, statusRequest, bootstrapRequest } = { ...defaultDeps, ...deps };
|
||||
|
||||
return new Command('login')
|
||||
.description('Authenticate with mcpd')
|
||||
@@ -111,17 +183,36 @@ export function createLoginCommand(deps?: Partial<AuthCommandDeps>): Command {
|
||||
const config = loadConfig(configDeps);
|
||||
const mcpdUrl = opts.mcpdUrl ?? config.mcpdUrl;
|
||||
|
||||
const email = await prompt.input('Email:');
|
||||
const password = await prompt.password('Password:');
|
||||
|
||||
try {
|
||||
const result = await loginRequest(mcpdUrl, email, password);
|
||||
saveCredentials({
|
||||
token: result.token,
|
||||
mcpdUrl,
|
||||
user: result.user.email,
|
||||
}, credentialsDeps);
|
||||
log(`Logged in as ${result.user.email}`);
|
||||
const status = await statusRequest(mcpdUrl);
|
||||
|
||||
if (!status.hasUsers) {
|
||||
log('No users configured. Creating first admin account.');
|
||||
const email = await prompt.input('Email:');
|
||||
const password = await prompt.password('Password:');
|
||||
const name = await prompt.input('Name (optional):');
|
||||
|
||||
const result = name
|
||||
? await bootstrapRequest(mcpdUrl, email, password, name)
|
||||
: await bootstrapRequest(mcpdUrl, email, password);
|
||||
saveCredentials({
|
||||
token: result.token,
|
||||
mcpdUrl,
|
||||
user: result.user.email,
|
||||
}, credentialsDeps);
|
||||
log(`Logged in as ${result.user.email} (admin)`);
|
||||
} else {
|
||||
const email = await prompt.input('Email:');
|
||||
const password = await prompt.password('Password:');
|
||||
|
||||
const result = await loginRequest(mcpdUrl, email, password);
|
||||
saveCredentials({
|
||||
token: result.token,
|
||||
mcpdUrl,
|
||||
user: result.user.email,
|
||||
}, credentialsDeps);
|
||||
log(`Logged in as ${result.user.email}`);
|
||||
}
|
||||
} catch (err) {
|
||||
log(`Login failed: ${(err as Error).message}`);
|
||||
process.exitCode = 1;
|
||||
|
||||
@@ -1,155 +0,0 @@
|
||||
import { Command } from 'commander';
|
||||
import { writeFileSync, readFileSync, existsSync } from 'node:fs';
|
||||
import { resolve } from 'node:path';
|
||||
import type { ApiClient } from '../api-client.js';
|
||||
|
||||
interface McpConfig {
|
||||
mcpServers: Record<string, { command: string; args: string[]; env?: Record<string, string> }>;
|
||||
}
|
||||
|
||||
export interface ClaudeCommandDeps {
|
||||
client: ApiClient;
|
||||
log: (...args: unknown[]) => void;
|
||||
}
|
||||
|
||||
export function createClaudeCommand(deps: ClaudeCommandDeps): Command {
|
||||
const { client, log } = deps;
|
||||
|
||||
const cmd = new Command('claude')
|
||||
.description('Manage Claude MCP configuration (.mcp.json)');
|
||||
|
||||
cmd
|
||||
.command('generate <projectId>')
|
||||
.description('Generate .mcp.json from a project configuration')
|
||||
.option('-o, --output <path>', 'Output file path', '.mcp.json')
|
||||
.option('--merge', 'Merge with existing .mcp.json instead of overwriting')
|
||||
.option('--stdout', 'Print to stdout instead of writing a file')
|
||||
.action(async (projectId: string, opts: { output: string; merge?: boolean; stdout?: boolean }) => {
|
||||
const config = await client.get<McpConfig>(`/api/v1/projects/${projectId}/mcp-config`);
|
||||
|
||||
if (opts.stdout) {
|
||||
log(JSON.stringify(config, null, 2));
|
||||
return;
|
||||
}
|
||||
|
||||
const outputPath = resolve(opts.output);
|
||||
let finalConfig = config;
|
||||
|
||||
if (opts.merge && existsSync(outputPath)) {
|
||||
try {
|
||||
const existing = JSON.parse(readFileSync(outputPath, 'utf-8')) as McpConfig;
|
||||
finalConfig = {
|
||||
mcpServers: {
|
||||
...existing.mcpServers,
|
||||
...config.mcpServers,
|
||||
},
|
||||
};
|
||||
} catch {
|
||||
// If existing file is invalid, just overwrite
|
||||
}
|
||||
}
|
||||
|
||||
writeFileSync(outputPath, JSON.stringify(finalConfig, null, 2) + '\n');
|
||||
const serverCount = Object.keys(finalConfig.mcpServers).length;
|
||||
log(`Wrote ${outputPath} (${serverCount} server(s))`);
|
||||
});
|
||||
|
||||
cmd
|
||||
.command('show')
|
||||
.description('Show current .mcp.json configuration')
|
||||
.option('-p, --path <path>', 'Path to .mcp.json', '.mcp.json')
|
||||
.action((opts: { path: string }) => {
|
||||
const filePath = resolve(opts.path);
|
||||
if (!existsSync(filePath)) {
|
||||
log(`No .mcp.json found at ${filePath}`);
|
||||
return;
|
||||
}
|
||||
const content = readFileSync(filePath, 'utf-8');
|
||||
try {
|
||||
const config = JSON.parse(content) as McpConfig;
|
||||
const servers = Object.entries(config.mcpServers ?? {});
|
||||
if (servers.length === 0) {
|
||||
log('No MCP servers configured.');
|
||||
return;
|
||||
}
|
||||
log(`MCP servers in ${filePath}:\n`);
|
||||
for (const [name, server] of servers) {
|
||||
log(` ${name}`);
|
||||
log(` command: ${server.command} ${server.args.join(' ')}`);
|
||||
if (server.env) {
|
||||
const envKeys = Object.keys(server.env);
|
||||
log(` env: ${envKeys.join(', ')}`);
|
||||
}
|
||||
}
|
||||
} catch {
|
||||
log(`Invalid JSON in ${filePath}`);
|
||||
}
|
||||
});
|
||||
|
||||
cmd
|
||||
.command('add <name>')
|
||||
.description('Add an MCP server entry to .mcp.json')
|
||||
.requiredOption('-c, --command <cmd>', 'Command to run')
|
||||
.option('-a, --args <args...>', 'Command arguments')
|
||||
.option('-e, --env <key=value...>', 'Environment variables')
|
||||
.option('-p, --path <path>', 'Path to .mcp.json', '.mcp.json')
|
||||
.action((name: string, opts: { command: string; args?: string[]; env?: string[]; path: string }) => {
|
||||
const filePath = resolve(opts.path);
|
||||
let config: McpConfig = { mcpServers: {} };
|
||||
|
||||
if (existsSync(filePath)) {
|
||||
try {
|
||||
config = JSON.parse(readFileSync(filePath, 'utf-8')) as McpConfig;
|
||||
} catch {
|
||||
// Start fresh
|
||||
}
|
||||
}
|
||||
|
||||
const entry: { command: string; args: string[]; env?: Record<string, string> } = {
|
||||
command: opts.command,
|
||||
args: opts.args ?? [],
|
||||
};
|
||||
|
||||
if (opts.env && opts.env.length > 0) {
|
||||
const env: Record<string, string> = {};
|
||||
for (const pair of opts.env) {
|
||||
const eqIdx = pair.indexOf('=');
|
||||
if (eqIdx > 0) {
|
||||
env[pair.slice(0, eqIdx)] = pair.slice(eqIdx + 1);
|
||||
}
|
||||
}
|
||||
entry.env = env;
|
||||
}
|
||||
|
||||
config.mcpServers[name] = entry;
|
||||
writeFileSync(filePath, JSON.stringify(config, null, 2) + '\n');
|
||||
log(`Added '${name}' to ${filePath}`);
|
||||
});
|
||||
|
||||
cmd
|
||||
.command('remove <name>')
|
||||
.description('Remove an MCP server entry from .mcp.json')
|
||||
.option('-p, --path <path>', 'Path to .mcp.json', '.mcp.json')
|
||||
.action((name: string, opts: { path: string }) => {
|
||||
const filePath = resolve(opts.path);
|
||||
if (!existsSync(filePath)) {
|
||||
log(`No .mcp.json found at ${filePath}`);
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
const config = JSON.parse(readFileSync(filePath, 'utf-8')) as McpConfig;
|
||||
if (!(name in config.mcpServers)) {
|
||||
log(`Server '${name}' not found in ${filePath}`);
|
||||
return;
|
||||
}
|
||||
delete config.mcpServers[name];
|
||||
writeFileSync(filePath, JSON.stringify(config, null, 2) + '\n');
|
||||
log(`Removed '${name}' from ${filePath}`);
|
||||
} catch {
|
||||
log(`Invalid JSON in ${filePath}`);
|
||||
}
|
||||
});
|
||||
|
||||
return cmd;
|
||||
}
|
||||
464
src/cli/src/commands/config-setup.ts
Normal file
464
src/cli/src/commands/config-setup.ts
Normal file
@@ -0,0 +1,464 @@
|
||||
import { Command } from 'commander';
|
||||
import http from 'node:http';
|
||||
import https from 'node:https';
|
||||
import { execFile } from 'node:child_process';
|
||||
import { promisify } from 'node:util';
|
||||
import { loadConfig, saveConfig } from '../config/index.js';
|
||||
import type { ConfigLoaderDeps, McpctlConfig, LlmConfig, LlmProviderName, LlmProviderEntry, LlmTier } from '../config/index.js';
|
||||
import type { SecretStore } from '@mcpctl/shared';
|
||||
import { createSecretStore } from '@mcpctl/shared';
|
||||
|
||||
const execFileAsync = promisify(execFile);
|
||||
|
||||
export interface ConfigSetupPrompt {
|
||||
select<T>(message: string, choices: Array<{ name: string; value: T; description?: string }>): Promise<T>;
|
||||
input(message: string, defaultValue?: string): Promise<string>;
|
||||
password(message: string): Promise<string>;
|
||||
confirm(message: string, defaultValue?: boolean): Promise<boolean>;
|
||||
}
|
||||
|
||||
export interface ConfigSetupDeps {
|
||||
configDeps: Partial<ConfigLoaderDeps>;
|
||||
secretStore: SecretStore;
|
||||
log: (...args: string[]) => void;
|
||||
prompt: ConfigSetupPrompt;
|
||||
fetchModels: (url: string, path: string) => Promise<string[]>;
|
||||
whichBinary: (name: string) => Promise<string | null>;
|
||||
}
|
||||
|
||||
interface ProviderChoice {
|
||||
name: string;
|
||||
value: LlmProviderName;
|
||||
description: string;
|
||||
}
|
||||
|
||||
/** Provider config fields returned by per-provider setup functions. */
|
||||
interface ProviderFields {
|
||||
model?: string;
|
||||
url?: string;
|
||||
binaryPath?: string;
|
||||
}
|
||||
|
||||
const FAST_PROVIDER_CHOICES: ProviderChoice[] = [
|
||||
{ name: 'vLLM', value: 'vllm', description: 'Self-hosted vLLM (OpenAI-compatible)' },
|
||||
{ name: 'Ollama', value: 'ollama', description: 'Local models via Ollama' },
|
||||
];
|
||||
|
||||
const HEAVY_PROVIDER_CHOICES: ProviderChoice[] = [
|
||||
{ name: 'Gemini CLI', value: 'gemini-cli', description: 'Google Gemini via local CLI (free, no API key)' },
|
||||
{ name: 'Anthropic (Claude)', value: 'anthropic', description: 'Claude API (requires API key)' },
|
||||
{ name: 'OpenAI', value: 'openai', description: 'OpenAI API (requires API key)' },
|
||||
{ name: 'DeepSeek', value: 'deepseek', description: 'DeepSeek API (requires API key)' },
|
||||
];
|
||||
|
||||
const ALL_PROVIDER_CHOICES: ProviderChoice[] = [
|
||||
...FAST_PROVIDER_CHOICES,
|
||||
...HEAVY_PROVIDER_CHOICES,
|
||||
{ name: 'None (disable)', value: 'none', description: 'Disable LLM features' },
|
||||
];
|
||||
|
||||
const GEMINI_MODELS = ['gemini-2.5-flash', 'gemini-2.5-pro', 'gemini-2.0-flash'];
|
||||
const ANTHROPIC_MODELS = ['claude-haiku-3-5-20241022', 'claude-sonnet-4-20250514', 'claude-opus-4-20250514'];
|
||||
const DEEPSEEK_MODELS = ['deepseek-chat', 'deepseek-reasoner'];
|
||||
|
||||
function defaultFetchModels(baseUrl: string, path: string): Promise<string[]> {
|
||||
return new Promise((resolve) => {
|
||||
const url = new URL(path, baseUrl);
|
||||
const isHttps = url.protocol === 'https:';
|
||||
const transport = isHttps ? https : http;
|
||||
|
||||
const req = transport.get({
|
||||
hostname: url.hostname,
|
||||
port: url.port || (isHttps ? 443 : 80),
|
||||
path: url.pathname,
|
||||
timeout: 5000,
|
||||
}, (res) => {
|
||||
const chunks: Buffer[] = [];
|
||||
res.on('data', (chunk: Buffer) => chunks.push(chunk));
|
||||
res.on('end', () => {
|
||||
try {
|
||||
const raw = Buffer.concat(chunks).toString('utf-8');
|
||||
const data = JSON.parse(raw) as { models?: Array<{ name: string }>; data?: Array<{ id: string }> };
|
||||
// Ollama format: { models: [{ name }] }
|
||||
if (data.models) {
|
||||
resolve(data.models.map((m) => m.name));
|
||||
return;
|
||||
}
|
||||
// OpenAI/vLLM format: { data: [{ id }] }
|
||||
if (data.data) {
|
||||
resolve(data.data.map((m) => m.id));
|
||||
return;
|
||||
}
|
||||
resolve([]);
|
||||
} catch {
|
||||
resolve([]);
|
||||
}
|
||||
});
|
||||
});
|
||||
req.on('error', () => resolve([]));
|
||||
req.on('timeout', () => { req.destroy(); resolve([]); });
|
||||
});
|
||||
}
|
||||
|
||||
async function defaultSelect<T>(message: string, choices: Array<{ name: string; value: T; description?: string }>): Promise<T> {
|
||||
const { default: inquirer } = await import('inquirer');
|
||||
const { answer } = await inquirer.prompt([{
|
||||
type: 'list',
|
||||
name: 'answer',
|
||||
message,
|
||||
choices: choices.map((c) => ({
|
||||
name: c.description ? `${c.name} — ${c.description}` : c.name,
|
||||
value: c.value,
|
||||
short: c.name,
|
||||
})),
|
||||
}]);
|
||||
return answer as T;
|
||||
}
|
||||
|
||||
async function defaultInput(message: string, defaultValue?: string): Promise<string> {
|
||||
const { default: inquirer } = await import('inquirer');
|
||||
const { answer } = await inquirer.prompt([{
|
||||
type: 'input',
|
||||
name: 'answer',
|
||||
message,
|
||||
default: defaultValue,
|
||||
}]);
|
||||
return answer as string;
|
||||
}
|
||||
|
||||
async function defaultPassword(message: string): Promise<string> {
|
||||
const { default: inquirer } = await import('inquirer');
|
||||
const { answer } = await inquirer.prompt([{ type: 'password', name: 'answer', message }]);
|
||||
return answer as string;
|
||||
}
|
||||
|
||||
async function defaultConfirm(message: string, defaultValue?: boolean): Promise<boolean> {
|
||||
const { default: inquirer } = await import('inquirer');
|
||||
const { answer } = await inquirer.prompt([{
|
||||
type: 'confirm',
|
||||
name: 'answer',
|
||||
message,
|
||||
default: defaultValue ?? true,
|
||||
}]);
|
||||
return answer as boolean;
|
||||
}
|
||||
|
||||
const defaultPrompt: ConfigSetupPrompt = {
|
||||
select: defaultSelect,
|
||||
input: defaultInput,
|
||||
password: defaultPassword,
|
||||
confirm: defaultConfirm,
|
||||
};
|
||||
|
||||
async function defaultWhichBinary(name: string): Promise<string | null> {
|
||||
try {
|
||||
const { stdout } = await execFileAsync('which', [name], { timeout: 3000 });
|
||||
const path = stdout.trim();
|
||||
return path || null;
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
// --- Per-provider setup functions (return ProviderFields for reuse in both modes) ---
|
||||
|
||||
async function setupGeminiCliFields(
|
||||
prompt: ConfigSetupPrompt,
|
||||
log: (...args: string[]) => void,
|
||||
whichBinary: (name: string) => Promise<string | null>,
|
||||
currentModel?: string,
|
||||
): Promise<ProviderFields> {
|
||||
const model = await prompt.select<string>('Select model:', [
|
||||
...GEMINI_MODELS.map((m) => ({
|
||||
name: m === currentModel ? `${m} (current)` : m,
|
||||
value: m,
|
||||
})),
|
||||
{ name: 'Custom...', value: '__custom__' },
|
||||
]);
|
||||
|
||||
const finalModel = model === '__custom__'
|
||||
? await prompt.input('Model name:', currentModel)
|
||||
: model;
|
||||
|
||||
let binaryPath: string | undefined;
|
||||
const detected = await whichBinary('gemini');
|
||||
if (detected) {
|
||||
log(`Found gemini at: ${detected}`);
|
||||
binaryPath = detected;
|
||||
} else {
|
||||
log('Warning: gemini binary not found in PATH');
|
||||
const manualPath = await prompt.input('Binary path (or install with: npm i -g @google/gemini-cli):');
|
||||
if (manualPath) binaryPath = manualPath;
|
||||
}
|
||||
|
||||
const result: ProviderFields = { model: finalModel };
|
||||
if (binaryPath) result.binaryPath = binaryPath;
|
||||
return result;
|
||||
}
|
||||
|
||||
async function setupOllamaFields(
|
||||
prompt: ConfigSetupPrompt,
|
||||
fetchModels: ConfigSetupDeps['fetchModels'],
|
||||
currentUrl?: string,
|
||||
currentModel?: string,
|
||||
): Promise<ProviderFields> {
|
||||
const url = await prompt.input('Ollama URL:', currentUrl ?? 'http://localhost:11434');
|
||||
const models = await fetchModels(url, '/api/tags');
|
||||
let model: string;
|
||||
|
||||
if (models.length > 0) {
|
||||
const choices = models.map((m) => ({
|
||||
name: m === currentModel ? `${m} (current)` : m,
|
||||
value: m,
|
||||
}));
|
||||
choices.push({ name: 'Custom...', value: '__custom__' });
|
||||
model = await prompt.select<string>('Select model:', choices);
|
||||
if (model === '__custom__') {
|
||||
model = await prompt.input('Model name:', currentModel);
|
||||
}
|
||||
} else {
|
||||
model = await prompt.input('Model name (could not fetch models):', currentModel ?? 'llama3.2');
|
||||
}
|
||||
|
||||
const result: ProviderFields = { model };
|
||||
if (url) result.url = url;
|
||||
return result;
|
||||
}
|
||||
|
||||
async function setupVllmFields(
|
||||
prompt: ConfigSetupPrompt,
|
||||
fetchModels: ConfigSetupDeps['fetchModels'],
|
||||
currentUrl?: string,
|
||||
currentModel?: string,
|
||||
): Promise<ProviderFields> {
|
||||
const url = await prompt.input('vLLM URL:', currentUrl ?? 'http://localhost:8000');
|
||||
const models = await fetchModels(url, '/v1/models');
|
||||
let model: string;
|
||||
|
||||
if (models.length > 0) {
|
||||
const choices = models.map((m) => ({
|
||||
name: m === currentModel ? `${m} (current)` : m,
|
||||
value: m,
|
||||
}));
|
||||
choices.push({ name: 'Custom...', value: '__custom__' });
|
||||
model = await prompt.select<string>('Select model:', choices);
|
||||
if (model === '__custom__') {
|
||||
model = await prompt.input('Model name:', currentModel);
|
||||
}
|
||||
} else {
|
||||
model = await prompt.input('Model name (could not fetch models):', currentModel ?? 'default');
|
||||
}
|
||||
|
||||
const result: ProviderFields = { model };
|
||||
if (url) result.url = url;
|
||||
return result;
|
||||
}
|
||||
|
||||
async function setupApiKeyFields(
|
||||
prompt: ConfigSetupPrompt,
|
||||
secretStore: SecretStore,
|
||||
provider: LlmProviderName,
|
||||
secretKey: string,
|
||||
hardcodedModels: string[],
|
||||
currentModel?: string,
|
||||
currentUrl?: string,
|
||||
): Promise<ProviderFields> {
|
||||
const existingKey = await secretStore.get(secretKey);
|
||||
let apiKey: string;
|
||||
|
||||
if (existingKey) {
|
||||
const masked = `****${existingKey.slice(-4)}`;
|
||||
const changeKey = await prompt.confirm(`API key stored (${masked}). Change it?`, false);
|
||||
apiKey = changeKey ? await prompt.password('API key:') : existingKey;
|
||||
} else {
|
||||
apiKey = await prompt.password('API key:');
|
||||
}
|
||||
|
||||
if (apiKey !== existingKey) {
|
||||
await secretStore.set(secretKey, apiKey);
|
||||
}
|
||||
|
||||
let model: string;
|
||||
if (hardcodedModels.length > 0) {
|
||||
const choices = hardcodedModels.map((m) => ({
|
||||
name: m === currentModel ? `${m} (current)` : m,
|
||||
value: m,
|
||||
}));
|
||||
choices.push({ name: 'Custom...', value: '__custom__' });
|
||||
model = await prompt.select<string>('Select model:', choices);
|
||||
if (model === '__custom__') {
|
||||
model = await prompt.input('Model name:', currentModel);
|
||||
}
|
||||
} else {
|
||||
model = await prompt.input('Model name:', currentModel ?? 'gpt-4o');
|
||||
}
|
||||
|
||||
let url: string | undefined;
|
||||
if (provider === 'openai') {
|
||||
const customUrl = await prompt.confirm('Use custom API endpoint?', false);
|
||||
if (customUrl) {
|
||||
url = await prompt.input('API URL:', currentUrl ?? 'https://api.openai.com');
|
||||
}
|
||||
}
|
||||
|
||||
const result: ProviderFields = { model };
|
||||
if (url) result.url = url;
|
||||
return result;
|
||||
}
|
||||
|
||||
/** Configure a single provider type and return its fields. */
|
||||
async function setupProviderFields(
|
||||
providerType: LlmProviderName,
|
||||
prompt: ConfigSetupPrompt,
|
||||
log: (...args: string[]) => void,
|
||||
fetchModels: ConfigSetupDeps['fetchModels'],
|
||||
whichBinary: (name: string) => Promise<string | null>,
|
||||
secretStore: SecretStore,
|
||||
): Promise<ProviderFields> {
|
||||
switch (providerType) {
|
||||
case 'gemini-cli':
|
||||
return setupGeminiCliFields(prompt, log, whichBinary);
|
||||
case 'ollama':
|
||||
return setupOllamaFields(prompt, fetchModels);
|
||||
case 'vllm':
|
||||
return setupVllmFields(prompt, fetchModels);
|
||||
case 'anthropic':
|
||||
return setupApiKeyFields(prompt, secretStore, 'anthropic', 'anthropic-api-key', ANTHROPIC_MODELS);
|
||||
case 'openai':
|
||||
return setupApiKeyFields(prompt, secretStore, 'openai', 'openai-api-key', []);
|
||||
case 'deepseek':
|
||||
return setupApiKeyFields(prompt, secretStore, 'deepseek', 'deepseek-api-key', DEEPSEEK_MODELS);
|
||||
default:
|
||||
return {};
|
||||
}
|
||||
}
|
||||
|
||||
/** Build a LlmProviderEntry from type, name, and fields. */
|
||||
function buildEntry(providerType: LlmProviderName, name: string, fields: ProviderFields, tier?: LlmTier): LlmProviderEntry {
|
||||
const entry: LlmProviderEntry = { name, type: providerType };
|
||||
if (fields.model) entry.model = fields.model;
|
||||
if (fields.url) entry.url = fields.url;
|
||||
if (fields.binaryPath) entry.binaryPath = fields.binaryPath;
|
||||
if (tier) entry.tier = tier;
|
||||
return entry;
|
||||
}
|
||||
|
||||
/** Simple mode: single provider (legacy format). */
|
||||
async function simpleSetup(
|
||||
config: McpctlConfig,
|
||||
configDeps: Partial<ConfigLoaderDeps>,
|
||||
prompt: ConfigSetupPrompt,
|
||||
log: (...args: string[]) => void,
|
||||
fetchModels: ConfigSetupDeps['fetchModels'],
|
||||
whichBinary: (name: string) => Promise<string | null>,
|
||||
secretStore: SecretStore,
|
||||
): Promise<void> {
|
||||
const currentLlm = config.llm && 'provider' in config.llm ? config.llm : undefined;
|
||||
|
||||
const choices = ALL_PROVIDER_CHOICES.map((c) => {
|
||||
if (currentLlm?.provider === c.value) {
|
||||
return { ...c, name: `${c.name} (current)` };
|
||||
}
|
||||
return c;
|
||||
});
|
||||
|
||||
const provider = await prompt.select<LlmProviderName>('Select LLM provider:', choices);
|
||||
|
||||
if (provider === 'none') {
|
||||
const updated: McpctlConfig = { ...config, llm: { provider: 'none' } };
|
||||
saveConfig(updated, configDeps);
|
||||
log('LLM disabled. Restart mcplocal: systemctl --user restart mcplocal');
|
||||
return;
|
||||
}
|
||||
|
||||
const fields = await setupProviderFields(provider, prompt, log, fetchModels, whichBinary, secretStore);
|
||||
const llmConfig: LlmConfig = { provider, ...fields };
|
||||
const updated: McpctlConfig = { ...config, llm: llmConfig };
|
||||
saveConfig(updated, configDeps);
|
||||
log(`\nLLM configured: ${llmConfig.provider}${llmConfig.model ? ` / ${llmConfig.model}` : ''}`);
|
||||
log('Restart mcplocal: systemctl --user restart mcplocal');
|
||||
}
|
||||
|
||||
/** Advanced mode: multiple providers with tier assignments. */
|
||||
async function advancedSetup(
|
||||
config: McpctlConfig,
|
||||
configDeps: Partial<ConfigLoaderDeps>,
|
||||
prompt: ConfigSetupPrompt,
|
||||
log: (...args: string[]) => void,
|
||||
fetchModels: ConfigSetupDeps['fetchModels'],
|
||||
whichBinary: (name: string) => Promise<string | null>,
|
||||
secretStore: SecretStore,
|
||||
): Promise<void> {
|
||||
const entries: LlmProviderEntry[] = [];
|
||||
|
||||
// Fast providers
|
||||
const addFast = await prompt.confirm('Add a FAST provider? (vLLM, Ollama — local, cheap, fast)', true);
|
||||
if (addFast) {
|
||||
let addMore = true;
|
||||
while (addMore) {
|
||||
const providerType = await prompt.select<LlmProviderName>('Fast provider type:', FAST_PROVIDER_CHOICES);
|
||||
const defaultName = providerType === 'vllm' ? 'vllm-local' : providerType;
|
||||
const name = await prompt.input('Provider name:', defaultName);
|
||||
const fields = await setupProviderFields(providerType, prompt, log, fetchModels, whichBinary, secretStore);
|
||||
entries.push(buildEntry(providerType, name, fields, 'fast'));
|
||||
log(` Added: ${name} (${providerType}) → fast tier`);
|
||||
addMore = await prompt.confirm('Add another fast provider?', false);
|
||||
}
|
||||
}
|
||||
|
||||
// Heavy providers
|
||||
const addHeavy = await prompt.confirm('Add a HEAVY provider? (Gemini, Anthropic, OpenAI — cloud, smart)', true);
|
||||
if (addHeavy) {
|
||||
let addMore = true;
|
||||
while (addMore) {
|
||||
const providerType = await prompt.select<LlmProviderName>('Heavy provider type:', HEAVY_PROVIDER_CHOICES);
|
||||
const defaultName = providerType;
|
||||
const name = await prompt.input('Provider name:', defaultName);
|
||||
const fields = await setupProviderFields(providerType, prompt, log, fetchModels, whichBinary, secretStore);
|
||||
entries.push(buildEntry(providerType, name, fields, 'heavy'));
|
||||
log(` Added: ${name} (${providerType}) → heavy tier`);
|
||||
addMore = await prompt.confirm('Add another heavy provider?', false);
|
||||
}
|
||||
}
|
||||
|
||||
if (entries.length === 0) {
|
||||
log('No providers configured.');
|
||||
return;
|
||||
}
|
||||
|
||||
// Summary
|
||||
log('\nProvider configuration:');
|
||||
for (const e of entries) {
|
||||
log(` ${e.tier ?? 'unassigned'}: ${e.name} (${e.type})${e.model ? ` / ${e.model}` : ''}`);
|
||||
}
|
||||
|
||||
const updated: McpctlConfig = { ...config, llm: { providers: entries } };
|
||||
saveConfig(updated, configDeps);
|
||||
log('\nRestart mcplocal: systemctl --user restart mcplocal');
|
||||
}
|
||||
|
||||
export function createConfigSetupCommand(deps?: Partial<ConfigSetupDeps>): Command {
|
||||
return new Command('setup')
|
||||
.description('Interactive LLM provider setup wizard')
|
||||
.action(async () => {
|
||||
const configDeps = deps?.configDeps ?? {};
|
||||
const log = deps?.log ?? ((...args: string[]) => console.log(...args));
|
||||
const prompt = deps?.prompt ?? defaultPrompt;
|
||||
const fetchModels = deps?.fetchModels ?? defaultFetchModels;
|
||||
const whichBinary = deps?.whichBinary ?? defaultWhichBinary;
|
||||
const secretStore = deps?.secretStore ?? await createSecretStore();
|
||||
|
||||
const config = loadConfig(configDeps);
|
||||
|
||||
const mode = await prompt.select<'simple' | 'advanced'>('Setup mode:', [
|
||||
{ name: 'Simple', value: 'simple', description: 'One provider for everything' },
|
||||
{ name: 'Advanced', value: 'advanced', description: 'Multiple providers with fast/heavy tiers' },
|
||||
]);
|
||||
|
||||
if (mode === 'simple') {
|
||||
await simpleSetup(config, configDeps, prompt, log, fetchModels, whichBinary, secretStore);
|
||||
} else {
|
||||
await advancedSetup(config, configDeps, prompt, log, fetchModels, whichBinary, secretStore);
|
||||
}
|
||||
});
|
||||
}
|
||||
@@ -1,19 +1,36 @@
|
||||
import { Command } from 'commander';
|
||||
import { writeFileSync, readFileSync, existsSync } from 'node:fs';
|
||||
import { resolve, join } from 'node:path';
|
||||
import { homedir } from 'node:os';
|
||||
import { loadConfig, saveConfig, mergeConfig, getConfigPath, DEFAULT_CONFIG } from '../config/index.js';
|
||||
import type { McpctlConfig, ConfigLoaderDeps } from '../config/index.js';
|
||||
import { formatJson, formatYaml } from '../formatters/index.js';
|
||||
import { saveCredentials, loadCredentials } from '../auth/index.js';
|
||||
import { createConfigSetupCommand } from './config-setup.js';
|
||||
import type { CredentialsDeps, StoredCredentials } from '../auth/index.js';
|
||||
import type { ApiClient } from '../api-client.js';
|
||||
|
||||
interface McpConfig {
|
||||
mcpServers: Record<string, { command?: string; args?: string[]; url?: string; env?: Record<string, string> }>;
|
||||
}
|
||||
|
||||
export interface ConfigCommandDeps {
|
||||
configDeps: Partial<ConfigLoaderDeps>;
|
||||
log: (...args: string[]) => void;
|
||||
}
|
||||
|
||||
export interface ConfigApiDeps {
|
||||
client: ApiClient;
|
||||
credentialsDeps: Partial<CredentialsDeps>;
|
||||
log: (...args: string[]) => void;
|
||||
}
|
||||
|
||||
const defaultDeps: ConfigCommandDeps = {
|
||||
configDeps: {},
|
||||
log: (...args) => console.log(...args),
|
||||
};
|
||||
|
||||
export function createConfigCommand(deps?: Partial<ConfigCommandDeps>): Command {
|
||||
export function createConfigCommand(deps?: Partial<ConfigCommandDeps>, apiDeps?: ConfigApiDeps): Command {
|
||||
const { configDeps, log } = { ...defaultDeps, ...deps };
|
||||
|
||||
const config = new Command('config').description('Manage mcpctl configuration');
|
||||
@@ -68,5 +85,134 @@ export function createConfigCommand(deps?: Partial<ConfigCommandDeps>): Command
|
||||
log('Configuration reset to defaults');
|
||||
});
|
||||
|
||||
// claude/claude-generate: generate .mcp.json pointing at mcpctl mcp bridge
|
||||
function registerClaudeCommand(name: string, hidden: boolean): void {
|
||||
const cmd = config
|
||||
.command(name)
|
||||
.description(hidden ? '' : 'Generate .mcp.json that connects a project via mcpctl mcp bridge')
|
||||
.requiredOption('--project <name>', 'Project name')
|
||||
.option('-o, --output <path>', 'Output file path', '.mcp.json')
|
||||
.option('--merge', 'Merge with existing .mcp.json instead of overwriting')
|
||||
.option('--stdout', 'Print to stdout instead of writing a file')
|
||||
.action((opts: { project: string; output: string; merge?: boolean; stdout?: boolean }) => {
|
||||
const mcpConfig: McpConfig = {
|
||||
mcpServers: {
|
||||
[opts.project]: {
|
||||
command: 'mcpctl',
|
||||
args: ['mcp', '-p', opts.project],
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
if (opts.stdout) {
|
||||
log(JSON.stringify(mcpConfig, null, 2));
|
||||
return;
|
||||
}
|
||||
|
||||
const outputPath = resolve(opts.output);
|
||||
let finalConfig = mcpConfig;
|
||||
|
||||
if (opts.merge && existsSync(outputPath)) {
|
||||
try {
|
||||
const existing = JSON.parse(readFileSync(outputPath, 'utf-8')) as McpConfig;
|
||||
finalConfig = {
|
||||
mcpServers: {
|
||||
...existing.mcpServers,
|
||||
...mcpConfig.mcpServers,
|
||||
},
|
||||
};
|
||||
} catch {
|
||||
// If existing file is invalid, just overwrite
|
||||
}
|
||||
}
|
||||
|
||||
writeFileSync(outputPath, JSON.stringify(finalConfig, null, 2) + '\n');
|
||||
const serverCount = Object.keys(finalConfig.mcpServers).length;
|
||||
log(`Wrote ${outputPath} (${serverCount} server(s))`);
|
||||
});
|
||||
if (hidden) {
|
||||
// Commander shows empty-description commands but they won't clutter help output
|
||||
void cmd; // suppress unused lint
|
||||
}
|
||||
}
|
||||
|
||||
registerClaudeCommand('claude', false);
|
||||
registerClaudeCommand('claude-generate', true); // backward compat
|
||||
|
||||
config.addCommand(createConfigSetupCommand({ configDeps }));
|
||||
|
||||
if (apiDeps) {
|
||||
const { client, credentialsDeps, log: apiLog } = apiDeps;
|
||||
|
||||
config
|
||||
.command('impersonate')
|
||||
.description('Impersonate another user or return to original identity')
|
||||
.argument('[email]', 'Email of user to impersonate')
|
||||
.option('--quit', 'Stop impersonating and return to original identity')
|
||||
.action(async (email: string | undefined, opts: { quit?: boolean }) => {
|
||||
const configDir = credentialsDeps?.configDir ?? join(homedir(), '.mcpctl');
|
||||
const backupPath = join(configDir, 'credentials-backup');
|
||||
|
||||
if (opts.quit) {
|
||||
if (!existsSync(backupPath)) {
|
||||
apiLog('No impersonation session to quit');
|
||||
process.exitCode = 1;
|
||||
return;
|
||||
}
|
||||
|
||||
const backupRaw = readFileSync(backupPath, 'utf-8');
|
||||
const backup = JSON.parse(backupRaw) as StoredCredentials;
|
||||
saveCredentials(backup, credentialsDeps);
|
||||
|
||||
// Remove backup file
|
||||
const { unlinkSync } = await import('node:fs');
|
||||
unlinkSync(backupPath);
|
||||
|
||||
apiLog(`Returned to ${backup.user}`);
|
||||
return;
|
||||
}
|
||||
|
||||
if (!email) {
|
||||
apiLog('Email is required when not using --quit');
|
||||
process.exitCode = 1;
|
||||
return;
|
||||
}
|
||||
|
||||
// Save current credentials as backup
|
||||
const currentCreds = loadCredentials(credentialsDeps);
|
||||
if (!currentCreds) {
|
||||
apiLog('Not logged in. Run "mcpctl login" first.');
|
||||
process.exitCode = 1;
|
||||
return;
|
||||
}
|
||||
|
||||
writeFileSync(backupPath, JSON.stringify(currentCreds, null, 2) + '\n', 'utf-8');
|
||||
|
||||
try {
|
||||
const result = await client.post<{ token: string; user: { email: string } }>(
|
||||
'/api/v1/auth/impersonate',
|
||||
{ email },
|
||||
);
|
||||
|
||||
saveCredentials({
|
||||
token: result.token,
|
||||
mcpdUrl: currentCreds.mcpdUrl,
|
||||
user: result.user.email,
|
||||
}, credentialsDeps);
|
||||
|
||||
apiLog(`Impersonating ${result.user.email}. Use 'mcpctl config impersonate --quit' to return.`);
|
||||
} catch (err) {
|
||||
// Restore backup on failure
|
||||
const backup = JSON.parse(readFileSync(backupPath, 'utf-8')) as StoredCredentials;
|
||||
saveCredentials(backup, credentialsDeps);
|
||||
const { unlinkSync } = await import('node:fs');
|
||||
unlinkSync(backupPath);
|
||||
|
||||
apiLog(`Impersonate failed: ${(err as Error).message}`);
|
||||
process.exitCode = 1;
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
return config;
|
||||
}
|
||||
|
||||
@@ -55,7 +55,7 @@ export function createCreateCommand(deps: CreateCommandDeps): Command {
|
||||
const { client, log } = deps;
|
||||
|
||||
const cmd = new Command('create')
|
||||
.description('Create a resource (server, project)');
|
||||
.description('Create a resource (server, secret, project, user, group, rbac)');
|
||||
|
||||
// --- create server ---
|
||||
cmd.command('server')
|
||||
@@ -195,19 +195,31 @@ export function createCreateCommand(deps: CreateCommandDeps): Command {
|
||||
.description('Create a project')
|
||||
.argument('<name>', 'Project name')
|
||||
.option('-d, --description <text>', 'Project description', '')
|
||||
.option('--proxy-mode <mode>', 'Proxy mode (direct, filtered)')
|
||||
.option('--prompt <text>', 'Project-level prompt / instructions for the LLM')
|
||||
.option('--gated', 'Enable gated sessions (default: true)')
|
||||
.option('--no-gated', 'Disable gated sessions')
|
||||
.option('--server <name>', 'Server name (repeat for multiple)', collect, [])
|
||||
.option('--force', 'Update if already exists')
|
||||
.action(async (name: string, opts) => {
|
||||
const body: Record<string, unknown> = {
|
||||
name,
|
||||
description: opts.description,
|
||||
proxyMode: opts.proxyMode ?? 'direct',
|
||||
};
|
||||
if (opts.prompt) body.prompt = opts.prompt;
|
||||
if (opts.gated !== undefined) body.gated = opts.gated as boolean;
|
||||
if (opts.server.length > 0) body.servers = opts.server;
|
||||
|
||||
try {
|
||||
const project = await client.post<{ id: string; name: string }>('/api/v1/projects', {
|
||||
name,
|
||||
description: opts.description,
|
||||
});
|
||||
const project = await client.post<{ id: string; name: string }>('/api/v1/projects', body);
|
||||
log(`project '${project.name}' created (id: ${project.id})`);
|
||||
} catch (err) {
|
||||
if (err instanceof ApiError && err.status === 409 && opts.force) {
|
||||
const existing = (await client.get<Array<{ id: string; name: string }>>('/api/v1/projects')).find((p) => p.name === name);
|
||||
if (!existing) throw err;
|
||||
await client.put(`/api/v1/projects/${existing.id}`, { description: opts.description });
|
||||
const { name: _n, ...updateBody } = body;
|
||||
await client.put(`/api/v1/projects/${existing.id}`, updateBody);
|
||||
log(`project '${name}' updated (id: ${existing.id})`);
|
||||
} else {
|
||||
throw err;
|
||||
@@ -215,5 +227,206 @@ export function createCreateCommand(deps: CreateCommandDeps): Command {
|
||||
}
|
||||
});
|
||||
|
||||
// --- create user ---
|
||||
cmd.command('user')
|
||||
.description('Create a user')
|
||||
.argument('<email>', 'User email address')
|
||||
.option('--password <pass>', 'User password')
|
||||
.option('--name <name>', 'User display name')
|
||||
.option('--force', 'Update if already exists')
|
||||
.action(async (email: string, opts) => {
|
||||
if (!opts.password) {
|
||||
throw new Error('--password is required');
|
||||
}
|
||||
const body: Record<string, unknown> = {
|
||||
email,
|
||||
password: opts.password,
|
||||
};
|
||||
if (opts.name) body.name = opts.name;
|
||||
|
||||
try {
|
||||
const user = await client.post<{ id: string; email: string }>('/api/v1/users', body);
|
||||
log(`user '${user.email}' created (id: ${user.id})`);
|
||||
} catch (err) {
|
||||
if (err instanceof ApiError && err.status === 409 && opts.force) {
|
||||
const existing = (await client.get<Array<{ id: string; email: string }>>('/api/v1/users')).find((u) => u.email === email);
|
||||
if (!existing) throw err;
|
||||
const { email: _e, ...updateBody } = body;
|
||||
await client.put(`/api/v1/users/${existing.id}`, updateBody);
|
||||
log(`user '${email}' updated (id: ${existing.id})`);
|
||||
} else {
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// --- create group ---
|
||||
cmd.command('group')
|
||||
.description('Create a group')
|
||||
.argument('<name>', 'Group name')
|
||||
.option('--description <text>', 'Group description')
|
||||
.option('--member <email>', 'Member email (repeat for multiple)', collect, [])
|
||||
.option('--force', 'Update if already exists')
|
||||
.action(async (name: string, opts) => {
|
||||
const body: Record<string, unknown> = {
|
||||
name,
|
||||
members: opts.member,
|
||||
};
|
||||
if (opts.description) body.description = opts.description;
|
||||
|
||||
try {
|
||||
const group = await client.post<{ id: string; name: string }>('/api/v1/groups', body);
|
||||
log(`group '${group.name}' created (id: ${group.id})`);
|
||||
} catch (err) {
|
||||
if (err instanceof ApiError && err.status === 409 && opts.force) {
|
||||
const existing = (await client.get<Array<{ id: string; name: string }>>('/api/v1/groups')).find((g) => g.name === name);
|
||||
if (!existing) throw err;
|
||||
const { name: _n, ...updateBody } = body;
|
||||
await client.put(`/api/v1/groups/${existing.id}`, updateBody);
|
||||
log(`group '${name}' updated (id: ${existing.id})`);
|
||||
} else {
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// --- create rbac ---
|
||||
cmd.command('rbac')
|
||||
.description('Create an RBAC binding definition')
|
||||
.argument('<name>', 'RBAC binding name')
|
||||
.option('--subject <entry>', 'Subject as Kind:name (repeat for multiple)', collect, [])
|
||||
.option('--binding <entry>', 'Role binding as role:resource (e.g. edit:servers, run:projects)', collect, [])
|
||||
.option('--operation <action>', 'Operation binding (e.g. logs, backup)', collect, [])
|
||||
.option('--force', 'Update if already exists')
|
||||
.action(async (name: string, opts) => {
|
||||
const subjects = (opts.subject as string[]).map((entry: string) => {
|
||||
const colonIdx = entry.indexOf(':');
|
||||
if (colonIdx === -1) {
|
||||
throw new Error(`Invalid subject format '${entry}'. Expected Kind:name (e.g. User:alice@example.com)`);
|
||||
}
|
||||
return { kind: entry.slice(0, colonIdx), name: entry.slice(colonIdx + 1) };
|
||||
});
|
||||
|
||||
const roleBindings: Array<Record<string, string>> = [];
|
||||
|
||||
// Resource bindings from --binding flag (role:resource or role:resource:name)
|
||||
for (const entry of opts.binding as string[]) {
|
||||
const parts = entry.split(':');
|
||||
if (parts.length === 2) {
|
||||
roleBindings.push({ role: parts[0]!, resource: parts[1]! });
|
||||
} else if (parts.length === 3) {
|
||||
roleBindings.push({ role: parts[0]!, resource: parts[1]!, name: parts[2]! });
|
||||
} else {
|
||||
throw new Error(`Invalid binding format '${entry}'. Expected role:resource or role:resource:name (e.g. edit:servers, view:servers:my-ha)`);
|
||||
}
|
||||
}
|
||||
|
||||
// Operation bindings from --operation flag
|
||||
for (const action of opts.operation as string[]) {
|
||||
roleBindings.push({ role: 'run', action });
|
||||
}
|
||||
|
||||
const body: Record<string, unknown> = {
|
||||
name,
|
||||
subjects,
|
||||
roleBindings,
|
||||
};
|
||||
|
||||
try {
|
||||
const rbac = await client.post<{ id: string; name: string }>('/api/v1/rbac', body);
|
||||
log(`rbac '${rbac.name}' created (id: ${rbac.id})`);
|
||||
} catch (err) {
|
||||
if (err instanceof ApiError && err.status === 409 && opts.force) {
|
||||
const existing = (await client.get<Array<{ id: string; name: string }>>('/api/v1/rbac')).find((r) => r.name === name);
|
||||
if (!existing) throw err;
|
||||
const { name: _n, ...updateBody } = body;
|
||||
await client.put(`/api/v1/rbac/${existing.id}`, updateBody);
|
||||
log(`rbac '${name}' updated (id: ${existing.id})`);
|
||||
} else {
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// --- create prompt ---
|
||||
cmd.command('prompt')
|
||||
.description('Create an approved prompt')
|
||||
.argument('<name>', 'Prompt name (lowercase alphanumeric with hyphens)')
|
||||
.option('--project <name>', 'Project name to scope the prompt to')
|
||||
.option('--content <text>', 'Prompt content text')
|
||||
.option('--content-file <path>', 'Read prompt content from file')
|
||||
.option('--priority <number>', 'Priority 1-10 (default: 5, higher = more important)')
|
||||
.option('--link <target>', 'Link to MCP resource (format: project/server:uri)')
|
||||
.action(async (name: string, opts) => {
|
||||
let content = opts.content as string | undefined;
|
||||
if (opts.contentFile) {
|
||||
const fs = await import('node:fs/promises');
|
||||
content = await fs.readFile(opts.contentFile as string, 'utf-8');
|
||||
}
|
||||
if (!content) {
|
||||
throw new Error('--content or --content-file is required');
|
||||
}
|
||||
|
||||
const body: Record<string, unknown> = { name, content };
|
||||
if (opts.project) {
|
||||
// Resolve project name to ID
|
||||
const projects = await client.get<Array<{ id: string; name: string }>>('/api/v1/projects');
|
||||
const project = projects.find((p) => p.name === opts.project);
|
||||
if (!project) throw new Error(`Project '${opts.project as string}' not found`);
|
||||
body.projectId = project.id;
|
||||
}
|
||||
if (opts.priority) {
|
||||
const priority = Number(opts.priority);
|
||||
if (isNaN(priority) || priority < 1 || priority > 10) {
|
||||
throw new Error('--priority must be a number between 1 and 10');
|
||||
}
|
||||
body.priority = priority;
|
||||
}
|
||||
if (opts.link) {
|
||||
body.linkTarget = opts.link;
|
||||
}
|
||||
|
||||
const prompt = await client.post<{ id: string; name: string }>('/api/v1/prompts', body);
|
||||
log(`prompt '${prompt.name}' created (id: ${prompt.id})`);
|
||||
});
|
||||
|
||||
// --- create promptrequest ---
|
||||
cmd.command('promptrequest')
|
||||
.description('Create a prompt request (pending proposal that needs approval)')
|
||||
.argument('<name>', 'Prompt request name (lowercase alphanumeric with hyphens)')
|
||||
.option('--project <name>', 'Project name to scope the prompt request to')
|
||||
.option('--content <text>', 'Prompt content text')
|
||||
.option('--content-file <path>', 'Read prompt content from file')
|
||||
.option('--priority <number>', 'Priority 1-10 (default: 5, higher = more important)')
|
||||
.action(async (name: string, opts) => {
|
||||
let content = opts.content as string | undefined;
|
||||
if (opts.contentFile) {
|
||||
const fs = await import('node:fs/promises');
|
||||
content = await fs.readFile(opts.contentFile as string, 'utf-8');
|
||||
}
|
||||
if (!content) {
|
||||
throw new Error('--content or --content-file is required');
|
||||
}
|
||||
|
||||
const body: Record<string, unknown> = { name, content };
|
||||
if (opts.project) {
|
||||
body.project = opts.project;
|
||||
}
|
||||
if (opts.priority) {
|
||||
const priority = Number(opts.priority);
|
||||
if (isNaN(priority) || priority < 1 || priority > 10) {
|
||||
throw new Error('--priority must be a number between 1 and 10');
|
||||
}
|
||||
body.priority = priority;
|
||||
}
|
||||
|
||||
const pr = await client.post<{ id: string; name: string }>(
|
||||
'/api/v1/promptrequests',
|
||||
body,
|
||||
);
|
||||
log(`prompt request '${pr.name}' created (id: ${pr.id})`);
|
||||
log(` approve with: mcpctl approve promptrequest ${pr.name}`);
|
||||
});
|
||||
|
||||
return cmd;
|
||||
}
|
||||
|
||||
@@ -11,7 +11,7 @@ export function createDeleteCommand(deps: DeleteCommandDeps): Command {
|
||||
const { client, log } = deps;
|
||||
|
||||
return new Command('delete')
|
||||
.description('Delete a resource (server, instance, profile, project)')
|
||||
.description('Delete a resource (server, instance, secret, project, user, group, rbac)')
|
||||
.argument('<resource>', 'resource type')
|
||||
.argument('<id>', 'resource ID or name')
|
||||
.action(async (resourceArg: string, idOrName: string) => {
|
||||
|
||||
@@ -133,16 +133,55 @@ function formatInstanceDetail(instance: Record<string, unknown>, inspect?: Recor
|
||||
return lines.join('\n');
|
||||
}
|
||||
|
||||
function formatProjectDetail(project: Record<string, unknown>): string {
|
||||
function formatProjectDetail(
|
||||
project: Record<string, unknown>,
|
||||
prompts: Array<{ name: string; priority: number; linkTarget: string | null }> = [],
|
||||
): string {
|
||||
const lines: string[] = [];
|
||||
lines.push(`=== Project: ${project.name} ===`);
|
||||
lines.push(`${pad('Name:')}${project.name}`);
|
||||
if (project.description) lines.push(`${pad('Description:')}${project.description}`);
|
||||
if (project.ownerId) lines.push(`${pad('Owner:')}${project.ownerId}`);
|
||||
lines.push(`${pad('Gated:')}${project.gated ? 'yes' : 'no'}`);
|
||||
|
||||
// Proxy config section
|
||||
const proxyMode = project.proxyMode as string | undefined;
|
||||
const llmProvider = project.llmProvider as string | undefined;
|
||||
const llmModel = project.llmModel as string | undefined;
|
||||
if (proxyMode || llmProvider || llmModel) {
|
||||
lines.push('');
|
||||
lines.push('Proxy Config:');
|
||||
lines.push(` ${pad('Mode:', 18)}${proxyMode ?? 'direct'}`);
|
||||
if (llmProvider) lines.push(` ${pad('LLM Provider:', 18)}${llmProvider}`);
|
||||
if (llmModel) lines.push(` ${pad('LLM Model:', 18)}${llmModel}`);
|
||||
}
|
||||
|
||||
// Servers section
|
||||
const servers = project.servers as Array<{ server: { name: string } }> | undefined;
|
||||
if (servers && servers.length > 0) {
|
||||
lines.push('');
|
||||
lines.push('Servers:');
|
||||
lines.push(' NAME');
|
||||
for (const s of servers) {
|
||||
lines.push(` ${s.server.name}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Prompts section
|
||||
if (prompts.length > 0) {
|
||||
lines.push('');
|
||||
lines.push('Prompts:');
|
||||
const nameW = Math.max(4, ...prompts.map((p) => p.name.length)) + 2;
|
||||
lines.push(` ${'NAME'.padEnd(nameW)}${'PRI'.padEnd(6)}TYPE`);
|
||||
for (const p of prompts) {
|
||||
const type = p.linkTarget ? 'link' : 'local';
|
||||
lines.push(` ${p.name.padEnd(nameW)}${String(p.priority).padEnd(6)}${type}`);
|
||||
}
|
||||
}
|
||||
|
||||
lines.push('');
|
||||
lines.push('Metadata:');
|
||||
lines.push(` ${pad('ID:', 12)}${project.id}`);
|
||||
if (project.ownerId) lines.push(` ${pad('Owner:', 12)}${project.ownerId}`);
|
||||
if (project.createdAt) lines.push(` ${pad('Created:', 12)}${project.createdAt}`);
|
||||
if (project.updatedAt) lines.push(` ${pad('Updated:', 12)}${project.updatedAt}`);
|
||||
|
||||
@@ -240,6 +279,231 @@ function formatTemplateDetail(template: Record<string, unknown>): string {
|
||||
return lines.join('\n');
|
||||
}
|
||||
|
||||
interface RbacBinding { role: string; resource?: string; action?: string; name?: string }
|
||||
interface RbacDef { name: string; subjects: Array<{ kind: string; name: string }>; roleBindings: RbacBinding[] }
|
||||
interface PermissionSet { source: string; bindings: RbacBinding[] }
|
||||
|
||||
function formatPermissionSections(sections: PermissionSet[]): string[] {
|
||||
const lines: string[] = [];
|
||||
for (const section of sections) {
|
||||
const bindings = section.bindings;
|
||||
if (bindings.length === 0) continue;
|
||||
|
||||
const resourceBindings = bindings.filter((b) => 'resource' in b && b.resource !== undefined);
|
||||
const operationBindings = bindings.filter((b) => 'action' in b && b.action !== undefined);
|
||||
|
||||
if (resourceBindings.length > 0) {
|
||||
lines.push('');
|
||||
lines.push(`${section.source} — Resources:`);
|
||||
const roleW = Math.max(6, ...resourceBindings.map((b) => b.role.length)) + 2;
|
||||
const resW = Math.max(10, ...resourceBindings.map((b) => (b.resource ?? '').length)) + 2;
|
||||
const hasName = resourceBindings.some((b) => b.name);
|
||||
if (hasName) {
|
||||
lines.push(` ${'ROLE'.padEnd(roleW)}${'RESOURCE'.padEnd(resW)}NAME`);
|
||||
} else {
|
||||
lines.push(` ${'ROLE'.padEnd(roleW)}RESOURCE`);
|
||||
}
|
||||
for (const b of resourceBindings) {
|
||||
if (hasName) {
|
||||
lines.push(` ${b.role.padEnd(roleW)}${(b.resource ?? '').padEnd(resW)}${b.name ?? '*'}`);
|
||||
} else {
|
||||
lines.push(` ${b.role.padEnd(roleW)}${b.resource}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (operationBindings.length > 0) {
|
||||
lines.push('');
|
||||
lines.push(`${section.source} — Operations:`);
|
||||
lines.push(` ${'ACTION'.padEnd(20)}ROLE`);
|
||||
for (const b of operationBindings) {
|
||||
lines.push(` ${(b.action ?? '').padEnd(20)}${b.role}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
return lines;
|
||||
}
|
||||
|
||||
function collectBindingsForSubject(
|
||||
rbacDefs: RbacDef[],
|
||||
kind: string,
|
||||
name: string,
|
||||
): { rbacName: string; bindings: RbacBinding[] }[] {
|
||||
const results: { rbacName: string; bindings: RbacBinding[] }[] = [];
|
||||
for (const def of rbacDefs) {
|
||||
const matched = def.subjects.some((s) => s.kind === kind && s.name === name);
|
||||
if (matched) {
|
||||
results.push({ rbacName: def.name, bindings: def.roleBindings });
|
||||
}
|
||||
}
|
||||
return results;
|
||||
}
|
||||
|
||||
function formatUserDetail(
|
||||
user: Record<string, unknown>,
|
||||
rbacDefs?: RbacDef[],
|
||||
userGroups?: string[],
|
||||
): string {
|
||||
const lines: string[] = [];
|
||||
lines.push(`=== User: ${user.email} ===`);
|
||||
lines.push(`${pad('Email:')}${user.email}`);
|
||||
lines.push(`${pad('Name:')}${(user.name as string | null) ?? '-'}`);
|
||||
lines.push(`${pad('Provider:')}${(user.provider as string | null) ?? 'local'}`);
|
||||
|
||||
if (userGroups && userGroups.length > 0) {
|
||||
lines.push(`${pad('Groups:')}${userGroups.join(', ')}`);
|
||||
}
|
||||
|
||||
if (rbacDefs) {
|
||||
const email = user.email as string;
|
||||
|
||||
// Direct permissions (User:email subjects)
|
||||
const directMatches = collectBindingsForSubject(rbacDefs, 'User', email);
|
||||
const directBindings = directMatches.flatMap((m) => m.bindings);
|
||||
const directSources = directMatches.map((m) => m.rbacName).join(', ');
|
||||
|
||||
// Inherited permissions (Group:name subjects)
|
||||
const inheritedSections: PermissionSet[] = [];
|
||||
if (userGroups) {
|
||||
for (const groupName of userGroups) {
|
||||
const groupMatches = collectBindingsForSubject(rbacDefs, 'Group', groupName);
|
||||
const groupBindings = groupMatches.flatMap((m) => m.bindings);
|
||||
if (groupBindings.length > 0) {
|
||||
inheritedSections.push({ source: `Inherited (${groupName})`, bindings: groupBindings });
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const sections: PermissionSet[] = [];
|
||||
if (directBindings.length > 0) {
|
||||
sections.push({ source: `Direct (${directSources})`, bindings: directBindings });
|
||||
}
|
||||
sections.push(...inheritedSections);
|
||||
|
||||
if (sections.length > 0) {
|
||||
lines.push('');
|
||||
lines.push('Access:');
|
||||
lines.push(...formatPermissionSections(sections));
|
||||
} else {
|
||||
lines.push('');
|
||||
lines.push('Access: (none)');
|
||||
}
|
||||
}
|
||||
|
||||
lines.push('');
|
||||
lines.push('Metadata:');
|
||||
lines.push(` ${pad('ID:', 12)}${user.id}`);
|
||||
if (user.createdAt) lines.push(` ${pad('Created:', 12)}${user.createdAt}`);
|
||||
if (user.updatedAt) lines.push(` ${pad('Updated:', 12)}${user.updatedAt}`);
|
||||
|
||||
return lines.join('\n');
|
||||
}
|
||||
|
||||
function formatGroupDetail(group: Record<string, unknown>, rbacDefs?: RbacDef[]): string {
|
||||
const lines: string[] = [];
|
||||
lines.push(`=== Group: ${group.name} ===`);
|
||||
lines.push(`${pad('Name:')}${group.name}`);
|
||||
if (group.description) lines.push(`${pad('Description:')}${group.description}`);
|
||||
|
||||
const members = group.members as Array<{ user: { email: string }; createdAt?: string }> | undefined;
|
||||
if (members && members.length > 0) {
|
||||
lines.push('');
|
||||
lines.push('Members:');
|
||||
const emailW = Math.max(6, ...members.map((m) => m.user.email.length)) + 2;
|
||||
lines.push(` ${'EMAIL'.padEnd(emailW)}ADDED`);
|
||||
for (const m of members) {
|
||||
const added = (m.createdAt as string | undefined) ?? '-';
|
||||
lines.push(` ${m.user.email.padEnd(emailW)}${added}`);
|
||||
}
|
||||
}
|
||||
|
||||
if (rbacDefs) {
|
||||
const groupName = group.name as string;
|
||||
const matches = collectBindingsForSubject(rbacDefs, 'Group', groupName);
|
||||
const allBindings = matches.flatMap((m) => m.bindings);
|
||||
const sources = matches.map((m) => m.rbacName).join(', ');
|
||||
|
||||
if (allBindings.length > 0) {
|
||||
const sections: PermissionSet[] = [{ source: `Granted (${sources})`, bindings: allBindings }];
|
||||
lines.push('');
|
||||
lines.push('Access:');
|
||||
lines.push(...formatPermissionSections(sections));
|
||||
} else {
|
||||
lines.push('');
|
||||
lines.push('Access: (none)');
|
||||
}
|
||||
}
|
||||
|
||||
lines.push('');
|
||||
lines.push('Metadata:');
|
||||
lines.push(` ${pad('ID:', 12)}${group.id}`);
|
||||
if (group.createdAt) lines.push(` ${pad('Created:', 12)}${group.createdAt}`);
|
||||
if (group.updatedAt) lines.push(` ${pad('Updated:', 12)}${group.updatedAt}`);
|
||||
|
||||
return lines.join('\n');
|
||||
}
|
||||
|
||||
function formatRbacDetail(rbac: Record<string, unknown>): string {
|
||||
const lines: string[] = [];
|
||||
lines.push(`=== RBAC: ${rbac.name} ===`);
|
||||
lines.push(`${pad('Name:')}${rbac.name}`);
|
||||
|
||||
const subjects = rbac.subjects as Array<{ kind: string; name: string }> | undefined;
|
||||
if (subjects && subjects.length > 0) {
|
||||
lines.push('');
|
||||
lines.push('Subjects:');
|
||||
const kindW = Math.max(6, ...subjects.map((s) => s.kind.length)) + 2;
|
||||
lines.push(` ${'KIND'.padEnd(kindW)}NAME`);
|
||||
for (const s of subjects) {
|
||||
lines.push(` ${s.kind.padEnd(kindW)}${s.name}`);
|
||||
}
|
||||
}
|
||||
|
||||
const roleBindings = rbac.roleBindings as Array<{ role: string; resource?: string; action?: string; name?: string }> | undefined;
|
||||
if (roleBindings && roleBindings.length > 0) {
|
||||
// Separate resource bindings from operation bindings
|
||||
const resourceBindings = roleBindings.filter((b) => 'resource' in b && b.resource !== undefined);
|
||||
const operationBindings = roleBindings.filter((b) => 'action' in b && b.action !== undefined);
|
||||
|
||||
if (resourceBindings.length > 0) {
|
||||
lines.push('');
|
||||
lines.push('Resource Bindings:');
|
||||
const roleW = Math.max(6, ...resourceBindings.map((b) => b.role.length)) + 2;
|
||||
const resW = Math.max(10, ...resourceBindings.map((b) => (b.resource ?? '').length)) + 2;
|
||||
const hasName = resourceBindings.some((b) => b.name);
|
||||
if (hasName) {
|
||||
lines.push(` ${'ROLE'.padEnd(roleW)}${'RESOURCE'.padEnd(resW)}NAME`);
|
||||
} else {
|
||||
lines.push(` ${'ROLE'.padEnd(roleW)}RESOURCE`);
|
||||
}
|
||||
for (const b of resourceBindings) {
|
||||
if (hasName) {
|
||||
lines.push(` ${b.role.padEnd(roleW)}${(b.resource ?? '').padEnd(resW)}${b.name ?? '*'}`);
|
||||
} else {
|
||||
lines.push(` ${b.role.padEnd(roleW)}${b.resource}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (operationBindings.length > 0) {
|
||||
lines.push('');
|
||||
lines.push('Operations:');
|
||||
lines.push(` ${'ACTION'.padEnd(20)}ROLE`);
|
||||
for (const b of operationBindings) {
|
||||
lines.push(` ${(b.action ?? '').padEnd(20)}${b.role}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
lines.push('');
|
||||
lines.push('Metadata:');
|
||||
lines.push(` ${pad('ID:', 12)}${rbac.id}`);
|
||||
if (rbac.createdAt) lines.push(` ${pad('Created:', 12)}${rbac.createdAt}`);
|
||||
if (rbac.updatedAt) lines.push(` ${pad('Updated:', 12)}${rbac.updatedAt}`);
|
||||
|
||||
return lines.join('\n');
|
||||
}
|
||||
|
||||
function formatGenericDetail(obj: Record<string, unknown>): string {
|
||||
const lines: string[] = [];
|
||||
for (const [key, value] of Object.entries(obj)) {
|
||||
@@ -338,8 +602,33 @@ export function createDescribeCommand(deps: DescribeCommandDeps): Command {
|
||||
case 'templates':
|
||||
deps.log(formatTemplateDetail(item));
|
||||
break;
|
||||
case 'projects':
|
||||
deps.log(formatProjectDetail(item));
|
||||
case 'projects': {
|
||||
const projectPrompts = await deps.client
|
||||
.get<Array<{ name: string; priority: number; linkTarget: string | null }>>(`/api/v1/prompts?projectId=${item.id as string}`)
|
||||
.catch(() => []);
|
||||
deps.log(formatProjectDetail(item, projectPrompts));
|
||||
break;
|
||||
}
|
||||
case 'users': {
|
||||
// Fetch RBAC definitions and groups to show permissions
|
||||
const [rbacDefsForUser, allGroupsForUser] = await Promise.all([
|
||||
deps.client.get<RbacDef[]>('/api/v1/rbac').catch(() => [] as RbacDef[]),
|
||||
deps.client.get<Array<{ name: string; members?: Array<{ user: { email: string } }> }>>('/api/v1/groups').catch(() => []),
|
||||
]);
|
||||
const userEmail = item.email as string;
|
||||
const userGroupNames = allGroupsForUser
|
||||
.filter((g) => g.members?.some((m) => m.user.email === userEmail))
|
||||
.map((g) => g.name);
|
||||
deps.log(formatUserDetail(item, rbacDefsForUser, userGroupNames));
|
||||
break;
|
||||
}
|
||||
case 'groups': {
|
||||
const rbacDefsForGroup = await deps.client.get<RbacDef[]>('/api/v1/rbac').catch(() => [] as RbacDef[]);
|
||||
deps.log(formatGroupDetail(item, rbacDefsForGroup));
|
||||
break;
|
||||
}
|
||||
case 'rbac':
|
||||
deps.log(formatRbacDetail(item));
|
||||
break;
|
||||
default:
|
||||
deps.log(formatGenericDetail(item));
|
||||
|
||||
@@ -6,6 +6,7 @@ import { execSync } from 'node:child_process';
|
||||
import yaml from 'js-yaml';
|
||||
import type { ApiClient } from '../api-client.js';
|
||||
import { resolveResource, resolveNameOrId, stripInternalFields } from './shared.js';
|
||||
import { reorderKeys } from '../formatters/output.js';
|
||||
|
||||
export interface EditCommandDeps {
|
||||
client: ApiClient;
|
||||
@@ -47,7 +48,7 @@ export function createEditCommand(deps: EditCommandDeps): Command {
|
||||
return;
|
||||
}
|
||||
|
||||
const validResources = ['servers', 'secrets', 'projects'];
|
||||
const validResources = ['servers', 'secrets', 'projects', 'groups', 'rbac', 'prompts', 'promptrequests'];
|
||||
if (!validResources.includes(resource)) {
|
||||
log(`Error: unknown resource type '${resourceArg}'`);
|
||||
process.exitCode = 1;
|
||||
@@ -61,7 +62,7 @@ export function createEditCommand(deps: EditCommandDeps): Command {
|
||||
const current = await client.get<Record<string, unknown>>(`/api/v1/${resource}/${id}`);
|
||||
|
||||
// Strip read-only fields for editor
|
||||
const editable = stripInternalFields(current);
|
||||
const editable = reorderKeys(stripInternalFields(current)) as Record<string, unknown>;
|
||||
|
||||
// Serialize to YAML
|
||||
const singular = resource.replace(/s$/, '');
|
||||
|
||||
@@ -5,7 +5,7 @@ import type { Column } from '../formatters/table.js';
|
||||
import { resolveResource, stripInternalFields } from './shared.js';
|
||||
|
||||
export interface GetCommandDeps {
|
||||
fetchResource: (resource: string, id?: string) => Promise<unknown[]>;
|
||||
fetchResource: (resource: string, id?: string, opts?: { project?: string; all?: boolean }) => Promise<unknown[]>;
|
||||
log: (...args: string[]) => void;
|
||||
}
|
||||
|
||||
@@ -21,7 +21,10 @@ interface ProjectRow {
|
||||
id: string;
|
||||
name: string;
|
||||
description: string;
|
||||
proxyMode: string;
|
||||
gated: boolean;
|
||||
ownerId: string;
|
||||
servers?: Array<{ server: { name: string } }>;
|
||||
}
|
||||
|
||||
interface SecretRow {
|
||||
@@ -57,10 +60,61 @@ const serverColumns: Column<ServerRow>[] = [
|
||||
{ header: 'ID', key: 'id' },
|
||||
];
|
||||
|
||||
interface UserRow {
|
||||
id: string;
|
||||
email: string;
|
||||
name: string | null;
|
||||
provider: string | null;
|
||||
}
|
||||
|
||||
interface GroupRow {
|
||||
id: string;
|
||||
name: string;
|
||||
description: string;
|
||||
members?: Array<{ user: { email: string } }>;
|
||||
}
|
||||
|
||||
interface RbacRow {
|
||||
id: string;
|
||||
name: string;
|
||||
subjects: Array<{ kind: string; name: string }>;
|
||||
roleBindings: Array<{ role: string; resource?: string; action?: string; name?: string }>;
|
||||
}
|
||||
|
||||
const projectColumns: Column<ProjectRow>[] = [
|
||||
{ header: 'NAME', key: 'name' },
|
||||
{ header: 'MODE', key: (r) => r.proxyMode ?? 'direct', width: 10 },
|
||||
{ header: 'GATED', key: (r) => r.gated ? 'yes' : 'no', width: 6 },
|
||||
{ header: 'SERVERS', key: (r) => r.servers ? String(r.servers.length) : '0', width: 8 },
|
||||
{ header: 'DESCRIPTION', key: 'description', width: 30 },
|
||||
{ header: 'ID', key: 'id' },
|
||||
];
|
||||
|
||||
const userColumns: Column<UserRow>[] = [
|
||||
{ header: 'EMAIL', key: 'email' },
|
||||
{ header: 'NAME', key: (r) => r.name ?? '-' },
|
||||
{ header: 'PROVIDER', key: (r) => r.provider ?? 'local', width: 10 },
|
||||
{ header: 'ID', key: 'id' },
|
||||
];
|
||||
|
||||
const groupColumns: Column<GroupRow>[] = [
|
||||
{ header: 'NAME', key: 'name' },
|
||||
{ header: 'MEMBERS', key: (r) => r.members ? String(r.members.length) : '0', width: 8 },
|
||||
{ header: 'DESCRIPTION', key: 'description', width: 40 },
|
||||
{ header: 'OWNER', key: 'ownerId' },
|
||||
{ header: 'ID', key: 'id' },
|
||||
];
|
||||
|
||||
const rbacColumns: Column<RbacRow>[] = [
|
||||
{ header: 'NAME', key: 'name' },
|
||||
{ header: 'SUBJECTS', key: (r) => r.subjects.map((s) => `${s.kind}:${s.name}`).join(', '), width: 30 },
|
||||
{ header: 'BINDINGS', key: (r) => r.roleBindings.map((b) => {
|
||||
if ('action' in b && b.action !== undefined) return `run>${b.action}`;
|
||||
if ('resource' in b && b.resource !== undefined) {
|
||||
const base = `${b.role}:${b.resource}`;
|
||||
return b.name ? `${base}:${b.name}` : base;
|
||||
}
|
||||
return b.role;
|
||||
}).join(', '), width: 40 },
|
||||
{ header: 'ID', key: 'id' },
|
||||
];
|
||||
|
||||
@@ -78,6 +132,44 @@ const templateColumns: Column<TemplateRow>[] = [
|
||||
{ header: 'DESCRIPTION', key: 'description', width: 50 },
|
||||
];
|
||||
|
||||
interface PromptRow {
|
||||
id: string;
|
||||
name: string;
|
||||
projectId: string | null;
|
||||
project?: { name: string } | null;
|
||||
priority: number;
|
||||
linkTarget: string | null;
|
||||
linkStatus: 'alive' | 'dead' | null;
|
||||
createdAt: string;
|
||||
}
|
||||
|
||||
interface PromptRequestRow {
|
||||
id: string;
|
||||
name: string;
|
||||
projectId: string | null;
|
||||
project?: { name: string } | null;
|
||||
createdBySession: string | null;
|
||||
createdAt: string;
|
||||
}
|
||||
|
||||
const promptColumns: Column<PromptRow>[] = [
|
||||
{ header: 'NAME', key: 'name' },
|
||||
{ header: 'PROJECT', key: (r) => r.project?.name ?? (r.projectId ? r.projectId : '(global)'), width: 20 },
|
||||
{ header: 'PRI', key: (r) => String(r.priority), width: 4 },
|
||||
{ header: 'LINK', key: (r) => r.linkTarget ? r.linkTarget.split(':')[0]! : '-', width: 20 },
|
||||
{ header: 'STATUS', key: (r) => r.linkStatus ?? '-', width: 6 },
|
||||
{ header: 'CREATED', key: (r) => new Date(r.createdAt).toLocaleString(), width: 20 },
|
||||
{ header: 'ID', key: 'id' },
|
||||
];
|
||||
|
||||
const promptRequestColumns: Column<PromptRequestRow>[] = [
|
||||
{ header: 'NAME', key: 'name' },
|
||||
{ header: 'PROJECT', key: (r) => r.project?.name ?? (r.projectId ? r.projectId : '(global)'), width: 20 },
|
||||
{ header: 'SESSION', key: (r) => r.createdBySession ? r.createdBySession.slice(0, 12) : '-', width: 14 },
|
||||
{ header: 'CREATED', key: (r) => new Date(r.createdAt).toLocaleString(), width: 20 },
|
||||
{ header: 'ID', key: 'id' },
|
||||
];
|
||||
|
||||
const instanceColumns: Column<InstanceRow>[] = [
|
||||
{ header: 'NAME', key: (r) => r.server?.name ?? '-', width: 20 },
|
||||
{ header: 'STATUS', key: 'status', width: 10 },
|
||||
@@ -99,6 +191,16 @@ function getColumnsForResource(resource: string): Column<Record<string, unknown>
|
||||
return templateColumns as unknown as Column<Record<string, unknown>>[];
|
||||
case 'instances':
|
||||
return instanceColumns as unknown as Column<Record<string, unknown>>[];
|
||||
case 'users':
|
||||
return userColumns as unknown as Column<Record<string, unknown>>[];
|
||||
case 'groups':
|
||||
return groupColumns as unknown as Column<Record<string, unknown>>[];
|
||||
case 'rbac':
|
||||
return rbacColumns as unknown as Column<Record<string, unknown>>[];
|
||||
case 'prompts':
|
||||
return promptColumns as unknown as Column<Record<string, unknown>>[];
|
||||
case 'promptrequests':
|
||||
return promptRequestColumns as unknown as Column<Record<string, unknown>>[];
|
||||
default:
|
||||
return [
|
||||
{ header: 'ID', key: 'id' as keyof Record<string, unknown> },
|
||||
@@ -124,9 +226,14 @@ export function createGetCommand(deps: GetCommandDeps): Command {
|
||||
.argument('<resource>', 'resource type (servers, projects, instances)')
|
||||
.argument('[id]', 'specific resource ID or name')
|
||||
.option('-o, --output <format>', 'output format (table, json, yaml)', 'table')
|
||||
.action(async (resourceArg: string, id: string | undefined, opts: { output: string }) => {
|
||||
.option('--project <name>', 'Filter by project')
|
||||
.option('-A, --all', 'Show all (including project-scoped) resources')
|
||||
.action(async (resourceArg: string, id: string | undefined, opts: { output: string; project?: string; all?: true }) => {
|
||||
const resource = resolveResource(resourceArg);
|
||||
const items = await deps.fetchResource(resource, id);
|
||||
const fetchOpts: { project?: string; all?: boolean } = {};
|
||||
if (opts.project) fetchOpts.project = opts.project;
|
||||
if (opts.all) fetchOpts.all = true;
|
||||
const items = await deps.fetchResource(resource, id, Object.keys(fetchOpts).length > 0 ? fetchOpts : undefined);
|
||||
|
||||
if (opts.output === 'json') {
|
||||
// Apply-compatible JSON wrapped in resource key
|
||||
|
||||
224
src/cli/src/commands/mcp.ts
Normal file
224
src/cli/src/commands/mcp.ts
Normal file
@@ -0,0 +1,224 @@
|
||||
import { Command } from 'commander';
|
||||
import http from 'node:http';
|
||||
import { createInterface } from 'node:readline';
|
||||
|
||||
export interface McpBridgeOptions {
|
||||
projectName: string;
|
||||
mcplocalUrl: string;
|
||||
token?: string | undefined;
|
||||
stdin: NodeJS.ReadableStream;
|
||||
stdout: NodeJS.WritableStream;
|
||||
stderr: NodeJS.WritableStream;
|
||||
}
|
||||
|
||||
function postJsonRpc(
|
||||
url: string,
|
||||
body: string,
|
||||
sessionId: string | undefined,
|
||||
token: string | undefined,
|
||||
): Promise<{ status: number; headers: http.IncomingHttpHeaders; body: string }> {
|
||||
return new Promise((resolve, reject) => {
|
||||
const parsed = new URL(url);
|
||||
const headers: Record<string, string> = {
|
||||
'Content-Type': 'application/json',
|
||||
'Accept': 'application/json, text/event-stream',
|
||||
};
|
||||
if (sessionId) {
|
||||
headers['mcp-session-id'] = sessionId;
|
||||
}
|
||||
if (token) {
|
||||
headers['Authorization'] = `Bearer ${token}`;
|
||||
}
|
||||
|
||||
const req = http.request(
|
||||
{
|
||||
hostname: parsed.hostname,
|
||||
port: parsed.port,
|
||||
path: parsed.pathname,
|
||||
method: 'POST',
|
||||
headers,
|
||||
timeout: 30_000,
|
||||
},
|
||||
(res) => {
|
||||
const chunks: Buffer[] = [];
|
||||
res.on('data', (chunk: Buffer) => chunks.push(chunk));
|
||||
res.on('end', () => {
|
||||
resolve({
|
||||
status: res.statusCode ?? 0,
|
||||
headers: res.headers,
|
||||
body: Buffer.concat(chunks).toString('utf-8'),
|
||||
});
|
||||
});
|
||||
},
|
||||
);
|
||||
req.on('error', reject);
|
||||
req.on('timeout', () => {
|
||||
req.destroy();
|
||||
reject(new Error('Request timed out'));
|
||||
});
|
||||
req.write(body);
|
||||
req.end();
|
||||
});
|
||||
}
|
||||
|
||||
function sendDelete(
|
||||
url: string,
|
||||
sessionId: string,
|
||||
token: string | undefined,
|
||||
): Promise<void> {
|
||||
return new Promise((resolve) => {
|
||||
const parsed = new URL(url);
|
||||
const headers: Record<string, string> = {
|
||||
'mcp-session-id': sessionId,
|
||||
};
|
||||
if (token) {
|
||||
headers['Authorization'] = `Bearer ${token}`;
|
||||
}
|
||||
|
||||
const req = http.request(
|
||||
{
|
||||
hostname: parsed.hostname,
|
||||
port: parsed.port,
|
||||
path: parsed.pathname,
|
||||
method: 'DELETE',
|
||||
headers,
|
||||
timeout: 5_000,
|
||||
},
|
||||
() => resolve(),
|
||||
);
|
||||
req.on('error', () => resolve()); // Best effort cleanup
|
||||
req.on('timeout', () => {
|
||||
req.destroy();
|
||||
resolve();
|
||||
});
|
||||
req.end();
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract JSON-RPC messages from an HTTP response body.
|
||||
* Handles both plain JSON and SSE (text/event-stream) formats.
|
||||
*/
|
||||
function extractJsonRpcMessages(contentType: string | undefined, body: string): string[] {
|
||||
if (contentType?.includes('text/event-stream')) {
|
||||
// Parse SSE: extract data: lines
|
||||
const messages: string[] = [];
|
||||
for (const line of body.split('\n')) {
|
||||
if (line.startsWith('data: ')) {
|
||||
messages.push(line.slice(6));
|
||||
}
|
||||
}
|
||||
return messages;
|
||||
}
|
||||
// Plain JSON response
|
||||
return [body];
|
||||
}
|
||||
|
||||
/**
|
||||
* STDIO-to-Streamable-HTTP MCP bridge.
|
||||
*
|
||||
* Reads JSON-RPC messages line-by-line from stdin, POSTs them to
|
||||
* mcplocal's project endpoint, and writes responses to stdout.
|
||||
*/
|
||||
export async function runMcpBridge(opts: McpBridgeOptions): Promise<void> {
|
||||
const { projectName, mcplocalUrl, token, stdin, stdout, stderr } = opts;
|
||||
const endpointUrl = `${mcplocalUrl.replace(/\/$/, '')}/projects/${encodeURIComponent(projectName)}/mcp`;
|
||||
|
||||
let sessionId: string | undefined;
|
||||
|
||||
const rl = createInterface({ input: stdin, crlfDelay: Infinity });
|
||||
|
||||
for await (const line of rl) {
|
||||
const trimmed = line.trim();
|
||||
if (!trimmed) continue;
|
||||
|
||||
try {
|
||||
const result = await postJsonRpc(endpointUrl, trimmed, sessionId, token);
|
||||
|
||||
// Capture session ID from first response
|
||||
if (!sessionId) {
|
||||
const sid = result.headers['mcp-session-id'];
|
||||
if (typeof sid === 'string') {
|
||||
sessionId = sid;
|
||||
}
|
||||
}
|
||||
|
||||
if (result.status >= 400) {
|
||||
stderr.write(`MCP bridge error: HTTP ${result.status}: ${result.body}\n`);
|
||||
}
|
||||
|
||||
// Handle both plain JSON and SSE responses
|
||||
const messages = extractJsonRpcMessages(result.headers['content-type'], result.body);
|
||||
for (const msg of messages) {
|
||||
const trimmedMsg = msg.trim();
|
||||
if (trimmedMsg) {
|
||||
stdout.write(trimmedMsg + '\n');
|
||||
}
|
||||
}
|
||||
} catch (err) {
|
||||
stderr.write(`MCP bridge error: ${err instanceof Error ? err.message : String(err)}\n`);
|
||||
}
|
||||
}
|
||||
|
||||
// stdin closed — cleanup session
|
||||
if (sessionId) {
|
||||
await sendDelete(endpointUrl, sessionId, token);
|
||||
}
|
||||
}
|
||||
|
||||
export interface McpCommandDeps {
|
||||
getProject: () => string | undefined;
|
||||
configLoader?: () => { mcplocalUrl: string };
|
||||
credentialsLoader?: () => { token: string } | null;
|
||||
}
|
||||
|
||||
export function createMcpCommand(deps: McpCommandDeps): Command {
|
||||
const cmd = new Command('mcp')
|
||||
.description('MCP STDIO transport bridge — connects stdin/stdout to a project MCP endpoint')
|
||||
.passThroughOptions()
|
||||
.option('-p, --project <name>', 'Project name')
|
||||
.action(async (opts: { project?: string }) => {
|
||||
// Accept -p/--project on the command itself, or fall back to global --project
|
||||
const projectName = opts.project ?? deps.getProject();
|
||||
if (!projectName) {
|
||||
process.stderr.write('Error: --project is required for the mcp command\n');
|
||||
process.exitCode = 1;
|
||||
return;
|
||||
}
|
||||
|
||||
let mcplocalUrl = 'http://localhost:3200';
|
||||
if (deps.configLoader) {
|
||||
mcplocalUrl = deps.configLoader().mcplocalUrl;
|
||||
} else {
|
||||
try {
|
||||
const { loadConfig } = await import('../config/index.js');
|
||||
mcplocalUrl = loadConfig().mcplocalUrl;
|
||||
} catch {
|
||||
// Use default
|
||||
}
|
||||
}
|
||||
|
||||
let token: string | undefined;
|
||||
if (deps.credentialsLoader) {
|
||||
token = deps.credentialsLoader()?.token;
|
||||
} else {
|
||||
try {
|
||||
const { loadCredentials } = await import('../auth/index.js');
|
||||
token = loadCredentials()?.token;
|
||||
} catch {
|
||||
// No credentials
|
||||
}
|
||||
}
|
||||
|
||||
await runMcpBridge({
|
||||
projectName,
|
||||
mcplocalUrl,
|
||||
token,
|
||||
stdin: process.stdin,
|
||||
stdout: process.stdout,
|
||||
stderr: process.stderr,
|
||||
});
|
||||
});
|
||||
|
||||
return cmd;
|
||||
}
|
||||
65
src/cli/src/commands/project-ops.ts
Normal file
65
src/cli/src/commands/project-ops.ts
Normal file
@@ -0,0 +1,65 @@
|
||||
import { Command } from 'commander';
|
||||
import type { ApiClient } from '../api-client.js';
|
||||
import { resolveNameOrId, resolveResource } from './shared.js';
|
||||
|
||||
export interface ProjectOpsDeps {
|
||||
client: ApiClient;
|
||||
log: (...args: string[]) => void;
|
||||
getProject: () => string | undefined;
|
||||
}
|
||||
|
||||
function requireProject(deps: ProjectOpsDeps): string {
|
||||
const project = deps.getProject();
|
||||
if (!project) {
|
||||
deps.log('Error: --project <name> is required for this command.');
|
||||
process.exitCode = 1;
|
||||
throw new Error('--project required');
|
||||
}
|
||||
return project;
|
||||
}
|
||||
|
||||
export function createAttachServerCommand(deps: ProjectOpsDeps): Command {
|
||||
const { client, log } = deps;
|
||||
|
||||
return new Command('attach-server')
|
||||
.description('Attach a server to a project (requires --project)')
|
||||
.argument('<server-name>', 'Server name to attach')
|
||||
.action(async (serverName: string) => {
|
||||
const projectName = requireProject(deps);
|
||||
const projectId = await resolveNameOrId(client, 'projects', projectName);
|
||||
await client.post(`/api/v1/projects/${projectId}/servers`, { server: serverName });
|
||||
log(`server '${serverName}' attached to project '${projectName}'`);
|
||||
});
|
||||
}
|
||||
|
||||
export function createDetachServerCommand(deps: ProjectOpsDeps): Command {
|
||||
const { client, log } = deps;
|
||||
|
||||
return new Command('detach-server')
|
||||
.description('Detach a server from a project (requires --project)')
|
||||
.argument('<server-name>', 'Server name to detach')
|
||||
.action(async (serverName: string) => {
|
||||
const projectName = requireProject(deps);
|
||||
const projectId = await resolveNameOrId(client, 'projects', projectName);
|
||||
await client.delete(`/api/v1/projects/${projectId}/servers/${serverName}`);
|
||||
log(`server '${serverName}' detached from project '${projectName}'`);
|
||||
});
|
||||
}
|
||||
|
||||
export function createApproveCommand(deps: ProjectOpsDeps): Command {
|
||||
const { client, log } = deps;
|
||||
|
||||
return new Command('approve')
|
||||
.description('Approve a pending prompt request (atomic: delete request, create prompt)')
|
||||
.argument('<resource>', 'Resource type (promptrequest)')
|
||||
.argument('<name>', 'Resource name or ID')
|
||||
.action(async (resourceArg: string, nameOrId: string) => {
|
||||
const resource = resolveResource(resourceArg);
|
||||
if (resource !== 'promptrequests') {
|
||||
throw new Error(`approve is only supported for 'promptrequest', got '${resourceArg}'`);
|
||||
}
|
||||
const id = await resolveNameOrId(client, 'promptrequests', nameOrId);
|
||||
const prompt = await client.post<{ id: string; name: string }>(`/api/v1/promptrequests/${id}/approve`, {});
|
||||
log(`prompt request approved → prompt '${prompt.name}' created (id: ${prompt.id})`);
|
||||
});
|
||||
}
|
||||
@@ -1,15 +0,0 @@
|
||||
import { Command } from 'commander';
|
||||
import type { ApiClient } from '../api-client.js';
|
||||
|
||||
export interface ProjectCommandDeps {
|
||||
client: ApiClient;
|
||||
log: (...args: unknown[]) => void;
|
||||
}
|
||||
|
||||
export function createProjectCommand(_deps: ProjectCommandDeps): Command {
|
||||
const cmd = new Command('project')
|
||||
.alias('proj')
|
||||
.description('Project-specific actions (create with "create project", list with "get projects")');
|
||||
|
||||
return cmd;
|
||||
}
|
||||
@@ -11,6 +11,16 @@ export const RESOURCE_ALIASES: Record<string, string> = {
|
||||
sec: 'secrets',
|
||||
template: 'templates',
|
||||
tpl: 'templates',
|
||||
user: 'users',
|
||||
group: 'groups',
|
||||
rbac: 'rbac',
|
||||
'rbac-definition': 'rbac',
|
||||
'rbac-binding': 'rbac',
|
||||
prompt: 'prompts',
|
||||
prompts: 'prompts',
|
||||
promptrequest: 'promptrequests',
|
||||
promptrequests: 'promptrequests',
|
||||
pr: 'promptrequests',
|
||||
};
|
||||
|
||||
export function resolveResource(name: string): string {
|
||||
@@ -28,17 +38,44 @@ export async function resolveNameOrId(
|
||||
if (/^c[a-z0-9]{24}/.test(nameOrId)) {
|
||||
return nameOrId;
|
||||
}
|
||||
const items = await client.get<Array<{ id: string; name: string }>>(`/api/v1/${resource}`);
|
||||
const match = items.find((item) => item.name === nameOrId);
|
||||
if (match) return match.id;
|
||||
// Users resolve by email, not name
|
||||
if (resource === 'users') {
|
||||
const items = await client.get<Array<{ id: string; email: string }>>(`/api/v1/${resource}`);
|
||||
const match = items.find((item) => item.email === nameOrId);
|
||||
if (match) return match.id;
|
||||
throw new Error(`user '${nameOrId}' not found`);
|
||||
}
|
||||
const items = await client.get<Array<Record<string, unknown>>>(`/api/v1/${resource}`);
|
||||
const match = items.find((item) => {
|
||||
// Instances use server.name, other resources use name directly
|
||||
if (resource === 'instances') {
|
||||
const server = item.server as { name?: string } | undefined;
|
||||
return server?.name === nameOrId;
|
||||
}
|
||||
return item.name === nameOrId;
|
||||
});
|
||||
if (match) return match.id as string;
|
||||
throw new Error(`${resource.replace(/s$/, '')} '${nameOrId}' not found`);
|
||||
}
|
||||
|
||||
/** Strip internal/read-only fields from an API response to make it apply-compatible. */
|
||||
export function stripInternalFields(obj: Record<string, unknown>): Record<string, unknown> {
|
||||
const result = { ...obj };
|
||||
for (const key of ['id', 'createdAt', 'updatedAt', 'version', 'ownerId']) {
|
||||
for (const key of ['id', 'createdAt', 'updatedAt', 'version', 'ownerId', 'summary', 'chapters']) {
|
||||
delete result[key];
|
||||
}
|
||||
// Strip relationship joins that aren't part of the resource spec (like k8s namespaces don't list deployments)
|
||||
if ('servers' in result && Array.isArray(result.servers)) {
|
||||
delete result.servers;
|
||||
}
|
||||
if ('owner' in result && typeof result.owner === 'object') {
|
||||
delete result.owner;
|
||||
}
|
||||
if ('members' in result && Array.isArray(result.members)) {
|
||||
delete result.members;
|
||||
}
|
||||
if ('project' in result && typeof result.project === 'object' && result.project !== null) {
|
||||
delete result.project;
|
||||
}
|
||||
return result;
|
||||
}
|
||||
|
||||
@@ -7,11 +7,32 @@ import type { CredentialsDeps } from '../auth/index.js';
|
||||
import { formatJson, formatYaml } from '../formatters/index.js';
|
||||
import { APP_VERSION } from '@mcpctl/shared';
|
||||
|
||||
// ANSI helpers
|
||||
const GREEN = '\x1b[32m';
|
||||
const RED = '\x1b[31m';
|
||||
const DIM = '\x1b[2m';
|
||||
const RESET = '\x1b[0m';
|
||||
const CLEAR_LINE = '\x1b[2K\r';
|
||||
|
||||
interface ProvidersInfo {
|
||||
providers: string[];
|
||||
tiers: { fast: string[]; heavy: string[] };
|
||||
health: Record<string, boolean>;
|
||||
}
|
||||
|
||||
export interface StatusCommandDeps {
|
||||
configDeps: Partial<ConfigLoaderDeps>;
|
||||
credentialsDeps: Partial<CredentialsDeps>;
|
||||
log: (...args: string[]) => void;
|
||||
write: (text: string) => void;
|
||||
checkHealth: (url: string) => Promise<boolean>;
|
||||
/** Check LLM health via mcplocal's /llm/health endpoint */
|
||||
checkLlm: (mcplocalUrl: string) => Promise<string>;
|
||||
/** Fetch available models from mcplocal's /llm/models endpoint */
|
||||
fetchModels: (mcplocalUrl: string) => Promise<string[]>;
|
||||
/** Fetch provider tier info from mcplocal's /llm/providers endpoint */
|
||||
fetchProviders: (mcplocalUrl: string) => Promise<ProvidersInfo | null>;
|
||||
isTTY: boolean;
|
||||
}
|
||||
|
||||
function defaultCheckHealth(url: string): Promise<boolean> {
|
||||
@@ -28,15 +49,114 @@ function defaultCheckHealth(url: string): Promise<boolean> {
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Check LLM health by querying mcplocal's /llm/health endpoint.
|
||||
* This tests the actual provider running inside the daemon (uses persistent ACP for gemini, etc.)
|
||||
*/
|
||||
function defaultCheckLlm(mcplocalUrl: string): Promise<string> {
|
||||
return new Promise((resolve) => {
|
||||
const req = http.get(`${mcplocalUrl}/llm/health`, { timeout: 45000 }, (res) => {
|
||||
const chunks: Buffer[] = [];
|
||||
res.on('data', (chunk: Buffer) => chunks.push(chunk));
|
||||
res.on('end', () => {
|
||||
try {
|
||||
const body = JSON.parse(Buffer.concat(chunks).toString('utf-8')) as { status: string; error?: string };
|
||||
if (body.status === 'ok') {
|
||||
resolve('ok');
|
||||
} else if (body.status === 'not configured') {
|
||||
resolve('not configured');
|
||||
} else if (body.error) {
|
||||
resolve(body.error.slice(0, 80));
|
||||
} else {
|
||||
resolve(body.status);
|
||||
}
|
||||
} catch {
|
||||
resolve('invalid response');
|
||||
}
|
||||
});
|
||||
});
|
||||
req.on('error', () => resolve('mcplocal unreachable'));
|
||||
req.on('timeout', () => { req.destroy(); resolve('timeout'); });
|
||||
});
|
||||
}
|
||||
|
||||
function defaultFetchModels(mcplocalUrl: string): Promise<string[]> {
|
||||
return new Promise((resolve) => {
|
||||
const req = http.get(`${mcplocalUrl}/llm/models`, { timeout: 5000 }, (res) => {
|
||||
const chunks: Buffer[] = [];
|
||||
res.on('data', (chunk: Buffer) => chunks.push(chunk));
|
||||
res.on('end', () => {
|
||||
try {
|
||||
const body = JSON.parse(Buffer.concat(chunks).toString('utf-8')) as { models?: string[] };
|
||||
resolve(body.models ?? []);
|
||||
} catch {
|
||||
resolve([]);
|
||||
}
|
||||
});
|
||||
});
|
||||
req.on('error', () => resolve([]));
|
||||
req.on('timeout', () => { req.destroy(); resolve([]); });
|
||||
});
|
||||
}
|
||||
|
||||
function defaultFetchProviders(mcplocalUrl: string): Promise<ProvidersInfo | null> {
|
||||
return new Promise((resolve) => {
|
||||
const req = http.get(`${mcplocalUrl}/llm/providers`, { timeout: 5000 }, (res) => {
|
||||
const chunks: Buffer[] = [];
|
||||
res.on('data', (chunk: Buffer) => chunks.push(chunk));
|
||||
res.on('end', () => {
|
||||
try {
|
||||
const body = JSON.parse(Buffer.concat(chunks).toString('utf-8')) as ProvidersInfo;
|
||||
resolve(body);
|
||||
} catch {
|
||||
resolve(null);
|
||||
}
|
||||
});
|
||||
});
|
||||
req.on('error', () => resolve(null));
|
||||
req.on('timeout', () => { req.destroy(); resolve(null); });
|
||||
});
|
||||
}
|
||||
|
||||
const SPINNER_FRAMES = ['⠋', '⠙', '⠹', '⠸', '⠼', '⠴', '⠦', '⠧', '⠇', '⠏'];
|
||||
|
||||
const defaultDeps: StatusCommandDeps = {
|
||||
configDeps: {},
|
||||
credentialsDeps: {},
|
||||
log: (...args) => console.log(...args),
|
||||
write: (text) => process.stdout.write(text),
|
||||
checkHealth: defaultCheckHealth,
|
||||
checkLlm: defaultCheckLlm,
|
||||
fetchModels: defaultFetchModels,
|
||||
fetchProviders: defaultFetchProviders,
|
||||
isTTY: process.stdout.isTTY ?? false,
|
||||
};
|
||||
|
||||
/** Determine LLM label from config (handles both legacy and multi-provider formats). */
|
||||
function getLlmLabel(llm: unknown): string | null {
|
||||
if (!llm || typeof llm !== 'object') return null;
|
||||
// Legacy format: { provider, model }
|
||||
if ('provider' in llm) {
|
||||
const legacy = llm as { provider: string; model?: string };
|
||||
if (legacy.provider === 'none') return null;
|
||||
return `${legacy.provider}${legacy.model ? ` / ${legacy.model}` : ''}`;
|
||||
}
|
||||
// Multi-provider format: { providers: [...] }
|
||||
if ('providers' in llm) {
|
||||
const multi = llm as { providers: Array<{ name: string; type: string; tier?: string }> };
|
||||
if (multi.providers.length === 0) return null;
|
||||
return multi.providers.map((p) => `${p.name}${p.tier ? ` (${p.tier})` : ''}`).join(', ');
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
/** Check if config uses multi-provider format. */
|
||||
function isMultiProvider(llm: unknown): boolean {
|
||||
return !!llm && typeof llm === 'object' && 'providers' in llm;
|
||||
}
|
||||
|
||||
export function createStatusCommand(deps?: Partial<StatusCommandDeps>): Command {
|
||||
const { configDeps, credentialsDeps, log, checkHealth } = { ...defaultDeps, ...deps };
|
||||
const { configDeps, credentialsDeps, log, write, checkHealth, checkLlm, fetchModels, fetchProviders, isTTY } = { ...defaultDeps, ...deps };
|
||||
|
||||
return new Command('status')
|
||||
.description('Show mcpctl status and connectivity')
|
||||
@@ -45,33 +165,124 @@ export function createStatusCommand(deps?: Partial<StatusCommandDeps>): Command
|
||||
const config = loadConfig(configDeps);
|
||||
const creds = loadCredentials(credentialsDeps);
|
||||
|
||||
const llmLabel = getLlmLabel(config.llm);
|
||||
const multiProvider = isMultiProvider(config.llm);
|
||||
|
||||
if (opts.output !== 'table') {
|
||||
// JSON/YAML: run everything in parallel, wait, output at once
|
||||
const [mcplocalReachable, mcpdReachable, llmStatus, providersInfo] = await Promise.all([
|
||||
checkHealth(config.mcplocalUrl),
|
||||
checkHealth(config.mcpdUrl),
|
||||
llmLabel ? checkLlm(config.mcplocalUrl) : Promise.resolve(null),
|
||||
multiProvider ? fetchProviders(config.mcplocalUrl) : Promise.resolve(null),
|
||||
]);
|
||||
|
||||
const llm = llmLabel
|
||||
? llmStatus === 'ok' ? llmLabel : `${llmLabel} (${llmStatus})`
|
||||
: null;
|
||||
|
||||
const status = {
|
||||
version: APP_VERSION,
|
||||
mcplocalUrl: config.mcplocalUrl,
|
||||
mcplocalReachable,
|
||||
mcpdUrl: config.mcpdUrl,
|
||||
mcpdReachable,
|
||||
auth: creds ? { user: creds.user } : null,
|
||||
registries: config.registries,
|
||||
outputFormat: config.outputFormat,
|
||||
llm,
|
||||
llmStatus,
|
||||
...(providersInfo ? { providers: providersInfo } : {}),
|
||||
};
|
||||
|
||||
log(opts.output === 'json' ? formatJson(status) : formatYaml(status));
|
||||
return;
|
||||
}
|
||||
|
||||
// Table format: print lines progressively, LLM last with spinner
|
||||
|
||||
// Fast health checks first
|
||||
const [mcplocalReachable, mcpdReachable] = await Promise.all([
|
||||
checkHealth(config.mcplocalUrl),
|
||||
checkHealth(config.mcpdUrl),
|
||||
]);
|
||||
|
||||
const status = {
|
||||
version: APP_VERSION,
|
||||
mcplocalUrl: config.mcplocalUrl,
|
||||
mcplocalReachable,
|
||||
mcpdUrl: config.mcpdUrl,
|
||||
mcpdReachable,
|
||||
auth: creds ? { user: creds.user } : null,
|
||||
registries: config.registries,
|
||||
outputFormat: config.outputFormat,
|
||||
};
|
||||
log(`mcpctl v${APP_VERSION}`);
|
||||
log(`mcplocal: ${config.mcplocalUrl} (${mcplocalReachable ? 'connected' : 'unreachable'})`);
|
||||
log(`mcpd: ${config.mcpdUrl} (${mcpdReachable ? 'connected' : 'unreachable'})`);
|
||||
log(`Auth: ${creds ? `logged in as ${creds.user}` : 'not logged in'}`);
|
||||
log(`Registries: ${config.registries.join(', ')}`);
|
||||
log(`Output: ${config.outputFormat}`);
|
||||
|
||||
if (opts.output === 'json') {
|
||||
log(formatJson(status));
|
||||
} else if (opts.output === 'yaml') {
|
||||
log(formatYaml(status));
|
||||
if (!llmLabel) {
|
||||
log(`LLM: not configured (run 'mcpctl config setup')`);
|
||||
return;
|
||||
}
|
||||
|
||||
// LLM check + models + providers fetch in parallel
|
||||
const llmPromise = checkLlm(config.mcplocalUrl);
|
||||
const modelsPromise = fetchModels(config.mcplocalUrl);
|
||||
const providersPromise = multiProvider ? fetchProviders(config.mcplocalUrl) : Promise.resolve(null);
|
||||
|
||||
if (isTTY) {
|
||||
let frame = 0;
|
||||
const interval = setInterval(() => {
|
||||
write(`${CLEAR_LINE}LLM: ${DIM}${SPINNER_FRAMES[frame % SPINNER_FRAMES.length]} checking...${RESET}`);
|
||||
frame++;
|
||||
}, 80);
|
||||
|
||||
const [llmStatus, models, providersInfo] = await Promise.all([llmPromise, modelsPromise, providersPromise]);
|
||||
clearInterval(interval);
|
||||
|
||||
if (providersInfo && (providersInfo.tiers.fast.length > 0 || providersInfo.tiers.heavy.length > 0)) {
|
||||
// Tiered display with per-provider health
|
||||
write(`${CLEAR_LINE}`);
|
||||
for (const tier of ['fast', 'heavy'] as const) {
|
||||
const names = providersInfo.tiers[tier];
|
||||
if (names.length === 0) continue;
|
||||
const label = tier === 'fast' ? 'LLM (fast): ' : 'LLM (heavy):';
|
||||
const parts = names.map((n) => {
|
||||
const ok = providersInfo.health[n];
|
||||
return ok ? `${n} ${GREEN}✓${RESET}` : `${n} ${RED}✗${RESET}`;
|
||||
});
|
||||
log(`${label} ${parts.join(', ')}`);
|
||||
}
|
||||
} else {
|
||||
// Legacy single provider display
|
||||
if (llmStatus === 'ok' || llmStatus === 'ok (key stored)') {
|
||||
write(`${CLEAR_LINE}LLM: ${llmLabel} ${GREEN}✓ ${llmStatus}${RESET}\n`);
|
||||
} else {
|
||||
write(`${CLEAR_LINE}LLM: ${llmLabel} ${RED}✗ ${llmStatus}${RESET}\n`);
|
||||
}
|
||||
}
|
||||
if (models.length > 0) {
|
||||
log(`${DIM} Available: ${models.join(', ')}${RESET}`);
|
||||
}
|
||||
} else {
|
||||
log(`mcpctl v${status.version}`);
|
||||
log(`mcplocal: ${status.mcplocalUrl} (${mcplocalReachable ? 'connected' : 'unreachable'})`);
|
||||
log(`mcpd: ${status.mcpdUrl} (${mcpdReachable ? 'connected' : 'unreachable'})`);
|
||||
log(`Auth: ${creds ? `logged in as ${creds.user}` : 'not logged in'}`);
|
||||
log(`Registries: ${status.registries.join(', ')}`);
|
||||
log(`Output: ${status.outputFormat}`);
|
||||
// Non-TTY: no spinner, just wait and print
|
||||
const [llmStatus, models, providersInfo] = await Promise.all([llmPromise, modelsPromise, providersPromise]);
|
||||
|
||||
if (providersInfo && (providersInfo.tiers.fast.length > 0 || providersInfo.tiers.heavy.length > 0)) {
|
||||
for (const tier of ['fast', 'heavy'] as const) {
|
||||
const names = providersInfo.tiers[tier];
|
||||
if (names.length === 0) continue;
|
||||
const label = tier === 'fast' ? 'LLM (fast): ' : 'LLM (heavy):';
|
||||
const parts = names.map((n) => {
|
||||
const ok = providersInfo.health[n];
|
||||
return ok ? `${n} ✓` : `${n} ✗`;
|
||||
});
|
||||
log(`${label} ${parts.join(', ')}`);
|
||||
}
|
||||
} else {
|
||||
if (llmStatus === 'ok' || llmStatus === 'ok (key stored)') {
|
||||
log(`LLM: ${llmLabel} ✓ ${llmStatus}`);
|
||||
} else {
|
||||
log(`LLM: ${llmLabel} ✗ ${llmStatus}`);
|
||||
}
|
||||
}
|
||||
if (models.length > 0) {
|
||||
log(`${DIM} Available: ${models.join(', ')}${RESET}`);
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
export { McpctlConfigSchema, DEFAULT_CONFIG } from './schema.js';
|
||||
export type { McpctlConfig } from './schema.js';
|
||||
export { McpctlConfigSchema, LlmConfigSchema, LlmProviderEntrySchema, LlmMultiConfigSchema, LLM_PROVIDERS, LLM_TIERS, DEFAULT_CONFIG } from './schema.js';
|
||||
export type { McpctlConfig, LlmConfig, LlmProviderEntry, LlmMultiConfig, LlmProviderName, LlmTier } from './schema.js';
|
||||
export { loadConfig, saveConfig, mergeConfig, getConfigPath } from './loader.js';
|
||||
export type { ConfigLoaderDeps } from './loader.js';
|
||||
|
||||
@@ -1,5 +1,50 @@
|
||||
import { z } from 'zod';
|
||||
|
||||
export const LLM_PROVIDERS = ['gemini-cli', 'ollama', 'anthropic', 'openai', 'deepseek', 'vllm', 'none'] as const;
|
||||
export type LlmProviderName = typeof LLM_PROVIDERS[number];
|
||||
|
||||
export const LLM_TIERS = ['fast', 'heavy'] as const;
|
||||
export type LlmTier = typeof LLM_TIERS[number];
|
||||
|
||||
/** Legacy single-provider format. */
|
||||
export const LlmConfigSchema = z.object({
|
||||
/** LLM provider name */
|
||||
provider: z.enum(LLM_PROVIDERS),
|
||||
/** Model name */
|
||||
model: z.string().optional(),
|
||||
/** Provider URL (for ollama, vllm, openai with custom endpoint) */
|
||||
url: z.string().optional(),
|
||||
/** Binary path override (for gemini-cli) */
|
||||
binaryPath: z.string().optional(),
|
||||
}).strict();
|
||||
|
||||
export type LlmConfig = z.infer<typeof LlmConfigSchema>;
|
||||
|
||||
/** Multi-provider entry (advanced mode). */
|
||||
export const LlmProviderEntrySchema = z.object({
|
||||
/** User-chosen name for this provider instance (e.g. "vllm-local") */
|
||||
name: z.string(),
|
||||
/** Provider type */
|
||||
type: z.enum(LLM_PROVIDERS),
|
||||
/** Model name */
|
||||
model: z.string().optional(),
|
||||
/** Provider URL (for ollama, vllm, openai with custom endpoint) */
|
||||
url: z.string().optional(),
|
||||
/** Binary path override (for gemini-cli) */
|
||||
binaryPath: z.string().optional(),
|
||||
/** Tier assignment */
|
||||
tier: z.enum(LLM_TIERS).optional(),
|
||||
}).strict();
|
||||
|
||||
export type LlmProviderEntry = z.infer<typeof LlmProviderEntrySchema>;
|
||||
|
||||
/** Multi-provider format with providers array. */
|
||||
export const LlmMultiConfigSchema = z.object({
|
||||
providers: z.array(LlmProviderEntrySchema).min(1),
|
||||
}).strict();
|
||||
|
||||
export type LlmMultiConfig = z.infer<typeof LlmMultiConfigSchema>;
|
||||
|
||||
export const McpctlConfigSchema = z.object({
|
||||
/** mcplocal daemon endpoint (local LLM pre-processing proxy) */
|
||||
mcplocalUrl: z.string().default('http://localhost:3200'),
|
||||
@@ -19,6 +64,8 @@ export const McpctlConfigSchema = z.object({
|
||||
outputFormat: z.enum(['table', 'json', 'yaml']).default('table'),
|
||||
/** Smithery API key */
|
||||
smitheryApiKey: z.string().optional(),
|
||||
/** LLM provider configuration — accepts legacy single-provider or multi-provider format */
|
||||
llm: z.union([LlmConfigSchema, LlmMultiConfigSchema]).optional(),
|
||||
}).transform((cfg) => {
|
||||
// Backward compatibility: if old daemonUrl is set but mcplocalUrl wasn't explicitly changed,
|
||||
// use daemonUrl as mcplocalUrl
|
||||
|
||||
@@ -6,6 +6,29 @@ export function formatJson(data: unknown): string {
|
||||
return JSON.stringify(data, null, 2);
|
||||
}
|
||||
|
||||
export function formatYaml(data: unknown): string {
|
||||
return yaml.dump(data, { lineWidth: 120, noRefs: true }).trimEnd();
|
||||
/**
|
||||
* Reorder object keys so that long text fields (like `content`, `prompt`)
|
||||
* come last. This makes YAML output more readable when content spans
|
||||
* multiple lines.
|
||||
*/
|
||||
export function reorderKeys(obj: unknown): unknown {
|
||||
if (Array.isArray(obj)) return obj.map(reorderKeys);
|
||||
if (obj !== null && typeof obj === 'object') {
|
||||
const rec = obj as Record<string, unknown>;
|
||||
const lastKeys = ['content', 'prompt'];
|
||||
const ordered: Record<string, unknown> = {};
|
||||
for (const key of Object.keys(rec)) {
|
||||
if (!lastKeys.includes(key)) ordered[key] = reorderKeys(rec[key]);
|
||||
}
|
||||
for (const key of lastKeys) {
|
||||
if (key in rec) ordered[key] = rec[key];
|
||||
}
|
||||
return ordered;
|
||||
}
|
||||
return obj;
|
||||
}
|
||||
|
||||
export function formatYaml(data: unknown): string {
|
||||
const reordered = reorderKeys(data);
|
||||
return yaml.dump(reordered, { lineWidth: 120, noRefs: true }).trimEnd();
|
||||
}
|
||||
|
||||
@@ -10,10 +10,11 @@ import { createLogsCommand } from './commands/logs.js';
|
||||
import { createApplyCommand } from './commands/apply.js';
|
||||
import { createCreateCommand } from './commands/create.js';
|
||||
import { createEditCommand } from './commands/edit.js';
|
||||
import { createClaudeCommand } from './commands/claude.js';
|
||||
import { createProjectCommand } from './commands/project.js';
|
||||
import { createBackupCommand, createRestoreCommand } from './commands/backup.js';
|
||||
import { createLoginCommand, createLogoutCommand } from './commands/auth.js';
|
||||
import { createAttachServerCommand, createDetachServerCommand, createApproveCommand } from './commands/project-ops.js';
|
||||
import { createMcpCommand } from './commands/mcp.js';
|
||||
import { createPatchCommand } from './commands/patch.js';
|
||||
import { ApiClient, ApiError } from './api-client.js';
|
||||
import { loadConfig } from './config/index.js';
|
||||
import { loadCredentials } from './auth/index.js';
|
||||
@@ -26,9 +27,9 @@ export function createProgram(): Command {
|
||||
.version(APP_VERSION, '-v, --version')
|
||||
.enablePositionalOptions()
|
||||
.option('--daemon-url <url>', 'mcplocal daemon URL')
|
||||
.option('--direct', 'bypass mcplocal and connect directly to mcpd');
|
||||
.option('--direct', 'bypass mcplocal and connect directly to mcpd')
|
||||
.option('--project <name>', 'Target project for project commands');
|
||||
|
||||
program.addCommand(createConfigCommand());
|
||||
program.addCommand(createStatusCommand());
|
||||
program.addCommand(createLoginCommand());
|
||||
program.addCommand(createLogoutCommand());
|
||||
@@ -48,7 +49,39 @@ export function createProgram(): Command {
|
||||
|
||||
const client = new ApiClient({ baseUrl, token: creds?.token ?? undefined });
|
||||
|
||||
const fetchResource = async (resource: string, nameOrId?: string): Promise<unknown[]> => {
|
||||
program.addCommand(createConfigCommand(undefined, {
|
||||
client,
|
||||
credentialsDeps: {},
|
||||
log: (...args) => console.log(...args),
|
||||
}));
|
||||
|
||||
const fetchResource = async (resource: string, nameOrId?: string, opts?: { project?: string; all?: boolean }): Promise<unknown[]> => {
|
||||
const projectName = opts?.project ?? program.opts().project as string | undefined;
|
||||
|
||||
// --project scoping for servers and instances
|
||||
if (projectName && !nameOrId && (resource === 'servers' || resource === 'instances')) {
|
||||
const projectId = await resolveNameOrId(client, 'projects', projectName);
|
||||
if (resource === 'servers') {
|
||||
return client.get<unknown[]>(`/api/v1/projects/${projectId}/servers`);
|
||||
}
|
||||
// instances: fetch project servers, then filter instances by serverId
|
||||
const projectServers = await client.get<Array<{ id: string }>>(`/api/v1/projects/${projectId}/servers`);
|
||||
const serverIds = new Set(projectServers.map((s) => s.id));
|
||||
const allInstances = await client.get<Array<{ serverId: string }>>(`/api/v1/instances`);
|
||||
return allInstances.filter((inst) => serverIds.has(inst.serverId));
|
||||
}
|
||||
|
||||
// --project scoping for prompts and promptrequests
|
||||
if (!nameOrId && (resource === 'prompts' || resource === 'promptrequests')) {
|
||||
if (projectName) {
|
||||
return client.get<unknown[]>(`/api/v1/${resource}?project=${encodeURIComponent(projectName)}`);
|
||||
}
|
||||
// Default: global-only. --all (-A) shows everything.
|
||||
if (!opts?.all) {
|
||||
return client.get<unknown[]>(`/api/v1/${resource}?scope=global`);
|
||||
}
|
||||
}
|
||||
|
||||
if (nameOrId) {
|
||||
// Glob pattern — use query param filtering
|
||||
if (nameOrId.includes('*')) {
|
||||
@@ -113,12 +146,7 @@ export function createProgram(): Command {
|
||||
log: (...args) => console.log(...args),
|
||||
}));
|
||||
|
||||
program.addCommand(createClaudeCommand({
|
||||
client,
|
||||
log: (...args) => console.log(...args),
|
||||
}));
|
||||
|
||||
program.addCommand(createProjectCommand({
|
||||
program.addCommand(createPatchCommand({
|
||||
client,
|
||||
log: (...args) => console.log(...args),
|
||||
}));
|
||||
@@ -133,6 +161,18 @@ export function createProgram(): Command {
|
||||
log: (...args) => console.log(...args),
|
||||
}));
|
||||
|
||||
const projectOpsDeps = {
|
||||
client,
|
||||
log: (...args: string[]) => console.log(...args),
|
||||
getProject: () => program.opts().project as string | undefined,
|
||||
};
|
||||
program.addCommand(createAttachServerCommand(projectOpsDeps), { hidden: true });
|
||||
program.addCommand(createDetachServerCommand(projectOpsDeps), { hidden: true });
|
||||
program.addCommand(createApproveCommand(projectOpsDeps));
|
||||
program.addCommand(createMcpCommand({
|
||||
getProject: () => program.opts().project as string | undefined,
|
||||
}), { hidden: true });
|
||||
|
||||
return program;
|
||||
}
|
||||
|
||||
@@ -145,14 +185,28 @@ const isDirectRun =
|
||||
if (isDirectRun) {
|
||||
createProgram().parseAsync(process.argv).catch((err: unknown) => {
|
||||
if (err instanceof ApiError) {
|
||||
let msg: string;
|
||||
try {
|
||||
const parsed = JSON.parse(err.body) as { error?: string; message?: string };
|
||||
msg = parsed.error ?? parsed.message ?? err.body;
|
||||
} catch {
|
||||
msg = err.body;
|
||||
if (err.status === 401) {
|
||||
console.error("Error: you need to log in. Run 'mcpctl login' to authenticate.");
|
||||
} else if (err.status === 403) {
|
||||
console.error('Error: permission denied. You do not have access to this resource.');
|
||||
} else {
|
||||
let msg: string;
|
||||
try {
|
||||
const parsed = JSON.parse(err.body) as { error?: string; message?: string; details?: unknown };
|
||||
msg = parsed.error ?? parsed.message ?? err.body;
|
||||
if (parsed.details && Array.isArray(parsed.details)) {
|
||||
const issues = parsed.details as Array<{ message?: string; path?: string[] }>;
|
||||
const detail = issues.map((i) => {
|
||||
const path = i.path?.join('.') ?? '';
|
||||
return path ? `${path}: ${i.message}` : (i.message ?? '');
|
||||
}).filter(Boolean).join('; ');
|
||||
if (detail) msg += `: ${detail}`;
|
||||
}
|
||||
} catch {
|
||||
msg = err.body;
|
||||
}
|
||||
console.error(`Error: ${msg}`);
|
||||
}
|
||||
console.error(`Error: ${msg}`);
|
||||
} else if (err instanceof Error) {
|
||||
console.error(`Error: ${err.message}`);
|
||||
} else {
|
||||
|
||||
@@ -21,6 +21,16 @@ beforeAll(async () => {
|
||||
res.writeHead(201, { 'Content-Type': 'application/json' });
|
||||
res.end(JSON.stringify({ id: 'srv-new', ...body }));
|
||||
});
|
||||
} else if (req.url === '/api/v1/servers/srv-1' && req.method === 'DELETE') {
|
||||
// Fastify rejects empty body with Content-Type: application/json
|
||||
const ct = req.headers['content-type'] ?? '';
|
||||
if (ct.includes('application/json')) {
|
||||
res.writeHead(400, { 'Content-Type': 'application/json' });
|
||||
res.end(JSON.stringify({ error: "Body cannot be empty when content-type is set to 'application/json'" }));
|
||||
} else {
|
||||
res.writeHead(204);
|
||||
res.end();
|
||||
}
|
||||
} else if (req.url === '/api/v1/missing' && req.method === 'GET') {
|
||||
res.writeHead(404, { 'Content-Type': 'application/json' });
|
||||
res.end(JSON.stringify({ error: 'Not found' }));
|
||||
@@ -75,6 +85,12 @@ describe('ApiClient', () => {
|
||||
await expect(client.get('/anything')).rejects.toThrow();
|
||||
});
|
||||
|
||||
it('performs DELETE without Content-Type header', async () => {
|
||||
const client = new ApiClient({ baseUrl: `http://localhost:${port}` });
|
||||
// Should succeed (204) because no Content-Type is sent on bodyless DELETE
|
||||
await expect(client.delete('/api/v1/servers/srv-1')).resolves.toBeUndefined();
|
||||
});
|
||||
|
||||
it('sends Authorization header when token provided', async () => {
|
||||
// We need a separate server to check the header
|
||||
let receivedAuth = '';
|
||||
|
||||
@@ -159,4 +159,347 @@ projects:
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('applies users (no role field)', async () => {
|
||||
const configPath = join(tmpDir, 'config.yaml');
|
||||
writeFileSync(configPath, `
|
||||
users:
|
||||
- email: alice@test.com
|
||||
password: password123
|
||||
name: Alice
|
||||
`);
|
||||
|
||||
const cmd = createApplyCommand({ client, log });
|
||||
await cmd.parseAsync([configPath], { from: 'user' });
|
||||
|
||||
const callBody = vi.mocked(client.post).mock.calls[0]![1] as Record<string, unknown>;
|
||||
expect(callBody).toEqual(expect.objectContaining({
|
||||
email: 'alice@test.com',
|
||||
password: 'password123',
|
||||
name: 'Alice',
|
||||
}));
|
||||
expect(callBody).not.toHaveProperty('role');
|
||||
expect(output.join('\n')).toContain('Created user: alice@test.com');
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('updates existing users matched by email', async () => {
|
||||
vi.mocked(client.get).mockImplementation(async (url: string) => {
|
||||
if (url === '/api/v1/users') return [{ id: 'usr-1', email: 'alice@test.com' }];
|
||||
return [];
|
||||
});
|
||||
|
||||
const configPath = join(tmpDir, 'config.yaml');
|
||||
writeFileSync(configPath, `
|
||||
users:
|
||||
- email: alice@test.com
|
||||
password: newpassword
|
||||
name: Alice Updated
|
||||
`);
|
||||
|
||||
const cmd = createApplyCommand({ client, log });
|
||||
await cmd.parseAsync([configPath], { from: 'user' });
|
||||
|
||||
expect(client.put).toHaveBeenCalledWith('/api/v1/users/usr-1', expect.objectContaining({
|
||||
email: 'alice@test.com',
|
||||
name: 'Alice Updated',
|
||||
}));
|
||||
expect(output.join('\n')).toContain('Updated user: alice@test.com');
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('applies groups', async () => {
|
||||
const configPath = join(tmpDir, 'config.yaml');
|
||||
writeFileSync(configPath, `
|
||||
groups:
|
||||
- name: dev-team
|
||||
description: Development team
|
||||
members:
|
||||
- alice@test.com
|
||||
- bob@test.com
|
||||
`);
|
||||
|
||||
const cmd = createApplyCommand({ client, log });
|
||||
await cmd.parseAsync([configPath], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/groups', expect.objectContaining({
|
||||
name: 'dev-team',
|
||||
description: 'Development team',
|
||||
members: ['alice@test.com', 'bob@test.com'],
|
||||
}));
|
||||
expect(output.join('\n')).toContain('Created group: dev-team');
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('updates existing groups', async () => {
|
||||
vi.mocked(client.get).mockImplementation(async (url: string) => {
|
||||
if (url === '/api/v1/groups') return [{ id: 'grp-1', name: 'dev-team' }];
|
||||
return [];
|
||||
});
|
||||
|
||||
const configPath = join(tmpDir, 'config.yaml');
|
||||
writeFileSync(configPath, `
|
||||
groups:
|
||||
- name: dev-team
|
||||
description: Updated devs
|
||||
members:
|
||||
- new@test.com
|
||||
`);
|
||||
|
||||
const cmd = createApplyCommand({ client, log });
|
||||
await cmd.parseAsync([configPath], { from: 'user' });
|
||||
|
||||
expect(client.put).toHaveBeenCalledWith('/api/v1/groups/grp-1', expect.objectContaining({
|
||||
name: 'dev-team',
|
||||
description: 'Updated devs',
|
||||
}));
|
||||
expect(output.join('\n')).toContain('Updated group: dev-team');
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('applies rbacBindings', async () => {
|
||||
const configPath = join(tmpDir, 'config.yaml');
|
||||
writeFileSync(configPath, `
|
||||
rbac:
|
||||
- name: developers
|
||||
subjects:
|
||||
- kind: User
|
||||
name: alice@test.com
|
||||
- kind: Group
|
||||
name: dev-team
|
||||
roleBindings:
|
||||
- role: edit
|
||||
resource: servers
|
||||
- role: view
|
||||
resource: instances
|
||||
`);
|
||||
|
||||
const cmd = createApplyCommand({ client, log });
|
||||
await cmd.parseAsync([configPath], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', expect.objectContaining({
|
||||
name: 'developers',
|
||||
subjects: [
|
||||
{ kind: 'User', name: 'alice@test.com' },
|
||||
{ kind: 'Group', name: 'dev-team' },
|
||||
],
|
||||
roleBindings: [
|
||||
{ role: 'edit', resource: 'servers' },
|
||||
{ role: 'view', resource: 'instances' },
|
||||
],
|
||||
}));
|
||||
expect(output.join('\n')).toContain('Created rbacBinding: developers');
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('updates existing rbacBindings', async () => {
|
||||
vi.mocked(client.get).mockImplementation(async (url: string) => {
|
||||
if (url === '/api/v1/rbac') return [{ id: 'rbac-1', name: 'developers' }];
|
||||
return [];
|
||||
});
|
||||
|
||||
const configPath = join(tmpDir, 'config.yaml');
|
||||
writeFileSync(configPath, `
|
||||
rbacBindings:
|
||||
- name: developers
|
||||
subjects:
|
||||
- kind: User
|
||||
name: new@test.com
|
||||
roleBindings:
|
||||
- role: edit
|
||||
resource: "*"
|
||||
`);
|
||||
|
||||
const cmd = createApplyCommand({ client, log });
|
||||
await cmd.parseAsync([configPath], { from: 'user' });
|
||||
|
||||
expect(client.put).toHaveBeenCalledWith('/api/v1/rbac/rbac-1', expect.objectContaining({
|
||||
name: 'developers',
|
||||
}));
|
||||
expect(output.join('\n')).toContain('Updated rbacBinding: developers');
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('applies projects with servers', async () => {
|
||||
const configPath = join(tmpDir, 'config.yaml');
|
||||
writeFileSync(configPath, `
|
||||
projects:
|
||||
- name: smart-home
|
||||
description: Home automation
|
||||
proxyMode: filtered
|
||||
llmProvider: gemini-cli
|
||||
llmModel: gemini-2.0-flash
|
||||
servers:
|
||||
- my-grafana
|
||||
- my-ha
|
||||
`);
|
||||
|
||||
const cmd = createApplyCommand({ client, log });
|
||||
await cmd.parseAsync([configPath], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/projects', expect.objectContaining({
|
||||
name: 'smart-home',
|
||||
proxyMode: 'filtered',
|
||||
llmProvider: 'gemini-cli',
|
||||
llmModel: 'gemini-2.0-flash',
|
||||
servers: ['my-grafana', 'my-ha'],
|
||||
}));
|
||||
expect(output.join('\n')).toContain('Created project: smart-home');
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('dry-run shows all new resource types', async () => {
|
||||
const configPath = join(tmpDir, 'config.yaml');
|
||||
writeFileSync(configPath, `
|
||||
secrets:
|
||||
- name: creds
|
||||
data:
|
||||
TOKEN: abc
|
||||
users:
|
||||
- email: alice@test.com
|
||||
password: password123
|
||||
groups:
|
||||
- name: dev-team
|
||||
members: []
|
||||
projects:
|
||||
- name: my-proj
|
||||
description: A project
|
||||
rbacBindings:
|
||||
- name: admins
|
||||
subjects:
|
||||
- kind: User
|
||||
name: admin@test.com
|
||||
roleBindings:
|
||||
- role: edit
|
||||
resource: "*"
|
||||
`);
|
||||
|
||||
const cmd = createApplyCommand({ client, log });
|
||||
await cmd.parseAsync([configPath, '--dry-run'], { from: 'user' });
|
||||
|
||||
expect(client.post).not.toHaveBeenCalled();
|
||||
const text = output.join('\n');
|
||||
expect(text).toContain('Dry run');
|
||||
expect(text).toContain('1 secret(s)');
|
||||
expect(text).toContain('1 user(s)');
|
||||
expect(text).toContain('1 group(s)');
|
||||
expect(text).toContain('1 project(s)');
|
||||
expect(text).toContain('1 rbacBinding(s)');
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('applies resources in correct order', async () => {
|
||||
const callOrder: string[] = [];
|
||||
vi.mocked(client.post).mockImplementation(async (url: string) => {
|
||||
callOrder.push(url);
|
||||
return { id: 'new-id', name: 'test' };
|
||||
});
|
||||
|
||||
const configPath = join(tmpDir, 'config.yaml');
|
||||
writeFileSync(configPath, `
|
||||
rbacBindings:
|
||||
- name: admins
|
||||
subjects:
|
||||
- kind: User
|
||||
name: admin@test.com
|
||||
roleBindings:
|
||||
- role: edit
|
||||
resource: "*"
|
||||
users:
|
||||
- email: admin@test.com
|
||||
password: password123
|
||||
secrets:
|
||||
- name: creds
|
||||
data:
|
||||
KEY: val
|
||||
groups:
|
||||
- name: dev-team
|
||||
servers:
|
||||
- name: my-server
|
||||
transport: STDIO
|
||||
projects:
|
||||
- name: my-proj
|
||||
`);
|
||||
|
||||
const cmd = createApplyCommand({ client, log });
|
||||
await cmd.parseAsync([configPath], { from: 'user' });
|
||||
|
||||
// Apply order: secrets → servers → users → groups → projects → templates → rbacBindings
|
||||
expect(callOrder[0]).toBe('/api/v1/secrets');
|
||||
expect(callOrder[1]).toBe('/api/v1/servers');
|
||||
expect(callOrder[2]).toBe('/api/v1/users');
|
||||
expect(callOrder[3]).toBe('/api/v1/groups');
|
||||
expect(callOrder[4]).toBe('/api/v1/projects');
|
||||
expect(callOrder[5]).toBe('/api/v1/rbac');
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('applies rbac with operation bindings', async () => {
|
||||
const configPath = join(tmpDir, 'config.yaml');
|
||||
writeFileSync(configPath, `
|
||||
rbac:
|
||||
- name: ops-team
|
||||
subjects:
|
||||
- kind: Group
|
||||
name: ops
|
||||
roleBindings:
|
||||
- role: edit
|
||||
resource: servers
|
||||
- role: run
|
||||
action: backup
|
||||
- role: run
|
||||
action: logs
|
||||
`);
|
||||
|
||||
const cmd = createApplyCommand({ client, log });
|
||||
await cmd.parseAsync([configPath], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', expect.objectContaining({
|
||||
name: 'ops-team',
|
||||
roleBindings: [
|
||||
{ role: 'edit', resource: 'servers' },
|
||||
{ role: 'run', action: 'backup' },
|
||||
{ role: 'run', action: 'logs' },
|
||||
],
|
||||
}));
|
||||
expect(output.join('\n')).toContain('Created rbacBinding: ops-team');
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('applies rbac with name-scoped resource binding', async () => {
|
||||
const configPath = join(tmpDir, 'config.yaml');
|
||||
writeFileSync(configPath, `
|
||||
rbac:
|
||||
- name: ha-viewer
|
||||
subjects:
|
||||
- kind: User
|
||||
name: alice@test.com
|
||||
roleBindings:
|
||||
- role: view
|
||||
resource: servers
|
||||
name: my-ha
|
||||
`);
|
||||
|
||||
const cmd = createApplyCommand({ client, log });
|
||||
await cmd.parseAsync([configPath], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', expect.objectContaining({
|
||||
name: 'ha-viewer',
|
||||
roleBindings: [
|
||||
{ role: 'view', resource: 'servers', name: 'my-ha' },
|
||||
],
|
||||
}));
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
});
|
||||
|
||||
@@ -37,6 +37,8 @@ describe('login command', () => {
|
||||
user: { email },
|
||||
}),
|
||||
logoutRequest: async () => {},
|
||||
statusRequest: async () => ({ hasUsers: true }),
|
||||
bootstrapRequest: async () => ({ token: '', user: { email: '' } }),
|
||||
});
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
expect(output[0]).toContain('Logged in as alice@test.com');
|
||||
@@ -58,6 +60,8 @@ describe('login command', () => {
|
||||
log,
|
||||
loginRequest: async () => { throw new Error('Invalid credentials'); },
|
||||
logoutRequest: async () => {},
|
||||
statusRequest: async () => ({ hasUsers: true }),
|
||||
bootstrapRequest: async () => ({ token: '', user: { email: '' } }),
|
||||
});
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
expect(output[0]).toContain('Login failed');
|
||||
@@ -83,6 +87,8 @@ describe('login command', () => {
|
||||
return { token: 'tok', user: { email } };
|
||||
},
|
||||
logoutRequest: async () => {},
|
||||
statusRequest: async () => ({ hasUsers: true }),
|
||||
bootstrapRequest: async () => ({ token: '', user: { email: '' } }),
|
||||
});
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
expect(capturedUrl).toBe('http://custom:3100');
|
||||
@@ -103,12 +109,74 @@ describe('login command', () => {
|
||||
return { token: 'tok', user: { email } };
|
||||
},
|
||||
logoutRequest: async () => {},
|
||||
statusRequest: async () => ({ hasUsers: true }),
|
||||
bootstrapRequest: async () => ({ token: '', user: { email: '' } }),
|
||||
});
|
||||
await cmd.parseAsync(['--mcpd-url', 'http://override:3100'], { from: 'user' });
|
||||
expect(capturedUrl).toBe('http://override:3100');
|
||||
});
|
||||
});
|
||||
|
||||
describe('login bootstrap flow', () => {
|
||||
it('bootstraps first admin when no users exist', async () => {
|
||||
let bootstrapCalled = false;
|
||||
const cmd = createLoginCommand({
|
||||
configDeps: { configDir: tempDir },
|
||||
credentialsDeps: { configDir: tempDir },
|
||||
prompt: {
|
||||
input: async (msg) => {
|
||||
if (msg.includes('Name')) return 'Admin User';
|
||||
return 'admin@test.com';
|
||||
},
|
||||
password: async () => 'admin-pass',
|
||||
},
|
||||
log,
|
||||
loginRequest: async () => ({ token: '', user: { email: '' } }),
|
||||
logoutRequest: async () => {},
|
||||
statusRequest: async () => ({ hasUsers: false }),
|
||||
bootstrapRequest: async (_url, email, _password) => {
|
||||
bootstrapCalled = true;
|
||||
return { token: 'admin-token', user: { email } };
|
||||
},
|
||||
});
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
|
||||
expect(bootstrapCalled).toBe(true);
|
||||
expect(output.join('\n')).toContain('No users configured');
|
||||
expect(output.join('\n')).toContain('admin@test.com');
|
||||
expect(output.join('\n')).toContain('admin');
|
||||
|
||||
const creds = loadCredentials({ configDir: tempDir });
|
||||
expect(creds).not.toBeNull();
|
||||
expect(creds!.token).toBe('admin-token');
|
||||
expect(creds!.user).toBe('admin@test.com');
|
||||
});
|
||||
|
||||
it('falls back to normal login when users exist', async () => {
|
||||
let loginCalled = false;
|
||||
const cmd = createLoginCommand({
|
||||
configDeps: { configDir: tempDir },
|
||||
credentialsDeps: { configDir: tempDir },
|
||||
prompt: {
|
||||
input: async () => 'alice@test.com',
|
||||
password: async () => 'secret',
|
||||
},
|
||||
log,
|
||||
loginRequest: async (_url, email) => {
|
||||
loginCalled = true;
|
||||
return { token: 'session-tok', user: { email } };
|
||||
},
|
||||
logoutRequest: async () => {},
|
||||
statusRequest: async () => ({ hasUsers: true }),
|
||||
bootstrapRequest: async () => { throw new Error('Should not be called'); },
|
||||
});
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
|
||||
expect(loginCalled).toBe(true);
|
||||
expect(output.join('\n')).not.toContain('No users configured');
|
||||
});
|
||||
});
|
||||
|
||||
describe('logout command', () => {
|
||||
it('removes credentials on logout', async () => {
|
||||
saveCredentials({ token: 'tok', mcpdUrl: 'http://x:3100', user: 'alice' }, { configDir: tempDir });
|
||||
@@ -120,6 +188,8 @@ describe('logout command', () => {
|
||||
log,
|
||||
loginRequest: async () => ({ token: '', user: { email: '' } }),
|
||||
logoutRequest: async () => { logoutCalled = true; },
|
||||
statusRequest: async () => ({ hasUsers: true }),
|
||||
bootstrapRequest: async () => ({ token: '', user: { email: '' } }),
|
||||
});
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
expect(output[0]).toContain('Logged out successfully');
|
||||
@@ -137,6 +207,8 @@ describe('logout command', () => {
|
||||
log,
|
||||
loginRequest: async () => ({ token: '', user: { email: '' } }),
|
||||
logoutRequest: async () => {},
|
||||
statusRequest: async () => ({ hasUsers: true }),
|
||||
bootstrapRequest: async () => ({ token: '', user: { email: '' } }),
|
||||
});
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
expect(output[0]).toContain('Not logged in');
|
||||
|
||||
@@ -1,158 +1,192 @@
|
||||
import { describe, it, expect, vi, beforeEach } from 'vitest';
|
||||
import { describe, it, expect, vi, beforeEach, afterEach } from 'vitest';
|
||||
import { writeFileSync, readFileSync, mkdtempSync, rmSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
import { createClaudeCommand } from '../../src/commands/claude.js';
|
||||
import { createConfigCommand } from '../../src/commands/config.js';
|
||||
import type { ApiClient } from '../../src/api-client.js';
|
||||
import { saveCredentials, loadCredentials } from '../../src/auth/index.js';
|
||||
|
||||
function mockClient(): ApiClient {
|
||||
return {
|
||||
get: vi.fn(async () => ({
|
||||
mcpServers: {
|
||||
'slack--default': { command: 'npx', args: ['-y', '@anthropic/slack-mcp'], env: { WORKSPACE: 'test' } },
|
||||
'github--default': { command: 'npx', args: ['-y', '@anthropic/github-mcp'] },
|
||||
},
|
||||
})),
|
||||
post: vi.fn(async () => ({})),
|
||||
get: vi.fn(async () => ({})),
|
||||
post: vi.fn(async () => ({ token: 'impersonated-tok', user: { email: 'other@test.com' } })),
|
||||
put: vi.fn(async () => ({})),
|
||||
delete: vi.fn(async () => {}),
|
||||
} as unknown as ApiClient;
|
||||
}
|
||||
|
||||
describe('claude command', () => {
|
||||
describe('config claude', () => {
|
||||
let client: ReturnType<typeof mockClient>;
|
||||
let output: string[];
|
||||
let tmpDir: string;
|
||||
const log = (...args: unknown[]) => output.push(args.map(String).join(' '));
|
||||
const log = (...args: string[]) => output.push(args.join(' '));
|
||||
|
||||
beforeEach(() => {
|
||||
client = mockClient();
|
||||
output = [];
|
||||
tmpDir = mkdtempSync(join(tmpdir(), 'mcpctl-claude-'));
|
||||
tmpDir = mkdtempSync(join(tmpdir(), 'mcpctl-config-claude-'));
|
||||
});
|
||||
|
||||
describe('generate', () => {
|
||||
it('generates .mcp.json from project config', async () => {
|
||||
const outPath = join(tmpDir, '.mcp.json');
|
||||
const cmd = createClaudeCommand({ client, log });
|
||||
await cmd.parseAsync(['generate', 'proj-1', '-o', outPath], { from: 'user' });
|
||||
afterEach(() => {
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
expect(client.get).toHaveBeenCalledWith('/api/v1/projects/proj-1/mcp-config');
|
||||
const written = JSON.parse(readFileSync(outPath, 'utf-8'));
|
||||
expect(written.mcpServers['slack--default']).toBeDefined();
|
||||
expect(output.join('\n')).toContain('2 server(s)');
|
||||
it('generates .mcp.json with mcpctl mcp bridge entry', async () => {
|
||||
const outPath = join(tmpDir, '.mcp.json');
|
||||
const cmd = createConfigCommand(
|
||||
{ configDeps: { configDir: tmpDir }, log },
|
||||
{ client, credentialsDeps: { configDir: tmpDir }, log },
|
||||
);
|
||||
await cmd.parseAsync(['claude', '--project', 'homeautomation', '-o', outPath], { from: 'user' });
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
// No API call should be made
|
||||
expect(client.get).not.toHaveBeenCalled();
|
||||
|
||||
const written = JSON.parse(readFileSync(outPath, 'utf-8'));
|
||||
expect(written.mcpServers['homeautomation']).toEqual({
|
||||
command: 'mcpctl',
|
||||
args: ['mcp', '-p', 'homeautomation'],
|
||||
});
|
||||
expect(output.join('\n')).toContain('1 server(s)');
|
||||
});
|
||||
|
||||
it('prints to stdout with --stdout', async () => {
|
||||
const cmd = createClaudeCommand({ client, log });
|
||||
await cmd.parseAsync(['generate', 'proj-1', '--stdout'], { from: 'user' });
|
||||
it('prints to stdout with --stdout', async () => {
|
||||
const cmd = createConfigCommand(
|
||||
{ configDeps: { configDir: tmpDir }, log },
|
||||
{ client, credentialsDeps: { configDir: tmpDir }, log },
|
||||
);
|
||||
await cmd.parseAsync(['claude', '--project', 'myproj', '--stdout'], { from: 'user' });
|
||||
|
||||
expect(output[0]).toContain('mcpServers');
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('merges with existing .mcp.json', async () => {
|
||||
const outPath = join(tmpDir, '.mcp.json');
|
||||
writeFileSync(outPath, JSON.stringify({
|
||||
mcpServers: { 'existing--server': { command: 'echo', args: [] } },
|
||||
}));
|
||||
|
||||
const cmd = createClaudeCommand({ client, log });
|
||||
await cmd.parseAsync(['generate', 'proj-1', '-o', outPath, '--merge'], { from: 'user' });
|
||||
|
||||
const written = JSON.parse(readFileSync(outPath, 'utf-8'));
|
||||
expect(written.mcpServers['existing--server']).toBeDefined();
|
||||
expect(written.mcpServers['slack--default']).toBeDefined();
|
||||
expect(output.join('\n')).toContain('3 server(s)');
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
const parsed = JSON.parse(output[0]);
|
||||
expect(parsed.mcpServers['myproj']).toEqual({
|
||||
command: 'mcpctl',
|
||||
args: ['mcp', '-p', 'myproj'],
|
||||
});
|
||||
});
|
||||
|
||||
describe('show', () => {
|
||||
it('shows servers in .mcp.json', () => {
|
||||
const filePath = join(tmpDir, '.mcp.json');
|
||||
writeFileSync(filePath, JSON.stringify({
|
||||
mcpServers: {
|
||||
'slack': { command: 'npx', args: ['-y', '@anthropic/slack-mcp'], env: { TOKEN: 'x' } },
|
||||
},
|
||||
}));
|
||||
it('merges with existing .mcp.json', async () => {
|
||||
const outPath = join(tmpDir, '.mcp.json');
|
||||
writeFileSync(outPath, JSON.stringify({
|
||||
mcpServers: { 'existing--server': { command: 'echo', args: [] } },
|
||||
}));
|
||||
|
||||
const cmd = createClaudeCommand({ client, log });
|
||||
cmd.parseAsync(['show', '-p', filePath], { from: 'user' });
|
||||
const cmd = createConfigCommand(
|
||||
{ configDeps: { configDir: tmpDir }, log },
|
||||
{ client, credentialsDeps: { configDir: tmpDir }, log },
|
||||
);
|
||||
await cmd.parseAsync(['claude', '--project', 'proj-1', '-o', outPath, '--merge'], { from: 'user' });
|
||||
|
||||
expect(output.join('\n')).toContain('slack');
|
||||
expect(output.join('\n')).toContain('npx -y @anthropic/slack-mcp');
|
||||
expect(output.join('\n')).toContain('TOKEN');
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
const written = JSON.parse(readFileSync(outPath, 'utf-8'));
|
||||
expect(written.mcpServers['existing--server']).toBeDefined();
|
||||
expect(written.mcpServers['proj-1']).toEqual({
|
||||
command: 'mcpctl',
|
||||
args: ['mcp', '-p', 'proj-1'],
|
||||
});
|
||||
expect(output.join('\n')).toContain('2 server(s)');
|
||||
});
|
||||
|
||||
it('handles missing file', () => {
|
||||
const cmd = createClaudeCommand({ client, log });
|
||||
cmd.parseAsync(['show', '-p', join(tmpDir, 'nonexistent.json')], { from: 'user' });
|
||||
it('backward compat: claude-generate still works', async () => {
|
||||
const outPath = join(tmpDir, '.mcp.json');
|
||||
const cmd = createConfigCommand(
|
||||
{ configDeps: { configDir: tmpDir }, log },
|
||||
{ client, credentialsDeps: { configDir: tmpDir }, log },
|
||||
);
|
||||
await cmd.parseAsync(['claude-generate', '--project', 'proj-1', '-o', outPath], { from: 'user' });
|
||||
|
||||
expect(output.join('\n')).toContain('No .mcp.json found');
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
const written = JSON.parse(readFileSync(outPath, 'utf-8'));
|
||||
expect(written.mcpServers['proj-1']).toEqual({
|
||||
command: 'mcpctl',
|
||||
args: ['mcp', '-p', 'proj-1'],
|
||||
});
|
||||
});
|
||||
|
||||
describe('add', () => {
|
||||
it('adds a server entry', () => {
|
||||
const filePath = join(tmpDir, '.mcp.json');
|
||||
const cmd = createClaudeCommand({ client, log });
|
||||
cmd.parseAsync(['add', 'my-server', '-c', 'npx', '-a', '-y', 'my-pkg', '-p', filePath], { from: 'user' });
|
||||
it('uses project name as the server key', async () => {
|
||||
const outPath = join(tmpDir, '.mcp.json');
|
||||
const cmd = createConfigCommand(
|
||||
{ configDeps: { configDir: tmpDir }, log },
|
||||
);
|
||||
await cmd.parseAsync(['claude', '--project', 'my-fancy-project', '-o', outPath], { from: 'user' });
|
||||
|
||||
const written = JSON.parse(readFileSync(filePath, 'utf-8'));
|
||||
expect(written.mcpServers['my-server']).toEqual({
|
||||
command: 'npx',
|
||||
args: ['-y', 'my-pkg'],
|
||||
});
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('adds server with env vars', () => {
|
||||
const filePath = join(tmpDir, '.mcp.json');
|
||||
const cmd = createClaudeCommand({ client, log });
|
||||
cmd.parseAsync(['add', 'my-server', '-c', 'node', '-e', 'KEY=val', 'SECRET=abc', '-p', filePath], { from: 'user' });
|
||||
|
||||
const written = JSON.parse(readFileSync(filePath, 'utf-8'));
|
||||
expect(written.mcpServers['my-server'].env).toEqual({ KEY: 'val', SECRET: 'abc' });
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
});
|
||||
|
||||
describe('remove', () => {
|
||||
it('removes a server entry', () => {
|
||||
const filePath = join(tmpDir, '.mcp.json');
|
||||
writeFileSync(filePath, JSON.stringify({
|
||||
mcpServers: { 'slack': { command: 'npx', args: [] }, 'github': { command: 'npx', args: [] } },
|
||||
}));
|
||||
|
||||
const cmd = createClaudeCommand({ client, log });
|
||||
cmd.parseAsync(['remove', 'slack', '-p', filePath], { from: 'user' });
|
||||
|
||||
const written = JSON.parse(readFileSync(filePath, 'utf-8'));
|
||||
expect(written.mcpServers['slack']).toBeUndefined();
|
||||
expect(written.mcpServers['github']).toBeDefined();
|
||||
expect(output.join('\n')).toContain("Removed 'slack'");
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('reports when server not found', () => {
|
||||
const filePath = join(tmpDir, '.mcp.json');
|
||||
writeFileSync(filePath, JSON.stringify({ mcpServers: {} }));
|
||||
|
||||
const cmd = createClaudeCommand({ client, log });
|
||||
cmd.parseAsync(['remove', 'nonexistent', '-p', filePath], { from: 'user' });
|
||||
|
||||
expect(output.join('\n')).toContain('not found');
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
const written = JSON.parse(readFileSync(outPath, 'utf-8'));
|
||||
expect(Object.keys(written.mcpServers)).toEqual(['my-fancy-project']);
|
||||
});
|
||||
});
|
||||
|
||||
describe('config impersonate', () => {
|
||||
let client: ReturnType<typeof mockClient>;
|
||||
let output: string[];
|
||||
let tmpDir: string;
|
||||
const log = (...args: string[]) => output.push(args.join(' '));
|
||||
|
||||
beforeEach(() => {
|
||||
client = mockClient();
|
||||
output = [];
|
||||
tmpDir = mkdtempSync(join(tmpdir(), 'mcpctl-config-impersonate-'));
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('impersonates a user and saves backup', async () => {
|
||||
saveCredentials({ token: 'admin-tok', mcpdUrl: 'http://localhost:3100', user: 'admin@test.com' }, { configDir: tmpDir });
|
||||
|
||||
const cmd = createConfigCommand(
|
||||
{ configDeps: { configDir: tmpDir }, log },
|
||||
{ client, credentialsDeps: { configDir: tmpDir }, log },
|
||||
);
|
||||
await cmd.parseAsync(['impersonate', 'other@test.com'], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/auth/impersonate', { email: 'other@test.com' });
|
||||
expect(output.join('\n')).toContain('Impersonating other@test.com');
|
||||
|
||||
const creds = loadCredentials({ configDir: tmpDir });
|
||||
expect(creds!.user).toBe('other@test.com');
|
||||
expect(creds!.token).toBe('impersonated-tok');
|
||||
|
||||
// Backup exists
|
||||
const backup = JSON.parse(readFileSync(join(tmpDir, 'credentials-backup'), 'utf-8'));
|
||||
expect(backup.user).toBe('admin@test.com');
|
||||
});
|
||||
|
||||
it('quits impersonation and restores backup', async () => {
|
||||
// Set up current (impersonated) credentials
|
||||
saveCredentials({ token: 'impersonated-tok', mcpdUrl: 'http://localhost:3100', user: 'other@test.com' }, { configDir: tmpDir });
|
||||
// Set up backup (original) credentials
|
||||
writeFileSync(join(tmpDir, 'credentials-backup'), JSON.stringify({
|
||||
token: 'admin-tok', mcpdUrl: 'http://localhost:3100', user: 'admin@test.com',
|
||||
}));
|
||||
|
||||
const cmd = createConfigCommand(
|
||||
{ configDeps: { configDir: tmpDir }, log },
|
||||
{ client, credentialsDeps: { configDir: tmpDir }, log },
|
||||
);
|
||||
await cmd.parseAsync(['impersonate', '--quit'], { from: 'user' });
|
||||
|
||||
expect(output.join('\n')).toContain('Returned to admin@test.com');
|
||||
|
||||
const creds = loadCredentials({ configDir: tmpDir });
|
||||
expect(creds!.user).toBe('admin@test.com');
|
||||
expect(creds!.token).toBe('admin-tok');
|
||||
});
|
||||
|
||||
it('errors when not logged in', async () => {
|
||||
const cmd = createConfigCommand(
|
||||
{ configDeps: { configDir: tmpDir }, log },
|
||||
{ client, credentialsDeps: { configDir: tmpDir }, log },
|
||||
);
|
||||
await cmd.parseAsync(['impersonate', 'other@test.com'], { from: 'user' });
|
||||
|
||||
expect(output.join('\n')).toContain('Not logged in');
|
||||
});
|
||||
|
||||
it('errors when quitting with no backup', async () => {
|
||||
const cmd = createConfigCommand(
|
||||
{ configDeps: { configDir: tmpDir }, log },
|
||||
{ client, credentialsDeps: { configDir: tmpDir }, log },
|
||||
);
|
||||
await cmd.parseAsync(['impersonate', '--quit'], { from: 'user' });
|
||||
|
||||
expect(output.join('\n')).toContain('No impersonation session to quit');
|
||||
});
|
||||
});
|
||||
|
||||
293
src/cli/tests/commands/config-setup.test.ts
Normal file
293
src/cli/tests/commands/config-setup.test.ts
Normal file
@@ -0,0 +1,293 @@
|
||||
import { describe, it, expect, vi, beforeEach } from 'vitest';
|
||||
import { createConfigSetupCommand } from '../../src/commands/config-setup.js';
|
||||
import type { ConfigSetupDeps, ConfigSetupPrompt } from '../../src/commands/config-setup.js';
|
||||
import type { SecretStore } from '@mcpctl/shared';
|
||||
import { mkdtempSync, rmSync, readFileSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
|
||||
let tempDir: string;
|
||||
let logs: string[];
|
||||
|
||||
beforeEach(() => {
|
||||
tempDir = mkdtempSync(join(tmpdir(), 'mcpctl-config-setup-test-'));
|
||||
logs = [];
|
||||
});
|
||||
|
||||
function cleanup() {
|
||||
rmSync(tempDir, { recursive: true, force: true });
|
||||
}
|
||||
|
||||
function mockSecretStore(secrets: Record<string, string> = {}): SecretStore {
|
||||
const store: Record<string, string> = { ...secrets };
|
||||
return {
|
||||
get: vi.fn(async (key: string) => store[key] ?? null),
|
||||
set: vi.fn(async (key: string, value: string) => { store[key] = value; }),
|
||||
delete: vi.fn(async () => true),
|
||||
backend: () => 'mock',
|
||||
};
|
||||
}
|
||||
|
||||
function mockPrompt(answers: unknown[]): ConfigSetupPrompt {
|
||||
let callIndex = 0;
|
||||
return {
|
||||
select: vi.fn(async () => answers[callIndex++]),
|
||||
input: vi.fn(async () => answers[callIndex++] as string),
|
||||
password: vi.fn(async () => answers[callIndex++] as string),
|
||||
confirm: vi.fn(async () => answers[callIndex++] as boolean),
|
||||
};
|
||||
}
|
||||
|
||||
function buildDeps(overrides: {
|
||||
secrets?: Record<string, string>;
|
||||
answers?: unknown[];
|
||||
fetchModels?: ConfigSetupDeps['fetchModels'];
|
||||
whichBinary?: ConfigSetupDeps['whichBinary'];
|
||||
} = {}): ConfigSetupDeps {
|
||||
return {
|
||||
configDeps: { configDir: tempDir },
|
||||
secretStore: mockSecretStore(overrides.secrets),
|
||||
log: (...args: string[]) => logs.push(args.join(' ')),
|
||||
prompt: mockPrompt(overrides.answers ?? []),
|
||||
fetchModels: overrides.fetchModels ?? vi.fn(async () => []),
|
||||
whichBinary: overrides.whichBinary ?? vi.fn(async () => '/usr/bin/gemini'),
|
||||
};
|
||||
}
|
||||
|
||||
function readConfig(): Record<string, unknown> {
|
||||
const raw = readFileSync(join(tempDir, 'config.json'), 'utf-8');
|
||||
return JSON.parse(raw) as Record<string, unknown>;
|
||||
}
|
||||
|
||||
async function runSetup(deps: ConfigSetupDeps): Promise<void> {
|
||||
const cmd = createConfigSetupCommand(deps);
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
}
|
||||
|
||||
describe('config setup wizard', () => {
|
||||
describe('provider: none', () => {
|
||||
it('disables LLM and saves config', async () => {
|
||||
const deps = buildDeps({ answers: ['simple', 'none'] });
|
||||
await runSetup(deps);
|
||||
|
||||
const config = readConfig();
|
||||
expect(config.llm).toEqual({ provider: 'none' });
|
||||
expect(logs.some((l) => l.includes('LLM disabled'))).toBe(true);
|
||||
cleanup();
|
||||
});
|
||||
});
|
||||
|
||||
describe('provider: gemini-cli', () => {
|
||||
it('auto-detects binary path and saves config', async () => {
|
||||
// Answers: select provider, select model (no binary prompt — auto-detected)
|
||||
const deps = buildDeps({
|
||||
answers: ['simple', 'gemini-cli', 'gemini-2.5-flash'],
|
||||
whichBinary: vi.fn(async () => '/home/user/.npm-global/bin/gemini'),
|
||||
});
|
||||
await runSetup(deps);
|
||||
|
||||
const config = readConfig();
|
||||
const llm = config.llm as Record<string, unknown>;
|
||||
expect(llm.provider).toBe('gemini-cli');
|
||||
expect(llm.model).toBe('gemini-2.5-flash');
|
||||
expect(llm.binaryPath).toBe('/home/user/.npm-global/bin/gemini');
|
||||
expect(logs.some((l) => l.includes('Found gemini at'))).toBe(true);
|
||||
cleanup();
|
||||
});
|
||||
|
||||
it('prompts for manual path when binary not found', async () => {
|
||||
// Answers: select provider, select model, enter manual path
|
||||
const deps = buildDeps({
|
||||
answers: ['simple', 'gemini-cli', 'gemini-2.5-flash', '/opt/gemini'],
|
||||
whichBinary: vi.fn(async () => null),
|
||||
});
|
||||
await runSetup(deps);
|
||||
|
||||
const config = readConfig();
|
||||
const llm = config.llm as Record<string, unknown>;
|
||||
expect(llm.binaryPath).toBe('/opt/gemini');
|
||||
expect(logs.some((l) => l.includes('not found'))).toBe(true);
|
||||
cleanup();
|
||||
});
|
||||
|
||||
it('saves gemini-cli with custom model', async () => {
|
||||
// Answers: select provider, select custom, enter model name
|
||||
const deps = buildDeps({
|
||||
answers: ['simple', 'gemini-cli', '__custom__', 'gemini-3.0-flash'],
|
||||
whichBinary: vi.fn(async () => '/usr/bin/gemini'),
|
||||
});
|
||||
await runSetup(deps);
|
||||
|
||||
const config = readConfig();
|
||||
const llm = config.llm as Record<string, unknown>;
|
||||
expect(llm.model).toBe('gemini-3.0-flash');
|
||||
cleanup();
|
||||
});
|
||||
});
|
||||
|
||||
describe('provider: ollama', () => {
|
||||
it('fetches models and allows selection', async () => {
|
||||
const fetchModels = vi.fn(async () => ['llama3.2', 'codellama', 'mistral']);
|
||||
// Answers: select provider, enter URL, select model
|
||||
const deps = buildDeps({
|
||||
answers: ['simple', 'ollama', 'http://localhost:11434', 'codellama'],
|
||||
fetchModels,
|
||||
});
|
||||
await runSetup(deps);
|
||||
|
||||
expect(fetchModels).toHaveBeenCalledWith('http://localhost:11434', '/api/tags');
|
||||
const config = readConfig();
|
||||
const llm = config.llm as Record<string, unknown>;
|
||||
expect(llm.provider).toBe('ollama');
|
||||
expect(llm.model).toBe('codellama');
|
||||
expect(llm.url).toBe('http://localhost:11434');
|
||||
cleanup();
|
||||
});
|
||||
|
||||
it('falls back to manual input when fetch fails', async () => {
|
||||
const fetchModels = vi.fn(async () => []);
|
||||
// Answers: select provider, enter URL, enter model manually
|
||||
const deps = buildDeps({
|
||||
answers: ['simple', 'ollama', 'http://localhost:11434', 'llama3.2'],
|
||||
fetchModels,
|
||||
});
|
||||
await runSetup(deps);
|
||||
|
||||
const config = readConfig();
|
||||
expect((config.llm as Record<string, unknown>).model).toBe('llama3.2');
|
||||
cleanup();
|
||||
});
|
||||
});
|
||||
|
||||
describe('provider: anthropic', () => {
|
||||
it('prompts for API key and saves to secret store', async () => {
|
||||
// Answers: select provider, enter API key, select model
|
||||
const deps = buildDeps({
|
||||
answers: ['simple', 'anthropic', 'sk-ant-new-key', 'claude-haiku-3-5-20241022'],
|
||||
});
|
||||
await runSetup(deps);
|
||||
|
||||
expect(deps.secretStore.set).toHaveBeenCalledWith('anthropic-api-key', 'sk-ant-new-key');
|
||||
const config = readConfig();
|
||||
const llm = config.llm as Record<string, unknown>;
|
||||
expect(llm.provider).toBe('anthropic');
|
||||
expect(llm.model).toBe('claude-haiku-3-5-20241022');
|
||||
// API key should NOT be in config file
|
||||
expect(llm).not.toHaveProperty('apiKey');
|
||||
cleanup();
|
||||
});
|
||||
|
||||
it('shows existing key masked and allows keeping it', async () => {
|
||||
// Answers: select provider, confirm change=false, select model
|
||||
const deps = buildDeps({
|
||||
secrets: { 'anthropic-api-key': 'sk-ant-existing-key-1234' },
|
||||
answers: ['simple', 'anthropic', false, 'claude-sonnet-4-20250514'],
|
||||
});
|
||||
await runSetup(deps);
|
||||
|
||||
// Should NOT have called set (kept existing key)
|
||||
expect(deps.secretStore.set).not.toHaveBeenCalled();
|
||||
const config = readConfig();
|
||||
expect((config.llm as Record<string, unknown>).model).toBe('claude-sonnet-4-20250514');
|
||||
cleanup();
|
||||
});
|
||||
|
||||
it('allows replacing existing key', async () => {
|
||||
// Answers: select provider, confirm change=true, enter new key, select model
|
||||
const deps = buildDeps({
|
||||
secrets: { 'anthropic-api-key': 'sk-ant-old' },
|
||||
answers: ['simple', 'anthropic', true, 'sk-ant-new', 'claude-haiku-3-5-20241022'],
|
||||
});
|
||||
await runSetup(deps);
|
||||
|
||||
expect(deps.secretStore.set).toHaveBeenCalledWith('anthropic-api-key', 'sk-ant-new');
|
||||
cleanup();
|
||||
});
|
||||
});
|
||||
|
||||
describe('provider: vllm', () => {
|
||||
it('fetches models from vLLM and allows selection', async () => {
|
||||
const fetchModels = vi.fn(async () => ['my-model', 'llama-70b']);
|
||||
// Answers: select provider, enter URL, select model
|
||||
const deps = buildDeps({
|
||||
answers: ['simple', 'vllm', 'http://gpu:8000', 'llama-70b'],
|
||||
fetchModels,
|
||||
});
|
||||
await runSetup(deps);
|
||||
|
||||
expect(fetchModels).toHaveBeenCalledWith('http://gpu:8000', '/v1/models');
|
||||
const config = readConfig();
|
||||
const llm = config.llm as Record<string, unknown>;
|
||||
expect(llm.provider).toBe('vllm');
|
||||
expect(llm.url).toBe('http://gpu:8000');
|
||||
expect(llm.model).toBe('llama-70b');
|
||||
cleanup();
|
||||
});
|
||||
});
|
||||
|
||||
describe('provider: openai', () => {
|
||||
it('prompts for key, model, and optional custom endpoint', async () => {
|
||||
// Answers: select provider, enter key, enter model, confirm custom URL=true, enter URL
|
||||
const deps = buildDeps({
|
||||
answers: ['simple', 'openai', 'sk-openai-key', 'gpt-4o', true, 'https://custom.api.com'],
|
||||
});
|
||||
await runSetup(deps);
|
||||
|
||||
expect(deps.secretStore.set).toHaveBeenCalledWith('openai-api-key', 'sk-openai-key');
|
||||
const config = readConfig();
|
||||
const llm = config.llm as Record<string, unknown>;
|
||||
expect(llm.provider).toBe('openai');
|
||||
expect(llm.model).toBe('gpt-4o');
|
||||
expect(llm.url).toBe('https://custom.api.com');
|
||||
cleanup();
|
||||
});
|
||||
|
||||
it('skips custom URL when not requested', async () => {
|
||||
// Answers: select provider, enter key, enter model, confirm custom URL=false
|
||||
const deps = buildDeps({
|
||||
answers: ['simple', 'openai', 'sk-openai-key', 'gpt-4o-mini', false],
|
||||
});
|
||||
await runSetup(deps);
|
||||
|
||||
const config = readConfig();
|
||||
const llm = config.llm as Record<string, unknown>;
|
||||
expect(llm.url).toBeUndefined();
|
||||
cleanup();
|
||||
});
|
||||
});
|
||||
|
||||
describe('provider: deepseek', () => {
|
||||
it('prompts for key and model', async () => {
|
||||
// Answers: select provider, enter key, select model
|
||||
const deps = buildDeps({
|
||||
answers: ['simple', 'deepseek', 'sk-ds-key', 'deepseek-chat'],
|
||||
});
|
||||
await runSetup(deps);
|
||||
|
||||
expect(deps.secretStore.set).toHaveBeenCalledWith('deepseek-api-key', 'sk-ds-key');
|
||||
const config = readConfig();
|
||||
const llm = config.llm as Record<string, unknown>;
|
||||
expect(llm.provider).toBe('deepseek');
|
||||
expect(llm.model).toBe('deepseek-chat');
|
||||
cleanup();
|
||||
});
|
||||
});
|
||||
|
||||
describe('output messages', () => {
|
||||
it('shows restart instruction', async () => {
|
||||
const deps = buildDeps({ answers: ['simple', 'gemini-cli', 'gemini-2.5-flash'] });
|
||||
await runSetup(deps);
|
||||
|
||||
expect(logs.some((l) => l.includes('systemctl --user restart mcplocal'))).toBe(true);
|
||||
cleanup();
|
||||
});
|
||||
|
||||
it('shows configured provider and model', async () => {
|
||||
const deps = buildDeps({ answers: ['simple', 'gemini-cli', 'gemini-2.5-flash'] });
|
||||
await runSetup(deps);
|
||||
|
||||
expect(logs.some((l) => l.includes('gemini-cli') && l.includes('gemini-2.5-flash'))).toBe(true);
|
||||
cleanup();
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -175,6 +175,7 @@ describe('create command', () => {
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/projects', {
|
||||
name: 'my-project',
|
||||
description: 'A test project',
|
||||
proxyMode: 'direct',
|
||||
});
|
||||
expect(output.join('\n')).toContain("project 'test' created");
|
||||
});
|
||||
@@ -185,6 +186,7 @@ describe('create command', () => {
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/projects', {
|
||||
name: 'minimal',
|
||||
description: '',
|
||||
proxyMode: 'direct',
|
||||
});
|
||||
});
|
||||
|
||||
@@ -193,8 +195,366 @@ describe('create command', () => {
|
||||
vi.mocked(client.get).mockResolvedValueOnce([{ id: 'proj-1', name: 'my-proj' }] as never);
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync(['project', 'my-proj', '-d', 'updated', '--force'], { from: 'user' });
|
||||
expect(client.put).toHaveBeenCalledWith('/api/v1/projects/proj-1', { description: 'updated' });
|
||||
expect(client.put).toHaveBeenCalledWith('/api/v1/projects/proj-1', { description: 'updated', proxyMode: 'direct' });
|
||||
expect(output.join('\n')).toContain("project 'my-proj' updated");
|
||||
});
|
||||
});
|
||||
|
||||
describe('create user', () => {
|
||||
it('creates a user with password and name', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'usr-1', email: 'alice@test.com' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync([
|
||||
'user', 'alice@test.com',
|
||||
'--password', 'secret123',
|
||||
'--name', 'Alice',
|
||||
], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/users', {
|
||||
email: 'alice@test.com',
|
||||
password: 'secret123',
|
||||
name: 'Alice',
|
||||
});
|
||||
expect(output.join('\n')).toContain("user 'alice@test.com' created");
|
||||
});
|
||||
|
||||
it('does not send role field (RBAC is the auth mechanism)', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'usr-1', email: 'admin@test.com' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync([
|
||||
'user', 'admin@test.com',
|
||||
'--password', 'pass123',
|
||||
], { from: 'user' });
|
||||
|
||||
const callBody = vi.mocked(client.post).mock.calls[0]![1] as Record<string, unknown>;
|
||||
expect(callBody).not.toHaveProperty('role');
|
||||
});
|
||||
|
||||
it('requires --password', async () => {
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await expect(cmd.parseAsync(['user', 'alice@test.com'], { from: 'user' })).rejects.toThrow('--password is required');
|
||||
});
|
||||
|
||||
it('throws on 409 without --force', async () => {
|
||||
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"User already exists"}'));
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await expect(
|
||||
cmd.parseAsync(['user', 'alice@test.com', '--password', 'pass'], { from: 'user' }),
|
||||
).rejects.toThrow('API error 409');
|
||||
});
|
||||
|
||||
it('updates existing user on 409 with --force', async () => {
|
||||
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"User already exists"}'));
|
||||
vi.mocked(client.get).mockResolvedValueOnce([{ id: 'usr-1', email: 'alice@test.com' }] as never);
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync([
|
||||
'user', 'alice@test.com', '--password', 'newpass', '--name', 'Alice New', '--force',
|
||||
], { from: 'user' });
|
||||
|
||||
expect(client.put).toHaveBeenCalledWith('/api/v1/users/usr-1', {
|
||||
password: 'newpass',
|
||||
name: 'Alice New',
|
||||
});
|
||||
expect(output.join('\n')).toContain("user 'alice@test.com' updated");
|
||||
});
|
||||
});
|
||||
|
||||
describe('create group', () => {
|
||||
it('creates a group with members', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'grp-1', name: 'dev-team' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync([
|
||||
'group', 'dev-team',
|
||||
'--description', 'Development team',
|
||||
'--member', 'alice@test.com',
|
||||
'--member', 'bob@test.com',
|
||||
], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/groups', {
|
||||
name: 'dev-team',
|
||||
description: 'Development team',
|
||||
members: ['alice@test.com', 'bob@test.com'],
|
||||
});
|
||||
expect(output.join('\n')).toContain("group 'dev-team' created");
|
||||
});
|
||||
|
||||
it('creates a group with no members', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'grp-1', name: 'empty-group' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync(['group', 'empty-group'], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/groups', {
|
||||
name: 'empty-group',
|
||||
members: [],
|
||||
});
|
||||
});
|
||||
|
||||
it('throws on 409 without --force', async () => {
|
||||
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"Group already exists"}'));
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await expect(
|
||||
cmd.parseAsync(['group', 'dev-team'], { from: 'user' }),
|
||||
).rejects.toThrow('API error 409');
|
||||
});
|
||||
|
||||
it('updates existing group on 409 with --force', async () => {
|
||||
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"Group already exists"}'));
|
||||
vi.mocked(client.get).mockResolvedValueOnce([{ id: 'grp-1', name: 'dev-team' }] as never);
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync([
|
||||
'group', 'dev-team', '--member', 'new@test.com', '--force',
|
||||
], { from: 'user' });
|
||||
|
||||
expect(client.put).toHaveBeenCalledWith('/api/v1/groups/grp-1', {
|
||||
members: ['new@test.com'],
|
||||
});
|
||||
expect(output.join('\n')).toContain("group 'dev-team' updated");
|
||||
});
|
||||
});
|
||||
|
||||
describe('create rbac', () => {
|
||||
it('creates an RBAC definition with subjects and bindings', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'rbac-1', name: 'developers' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync([
|
||||
'rbac', 'developers',
|
||||
'--subject', 'User:alice@test.com',
|
||||
'--subject', 'Group:dev-team',
|
||||
'--binding', 'edit:servers',
|
||||
'--binding', 'view:instances',
|
||||
], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', {
|
||||
name: 'developers',
|
||||
subjects: [
|
||||
{ kind: 'User', name: 'alice@test.com' },
|
||||
{ kind: 'Group', name: 'dev-team' },
|
||||
],
|
||||
roleBindings: [
|
||||
{ role: 'edit', resource: 'servers' },
|
||||
{ role: 'view', resource: 'instances' },
|
||||
],
|
||||
});
|
||||
expect(output.join('\n')).toContain("rbac 'developers' created");
|
||||
});
|
||||
|
||||
it('creates an RBAC definition with wildcard resource', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'rbac-1', name: 'admins' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync([
|
||||
'rbac', 'admins',
|
||||
'--subject', 'User:admin@test.com',
|
||||
'--binding', 'edit:*',
|
||||
], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', {
|
||||
name: 'admins',
|
||||
subjects: [{ kind: 'User', name: 'admin@test.com' }],
|
||||
roleBindings: [{ role: 'edit', resource: '*' }],
|
||||
});
|
||||
});
|
||||
|
||||
it('creates an RBAC definition with empty subjects and bindings', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'rbac-1', name: 'empty' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync(['rbac', 'empty'], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', {
|
||||
name: 'empty',
|
||||
subjects: [],
|
||||
roleBindings: [],
|
||||
});
|
||||
});
|
||||
|
||||
it('throws on invalid subject format', async () => {
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await expect(
|
||||
cmd.parseAsync(['rbac', 'bad', '--subject', 'no-colon'], { from: 'user' }),
|
||||
).rejects.toThrow('Invalid subject format');
|
||||
});
|
||||
|
||||
it('throws on invalid binding format', async () => {
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await expect(
|
||||
cmd.parseAsync(['rbac', 'bad', '--binding', 'no-colon'], { from: 'user' }),
|
||||
).rejects.toThrow('Invalid binding format');
|
||||
});
|
||||
|
||||
it('throws on 409 without --force', async () => {
|
||||
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"RBAC already exists"}'));
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await expect(
|
||||
cmd.parseAsync(['rbac', 'developers', '--subject', 'User:a@b.com', '--binding', 'edit:servers'], { from: 'user' }),
|
||||
).rejects.toThrow('API error 409');
|
||||
});
|
||||
|
||||
it('updates existing RBAC on 409 with --force', async () => {
|
||||
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"RBAC already exists"}'));
|
||||
vi.mocked(client.get).mockResolvedValueOnce([{ id: 'rbac-1', name: 'developers' }] as never);
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync([
|
||||
'rbac', 'developers',
|
||||
'--subject', 'User:new@test.com',
|
||||
'--binding', 'edit:*',
|
||||
'--force',
|
||||
], { from: 'user' });
|
||||
|
||||
expect(client.put).toHaveBeenCalledWith('/api/v1/rbac/rbac-1', {
|
||||
subjects: [{ kind: 'User', name: 'new@test.com' }],
|
||||
roleBindings: [{ role: 'edit', resource: '*' }],
|
||||
});
|
||||
expect(output.join('\n')).toContain("rbac 'developers' updated");
|
||||
});
|
||||
|
||||
it('creates an RBAC definition with operation bindings', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'rbac-1', name: 'ops' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync([
|
||||
'rbac', 'ops',
|
||||
'--subject', 'Group:ops-team',
|
||||
'--binding', 'edit:servers',
|
||||
'--operation', 'logs',
|
||||
'--operation', 'backup',
|
||||
], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', {
|
||||
name: 'ops',
|
||||
subjects: [{ kind: 'Group', name: 'ops-team' }],
|
||||
roleBindings: [
|
||||
{ role: 'edit', resource: 'servers' },
|
||||
{ role: 'run', action: 'logs' },
|
||||
{ role: 'run', action: 'backup' },
|
||||
],
|
||||
});
|
||||
expect(output.join('\n')).toContain("rbac 'ops' created");
|
||||
});
|
||||
|
||||
it('creates an RBAC definition with name-scoped binding', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'rbac-1', name: 'ha-viewer' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync([
|
||||
'rbac', 'ha-viewer',
|
||||
'--subject', 'User:alice@test.com',
|
||||
'--binding', 'view:servers:my-ha',
|
||||
], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', {
|
||||
name: 'ha-viewer',
|
||||
subjects: [{ kind: 'User', name: 'alice@test.com' }],
|
||||
roleBindings: [
|
||||
{ role: 'view', resource: 'servers', name: 'my-ha' },
|
||||
],
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('create prompt', () => {
|
||||
it('creates a prompt with content', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'p-1', name: 'test-prompt' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync(['prompt', 'test-prompt', '--content', 'Hello world'], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/prompts', {
|
||||
name: 'test-prompt',
|
||||
content: 'Hello world',
|
||||
});
|
||||
expect(output.join('\n')).toContain("prompt 'test-prompt' created");
|
||||
});
|
||||
|
||||
it('requires content or content-file', async () => {
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await expect(
|
||||
cmd.parseAsync(['prompt', 'no-content'], { from: 'user' }),
|
||||
).rejects.toThrow('--content or --content-file is required');
|
||||
});
|
||||
|
||||
it('--priority sets prompt priority', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'p-1', name: 'pri-prompt' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync(['prompt', 'pri-prompt', '--content', 'x', '--priority', '8'], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/prompts', expect.objectContaining({
|
||||
priority: 8,
|
||||
}));
|
||||
});
|
||||
|
||||
it('--priority validates range 1-10', async () => {
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await expect(
|
||||
cmd.parseAsync(['prompt', 'bad', '--content', 'x', '--priority', '15'], { from: 'user' }),
|
||||
).rejects.toThrow('--priority must be a number between 1 and 10');
|
||||
});
|
||||
|
||||
it('--priority rejects zero', async () => {
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await expect(
|
||||
cmd.parseAsync(['prompt', 'bad', '--content', 'x', '--priority', '0'], { from: 'user' }),
|
||||
).rejects.toThrow('--priority must be a number between 1 and 10');
|
||||
});
|
||||
|
||||
it('--link sets linkTarget', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'p-1', name: 'linked' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync(['prompt', 'linked', '--content', 'x', '--link', 'proj/srv:docmost://pages/abc'], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/prompts', expect.objectContaining({
|
||||
linkTarget: 'proj/srv:docmost://pages/abc',
|
||||
}));
|
||||
});
|
||||
|
||||
it('--project resolves project name to ID', async () => {
|
||||
vi.mocked(client.get).mockResolvedValueOnce([{ id: 'proj-1', name: 'my-project' }] as never);
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'p-1', name: 'scoped' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync(['prompt', 'scoped', '--content', 'x', '--project', 'my-project'], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/prompts', expect.objectContaining({
|
||||
projectId: 'proj-1',
|
||||
}));
|
||||
});
|
||||
|
||||
it('--project throws when project not found', async () => {
|
||||
vi.mocked(client.get).mockResolvedValueOnce([] as never);
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await expect(
|
||||
cmd.parseAsync(['prompt', 'bad', '--content', 'x', '--project', 'nope'], { from: 'user' }),
|
||||
).rejects.toThrow("Project 'nope' not found");
|
||||
});
|
||||
});
|
||||
|
||||
describe('create promptrequest', () => {
|
||||
it('creates a prompt request with priority', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'r-1', name: 'req' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync(['promptrequest', 'req', '--content', 'proposal', '--priority', '7'], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/promptrequests', expect.objectContaining({
|
||||
name: 'req',
|
||||
content: 'proposal',
|
||||
priority: 7,
|
||||
}));
|
||||
});
|
||||
});
|
||||
|
||||
describe('create project', () => {
|
||||
it('creates a project with --gated', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'proj-1', name: 'gated-proj' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync(['project', 'gated-proj', '--gated'], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/projects', expect.objectContaining({
|
||||
gated: true,
|
||||
}));
|
||||
});
|
||||
|
||||
it('creates a project with --no-gated', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'proj-1', name: 'open-proj' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync(['project', 'open-proj', '--no-gated'], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/projects', expect.objectContaining({
|
||||
gated: false,
|
||||
}));
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
@@ -287,4 +287,410 @@ describe('describe command', () => {
|
||||
expect(text).toContain('list_datasources');
|
||||
expect(text).toContain('mcpctl create server my-grafana --from-template=grafana');
|
||||
});
|
||||
|
||||
it('shows user detail (no Role field — RBAC is the auth mechanism)', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'usr-1',
|
||||
email: 'alice@test.com',
|
||||
name: 'Alice Smith',
|
||||
provider: null,
|
||||
createdAt: '2025-01-01',
|
||||
updatedAt: '2025-01-15',
|
||||
});
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'user', 'usr-1']);
|
||||
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('users', 'usr-1');
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('=== User: alice@test.com ===');
|
||||
expect(text).toContain('Email:');
|
||||
expect(text).toContain('alice@test.com');
|
||||
expect(text).toContain('Name:');
|
||||
expect(text).toContain('Alice Smith');
|
||||
expect(text).not.toContain('Role:');
|
||||
expect(text).toContain('Provider:');
|
||||
expect(text).toContain('local');
|
||||
expect(text).toContain('ID:');
|
||||
expect(text).toContain('usr-1');
|
||||
});
|
||||
|
||||
it('shows user with no name as dash', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'usr-2',
|
||||
email: 'bob@test.com',
|
||||
name: null,
|
||||
provider: 'oidc',
|
||||
});
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'user', 'usr-2']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('=== User: bob@test.com ===');
|
||||
expect(text).toContain('Name:');
|
||||
expect(text).toContain('-');
|
||||
expect(text).not.toContain('Role:');
|
||||
expect(text).toContain('oidc');
|
||||
});
|
||||
|
||||
it('shows group detail with members', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'grp-1',
|
||||
name: 'dev-team',
|
||||
description: 'Development team',
|
||||
members: [
|
||||
{ user: { email: 'alice@test.com' }, createdAt: '2025-01-01' },
|
||||
{ user: { email: 'bob@test.com' }, createdAt: '2025-01-02' },
|
||||
],
|
||||
createdAt: '2025-01-01',
|
||||
updatedAt: '2025-01-15',
|
||||
});
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'group', 'grp-1']);
|
||||
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('groups', 'grp-1');
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('=== Group: dev-team ===');
|
||||
expect(text).toContain('Name:');
|
||||
expect(text).toContain('dev-team');
|
||||
expect(text).toContain('Description:');
|
||||
expect(text).toContain('Development team');
|
||||
expect(text).toContain('Members:');
|
||||
expect(text).toContain('EMAIL');
|
||||
expect(text).toContain('ADDED');
|
||||
expect(text).toContain('alice@test.com');
|
||||
expect(text).toContain('bob@test.com');
|
||||
expect(text).toContain('ID:');
|
||||
expect(text).toContain('grp-1');
|
||||
});
|
||||
|
||||
it('shows group detail with no members', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'grp-2',
|
||||
name: 'empty-group',
|
||||
description: '',
|
||||
members: [],
|
||||
});
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'group', 'grp-2']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('=== Group: empty-group ===');
|
||||
// No Members section when empty
|
||||
expect(text).not.toContain('EMAIL');
|
||||
});
|
||||
|
||||
it('shows RBAC detail with subjects and bindings', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'rbac-1',
|
||||
name: 'developers',
|
||||
subjects: [
|
||||
{ kind: 'User', name: 'alice@test.com' },
|
||||
{ kind: 'Group', name: 'dev-team' },
|
||||
],
|
||||
roleBindings: [
|
||||
{ role: 'edit', resource: 'servers' },
|
||||
{ role: 'view', resource: 'instances' },
|
||||
{ role: 'view', resource: 'projects' },
|
||||
],
|
||||
createdAt: '2025-01-01',
|
||||
updatedAt: '2025-01-15',
|
||||
});
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'rbac', 'rbac-1']);
|
||||
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('rbac', 'rbac-1');
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('=== RBAC: developers ===');
|
||||
expect(text).toContain('Name:');
|
||||
expect(text).toContain('developers');
|
||||
// Subjects section
|
||||
expect(text).toContain('Subjects:');
|
||||
expect(text).toContain('KIND');
|
||||
expect(text).toContain('NAME');
|
||||
expect(text).toContain('User');
|
||||
expect(text).toContain('alice@test.com');
|
||||
expect(text).toContain('Group');
|
||||
expect(text).toContain('dev-team');
|
||||
// Role Bindings section
|
||||
expect(text).toContain('Resource Bindings:');
|
||||
expect(text).toContain('ROLE');
|
||||
expect(text).toContain('RESOURCE');
|
||||
expect(text).toContain('edit');
|
||||
expect(text).toContain('servers');
|
||||
expect(text).toContain('view');
|
||||
expect(text).toContain('instances');
|
||||
expect(text).toContain('projects');
|
||||
expect(text).toContain('ID:');
|
||||
expect(text).toContain('rbac-1');
|
||||
});
|
||||
|
||||
it('shows RBAC detail with wildcard resource', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'rbac-2',
|
||||
name: 'admins',
|
||||
subjects: [{ kind: 'User', name: 'admin@test.com' }],
|
||||
roleBindings: [{ role: 'edit', resource: '*' }],
|
||||
});
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'rbac', 'rbac-2']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('=== RBAC: admins ===');
|
||||
expect(text).toContain('edit');
|
||||
expect(text).toContain('*');
|
||||
});
|
||||
|
||||
it('shows RBAC detail with empty subjects and bindings', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'rbac-3',
|
||||
name: 'empty-rbac',
|
||||
subjects: [],
|
||||
roleBindings: [],
|
||||
});
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'rbac', 'rbac-3']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('=== RBAC: empty-rbac ===');
|
||||
// No Subjects or Role Bindings sections when empty
|
||||
expect(text).not.toContain('KIND');
|
||||
expect(text).not.toContain('ROLE');
|
||||
expect(text).not.toContain('RESOURCE');
|
||||
});
|
||||
|
||||
it('shows RBAC detail with mixed resource and operation bindings', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'rbac-1',
|
||||
name: 'admin-access',
|
||||
subjects: [{ kind: 'Group', name: 'admin' }],
|
||||
roleBindings: [
|
||||
{ role: 'edit', resource: '*' },
|
||||
{ role: 'run', resource: 'projects' },
|
||||
{ role: 'run', action: 'logs' },
|
||||
{ role: 'run', action: 'backup' },
|
||||
],
|
||||
createdAt: '2025-01-01',
|
||||
});
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'rbac', 'rbac-1']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('Resource Bindings:');
|
||||
expect(text).toContain('edit');
|
||||
expect(text).toContain('*');
|
||||
expect(text).toContain('run');
|
||||
expect(text).toContain('projects');
|
||||
expect(text).toContain('Operations:');
|
||||
expect(text).toContain('ACTION');
|
||||
expect(text).toContain('logs');
|
||||
expect(text).toContain('backup');
|
||||
});
|
||||
|
||||
it('shows RBAC detail with name-scoped resource binding', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'rbac-1',
|
||||
name: 'ha-viewer',
|
||||
subjects: [{ kind: 'User', name: 'alice@test.com' }],
|
||||
roleBindings: [
|
||||
{ role: 'view', resource: 'servers', name: 'my-ha' },
|
||||
{ role: 'edit', resource: 'secrets' },
|
||||
],
|
||||
});
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'rbac', 'rbac-1']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('Resource Bindings:');
|
||||
expect(text).toContain('NAME');
|
||||
expect(text).toContain('my-ha');
|
||||
expect(text).toContain('view');
|
||||
expect(text).toContain('servers');
|
||||
});
|
||||
|
||||
it('shows user with direct RBAC permissions', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'usr-1',
|
||||
email: 'alice@test.com',
|
||||
name: 'Alice',
|
||||
provider: null,
|
||||
});
|
||||
vi.mocked(deps.client.get)
|
||||
.mockResolvedValueOnce([] as never) // users list (resolveNameOrId)
|
||||
.mockResolvedValueOnce([ // RBAC defs
|
||||
{
|
||||
name: 'dev-access',
|
||||
subjects: [{ kind: 'User', name: 'alice@test.com' }],
|
||||
roleBindings: [
|
||||
{ role: 'edit', resource: 'servers' },
|
||||
{ role: 'run', action: 'logs' },
|
||||
],
|
||||
},
|
||||
] as never)
|
||||
.mockResolvedValueOnce([] as never); // groups
|
||||
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'user', 'usr-1']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('=== User: alice@test.com ===');
|
||||
expect(text).toContain('Access:');
|
||||
expect(text).toContain('Direct (dev-access)');
|
||||
expect(text).toContain('Resources:');
|
||||
expect(text).toContain('edit');
|
||||
expect(text).toContain('servers');
|
||||
expect(text).toContain('Operations:');
|
||||
expect(text).toContain('logs');
|
||||
});
|
||||
|
||||
it('shows user with inherited group permissions', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'usr-1',
|
||||
email: 'bob@test.com',
|
||||
name: 'Bob',
|
||||
provider: null,
|
||||
});
|
||||
vi.mocked(deps.client.get)
|
||||
.mockResolvedValueOnce([] as never) // users list
|
||||
.mockResolvedValueOnce([ // RBAC defs
|
||||
{
|
||||
name: 'team-perms',
|
||||
subjects: [{ kind: 'Group', name: 'dev-team' }],
|
||||
roleBindings: [
|
||||
{ role: 'view', resource: '*' },
|
||||
{ role: 'run', action: 'backup' },
|
||||
],
|
||||
},
|
||||
] as never)
|
||||
.mockResolvedValueOnce([ // groups
|
||||
{ name: 'dev-team', members: [{ user: { email: 'bob@test.com' } }] },
|
||||
] as never);
|
||||
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'user', 'usr-1']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('Groups:');
|
||||
expect(text).toContain('dev-team');
|
||||
expect(text).toContain('Access:');
|
||||
expect(text).toContain('Inherited (dev-team)');
|
||||
expect(text).toContain('view');
|
||||
expect(text).toContain('*');
|
||||
expect(text).toContain('backup');
|
||||
});
|
||||
|
||||
it('shows user with no permissions', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'usr-1',
|
||||
email: 'nobody@test.com',
|
||||
name: null,
|
||||
provider: null,
|
||||
});
|
||||
vi.mocked(deps.client.get)
|
||||
.mockResolvedValueOnce([] as never)
|
||||
.mockResolvedValueOnce([] as never)
|
||||
.mockResolvedValueOnce([] as never);
|
||||
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'user', 'usr-1']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('Access: (none)');
|
||||
});
|
||||
|
||||
it('shows group with RBAC permissions', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'grp-1',
|
||||
name: 'admin',
|
||||
description: 'Admin group',
|
||||
members: [{ user: { email: 'alice@test.com' } }],
|
||||
});
|
||||
vi.mocked(deps.client.get)
|
||||
.mockResolvedValueOnce([] as never) // groups list (resolveNameOrId)
|
||||
.mockResolvedValueOnce([ // RBAC defs
|
||||
{
|
||||
name: 'admin-access',
|
||||
subjects: [{ kind: 'Group', name: 'admin' }],
|
||||
roleBindings: [
|
||||
{ role: 'edit', resource: '*' },
|
||||
{ role: 'run', action: 'backup' },
|
||||
{ role: 'run', action: 'restore' },
|
||||
],
|
||||
},
|
||||
] as never);
|
||||
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'group', 'grp-1']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('=== Group: admin ===');
|
||||
expect(text).toContain('Access:');
|
||||
expect(text).toContain('Granted (admin-access)');
|
||||
expect(text).toContain('edit');
|
||||
expect(text).toContain('*');
|
||||
expect(text).toContain('backup');
|
||||
expect(text).toContain('restore');
|
||||
});
|
||||
|
||||
it('shows group with name-scoped permissions', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'grp-1',
|
||||
name: 'ha-team',
|
||||
description: 'HA team',
|
||||
members: [],
|
||||
});
|
||||
vi.mocked(deps.client.get)
|
||||
.mockResolvedValueOnce([] as never)
|
||||
.mockResolvedValueOnce([ // RBAC defs
|
||||
{
|
||||
name: 'ha-access',
|
||||
subjects: [{ kind: 'Group', name: 'ha-team' }],
|
||||
roleBindings: [
|
||||
{ role: 'edit', resource: 'servers', name: 'my-ha' },
|
||||
{ role: 'view', resource: 'secrets' },
|
||||
],
|
||||
},
|
||||
] as never);
|
||||
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'group', 'grp-1']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('Access:');
|
||||
expect(text).toContain('Granted (ha-access)');
|
||||
expect(text).toContain('my-ha');
|
||||
expect(text).toContain('NAME');
|
||||
});
|
||||
|
||||
it('outputs user detail as JSON', async () => {
|
||||
const deps = makeDeps({ id: 'usr-1', email: 'alice@test.com', name: 'Alice', role: 'ADMIN' });
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'user', 'usr-1', '-o', 'json']);
|
||||
|
||||
const parsed = JSON.parse(deps.output[0] ?? '');
|
||||
expect(parsed.email).toBe('alice@test.com');
|
||||
expect(parsed.role).toBe('ADMIN');
|
||||
});
|
||||
|
||||
it('outputs group detail as YAML', async () => {
|
||||
const deps = makeDeps({ id: 'grp-1', name: 'dev-team', description: 'Devs' });
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'group', 'grp-1', '-o', 'yaml']);
|
||||
|
||||
expect(deps.output[0]).toContain('name: dev-team');
|
||||
});
|
||||
|
||||
it('outputs rbac detail as JSON', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'rbac-1',
|
||||
name: 'devs',
|
||||
subjects: [{ kind: 'User', name: 'a@b.com' }],
|
||||
roleBindings: [{ role: 'edit', resource: 'servers' }],
|
||||
});
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'rbac', 'rbac-1', '-o', 'json']);
|
||||
|
||||
const parsed = JSON.parse(deps.output[0] ?? '');
|
||||
expect(parsed.subjects).toHaveLength(1);
|
||||
expect(parsed.roleBindings[0].role).toBe('edit');
|
||||
});
|
||||
});
|
||||
|
||||
@@ -20,7 +20,7 @@ describe('get command', () => {
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'servers']);
|
||||
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('servers', undefined);
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('servers', undefined, undefined);
|
||||
expect(deps.output[0]).toContain('NAME');
|
||||
expect(deps.output[0]).toContain('TRANSPORT');
|
||||
expect(deps.output.join('\n')).toContain('slack');
|
||||
@@ -31,14 +31,14 @@ describe('get command', () => {
|
||||
const deps = makeDeps([]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'srv']);
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('servers', undefined);
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('servers', undefined, undefined);
|
||||
});
|
||||
|
||||
it('passes ID when provided', async () => {
|
||||
const deps = makeDeps([{ id: 'srv-1', name: 'slack' }]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'servers', 'srv-1']);
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('servers', 'srv-1');
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('servers', 'srv-1', undefined);
|
||||
});
|
||||
|
||||
it('outputs apply-compatible JSON format', async () => {
|
||||
@@ -85,4 +85,253 @@ describe('get command', () => {
|
||||
await cmd.parseAsync(['node', 'test', 'servers']);
|
||||
expect(deps.output[0]).toContain('No servers found');
|
||||
});
|
||||
|
||||
it('lists users with correct columns (no ROLE column)', async () => {
|
||||
const deps = makeDeps([
|
||||
{ id: 'usr-1', email: 'alice@test.com', name: 'Alice', provider: null },
|
||||
{ id: 'usr-2', email: 'bob@test.com', name: null, provider: 'oidc' },
|
||||
]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'users']);
|
||||
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('users', undefined, undefined);
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('EMAIL');
|
||||
expect(text).toContain('NAME');
|
||||
expect(text).not.toContain('ROLE');
|
||||
expect(text).toContain('PROVIDER');
|
||||
expect(text).toContain('alice@test.com');
|
||||
expect(text).toContain('Alice');
|
||||
expect(text).toContain('bob@test.com');
|
||||
expect(text).toContain('oidc');
|
||||
});
|
||||
|
||||
it('resolves user alias', async () => {
|
||||
const deps = makeDeps([]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'user']);
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('users', undefined, undefined);
|
||||
});
|
||||
|
||||
it('lists groups with correct columns', async () => {
|
||||
const deps = makeDeps([
|
||||
{
|
||||
id: 'grp-1',
|
||||
name: 'dev-team',
|
||||
description: 'Developers',
|
||||
members: [{ user: { email: 'alice@test.com' } }, { user: { email: 'bob@test.com' } }],
|
||||
},
|
||||
{ id: 'grp-2', name: 'ops-team', description: 'Operations', members: [] },
|
||||
]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'groups']);
|
||||
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('groups', undefined, undefined);
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('NAME');
|
||||
expect(text).toContain('MEMBERS');
|
||||
expect(text).toContain('DESCRIPTION');
|
||||
expect(text).toContain('dev-team');
|
||||
expect(text).toContain('2');
|
||||
expect(text).toContain('ops-team');
|
||||
expect(text).toContain('0');
|
||||
});
|
||||
|
||||
it('resolves group alias', async () => {
|
||||
const deps = makeDeps([]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'group']);
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('groups', undefined, undefined);
|
||||
});
|
||||
|
||||
it('lists rbac definitions with correct columns', async () => {
|
||||
const deps = makeDeps([
|
||||
{
|
||||
id: 'rbac-1',
|
||||
name: 'admins',
|
||||
subjects: [{ kind: 'User', name: 'admin@test.com' }],
|
||||
roleBindings: [{ role: 'edit', resource: '*' }],
|
||||
},
|
||||
]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'rbac']);
|
||||
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('rbac', undefined, undefined);
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('NAME');
|
||||
expect(text).toContain('SUBJECTS');
|
||||
expect(text).toContain('BINDINGS');
|
||||
expect(text).toContain('admins');
|
||||
expect(text).toContain('User:admin@test.com');
|
||||
expect(text).toContain('edit:*');
|
||||
});
|
||||
|
||||
it('resolves rbac-definition alias', async () => {
|
||||
const deps = makeDeps([]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'rbac-definition']);
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('rbac', undefined, undefined);
|
||||
});
|
||||
|
||||
it('lists projects with new columns', async () => {
|
||||
const deps = makeDeps([{
|
||||
id: 'proj-1',
|
||||
name: 'smart-home',
|
||||
description: 'Home automation',
|
||||
proxyMode: 'filtered',
|
||||
ownerId: 'usr-1',
|
||||
servers: [{ server: { name: 'grafana' } }],
|
||||
}]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'projects']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('MODE');
|
||||
expect(text).toContain('SERVERS');
|
||||
expect(text).toContain('smart-home');
|
||||
expect(text).toContain('filtered');
|
||||
expect(text).toContain('1');
|
||||
});
|
||||
|
||||
it('displays mixed resource and operation bindings', async () => {
|
||||
const deps = makeDeps([
|
||||
{
|
||||
id: 'rbac-1',
|
||||
name: 'admin-access',
|
||||
subjects: [{ kind: 'Group', name: 'admin' }],
|
||||
roleBindings: [
|
||||
{ role: 'edit', resource: '*' },
|
||||
{ role: 'run', action: 'logs' },
|
||||
{ role: 'run', action: 'backup' },
|
||||
],
|
||||
},
|
||||
]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'rbac']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('edit:*');
|
||||
expect(text).toContain('run>logs');
|
||||
expect(text).toContain('run>backup');
|
||||
});
|
||||
|
||||
it('displays name-scoped resource bindings', async () => {
|
||||
const deps = makeDeps([
|
||||
{
|
||||
id: 'rbac-1',
|
||||
name: 'ha-viewer',
|
||||
subjects: [{ kind: 'User', name: 'alice@test.com' }],
|
||||
roleBindings: [{ role: 'view', resource: 'servers', name: 'my-ha' }],
|
||||
},
|
||||
]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'rbac']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('view:servers:my-ha');
|
||||
});
|
||||
|
||||
it('shows no results message for empty users list', async () => {
|
||||
const deps = makeDeps([]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'users']);
|
||||
expect(deps.output[0]).toContain('No users found');
|
||||
});
|
||||
|
||||
it('shows no results message for empty groups list', async () => {
|
||||
const deps = makeDeps([]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'groups']);
|
||||
expect(deps.output[0]).toContain('No groups found');
|
||||
});
|
||||
|
||||
it('shows no results message for empty rbac list', async () => {
|
||||
const deps = makeDeps([]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'rbac']);
|
||||
expect(deps.output[0]).toContain('No rbac found');
|
||||
});
|
||||
|
||||
it('lists prompts with project name column', async () => {
|
||||
const deps = makeDeps([
|
||||
{ id: 'p-1', name: 'debug-guide', projectId: 'proj-1', project: { name: 'smart-home' }, createdAt: '2025-01-01T00:00:00Z' },
|
||||
{ id: 'p-2', name: 'global-rules', projectId: null, project: null, createdAt: '2025-01-01T00:00:00Z' },
|
||||
]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'prompts']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('NAME');
|
||||
expect(text).toContain('PROJECT');
|
||||
expect(text).toContain('debug-guide');
|
||||
expect(text).toContain('smart-home');
|
||||
expect(text).toContain('global-rules');
|
||||
expect(text).toContain('(global)');
|
||||
});
|
||||
|
||||
it('lists promptrequests with project name column', async () => {
|
||||
const deps = makeDeps([
|
||||
{ id: 'pr-1', name: 'new-rule', projectId: 'proj-1', project: { name: 'my-project' }, createdBySession: 'sess-abc123def456', createdAt: '2025-01-01T00:00:00Z' },
|
||||
]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'promptrequests']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('new-rule');
|
||||
expect(text).toContain('my-project');
|
||||
expect(text).toContain('sess-abc123d');
|
||||
});
|
||||
|
||||
it('passes --project option to fetchResource', async () => {
|
||||
const deps = makeDeps([]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'prompts', '--project', 'smart-home']);
|
||||
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('prompts', undefined, { project: 'smart-home' });
|
||||
});
|
||||
|
||||
it('does not pass project when --project is not specified', async () => {
|
||||
const deps = makeDeps([]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'prompts']);
|
||||
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('prompts', undefined, undefined);
|
||||
});
|
||||
|
||||
it('passes --all flag to fetchResource', async () => {
|
||||
const deps = makeDeps([]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'prompts', '-A']);
|
||||
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('prompts', undefined, { all: true });
|
||||
});
|
||||
|
||||
it('passes both --project and --all when both given', async () => {
|
||||
const deps = makeDeps([]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'prompts', '--project', 'my-proj', '-A']);
|
||||
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('prompts', undefined, { project: 'my-proj', all: true });
|
||||
});
|
||||
|
||||
it('resolves prompt alias', async () => {
|
||||
const deps = makeDeps([]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'prompt']);
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('prompts', undefined, undefined);
|
||||
});
|
||||
|
||||
it('resolves pr alias to promptrequests', async () => {
|
||||
const deps = makeDeps([]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'pr']);
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('promptrequests', undefined, undefined);
|
||||
});
|
||||
|
||||
it('shows no results message for empty prompts list', async () => {
|
||||
const deps = makeDeps([]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'prompts']);
|
||||
expect(deps.output[0]).toContain('No prompts found');
|
||||
});
|
||||
});
|
||||
|
||||
481
src/cli/tests/commands/mcp.test.ts
Normal file
481
src/cli/tests/commands/mcp.test.ts
Normal file
@@ -0,0 +1,481 @@
|
||||
import { describe, it, expect, beforeAll, afterAll } from 'vitest';
|
||||
import http from 'node:http';
|
||||
import { Readable, Writable } from 'node:stream';
|
||||
import { runMcpBridge, createMcpCommand } from '../../src/commands/mcp.js';
|
||||
|
||||
// ---- Mock MCP server (simulates mcplocal project endpoint) ----
|
||||
|
||||
interface RecordedRequest {
|
||||
method: string;
|
||||
url: string;
|
||||
headers: http.IncomingHttpHeaders;
|
||||
body: string;
|
||||
}
|
||||
|
||||
let mockServer: http.Server;
|
||||
let mockPort: number;
|
||||
const recorded: RecordedRequest[] = [];
|
||||
let sessionCounter = 0;
|
||||
|
||||
function makeInitializeResponse(id: number | string) {
|
||||
return JSON.stringify({
|
||||
jsonrpc: '2.0',
|
||||
id,
|
||||
result: {
|
||||
protocolVersion: '2024-11-05',
|
||||
capabilities: { tools: {} },
|
||||
serverInfo: { name: 'test-server', version: '1.0.0' },
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
function makeToolsListResponse(id: number | string) {
|
||||
return JSON.stringify({
|
||||
jsonrpc: '2.0',
|
||||
id,
|
||||
result: {
|
||||
tools: [
|
||||
{ name: 'grafana/query', description: 'Query Grafana', inputSchema: { type: 'object', properties: {} } },
|
||||
],
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
function makeToolCallResponse(id: number | string) {
|
||||
return JSON.stringify({
|
||||
jsonrpc: '2.0',
|
||||
id,
|
||||
result: {
|
||||
content: [{ type: 'text', text: 'tool result' }],
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
beforeAll(async () => {
|
||||
mockServer = http.createServer((req, res) => {
|
||||
const chunks: Buffer[] = [];
|
||||
req.on('data', (c: Buffer) => chunks.push(c));
|
||||
req.on('end', () => {
|
||||
const body = Buffer.concat(chunks).toString('utf-8');
|
||||
recorded.push({ method: req.method ?? '', url: req.url ?? '', headers: req.headers, body });
|
||||
|
||||
if (req.method === 'DELETE') {
|
||||
res.writeHead(200);
|
||||
res.end();
|
||||
return;
|
||||
}
|
||||
|
||||
if (req.method === 'POST' && req.url?.startsWith('/projects/')) {
|
||||
let sessionId = req.headers['mcp-session-id'] as string | undefined;
|
||||
|
||||
// Assign session ID on first request
|
||||
if (!sessionId) {
|
||||
sessionCounter++;
|
||||
sessionId = `session-${sessionCounter}`;
|
||||
}
|
||||
res.setHeader('mcp-session-id', sessionId);
|
||||
|
||||
// Parse JSON-RPC and respond based on method
|
||||
try {
|
||||
const rpc = JSON.parse(body) as { id: number | string; method: string };
|
||||
let responseBody: string;
|
||||
|
||||
switch (rpc.method) {
|
||||
case 'initialize':
|
||||
responseBody = makeInitializeResponse(rpc.id);
|
||||
break;
|
||||
case 'tools/list':
|
||||
responseBody = makeToolsListResponse(rpc.id);
|
||||
break;
|
||||
case 'tools/call':
|
||||
responseBody = makeToolCallResponse(rpc.id);
|
||||
break;
|
||||
default:
|
||||
responseBody = JSON.stringify({ jsonrpc: '2.0', id: rpc.id, error: { code: -32601, message: 'Method not found' } });
|
||||
}
|
||||
|
||||
// Respond in SSE format for /projects/sse-project/mcp
|
||||
if (req.url?.includes('sse-project')) {
|
||||
res.writeHead(200, { 'Content-Type': 'text/event-stream' });
|
||||
res.end(`event: message\ndata: ${responseBody}\n\n`);
|
||||
} else {
|
||||
res.writeHead(200, { 'Content-Type': 'application/json' });
|
||||
res.end(responseBody);
|
||||
}
|
||||
} catch {
|
||||
res.writeHead(400, { 'Content-Type': 'application/json' });
|
||||
res.end(JSON.stringify({ error: 'Invalid JSON' }));
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
res.writeHead(404);
|
||||
res.end();
|
||||
});
|
||||
});
|
||||
|
||||
await new Promise<void>((resolve) => {
|
||||
mockServer.listen(0, () => {
|
||||
const addr = mockServer.address();
|
||||
if (addr && typeof addr === 'object') {
|
||||
mockPort = addr.port;
|
||||
}
|
||||
resolve();
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
afterAll(() => {
|
||||
mockServer.close();
|
||||
});
|
||||
|
||||
// ---- Helper to run bridge with mock streams ----
|
||||
|
||||
function createMockStreams() {
|
||||
const stdoutChunks: string[] = [];
|
||||
const stderrChunks: string[] = [];
|
||||
|
||||
const stdout = new Writable({
|
||||
write(chunk: Buffer, _encoding, callback) {
|
||||
stdoutChunks.push(chunk.toString());
|
||||
callback();
|
||||
},
|
||||
});
|
||||
|
||||
const stderr = new Writable({
|
||||
write(chunk: Buffer, _encoding, callback) {
|
||||
stderrChunks.push(chunk.toString());
|
||||
callback();
|
||||
},
|
||||
});
|
||||
|
||||
return { stdout, stderr, stdoutChunks, stderrChunks };
|
||||
}
|
||||
|
||||
function pushAndEnd(stdin: Readable, lines: string[]) {
|
||||
for (const line of lines) {
|
||||
stdin.push(line + '\n');
|
||||
}
|
||||
stdin.push(null); // EOF
|
||||
}
|
||||
|
||||
// ---- Tests ----
|
||||
|
||||
describe('MCP STDIO Bridge', () => {
|
||||
beforeAll(() => {
|
||||
recorded.length = 0;
|
||||
sessionCounter = 0;
|
||||
});
|
||||
|
||||
it('forwards initialize request and returns response', async () => {
|
||||
recorded.length = 0;
|
||||
const stdin = new Readable({ read() {} });
|
||||
const { stdout, stdoutChunks } = createMockStreams();
|
||||
|
||||
const initMsg = JSON.stringify({
|
||||
jsonrpc: '2.0', id: 1, method: 'initialize',
|
||||
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
|
||||
});
|
||||
|
||||
pushAndEnd(stdin, [initMsg]);
|
||||
|
||||
await runMcpBridge({
|
||||
projectName: 'test-project',
|
||||
mcplocalUrl: `http://localhost:${mockPort}`,
|
||||
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
|
||||
});
|
||||
|
||||
// Verify request was made to correct URL
|
||||
expect(recorded.some((r) => r.url === '/projects/test-project/mcp' && r.method === 'POST')).toBe(true);
|
||||
|
||||
// Verify response on stdout
|
||||
const output = stdoutChunks.join('');
|
||||
const parsed = JSON.parse(output.trim());
|
||||
expect(parsed.result.serverInfo.name).toBe('test-server');
|
||||
expect(parsed.result.protocolVersion).toBe('2024-11-05');
|
||||
});
|
||||
|
||||
it('sends session ID on subsequent requests', async () => {
|
||||
recorded.length = 0;
|
||||
const stdin = new Readable({ read() {} });
|
||||
const { stdout, stdoutChunks } = createMockStreams();
|
||||
|
||||
const initMsg = JSON.stringify({
|
||||
jsonrpc: '2.0', id: 1, method: 'initialize',
|
||||
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
|
||||
});
|
||||
const toolsListMsg = JSON.stringify({ jsonrpc: '2.0', id: 2, method: 'tools/list', params: {} });
|
||||
|
||||
pushAndEnd(stdin, [initMsg, toolsListMsg]);
|
||||
|
||||
await runMcpBridge({
|
||||
projectName: 'test-project',
|
||||
mcplocalUrl: `http://localhost:${mockPort}`,
|
||||
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
|
||||
});
|
||||
|
||||
// First POST should NOT have mcp-session-id header
|
||||
const firstPost = recorded.find((r) => r.method === 'POST' && r.body.includes('initialize'));
|
||||
expect(firstPost).toBeDefined();
|
||||
expect(firstPost!.headers['mcp-session-id']).toBeUndefined();
|
||||
|
||||
// Second POST SHOULD have mcp-session-id header
|
||||
const secondPost = recorded.find((r) => r.method === 'POST' && r.body.includes('tools/list'));
|
||||
expect(secondPost).toBeDefined();
|
||||
expect(secondPost!.headers['mcp-session-id']).toMatch(/^session-/);
|
||||
|
||||
// Verify tools/list response
|
||||
const lines = stdoutChunks.join('').trim().split('\n');
|
||||
expect(lines.length).toBe(2);
|
||||
const toolsResponse = JSON.parse(lines[1]);
|
||||
expect(toolsResponse.result.tools[0].name).toBe('grafana/query');
|
||||
});
|
||||
|
||||
it('forwards tools/call and returns result', async () => {
|
||||
recorded.length = 0;
|
||||
const stdin = new Readable({ read() {} });
|
||||
const { stdout, stdoutChunks } = createMockStreams();
|
||||
|
||||
const initMsg = JSON.stringify({
|
||||
jsonrpc: '2.0', id: 1, method: 'initialize',
|
||||
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
|
||||
});
|
||||
const callMsg = JSON.stringify({
|
||||
jsonrpc: '2.0', id: 2, method: 'tools/call',
|
||||
params: { name: 'grafana/query', arguments: { query: 'test' } },
|
||||
});
|
||||
|
||||
pushAndEnd(stdin, [initMsg, callMsg]);
|
||||
|
||||
await runMcpBridge({
|
||||
projectName: 'test-project',
|
||||
mcplocalUrl: `http://localhost:${mockPort}`,
|
||||
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
|
||||
});
|
||||
|
||||
const lines = stdoutChunks.join('').trim().split('\n');
|
||||
expect(lines.length).toBe(2);
|
||||
const callResponse = JSON.parse(lines[1]);
|
||||
expect(callResponse.result.content[0].text).toBe('tool result');
|
||||
});
|
||||
|
||||
it('forwards Authorization header when token provided', async () => {
|
||||
recorded.length = 0;
|
||||
const stdin = new Readable({ read() {} });
|
||||
const { stdout } = createMockStreams();
|
||||
|
||||
const initMsg = JSON.stringify({
|
||||
jsonrpc: '2.0', id: 1, method: 'initialize',
|
||||
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
|
||||
});
|
||||
|
||||
pushAndEnd(stdin, [initMsg]);
|
||||
|
||||
await runMcpBridge({
|
||||
projectName: 'test-project',
|
||||
mcplocalUrl: `http://localhost:${mockPort}`,
|
||||
token: 'my-secret-token',
|
||||
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
|
||||
});
|
||||
|
||||
const post = recorded.find((r) => r.method === 'POST');
|
||||
expect(post).toBeDefined();
|
||||
expect(post!.headers['authorization']).toBe('Bearer my-secret-token');
|
||||
});
|
||||
|
||||
it('does not send Authorization header when no token', async () => {
|
||||
recorded.length = 0;
|
||||
const stdin = new Readable({ read() {} });
|
||||
const { stdout } = createMockStreams();
|
||||
|
||||
const initMsg = JSON.stringify({
|
||||
jsonrpc: '2.0', id: 1, method: 'initialize',
|
||||
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
|
||||
});
|
||||
|
||||
pushAndEnd(stdin, [initMsg]);
|
||||
|
||||
await runMcpBridge({
|
||||
projectName: 'test-project',
|
||||
mcplocalUrl: `http://localhost:${mockPort}`,
|
||||
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
|
||||
});
|
||||
|
||||
const post = recorded.find((r) => r.method === 'POST');
|
||||
expect(post).toBeDefined();
|
||||
expect(post!.headers['authorization']).toBeUndefined();
|
||||
});
|
||||
|
||||
it('sends DELETE to clean up session on stdin EOF', async () => {
|
||||
recorded.length = 0;
|
||||
const stdin = new Readable({ read() {} });
|
||||
const { stdout } = createMockStreams();
|
||||
|
||||
const initMsg = JSON.stringify({
|
||||
jsonrpc: '2.0', id: 1, method: 'initialize',
|
||||
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
|
||||
});
|
||||
|
||||
pushAndEnd(stdin, [initMsg]);
|
||||
|
||||
await runMcpBridge({
|
||||
projectName: 'test-project',
|
||||
mcplocalUrl: `http://localhost:${mockPort}`,
|
||||
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
|
||||
});
|
||||
|
||||
// Should have a DELETE request for session cleanup
|
||||
const deleteReq = recorded.find((r) => r.method === 'DELETE');
|
||||
expect(deleteReq).toBeDefined();
|
||||
expect(deleteReq!.headers['mcp-session-id']).toMatch(/^session-/);
|
||||
});
|
||||
|
||||
it('does not send DELETE if no session was established', async () => {
|
||||
recorded.length = 0;
|
||||
const stdin = new Readable({ read() {} });
|
||||
const { stdout } = createMockStreams();
|
||||
|
||||
// Push EOF immediately with no messages
|
||||
stdin.push(null);
|
||||
|
||||
await runMcpBridge({
|
||||
projectName: 'test-project',
|
||||
mcplocalUrl: `http://localhost:${mockPort}`,
|
||||
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
|
||||
});
|
||||
|
||||
expect(recorded.filter((r) => r.method === 'DELETE')).toHaveLength(0);
|
||||
});
|
||||
|
||||
it('writes errors to stderr, not stdout', async () => {
|
||||
recorded.length = 0;
|
||||
const stdin = new Readable({ read() {} });
|
||||
const { stdout, stdoutChunks, stderr, stderrChunks } = createMockStreams();
|
||||
|
||||
// Send to a non-existent port to trigger connection error
|
||||
const badMsg = JSON.stringify({ jsonrpc: '2.0', id: 1, method: 'initialize', params: {} });
|
||||
pushAndEnd(stdin, [badMsg]);
|
||||
|
||||
await runMcpBridge({
|
||||
projectName: 'test-project',
|
||||
mcplocalUrl: 'http://localhost:1', // will fail to connect
|
||||
stdin, stdout, stderr,
|
||||
});
|
||||
|
||||
// Error should be on stderr
|
||||
expect(stderrChunks.join('')).toContain('MCP bridge error');
|
||||
// stdout should be empty (no corrupted output)
|
||||
expect(stdoutChunks.join('')).toBe('');
|
||||
});
|
||||
|
||||
it('skips blank lines in stdin', async () => {
|
||||
recorded.length = 0;
|
||||
const stdin = new Readable({ read() {} });
|
||||
const { stdout, stdoutChunks } = createMockStreams();
|
||||
|
||||
const initMsg = JSON.stringify({
|
||||
jsonrpc: '2.0', id: 1, method: 'initialize',
|
||||
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
|
||||
});
|
||||
|
||||
pushAndEnd(stdin, ['', ' ', initMsg, '']);
|
||||
|
||||
await runMcpBridge({
|
||||
projectName: 'test-project',
|
||||
mcplocalUrl: `http://localhost:${mockPort}`,
|
||||
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
|
||||
});
|
||||
|
||||
// Only one POST (for the actual message)
|
||||
const posts = recorded.filter((r) => r.method === 'POST');
|
||||
expect(posts).toHaveLength(1);
|
||||
|
||||
// One response line
|
||||
const lines = stdoutChunks.join('').trim().split('\n');
|
||||
expect(lines).toHaveLength(1);
|
||||
});
|
||||
|
||||
it('handles SSE (text/event-stream) responses', async () => {
|
||||
recorded.length = 0;
|
||||
const stdin = new Readable({ read() {} });
|
||||
const { stdout, stdoutChunks } = createMockStreams();
|
||||
|
||||
const initMsg = JSON.stringify({
|
||||
jsonrpc: '2.0', id: 1, method: 'initialize',
|
||||
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
|
||||
});
|
||||
|
||||
pushAndEnd(stdin, [initMsg]);
|
||||
|
||||
await runMcpBridge({
|
||||
projectName: 'sse-project', // triggers SSE response from mock server
|
||||
mcplocalUrl: `http://localhost:${mockPort}`,
|
||||
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
|
||||
});
|
||||
|
||||
// Should extract JSON from SSE data: lines
|
||||
const output = stdoutChunks.join('').trim();
|
||||
const parsed = JSON.parse(output);
|
||||
expect(parsed.result.serverInfo.name).toBe('test-server');
|
||||
});
|
||||
|
||||
it('URL-encodes project name', async () => {
|
||||
recorded.length = 0;
|
||||
const stdin = new Readable({ read() {} });
|
||||
const { stdout } = createMockStreams();
|
||||
const { stderr } = createMockStreams();
|
||||
|
||||
const initMsg = JSON.stringify({
|
||||
jsonrpc: '2.0', id: 1, method: 'initialize',
|
||||
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
|
||||
});
|
||||
|
||||
pushAndEnd(stdin, [initMsg]);
|
||||
|
||||
await runMcpBridge({
|
||||
projectName: 'my project',
|
||||
mcplocalUrl: `http://localhost:${mockPort}`,
|
||||
stdin, stdout, stderr,
|
||||
});
|
||||
|
||||
const post = recorded.find((r) => r.method === 'POST');
|
||||
expect(post?.url).toBe('/projects/my%20project/mcp');
|
||||
});
|
||||
});
|
||||
|
||||
describe('createMcpCommand', () => {
|
||||
it('accepts --project option directly', () => {
|
||||
const cmd = createMcpCommand({
|
||||
getProject: () => undefined,
|
||||
configLoader: () => ({ mcplocalUrl: 'http://localhost:3200' }),
|
||||
credentialsLoader: () => null,
|
||||
});
|
||||
const opt = cmd.options.find((o) => o.long === '--project');
|
||||
expect(opt).toBeDefined();
|
||||
expect(opt!.short).toBe('-p');
|
||||
});
|
||||
|
||||
it('parses --project from command args', async () => {
|
||||
let capturedProject: string | undefined;
|
||||
const cmd = createMcpCommand({
|
||||
getProject: () => undefined,
|
||||
configLoader: () => ({ mcplocalUrl: `http://localhost:${mockPort}` }),
|
||||
credentialsLoader: () => null,
|
||||
});
|
||||
// Override the action to capture what project was parsed
|
||||
// We test by checking the option parsing works, not by running the full bridge
|
||||
const parsed = cmd.parse(['--project', 'test-proj'], { from: 'user' });
|
||||
capturedProject = parsed.opts().project;
|
||||
expect(capturedProject).toBe('test-proj');
|
||||
});
|
||||
|
||||
it('parses -p shorthand from command args', () => {
|
||||
const cmd = createMcpCommand({
|
||||
getProject: () => undefined,
|
||||
configLoader: () => ({ mcplocalUrl: `http://localhost:${mockPort}` }),
|
||||
credentialsLoader: () => null,
|
||||
});
|
||||
const parsed = cmd.parse(['-p', 'my-project'], { from: 'user' });
|
||||
expect(parsed.opts().project).toBe('my-project');
|
||||
});
|
||||
});
|
||||
@@ -1,17 +1,19 @@
|
||||
import { describe, it, expect, vi, beforeEach } from 'vitest';
|
||||
import { createProjectCommand } from '../../src/commands/project.js';
|
||||
import type { ApiClient } from '../../src/api-client.js';
|
||||
import { createCreateCommand } from '../../src/commands/create.js';
|
||||
import { createGetCommand } from '../../src/commands/get.js';
|
||||
import { createDescribeCommand } from '../../src/commands/describe.js';
|
||||
import { type ApiClient, ApiError } from '../../src/api-client.js';
|
||||
|
||||
function mockClient(): ApiClient {
|
||||
return {
|
||||
get: vi.fn(async () => []),
|
||||
post: vi.fn(async () => ({ id: 'proj-1', name: 'my-project' })),
|
||||
post: vi.fn(async () => ({ id: 'new-id', name: 'test' })),
|
||||
put: vi.fn(async () => ({})),
|
||||
delete: vi.fn(async () => {}),
|
||||
} as unknown as ApiClient;
|
||||
}
|
||||
|
||||
describe('project command', () => {
|
||||
describe('project with new fields', () => {
|
||||
let client: ReturnType<typeof mockClient>;
|
||||
let output: string[];
|
||||
const log = (...args: unknown[]) => output.push(args.map(String).join(' '));
|
||||
@@ -21,9 +23,90 @@ describe('project command', () => {
|
||||
output = [];
|
||||
});
|
||||
|
||||
it('creates command with alias', () => {
|
||||
const cmd = createProjectCommand({ client, log });
|
||||
expect(cmd.name()).toBe('project');
|
||||
expect(cmd.alias()).toBe('proj');
|
||||
describe('create project with enhanced options', () => {
|
||||
it('creates project with proxy mode and servers', async () => {
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync([
|
||||
'project', 'smart-home',
|
||||
'-d', 'Smart home project',
|
||||
'--proxy-mode', 'filtered',
|
||||
'--server', 'my-grafana',
|
||||
'--server', 'my-ha',
|
||||
], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/projects', expect.objectContaining({
|
||||
name: 'smart-home',
|
||||
description: 'Smart home project',
|
||||
proxyMode: 'filtered',
|
||||
servers: ['my-grafana', 'my-ha'],
|
||||
}));
|
||||
});
|
||||
|
||||
it('defaults proxy mode to direct', async () => {
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync(['project', 'basic'], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/projects', expect.objectContaining({
|
||||
proxyMode: 'direct',
|
||||
}));
|
||||
});
|
||||
});
|
||||
|
||||
describe('get projects shows new columns', () => {
|
||||
it('shows MODE and SERVERS columns', async () => {
|
||||
const deps = {
|
||||
output: [] as string[],
|
||||
fetchResource: vi.fn(async () => [{
|
||||
id: 'proj-1',
|
||||
name: 'smart-home',
|
||||
description: 'Test',
|
||||
proxyMode: 'filtered',
|
||||
ownerId: 'user-1',
|
||||
servers: [{ server: { name: 'grafana' } }, { server: { name: 'ha' } }],
|
||||
}]),
|
||||
log: (...args: string[]) => deps.output.push(args.join(' ')),
|
||||
};
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'projects']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('MODE');
|
||||
expect(text).toContain('SERVERS');
|
||||
expect(text).toContain('smart-home');
|
||||
});
|
||||
});
|
||||
|
||||
describe('describe project shows full detail', () => {
|
||||
it('shows servers and proxy config', async () => {
|
||||
const deps = {
|
||||
output: [] as string[],
|
||||
client: mockClient(),
|
||||
fetchResource: vi.fn(async () => ({
|
||||
id: 'proj-1',
|
||||
name: 'smart-home',
|
||||
description: 'Smart home',
|
||||
proxyMode: 'filtered',
|
||||
llmProvider: 'gemini-cli',
|
||||
llmModel: 'gemini-2.0-flash',
|
||||
ownerId: 'user-1',
|
||||
servers: [
|
||||
{ server: { name: 'my-grafana' } },
|
||||
{ server: { name: 'my-ha' } },
|
||||
],
|
||||
createdAt: '2025-01-01',
|
||||
updatedAt: '2025-01-01',
|
||||
})),
|
||||
log: (...args: string[]) => deps.output.push(args.join(' ')),
|
||||
};
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'project', 'proj-1']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('=== Project: smart-home ===');
|
||||
expect(text).toContain('filtered');
|
||||
expect(text).toContain('gemini-cli');
|
||||
expect(text).toContain('my-grafana');
|
||||
expect(text).toContain('my-ha');
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
@@ -3,19 +3,39 @@ import { mkdtempSync, rmSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
import { createStatusCommand } from '../../src/commands/status.js';
|
||||
import type { StatusCommandDeps } from '../../src/commands/status.js';
|
||||
import { saveConfig, DEFAULT_CONFIG } from '../../src/config/index.js';
|
||||
import { saveCredentials } from '../../src/auth/index.js';
|
||||
|
||||
let tempDir: string;
|
||||
let output: string[];
|
||||
let written: string[];
|
||||
|
||||
function log(...args: string[]) {
|
||||
output.push(args.join(' '));
|
||||
}
|
||||
|
||||
function write(text: string) {
|
||||
written.push(text);
|
||||
}
|
||||
|
||||
function baseDeps(overrides?: Partial<StatusCommandDeps>): Partial<StatusCommandDeps> {
|
||||
return {
|
||||
configDeps: { configDir: tempDir },
|
||||
credentialsDeps: { configDir: tempDir },
|
||||
log,
|
||||
write,
|
||||
checkHealth: async () => true,
|
||||
fetchProviders: async () => null,
|
||||
isTTY: false,
|
||||
...overrides,
|
||||
};
|
||||
}
|
||||
|
||||
beforeEach(() => {
|
||||
tempDir = mkdtempSync(join(tmpdir(), 'mcpctl-status-test-'));
|
||||
output = [];
|
||||
written = [];
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
@@ -24,12 +44,7 @@ afterEach(() => {
|
||||
|
||||
describe('status command', () => {
|
||||
it('shows status in table format', async () => {
|
||||
const cmd = createStatusCommand({
|
||||
configDeps: { configDir: tempDir },
|
||||
credentialsDeps: { configDir: tempDir },
|
||||
log,
|
||||
checkHealth: async () => true,
|
||||
});
|
||||
const cmd = createStatusCommand(baseDeps());
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
const out = output.join('\n');
|
||||
expect(out).toContain('mcpctl v');
|
||||
@@ -39,46 +54,26 @@ describe('status command', () => {
|
||||
});
|
||||
|
||||
it('shows unreachable when daemons are down', async () => {
|
||||
const cmd = createStatusCommand({
|
||||
configDeps: { configDir: tempDir },
|
||||
credentialsDeps: { configDir: tempDir },
|
||||
log,
|
||||
checkHealth: async () => false,
|
||||
});
|
||||
const cmd = createStatusCommand(baseDeps({ checkHealth: async () => false }));
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
expect(output.join('\n')).toContain('unreachable');
|
||||
});
|
||||
|
||||
it('shows not logged in when no credentials', async () => {
|
||||
const cmd = createStatusCommand({
|
||||
configDeps: { configDir: tempDir },
|
||||
credentialsDeps: { configDir: tempDir },
|
||||
log,
|
||||
checkHealth: async () => true,
|
||||
});
|
||||
const cmd = createStatusCommand(baseDeps());
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
expect(output.join('\n')).toContain('not logged in');
|
||||
});
|
||||
|
||||
it('shows logged in user when credentials exist', async () => {
|
||||
saveCredentials({ token: 'tok', mcpdUrl: 'http://x:3100', user: 'alice@example.com' }, { configDir: tempDir });
|
||||
const cmd = createStatusCommand({
|
||||
configDeps: { configDir: tempDir },
|
||||
credentialsDeps: { configDir: tempDir },
|
||||
log,
|
||||
checkHealth: async () => true,
|
||||
});
|
||||
const cmd = createStatusCommand(baseDeps());
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
expect(output.join('\n')).toContain('logged in as alice@example.com');
|
||||
});
|
||||
|
||||
it('shows status in JSON format', async () => {
|
||||
const cmd = createStatusCommand({
|
||||
configDeps: { configDir: tempDir },
|
||||
credentialsDeps: { configDir: tempDir },
|
||||
log,
|
||||
checkHealth: async () => true,
|
||||
});
|
||||
const cmd = createStatusCommand(baseDeps());
|
||||
await cmd.parseAsync(['-o', 'json'], { from: 'user' });
|
||||
const parsed = JSON.parse(output[0]) as Record<string, unknown>;
|
||||
expect(parsed['version']).toBe('0.1.0');
|
||||
@@ -87,12 +82,7 @@ describe('status command', () => {
|
||||
});
|
||||
|
||||
it('shows status in YAML format', async () => {
|
||||
const cmd = createStatusCommand({
|
||||
configDeps: { configDir: tempDir },
|
||||
credentialsDeps: { configDir: tempDir },
|
||||
log,
|
||||
checkHealth: async () => false,
|
||||
});
|
||||
const cmd = createStatusCommand(baseDeps({ checkHealth: async () => false }));
|
||||
await cmd.parseAsync(['-o', 'yaml'], { from: 'user' });
|
||||
expect(output[0]).toContain('mcplocalReachable: false');
|
||||
});
|
||||
@@ -100,15 +90,12 @@ describe('status command', () => {
|
||||
it('checks correct URLs from config', async () => {
|
||||
saveConfig({ ...DEFAULT_CONFIG, mcplocalUrl: 'http://local:3200', mcpdUrl: 'http://remote:3100' }, { configDir: tempDir });
|
||||
const checkedUrls: string[] = [];
|
||||
const cmd = createStatusCommand({
|
||||
configDeps: { configDir: tempDir },
|
||||
credentialsDeps: { configDir: tempDir },
|
||||
log,
|
||||
const cmd = createStatusCommand(baseDeps({
|
||||
checkHealth: async (url) => {
|
||||
checkedUrls.push(url);
|
||||
return false;
|
||||
},
|
||||
});
|
||||
}));
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
expect(checkedUrls).toContain('http://local:3200');
|
||||
expect(checkedUrls).toContain('http://remote:3100');
|
||||
@@ -116,14 +103,100 @@ describe('status command', () => {
|
||||
|
||||
it('shows registries from config', async () => {
|
||||
saveConfig({ ...DEFAULT_CONFIG, registries: ['official'] }, { configDir: tempDir });
|
||||
const cmd = createStatusCommand({
|
||||
configDeps: { configDir: tempDir },
|
||||
credentialsDeps: { configDir: tempDir },
|
||||
log,
|
||||
checkHealth: async () => true,
|
||||
});
|
||||
const cmd = createStatusCommand(baseDeps());
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
expect(output.join('\n')).toContain('official');
|
||||
expect(output.join('\n')).not.toContain('glama');
|
||||
});
|
||||
|
||||
it('shows LLM not configured hint when no LLM is set', async () => {
|
||||
const cmd = createStatusCommand(baseDeps());
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
const out = output.join('\n');
|
||||
expect(out).toContain('LLM:');
|
||||
expect(out).toContain('not configured');
|
||||
expect(out).toContain('mcpctl config setup');
|
||||
});
|
||||
|
||||
it('shows green check when LLM is healthy (non-TTY)', async () => {
|
||||
saveConfig({ ...DEFAULT_CONFIG, llm: { provider: 'anthropic', model: 'claude-haiku-3-5-20241022' } }, { configDir: tempDir });
|
||||
const cmd = createStatusCommand(baseDeps({ checkLlm: async () => 'ok' }));
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
const out = output.join('\n');
|
||||
expect(out).toContain('anthropic / claude-haiku-3-5-20241022');
|
||||
expect(out).toContain('✓ ok');
|
||||
});
|
||||
|
||||
it('shows red cross when LLM check fails (non-TTY)', async () => {
|
||||
saveConfig({ ...DEFAULT_CONFIG, llm: { provider: 'gemini-cli', model: 'gemini-2.5-flash' } }, { configDir: tempDir });
|
||||
const cmd = createStatusCommand(baseDeps({ checkLlm: async () => 'not authenticated' }));
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
const out = output.join('\n');
|
||||
expect(out).toContain('✗ not authenticated');
|
||||
});
|
||||
|
||||
it('shows error message from mcplocal', async () => {
|
||||
saveConfig({ ...DEFAULT_CONFIG, llm: { provider: 'gemini-cli', model: 'gemini-2.5-flash' } }, { configDir: tempDir });
|
||||
const cmd = createStatusCommand(baseDeps({ checkLlm: async () => 'binary not found' }));
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
expect(output.join('\n')).toContain('✗ binary not found');
|
||||
});
|
||||
|
||||
it('queries mcplocal URL for LLM health', async () => {
|
||||
saveConfig({ ...DEFAULT_CONFIG, mcplocalUrl: 'http://custom:9999', llm: { provider: 'gemini-cli', model: 'gemini-2.5-flash' } }, { configDir: tempDir });
|
||||
let queriedUrl = '';
|
||||
const cmd = createStatusCommand(baseDeps({
|
||||
checkLlm: async (url) => { queriedUrl = url; return 'ok'; },
|
||||
}));
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
expect(queriedUrl).toBe('http://custom:9999');
|
||||
});
|
||||
|
||||
it('uses spinner on TTY and writes final result', async () => {
|
||||
saveConfig({ ...DEFAULT_CONFIG, llm: { provider: 'gemini-cli', model: 'gemini-2.5-flash' } }, { configDir: tempDir });
|
||||
const cmd = createStatusCommand(baseDeps({
|
||||
isTTY: true,
|
||||
checkLlm: async () => 'ok',
|
||||
}));
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
// On TTY, the final LLM line goes through write(), not log()
|
||||
const finalWrite = written[written.length - 1];
|
||||
expect(finalWrite).toContain('gemini-cli / gemini-2.5-flash');
|
||||
expect(finalWrite).toContain('✓ ok');
|
||||
});
|
||||
|
||||
it('uses spinner on TTY and shows failure', async () => {
|
||||
saveConfig({ ...DEFAULT_CONFIG, llm: { provider: 'gemini-cli', model: 'gemini-2.5-flash' } }, { configDir: tempDir });
|
||||
const cmd = createStatusCommand(baseDeps({
|
||||
isTTY: true,
|
||||
checkLlm: async () => 'not authenticated',
|
||||
}));
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
const finalWrite = written[written.length - 1];
|
||||
expect(finalWrite).toContain('✗ not authenticated');
|
||||
});
|
||||
|
||||
it('shows not configured when LLM provider is none', async () => {
|
||||
saveConfig({ ...DEFAULT_CONFIG, llm: { provider: 'none' } }, { configDir: tempDir });
|
||||
const cmd = createStatusCommand(baseDeps());
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
expect(output.join('\n')).toContain('not configured');
|
||||
});
|
||||
|
||||
it('includes llm and llmStatus in JSON output', async () => {
|
||||
saveConfig({ ...DEFAULT_CONFIG, llm: { provider: 'gemini-cli', model: 'gemini-2.5-flash' } }, { configDir: tempDir });
|
||||
const cmd = createStatusCommand(baseDeps({ checkLlm: async () => 'ok' }));
|
||||
await cmd.parseAsync(['-o', 'json'], { from: 'user' });
|
||||
const parsed = JSON.parse(output[0]) as Record<string, unknown>;
|
||||
expect(parsed['llm']).toBe('gemini-cli / gemini-2.5-flash');
|
||||
expect(parsed['llmStatus']).toBe('ok');
|
||||
});
|
||||
|
||||
it('includes null llm in JSON output when not configured', async () => {
|
||||
const cmd = createStatusCommand(baseDeps());
|
||||
await cmd.parseAsync(['-o', 'json'], { from: 'user' });
|
||||
const parsed = JSON.parse(output[0]) as Record<string, unknown>;
|
||||
expect(parsed['llm']).toBeNull();
|
||||
expect(parsed['llmStatus']).toBeNull();
|
||||
});
|
||||
});
|
||||
|
||||
176
src/cli/tests/completions.test.ts
Normal file
176
src/cli/tests/completions.test.ts
Normal file
@@ -0,0 +1,176 @@
|
||||
import { describe, it, expect } from 'vitest';
|
||||
import { readFileSync } from 'node:fs';
|
||||
import { join, dirname } from 'node:path';
|
||||
import { fileURLToPath } from 'node:url';
|
||||
|
||||
const root = join(dirname(fileURLToPath(import.meta.url)), '..', '..', '..');
|
||||
const fishFile = readFileSync(join(root, 'completions', 'mcpctl.fish'), 'utf-8');
|
||||
const bashFile = readFileSync(join(root, 'completions', 'mcpctl.bash'), 'utf-8');
|
||||
|
||||
describe('fish completions', () => {
|
||||
it('erases stale completions at the top', () => {
|
||||
const lines = fishFile.split('\n');
|
||||
const firstComplete = lines.findIndex((l) => l.startsWith('complete '));
|
||||
expect(lines[firstComplete]).toContain('-e');
|
||||
});
|
||||
|
||||
it('does not offer resource types without __mcpctl_needs_resource_type guard', () => {
|
||||
const resourceTypes = ['servers', 'instances', 'secrets', 'templates', 'projects', 'users', 'groups', 'rbac', 'prompts', 'promptrequests'];
|
||||
const lines = fishFile.split('\n').filter((l) => l.startsWith('complete '));
|
||||
|
||||
for (const line of lines) {
|
||||
// Find lines that offer resource types as positional args
|
||||
const offersResourceType = resourceTypes.some((r) => {
|
||||
// Match `-a "...servers..."` or `-a 'servers projects'`
|
||||
const aMatch = line.match(/-a\s+['"]([^'"]+)['"]/);
|
||||
if (!aMatch) return false;
|
||||
return aMatch[1].split(/\s+/).includes(r);
|
||||
});
|
||||
|
||||
if (!offersResourceType) continue;
|
||||
|
||||
// Skip the help completions line and the -e line
|
||||
if (line.includes('__fish_seen_subcommand_from help')) continue;
|
||||
// Skip project-scoped command offerings (those offer commands, not resource types)
|
||||
if (line.includes('attach-server') || line.includes('detach-server')) continue;
|
||||
// Skip lines that offer commands (not resource types)
|
||||
if (line.includes("-d 'Show") || line.includes("-d 'Manage") || line.includes("-d 'Authenticate") ||
|
||||
line.includes("-d 'Log out'") || line.includes("-d 'Get instance") || line.includes("-d 'Create a resource'") ||
|
||||
line.includes("-d 'Edit a resource'") || line.includes("-d 'Apply") || line.includes("-d 'Backup") ||
|
||||
line.includes("-d 'Restore") || line.includes("-d 'List resources") || line.includes("-d 'Delete a resource'")) continue;
|
||||
|
||||
// Lines offering resource types MUST have __mcpctl_needs_resource_type in their condition
|
||||
expect(line, `Resource type completion missing guard: ${line}`).toContain('__mcpctl_needs_resource_type');
|
||||
}
|
||||
});
|
||||
|
||||
it('resource name completions require resource type to be selected', () => {
|
||||
const lines = fishFile.split('\n').filter((l) => l.startsWith('complete') && l.includes('__mcpctl_resource_names'));
|
||||
expect(lines.length).toBeGreaterThan(0);
|
||||
for (const line of lines) {
|
||||
expect(line).toContain('not __mcpctl_needs_resource_type');
|
||||
}
|
||||
});
|
||||
|
||||
it('defines --project option', () => {
|
||||
expect(fishFile).toContain("complete -c mcpctl -l project");
|
||||
});
|
||||
|
||||
it('attach-server command only shows with --project', () => {
|
||||
// Only check lines that OFFER attach-server as a command (via -a attach-server), not argument completions
|
||||
const lines = fishFile.split('\n').filter((l) =>
|
||||
l.startsWith('complete') && l.includes("-a attach-server"));
|
||||
expect(lines.length).toBeGreaterThan(0);
|
||||
for (const line of lines) {
|
||||
expect(line).toContain('__mcpctl_has_project');
|
||||
}
|
||||
});
|
||||
|
||||
it('detach-server command only shows with --project', () => {
|
||||
const lines = fishFile.split('\n').filter((l) =>
|
||||
l.startsWith('complete') && l.includes("-a detach-server"));
|
||||
expect(lines.length).toBeGreaterThan(0);
|
||||
for (const line of lines) {
|
||||
expect(line).toContain('__mcpctl_has_project');
|
||||
}
|
||||
});
|
||||
|
||||
it('resource name functions use jq .[][].name to unwrap wrapped JSON and avoid nested matches', () => {
|
||||
// API returns { "resources": [...] } not [...], so .[].name fails silently.
|
||||
// Must use .[][].name to unwrap the outer object then iterate the array.
|
||||
// Also must not use string match regex which matches nested name fields.
|
||||
const resourceNamesFn = fishFile.match(/function __mcpctl_resource_names[\s\S]*?^end/m)?.[0] ?? '';
|
||||
const projectNamesFn = fishFile.match(/function __mcpctl_project_names[\s\S]*?^end/m)?.[0] ?? '';
|
||||
|
||||
expect(resourceNamesFn, '__mcpctl_resource_names must use jq .[][].name').toContain("jq -r '.[][].name'");
|
||||
expect(resourceNamesFn, '__mcpctl_resource_names must not use string match on name').not.toMatch(/string match.*"name"/);
|
||||
|
||||
expect(projectNamesFn, '__mcpctl_project_names must use jq .[][].name').toContain("jq -r '.[][].name'");
|
||||
expect(projectNamesFn, '__mcpctl_project_names must not use string match on name').not.toMatch(/string match.*"name"/);
|
||||
});
|
||||
|
||||
it('instances use server.name instead of name', () => {
|
||||
const resourceNamesFn = fishFile.match(/function __mcpctl_resource_names[\s\S]*?^end/m)?.[0] ?? '';
|
||||
expect(resourceNamesFn, 'must handle instances via server.name').toContain('.server.name');
|
||||
});
|
||||
|
||||
it('attach-server completes with available (unattached) servers and guards against repeat', () => {
|
||||
const attachLine = fishFile.split('\n').find((l) =>
|
||||
l.startsWith('complete') && l.includes('__fish_seen_subcommand_from attach-server'));
|
||||
expect(attachLine, 'attach-server argument completion must exist').toBeDefined();
|
||||
expect(attachLine, 'attach-server must use __mcpctl_available_servers').toContain('__mcpctl_available_servers');
|
||||
expect(attachLine, 'attach-server must guard with __mcpctl_needs_server_arg').toContain('__mcpctl_needs_server_arg');
|
||||
});
|
||||
|
||||
it('detach-server completes with project servers and guards against repeat', () => {
|
||||
const detachLine = fishFile.split('\n').find((l) =>
|
||||
l.startsWith('complete') && l.includes('__fish_seen_subcommand_from detach-server'));
|
||||
expect(detachLine, 'detach-server argument completion must exist').toBeDefined();
|
||||
expect(detachLine, 'detach-server must use __mcpctl_project_servers').toContain('__mcpctl_project_servers');
|
||||
expect(detachLine, 'detach-server must guard with __mcpctl_needs_server_arg').toContain('__mcpctl_needs_server_arg');
|
||||
});
|
||||
|
||||
it('non-project commands do not show with --project', () => {
|
||||
const nonProjectCmds = ['status', 'login', 'logout', 'config', 'apply', 'backup', 'restore'];
|
||||
const lines = fishFile.split('\n').filter((l) => l.startsWith('complete') && l.includes('-a '));
|
||||
|
||||
for (const cmd of nonProjectCmds) {
|
||||
const cmdLines = lines.filter((l) => {
|
||||
const aMatch = l.match(/-a\s+(\S+)/);
|
||||
return aMatch && aMatch[1].replace(/['"]/g, '') === cmd;
|
||||
});
|
||||
for (const line of cmdLines) {
|
||||
expect(line, `${cmd} should require 'not __mcpctl_has_project'`).toContain('not __mcpctl_has_project');
|
||||
}
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
describe('bash completions', () => {
|
||||
it('separates project commands from regular commands', () => {
|
||||
expect(bashFile).toContain('project_commands=');
|
||||
expect(bashFile).toContain('attach-server detach-server');
|
||||
});
|
||||
|
||||
it('checks has_project before offering project commands', () => {
|
||||
expect(bashFile).toContain('if $has_project');
|
||||
expect(bashFile).toContain('$project_commands');
|
||||
});
|
||||
|
||||
it('fetches resource names dynamically after resource type', () => {
|
||||
expect(bashFile).toContain('_mcpctl_resource_names');
|
||||
// get/describe/delete should use resource_names when resource_type is set
|
||||
expect(bashFile).toMatch(/get\|describe\|delete\)[\s\S]*?_mcpctl_resource_names/);
|
||||
});
|
||||
|
||||
it('attach-server filters out already-attached servers and guards against repeat', () => {
|
||||
const attachBlock = bashFile.match(/attach-server\)[\s\S]*?return ;;/)?.[0] ?? '';
|
||||
expect(attachBlock, 'attach-server must use _mcpctl_get_project_value').toContain('_mcpctl_get_project_value');
|
||||
expect(attachBlock, 'attach-server must query project servers to exclude').toContain('--project');
|
||||
expect(attachBlock, 'attach-server must check position to prevent repeat').toContain('cword - subcmd_pos');
|
||||
});
|
||||
|
||||
it('detach-server shows only project servers and guards against repeat', () => {
|
||||
const detachBlock = bashFile.match(/detach-server\)[\s\S]*?return ;;/)?.[0] ?? '';
|
||||
expect(detachBlock, 'detach-server must use _mcpctl_get_project_value').toContain('_mcpctl_get_project_value');
|
||||
expect(detachBlock, 'detach-server must query project servers').toContain('--project');
|
||||
expect(detachBlock, 'detach-server must check position to prevent repeat').toContain('cword - subcmd_pos');
|
||||
});
|
||||
|
||||
it('instances use server.name instead of name', () => {
|
||||
const fnMatch = bashFile.match(/_mcpctl_resource_names\(\)[\s\S]*?\n\s*\}/)?.[0] ?? '';
|
||||
expect(fnMatch, 'must handle instances via .server.name').toContain('.server.name');
|
||||
});
|
||||
|
||||
it('defines --project option', () => {
|
||||
expect(bashFile).toContain('--project');
|
||||
});
|
||||
|
||||
it('resource name function uses jq .[][].name to unwrap wrapped JSON and avoid nested matches', () => {
|
||||
const fnMatch = bashFile.match(/_mcpctl_resource_names\(\)[\s\S]*?\n\s*\}/)?.[0] ?? '';
|
||||
expect(fnMatch, '_mcpctl_resource_names must use jq .[][].name').toContain("jq -r '.[][].name'");
|
||||
expect(fnMatch, '_mcpctl_resource_names must not use grep on name').not.toMatch(/grep.*"name"/);
|
||||
// Guard against .[].name (single bracket) which fails on wrapped JSON
|
||||
expect(fnMatch, '_mcpctl_resource_names must not use .[].name (needs .[][].name)').not.toMatch(/jq.*'\.\[\]\.name'/);
|
||||
});
|
||||
});
|
||||
@@ -21,35 +21,62 @@ describe('CLI command registration (e2e)', () => {
|
||||
expect(commandNames).toContain('apply');
|
||||
expect(commandNames).toContain('create');
|
||||
expect(commandNames).toContain('edit');
|
||||
expect(commandNames).toContain('claude');
|
||||
expect(commandNames).toContain('project');
|
||||
expect(commandNames).toContain('backup');
|
||||
expect(commandNames).toContain('restore');
|
||||
});
|
||||
|
||||
it('instance command is removed (use get/delete/logs instead)', () => {
|
||||
it('old project and claude top-level commands are removed', () => {
|
||||
const program = createProgram();
|
||||
const commandNames = program.commands.map((c) => c.name());
|
||||
expect(commandNames).not.toContain('claude');
|
||||
expect(commandNames).not.toContain('project');
|
||||
expect(commandNames).not.toContain('instance');
|
||||
});
|
||||
|
||||
it('claude command has config management subcommands', () => {
|
||||
it('config command has claude-generate and impersonate subcommands', () => {
|
||||
const program = createProgram();
|
||||
const claude = program.commands.find((c) => c.name() === 'claude');
|
||||
expect(claude).toBeDefined();
|
||||
const config = program.commands.find((c) => c.name() === 'config');
|
||||
expect(config).toBeDefined();
|
||||
|
||||
const subcommands = claude!.commands.map((c) => c.name());
|
||||
expect(subcommands).toContain('generate');
|
||||
expect(subcommands).toContain('show');
|
||||
expect(subcommands).toContain('add');
|
||||
expect(subcommands).toContain('remove');
|
||||
const subcommands = config!.commands.map((c) => c.name());
|
||||
expect(subcommands).toContain('claude-generate');
|
||||
expect(subcommands).toContain('impersonate');
|
||||
expect(subcommands).toContain('view');
|
||||
expect(subcommands).toContain('set');
|
||||
expect(subcommands).toContain('path');
|
||||
expect(subcommands).toContain('reset');
|
||||
});
|
||||
|
||||
it('project command exists with alias', () => {
|
||||
it('create command has user, group, rbac, prompt, promptrequest subcommands', () => {
|
||||
const program = createProgram();
|
||||
const project = program.commands.find((c) => c.name() === 'project');
|
||||
expect(project).toBeDefined();
|
||||
expect(project!.alias()).toBe('proj');
|
||||
const create = program.commands.find((c) => c.name() === 'create');
|
||||
expect(create).toBeDefined();
|
||||
|
||||
const subcommands = create!.commands.map((c) => c.name());
|
||||
expect(subcommands).toContain('server');
|
||||
expect(subcommands).toContain('secret');
|
||||
expect(subcommands).toContain('project');
|
||||
expect(subcommands).toContain('user');
|
||||
expect(subcommands).toContain('group');
|
||||
expect(subcommands).toContain('rbac');
|
||||
expect(subcommands).toContain('prompt');
|
||||
expect(subcommands).toContain('promptrequest');
|
||||
});
|
||||
|
||||
it('get command accepts --project option', () => {
|
||||
const program = createProgram();
|
||||
const get = program.commands.find((c) => c.name() === 'get');
|
||||
expect(get).toBeDefined();
|
||||
|
||||
const projectOpt = get!.options.find((o) => o.long === '--project');
|
||||
expect(projectOpt).toBeDefined();
|
||||
expect(projectOpt!.description).toContain('project');
|
||||
});
|
||||
|
||||
it('program-level --project option is defined', () => {
|
||||
const program = createProgram();
|
||||
const projectOpt = program.options.find((o) => o.long === '--project');
|
||||
expect(projectOpt).toBeDefined();
|
||||
});
|
||||
|
||||
it('displays version', () => {
|
||||
|
||||
@@ -0,0 +1,8 @@
|
||||
-- DropForeignKey
|
||||
ALTER TABLE "ProjectMember" DROP CONSTRAINT IF EXISTS "ProjectMember_projectId_fkey";
|
||||
|
||||
-- DropForeignKey
|
||||
ALTER TABLE "ProjectMember" DROP CONSTRAINT IF EXISTS "ProjectMember_userId_fkey";
|
||||
|
||||
-- DropTable
|
||||
DROP TABLE IF EXISTS "ProjectMember";
|
||||
@@ -0,0 +1,11 @@
|
||||
-- AlterTable: Add gated flag to Project
|
||||
ALTER TABLE "Project" ADD COLUMN "gated" BOOLEAN NOT NULL DEFAULT true;
|
||||
|
||||
-- AlterTable: Add priority, summary, chapters, linkTarget to Prompt
|
||||
ALTER TABLE "Prompt" ADD COLUMN "priority" INTEGER NOT NULL DEFAULT 5;
|
||||
ALTER TABLE "Prompt" ADD COLUMN "summary" TEXT;
|
||||
ALTER TABLE "Prompt" ADD COLUMN "chapters" JSONB;
|
||||
ALTER TABLE "Prompt" ADD COLUMN "linkTarget" TEXT;
|
||||
|
||||
-- AlterTable: Add priority to PromptRequest
|
||||
ALTER TABLE "PromptRequest" ADD COLUMN "priority" INTEGER NOT NULL DEFAULT 5;
|
||||
@@ -15,13 +15,16 @@ model User {
|
||||
name String?
|
||||
passwordHash String
|
||||
role Role @default(USER)
|
||||
provider String?
|
||||
externalId String?
|
||||
version Int @default(1)
|
||||
createdAt DateTime @default(now())
|
||||
updatedAt DateTime @updatedAt
|
||||
|
||||
sessions Session[]
|
||||
auditLogs AuditLog[]
|
||||
projects Project[]
|
||||
sessions Session[]
|
||||
auditLogs AuditLog[]
|
||||
ownedProjects Project[]
|
||||
groupMemberships GroupMember[]
|
||||
|
||||
@@index([email])
|
||||
}
|
||||
@@ -71,6 +74,7 @@ model McpServer {
|
||||
templateVersion String?
|
||||
|
||||
instances McpInstance[]
|
||||
projects ProjectServer[]
|
||||
|
||||
@@index([name])
|
||||
}
|
||||
@@ -117,23 +121,86 @@ model Secret {
|
||||
@@index([name])
|
||||
}
|
||||
|
||||
// ── Groups ──
|
||||
|
||||
model Group {
|
||||
id String @id @default(cuid())
|
||||
name String @unique
|
||||
description String @default("")
|
||||
version Int @default(1)
|
||||
createdAt DateTime @default(now())
|
||||
updatedAt DateTime @updatedAt
|
||||
|
||||
members GroupMember[]
|
||||
|
||||
@@index([name])
|
||||
}
|
||||
|
||||
model GroupMember {
|
||||
id String @id @default(cuid())
|
||||
groupId String
|
||||
userId String
|
||||
createdAt DateTime @default(now())
|
||||
|
||||
group Group @relation(fields: [groupId], references: [id], onDelete: Cascade)
|
||||
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
|
||||
|
||||
@@unique([groupId, userId])
|
||||
@@index([groupId])
|
||||
@@index([userId])
|
||||
}
|
||||
|
||||
// ── RBAC Definitions ──
|
||||
|
||||
model RbacDefinition {
|
||||
id String @id @default(cuid())
|
||||
name String @unique
|
||||
subjects Json @default("[]")
|
||||
roleBindings Json @default("[]")
|
||||
version Int @default(1)
|
||||
createdAt DateTime @default(now())
|
||||
updatedAt DateTime @updatedAt
|
||||
|
||||
@@index([name])
|
||||
}
|
||||
|
||||
// ── Projects ──
|
||||
|
||||
model Project {
|
||||
id String @id @default(cuid())
|
||||
name String @unique
|
||||
description String @default("")
|
||||
prompt String @default("")
|
||||
proxyMode String @default("direct")
|
||||
gated Boolean @default(true)
|
||||
llmProvider String?
|
||||
llmModel String?
|
||||
ownerId String
|
||||
version Int @default(1)
|
||||
createdAt DateTime @default(now())
|
||||
updatedAt DateTime @updatedAt
|
||||
|
||||
owner User @relation(fields: [ownerId], references: [id], onDelete: Cascade)
|
||||
owner User @relation(fields: [ownerId], references: [id], onDelete: Cascade)
|
||||
servers ProjectServer[]
|
||||
prompts Prompt[]
|
||||
promptRequests PromptRequest[]
|
||||
|
||||
@@index([name])
|
||||
@@index([ownerId])
|
||||
}
|
||||
|
||||
model ProjectServer {
|
||||
id String @id @default(cuid())
|
||||
projectId String
|
||||
serverId String
|
||||
createdAt DateTime @default(now())
|
||||
|
||||
project Project @relation(fields: [projectId], references: [id], onDelete: Cascade)
|
||||
server McpServer @relation(fields: [serverId], references: [id], onDelete: Cascade)
|
||||
|
||||
@@unique([projectId, serverId])
|
||||
}
|
||||
|
||||
// ── MCP Instances (running containers) ──
|
||||
|
||||
model McpInstance {
|
||||
@@ -164,6 +231,46 @@ enum InstanceStatus {
|
||||
ERROR
|
||||
}
|
||||
|
||||
// ── Prompts (approved content resources) ──
|
||||
|
||||
model Prompt {
|
||||
id String @id @default(cuid())
|
||||
name String
|
||||
content String @db.Text
|
||||
projectId String?
|
||||
priority Int @default(5)
|
||||
summary String? @db.Text
|
||||
chapters Json?
|
||||
linkTarget String?
|
||||
version Int @default(1)
|
||||
createdAt DateTime @default(now())
|
||||
updatedAt DateTime @updatedAt
|
||||
|
||||
project Project? @relation(fields: [projectId], references: [id], onDelete: Cascade)
|
||||
|
||||
@@unique([name, projectId])
|
||||
@@index([projectId])
|
||||
}
|
||||
|
||||
// ── Prompt Requests (pending proposals from LLM sessions) ──
|
||||
|
||||
model PromptRequest {
|
||||
id String @id @default(cuid())
|
||||
name String
|
||||
content String @db.Text
|
||||
projectId String?
|
||||
priority Int @default(5)
|
||||
createdBySession String?
|
||||
createdByUserId String?
|
||||
createdAt DateTime @default(now())
|
||||
|
||||
project Project? @relation(fields: [projectId], references: [id], onDelete: Cascade)
|
||||
|
||||
@@unique([name, projectId])
|
||||
@@index([projectId])
|
||||
@@index([createdBySession])
|
||||
}
|
||||
|
||||
// ── Audit Logs ──
|
||||
|
||||
model AuditLog {
|
||||
|
||||
@@ -49,10 +49,15 @@ export async function clearAllTables(client: PrismaClient): Promise<void> {
|
||||
// Delete in order respecting foreign keys
|
||||
await client.auditLog.deleteMany();
|
||||
await client.mcpInstance.deleteMany();
|
||||
await client.projectServer.deleteMany();
|
||||
await client.projectMember.deleteMany();
|
||||
await client.secret.deleteMany();
|
||||
await client.session.deleteMany();
|
||||
await client.project.deleteMany();
|
||||
await client.mcpServer.deleteMany();
|
||||
await client.mcpTemplate.deleteMany();
|
||||
await client.groupMember.deleteMany();
|
||||
await client.group.deleteMany();
|
||||
await client.rbacDefinition.deleteMany();
|
||||
await client.user.deleteMany();
|
||||
}
|
||||
|
||||
@@ -23,11 +23,35 @@ async function createUser(overrides: { email?: string; name?: string; role?: 'US
|
||||
data: {
|
||||
email: overrides.email ?? `test-${Date.now()}@example.com`,
|
||||
name: overrides.name ?? 'Test User',
|
||||
passwordHash: '$2b$10$test-hash-placeholder',
|
||||
role: overrides.role ?? 'USER',
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
async function createGroup(overrides: { name?: string; description?: string } = {}) {
|
||||
return prisma.group.create({
|
||||
data: {
|
||||
name: overrides.name ?? `group-${Date.now()}`,
|
||||
description: overrides.description ?? 'Test group',
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
async function createProject(overrides: { name?: string; ownerId?: string } = {}) {
|
||||
let ownerId = overrides.ownerId;
|
||||
if (!ownerId) {
|
||||
const user = await createUser();
|
||||
ownerId = user.id;
|
||||
}
|
||||
return prisma.project.create({
|
||||
data: {
|
||||
name: overrides.name ?? `project-${Date.now()}`,
|
||||
ownerId,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
async function createServer(overrides: { name?: string; transport?: 'STDIO' | 'SSE' | 'STREAMABLE_HTTP' } = {}) {
|
||||
return prisma.mcpServer.create({
|
||||
data: {
|
||||
@@ -309,3 +333,236 @@ describe('AuditLog', () => {
|
||||
expect(logs).toHaveLength(0);
|
||||
});
|
||||
});
|
||||
|
||||
// ── User SSO fields ──
|
||||
|
||||
describe('User SSO fields', () => {
|
||||
it('stores provider and externalId', async () => {
|
||||
const user = await prisma.user.create({
|
||||
data: {
|
||||
email: 'sso@example.com',
|
||||
passwordHash: 'hash',
|
||||
provider: 'oidc',
|
||||
externalId: 'ext-123',
|
||||
},
|
||||
});
|
||||
expect(user.provider).toBe('oidc');
|
||||
expect(user.externalId).toBe('ext-123');
|
||||
});
|
||||
|
||||
it('defaults provider and externalId to null', async () => {
|
||||
const user = await createUser();
|
||||
expect(user.provider).toBeNull();
|
||||
expect(user.externalId).toBeNull();
|
||||
});
|
||||
});
|
||||
|
||||
// ── Group model ──
|
||||
|
||||
describe('Group', () => {
|
||||
it('creates a group with defaults', async () => {
|
||||
const group = await createGroup();
|
||||
expect(group.id).toBeDefined();
|
||||
expect(group.version).toBe(1);
|
||||
});
|
||||
|
||||
it('enforces unique name', async () => {
|
||||
await createGroup({ name: 'devs' });
|
||||
await expect(createGroup({ name: 'devs' })).rejects.toThrow();
|
||||
});
|
||||
|
||||
it('creates group members', async () => {
|
||||
const group = await createGroup();
|
||||
const user = await createUser();
|
||||
const member = await prisma.groupMember.create({
|
||||
data: { groupId: group.id, userId: user.id },
|
||||
});
|
||||
expect(member.groupId).toBe(group.id);
|
||||
expect(member.userId).toBe(user.id);
|
||||
});
|
||||
|
||||
it('enforces unique group-user pair', async () => {
|
||||
const group = await createGroup();
|
||||
const user = await createUser();
|
||||
await prisma.groupMember.create({ data: { groupId: group.id, userId: user.id } });
|
||||
await expect(
|
||||
prisma.groupMember.create({ data: { groupId: group.id, userId: user.id } }),
|
||||
).rejects.toThrow();
|
||||
});
|
||||
|
||||
it('cascades delete when group is deleted', async () => {
|
||||
const group = await createGroup();
|
||||
const user = await createUser();
|
||||
await prisma.groupMember.create({ data: { groupId: group.id, userId: user.id } });
|
||||
await prisma.group.delete({ where: { id: group.id } });
|
||||
const members = await prisma.groupMember.findMany({ where: { groupId: group.id } });
|
||||
expect(members).toHaveLength(0);
|
||||
});
|
||||
});
|
||||
|
||||
// ── RbacDefinition model ──
|
||||
|
||||
describe('RbacDefinition', () => {
|
||||
it('creates with defaults', async () => {
|
||||
const rbac = await prisma.rbacDefinition.create({
|
||||
data: { name: 'test-rbac' },
|
||||
});
|
||||
expect(rbac.subjects).toEqual([]);
|
||||
expect(rbac.roleBindings).toEqual([]);
|
||||
expect(rbac.version).toBe(1);
|
||||
});
|
||||
|
||||
it('enforces unique name', async () => {
|
||||
await prisma.rbacDefinition.create({ data: { name: 'dup-rbac' } });
|
||||
await expect(prisma.rbacDefinition.create({ data: { name: 'dup-rbac' } })).rejects.toThrow();
|
||||
});
|
||||
|
||||
it('stores subjects as JSON', async () => {
|
||||
const rbac = await prisma.rbacDefinition.create({
|
||||
data: {
|
||||
name: 'with-subjects',
|
||||
subjects: [{ kind: 'User', name: 'alice@test.com' }, { kind: 'Group', name: 'devs' }],
|
||||
},
|
||||
});
|
||||
const subjects = rbac.subjects as Array<{ kind: string; name: string }>;
|
||||
expect(subjects).toHaveLength(2);
|
||||
expect(subjects[0].kind).toBe('User');
|
||||
});
|
||||
|
||||
it('stores roleBindings as JSON', async () => {
|
||||
const rbac = await prisma.rbacDefinition.create({
|
||||
data: {
|
||||
name: 'with-bindings',
|
||||
roleBindings: [{ role: 'editor', resource: 'servers' }],
|
||||
},
|
||||
});
|
||||
const bindings = rbac.roleBindings as Array<{ role: string; resource: string }>;
|
||||
expect(bindings).toHaveLength(1);
|
||||
expect(bindings[0].role).toBe('editor');
|
||||
});
|
||||
|
||||
it('updates subjects and roleBindings', async () => {
|
||||
const rbac = await prisma.rbacDefinition.create({ data: { name: 'updatable-rbac' } });
|
||||
const updated = await prisma.rbacDefinition.update({
|
||||
where: { id: rbac.id },
|
||||
data: {
|
||||
subjects: [{ kind: 'User', name: 'bob@test.com' }],
|
||||
roleBindings: [{ role: 'admin', resource: '*' }],
|
||||
},
|
||||
});
|
||||
expect((updated.subjects as unknown[]).length).toBe(1);
|
||||
expect((updated.roleBindings as unknown[]).length).toBe(1);
|
||||
});
|
||||
});
|
||||
|
||||
// ── ProjectServer model ──
|
||||
|
||||
describe('ProjectServer', () => {
|
||||
it('links project to server', async () => {
|
||||
const project = await createProject();
|
||||
const server = await createServer();
|
||||
const ps = await prisma.projectServer.create({
|
||||
data: { projectId: project.id, serverId: server.id },
|
||||
});
|
||||
expect(ps.projectId).toBe(project.id);
|
||||
expect(ps.serverId).toBe(server.id);
|
||||
});
|
||||
|
||||
it('enforces unique project-server pair', async () => {
|
||||
const project = await createProject();
|
||||
const server = await createServer();
|
||||
await prisma.projectServer.create({ data: { projectId: project.id, serverId: server.id } });
|
||||
await expect(
|
||||
prisma.projectServer.create({ data: { projectId: project.id, serverId: server.id } }),
|
||||
).rejects.toThrow();
|
||||
});
|
||||
|
||||
it('cascades delete when project is deleted', async () => {
|
||||
const project = await createProject();
|
||||
const server = await createServer();
|
||||
await prisma.projectServer.create({ data: { projectId: project.id, serverId: server.id } });
|
||||
await prisma.project.delete({ where: { id: project.id } });
|
||||
const links = await prisma.projectServer.findMany({ where: { projectId: project.id } });
|
||||
expect(links).toHaveLength(0);
|
||||
});
|
||||
|
||||
it('cascades delete when server is deleted', async () => {
|
||||
const project = await createProject();
|
||||
const server = await createServer();
|
||||
await prisma.projectServer.create({ data: { projectId: project.id, serverId: server.id } });
|
||||
await prisma.mcpServer.delete({ where: { id: server.id } });
|
||||
const links = await prisma.projectServer.findMany({ where: { serverId: server.id } });
|
||||
expect(links).toHaveLength(0);
|
||||
});
|
||||
});
|
||||
|
||||
// ── ProjectMember model ──
|
||||
|
||||
describe('ProjectMember', () => {
|
||||
it('links project to user with role', async () => {
|
||||
const user = await createUser();
|
||||
const project = await createProject({ ownerId: user.id });
|
||||
const pm = await prisma.projectMember.create({
|
||||
data: { projectId: project.id, userId: user.id, role: 'admin' },
|
||||
});
|
||||
expect(pm.role).toBe('admin');
|
||||
});
|
||||
|
||||
it('defaults role to member', async () => {
|
||||
const user = await createUser();
|
||||
const project = await createProject({ ownerId: user.id });
|
||||
const pm = await prisma.projectMember.create({
|
||||
data: { projectId: project.id, userId: user.id },
|
||||
});
|
||||
expect(pm.role).toBe('member');
|
||||
});
|
||||
|
||||
it('enforces unique project-user pair', async () => {
|
||||
const user = await createUser();
|
||||
const project = await createProject({ ownerId: user.id });
|
||||
await prisma.projectMember.create({ data: { projectId: project.id, userId: user.id } });
|
||||
await expect(
|
||||
prisma.projectMember.create({ data: { projectId: project.id, userId: user.id } }),
|
||||
).rejects.toThrow();
|
||||
});
|
||||
|
||||
it('cascades delete when project is deleted', async () => {
|
||||
const user = await createUser();
|
||||
const project = await createProject({ ownerId: user.id });
|
||||
await prisma.projectMember.create({ data: { projectId: project.id, userId: user.id } });
|
||||
await prisma.project.delete({ where: { id: project.id } });
|
||||
const members = await prisma.projectMember.findMany({ where: { projectId: project.id } });
|
||||
expect(members).toHaveLength(0);
|
||||
});
|
||||
});
|
||||
|
||||
// ── Project new fields ──
|
||||
|
||||
describe('Project new fields', () => {
|
||||
it('defaults proxyMode to direct', async () => {
|
||||
const project = await createProject();
|
||||
expect(project.proxyMode).toBe('direct');
|
||||
});
|
||||
|
||||
it('stores proxyMode, llmProvider, llmModel', async () => {
|
||||
const user = await createUser();
|
||||
const project = await prisma.project.create({
|
||||
data: {
|
||||
name: 'filtered-project',
|
||||
ownerId: user.id,
|
||||
proxyMode: 'filtered',
|
||||
llmProvider: 'gemini-cli',
|
||||
llmModel: 'gemini-2.0-flash',
|
||||
},
|
||||
});
|
||||
expect(project.proxyMode).toBe('filtered');
|
||||
expect(project.llmProvider).toBe('gemini-cli');
|
||||
expect(project.llmModel).toBe('gemini-2.0-flash');
|
||||
});
|
||||
|
||||
it('defaults llmProvider and llmModel to null', async () => {
|
||||
const project = await createProject();
|
||||
expect(project.llmProvider).toBeNull();
|
||||
expect(project.llmModel).toBeNull();
|
||||
});
|
||||
});
|
||||
|
||||
102
src/mcpd/src/bootstrap/system-project.ts
Normal file
102
src/mcpd/src/bootstrap/system-project.ts
Normal file
@@ -0,0 +1,102 @@
|
||||
/**
|
||||
* Bootstrap the mcpctl-system project and its system prompts.
|
||||
*
|
||||
* This runs on every mcpd startup and uses upserts to be idempotent.
|
||||
* System prompts are editable by users but will be re-created if deleted.
|
||||
*/
|
||||
|
||||
import type { PrismaClient } from '@prisma/client';
|
||||
|
||||
/** Well-known owner ID for system-managed resources. */
|
||||
export const SYSTEM_OWNER_ID = 'system';
|
||||
|
||||
/** Well-known project name for system prompts. */
|
||||
export const SYSTEM_PROJECT_NAME = 'mcpctl-system';
|
||||
|
||||
interface SystemPromptDef {
|
||||
name: string;
|
||||
priority: number;
|
||||
content: string;
|
||||
}
|
||||
|
||||
const SYSTEM_PROMPTS: SystemPromptDef[] = [
|
||||
{
|
||||
name: 'gate-instructions',
|
||||
priority: 10,
|
||||
content: `This project uses a gated session. Before you can access tools, you must describe your current task by calling begin_session with 3-7 keywords.
|
||||
|
||||
After calling begin_session, you will receive:
|
||||
1. Relevant project prompts matched to your keywords
|
||||
2. A list of other available prompts
|
||||
3. Full access to all project tools
|
||||
|
||||
Choose your keywords carefully — they determine which context you receive.`,
|
||||
},
|
||||
{
|
||||
name: 'gate-encouragement',
|
||||
priority: 10,
|
||||
content: `If any of the listed prompts seem relevant to your work, or if you encounter unfamiliar patterns, conventions, or constraints during implementation, use read_prompts({ tags: [...] }) to retrieve them.
|
||||
|
||||
It is better to check and not need it than to proceed without important context. The project maintainers have documented common pitfalls, architecture decisions, and required patterns — taking 10 seconds to retrieve a prompt can save hours of rework.`,
|
||||
},
|
||||
{
|
||||
name: 'gate-intercept-preamble',
|
||||
priority: 10,
|
||||
content: `The following project context was automatically retrieved based on your tool call. You bypassed the begin_session step, so this context was matched using keywords extracted from your tool invocation.
|
||||
|
||||
Review this context carefully — it may contain important guidelines, constraints, or patterns relevant to your work. If you need more context, use read_prompts({ tags: [...] }) at any time.`,
|
||||
},
|
||||
{
|
||||
name: 'session-greeting',
|
||||
priority: 10,
|
||||
content: `Welcome to this project. To get started, call begin_session with keywords describing your task.
|
||||
|
||||
Example: begin_session({ tags: ["zigbee", "pairing", "mqtt"] })
|
||||
|
||||
This will load relevant project context, policies, and guidelines tailored to your work.`,
|
||||
},
|
||||
];
|
||||
|
||||
/**
|
||||
* Ensure the mcpctl-system project and its system prompts exist.
|
||||
* Uses upserts so this is safe to call on every startup.
|
||||
*/
|
||||
export async function bootstrapSystemProject(prisma: PrismaClient): Promise<void> {
|
||||
// Upsert the system project
|
||||
const project = await prisma.project.upsert({
|
||||
where: { name: SYSTEM_PROJECT_NAME },
|
||||
create: {
|
||||
name: SYSTEM_PROJECT_NAME,
|
||||
description: 'System prompts for mcpctl gating and session management',
|
||||
prompt: '',
|
||||
proxyMode: 'direct',
|
||||
gated: false,
|
||||
ownerId: SYSTEM_OWNER_ID,
|
||||
},
|
||||
update: {}, // Don't overwrite user edits to the project itself
|
||||
});
|
||||
|
||||
// Upsert each system prompt (re-create if deleted, don't overwrite content if edited)
|
||||
for (const def of SYSTEM_PROMPTS) {
|
||||
const existing = await prisma.prompt.findFirst({
|
||||
where: { name: def.name, projectId: project.id },
|
||||
});
|
||||
|
||||
if (!existing) {
|
||||
await prisma.prompt.create({
|
||||
data: {
|
||||
name: def.name,
|
||||
content: def.content,
|
||||
priority: def.priority,
|
||||
projectId: project.id,
|
||||
},
|
||||
});
|
||||
}
|
||||
// If the prompt exists, don't overwrite — user may have edited it
|
||||
}
|
||||
}
|
||||
|
||||
/** Get the names of all system prompts (for delete protection). */
|
||||
export function getSystemPromptNames(): string[] {
|
||||
return SYSTEM_PROMPTS.map((p) => p.name);
|
||||
}
|
||||
@@ -14,7 +14,13 @@ import {
|
||||
ProjectRepository,
|
||||
AuditLogRepository,
|
||||
TemplateRepository,
|
||||
RbacDefinitionRepository,
|
||||
UserRepository,
|
||||
GroupRepository,
|
||||
} from './repositories/index.js';
|
||||
import { PromptRepository } from './repositories/prompt.repository.js';
|
||||
import { PromptRequestRepository } from './repositories/prompt-request.repository.js';
|
||||
import { bootstrapSystemProject } from './bootstrap/system-project.js';
|
||||
import {
|
||||
McpServerService,
|
||||
SecretService,
|
||||
@@ -30,7 +36,14 @@ import {
|
||||
McpProxyService,
|
||||
TemplateService,
|
||||
HealthProbeRunner,
|
||||
RbacDefinitionService,
|
||||
RbacService,
|
||||
UserService,
|
||||
GroupService,
|
||||
} from './services/index.js';
|
||||
import type { RbacAction } from './services/index.js';
|
||||
import type { UpdateRbacDefinitionInput } from './validation/rbac-definition.schema.js';
|
||||
import { createAuthMiddleware } from './middleware/auth.js';
|
||||
import {
|
||||
registerMcpServerRoutes,
|
||||
registerSecretRoutes,
|
||||
@@ -42,7 +55,155 @@ import {
|
||||
registerAuthRoutes,
|
||||
registerMcpProxyRoutes,
|
||||
registerTemplateRoutes,
|
||||
registerRbacRoutes,
|
||||
registerUserRoutes,
|
||||
registerGroupRoutes,
|
||||
} from './routes/index.js';
|
||||
import { registerPromptRoutes } from './routes/prompts.js';
|
||||
import { PromptService } from './services/prompt.service.js';
|
||||
|
||||
type PermissionCheck =
|
||||
| { kind: 'resource'; resource: string; action: RbacAction; resourceName?: string }
|
||||
| { kind: 'operation'; operation: string }
|
||||
| { kind: 'skip' };
|
||||
|
||||
/**
|
||||
* Map an HTTP method + URL to a permission check.
|
||||
* Returns 'skip' for URLs that should not be RBAC-checked.
|
||||
*/
|
||||
function mapUrlToPermission(method: string, url: string): PermissionCheck {
|
||||
const match = url.match(/^\/api\/v1\/([a-z-]+)/);
|
||||
if (!match) return { kind: 'skip' };
|
||||
|
||||
const segment = match[1] as string;
|
||||
|
||||
// Operations (non-resource endpoints)
|
||||
if (segment === 'backup') return { kind: 'operation', operation: 'backup' };
|
||||
if (segment === 'restore') return { kind: 'operation', operation: 'restore' };
|
||||
if (segment === 'audit-logs' && method === 'DELETE') return { kind: 'operation', operation: 'audit-purge' };
|
||||
|
||||
const resourceMap: Record<string, string | undefined> = {
|
||||
'servers': 'servers',
|
||||
'instances': 'instances',
|
||||
'secrets': 'secrets',
|
||||
'projects': 'projects',
|
||||
'templates': 'templates',
|
||||
'users': 'users',
|
||||
'groups': 'groups',
|
||||
'rbac': 'rbac',
|
||||
'audit-logs': 'rbac',
|
||||
'mcp': 'servers',
|
||||
'prompts': 'prompts',
|
||||
'promptrequests': 'promptrequests',
|
||||
};
|
||||
|
||||
const resource = resourceMap[segment];
|
||||
if (resource === undefined) return { kind: 'skip' };
|
||||
|
||||
// Special case: /api/v1/promptrequests/:id/approve → needs both delete+promptrequests and create+prompts
|
||||
// We check delete on promptrequests (the harder permission); create on prompts is checked in the service layer
|
||||
const approveMatch = url.match(/^\/api\/v1\/promptrequests\/([^/?]+)\/approve/);
|
||||
if (approveMatch?.[1]) {
|
||||
return { kind: 'resource', resource: 'promptrequests', action: 'delete', resourceName: approveMatch[1] };
|
||||
}
|
||||
|
||||
// Special case: /api/v1/projects/:name/prompts/visible → view prompts
|
||||
const visiblePromptsMatch = url.match(/^\/api\/v1\/projects\/([^/?]+)\/prompts\/visible/);
|
||||
if (visiblePromptsMatch?.[1]) {
|
||||
return { kind: 'resource', resource: 'prompts', action: 'view' };
|
||||
}
|
||||
|
||||
// Special case: /api/v1/projects/:name/promptrequests → create promptrequests
|
||||
const projectPromptrequestsMatch = url.match(/^\/api\/v1\/projects\/([^/?]+)\/promptrequests/);
|
||||
if (projectPromptrequestsMatch?.[1] && method === 'POST') {
|
||||
return { kind: 'resource', resource: 'promptrequests', action: 'create' };
|
||||
}
|
||||
|
||||
// Special case: /api/v1/projects/:id/instructions → view projects
|
||||
const instructionsMatch = url.match(/^\/api\/v1\/projects\/([^/?]+)\/instructions/);
|
||||
if (instructionsMatch?.[1]) {
|
||||
return { kind: 'resource', resource: 'projects', action: 'view', resourceName: instructionsMatch[1] };
|
||||
}
|
||||
|
||||
// Special case: /api/v1/projects/:id/mcp-config → requires 'expose' permission
|
||||
const mcpConfigMatch = url.match(/^\/api\/v1\/projects\/([^/?]+)\/mcp-config/);
|
||||
if (mcpConfigMatch?.[1]) {
|
||||
return { kind: 'resource', resource: 'projects', action: 'expose', resourceName: mcpConfigMatch[1] };
|
||||
}
|
||||
|
||||
// Special case: /api/v1/projects/:id/servers — attach/detach requires 'edit'
|
||||
const projectServersMatch = url.match(/^\/api\/v1\/projects\/([^/?]+)\/servers/);
|
||||
if (projectServersMatch?.[1] && method !== 'GET') {
|
||||
return { kind: 'resource', resource: 'projects', action: 'edit', resourceName: projectServersMatch[1] };
|
||||
}
|
||||
|
||||
// Map HTTP method to action
|
||||
let action: RbacAction;
|
||||
switch (method) {
|
||||
case 'GET':
|
||||
case 'HEAD':
|
||||
action = 'view';
|
||||
break;
|
||||
case 'POST':
|
||||
action = 'create';
|
||||
break;
|
||||
case 'DELETE':
|
||||
action = 'delete';
|
||||
break;
|
||||
default: // PUT, PATCH
|
||||
action = 'edit';
|
||||
break;
|
||||
}
|
||||
|
||||
// Extract resource name/ID from URL (3rd segment: /api/v1/servers/:nameOrId)
|
||||
const nameMatch = url.match(/^\/api\/v1\/[a-z-]+\/([^/?]+)/);
|
||||
const resourceName = nameMatch?.[1];
|
||||
|
||||
const check: PermissionCheck = { kind: 'resource', resource, action };
|
||||
if (resourceName !== undefined) (check as { resourceName: string }).resourceName = resourceName;
|
||||
return check;
|
||||
}
|
||||
|
||||
/**
|
||||
* Migrate legacy 'admin' role bindings → granular roles.
|
||||
* Old format: { role: 'admin', resource: '*' }
|
||||
* New format: { role: 'edit', resource: '*' }, { role: 'run', resource: '*' },
|
||||
* plus operation bindings for impersonate, logs, backup, restore, audit-purge
|
||||
*/
|
||||
async function migrateAdminRole(rbacRepo: InstanceType<typeof RbacDefinitionRepository>): Promise<void> {
|
||||
const definitions = await rbacRepo.findAll();
|
||||
for (const def of definitions) {
|
||||
const bindings = def.roleBindings as Array<Record<string, unknown>>;
|
||||
const hasAdminRole = bindings.some((b) => b['role'] === 'admin');
|
||||
if (!hasAdminRole) continue;
|
||||
|
||||
// Replace admin bindings with granular equivalents
|
||||
const newBindings: Array<Record<string, string>> = [];
|
||||
for (const b of bindings) {
|
||||
if (b['role'] === 'admin') {
|
||||
const resource = b['resource'] as string;
|
||||
newBindings.push({ role: 'edit', resource });
|
||||
newBindings.push({ role: 'run', resource });
|
||||
} else {
|
||||
newBindings.push(b as Record<string, string>);
|
||||
}
|
||||
}
|
||||
// Add operation bindings (idempotent — only for wildcard admin)
|
||||
const hasWildcard = bindings.some((b) => b['role'] === 'admin' && b['resource'] === '*');
|
||||
if (hasWildcard) {
|
||||
const ops = ['impersonate', 'logs', 'backup', 'restore', 'audit-purge'];
|
||||
for (const op of ops) {
|
||||
if (!newBindings.some((b) => b['action'] === op)) {
|
||||
newBindings.push({ role: 'run', action: op });
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
await rbacRepo.update(def.id, { roleBindings: newBindings as UpdateRbacDefinitionInput['roleBindings'] });
|
||||
// eslint-disable-next-line no-console
|
||||
console.log(`mcpd: migrated RBAC '${def.name}' from admin → granular roles`);
|
||||
}
|
||||
}
|
||||
|
||||
async function main(): Promise<void> {
|
||||
const config = loadConfigFromEnv();
|
||||
@@ -75,6 +236,9 @@ async function main(): Promise<void> {
|
||||
});
|
||||
await seedTemplates(prisma, templates);
|
||||
|
||||
// Bootstrap system project and prompts
|
||||
await bootstrapSystemProject(prisma);
|
||||
|
||||
// Repositories
|
||||
const serverRepo = new McpServerRepository(prisma);
|
||||
const secretRepo = new SecretRepository(prisma);
|
||||
@@ -82,6 +246,21 @@ async function main(): Promise<void> {
|
||||
const projectRepo = new ProjectRepository(prisma);
|
||||
const auditLogRepo = new AuditLogRepository(prisma);
|
||||
const templateRepo = new TemplateRepository(prisma);
|
||||
const rbacDefinitionRepo = new RbacDefinitionRepository(prisma);
|
||||
const userRepo = new UserRepository(prisma);
|
||||
const groupRepo = new GroupRepository(prisma);
|
||||
|
||||
// CUID detection for RBAC name resolution
|
||||
const CUID_RE = /^c[^\s-]{8,}$/i;
|
||||
const nameResolvers: Record<string, { findById(id: string): Promise<{ name: string } | null> }> = {
|
||||
servers: serverRepo,
|
||||
secrets: secretRepo,
|
||||
projects: projectRepo,
|
||||
groups: groupRepo,
|
||||
};
|
||||
|
||||
// Migrate legacy 'admin' role → granular roles
|
||||
await migrateAdminRole(rbacDefinitionRepo);
|
||||
|
||||
// Orchestrator
|
||||
const orchestrator = new DockerContainerManager();
|
||||
@@ -91,15 +270,27 @@ async function main(): Promise<void> {
|
||||
const instanceService = new InstanceService(instanceRepo, serverRepo, orchestrator, secretRepo);
|
||||
serverService.setInstanceService(instanceService);
|
||||
const secretService = new SecretService(secretRepo);
|
||||
const projectService = new ProjectService(projectRepo);
|
||||
const projectService = new ProjectService(projectRepo, serverRepo, secretRepo);
|
||||
const auditLogService = new AuditLogService(auditLogRepo);
|
||||
const metricsCollector = new MetricsCollector();
|
||||
const healthAggregator = new HealthAggregator(metricsCollector, orchestrator);
|
||||
const backupService = new BackupService(serverRepo, projectRepo, secretRepo);
|
||||
const restoreService = new RestoreService(serverRepo, projectRepo, secretRepo);
|
||||
const backupService = new BackupService(serverRepo, projectRepo, secretRepo, userRepo, groupRepo, rbacDefinitionRepo);
|
||||
const restoreService = new RestoreService(serverRepo, projectRepo, secretRepo, userRepo, groupRepo, rbacDefinitionRepo);
|
||||
const authService = new AuthService(prisma);
|
||||
const templateService = new TemplateService(templateRepo);
|
||||
const mcpProxyService = new McpProxyService(instanceRepo, serverRepo);
|
||||
const mcpProxyService = new McpProxyService(instanceRepo, serverRepo, orchestrator);
|
||||
const rbacDefinitionService = new RbacDefinitionService(rbacDefinitionRepo);
|
||||
const rbacService = new RbacService(rbacDefinitionRepo, prisma);
|
||||
const userService = new UserService(userRepo);
|
||||
const groupService = new GroupService(groupRepo, userRepo);
|
||||
const promptRepo = new PromptRepository(prisma);
|
||||
const promptRequestRepo = new PromptRequestRepository(prisma);
|
||||
const promptService = new PromptService(promptRepo, promptRequestRepo, projectRepo);
|
||||
|
||||
// Auth middleware for global hooks
|
||||
const authMiddleware = createAuthMiddleware({
|
||||
findSession: (token) => authService.findSession(token),
|
||||
});
|
||||
|
||||
// Server
|
||||
const app = await createServer(config, {
|
||||
@@ -115,6 +306,59 @@ async function main(): Promise<void> {
|
||||
},
|
||||
});
|
||||
|
||||
// ── Global auth hook ──
|
||||
// Runs on all /api/v1/* routes EXCEPT auth endpoints and health checks.
|
||||
// Tests that use createServer() directly are NOT affected — this hook
|
||||
// is only registered here in main.ts.
|
||||
app.addHook('preHandler', async (request, reply) => {
|
||||
const url = request.url;
|
||||
// Skip auth for health, auth, and root
|
||||
if (url.startsWith('/api/v1/auth/') || url === '/healthz' || url === '/health') return;
|
||||
if (!url.startsWith('/api/v1/')) return;
|
||||
|
||||
// Run auth middleware
|
||||
await authMiddleware(request, reply);
|
||||
});
|
||||
|
||||
// ── Global RBAC hook ──
|
||||
// Runs after the auth hook. Maps URL to resource+action and checks permissions.
|
||||
app.addHook('preHandler', async (request, reply) => {
|
||||
if (reply.sent) return; // Auth hook already rejected
|
||||
const url = request.url;
|
||||
if (url.startsWith('/api/v1/auth/') || url === '/healthz' || url === '/health') return;
|
||||
if (!url.startsWith('/api/v1/')) return;
|
||||
if (request.userId === undefined) return; // Auth hook will handle 401
|
||||
|
||||
const check = mapUrlToPermission(request.method, url);
|
||||
if (check.kind === 'skip') return;
|
||||
|
||||
// Extract service account identity from header (sent by mcplocal)
|
||||
const saHeader = request.headers['x-service-account'];
|
||||
const serviceAccountName = typeof saHeader === 'string' ? saHeader : undefined;
|
||||
|
||||
let allowed: boolean;
|
||||
if (check.kind === 'operation') {
|
||||
allowed = await rbacService.canRunOperation(request.userId, check.operation, serviceAccountName);
|
||||
} else {
|
||||
// Resolve CUID → human name for name-scoped RBAC bindings
|
||||
if (check.resourceName !== undefined && CUID_RE.test(check.resourceName)) {
|
||||
const resolver = nameResolvers[check.resource];
|
||||
if (resolver) {
|
||||
const entity = await resolver.findById(check.resourceName);
|
||||
if (entity) check.resourceName = entity.name;
|
||||
}
|
||||
}
|
||||
allowed = await rbacService.canAccess(request.userId, check.action, check.resource, check.resourceName, serviceAccountName);
|
||||
// Compute scope for list filtering (used by preSerialization hook)
|
||||
if (allowed && check.resourceName === undefined) {
|
||||
request.rbacScope = await rbacService.getAllowedScope(request.userId, check.action, check.resource, serviceAccountName);
|
||||
}
|
||||
}
|
||||
if (!allowed) {
|
||||
reply.code(403).send({ error: 'Forbidden' });
|
||||
}
|
||||
});
|
||||
|
||||
// Routes
|
||||
registerMcpServerRoutes(app, serverService, instanceService);
|
||||
registerTemplateRoutes(app, templateService);
|
||||
@@ -124,12 +368,27 @@ async function main(): Promise<void> {
|
||||
registerAuditLogRoutes(app, auditLogService);
|
||||
registerHealthMonitoringRoutes(app, { healthAggregator, metricsCollector });
|
||||
registerBackupRoutes(app, { backupService, restoreService });
|
||||
registerAuthRoutes(app, { authService });
|
||||
registerAuthRoutes(app, { authService, userService, groupService, rbacDefinitionService, rbacService });
|
||||
registerMcpProxyRoutes(app, {
|
||||
mcpProxyService,
|
||||
auditLogService,
|
||||
authDeps: { findSession: (token) => authService.findSession(token) },
|
||||
});
|
||||
registerRbacRoutes(app, rbacDefinitionService);
|
||||
registerUserRoutes(app, userService);
|
||||
registerGroupRoutes(app, groupService);
|
||||
registerPromptRoutes(app, promptService, projectRepo);
|
||||
|
||||
// ── RBAC list filtering hook ──
|
||||
// Filters array responses to only include resources the user is allowed to see.
|
||||
app.addHook('preSerialization', async (request, _reply, payload) => {
|
||||
if (!request.rbacScope || request.rbacScope.wildcard) return payload;
|
||||
if (!Array.isArray(payload)) return payload;
|
||||
return (payload as Array<Record<string, unknown>>).filter((item) => {
|
||||
const name = item['name'];
|
||||
return typeof name === 'string' && request.rbacScope!.names.has(name);
|
||||
});
|
||||
});
|
||||
|
||||
// Start
|
||||
await app.listen({ port: config.port, host: config.host });
|
||||
|
||||
@@ -7,6 +7,7 @@ export interface AuthDeps {
|
||||
declare module 'fastify' {
|
||||
interface FastifyRequest {
|
||||
userId?: string;
|
||||
rbacScope?: { wildcard: boolean; names: Set<string> };
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
36
src/mcpd/src/middleware/rbac.ts
Normal file
36
src/mcpd/src/middleware/rbac.ts
Normal file
@@ -0,0 +1,36 @@
|
||||
import type { FastifyRequest, FastifyReply } from 'fastify';
|
||||
import type { RbacService, RbacAction } from '../services/rbac.service.js';
|
||||
|
||||
export function createRbacMiddleware(rbacService: RbacService) {
|
||||
function requirePermission(resource: string, action: RbacAction, resourceName?: string) {
|
||||
return async (request: FastifyRequest, reply: FastifyReply): Promise<void> => {
|
||||
if (request.userId === undefined) {
|
||||
reply.code(401).send({ error: 'Authentication required' });
|
||||
return;
|
||||
}
|
||||
|
||||
const allowed = await rbacService.canAccess(request.userId, action, resource, resourceName);
|
||||
if (!allowed) {
|
||||
reply.code(403).send({ error: 'Forbidden' });
|
||||
return;
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
function requireOperation(operation: string) {
|
||||
return async (request: FastifyRequest, reply: FastifyReply): Promise<void> => {
|
||||
if (request.userId === undefined) {
|
||||
reply.code(401).send({ error: 'Authentication required' });
|
||||
return;
|
||||
}
|
||||
|
||||
const allowed = await rbacService.canRunOperation(request.userId, operation);
|
||||
if (!allowed) {
|
||||
reply.code(403).send({ error: 'Forbidden' });
|
||||
return;
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
return { requirePermission, requireOperation };
|
||||
}
|
||||
93
src/mcpd/src/repositories/group.repository.ts
Normal file
93
src/mcpd/src/repositories/group.repository.ts
Normal file
@@ -0,0 +1,93 @@
|
||||
import type { PrismaClient, Group } from '@prisma/client';
|
||||
|
||||
export interface GroupWithMembers extends Group {
|
||||
members: Array<{ id: string; user: { id: string; email: string; name: string | null } }>;
|
||||
}
|
||||
|
||||
export interface IGroupRepository {
|
||||
findAll(): Promise<GroupWithMembers[]>;
|
||||
findById(id: string): Promise<GroupWithMembers | null>;
|
||||
findByName(name: string): Promise<GroupWithMembers | null>;
|
||||
create(data: { name: string; description?: string }): Promise<Group>;
|
||||
update(id: string, data: { description?: string }): Promise<Group>;
|
||||
delete(id: string): Promise<void>;
|
||||
setMembers(groupId: string, userIds: string[]): Promise<void>;
|
||||
findGroupsForUser(userId: string): Promise<Array<{ id: string; name: string }>>;
|
||||
}
|
||||
|
||||
const MEMBERS_INCLUDE = {
|
||||
members: {
|
||||
select: {
|
||||
id: true,
|
||||
user: {
|
||||
select: { id: true, email: true, name: true },
|
||||
},
|
||||
},
|
||||
},
|
||||
} as const;
|
||||
|
||||
export class GroupRepository implements IGroupRepository {
|
||||
constructor(private readonly prisma: PrismaClient) {}
|
||||
|
||||
async findAll(): Promise<GroupWithMembers[]> {
|
||||
return this.prisma.group.findMany({
|
||||
orderBy: { name: 'asc' },
|
||||
include: MEMBERS_INCLUDE,
|
||||
});
|
||||
}
|
||||
|
||||
async findById(id: string): Promise<GroupWithMembers | null> {
|
||||
return this.prisma.group.findUnique({
|
||||
where: { id },
|
||||
include: MEMBERS_INCLUDE,
|
||||
});
|
||||
}
|
||||
|
||||
async findByName(name: string): Promise<GroupWithMembers | null> {
|
||||
return this.prisma.group.findUnique({
|
||||
where: { name },
|
||||
include: MEMBERS_INCLUDE,
|
||||
});
|
||||
}
|
||||
|
||||
async create(data: { name: string; description?: string }): Promise<Group> {
|
||||
const createData: Record<string, unknown> = { name: data.name };
|
||||
if (data.description !== undefined) createData['description'] = data.description;
|
||||
return this.prisma.group.create({
|
||||
data: createData as Parameters<PrismaClient['group']['create']>[0]['data'],
|
||||
});
|
||||
}
|
||||
|
||||
async update(id: string, data: { description?: string }): Promise<Group> {
|
||||
const updateData: Record<string, unknown> = {};
|
||||
if (data.description !== undefined) updateData['description'] = data.description;
|
||||
return this.prisma.group.update({ where: { id }, data: updateData });
|
||||
}
|
||||
|
||||
async delete(id: string): Promise<void> {
|
||||
await this.prisma.group.delete({ where: { id } });
|
||||
}
|
||||
|
||||
async setMembers(groupId: string, userIds: string[]): Promise<void> {
|
||||
await this.prisma.$transaction(async (tx) => {
|
||||
await tx.groupMember.deleteMany({ where: { groupId } });
|
||||
if (userIds.length > 0) {
|
||||
await tx.groupMember.createMany({
|
||||
data: userIds.map((userId) => ({ groupId, userId })),
|
||||
});
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
async findGroupsForUser(userId: string): Promise<Array<{ id: string; name: string }>> {
|
||||
const memberships = await this.prisma.groupMember.findMany({
|
||||
where: { userId },
|
||||
select: {
|
||||
group: {
|
||||
select: { id: true, name: true },
|
||||
},
|
||||
},
|
||||
});
|
||||
return memberships.map((m) => m.group);
|
||||
}
|
||||
}
|
||||
@@ -1,9 +1,15 @@
|
||||
export type { IMcpServerRepository, IMcpInstanceRepository, ISecretRepository, IAuditLogRepository, AuditLogFilter } from './interfaces.js';
|
||||
export { McpServerRepository } from './mcp-server.repository.js';
|
||||
export { SecretRepository } from './secret.repository.js';
|
||||
export type { IProjectRepository } from './project.repository.js';
|
||||
export type { IProjectRepository, ProjectWithRelations } from './project.repository.js';
|
||||
export { ProjectRepository } from './project.repository.js';
|
||||
export { McpInstanceRepository } from './mcp-instance.repository.js';
|
||||
export { AuditLogRepository } from './audit-log.repository.js';
|
||||
export type { ITemplateRepository } from './template.repository.js';
|
||||
export { TemplateRepository } from './template.repository.js';
|
||||
export type { IRbacDefinitionRepository } from './rbac-definition.repository.js';
|
||||
export { RbacDefinitionRepository } from './rbac-definition.repository.js';
|
||||
export type { IUserRepository, SafeUser } from './user.repository.js';
|
||||
export { UserRepository } from './user.repository.js';
|
||||
export type { IGroupRepository, GroupWithMembers } from './group.repository.js';
|
||||
export { GroupRepository } from './group.repository.js';
|
||||
|
||||
@@ -1,49 +1,92 @@
|
||||
import type { PrismaClient, Project } from '@prisma/client';
|
||||
import type { CreateProjectInput, UpdateProjectInput } from '../validation/project.schema.js';
|
||||
|
||||
export interface ProjectWithRelations extends Project {
|
||||
servers: Array<{ id: string; projectId: string; serverId: string; server: Record<string, unknown> & { id: string; name: string } }>;
|
||||
}
|
||||
|
||||
const PROJECT_INCLUDE = {
|
||||
servers: { include: { server: true } },
|
||||
} as const;
|
||||
|
||||
export interface IProjectRepository {
|
||||
findAll(ownerId?: string): Promise<Project[]>;
|
||||
findById(id: string): Promise<Project | null>;
|
||||
findByName(name: string): Promise<Project | null>;
|
||||
create(data: CreateProjectInput & { ownerId: string }): Promise<Project>;
|
||||
update(id: string, data: UpdateProjectInput): Promise<Project>;
|
||||
findAll(ownerId?: string): Promise<ProjectWithRelations[]>;
|
||||
findById(id: string): Promise<ProjectWithRelations | null>;
|
||||
findByName(name: string): Promise<ProjectWithRelations | null>;
|
||||
create(data: { name: string; description: string; prompt?: string; ownerId: string; proxyMode: string; llmProvider?: string; llmModel?: string }): Promise<ProjectWithRelations>;
|
||||
update(id: string, data: Record<string, unknown>): Promise<ProjectWithRelations>;
|
||||
delete(id: string): Promise<void>;
|
||||
setServers(projectId: string, serverIds: string[]): Promise<void>;
|
||||
addServer(projectId: string, serverId: string): Promise<void>;
|
||||
removeServer(projectId: string, serverId: string): Promise<void>;
|
||||
}
|
||||
|
||||
export class ProjectRepository implements IProjectRepository {
|
||||
constructor(private readonly prisma: PrismaClient) {}
|
||||
|
||||
async findAll(ownerId?: string): Promise<Project[]> {
|
||||
async findAll(ownerId?: string): Promise<ProjectWithRelations[]> {
|
||||
const where = ownerId !== undefined ? { ownerId } : {};
|
||||
return this.prisma.project.findMany({ where, orderBy: { name: 'asc' } });
|
||||
return this.prisma.project.findMany({ where, orderBy: { name: 'asc' }, include: PROJECT_INCLUDE }) as unknown as Promise<ProjectWithRelations[]>;
|
||||
}
|
||||
|
||||
async findById(id: string): Promise<Project | null> {
|
||||
return this.prisma.project.findUnique({ where: { id } });
|
||||
async findById(id: string): Promise<ProjectWithRelations | null> {
|
||||
return this.prisma.project.findUnique({ where: { id }, include: PROJECT_INCLUDE }) as unknown as Promise<ProjectWithRelations | null>;
|
||||
}
|
||||
|
||||
async findByName(name: string): Promise<Project | null> {
|
||||
return this.prisma.project.findUnique({ where: { name } });
|
||||
async findByName(name: string): Promise<ProjectWithRelations | null> {
|
||||
return this.prisma.project.findUnique({ where: { name }, include: PROJECT_INCLUDE }) as unknown as Promise<ProjectWithRelations | null>;
|
||||
}
|
||||
|
||||
async create(data: CreateProjectInput & { ownerId: string }): Promise<Project> {
|
||||
async create(data: { name: string; description: string; prompt?: string; ownerId: string; proxyMode: string; llmProvider?: string; llmModel?: string }): Promise<ProjectWithRelations> {
|
||||
const createData: Record<string, unknown> = {
|
||||
name: data.name,
|
||||
description: data.description,
|
||||
ownerId: data.ownerId,
|
||||
proxyMode: data.proxyMode,
|
||||
};
|
||||
if (data.prompt !== undefined) createData['prompt'] = data.prompt;
|
||||
if (data.llmProvider !== undefined) createData['llmProvider'] = data.llmProvider;
|
||||
if (data.llmModel !== undefined) createData['llmModel'] = data.llmModel;
|
||||
|
||||
return this.prisma.project.create({
|
||||
data: {
|
||||
name: data.name,
|
||||
description: data.description,
|
||||
ownerId: data.ownerId,
|
||||
},
|
||||
});
|
||||
data: createData as Parameters<PrismaClient['project']['create']>[0]['data'],
|
||||
include: PROJECT_INCLUDE,
|
||||
}) as unknown as Promise<ProjectWithRelations>;
|
||||
}
|
||||
|
||||
async update(id: string, data: UpdateProjectInput): Promise<Project> {
|
||||
const updateData: Record<string, unknown> = {};
|
||||
if (data.description !== undefined) updateData['description'] = data.description;
|
||||
return this.prisma.project.update({ where: { id }, data: updateData });
|
||||
async update(id: string, data: Record<string, unknown>): Promise<ProjectWithRelations> {
|
||||
return this.prisma.project.update({
|
||||
where: { id },
|
||||
data,
|
||||
include: PROJECT_INCLUDE,
|
||||
}) as unknown as Promise<ProjectWithRelations>;
|
||||
}
|
||||
|
||||
async delete(id: string): Promise<void> {
|
||||
await this.prisma.project.delete({ where: { id } });
|
||||
}
|
||||
|
||||
async setServers(projectId: string, serverIds: string[]): Promise<void> {
|
||||
await this.prisma.$transaction(async (tx) => {
|
||||
await tx.projectServer.deleteMany({ where: { projectId } });
|
||||
if (serverIds.length > 0) {
|
||||
await tx.projectServer.createMany({
|
||||
data: serverIds.map((serverId) => ({ projectId, serverId })),
|
||||
});
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
async addServer(projectId: string, serverId: string): Promise<void> {
|
||||
await this.prisma.projectServer.upsert({
|
||||
where: { projectId_serverId: { projectId, serverId } },
|
||||
create: { projectId, serverId },
|
||||
update: {},
|
||||
});
|
||||
}
|
||||
|
||||
async removeServer(projectId: string, serverId: string): Promise<void> {
|
||||
await this.prisma.projectServer.deleteMany({
|
||||
where: { projectId, serverId },
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
69
src/mcpd/src/repositories/prompt-request.repository.ts
Normal file
69
src/mcpd/src/repositories/prompt-request.repository.ts
Normal file
@@ -0,0 +1,69 @@
|
||||
import type { PrismaClient, PromptRequest } from '@prisma/client';
|
||||
|
||||
export interface IPromptRequestRepository {
|
||||
findAll(projectId?: string): Promise<PromptRequest[]>;
|
||||
findGlobal(): Promise<PromptRequest[]>;
|
||||
findById(id: string): Promise<PromptRequest | null>;
|
||||
findByNameAndProject(name: string, projectId: string | null): Promise<PromptRequest | null>;
|
||||
findBySession(sessionId: string, projectId?: string): Promise<PromptRequest[]>;
|
||||
create(data: { name: string; content: string; projectId?: string; priority?: number; createdBySession?: string; createdByUserId?: string }): Promise<PromptRequest>;
|
||||
update(id: string, data: { content?: string; priority?: number }): Promise<PromptRequest>;
|
||||
delete(id: string): Promise<void>;
|
||||
}
|
||||
|
||||
export class PromptRequestRepository implements IPromptRequestRepository {
|
||||
constructor(private readonly prisma: PrismaClient) {}
|
||||
|
||||
async findAll(projectId?: string): Promise<PromptRequest[]> {
|
||||
const include = { project: { select: { name: true } } };
|
||||
if (projectId !== undefined) {
|
||||
return this.prisma.promptRequest.findMany({
|
||||
where: { OR: [{ projectId }, { projectId: null }] },
|
||||
include,
|
||||
orderBy: { createdAt: 'desc' },
|
||||
});
|
||||
}
|
||||
return this.prisma.promptRequest.findMany({ include, orderBy: { createdAt: 'desc' } });
|
||||
}
|
||||
|
||||
async findGlobal(): Promise<PromptRequest[]> {
|
||||
return this.prisma.promptRequest.findMany({
|
||||
where: { projectId: null },
|
||||
include: { project: { select: { name: true } } },
|
||||
orderBy: { createdAt: 'desc' },
|
||||
});
|
||||
}
|
||||
|
||||
async findById(id: string): Promise<PromptRequest | null> {
|
||||
return this.prisma.promptRequest.findUnique({ where: { id } });
|
||||
}
|
||||
|
||||
async findByNameAndProject(name: string, projectId: string | null): Promise<PromptRequest | null> {
|
||||
return this.prisma.promptRequest.findUnique({
|
||||
where: { name_projectId: { name, projectId: projectId ?? '' } },
|
||||
});
|
||||
}
|
||||
|
||||
async findBySession(sessionId: string, projectId?: string): Promise<PromptRequest[]> {
|
||||
const where: Record<string, unknown> = { createdBySession: sessionId };
|
||||
if (projectId !== undefined) {
|
||||
where['OR'] = [{ projectId }, { projectId: null }];
|
||||
}
|
||||
return this.prisma.promptRequest.findMany({
|
||||
where,
|
||||
orderBy: { createdAt: 'desc' },
|
||||
});
|
||||
}
|
||||
|
||||
async create(data: { name: string; content: string; projectId?: string; priority?: number; createdBySession?: string; createdByUserId?: string }): Promise<PromptRequest> {
|
||||
return this.prisma.promptRequest.create({ data });
|
||||
}
|
||||
|
||||
async update(id: string, data: { content?: string; priority?: number }): Promise<PromptRequest> {
|
||||
return this.prisma.promptRequest.update({ where: { id }, data });
|
||||
}
|
||||
|
||||
async delete(id: string): Promise<void> {
|
||||
await this.prisma.promptRequest.delete({ where: { id } });
|
||||
}
|
||||
}
|
||||
58
src/mcpd/src/repositories/prompt.repository.ts
Normal file
58
src/mcpd/src/repositories/prompt.repository.ts
Normal file
@@ -0,0 +1,58 @@
|
||||
import type { PrismaClient, Prompt } from '@prisma/client';
|
||||
|
||||
export interface IPromptRepository {
|
||||
findAll(projectId?: string): Promise<Prompt[]>;
|
||||
findGlobal(): Promise<Prompt[]>;
|
||||
findById(id: string): Promise<Prompt | null>;
|
||||
findByNameAndProject(name: string, projectId: string | null): Promise<Prompt | null>;
|
||||
create(data: { name: string; content: string; projectId?: string; priority?: number; linkTarget?: string }): Promise<Prompt>;
|
||||
update(id: string, data: { content?: string; priority?: number; summary?: string; chapters?: string[] }): Promise<Prompt>;
|
||||
delete(id: string): Promise<void>;
|
||||
}
|
||||
|
||||
export class PromptRepository implements IPromptRepository {
|
||||
constructor(private readonly prisma: PrismaClient) {}
|
||||
|
||||
async findAll(projectId?: string): Promise<Prompt[]> {
|
||||
const include = { project: { select: { name: true } } };
|
||||
if (projectId !== undefined) {
|
||||
// Project-scoped + global prompts
|
||||
return this.prisma.prompt.findMany({
|
||||
where: { OR: [{ projectId }, { projectId: null }] },
|
||||
include,
|
||||
orderBy: { name: 'asc' },
|
||||
});
|
||||
}
|
||||
return this.prisma.prompt.findMany({ include, orderBy: { name: 'asc' } });
|
||||
}
|
||||
|
||||
async findGlobal(): Promise<Prompt[]> {
|
||||
return this.prisma.prompt.findMany({
|
||||
where: { projectId: null },
|
||||
include: { project: { select: { name: true } } },
|
||||
orderBy: { name: 'asc' },
|
||||
});
|
||||
}
|
||||
|
||||
async findById(id: string): Promise<Prompt | null> {
|
||||
return this.prisma.prompt.findUnique({ where: { id } });
|
||||
}
|
||||
|
||||
async findByNameAndProject(name: string, projectId: string | null): Promise<Prompt | null> {
|
||||
return this.prisma.prompt.findUnique({
|
||||
where: { name_projectId: { name, projectId: projectId ?? '' } },
|
||||
});
|
||||
}
|
||||
|
||||
async create(data: { name: string; content: string; projectId?: string; priority?: number; linkTarget?: string }): Promise<Prompt> {
|
||||
return this.prisma.prompt.create({ data });
|
||||
}
|
||||
|
||||
async update(id: string, data: { content?: string; priority?: number; summary?: string; chapters?: string[] }): Promise<Prompt> {
|
||||
return this.prisma.prompt.update({ where: { id }, data });
|
||||
}
|
||||
|
||||
async delete(id: string): Promise<void> {
|
||||
await this.prisma.prompt.delete({ where: { id } });
|
||||
}
|
||||
}
|
||||
48
src/mcpd/src/repositories/rbac-definition.repository.ts
Normal file
48
src/mcpd/src/repositories/rbac-definition.repository.ts
Normal file
@@ -0,0 +1,48 @@
|
||||
import type { PrismaClient, RbacDefinition } from '@prisma/client';
|
||||
import type { CreateRbacDefinitionInput, UpdateRbacDefinitionInput } from '../validation/rbac-definition.schema.js';
|
||||
|
||||
export interface IRbacDefinitionRepository {
|
||||
findAll(): Promise<RbacDefinition[]>;
|
||||
findById(id: string): Promise<RbacDefinition | null>;
|
||||
findByName(name: string): Promise<RbacDefinition | null>;
|
||||
create(data: CreateRbacDefinitionInput): Promise<RbacDefinition>;
|
||||
update(id: string, data: UpdateRbacDefinitionInput): Promise<RbacDefinition>;
|
||||
delete(id: string): Promise<void>;
|
||||
}
|
||||
|
||||
export class RbacDefinitionRepository implements IRbacDefinitionRepository {
|
||||
constructor(private readonly prisma: PrismaClient) {}
|
||||
|
||||
async findAll(): Promise<RbacDefinition[]> {
|
||||
return this.prisma.rbacDefinition.findMany({ orderBy: { name: 'asc' } });
|
||||
}
|
||||
|
||||
async findById(id: string): Promise<RbacDefinition | null> {
|
||||
return this.prisma.rbacDefinition.findUnique({ where: { id } });
|
||||
}
|
||||
|
||||
async findByName(name: string): Promise<RbacDefinition | null> {
|
||||
return this.prisma.rbacDefinition.findUnique({ where: { name } });
|
||||
}
|
||||
|
||||
async create(data: CreateRbacDefinitionInput): Promise<RbacDefinition> {
|
||||
return this.prisma.rbacDefinition.create({
|
||||
data: {
|
||||
name: data.name,
|
||||
subjects: data.subjects,
|
||||
roleBindings: data.roleBindings,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
async update(id: string, data: UpdateRbacDefinitionInput): Promise<RbacDefinition> {
|
||||
const updateData: Record<string, unknown> = {};
|
||||
if (data.subjects !== undefined) updateData['subjects'] = data.subjects;
|
||||
if (data.roleBindings !== undefined) updateData['roleBindings'] = data.roleBindings;
|
||||
return this.prisma.rbacDefinition.update({ where: { id }, data: updateData });
|
||||
}
|
||||
|
||||
async delete(id: string): Promise<void> {
|
||||
await this.prisma.rbacDefinition.delete({ where: { id } });
|
||||
}
|
||||
}
|
||||
76
src/mcpd/src/repositories/user.repository.ts
Normal file
76
src/mcpd/src/repositories/user.repository.ts
Normal file
@@ -0,0 +1,76 @@
|
||||
import type { PrismaClient, User } from '@prisma/client';
|
||||
|
||||
/** User without the passwordHash field — safe for API responses. */
|
||||
export type SafeUser = Omit<User, 'passwordHash'>;
|
||||
|
||||
export interface IUserRepository {
|
||||
findAll(): Promise<SafeUser[]>;
|
||||
findById(id: string): Promise<SafeUser | null>;
|
||||
findByEmail(email: string, includeHash?: boolean): Promise<SafeUser | null> | Promise<User | null>;
|
||||
create(data: { email: string; passwordHash: string; name?: string; role?: string }): Promise<SafeUser>;
|
||||
delete(id: string): Promise<void>;
|
||||
count(): Promise<number>;
|
||||
}
|
||||
|
||||
/** Fields to select when passwordHash must be excluded. */
|
||||
const safeSelect = {
|
||||
id: true,
|
||||
email: true,
|
||||
name: true,
|
||||
role: true,
|
||||
provider: true,
|
||||
externalId: true,
|
||||
version: true,
|
||||
createdAt: true,
|
||||
updatedAt: true,
|
||||
} as const;
|
||||
|
||||
export class UserRepository implements IUserRepository {
|
||||
constructor(private readonly prisma: PrismaClient) {}
|
||||
|
||||
async findAll(): Promise<SafeUser[]> {
|
||||
return this.prisma.user.findMany({
|
||||
select: safeSelect,
|
||||
orderBy: { email: 'asc' },
|
||||
});
|
||||
}
|
||||
|
||||
async findById(id: string): Promise<SafeUser | null> {
|
||||
return this.prisma.user.findUnique({
|
||||
where: { id },
|
||||
select: safeSelect,
|
||||
});
|
||||
}
|
||||
|
||||
async findByEmail(email: string, includeHash?: boolean): Promise<User | SafeUser | null> {
|
||||
if (includeHash === true) {
|
||||
return this.prisma.user.findUnique({ where: { email } });
|
||||
}
|
||||
return this.prisma.user.findUnique({
|
||||
where: { email },
|
||||
select: safeSelect,
|
||||
});
|
||||
}
|
||||
|
||||
async create(data: { email: string; passwordHash: string; name?: string; role?: string }): Promise<SafeUser> {
|
||||
const createData: Record<string, unknown> = {
|
||||
email: data.email,
|
||||
passwordHash: data.passwordHash,
|
||||
};
|
||||
if (data.name !== undefined) createData['name'] = data.name;
|
||||
if (data.role !== undefined) createData['role'] = data.role;
|
||||
|
||||
return this.prisma.user.create({
|
||||
data: createData as Parameters<PrismaClient['user']['create']>[0]['data'],
|
||||
select: safeSelect,
|
||||
});
|
||||
}
|
||||
|
||||
async delete(id: string): Promise<void> {
|
||||
await this.prisma.user.delete({ where: { id } });
|
||||
}
|
||||
|
||||
async count(): Promise<number> {
|
||||
return this.prisma.user.count();
|
||||
}
|
||||
}
|
||||
@@ -1,15 +1,76 @@
|
||||
import type { FastifyInstance } from 'fastify';
|
||||
import type { AuthService } from '../services/auth.service.js';
|
||||
import type { UserService } from '../services/user.service.js';
|
||||
import type { GroupService } from '../services/group.service.js';
|
||||
import type { RbacDefinitionService } from '../services/rbac-definition.service.js';
|
||||
import type { RbacService } from '../services/rbac.service.js';
|
||||
import { createAuthMiddleware } from '../middleware/auth.js';
|
||||
import { createRbacMiddleware } from '../middleware/rbac.js';
|
||||
|
||||
export interface AuthRouteDeps {
|
||||
authService: AuthService;
|
||||
userService: UserService;
|
||||
groupService: GroupService;
|
||||
rbacDefinitionService: RbacDefinitionService;
|
||||
rbacService: RbacService;
|
||||
}
|
||||
|
||||
export function registerAuthRoutes(app: FastifyInstance, deps: AuthRouteDeps): void {
|
||||
const authMiddleware = createAuthMiddleware({
|
||||
findSession: (token) => deps.authService.findSession(token),
|
||||
});
|
||||
const { requireOperation } = createRbacMiddleware(deps.rbacService);
|
||||
|
||||
// GET /api/v1/auth/status — unauthenticated, returns whether any users exist
|
||||
app.get('/api/v1/auth/status', async () => {
|
||||
const count = await deps.userService.count();
|
||||
return { hasUsers: count > 0 };
|
||||
});
|
||||
|
||||
// POST /api/v1/auth/bootstrap — only works when no users exist (first-run setup)
|
||||
app.post('/api/v1/auth/bootstrap', async (request, reply) => {
|
||||
const count = await deps.userService.count();
|
||||
if (count > 0) {
|
||||
reply.code(409).send({ error: 'Users already exist. Use login instead.' });
|
||||
return;
|
||||
}
|
||||
|
||||
const { email, password, name } = request.body as { email: string; password: string; name?: string };
|
||||
|
||||
// Create the first admin user
|
||||
await deps.userService.create({
|
||||
email,
|
||||
password,
|
||||
...(name !== undefined ? { name } : {}),
|
||||
});
|
||||
|
||||
// Create "admin" group and add the first user to it
|
||||
await deps.groupService.create({
|
||||
name: 'admin',
|
||||
description: 'Bootstrap admin group',
|
||||
members: [email],
|
||||
});
|
||||
|
||||
// Create bootstrap RBAC: full resource access + all operations
|
||||
await deps.rbacDefinitionService.create({
|
||||
name: 'bootstrap-admin',
|
||||
subjects: [{ kind: 'Group', name: 'admin' }],
|
||||
roleBindings: [
|
||||
{ role: 'edit', resource: '*' },
|
||||
{ role: 'run', resource: '*' },
|
||||
{ role: 'run', action: 'impersonate' },
|
||||
{ role: 'run', action: 'logs' },
|
||||
{ role: 'run', action: 'backup' },
|
||||
{ role: 'run', action: 'restore' },
|
||||
{ role: 'run', action: 'audit-purge' },
|
||||
],
|
||||
});
|
||||
|
||||
// Auto-login so the caller gets a token immediately
|
||||
const session = await deps.authService.login(email, password);
|
||||
reply.code(201);
|
||||
return session;
|
||||
});
|
||||
|
||||
// POST /api/v1/auth/login — no auth required
|
||||
app.post<{
|
||||
@@ -28,4 +89,15 @@ export function registerAuthRoutes(app: FastifyInstance, deps: AuthRouteDeps): v
|
||||
await deps.authService.logout(token);
|
||||
return { success: true };
|
||||
});
|
||||
|
||||
// POST /api/v1/auth/impersonate — requires auth + run:impersonate operation
|
||||
app.post(
|
||||
'/api/v1/auth/impersonate',
|
||||
{ preHandler: [authMiddleware, requireOperation('impersonate')] },
|
||||
async (request) => {
|
||||
const { email } = request.body as { email: string };
|
||||
const result = await deps.authService.impersonate(email);
|
||||
return result;
|
||||
},
|
||||
);
|
||||
}
|
||||
|
||||
@@ -13,7 +13,7 @@ export function registerBackupRoutes(app: FastifyInstance, deps: BackupDeps): vo
|
||||
app.post<{
|
||||
Body: {
|
||||
password?: string;
|
||||
resources?: Array<'servers' | 'secrets' | 'projects'>;
|
||||
resources?: Array<'servers' | 'secrets' | 'projects' | 'users' | 'groups' | 'rbac'>;
|
||||
};
|
||||
}>('/api/v1/backup', async (request) => {
|
||||
const opts: BackupOptions = {};
|
||||
@@ -51,7 +51,7 @@ export function registerBackupRoutes(app: FastifyInstance, deps: BackupDeps): vo
|
||||
|
||||
const result = await deps.restoreService.restore(bundle, restoreOpts);
|
||||
|
||||
if (result.errors.length > 0 && result.serversCreated === 0 && result.secretsCreated === 0 && result.projectsCreated === 0) {
|
||||
if (result.errors.length > 0 && result.serversCreated === 0 && result.secretsCreated === 0 && result.projectsCreated === 0 && result.usersCreated === 0 && result.groupsCreated === 0 && result.rbacCreated === 0) {
|
||||
reply.code(422);
|
||||
}
|
||||
|
||||
|
||||
35
src/mcpd/src/routes/groups.ts
Normal file
35
src/mcpd/src/routes/groups.ts
Normal file
@@ -0,0 +1,35 @@
|
||||
import type { FastifyInstance } from 'fastify';
|
||||
import type { GroupService } from '../services/group.service.js';
|
||||
|
||||
export function registerGroupRoutes(
|
||||
app: FastifyInstance,
|
||||
service: GroupService,
|
||||
): void {
|
||||
app.get('/api/v1/groups', async () => {
|
||||
return service.list();
|
||||
});
|
||||
|
||||
app.get<{ Params: { id: string } }>('/api/v1/groups/:id', async (request) => {
|
||||
// Try by ID first, fall back to name lookup
|
||||
try {
|
||||
return await service.getById(request.params.id);
|
||||
} catch {
|
||||
return service.getByName(request.params.id);
|
||||
}
|
||||
});
|
||||
|
||||
app.post('/api/v1/groups', async (request, reply) => {
|
||||
const group = await service.create(request.body);
|
||||
reply.code(201);
|
||||
return group;
|
||||
});
|
||||
|
||||
app.put<{ Params: { id: string } }>('/api/v1/groups/:id', async (request) => {
|
||||
return service.update(request.params.id, request.body);
|
||||
});
|
||||
|
||||
app.delete<{ Params: { id: string } }>('/api/v1/groups/:id', async (request, reply) => {
|
||||
await service.delete(request.params.id);
|
||||
reply.code(204);
|
||||
});
|
||||
}
|
||||
@@ -14,3 +14,6 @@ export type { AuthRouteDeps } from './auth.js';
|
||||
export { registerMcpProxyRoutes } from './mcp-proxy.js';
|
||||
export type { McpProxyRouteDeps } from './mcp-proxy.js';
|
||||
export { registerTemplateRoutes } from './templates.js';
|
||||
export { registerRbacRoutes } from './rbac-definitions.js';
|
||||
export { registerUserRoutes } from './users.js';
|
||||
export { registerGroupRoutes } from './groups.js';
|
||||
|
||||
@@ -2,13 +2,13 @@ import type { FastifyInstance } from 'fastify';
|
||||
import type { ProjectService } from '../services/project.service.js';
|
||||
|
||||
export function registerProjectRoutes(app: FastifyInstance, service: ProjectService): void {
|
||||
app.get('/api/v1/projects', async (request) => {
|
||||
// If authenticated, filter by owner; otherwise list all
|
||||
return service.list(request.userId);
|
||||
app.get('/api/v1/projects', async () => {
|
||||
// RBAC preSerialization hook handles access filtering
|
||||
return service.list();
|
||||
});
|
||||
|
||||
app.get<{ Params: { id: string } }>('/api/v1/projects/:id', async (request) => {
|
||||
return service.getById(request.params.id);
|
||||
return service.resolveAndGet(request.params.id);
|
||||
});
|
||||
|
||||
app.post('/api/v1/projects', async (request, reply) => {
|
||||
@@ -19,11 +19,51 @@ export function registerProjectRoutes(app: FastifyInstance, service: ProjectServ
|
||||
});
|
||||
|
||||
app.put<{ Params: { id: string } }>('/api/v1/projects/:id', async (request) => {
|
||||
return service.update(request.params.id, request.body);
|
||||
const project = await service.resolveAndGet(request.params.id);
|
||||
return service.update(project.id, request.body);
|
||||
});
|
||||
|
||||
app.delete<{ Params: { id: string } }>('/api/v1/projects/:id', async (request, reply) => {
|
||||
await service.delete(request.params.id);
|
||||
const project = await service.resolveAndGet(request.params.id);
|
||||
await service.delete(project.id);
|
||||
reply.code(204);
|
||||
});
|
||||
|
||||
// Generate .mcp.json for a project
|
||||
app.get<{ Params: { id: string } }>('/api/v1/projects/:id/mcp-config', async (request) => {
|
||||
return service.generateMcpConfig(request.params.id);
|
||||
});
|
||||
|
||||
// Attach a server to a project
|
||||
app.post<{ Params: { id: string }; Body: { server: string } }>('/api/v1/projects/:id/servers', async (request) => {
|
||||
const body = request.body as { server?: string };
|
||||
if (!body.server) {
|
||||
throw Object.assign(new Error('Missing "server" in request body'), { statusCode: 400 });
|
||||
}
|
||||
return service.addServer(request.params.id, body.server);
|
||||
});
|
||||
|
||||
// Detach a server from a project
|
||||
app.delete<{ Params: { id: string; serverName: string } }>('/api/v1/projects/:id/servers/:serverName', async (request, reply) => {
|
||||
await service.removeServer(request.params.id, request.params.serverName);
|
||||
reply.code(204);
|
||||
});
|
||||
|
||||
// List servers in a project (for mcplocal discovery)
|
||||
app.get<{ Params: { id: string } }>('/api/v1/projects/:id/servers', async (request) => {
|
||||
const project = await service.resolveAndGet(request.params.id);
|
||||
return project.servers.map((ps) => ps.server);
|
||||
});
|
||||
|
||||
// Get project instructions for LLM (prompt + server list)
|
||||
app.get<{ Params: { id: string } }>('/api/v1/projects/:id/instructions', async (request) => {
|
||||
const project = await service.resolveAndGet(request.params.id);
|
||||
return {
|
||||
prompt: project.prompt,
|
||||
servers: project.servers.map((ps) => ({
|
||||
name: (ps.server as Record<string, unknown>).name as string,
|
||||
description: (ps.server as Record<string, unknown>).description as string,
|
||||
})),
|
||||
};
|
||||
});
|
||||
}
|
||||
|
||||
207
src/mcpd/src/routes/prompts.ts
Normal file
207
src/mcpd/src/routes/prompts.ts
Normal file
@@ -0,0 +1,207 @@
|
||||
import type { FastifyInstance } from 'fastify';
|
||||
import type { Prompt } from '@prisma/client';
|
||||
import type { PromptService } from '../services/prompt.service.js';
|
||||
import type { IProjectRepository, ProjectWithRelations } from '../repositories/project.repository.js';
|
||||
|
||||
type PromptWithLinkStatus = Prompt & { linkStatus: 'alive' | 'dead' | null };
|
||||
|
||||
/**
|
||||
* Enrich prompts with linkStatus by checking if the target project/server exists.
|
||||
* This is a structural check (does the target exist?) — not a runtime probe.
|
||||
*/
|
||||
async function enrichWithLinkStatus(
|
||||
prompts: Prompt[],
|
||||
projectRepo: IProjectRepository,
|
||||
): Promise<PromptWithLinkStatus[]> {
|
||||
// Cache project lookups to avoid repeated DB queries
|
||||
const projectCache = new Map<string, ProjectWithRelations | null>();
|
||||
|
||||
const results: PromptWithLinkStatus[] = [];
|
||||
|
||||
for (const p of prompts) {
|
||||
if (!p.linkTarget) {
|
||||
results.push({ ...p, linkStatus: null } as PromptWithLinkStatus);
|
||||
continue;
|
||||
}
|
||||
|
||||
try {
|
||||
// Parse: project/server:uri
|
||||
const slashIdx = p.linkTarget.indexOf('/');
|
||||
if (slashIdx < 1) { results.push({ ...p, linkStatus: 'dead' as const }); continue; }
|
||||
const projectName = p.linkTarget.slice(0, slashIdx);
|
||||
const rest = p.linkTarget.slice(slashIdx + 1);
|
||||
const colonIdx = rest.indexOf(':');
|
||||
if (colonIdx < 1) { results.push({ ...p, linkStatus: 'dead' as const }); continue; }
|
||||
const serverName = rest.slice(0, colonIdx);
|
||||
|
||||
// Check if project exists (cached)
|
||||
if (!projectCache.has(projectName)) {
|
||||
projectCache.set(projectName, await projectRepo.findByName(projectName));
|
||||
}
|
||||
const project = projectCache.get(projectName);
|
||||
if (!project) { results.push({ ...p, linkStatus: 'dead' as const }); continue; }
|
||||
|
||||
// Check if server is linked to that project
|
||||
const hasServer = project.servers.some((s) => s.server.name === serverName);
|
||||
results.push({ ...p, linkStatus: hasServer ? 'alive' as const : 'dead' as const });
|
||||
} catch {
|
||||
results.push({ ...p, linkStatus: 'dead' as const });
|
||||
}
|
||||
}
|
||||
|
||||
return results;
|
||||
}
|
||||
|
||||
export function registerPromptRoutes(
|
||||
app: FastifyInstance,
|
||||
service: PromptService,
|
||||
projectRepo: IProjectRepository,
|
||||
): void {
|
||||
// ── Prompts (approved) ──
|
||||
|
||||
app.get<{ Querystring: { project?: string; scope?: string; projectId?: string } }>('/api/v1/prompts', async (request) => {
|
||||
let prompts: Prompt[];
|
||||
const projectName = request.query.project;
|
||||
if (projectName) {
|
||||
const project = await projectRepo.findByName(projectName);
|
||||
if (!project) {
|
||||
throw Object.assign(new Error(`Project not found: ${projectName}`), { statusCode: 404 });
|
||||
}
|
||||
prompts = await service.listPrompts(project.id);
|
||||
} else if (request.query.projectId) {
|
||||
prompts = await service.listPrompts(request.query.projectId);
|
||||
} else if (request.query.scope === 'global') {
|
||||
prompts = await service.listGlobalPrompts();
|
||||
} else {
|
||||
prompts = await service.listPrompts();
|
||||
}
|
||||
return enrichWithLinkStatus(prompts, projectRepo);
|
||||
});
|
||||
|
||||
app.get<{ Params: { id: string } }>('/api/v1/prompts/:id', async (request) => {
|
||||
const prompt = await service.getPrompt(request.params.id);
|
||||
const [enriched] = await enrichWithLinkStatus([prompt], projectRepo);
|
||||
return enriched;
|
||||
});
|
||||
|
||||
app.post('/api/v1/prompts', async (request, reply) => {
|
||||
const prompt = await service.createPrompt(request.body);
|
||||
reply.code(201);
|
||||
return prompt;
|
||||
});
|
||||
|
||||
app.put<{ Params: { id: string } }>('/api/v1/prompts/:id', async (request) => {
|
||||
return service.updatePrompt(request.params.id, request.body);
|
||||
});
|
||||
|
||||
app.delete<{ Params: { id: string } }>('/api/v1/prompts/:id', async (request, reply) => {
|
||||
await service.deletePrompt(request.params.id);
|
||||
reply.code(204);
|
||||
});
|
||||
|
||||
// ── Prompt Requests (pending proposals) ──
|
||||
|
||||
app.get<{ Querystring: { project?: string; scope?: string } }>('/api/v1/promptrequests', async (request) => {
|
||||
const projectName = request.query.project;
|
||||
if (projectName) {
|
||||
const project = await projectRepo.findByName(projectName);
|
||||
if (!project) {
|
||||
throw Object.assign(new Error(`Project not found: ${projectName}`), { statusCode: 404 });
|
||||
}
|
||||
return service.listPromptRequests(project.id);
|
||||
}
|
||||
if (request.query.scope === 'global') {
|
||||
return service.listGlobalPromptRequests();
|
||||
}
|
||||
return service.listPromptRequests();
|
||||
});
|
||||
|
||||
app.get<{ Params: { id: string } }>('/api/v1/promptrequests/:id', async (request) => {
|
||||
return service.getPromptRequest(request.params.id);
|
||||
});
|
||||
|
||||
app.put<{ Params: { id: string } }>('/api/v1/promptrequests/:id', async (request) => {
|
||||
return service.updatePromptRequest(request.params.id, request.body);
|
||||
});
|
||||
|
||||
app.delete<{ Params: { id: string } }>('/api/v1/promptrequests/:id', async (request, reply) => {
|
||||
await service.deletePromptRequest(request.params.id);
|
||||
reply.code(204);
|
||||
});
|
||||
|
||||
app.post('/api/v1/promptrequests', async (request, reply) => {
|
||||
const body = request.body as Record<string, unknown>;
|
||||
// Resolve project name → ID if provided
|
||||
if (body.project && typeof body.project === 'string') {
|
||||
const project = await projectRepo.findByName(body.project);
|
||||
if (!project) {
|
||||
throw Object.assign(new Error(`Project not found: ${body.project}`), { statusCode: 404 });
|
||||
}
|
||||
const { project: _, ...rest } = body;
|
||||
const req = await service.propose({ ...rest, projectId: project.id });
|
||||
reply.code(201);
|
||||
return req;
|
||||
}
|
||||
const req = await service.propose(body);
|
||||
reply.code(201);
|
||||
return req;
|
||||
});
|
||||
|
||||
// Approve: atomic delete request → create prompt
|
||||
app.post<{ Params: { id: string } }>('/api/v1/promptrequests/:id/approve', async (request) => {
|
||||
return service.approve(request.params.id);
|
||||
});
|
||||
|
||||
// Regenerate summary/chapters for a prompt
|
||||
app.post<{ Params: { id: string } }>('/api/v1/prompts/:id/regenerate-summary', async (request) => {
|
||||
return service.regenerateSummary(request.params.id);
|
||||
});
|
||||
|
||||
// Compact prompt index for gating LLM (name, priority, summary, chapters)
|
||||
app.get<{ Params: { name: string } }>('/api/v1/projects/:name/prompt-index', async (request) => {
|
||||
const project = await projectRepo.findByName(request.params.name);
|
||||
if (!project) {
|
||||
throw Object.assign(new Error(`Project not found: ${request.params.name}`), { statusCode: 404 });
|
||||
}
|
||||
const prompts = await service.listPrompts(project.id);
|
||||
return prompts.map((p) => ({
|
||||
name: p.name,
|
||||
priority: p.priority,
|
||||
summary: p.summary,
|
||||
chapters: p.chapters,
|
||||
linkTarget: p.linkTarget,
|
||||
}));
|
||||
});
|
||||
|
||||
// ── Project-scoped endpoints (for mcplocal) ──
|
||||
|
||||
// Visible prompts: approved + session's pending requests
|
||||
app.get<{ Params: { name: string }; Querystring: { session?: string } }>(
|
||||
'/api/v1/projects/:name/prompts/visible',
|
||||
async (request) => {
|
||||
const project = await projectRepo.findByName(request.params.name);
|
||||
if (!project) {
|
||||
throw Object.assign(new Error(`Project not found: ${request.params.name}`), { statusCode: 404 });
|
||||
}
|
||||
return service.getVisiblePrompts(project.id, request.query.session);
|
||||
},
|
||||
);
|
||||
|
||||
// LLM propose: create a PromptRequest for a project
|
||||
app.post<{ Params: { name: string } }>(
|
||||
'/api/v1/projects/:name/promptrequests',
|
||||
async (request, reply) => {
|
||||
const project = await projectRepo.findByName(request.params.name);
|
||||
if (!project) {
|
||||
throw Object.assign(new Error(`Project not found: ${request.params.name}`), { statusCode: 404 });
|
||||
}
|
||||
const body = request.body as Record<string, unknown>;
|
||||
const req = await service.propose({
|
||||
...body,
|
||||
projectId: project.id,
|
||||
});
|
||||
reply.code(201);
|
||||
return req;
|
||||
},
|
||||
);
|
||||
}
|
||||
30
src/mcpd/src/routes/rbac-definitions.ts
Normal file
30
src/mcpd/src/routes/rbac-definitions.ts
Normal file
@@ -0,0 +1,30 @@
|
||||
import type { FastifyInstance } from 'fastify';
|
||||
import type { RbacDefinitionService } from '../services/rbac-definition.service.js';
|
||||
|
||||
export function registerRbacRoutes(
|
||||
app: FastifyInstance,
|
||||
service: RbacDefinitionService,
|
||||
): void {
|
||||
app.get('/api/v1/rbac', async () => {
|
||||
return service.list();
|
||||
});
|
||||
|
||||
app.get<{ Params: { id: string } }>('/api/v1/rbac/:id', async (request) => {
|
||||
return service.getById(request.params.id);
|
||||
});
|
||||
|
||||
app.post('/api/v1/rbac', async (request, reply) => {
|
||||
const def = await service.create(request.body);
|
||||
reply.code(201);
|
||||
return def;
|
||||
});
|
||||
|
||||
app.put<{ Params: { id: string } }>('/api/v1/rbac/:id', async (request) => {
|
||||
return service.update(request.params.id, request.body);
|
||||
});
|
||||
|
||||
app.delete<{ Params: { id: string } }>('/api/v1/rbac/:id', async (request, reply) => {
|
||||
await service.delete(request.params.id);
|
||||
reply.code(204);
|
||||
});
|
||||
}
|
||||
31
src/mcpd/src/routes/users.ts
Normal file
31
src/mcpd/src/routes/users.ts
Normal file
@@ -0,0 +1,31 @@
|
||||
import type { FastifyInstance } from 'fastify';
|
||||
import type { UserService } from '../services/user.service.js';
|
||||
|
||||
export function registerUserRoutes(
|
||||
app: FastifyInstance,
|
||||
service: UserService,
|
||||
): void {
|
||||
app.get('/api/v1/users', async () => {
|
||||
return service.list();
|
||||
});
|
||||
|
||||
app.get<{ Params: { id: string } }>('/api/v1/users/:id', async (request) => {
|
||||
// Support lookup by email (contains @) or by id
|
||||
const idOrEmail = request.params.id;
|
||||
if (idOrEmail.includes('@')) {
|
||||
return service.getByEmail(idOrEmail);
|
||||
}
|
||||
return service.getById(idOrEmail);
|
||||
});
|
||||
|
||||
app.post('/api/v1/users', async (request, reply) => {
|
||||
const user = await service.create(request.body);
|
||||
reply.code(201);
|
||||
return user;
|
||||
});
|
||||
|
||||
app.delete<{ Params: { id: string } }>('/api/v1/users/:id', async (_request, reply) => {
|
||||
await service.delete(_request.params.id);
|
||||
reply.code(204);
|
||||
});
|
||||
}
|
||||
@@ -63,4 +63,32 @@ export class AuthService {
|
||||
}
|
||||
return { userId: session.userId, expiresAt: session.expiresAt };
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a session for a user by email without requiring their password.
|
||||
* Used for admin impersonation.
|
||||
*/
|
||||
async impersonate(email: string): Promise<LoginResult> {
|
||||
const user = await this.prisma.user.findUnique({ where: { email } });
|
||||
if (user === null) {
|
||||
throw new AuthenticationError('User not found');
|
||||
}
|
||||
|
||||
const token = randomUUID();
|
||||
const expiresAt = new Date(Date.now() + SESSION_TTL_MS);
|
||||
|
||||
await this.prisma.session.create({
|
||||
data: {
|
||||
token,
|
||||
userId: user.id,
|
||||
expiresAt,
|
||||
},
|
||||
});
|
||||
|
||||
return {
|
||||
token,
|
||||
expiresAt,
|
||||
user: { id: user.id, email: user.email, role: user.role },
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,5 +1,8 @@
|
||||
import type { IMcpServerRepository, ISecretRepository } from '../../repositories/interfaces.js';
|
||||
import type { IProjectRepository } from '../../repositories/project.repository.js';
|
||||
import type { IUserRepository } from '../../repositories/user.repository.js';
|
||||
import type { IGroupRepository } from '../../repositories/group.repository.js';
|
||||
import type { IRbacDefinitionRepository } from '../../repositories/rbac-definition.repository.js';
|
||||
import { encrypt, isSensitiveKey } from './crypto.js';
|
||||
import type { EncryptedPayload } from './crypto.js';
|
||||
import { APP_VERSION } from '@mcpctl/shared';
|
||||
@@ -12,6 +15,9 @@ export interface BackupBundle {
|
||||
servers: BackupServer[];
|
||||
secrets: BackupSecret[];
|
||||
projects: BackupProject[];
|
||||
users?: BackupUser[];
|
||||
groups?: BackupGroup[];
|
||||
rbacBindings?: BackupRbacBinding[];
|
||||
encryptedSecrets?: EncryptedPayload;
|
||||
}
|
||||
|
||||
@@ -33,11 +39,34 @@ export interface BackupSecret {
|
||||
export interface BackupProject {
|
||||
name: string;
|
||||
description: string;
|
||||
proxyMode?: string;
|
||||
llmProvider?: string | null;
|
||||
llmModel?: string | null;
|
||||
serverNames?: string[];
|
||||
}
|
||||
|
||||
export interface BackupUser {
|
||||
email: string;
|
||||
name: string | null;
|
||||
role: string;
|
||||
provider: string | null;
|
||||
}
|
||||
|
||||
export interface BackupGroup {
|
||||
name: string;
|
||||
description: string;
|
||||
memberEmails: string[];
|
||||
}
|
||||
|
||||
export interface BackupRbacBinding {
|
||||
name: string;
|
||||
subjects: unknown;
|
||||
roleBindings: unknown;
|
||||
}
|
||||
|
||||
export interface BackupOptions {
|
||||
password?: string;
|
||||
resources?: Array<'servers' | 'secrets' | 'projects'>;
|
||||
resources?: Array<'servers' | 'secrets' | 'projects' | 'users' | 'groups' | 'rbac'>;
|
||||
}
|
||||
|
||||
export class BackupService {
|
||||
@@ -45,14 +74,20 @@ export class BackupService {
|
||||
private serverRepo: IMcpServerRepository,
|
||||
private projectRepo: IProjectRepository,
|
||||
private secretRepo: ISecretRepository,
|
||||
private userRepo?: IUserRepository,
|
||||
private groupRepo?: IGroupRepository,
|
||||
private rbacRepo?: IRbacDefinitionRepository,
|
||||
) {}
|
||||
|
||||
async createBackup(options?: BackupOptions): Promise<BackupBundle> {
|
||||
const resources = options?.resources ?? ['servers', 'secrets', 'projects'];
|
||||
const resources = options?.resources ?? ['servers', 'secrets', 'projects', 'users', 'groups', 'rbac'];
|
||||
|
||||
let servers: BackupServer[] = [];
|
||||
let secrets: BackupSecret[] = [];
|
||||
let projects: BackupProject[] = [];
|
||||
let users: BackupUser[] = [];
|
||||
let groups: BackupGroup[] = [];
|
||||
let rbacBindings: BackupRbacBinding[] = [];
|
||||
|
||||
if (resources.includes('servers')) {
|
||||
const allServers = await this.serverRepo.findAll();
|
||||
@@ -80,6 +115,38 @@ export class BackupService {
|
||||
projects = allProjects.map((proj) => ({
|
||||
name: proj.name,
|
||||
description: proj.description,
|
||||
proxyMode: proj.proxyMode,
|
||||
llmProvider: proj.llmProvider,
|
||||
llmModel: proj.llmModel,
|
||||
serverNames: proj.servers.map((ps) => ps.server.name),
|
||||
}));
|
||||
}
|
||||
|
||||
if (resources.includes('users') && this.userRepo) {
|
||||
const allUsers = await this.userRepo.findAll();
|
||||
users = allUsers.map((u) => ({
|
||||
email: u.email,
|
||||
name: u.name,
|
||||
role: u.role,
|
||||
provider: u.provider,
|
||||
}));
|
||||
}
|
||||
|
||||
if (resources.includes('groups') && this.groupRepo) {
|
||||
const allGroups = await this.groupRepo.findAll();
|
||||
groups = allGroups.map((g) => ({
|
||||
name: g.name,
|
||||
description: g.description,
|
||||
memberEmails: g.members.map((m) => m.user.email),
|
||||
}));
|
||||
}
|
||||
|
||||
if (resources.includes('rbac') && this.rbacRepo) {
|
||||
const allRbac = await this.rbacRepo.findAll();
|
||||
rbacBindings = allRbac.map((r) => ({
|
||||
name: r.name,
|
||||
subjects: r.subjects,
|
||||
roleBindings: r.roleBindings,
|
||||
}));
|
||||
}
|
||||
|
||||
@@ -91,6 +158,9 @@ export class BackupService {
|
||||
servers,
|
||||
secrets,
|
||||
projects,
|
||||
users,
|
||||
groups,
|
||||
rbacBindings,
|
||||
};
|
||||
|
||||
if (options?.password && secrets.length > 0) {
|
||||
|
||||
@@ -1,5 +1,9 @@
|
||||
import type { IMcpServerRepository, ISecretRepository } from '../../repositories/interfaces.js';
|
||||
import type { IProjectRepository } from '../../repositories/project.repository.js';
|
||||
import type { IUserRepository } from '../../repositories/user.repository.js';
|
||||
import type { IGroupRepository } from '../../repositories/group.repository.js';
|
||||
import type { IRbacDefinitionRepository } from '../../repositories/rbac-definition.repository.js';
|
||||
import type { RbacRoleBinding } from '../../validation/rbac-definition.schema.js';
|
||||
import { decrypt } from './crypto.js';
|
||||
import type { BackupBundle } from './backup-service.js';
|
||||
|
||||
@@ -17,6 +21,12 @@ export interface RestoreResult {
|
||||
secretsSkipped: number;
|
||||
projectsCreated: number;
|
||||
projectsSkipped: number;
|
||||
usersCreated: number;
|
||||
usersSkipped: number;
|
||||
groupsCreated: number;
|
||||
groupsSkipped: number;
|
||||
rbacCreated: number;
|
||||
rbacSkipped: number;
|
||||
errors: string[];
|
||||
}
|
||||
|
||||
@@ -25,6 +35,9 @@ export class RestoreService {
|
||||
private serverRepo: IMcpServerRepository,
|
||||
private projectRepo: IProjectRepository,
|
||||
private secretRepo: ISecretRepository,
|
||||
private userRepo?: IUserRepository,
|
||||
private groupRepo?: IGroupRepository,
|
||||
private rbacRepo?: IRbacDefinitionRepository,
|
||||
) {}
|
||||
|
||||
validateBundle(bundle: unknown): bundle is BackupBundle {
|
||||
@@ -36,6 +49,7 @@ export class RestoreService {
|
||||
Array.isArray(b['secrets']) &&
|
||||
Array.isArray(b['projects'])
|
||||
);
|
||||
// users, groups, rbacBindings are optional for backwards compatibility
|
||||
}
|
||||
|
||||
async restore(bundle: BackupBundle, options?: RestoreOptions): Promise<RestoreResult> {
|
||||
@@ -47,6 +61,12 @@ export class RestoreService {
|
||||
secretsSkipped: 0,
|
||||
projectsCreated: 0,
|
||||
projectsSkipped: 0,
|
||||
usersCreated: 0,
|
||||
usersSkipped: 0,
|
||||
groupsCreated: 0,
|
||||
groupsSkipped: 0,
|
||||
rbacCreated: 0,
|
||||
rbacSkipped: 0,
|
||||
errors: [],
|
||||
};
|
||||
|
||||
@@ -78,6 +98,37 @@ export class RestoreService {
|
||||
}
|
||||
}
|
||||
|
||||
// Restore order: secrets → servers → users → groups → projects → rbacBindings
|
||||
|
||||
// Restore secrets
|
||||
for (const secret of bundle.secrets) {
|
||||
try {
|
||||
const existing = await this.secretRepo.findByName(secret.name);
|
||||
if (existing) {
|
||||
if (strategy === 'fail') {
|
||||
result.errors.push(`Secret "${secret.name}" already exists`);
|
||||
return result;
|
||||
}
|
||||
if (strategy === 'skip') {
|
||||
result.secretsSkipped++;
|
||||
continue;
|
||||
}
|
||||
// overwrite
|
||||
await this.secretRepo.update(existing.id, { data: secret.data });
|
||||
result.secretsCreated++;
|
||||
continue;
|
||||
}
|
||||
|
||||
await this.secretRepo.create({
|
||||
name: secret.name,
|
||||
data: secret.data,
|
||||
});
|
||||
result.secretsCreated++;
|
||||
} catch (err) {
|
||||
result.errors.push(`Failed to restore secret "${secret.name}": ${err instanceof Error ? err.message : String(err)}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Restore servers
|
||||
for (const server of bundle.servers) {
|
||||
try {
|
||||
@@ -121,36 +172,75 @@ export class RestoreService {
|
||||
}
|
||||
}
|
||||
|
||||
// Restore secrets
|
||||
for (const secret of bundle.secrets) {
|
||||
try {
|
||||
const existing = await this.secretRepo.findByName(secret.name);
|
||||
if (existing) {
|
||||
if (strategy === 'fail') {
|
||||
result.errors.push(`Secret "${secret.name}" already exists`);
|
||||
return result;
|
||||
}
|
||||
if (strategy === 'skip') {
|
||||
result.secretsSkipped++;
|
||||
// Restore users
|
||||
if (bundle.users && this.userRepo) {
|
||||
for (const user of bundle.users) {
|
||||
try {
|
||||
const existing = await this.userRepo.findByEmail(user.email);
|
||||
if (existing) {
|
||||
if (strategy === 'fail') {
|
||||
result.errors.push(`User "${user.email}" already exists`);
|
||||
return result;
|
||||
}
|
||||
result.usersSkipped++;
|
||||
continue;
|
||||
}
|
||||
// overwrite
|
||||
await this.secretRepo.update(existing.id, { data: secret.data });
|
||||
result.secretsCreated++;
|
||||
continue;
|
||||
}
|
||||
|
||||
await this.secretRepo.create({
|
||||
name: secret.name,
|
||||
data: secret.data,
|
||||
});
|
||||
result.secretsCreated++;
|
||||
} catch (err) {
|
||||
result.errors.push(`Failed to restore secret "${secret.name}": ${err instanceof Error ? err.message : String(err)}`);
|
||||
// Create with placeholder passwordHash — user must reset password
|
||||
const createData: { email: string; passwordHash: string; name?: string; role?: string } = {
|
||||
email: user.email,
|
||||
passwordHash: '__RESTORED_MUST_RESET__',
|
||||
role: user.role,
|
||||
};
|
||||
if (user.name !== null) createData.name = user.name;
|
||||
await this.userRepo.create(createData);
|
||||
result.usersCreated++;
|
||||
} catch (err) {
|
||||
result.errors.push(`Failed to restore user "${user.email}": ${err instanceof Error ? err.message : String(err)}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Restore projects
|
||||
// Restore groups
|
||||
if (bundle.groups && this.groupRepo && this.userRepo) {
|
||||
for (const group of bundle.groups) {
|
||||
try {
|
||||
const existing = await this.groupRepo.findByName(group.name);
|
||||
if (existing) {
|
||||
if (strategy === 'fail') {
|
||||
result.errors.push(`Group "${group.name}" already exists`);
|
||||
return result;
|
||||
}
|
||||
if (strategy === 'skip') {
|
||||
result.groupsSkipped++;
|
||||
continue;
|
||||
}
|
||||
// overwrite: update description and re-set members
|
||||
await this.groupRepo.update(existing.id, { description: group.description });
|
||||
if (group.memberEmails.length > 0) {
|
||||
const memberIds = await this.resolveUserEmails(group.memberEmails);
|
||||
await this.groupRepo.setMembers(existing.id, memberIds);
|
||||
}
|
||||
result.groupsCreated++;
|
||||
continue;
|
||||
}
|
||||
|
||||
const created = await this.groupRepo.create({
|
||||
name: group.name,
|
||||
description: group.description,
|
||||
});
|
||||
if (group.memberEmails.length > 0) {
|
||||
const memberIds = await this.resolveUserEmails(group.memberEmails);
|
||||
await this.groupRepo.setMembers(created.id, memberIds);
|
||||
}
|
||||
result.groupsCreated++;
|
||||
} catch (err) {
|
||||
result.errors.push(`Failed to restore group "${group.name}": ${err instanceof Error ? err.message : String(err)}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Restore projects (enriched)
|
||||
for (const project of bundle.projects) {
|
||||
try {
|
||||
const existing = await this.projectRepo.findByName(project.name);
|
||||
@@ -164,22 +254,100 @@ export class RestoreService {
|
||||
continue;
|
||||
}
|
||||
// overwrite
|
||||
await this.projectRepo.update(existing.id, { description: project.description });
|
||||
const updateData: Record<string, unknown> = { description: project.description };
|
||||
if (project.proxyMode) updateData['proxyMode'] = project.proxyMode;
|
||||
if (project.llmProvider !== undefined) updateData['llmProvider'] = project.llmProvider;
|
||||
if (project.llmModel !== undefined) updateData['llmModel'] = project.llmModel;
|
||||
await this.projectRepo.update(existing.id, updateData);
|
||||
|
||||
// Re-link servers
|
||||
if (project.serverNames && project.serverNames.length > 0) {
|
||||
const serverIds = await this.resolveServerNames(project.serverNames);
|
||||
await this.projectRepo.setServers(existing.id, serverIds);
|
||||
}
|
||||
|
||||
result.projectsCreated++;
|
||||
continue;
|
||||
}
|
||||
|
||||
await this.projectRepo.create({
|
||||
const projectCreateData: { name: string; description: string; ownerId: string; proxyMode: string; llmProvider?: string; llmModel?: string } = {
|
||||
name: project.name,
|
||||
description: project.description,
|
||||
ownerId: 'system',
|
||||
});
|
||||
proxyMode: project.proxyMode ?? 'direct',
|
||||
};
|
||||
if (project.llmProvider != null) projectCreateData.llmProvider = project.llmProvider;
|
||||
if (project.llmModel != null) projectCreateData.llmModel = project.llmModel;
|
||||
const created = await this.projectRepo.create(projectCreateData);
|
||||
|
||||
// Link servers
|
||||
if (project.serverNames && project.serverNames.length > 0) {
|
||||
const serverIds = await this.resolveServerNames(project.serverNames);
|
||||
await this.projectRepo.setServers(created.id, serverIds);
|
||||
}
|
||||
|
||||
result.projectsCreated++;
|
||||
} catch (err) {
|
||||
result.errors.push(`Failed to restore project "${project.name}": ${err instanceof Error ? err.message : String(err)}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Restore RBAC bindings
|
||||
if (bundle.rbacBindings && this.rbacRepo) {
|
||||
for (const rbac of bundle.rbacBindings) {
|
||||
try {
|
||||
const existing = await this.rbacRepo.findByName(rbac.name);
|
||||
if (existing) {
|
||||
if (strategy === 'fail') {
|
||||
result.errors.push(`RBAC binding "${rbac.name}" already exists`);
|
||||
return result;
|
||||
}
|
||||
if (strategy === 'skip') {
|
||||
result.rbacSkipped++;
|
||||
continue;
|
||||
}
|
||||
// overwrite
|
||||
await this.rbacRepo.update(existing.id, {
|
||||
subjects: rbac.subjects as Array<{ kind: 'User' | 'Group'; name: string }>,
|
||||
roleBindings: rbac.roleBindings as RbacRoleBinding[],
|
||||
});
|
||||
result.rbacCreated++;
|
||||
continue;
|
||||
}
|
||||
|
||||
await this.rbacRepo.create({
|
||||
name: rbac.name,
|
||||
subjects: rbac.subjects as Array<{ kind: 'User' | 'Group'; name: string }>,
|
||||
roleBindings: rbac.roleBindings as RbacRoleBinding[],
|
||||
});
|
||||
result.rbacCreated++;
|
||||
} catch (err) {
|
||||
result.errors.push(`Failed to restore RBAC binding "${rbac.name}": ${err instanceof Error ? err.message : String(err)}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
/** Resolve email addresses to user IDs via the user repository. */
|
||||
private async resolveUserEmails(emails: string[]): Promise<string[]> {
|
||||
const ids: string[] = [];
|
||||
for (const email of emails) {
|
||||
const user = await this.userRepo!.findByEmail(email);
|
||||
if (user) ids.push(user.id);
|
||||
}
|
||||
return ids;
|
||||
}
|
||||
|
||||
/** Resolve server names to server IDs via the server repository. */
|
||||
private async resolveServerNames(names: string[]): Promise<string[]> {
|
||||
const ids: string[] = [];
|
||||
for (const name of names) {
|
||||
const server = await this.serverRepo.findByName(name);
|
||||
if (server) ids.push(server.id);
|
||||
}
|
||||
return ids;
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
@@ -138,6 +138,19 @@ export class DockerContainerManager implements McpOrchestrator {
|
||||
if (port !== undefined) {
|
||||
result.port = port;
|
||||
}
|
||||
|
||||
// Extract container IP from first non-default network
|
||||
const networks = info.NetworkSettings?.Networks;
|
||||
if (networks) {
|
||||
for (const [, net] of Object.entries(networks)) {
|
||||
const netInfo = net as { IPAddress?: string };
|
||||
if (netInfo.IPAddress) {
|
||||
result.ip = netInfo.IPAddress;
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
|
||||
89
src/mcpd/src/services/group.service.ts
Normal file
89
src/mcpd/src/services/group.service.ts
Normal file
@@ -0,0 +1,89 @@
|
||||
import type { GroupWithMembers, IGroupRepository } from '../repositories/group.repository.js';
|
||||
import type { IUserRepository } from '../repositories/user.repository.js';
|
||||
import { CreateGroupSchema, UpdateGroupSchema } from '../validation/group.schema.js';
|
||||
import { NotFoundError, ConflictError } from './mcp-server.service.js';
|
||||
|
||||
export class GroupService {
|
||||
constructor(
|
||||
private readonly groupRepo: IGroupRepository,
|
||||
private readonly userRepo: IUserRepository,
|
||||
) {}
|
||||
|
||||
async list(): Promise<GroupWithMembers[]> {
|
||||
return this.groupRepo.findAll();
|
||||
}
|
||||
|
||||
async getById(id: string): Promise<GroupWithMembers> {
|
||||
const group = await this.groupRepo.findById(id);
|
||||
if (group === null) {
|
||||
throw new NotFoundError(`Group not found: ${id}`);
|
||||
}
|
||||
return group;
|
||||
}
|
||||
|
||||
async getByName(name: string): Promise<GroupWithMembers> {
|
||||
const group = await this.groupRepo.findByName(name);
|
||||
if (group === null) {
|
||||
throw new NotFoundError(`Group not found: ${name}`);
|
||||
}
|
||||
return group;
|
||||
}
|
||||
|
||||
async create(input: unknown): Promise<GroupWithMembers> {
|
||||
const data = CreateGroupSchema.parse(input);
|
||||
|
||||
const existing = await this.groupRepo.findByName(data.name);
|
||||
if (existing !== null) {
|
||||
throw new ConflictError(`Group already exists: ${data.name}`);
|
||||
}
|
||||
|
||||
const group = await this.groupRepo.create({
|
||||
name: data.name,
|
||||
description: data.description,
|
||||
});
|
||||
|
||||
if (data.members.length > 0) {
|
||||
const userIds = await this.resolveEmails(data.members);
|
||||
await this.groupRepo.setMembers(group.id, userIds);
|
||||
}
|
||||
|
||||
const result = await this.groupRepo.findById(group.id);
|
||||
// Should always exist since we just created it
|
||||
return result!;
|
||||
}
|
||||
|
||||
async update(id: string, input: unknown): Promise<GroupWithMembers> {
|
||||
const data = UpdateGroupSchema.parse(input);
|
||||
|
||||
// Verify exists
|
||||
await this.getById(id);
|
||||
|
||||
if (data.description !== undefined) {
|
||||
await this.groupRepo.update(id, { description: data.description });
|
||||
}
|
||||
|
||||
if (data.members !== undefined) {
|
||||
const userIds = await this.resolveEmails(data.members);
|
||||
await this.groupRepo.setMembers(id, userIds);
|
||||
}
|
||||
|
||||
return this.getById(id);
|
||||
}
|
||||
|
||||
async delete(id: string): Promise<void> {
|
||||
await this.getById(id);
|
||||
await this.groupRepo.delete(id);
|
||||
}
|
||||
|
||||
private async resolveEmails(emails: string[]): Promise<string[]> {
|
||||
const userIds: string[] = [];
|
||||
for (const email of emails) {
|
||||
const user = await this.userRepo.findByEmail(email);
|
||||
if (user === null) {
|
||||
throw new NotFoundError(`User not found: ${email}`);
|
||||
}
|
||||
userIds.push(user.id);
|
||||
}
|
||||
return userIds;
|
||||
}
|
||||
}
|
||||
@@ -112,7 +112,7 @@ export class HealthProbeRunner {
|
||||
|
||||
try {
|
||||
if (server.transport === 'SSE' || server.transport === 'STREAMABLE_HTTP') {
|
||||
result = await this.probeHttp(instance, healthCheck, timeoutMs);
|
||||
result = await this.probeHttp(instance, server, healthCheck, timeoutMs);
|
||||
} else {
|
||||
result = await this.probeStdio(instance, server, healthCheck, timeoutMs);
|
||||
}
|
||||
@@ -172,22 +172,48 @@ export class HealthProbeRunner {
|
||||
/** Probe an HTTP/SSE MCP server by sending a JSON-RPC tool call. */
|
||||
private async probeHttp(
|
||||
instance: McpInstance,
|
||||
server: McpServer,
|
||||
healthCheck: HealthCheckSpec,
|
||||
timeoutMs: number,
|
||||
): Promise<ProbeResult> {
|
||||
if (!instance.port) {
|
||||
return { healthy: false, latencyMs: 0, message: 'No port assigned' };
|
||||
if (!instance.containerId) {
|
||||
return { healthy: false, latencyMs: 0, message: 'No container ID' };
|
||||
}
|
||||
|
||||
const start = Date.now();
|
||||
// Get container IP for internal network communication
|
||||
// (mcpd and MCP containers share the mcp-servers network)
|
||||
const containerInfo = await this.orchestrator.inspectContainer(instance.containerId);
|
||||
const containerPort = (server.containerPort as number | null) ?? 3000;
|
||||
|
||||
// For HTTP servers, we need to initialize a session first, then call the tool
|
||||
let baseUrl: string;
|
||||
if (containerInfo.ip) {
|
||||
baseUrl = `http://${containerInfo.ip}:${containerPort}`;
|
||||
} else if (instance.port) {
|
||||
baseUrl = `http://localhost:${instance.port}`;
|
||||
} else {
|
||||
return { healthy: false, latencyMs: 0, message: 'No container IP or port' };
|
||||
}
|
||||
|
||||
if (server.transport === 'SSE') {
|
||||
return this.probeSse(baseUrl, healthCheck, timeoutMs);
|
||||
}
|
||||
return this.probeStreamableHttp(baseUrl, healthCheck, timeoutMs);
|
||||
}
|
||||
|
||||
/**
|
||||
* Probe a streamable-http MCP server (POST to root endpoint).
|
||||
*/
|
||||
private async probeStreamableHttp(
|
||||
baseUrl: string,
|
||||
healthCheck: HealthCheckSpec,
|
||||
timeoutMs: number,
|
||||
): Promise<ProbeResult> {
|
||||
const start = Date.now();
|
||||
const controller = new AbortController();
|
||||
const timer = setTimeout(() => controller.abort(), timeoutMs);
|
||||
|
||||
try {
|
||||
// Initialize
|
||||
const initResp = await fetch(`http://localhost:${instance.port}`, {
|
||||
const initResp = await fetch(baseUrl, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json', 'Accept': 'application/json, text/event-stream' },
|
||||
body: JSON.stringify({
|
||||
@@ -205,15 +231,13 @@ export class HealthProbeRunner {
|
||||
const headers: Record<string, string> = { 'Content-Type': 'application/json', 'Accept': 'application/json, text/event-stream' };
|
||||
if (sessionId) headers['Mcp-Session-Id'] = sessionId;
|
||||
|
||||
// Send initialized notification
|
||||
await fetch(`http://localhost:${instance.port}`, {
|
||||
await fetch(baseUrl, {
|
||||
method: 'POST', headers,
|
||||
body: JSON.stringify({ jsonrpc: '2.0', method: 'notifications/initialized' }),
|
||||
signal: controller.signal,
|
||||
});
|
||||
|
||||
// Call health check tool
|
||||
const toolResp = await fetch(`http://localhost:${instance.port}`, {
|
||||
const toolResp = await fetch(baseUrl, {
|
||||
method: 'POST', headers,
|
||||
body: JSON.stringify({
|
||||
jsonrpc: '2.0', id: 2, method: 'tools/call',
|
||||
@@ -229,7 +253,6 @@ export class HealthProbeRunner {
|
||||
}
|
||||
|
||||
const body = await toolResp.text();
|
||||
// Check for JSON-RPC error in response
|
||||
try {
|
||||
const parsed = JSON.parse(body.includes('data: ') ? body.split('data: ')[1]!.split('\n')[0]! : body);
|
||||
if (parsed.error) {
|
||||
@@ -245,6 +268,146 @@ export class HealthProbeRunner {
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Probe an SSE-transport MCP server.
|
||||
* SSE protocol: GET /sse → endpoint event → POST /messages?session_id=...
|
||||
*/
|
||||
private async probeSse(
|
||||
baseUrl: string,
|
||||
healthCheck: HealthCheckSpec,
|
||||
timeoutMs: number,
|
||||
): Promise<ProbeResult> {
|
||||
const start = Date.now();
|
||||
const controller = new AbortController();
|
||||
const timer = setTimeout(() => controller.abort(), timeoutMs);
|
||||
|
||||
try {
|
||||
// 1. Connect to SSE endpoint to get the message URL
|
||||
const sseResp = await fetch(`${baseUrl}/sse`, {
|
||||
method: 'GET',
|
||||
headers: { 'Accept': 'text/event-stream' },
|
||||
signal: controller.signal,
|
||||
});
|
||||
|
||||
if (!sseResp.ok) {
|
||||
return { healthy: false, latencyMs: Date.now() - start, message: `SSE connect HTTP ${sseResp.status}` };
|
||||
}
|
||||
|
||||
// 2. Read the SSE stream to find the endpoint event
|
||||
const reader = sseResp.body?.getReader();
|
||||
if (!reader) {
|
||||
return { healthy: false, latencyMs: Date.now() - start, message: 'No SSE stream body' };
|
||||
}
|
||||
|
||||
const decoder = new TextDecoder();
|
||||
let buffer = '';
|
||||
let messagesUrl = '';
|
||||
|
||||
// Read until we get the endpoint event
|
||||
while (!messagesUrl) {
|
||||
const { done, value } = await reader.read();
|
||||
if (done) break;
|
||||
buffer += decoder.decode(value, { stream: true });
|
||||
|
||||
for (const line of buffer.split('\n')) {
|
||||
if (line.startsWith('data: ') && buffer.includes('event: endpoint')) {
|
||||
const endpoint = line.slice(6).trim();
|
||||
// Endpoint may be relative (e.g., /messages?session_id=...) or absolute
|
||||
messagesUrl = endpoint.startsWith('http') ? endpoint : `${baseUrl}${endpoint}`;
|
||||
}
|
||||
}
|
||||
// Keep only the last incomplete line
|
||||
const lines = buffer.split('\n');
|
||||
buffer = lines[lines.length - 1] ?? '';
|
||||
}
|
||||
|
||||
if (!messagesUrl) {
|
||||
reader.cancel();
|
||||
return { healthy: false, latencyMs: Date.now() - start, message: 'No endpoint event from SSE' };
|
||||
}
|
||||
|
||||
// 3. Initialize via the messages endpoint
|
||||
const postHeaders = { 'Content-Type': 'application/json' };
|
||||
|
||||
const initResp = await fetch(messagesUrl, {
|
||||
method: 'POST', headers: postHeaders,
|
||||
body: JSON.stringify({
|
||||
jsonrpc: '2.0', id: 1, method: 'initialize',
|
||||
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'mcpctl-health', version: '0.1.0' } },
|
||||
}),
|
||||
signal: controller.signal,
|
||||
});
|
||||
|
||||
if (!initResp.ok) {
|
||||
reader.cancel();
|
||||
return { healthy: false, latencyMs: Date.now() - start, message: `Initialize HTTP ${initResp.status}` };
|
||||
}
|
||||
|
||||
// 4. Send initialized notification
|
||||
await fetch(messagesUrl, {
|
||||
method: 'POST', headers: postHeaders,
|
||||
body: JSON.stringify({ jsonrpc: '2.0', method: 'notifications/initialized' }),
|
||||
signal: controller.signal,
|
||||
});
|
||||
|
||||
// 5. Call health check tool
|
||||
const toolResp = await fetch(messagesUrl, {
|
||||
method: 'POST', headers: postHeaders,
|
||||
body: JSON.stringify({
|
||||
jsonrpc: '2.0', id: 2, method: 'tools/call',
|
||||
params: { name: healthCheck.tool, arguments: healthCheck.arguments ?? {} },
|
||||
}),
|
||||
signal: controller.signal,
|
||||
});
|
||||
|
||||
const latencyMs = Date.now() - start;
|
||||
|
||||
// 6. Read tool response from SSE stream
|
||||
// The response comes back on the SSE stream, not the POST response
|
||||
let responseBuffer = '';
|
||||
const readTimeout = setTimeout(() => reader.cancel(), 5000);
|
||||
|
||||
while (true) {
|
||||
const { done, value } = await reader.read();
|
||||
if (done) break;
|
||||
responseBuffer += decoder.decode(value, { stream: true });
|
||||
|
||||
// Look for data lines containing our response (id: 2)
|
||||
for (const line of responseBuffer.split('\n')) {
|
||||
if (line.startsWith('data: ')) {
|
||||
try {
|
||||
const parsed = JSON.parse(line.slice(6));
|
||||
if (parsed.id === 2) {
|
||||
clearTimeout(readTimeout);
|
||||
reader.cancel();
|
||||
if (parsed.error) {
|
||||
return { healthy: false, latencyMs, message: parsed.error.message ?? 'Tool call error' };
|
||||
}
|
||||
return { healthy: true, latencyMs, message: 'ok' };
|
||||
}
|
||||
} catch {
|
||||
// Not valid JSON, skip
|
||||
}
|
||||
}
|
||||
}
|
||||
const respLines = responseBuffer.split('\n');
|
||||
responseBuffer = respLines[respLines.length - 1] ?? '';
|
||||
}
|
||||
|
||||
clearTimeout(readTimeout);
|
||||
reader.cancel();
|
||||
|
||||
// If POST response itself was ok (202 for SSE), consider it healthy
|
||||
if (toolResp.ok) {
|
||||
return { healthy: true, latencyMs, message: 'ok' };
|
||||
}
|
||||
|
||||
return { healthy: false, latencyMs, message: `Tool call HTTP ${toolResp.status}` };
|
||||
} finally {
|
||||
clearTimeout(timer);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Probe a STDIO MCP server by running `docker exec` with a disposable Node.js
|
||||
* script that pipes JSON-RPC messages into the package binary.
|
||||
|
||||
@@ -27,3 +27,8 @@ export type { McpProxyRequest, McpProxyResponse } from './mcp-proxy-service.js';
|
||||
export { TemplateService } from './template.service.js';
|
||||
export { HealthProbeRunner } from './health-probe.service.js';
|
||||
export type { HealthCheckSpec, ProbeResult } from './health-probe.service.js';
|
||||
export { RbacDefinitionService } from './rbac-definition.service.js';
|
||||
export { RbacService } from './rbac.service.js';
|
||||
export type { RbacAction, Permission, AllowedScope } from './rbac.service.js';
|
||||
export { UserService } from './user.service.js';
|
||||
export { GroupService } from './group.service.js';
|
||||
|
||||
@@ -1,8 +1,10 @@
|
||||
import type { McpServer } from '@prisma/client';
|
||||
|
||||
export interface McpConfigServer {
|
||||
command: string;
|
||||
args: string[];
|
||||
command?: string;
|
||||
args?: string[];
|
||||
url?: string;
|
||||
headers?: Record<string, string>;
|
||||
env?: Record<string, string>;
|
||||
}
|
||||
|
||||
@@ -19,16 +21,24 @@ export function generateMcpConfig(
|
||||
const mcpServers: Record<string, McpConfigServer> = {};
|
||||
|
||||
for (const { server, resolvedEnv } of servers) {
|
||||
const config: McpConfigServer = {
|
||||
command: 'npx',
|
||||
args: ['-y', server.packageName ?? server.name],
|
||||
};
|
||||
if (server.transport === 'SSE' || server.transport === 'STREAMABLE_HTTP') {
|
||||
// Point at mcpd proxy URL for non-STDIO transports
|
||||
mcpServers[server.name] = {
|
||||
url: `http://localhost:3100/api/v1/mcp/proxy/${server.name}`,
|
||||
};
|
||||
} else {
|
||||
// STDIO — npx command approach
|
||||
const config: McpConfigServer = {
|
||||
command: 'npx',
|
||||
args: ['-y', server.packageName ?? server.name],
|
||||
};
|
||||
|
||||
if (Object.keys(resolvedEnv).length > 0) {
|
||||
config.env = resolvedEnv;
|
||||
if (Object.keys(resolvedEnv).length > 0) {
|
||||
config.env = resolvedEnv;
|
||||
}
|
||||
|
||||
mcpServers[server.name] = config;
|
||||
}
|
||||
|
||||
mcpServers[server.name] = config;
|
||||
}
|
||||
|
||||
return { mcpServers };
|
||||
|
||||
@@ -1,7 +1,10 @@
|
||||
import type { McpInstance } from '@prisma/client';
|
||||
import type { McpInstance, McpServer } from '@prisma/client';
|
||||
import type { IMcpInstanceRepository, IMcpServerRepository } from '../repositories/interfaces.js';
|
||||
import type { McpOrchestrator } from './orchestrator.js';
|
||||
import { NotFoundError } from './mcp-server.service.js';
|
||||
import { InvalidStateError } from './instance.service.js';
|
||||
import { sendViaSse } from './transport/sse-client.js';
|
||||
import { sendViaStdio } from './transport/stdio-client.js';
|
||||
|
||||
export interface McpProxyRequest {
|
||||
serverId: string;
|
||||
@@ -38,17 +41,21 @@ export class McpProxyService {
|
||||
constructor(
|
||||
private readonly instanceRepo: IMcpInstanceRepository,
|
||||
private readonly serverRepo: IMcpServerRepository,
|
||||
private readonly orchestrator?: McpOrchestrator,
|
||||
) {}
|
||||
|
||||
async execute(request: McpProxyRequest): Promise<McpProxyResponse> {
|
||||
const server = await this.serverRepo.findById(request.serverId);
|
||||
|
||||
// External server: proxy directly to externalUrl
|
||||
if (server?.externalUrl) {
|
||||
return this.sendToExternal(server.id, server.externalUrl, request.method, request.params);
|
||||
if (!server) {
|
||||
throw new NotFoundError(`Server '${request.serverId}' not found`);
|
||||
}
|
||||
|
||||
// Managed server: find running instance
|
||||
// External server: proxy directly to externalUrl
|
||||
if (server.externalUrl) {
|
||||
return this.sendToExternal(server, request.method, request.params);
|
||||
}
|
||||
|
||||
// Managed server: find running instance and dispatch by transport
|
||||
const instances = await this.instanceRepo.findAll(request.serverId);
|
||||
const running = instances.find((i) => i.status === 'RUNNING');
|
||||
|
||||
@@ -56,20 +63,95 @@ export class McpProxyService {
|
||||
throw new NotFoundError(`No running instance found for server '${request.serverId}'`);
|
||||
}
|
||||
|
||||
if (running.port === null || running.port === undefined) {
|
||||
throw new InvalidStateError(
|
||||
`Running instance '${running.id}' for server '${request.serverId}' has no port assigned`,
|
||||
);
|
||||
}
|
||||
|
||||
return this.sendJsonRpc(running, request.method, request.params);
|
||||
return this.sendToManaged(server, running, request.method, request.params);
|
||||
}
|
||||
|
||||
/**
|
||||
* Send a JSON-RPC request to an external MCP server.
|
||||
* Handles streamable-http protocol (session management + SSE response parsing).
|
||||
* Send to an external MCP server. Dispatches based on transport type.
|
||||
*/
|
||||
private async sendToExternal(
|
||||
server: McpServer,
|
||||
method: string,
|
||||
params?: Record<string, unknown>,
|
||||
): Promise<McpProxyResponse> {
|
||||
const url = server.externalUrl as string;
|
||||
|
||||
if (server.transport === 'SSE') {
|
||||
return sendViaSse(url, method, params);
|
||||
}
|
||||
|
||||
// STREAMABLE_HTTP (default for external)
|
||||
return this.sendStreamableHttp(server.id, url, method, params);
|
||||
}
|
||||
|
||||
/**
|
||||
* Send to a managed (containerized) MCP server. Dispatches based on transport type.
|
||||
*/
|
||||
private async sendToManaged(
|
||||
server: McpServer,
|
||||
instance: McpInstance,
|
||||
method: string,
|
||||
params?: Record<string, unknown>,
|
||||
): Promise<McpProxyResponse> {
|
||||
const transport = server.transport as string;
|
||||
|
||||
// STDIO: use docker exec
|
||||
if (transport === 'STDIO') {
|
||||
if (!this.orchestrator) {
|
||||
throw new InvalidStateError('Orchestrator required for STDIO transport');
|
||||
}
|
||||
if (!instance.containerId) {
|
||||
throw new InvalidStateError(`Instance '${instance.id}' has no container ID`);
|
||||
}
|
||||
const packageName = server.packageName as string | null;
|
||||
if (!packageName) {
|
||||
throw new InvalidStateError(`Server '${server.id}' has no package name for STDIO transport`);
|
||||
}
|
||||
return sendViaStdio(this.orchestrator, instance.containerId, packageName, method, params);
|
||||
}
|
||||
|
||||
// SSE or STREAMABLE_HTTP: need a base URL
|
||||
const baseUrl = await this.resolveBaseUrl(instance, server);
|
||||
|
||||
if (transport === 'SSE') {
|
||||
return sendViaSse(baseUrl, method, params);
|
||||
}
|
||||
|
||||
// STREAMABLE_HTTP (default)
|
||||
return this.sendStreamableHttp(server.id, baseUrl, method, params);
|
||||
}
|
||||
|
||||
/**
|
||||
* Resolve the base URL for an HTTP-based managed server.
|
||||
* Prefers container internal IP on Docker network, falls back to localhost:port.
|
||||
*/
|
||||
private async resolveBaseUrl(instance: McpInstance, server: McpServer): Promise<string> {
|
||||
const containerPort = (server.containerPort as number | null) ?? 3000;
|
||||
|
||||
if (this.orchestrator && instance.containerId) {
|
||||
try {
|
||||
const containerInfo = await this.orchestrator.inspectContainer(instance.containerId);
|
||||
if (containerInfo.ip) {
|
||||
return `http://${containerInfo.ip}:${containerPort}`;
|
||||
}
|
||||
} catch {
|
||||
// Fall through to localhost
|
||||
}
|
||||
}
|
||||
|
||||
if (instance.port !== null && instance.port !== undefined) {
|
||||
return `http://localhost:${instance.port}`;
|
||||
}
|
||||
|
||||
throw new InvalidStateError(
|
||||
`Cannot resolve URL for instance '${instance.id}': no container IP or host port`,
|
||||
);
|
||||
}
|
||||
|
||||
/**
|
||||
* Send via streamable-http protocol with session management.
|
||||
*/
|
||||
private async sendStreamableHttp(
|
||||
serverId: string,
|
||||
url: string,
|
||||
method: string,
|
||||
@@ -109,14 +191,14 @@ export class McpProxyService {
|
||||
// Session expired? Clear and retry once
|
||||
if (response.status === 400 || response.status === 404) {
|
||||
this.sessions.delete(serverId);
|
||||
return this.sendToExternal(serverId, url, method, params);
|
||||
return this.sendStreamableHttp(serverId, url, method, params);
|
||||
}
|
||||
return {
|
||||
jsonrpc: '2.0',
|
||||
id: 1,
|
||||
error: {
|
||||
code: -32000,
|
||||
message: `External MCP server returned HTTP ${response.status}: ${response.statusText}`,
|
||||
message: `MCP server returned HTTP ${response.status}: ${response.statusText}`,
|
||||
},
|
||||
};
|
||||
}
|
||||
@@ -126,8 +208,7 @@ export class McpProxyService {
|
||||
}
|
||||
|
||||
/**
|
||||
* Initialize a streamable-http session with an external server.
|
||||
* Sends `initialize` and `notifications/initialized`, caches the session ID.
|
||||
* Initialize a streamable-http session with a server.
|
||||
*/
|
||||
private async initSession(serverId: string, url: string): Promise<void> {
|
||||
const initBody = {
|
||||
@@ -174,41 +255,4 @@ export class McpProxyService {
|
||||
body: JSON.stringify({ jsonrpc: '2.0', method: 'notifications/initialized' }),
|
||||
});
|
||||
}
|
||||
|
||||
private async sendJsonRpc(
|
||||
instance: McpInstance,
|
||||
method: string,
|
||||
params?: Record<string, unknown>,
|
||||
): Promise<McpProxyResponse> {
|
||||
const url = `http://localhost:${instance.port}`;
|
||||
|
||||
const body: Record<string, unknown> = {
|
||||
jsonrpc: '2.0',
|
||||
id: 1,
|
||||
method,
|
||||
};
|
||||
if (params !== undefined) {
|
||||
body.params = params;
|
||||
}
|
||||
|
||||
const response = await fetch(url, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify(body),
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
return {
|
||||
jsonrpc: '2.0',
|
||||
id: 1,
|
||||
error: {
|
||||
code: -32000,
|
||||
message: `MCP server returned HTTP ${response.status}: ${response.statusText}`,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
const result = (await response.json()) as McpProxyResponse;
|
||||
return result;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -30,6 +30,8 @@ export interface ContainerInfo {
|
||||
name: string;
|
||||
state: 'running' | 'stopped' | 'starting' | 'error' | 'unknown';
|
||||
port?: number;
|
||||
/** Container IP on the first non-default network (for internal communication) */
|
||||
ip?: string;
|
||||
createdAt: Date;
|
||||
}
|
||||
|
||||
|
||||
@@ -1,18 +1,24 @@
|
||||
import type { Project } from '@prisma/client';
|
||||
import type { IProjectRepository } from '../repositories/project.repository.js';
|
||||
import type { McpServer } from '@prisma/client';
|
||||
import type { IProjectRepository, ProjectWithRelations } from '../repositories/project.repository.js';
|
||||
import type { IMcpServerRepository, ISecretRepository } from '../repositories/interfaces.js';
|
||||
import { CreateProjectSchema, UpdateProjectSchema } from '../validation/project.schema.js';
|
||||
import { NotFoundError, ConflictError } from './mcp-server.service.js';
|
||||
import { resolveServerEnv } from './env-resolver.js';
|
||||
import { generateMcpConfig } from './mcp-config-generator.js';
|
||||
import type { McpConfig } from './mcp-config-generator.js';
|
||||
|
||||
export class ProjectService {
|
||||
constructor(
|
||||
private readonly projectRepo: IProjectRepository,
|
||||
private readonly serverRepo: IMcpServerRepository,
|
||||
private readonly secretRepo: ISecretRepository,
|
||||
) {}
|
||||
|
||||
async list(ownerId?: string): Promise<Project[]> {
|
||||
async list(ownerId?: string): Promise<ProjectWithRelations[]> {
|
||||
return this.projectRepo.findAll(ownerId);
|
||||
}
|
||||
|
||||
async getById(id: string): Promise<Project> {
|
||||
async getById(id: string): Promise<ProjectWithRelations> {
|
||||
const project = await this.projectRepo.findById(id);
|
||||
if (project === null) {
|
||||
throw new NotFoundError(`Project not found: ${id}`);
|
||||
@@ -20,7 +26,20 @@ export class ProjectService {
|
||||
return project;
|
||||
}
|
||||
|
||||
async create(input: unknown, ownerId: string): Promise<Project> {
|
||||
/** Resolve by ID or name. */
|
||||
async resolveAndGet(idOrName: string): Promise<ProjectWithRelations> {
|
||||
// Try by ID first
|
||||
const byId = await this.projectRepo.findById(idOrName);
|
||||
if (byId !== null) return byId;
|
||||
|
||||
// Fall back to name
|
||||
const byName = await this.projectRepo.findByName(idOrName);
|
||||
if (byName !== null) return byName;
|
||||
|
||||
throw new NotFoundError(`Project not found: ${idOrName}`);
|
||||
}
|
||||
|
||||
async create(input: unknown, ownerId: string): Promise<ProjectWithRelations> {
|
||||
const data = CreateProjectSchema.parse(input);
|
||||
|
||||
const existing = await this.projectRepo.findByName(data.name);
|
||||
@@ -28,17 +47,111 @@ export class ProjectService {
|
||||
throw new ConflictError(`Project already exists: ${data.name}`);
|
||||
}
|
||||
|
||||
return this.projectRepo.create({ ...data, ownerId });
|
||||
// Resolve server names to IDs
|
||||
const serverIds = await this.resolveServerNames(data.servers);
|
||||
|
||||
const project = await this.projectRepo.create({
|
||||
name: data.name,
|
||||
description: data.description,
|
||||
prompt: data.prompt,
|
||||
ownerId,
|
||||
proxyMode: data.proxyMode,
|
||||
gated: data.gated,
|
||||
...(data.llmProvider !== undefined ? { llmProvider: data.llmProvider } : {}),
|
||||
...(data.llmModel !== undefined ? { llmModel: data.llmModel } : {}),
|
||||
});
|
||||
|
||||
// Link servers
|
||||
if (serverIds.length > 0) {
|
||||
await this.projectRepo.setServers(project.id, serverIds);
|
||||
}
|
||||
|
||||
// Re-fetch to include relations
|
||||
return this.getById(project.id);
|
||||
}
|
||||
|
||||
async update(id: string, input: unknown): Promise<Project> {
|
||||
async update(id: string, input: unknown): Promise<ProjectWithRelations> {
|
||||
const data = UpdateProjectSchema.parse(input);
|
||||
await this.getById(id);
|
||||
return this.projectRepo.update(id, data);
|
||||
const project = await this.getById(id);
|
||||
|
||||
// Build update data for scalar fields
|
||||
const updateData: Record<string, unknown> = {};
|
||||
if (data.description !== undefined) updateData['description'] = data.description;
|
||||
if (data.prompt !== undefined) updateData['prompt'] = data.prompt;
|
||||
if (data.proxyMode !== undefined) updateData['proxyMode'] = data.proxyMode;
|
||||
if (data.llmProvider !== undefined) updateData['llmProvider'] = data.llmProvider;
|
||||
if (data.llmModel !== undefined) updateData['llmModel'] = data.llmModel;
|
||||
if (data.gated !== undefined) updateData['gated'] = data.gated;
|
||||
|
||||
// Update scalar fields if any changed
|
||||
if (Object.keys(updateData).length > 0) {
|
||||
await this.projectRepo.update(project.id, updateData);
|
||||
}
|
||||
|
||||
// Update servers if provided
|
||||
if (data.servers !== undefined) {
|
||||
const serverIds = await this.resolveServerNames(data.servers);
|
||||
await this.projectRepo.setServers(project.id, serverIds);
|
||||
}
|
||||
|
||||
// Re-fetch to include updated relations
|
||||
return this.getById(project.id);
|
||||
}
|
||||
|
||||
async delete(id: string): Promise<void> {
|
||||
await this.getById(id);
|
||||
await this.projectRepo.delete(id);
|
||||
}
|
||||
|
||||
async generateMcpConfig(idOrName: string): Promise<McpConfig> {
|
||||
const project = await this.resolveAndGet(idOrName);
|
||||
|
||||
if (project.proxyMode === 'filtered') {
|
||||
// Single entry pointing at mcplocal proxy
|
||||
return {
|
||||
mcpServers: {
|
||||
[project.name]: {
|
||||
url: `http://localhost:3100/api/v1/mcp/proxy/project/${project.name}`,
|
||||
},
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
// Direct mode: fetch full servers and resolve env
|
||||
const serverEntries: Array<{ server: McpServer; resolvedEnv: Record<string, string> }> = [];
|
||||
|
||||
for (const ps of project.servers) {
|
||||
const server = await this.serverRepo.findById(ps.server.id);
|
||||
if (server === null) continue;
|
||||
|
||||
const resolvedEnv = await resolveServerEnv(server, this.secretRepo);
|
||||
serverEntries.push({ server, resolvedEnv });
|
||||
}
|
||||
|
||||
return generateMcpConfig(serverEntries);
|
||||
}
|
||||
|
||||
async addServer(idOrName: string, serverName: string): Promise<ProjectWithRelations> {
|
||||
const project = await this.resolveAndGet(idOrName);
|
||||
const server = await this.serverRepo.findByName(serverName);
|
||||
if (server === null) throw new NotFoundError(`Server not found: ${serverName}`);
|
||||
await this.projectRepo.addServer(project.id, server.id);
|
||||
return this.getById(project.id);
|
||||
}
|
||||
|
||||
async removeServer(idOrName: string, serverName: string): Promise<ProjectWithRelations> {
|
||||
const project = await this.resolveAndGet(idOrName);
|
||||
const server = await this.serverRepo.findByName(serverName);
|
||||
if (server === null) throw new NotFoundError(`Server not found: ${serverName}`);
|
||||
await this.projectRepo.removeServer(project.id, server.id);
|
||||
return this.getById(project.id);
|
||||
}
|
||||
|
||||
private async resolveServerNames(names: string[]): Promise<string[]> {
|
||||
return Promise.all(names.map(async (name) => {
|
||||
const server = await this.serverRepo.findByName(name);
|
||||
if (server === null) throw new NotFoundError(`Server not found: ${name}`);
|
||||
return server.id;
|
||||
}));
|
||||
}
|
||||
}
|
||||
|
||||
96
src/mcpd/src/services/prompt-summary.service.ts
Normal file
96
src/mcpd/src/services/prompt-summary.service.ts
Normal file
@@ -0,0 +1,96 @@
|
||||
/**
|
||||
* Generates summary and chapters for prompt content.
|
||||
*
|
||||
* Uses regex-based extraction by default (first sentence + markdown headings).
|
||||
* An optional LLM generator can be injected for higher-quality summaries.
|
||||
*/
|
||||
|
||||
const MAX_SUMMARY_WORDS = 20;
|
||||
const HEADING_RE = /^#{1,6}\s+(.+)$/gm;
|
||||
|
||||
export interface LlmSummaryGenerator {
|
||||
generate(content: string): Promise<{ summary: string; chapters: string[] }>;
|
||||
}
|
||||
|
||||
export class PromptSummaryService {
|
||||
constructor(private readonly llmGenerator: LlmSummaryGenerator | null = null) {}
|
||||
|
||||
async generateSummary(content: string): Promise<{ summary: string; chapters: string[] }> {
|
||||
if (this.llmGenerator) {
|
||||
try {
|
||||
return await this.llmGenerator.generate(content);
|
||||
} catch {
|
||||
// Fall back to regex on LLM failure
|
||||
}
|
||||
}
|
||||
return this.generateWithRegex(content);
|
||||
}
|
||||
|
||||
generateWithRegex(content: string): { summary: string; chapters: string[] } {
|
||||
return {
|
||||
summary: extractFirstSentence(content, MAX_SUMMARY_WORDS),
|
||||
chapters: extractHeadings(content),
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract the first sentence, truncated to maxWords.
|
||||
* Strips markdown formatting.
|
||||
*/
|
||||
export function extractFirstSentence(content: string, maxWords: number): string {
|
||||
if (!content.trim()) return '';
|
||||
|
||||
// Skip leading headings and blank lines to find first content line
|
||||
const lines = content.split('\n');
|
||||
let firstContent = '';
|
||||
for (const line of lines) {
|
||||
const trimmed = line.trim();
|
||||
if (!trimmed) continue;
|
||||
if (trimmed.startsWith('#')) continue;
|
||||
firstContent = trimmed;
|
||||
break;
|
||||
}
|
||||
|
||||
if (!firstContent) {
|
||||
// All lines are headings or empty — use first heading text
|
||||
for (const line of lines) {
|
||||
const trimmed = line.trim();
|
||||
if (trimmed.startsWith('#')) {
|
||||
firstContent = trimmed.replace(/^#+\s*/, '');
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (!firstContent) return '';
|
||||
|
||||
// Strip basic markdown formatting
|
||||
firstContent = firstContent
|
||||
.replace(/\*\*(.+?)\*\*/g, '$1')
|
||||
.replace(/\*(.+?)\*/g, '$1')
|
||||
.replace(/`(.+?)`/g, '$1')
|
||||
.replace(/\[(.+?)\]\(.+?\)/g, '$1');
|
||||
|
||||
// Split on sentence boundaries
|
||||
const sentenceEnd = firstContent.search(/[.!?]\s|[.!?]$/);
|
||||
const sentence = sentenceEnd >= 0 ? firstContent.slice(0, sentenceEnd + 1) : firstContent;
|
||||
|
||||
// Truncate to maxWords
|
||||
const words = sentence.split(/\s+/);
|
||||
if (words.length <= maxWords) return sentence;
|
||||
return words.slice(0, maxWords).join(' ') + '...';
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract markdown headings as chapter titles.
|
||||
*/
|
||||
export function extractHeadings(content: string): string[] {
|
||||
const headings: string[] = [];
|
||||
let match: RegExpExecArray | null;
|
||||
while ((match = HEADING_RE.exec(content)) !== null) {
|
||||
const heading = match[1]!.trim();
|
||||
if (heading) headings.push(heading);
|
||||
}
|
||||
return headings;
|
||||
}
|
||||
198
src/mcpd/src/services/prompt.service.ts
Normal file
198
src/mcpd/src/services/prompt.service.ts
Normal file
@@ -0,0 +1,198 @@
|
||||
import type { Prompt, PromptRequest } from '@prisma/client';
|
||||
import type { IPromptRepository } from '../repositories/prompt.repository.js';
|
||||
import type { IPromptRequestRepository } from '../repositories/prompt-request.repository.js';
|
||||
import type { IProjectRepository } from '../repositories/project.repository.js';
|
||||
import { CreatePromptSchema, UpdatePromptSchema, CreatePromptRequestSchema, UpdatePromptRequestSchema } from '../validation/prompt.schema.js';
|
||||
import { NotFoundError } from './mcp-server.service.js';
|
||||
import type { PromptSummaryService } from './prompt-summary.service.js';
|
||||
import { SYSTEM_PROJECT_NAME } from '../bootstrap/system-project.js';
|
||||
|
||||
export class PromptService {
|
||||
private summaryService: PromptSummaryService | null = null;
|
||||
|
||||
constructor(
|
||||
private readonly promptRepo: IPromptRepository,
|
||||
private readonly promptRequestRepo: IPromptRequestRepository,
|
||||
private readonly projectRepo: IProjectRepository,
|
||||
) {}
|
||||
|
||||
setSummaryService(service: PromptSummaryService): void {
|
||||
this.summaryService = service;
|
||||
}
|
||||
|
||||
// ── Prompt CRUD ──
|
||||
|
||||
async listPrompts(projectId?: string): Promise<Prompt[]> {
|
||||
return this.promptRepo.findAll(projectId);
|
||||
}
|
||||
|
||||
async listGlobalPrompts(): Promise<Prompt[]> {
|
||||
return this.promptRepo.findGlobal();
|
||||
}
|
||||
|
||||
async getPrompt(id: string): Promise<Prompt> {
|
||||
const prompt = await this.promptRepo.findById(id);
|
||||
if (prompt === null) throw new NotFoundError(`Prompt not found: ${id}`);
|
||||
return prompt;
|
||||
}
|
||||
|
||||
async createPrompt(input: unknown): Promise<Prompt> {
|
||||
const data = CreatePromptSchema.parse(input);
|
||||
|
||||
if (data.projectId) {
|
||||
const project = await this.projectRepo.findById(data.projectId);
|
||||
if (project === null) throw new NotFoundError(`Project not found: ${data.projectId}`);
|
||||
}
|
||||
|
||||
const createData: { name: string; content: string; projectId?: string; priority?: number; linkTarget?: string } = {
|
||||
name: data.name,
|
||||
content: data.content,
|
||||
};
|
||||
if (data.projectId !== undefined) createData.projectId = data.projectId;
|
||||
if (data.priority !== undefined) createData.priority = data.priority;
|
||||
if (data.linkTarget !== undefined) createData.linkTarget = data.linkTarget;
|
||||
const prompt = await this.promptRepo.create(createData);
|
||||
// Auto-generate summary/chapters (non-blocking — don't fail create if summary fails)
|
||||
if (this.summaryService && !data.linkTarget) {
|
||||
this.generateAndStoreSummary(prompt.id, data.content).catch(() => {});
|
||||
}
|
||||
return prompt;
|
||||
}
|
||||
|
||||
async updatePrompt(id: string, input: unknown): Promise<Prompt> {
|
||||
const data = UpdatePromptSchema.parse(input);
|
||||
await this.getPrompt(id);
|
||||
const updateData: { content?: string; priority?: number } = {};
|
||||
if (data.content !== undefined) updateData.content = data.content;
|
||||
if (data.priority !== undefined) updateData.priority = data.priority;
|
||||
const prompt = await this.promptRepo.update(id, updateData);
|
||||
// Regenerate summary when content changes
|
||||
if (this.summaryService && data.content !== undefined && !prompt.linkTarget) {
|
||||
this.generateAndStoreSummary(prompt.id, data.content).catch(() => {});
|
||||
}
|
||||
return prompt;
|
||||
}
|
||||
|
||||
async regenerateSummary(id: string): Promise<Prompt> {
|
||||
const prompt = await this.getPrompt(id);
|
||||
if (!this.summaryService) {
|
||||
throw new Error('Summary generation not available');
|
||||
}
|
||||
return this.generateAndStoreSummary(prompt.id, prompt.content);
|
||||
}
|
||||
|
||||
private async generateAndStoreSummary(id: string, content: string): Promise<Prompt> {
|
||||
if (!this.summaryService) throw new Error('No summary service');
|
||||
const { summary, chapters } = await this.summaryService.generateSummary(content);
|
||||
return this.promptRepo.update(id, { summary, chapters });
|
||||
}
|
||||
|
||||
async deletePrompt(id: string): Promise<void> {
|
||||
const prompt = await this.getPrompt(id);
|
||||
// Protect system prompts from deletion
|
||||
if (prompt.projectId) {
|
||||
const project = await this.projectRepo.findById(prompt.projectId);
|
||||
if (project?.name === SYSTEM_PROJECT_NAME) {
|
||||
throw Object.assign(new Error('Cannot delete system prompts'), { statusCode: 403 });
|
||||
}
|
||||
}
|
||||
await this.promptRepo.delete(id);
|
||||
}
|
||||
|
||||
// ── PromptRequest CRUD ──
|
||||
|
||||
async listPromptRequests(projectId?: string): Promise<PromptRequest[]> {
|
||||
return this.promptRequestRepo.findAll(projectId);
|
||||
}
|
||||
|
||||
async listGlobalPromptRequests(): Promise<PromptRequest[]> {
|
||||
return this.promptRequestRepo.findGlobal();
|
||||
}
|
||||
|
||||
async getPromptRequest(id: string): Promise<PromptRequest> {
|
||||
const req = await this.promptRequestRepo.findById(id);
|
||||
if (req === null) throw new NotFoundError(`PromptRequest not found: ${id}`);
|
||||
return req;
|
||||
}
|
||||
|
||||
async updatePromptRequest(id: string, input: unknown): Promise<PromptRequest> {
|
||||
await this.getPromptRequest(id);
|
||||
const data = UpdatePromptRequestSchema.parse(input);
|
||||
const updateData: { content?: string; priority?: number } = {};
|
||||
if (data.content !== undefined) updateData.content = data.content;
|
||||
if (data.priority !== undefined) updateData.priority = data.priority;
|
||||
return this.promptRequestRepo.update(id, updateData);
|
||||
}
|
||||
|
||||
async deletePromptRequest(id: string): Promise<void> {
|
||||
await this.getPromptRequest(id);
|
||||
await this.promptRequestRepo.delete(id);
|
||||
}
|
||||
|
||||
// ── Propose (LLM creates a PromptRequest) ──
|
||||
|
||||
async propose(input: unknown): Promise<PromptRequest> {
|
||||
const data = CreatePromptRequestSchema.parse(input);
|
||||
|
||||
if (data.projectId) {
|
||||
const project = await this.projectRepo.findById(data.projectId);
|
||||
if (project === null) throw new NotFoundError(`Project not found: ${data.projectId}`);
|
||||
}
|
||||
|
||||
const createData: { name: string; content: string; projectId?: string; priority?: number; createdBySession?: string; createdByUserId?: string } = {
|
||||
name: data.name,
|
||||
content: data.content,
|
||||
};
|
||||
if (data.projectId !== undefined) createData.projectId = data.projectId;
|
||||
if (data.priority !== undefined) createData.priority = data.priority;
|
||||
if (data.createdBySession !== undefined) createData.createdBySession = data.createdBySession;
|
||||
if (data.createdByUserId !== undefined) createData.createdByUserId = data.createdByUserId;
|
||||
return this.promptRequestRepo.create(createData);
|
||||
}
|
||||
|
||||
// ── Approve (delete PromptRequest → create Prompt) ──
|
||||
|
||||
async approve(requestId: string): Promise<Prompt> {
|
||||
const req = await this.getPromptRequest(requestId);
|
||||
|
||||
// Create the approved prompt (carry priority from request)
|
||||
const createData: { name: string; content: string; projectId?: string; priority?: number } = {
|
||||
name: req.name,
|
||||
content: req.content,
|
||||
};
|
||||
if (req.projectId !== null) createData.projectId = req.projectId;
|
||||
if (req.priority !== 5) createData.priority = req.priority;
|
||||
|
||||
const prompt = await this.promptRepo.create(createData);
|
||||
|
||||
// Delete the request
|
||||
await this.promptRequestRepo.delete(requestId);
|
||||
|
||||
return prompt;
|
||||
}
|
||||
|
||||
// ── Visibility for MCP (approved prompts + session's pending requests) ──
|
||||
|
||||
async getVisiblePrompts(
|
||||
projectId?: string,
|
||||
sessionId?: string,
|
||||
): Promise<Array<{ name: string; content: string; type: 'prompt' | 'promptrequest' }>> {
|
||||
const results: Array<{ name: string; content: string; type: 'prompt' | 'promptrequest' }> = [];
|
||||
|
||||
// Approved prompts (project-scoped + global)
|
||||
const prompts = await this.promptRepo.findAll(projectId);
|
||||
for (const p of prompts) {
|
||||
results.push({ name: p.name, content: p.content, type: 'prompt' });
|
||||
}
|
||||
|
||||
// Session's own pending requests
|
||||
if (sessionId) {
|
||||
const requests = await this.promptRequestRepo.findBySession(sessionId, projectId);
|
||||
for (const r of requests) {
|
||||
results.push({ name: r.name, content: r.content, type: 'promptrequest' });
|
||||
}
|
||||
}
|
||||
|
||||
return results;
|
||||
}
|
||||
}
|
||||
54
src/mcpd/src/services/rbac-definition.service.ts
Normal file
54
src/mcpd/src/services/rbac-definition.service.ts
Normal file
@@ -0,0 +1,54 @@
|
||||
import type { RbacDefinition } from '@prisma/client';
|
||||
import type { IRbacDefinitionRepository } from '../repositories/rbac-definition.repository.js';
|
||||
import { CreateRbacDefinitionSchema, UpdateRbacDefinitionSchema } from '../validation/rbac-definition.schema.js';
|
||||
import { NotFoundError, ConflictError } from './mcp-server.service.js';
|
||||
|
||||
export class RbacDefinitionService {
|
||||
constructor(private readonly repo: IRbacDefinitionRepository) {}
|
||||
|
||||
async list(): Promise<RbacDefinition[]> {
|
||||
return this.repo.findAll();
|
||||
}
|
||||
|
||||
async getById(id: string): Promise<RbacDefinition> {
|
||||
const def = await this.repo.findById(id);
|
||||
if (def === null) {
|
||||
throw new NotFoundError(`RbacDefinition not found: ${id}`);
|
||||
}
|
||||
return def;
|
||||
}
|
||||
|
||||
async getByName(name: string): Promise<RbacDefinition> {
|
||||
const def = await this.repo.findByName(name);
|
||||
if (def === null) {
|
||||
throw new NotFoundError(`RbacDefinition not found: ${name}`);
|
||||
}
|
||||
return def;
|
||||
}
|
||||
|
||||
async create(input: unknown): Promise<RbacDefinition> {
|
||||
const data = CreateRbacDefinitionSchema.parse(input);
|
||||
|
||||
const existing = await this.repo.findByName(data.name);
|
||||
if (existing !== null) {
|
||||
throw new ConflictError(`RbacDefinition already exists: ${data.name}`);
|
||||
}
|
||||
|
||||
return this.repo.create(data);
|
||||
}
|
||||
|
||||
async update(id: string, input: unknown): Promise<RbacDefinition> {
|
||||
const data = UpdateRbacDefinitionSchema.parse(input);
|
||||
|
||||
// Verify exists
|
||||
await this.getById(id);
|
||||
|
||||
return this.repo.update(id, data);
|
||||
}
|
||||
|
||||
async delete(id: string): Promise<void> {
|
||||
// Verify exists
|
||||
await this.getById(id);
|
||||
await this.repo.delete(id);
|
||||
}
|
||||
}
|
||||
165
src/mcpd/src/services/rbac.service.ts
Normal file
165
src/mcpd/src/services/rbac.service.ts
Normal file
@@ -0,0 +1,165 @@
|
||||
import type { PrismaClient } from '@prisma/client';
|
||||
import type { IRbacDefinitionRepository } from '../repositories/rbac-definition.repository.js';
|
||||
import {
|
||||
normalizeResource,
|
||||
isResourceBinding,
|
||||
isOperationBinding,
|
||||
type RbacSubject,
|
||||
type RbacRoleBinding,
|
||||
} from '../validation/rbac-definition.schema.js';
|
||||
|
||||
export type RbacAction = 'view' | 'create' | 'delete' | 'edit' | 'run' | 'expose';
|
||||
|
||||
export interface ResourcePermission {
|
||||
role: string;
|
||||
resource: string;
|
||||
name?: string;
|
||||
}
|
||||
|
||||
export interface OperationPermission {
|
||||
role: 'run';
|
||||
action: string;
|
||||
}
|
||||
|
||||
export type Permission = ResourcePermission | OperationPermission;
|
||||
|
||||
export interface AllowedScope {
|
||||
wildcard: boolean;
|
||||
names: Set<string>;
|
||||
}
|
||||
|
||||
/** Maps roles to the set of actions they grant. */
|
||||
const ROLE_ACTIONS: Record<string, readonly RbacAction[]> = {
|
||||
edit: ['view', 'create', 'delete', 'edit', 'expose'],
|
||||
view: ['view'],
|
||||
create: ['create'],
|
||||
delete: ['delete'],
|
||||
run: ['run'],
|
||||
expose: ['expose', 'view'],
|
||||
};
|
||||
|
||||
export class RbacService {
|
||||
constructor(
|
||||
private readonly rbacRepo: IRbacDefinitionRepository,
|
||||
private readonly prisma: PrismaClient,
|
||||
) {}
|
||||
|
||||
/**
|
||||
* Check whether a user is allowed to perform an action on a resource.
|
||||
* @param resourceName — optional specific resource name (e.g. 'my-ha').
|
||||
* If provided, name-scoped bindings only match when their name equals this.
|
||||
* If omitted (listing), name-scoped bindings still grant access.
|
||||
*/
|
||||
async canAccess(userId: string, action: RbacAction, resource: string, resourceName?: string, serviceAccountName?: string): Promise<boolean> {
|
||||
const permissions = await this.getPermissions(userId, serviceAccountName);
|
||||
const normalized = normalizeResource(resource);
|
||||
|
||||
for (const perm of permissions) {
|
||||
if (!('resource' in perm)) continue;
|
||||
const actions = ROLE_ACTIONS[perm.role];
|
||||
if (actions === undefined) continue;
|
||||
if (!actions.includes(action)) continue;
|
||||
const permResource = normalizeResource(perm.resource);
|
||||
if (permResource !== '*' && permResource !== normalized) continue;
|
||||
// Name-scoped check: if binding has a name AND caller specified a resourceName, must match
|
||||
if (perm.name !== undefined && resourceName !== undefined && perm.name !== resourceName) continue;
|
||||
return true;
|
||||
}
|
||||
|
||||
return false;
|
||||
}
|
||||
|
||||
/**
|
||||
* Check whether a user is allowed to perform a named operation.
|
||||
* Operations require an explicit 'run' role binding with a matching action.
|
||||
*/
|
||||
async canRunOperation(userId: string, operation: string, serviceAccountName?: string): Promise<boolean> {
|
||||
const permissions = await this.getPermissions(userId, serviceAccountName);
|
||||
|
||||
for (const perm of permissions) {
|
||||
if ('action' in perm && perm.role === 'run' && perm.action === operation) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
return false;
|
||||
}
|
||||
|
||||
/**
|
||||
* Determine the set of resource names a user may access for a given action+resource.
|
||||
* Returns wildcard:true if any matching binding is unscoped (no name constraint).
|
||||
* Returns wildcard:false with a set of allowed names if all bindings are name-scoped.
|
||||
*/
|
||||
async getAllowedScope(userId: string, action: RbacAction, resource: string, serviceAccountName?: string): Promise<AllowedScope> {
|
||||
const permissions = await this.getPermissions(userId, serviceAccountName);
|
||||
const normalized = normalizeResource(resource);
|
||||
const names = new Set<string>();
|
||||
|
||||
for (const perm of permissions) {
|
||||
if (!('resource' in perm)) continue;
|
||||
const actions = ROLE_ACTIONS[perm.role];
|
||||
if (actions === undefined) continue;
|
||||
if (!actions.includes(action)) continue;
|
||||
const permResource = normalizeResource(perm.resource);
|
||||
if (permResource !== '*' && permResource !== normalized) continue;
|
||||
// Unscoped binding → wildcard access to this resource
|
||||
if (perm.name === undefined) return { wildcard: true, names: new Set() };
|
||||
names.add(perm.name);
|
||||
}
|
||||
|
||||
return { wildcard: false, names };
|
||||
}
|
||||
|
||||
/**
|
||||
* Collect all permissions for a user across all matching RbacDefinitions.
|
||||
*/
|
||||
async getPermissions(userId: string, serviceAccountName?: string): Promise<Permission[]> {
|
||||
// 1. Resolve user email
|
||||
const user = await this.prisma.user.findUnique({
|
||||
where: { id: userId },
|
||||
select: { email: true },
|
||||
});
|
||||
if (user === null && serviceAccountName === undefined) return [];
|
||||
|
||||
// 2. Resolve group names the user belongs to
|
||||
let groupNames: string[] = [];
|
||||
if (user !== null) {
|
||||
const memberships = await this.prisma.groupMember.findMany({
|
||||
where: { userId },
|
||||
select: { group: { select: { name: true } } },
|
||||
});
|
||||
groupNames = memberships.map((m) => m.group.name);
|
||||
}
|
||||
|
||||
// 3. Load all RbacDefinitions
|
||||
const definitions = await this.rbacRepo.findAll();
|
||||
|
||||
// 4. Find definitions where user or service account is a subject
|
||||
const permissions: Permission[] = [];
|
||||
for (const def of definitions) {
|
||||
const subjects = def.subjects as RbacSubject[];
|
||||
const matched = subjects.some((s) => {
|
||||
if (s.kind === 'User') return user !== null && s.name === user.email;
|
||||
if (s.kind === 'Group') return groupNames.includes(s.name);
|
||||
if (s.kind === 'ServiceAccount') return serviceAccountName !== undefined && s.name === serviceAccountName;
|
||||
return false;
|
||||
});
|
||||
|
||||
if (!matched) continue;
|
||||
|
||||
// 5. Collect roleBindings
|
||||
const bindings = def.roleBindings as RbacRoleBinding[];
|
||||
for (const binding of bindings) {
|
||||
if (isResourceBinding(binding)) {
|
||||
const perm: ResourcePermission = { role: binding.role, resource: binding.resource };
|
||||
if (binding.name !== undefined) perm.name = binding.name;
|
||||
permissions.push(perm);
|
||||
} else if (isOperationBinding(binding)) {
|
||||
permissions.push({ role: 'run', action: binding.action });
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return permissions;
|
||||
}
|
||||
}
|
||||
2
src/mcpd/src/services/transport/index.ts
Normal file
2
src/mcpd/src/services/transport/index.ts
Normal file
@@ -0,0 +1,2 @@
|
||||
export { sendViaSse } from './sse-client.js';
|
||||
export { sendViaStdio } from './stdio-client.js';
|
||||
150
src/mcpd/src/services/transport/sse-client.ts
Normal file
150
src/mcpd/src/services/transport/sse-client.ts
Normal file
@@ -0,0 +1,150 @@
|
||||
import type { McpProxyResponse } from '../mcp-proxy-service.js';
|
||||
|
||||
/**
|
||||
* SSE transport client for MCP servers using the legacy SSE protocol.
|
||||
*
|
||||
* Protocol: GET /sse → endpoint event with messages URL → POST to messages URL.
|
||||
* Responses come back on the SSE stream, matched by JSON-RPC request ID.
|
||||
*
|
||||
* Each call opens a fresh SSE connection, initializes, sends the request,
|
||||
* reads the response, and closes. Session caching may be added later.
|
||||
*/
|
||||
export async function sendViaSse(
|
||||
baseUrl: string,
|
||||
method: string,
|
||||
params?: Record<string, unknown>,
|
||||
timeoutMs = 30_000,
|
||||
): Promise<McpProxyResponse> {
|
||||
const controller = new AbortController();
|
||||
const timer = setTimeout(() => controller.abort(), timeoutMs);
|
||||
|
||||
try {
|
||||
// 1. GET /sse → SSE stream
|
||||
const sseResp = await fetch(`${baseUrl}/sse`, {
|
||||
method: 'GET',
|
||||
headers: { 'Accept': 'text/event-stream' },
|
||||
signal: controller.signal,
|
||||
});
|
||||
|
||||
if (!sseResp.ok) {
|
||||
return errorResponse(`SSE connect failed: HTTP ${sseResp.status}`);
|
||||
}
|
||||
|
||||
const reader = sseResp.body?.getReader();
|
||||
if (!reader) {
|
||||
return errorResponse('No SSE stream body');
|
||||
}
|
||||
|
||||
// 2. Read until we get the endpoint event with messages URL
|
||||
const decoder = new TextDecoder();
|
||||
let buffer = '';
|
||||
let messagesUrl = '';
|
||||
|
||||
while (!messagesUrl) {
|
||||
const { done, value } = await reader.read();
|
||||
if (done) break;
|
||||
buffer += decoder.decode(value, { stream: true });
|
||||
|
||||
for (const line of buffer.split('\n')) {
|
||||
if (line.startsWith('data: ') && buffer.includes('event: endpoint')) {
|
||||
const endpoint = line.slice(6).trim();
|
||||
messagesUrl = endpoint.startsWith('http') ? endpoint : `${baseUrl}${endpoint}`;
|
||||
}
|
||||
}
|
||||
const lines = buffer.split('\n');
|
||||
buffer = lines[lines.length - 1] ?? '';
|
||||
}
|
||||
|
||||
if (!messagesUrl) {
|
||||
reader.cancel();
|
||||
return errorResponse('No endpoint event from SSE stream');
|
||||
}
|
||||
|
||||
const postHeaders = { 'Content-Type': 'application/json' };
|
||||
|
||||
// 3. Initialize
|
||||
const initResp = await fetch(messagesUrl, {
|
||||
method: 'POST',
|
||||
headers: postHeaders,
|
||||
body: JSON.stringify({
|
||||
jsonrpc: '2.0',
|
||||
id: 1,
|
||||
method: 'initialize',
|
||||
params: {
|
||||
protocolVersion: '2024-11-05',
|
||||
capabilities: {},
|
||||
clientInfo: { name: 'mcpctl-proxy', version: '0.1.0' },
|
||||
},
|
||||
}),
|
||||
signal: controller.signal,
|
||||
});
|
||||
|
||||
if (!initResp.ok) {
|
||||
reader.cancel();
|
||||
return errorResponse(`SSE initialize failed: HTTP ${initResp.status}`);
|
||||
}
|
||||
|
||||
// 4. Send notifications/initialized
|
||||
await fetch(messagesUrl, {
|
||||
method: 'POST',
|
||||
headers: postHeaders,
|
||||
body: JSON.stringify({ jsonrpc: '2.0', method: 'notifications/initialized' }),
|
||||
signal: controller.signal,
|
||||
});
|
||||
|
||||
// 5. Send the actual request
|
||||
const requestId = 2;
|
||||
await fetch(messagesUrl, {
|
||||
method: 'POST',
|
||||
headers: postHeaders,
|
||||
body: JSON.stringify({
|
||||
jsonrpc: '2.0',
|
||||
id: requestId,
|
||||
method,
|
||||
...(params !== undefined ? { params } : {}),
|
||||
}),
|
||||
signal: controller.signal,
|
||||
});
|
||||
|
||||
// 6. Read response from SSE stream (matched by request ID)
|
||||
let responseBuffer = '';
|
||||
const readTimeout = setTimeout(() => reader.cancel(), 5000);
|
||||
|
||||
while (true) {
|
||||
const { done, value } = await reader.read();
|
||||
if (done) break;
|
||||
responseBuffer += decoder.decode(value, { stream: true });
|
||||
|
||||
for (const line of responseBuffer.split('\n')) {
|
||||
if (line.startsWith('data: ')) {
|
||||
try {
|
||||
const parsed = JSON.parse(line.slice(6)) as McpProxyResponse;
|
||||
if (parsed.id === requestId) {
|
||||
clearTimeout(readTimeout);
|
||||
reader.cancel();
|
||||
return parsed;
|
||||
}
|
||||
} catch {
|
||||
// Not valid JSON, skip
|
||||
}
|
||||
}
|
||||
}
|
||||
const respLines = responseBuffer.split('\n');
|
||||
responseBuffer = respLines[respLines.length - 1] ?? '';
|
||||
}
|
||||
|
||||
clearTimeout(readTimeout);
|
||||
reader.cancel();
|
||||
return errorResponse('No response received from SSE stream');
|
||||
} finally {
|
||||
clearTimeout(timer);
|
||||
}
|
||||
}
|
||||
|
||||
function errorResponse(message: string): McpProxyResponse {
|
||||
return {
|
||||
jsonrpc: '2.0',
|
||||
id: 1,
|
||||
error: { code: -32000, message },
|
||||
};
|
||||
}
|
||||
119
src/mcpd/src/services/transport/stdio-client.ts
Normal file
119
src/mcpd/src/services/transport/stdio-client.ts
Normal file
@@ -0,0 +1,119 @@
|
||||
import type { McpOrchestrator } from '../orchestrator.js';
|
||||
import type { McpProxyResponse } from '../mcp-proxy-service.js';
|
||||
|
||||
/**
|
||||
* STDIO transport client for MCP servers running as Docker containers.
|
||||
*
|
||||
* Runs `docker exec` with an inline Node.js script that spawns the MCP server
|
||||
* binary, pipes JSON-RPC messages via stdin/stdout, and returns the response.
|
||||
*
|
||||
* Each call is self-contained: initialize → notifications/initialized → request → response.
|
||||
*/
|
||||
export async function sendViaStdio(
|
||||
orchestrator: McpOrchestrator,
|
||||
containerId: string,
|
||||
packageName: string,
|
||||
method: string,
|
||||
params?: Record<string, unknown>,
|
||||
timeoutMs = 30_000,
|
||||
): Promise<McpProxyResponse> {
|
||||
const initMsg = JSON.stringify({
|
||||
jsonrpc: '2.0',
|
||||
id: 1,
|
||||
method: 'initialize',
|
||||
params: {
|
||||
protocolVersion: '2024-11-05',
|
||||
capabilities: {},
|
||||
clientInfo: { name: 'mcpctl-proxy', version: '0.1.0' },
|
||||
},
|
||||
});
|
||||
const initializedMsg = JSON.stringify({
|
||||
jsonrpc: '2.0',
|
||||
method: 'notifications/initialized',
|
||||
});
|
||||
|
||||
const requestBody: Record<string, unknown> = {
|
||||
jsonrpc: '2.0',
|
||||
id: 2,
|
||||
method,
|
||||
};
|
||||
if (params !== undefined) {
|
||||
requestBody.params = params;
|
||||
}
|
||||
const requestMsg = JSON.stringify(requestBody);
|
||||
|
||||
// Inline Node.js script that:
|
||||
// 1. Spawns the MCP server binary via npx
|
||||
// 2. Sends initialize → initialized → actual request via stdin
|
||||
// 3. Reads stdout for JSON-RPC response with id: 2
|
||||
// 4. Outputs the full JSON-RPC response to stdout
|
||||
const probeScript = `
|
||||
const { spawn } = require('child_process');
|
||||
const proc = spawn('npx', ['--prefer-offline', '-y', ${JSON.stringify(packageName)}], { stdio: ['pipe', 'pipe', 'pipe'] });
|
||||
let output = '';
|
||||
let responded = false;
|
||||
proc.stdout.on('data', d => {
|
||||
output += d;
|
||||
const lines = output.split('\\n');
|
||||
for (const line of lines) {
|
||||
if (!line.trim()) continue;
|
||||
try {
|
||||
const msg = JSON.parse(line);
|
||||
if (msg.id === 2) {
|
||||
responded = true;
|
||||
process.stdout.write(JSON.stringify(msg), () => {
|
||||
proc.kill();
|
||||
process.exit(0);
|
||||
});
|
||||
}
|
||||
} catch {}
|
||||
}
|
||||
output = lines[lines.length - 1] || '';
|
||||
});
|
||||
proc.stderr.on('data', () => {});
|
||||
proc.on('error', e => { process.stdout.write(JSON.stringify({jsonrpc:'2.0',id:2,error:{code:-32000,message:e.message}})); process.exit(1); });
|
||||
proc.on('exit', (code) => { if (!responded) { process.stdout.write(JSON.stringify({jsonrpc:'2.0',id:2,error:{code:-32000,message:'process exited '+code}})); process.exit(1); } });
|
||||
setTimeout(() => { if (!responded) { process.stdout.write(JSON.stringify({jsonrpc:'2.0',id:2,error:{code:-32000,message:'timeout'}})); proc.kill(); process.exit(1); } }, ${timeoutMs - 2000});
|
||||
proc.stdin.write(${JSON.stringify(initMsg)} + '\\n');
|
||||
setTimeout(() => {
|
||||
proc.stdin.write(${JSON.stringify(initializedMsg)} + '\\n');
|
||||
setTimeout(() => {
|
||||
proc.stdin.write(${JSON.stringify(requestMsg)} + '\\n');
|
||||
}, 500);
|
||||
}, 500);
|
||||
`.trim();
|
||||
|
||||
try {
|
||||
const result = await orchestrator.execInContainer(
|
||||
containerId,
|
||||
['node', '-e', probeScript],
|
||||
{ timeoutMs },
|
||||
);
|
||||
|
||||
if (result.exitCode === 0 && result.stdout.trim()) {
|
||||
try {
|
||||
return JSON.parse(result.stdout.trim()) as McpProxyResponse;
|
||||
} catch {
|
||||
return errorResponse(`Failed to parse STDIO response: ${result.stdout.slice(0, 200)}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Try to parse error response from stdout
|
||||
try {
|
||||
return JSON.parse(result.stdout.trim()) as McpProxyResponse;
|
||||
} catch {
|
||||
const errorMsg = result.stderr.trim() || `docker exec exit code ${result.exitCode}`;
|
||||
return errorResponse(errorMsg);
|
||||
}
|
||||
} catch (err) {
|
||||
return errorResponse(err instanceof Error ? err.message : String(err));
|
||||
}
|
||||
}
|
||||
|
||||
function errorResponse(message: string): McpProxyResponse {
|
||||
return {
|
||||
jsonrpc: '2.0',
|
||||
id: 2,
|
||||
error: { code: -32000, message },
|
||||
};
|
||||
}
|
||||
60
src/mcpd/src/services/user.service.ts
Normal file
60
src/mcpd/src/services/user.service.ts
Normal file
@@ -0,0 +1,60 @@
|
||||
import bcrypt from 'bcrypt';
|
||||
import type { IUserRepository, SafeUser } from '../repositories/user.repository.js';
|
||||
import { CreateUserSchema } from '../validation/user.schema.js';
|
||||
import { NotFoundError, ConflictError } from './mcp-server.service.js';
|
||||
|
||||
const SALT_ROUNDS = 10;
|
||||
|
||||
export class UserService {
|
||||
constructor(private readonly userRepo: IUserRepository) {}
|
||||
|
||||
async list(): Promise<SafeUser[]> {
|
||||
return this.userRepo.findAll();
|
||||
}
|
||||
|
||||
async getById(id: string): Promise<SafeUser> {
|
||||
const user = await this.userRepo.findById(id);
|
||||
if (user === null) {
|
||||
throw new NotFoundError(`User not found: ${id}`);
|
||||
}
|
||||
return user;
|
||||
}
|
||||
|
||||
async getByEmail(email: string): Promise<SafeUser> {
|
||||
const user = await this.userRepo.findByEmail(email);
|
||||
if (user === null) {
|
||||
throw new NotFoundError(`User not found: ${email}`);
|
||||
}
|
||||
return user;
|
||||
}
|
||||
|
||||
async create(input: unknown): Promise<SafeUser> {
|
||||
const data = CreateUserSchema.parse(input);
|
||||
|
||||
const existing = await this.userRepo.findByEmail(data.email);
|
||||
if (existing !== null) {
|
||||
throw new ConflictError(`User already exists: ${data.email}`);
|
||||
}
|
||||
|
||||
const passwordHash = await bcrypt.hash(data.password, SALT_ROUNDS);
|
||||
|
||||
const createData: { email: string; passwordHash: string; name?: string } = {
|
||||
email: data.email,
|
||||
passwordHash,
|
||||
};
|
||||
if (data.name !== undefined) {
|
||||
createData.name = data.name;
|
||||
}
|
||||
|
||||
return this.userRepo.create(createData);
|
||||
}
|
||||
|
||||
async delete(id: string): Promise<void> {
|
||||
await this.getById(id);
|
||||
await this.userRepo.delete(id);
|
||||
}
|
||||
|
||||
async count(): Promise<number> {
|
||||
return this.userRepo.count();
|
||||
}
|
||||
}
|
||||
15
src/mcpd/src/validation/group.schema.ts
Normal file
15
src/mcpd/src/validation/group.schema.ts
Normal file
@@ -0,0 +1,15 @@
|
||||
import { z } from 'zod';
|
||||
|
||||
export const CreateGroupSchema = z.object({
|
||||
name: z.string().min(1).max(100).regex(/^[a-z0-9-]+$/, 'Name must be lowercase alphanumeric with hyphens'),
|
||||
description: z.string().max(1000).default(''),
|
||||
members: z.array(z.string().email()).default([]),
|
||||
});
|
||||
|
||||
export const UpdateGroupSchema = z.object({
|
||||
description: z.string().max(1000).optional(),
|
||||
members: z.array(z.string().email()).optional(),
|
||||
});
|
||||
|
||||
export type CreateGroupInput = z.infer<typeof CreateGroupSchema>;
|
||||
export type UpdateGroupInput = z.infer<typeof UpdateGroupSchema>;
|
||||
@@ -2,3 +2,5 @@ export { CreateMcpServerSchema, UpdateMcpServerSchema } from './mcp-server.schem
|
||||
export type { CreateMcpServerInput, UpdateMcpServerInput } from './mcp-server.schema.js';
|
||||
export { CreateProjectSchema, UpdateProjectSchema } from './project.schema.js';
|
||||
export type { CreateProjectInput, UpdateProjectInput } from './project.schema.js';
|
||||
export { CreateRbacDefinitionSchema, UpdateRbacDefinitionSchema, RbacSubjectSchema, RbacRoleBindingSchema, RBAC_ROLES, RBAC_RESOURCES } from './rbac-definition.schema.js';
|
||||
export type { CreateRbacDefinitionInput, UpdateRbacDefinitionInput, RbacSubject, RbacRoleBinding } from './rbac-definition.schema.js';
|
||||
|
||||
@@ -3,10 +3,25 @@ import { z } from 'zod';
|
||||
export const CreateProjectSchema = z.object({
|
||||
name: z.string().min(1).max(100).regex(/^[a-z0-9-]+$/, 'Name must be lowercase alphanumeric with hyphens'),
|
||||
description: z.string().max(1000).default(''),
|
||||
});
|
||||
prompt: z.string().max(10000).default(''),
|
||||
proxyMode: z.enum(['direct', 'filtered']).default('direct'),
|
||||
gated: z.boolean().default(true),
|
||||
llmProvider: z.string().max(100).optional(),
|
||||
llmModel: z.string().max(100).optional(),
|
||||
servers: z.array(z.string().min(1)).default([]),
|
||||
}).refine(
|
||||
(d) => d.proxyMode !== 'filtered' || d.llmProvider,
|
||||
{ message: 'llmProvider is required when proxyMode is "filtered"' },
|
||||
);
|
||||
|
||||
export const UpdateProjectSchema = z.object({
|
||||
description: z.string().max(1000).optional(),
|
||||
prompt: z.string().max(10000).optional(),
|
||||
proxyMode: z.enum(['direct', 'filtered']).optional(),
|
||||
gated: z.boolean().optional(),
|
||||
llmProvider: z.string().max(100).nullable().optional(),
|
||||
llmModel: z.string().max(100).nullable().optional(),
|
||||
servers: z.array(z.string().min(1)).optional(),
|
||||
});
|
||||
|
||||
export type CreateProjectInput = z.infer<typeof CreateProjectSchema>;
|
||||
|
||||
36
src/mcpd/src/validation/prompt.schema.ts
Normal file
36
src/mcpd/src/validation/prompt.schema.ts
Normal file
@@ -0,0 +1,36 @@
|
||||
import { z } from 'zod';
|
||||
|
||||
const LINK_TARGET_RE = /^[a-z0-9-]+\/[a-z0-9-]+:\S+$/;
|
||||
|
||||
export const CreatePromptSchema = z.object({
|
||||
name: z.string().min(1).max(100).regex(/^[a-z0-9-]+$/, 'Name must be lowercase alphanumeric with hyphens'),
|
||||
content: z.string().min(1).max(50000),
|
||||
projectId: z.string().optional(),
|
||||
priority: z.number().int().min(1).max(10).default(5).optional(),
|
||||
linkTarget: z.string().regex(LINK_TARGET_RE, 'Link target must be project/server:resource-uri').optional(),
|
||||
});
|
||||
|
||||
export const UpdatePromptSchema = z.object({
|
||||
content: z.string().min(1).max(50000).optional(),
|
||||
priority: z.number().int().min(1).max(10).optional(),
|
||||
// linkTarget intentionally excluded — links are immutable
|
||||
});
|
||||
|
||||
export const CreatePromptRequestSchema = z.object({
|
||||
name: z.string().min(1).max(100).regex(/^[a-z0-9-]+$/, 'Name must be lowercase alphanumeric with hyphens'),
|
||||
content: z.string().min(1).max(50000),
|
||||
projectId: z.string().optional(),
|
||||
priority: z.number().int().min(1).max(10).default(5).optional(),
|
||||
createdBySession: z.string().optional(),
|
||||
createdByUserId: z.string().optional(),
|
||||
});
|
||||
|
||||
export const UpdatePromptRequestSchema = z.object({
|
||||
content: z.string().min(1).max(50000).optional(),
|
||||
priority: z.number().int().min(1).max(10).optional(),
|
||||
});
|
||||
|
||||
export type CreatePromptInput = z.infer<typeof CreatePromptSchema>;
|
||||
export type UpdatePromptInput = z.infer<typeof UpdatePromptSchema>;
|
||||
export type CreatePromptRequestInput = z.infer<typeof CreatePromptRequestSchema>;
|
||||
export type UpdatePromptRequestInput = z.infer<typeof UpdatePromptRequestSchema>;
|
||||
73
src/mcpd/src/validation/rbac-definition.schema.ts
Normal file
73
src/mcpd/src/validation/rbac-definition.schema.ts
Normal file
@@ -0,0 +1,73 @@
|
||||
import { z } from 'zod';
|
||||
|
||||
export const RBAC_ROLES = ['edit', 'view', 'create', 'delete', 'run', 'expose'] as const;
|
||||
export const RBAC_RESOURCES = ['*', 'servers', 'instances', 'secrets', 'projects', 'templates', 'users', 'groups', 'rbac', 'prompts', 'promptrequests'] as const;
|
||||
|
||||
/** Singular→plural map for resource names. */
|
||||
const RESOURCE_ALIASES: Record<string, string> = {
|
||||
server: 'servers',
|
||||
instance: 'instances',
|
||||
secret: 'secrets',
|
||||
project: 'projects',
|
||||
template: 'templates',
|
||||
user: 'users',
|
||||
group: 'groups',
|
||||
prompt: 'prompts',
|
||||
promptrequest: 'promptrequests',
|
||||
};
|
||||
|
||||
/** Normalize a resource name to its canonical plural form. */
|
||||
export function normalizeResource(resource: string): string {
|
||||
return RESOURCE_ALIASES[resource] ?? resource;
|
||||
}
|
||||
|
||||
export const RbacSubjectSchema = z.object({
|
||||
kind: z.enum(['User', 'Group', 'ServiceAccount']),
|
||||
name: z.string().min(1),
|
||||
});
|
||||
|
||||
/** Resource binding: role grants access to a resource type (optionally scoped to a named instance). */
|
||||
export const ResourceBindingSchema = z.object({
|
||||
role: z.enum(RBAC_ROLES),
|
||||
resource: z.string().min(1).transform(normalizeResource),
|
||||
name: z.string().min(1).optional(),
|
||||
});
|
||||
|
||||
/** Operation binding: 'run' role grants access to a named operation. */
|
||||
export const OperationBindingSchema = z.object({
|
||||
role: z.literal('run'),
|
||||
action: z.string().min(1),
|
||||
});
|
||||
|
||||
/** Union of both binding types. */
|
||||
export const RbacRoleBindingSchema = z.union([
|
||||
ResourceBindingSchema,
|
||||
OperationBindingSchema,
|
||||
]);
|
||||
|
||||
export type RbacSubject = z.infer<typeof RbacSubjectSchema>;
|
||||
export type ResourceBinding = z.infer<typeof ResourceBindingSchema>;
|
||||
export type OperationBinding = z.infer<typeof OperationBindingSchema>;
|
||||
export type RbacRoleBinding = z.infer<typeof RbacRoleBindingSchema>;
|
||||
|
||||
export function isResourceBinding(b: RbacRoleBinding): b is ResourceBinding {
|
||||
return 'resource' in b;
|
||||
}
|
||||
|
||||
export function isOperationBinding(b: RbacRoleBinding): b is OperationBinding {
|
||||
return 'action' in b;
|
||||
}
|
||||
|
||||
export const CreateRbacDefinitionSchema = z.object({
|
||||
name: z.string().min(1).max(100).regex(/^[a-z0-9-]+$/, 'Name must be lowercase alphanumeric with hyphens'),
|
||||
subjects: z.array(RbacSubjectSchema).min(1),
|
||||
roleBindings: z.array(RbacRoleBindingSchema).min(1),
|
||||
});
|
||||
|
||||
export const UpdateRbacDefinitionSchema = z.object({
|
||||
subjects: z.array(RbacSubjectSchema).min(1).optional(),
|
||||
roleBindings: z.array(RbacRoleBindingSchema).min(1).optional(),
|
||||
});
|
||||
|
||||
export type CreateRbacDefinitionInput = z.infer<typeof CreateRbacDefinitionSchema>;
|
||||
export type UpdateRbacDefinitionInput = z.infer<typeof UpdateRbacDefinitionSchema>;
|
||||
15
src/mcpd/src/validation/user.schema.ts
Normal file
15
src/mcpd/src/validation/user.schema.ts
Normal file
@@ -0,0 +1,15 @@
|
||||
import { z } from 'zod';
|
||||
|
||||
export const CreateUserSchema = z.object({
|
||||
email: z.string().email(),
|
||||
password: z.string().min(8).max(128),
|
||||
name: z.string().max(100).optional(),
|
||||
});
|
||||
|
||||
export const UpdateUserSchema = z.object({
|
||||
name: z.string().max(100).optional(),
|
||||
password: z.string().min(8).max(128).optional(),
|
||||
});
|
||||
|
||||
export type CreateUserInput = z.infer<typeof CreateUserSchema>;
|
||||
export type UpdateUserInput = z.infer<typeof UpdateUserSchema>;
|
||||
424
src/mcpd/tests/auth-bootstrap.test.ts
Normal file
424
src/mcpd/tests/auth-bootstrap.test.ts
Normal file
@@ -0,0 +1,424 @@
|
||||
import { describe, it, expect, vi, afterEach, beforeEach } from 'vitest';
|
||||
import Fastify from 'fastify';
|
||||
import type { FastifyInstance } from 'fastify';
|
||||
import { registerAuthRoutes } from '../src/routes/auth.js';
|
||||
import { errorHandler } from '../src/middleware/error-handler.js';
|
||||
import type { AuthService, LoginResult } from '../src/services/auth.service.js';
|
||||
import type { UserService } from '../src/services/user.service.js';
|
||||
import type { GroupService } from '../src/services/group.service.js';
|
||||
import type { RbacDefinitionService } from '../src/services/rbac-definition.service.js';
|
||||
import type { RbacService, RbacAction } from '../src/services/rbac.service.js';
|
||||
import type { SafeUser } from '../src/repositories/user.repository.js';
|
||||
import type { RbacDefinition } from '@prisma/client';
|
||||
|
||||
let app: FastifyInstance;
|
||||
|
||||
afterEach(async () => {
|
||||
if (app) await app.close();
|
||||
});
|
||||
|
||||
function makeLoginResult(overrides?: Partial<LoginResult>): LoginResult {
|
||||
return {
|
||||
token: 'test-token-123',
|
||||
expiresAt: new Date(Date.now() + 86400_000),
|
||||
user: { id: 'user-1', email: 'admin@example.com', role: 'user' },
|
||||
...overrides,
|
||||
};
|
||||
}
|
||||
|
||||
function makeSafeUser(overrides?: Partial<SafeUser>): SafeUser {
|
||||
return {
|
||||
id: 'user-1',
|
||||
email: 'admin@example.com',
|
||||
name: null,
|
||||
role: 'user',
|
||||
provider: 'local',
|
||||
externalId: null,
|
||||
version: 1,
|
||||
createdAt: new Date(),
|
||||
updatedAt: new Date(),
|
||||
...overrides,
|
||||
};
|
||||
}
|
||||
|
||||
function makeRbacDef(overrides?: Partial<RbacDefinition>): RbacDefinition {
|
||||
return {
|
||||
id: 'rbac-1',
|
||||
name: 'bootstrap-admin',
|
||||
subjects: [{ kind: 'Group', name: 'admin' }],
|
||||
roleBindings: [
|
||||
{ role: 'edit', resource: '*' },
|
||||
{ role: 'run', resource: '*' },
|
||||
{ role: 'run', action: 'impersonate' },
|
||||
{ role: 'run', action: 'logs' },
|
||||
{ role: 'run', action: 'backup' },
|
||||
{ role: 'run', action: 'restore' },
|
||||
{ role: 'run', action: 'audit-purge' },
|
||||
],
|
||||
version: 1,
|
||||
createdAt: new Date(),
|
||||
updatedAt: new Date(),
|
||||
...overrides,
|
||||
};
|
||||
}
|
||||
|
||||
interface MockDeps {
|
||||
authService: {
|
||||
login: ReturnType<typeof vi.fn>;
|
||||
logout: ReturnType<typeof vi.fn>;
|
||||
findSession: ReturnType<typeof vi.fn>;
|
||||
impersonate: ReturnType<typeof vi.fn>;
|
||||
};
|
||||
userService: {
|
||||
count: ReturnType<typeof vi.fn>;
|
||||
create: ReturnType<typeof vi.fn>;
|
||||
list: ReturnType<typeof vi.fn>;
|
||||
getById: ReturnType<typeof vi.fn>;
|
||||
getByEmail: ReturnType<typeof vi.fn>;
|
||||
delete: ReturnType<typeof vi.fn>;
|
||||
};
|
||||
groupService: {
|
||||
create: ReturnType<typeof vi.fn>;
|
||||
list: ReturnType<typeof vi.fn>;
|
||||
getById: ReturnType<typeof vi.fn>;
|
||||
getByName: ReturnType<typeof vi.fn>;
|
||||
update: ReturnType<typeof vi.fn>;
|
||||
delete: ReturnType<typeof vi.fn>;
|
||||
};
|
||||
rbacDefinitionService: {
|
||||
create: ReturnType<typeof vi.fn>;
|
||||
list: ReturnType<typeof vi.fn>;
|
||||
getById: ReturnType<typeof vi.fn>;
|
||||
getByName: ReturnType<typeof vi.fn>;
|
||||
update: ReturnType<typeof vi.fn>;
|
||||
delete: ReturnType<typeof vi.fn>;
|
||||
};
|
||||
rbacService: {
|
||||
canAccess: ReturnType<typeof vi.fn>;
|
||||
canRunOperation: ReturnType<typeof vi.fn>;
|
||||
getPermissions: ReturnType<typeof vi.fn>;
|
||||
};
|
||||
}
|
||||
|
||||
function createMockDeps(): MockDeps {
|
||||
return {
|
||||
authService: {
|
||||
login: vi.fn(async () => makeLoginResult()),
|
||||
logout: vi.fn(async () => {}),
|
||||
findSession: vi.fn(async () => null),
|
||||
impersonate: vi.fn(async () => makeLoginResult({ token: 'impersonated-token' })),
|
||||
},
|
||||
userService: {
|
||||
count: vi.fn(async () => 0),
|
||||
create: vi.fn(async () => makeSafeUser()),
|
||||
list: vi.fn(async () => []),
|
||||
getById: vi.fn(async () => makeSafeUser()),
|
||||
getByEmail: vi.fn(async () => makeSafeUser()),
|
||||
delete: vi.fn(async () => {}),
|
||||
},
|
||||
groupService: {
|
||||
create: vi.fn(async () => ({ id: 'grp-1', name: 'admin', description: 'Bootstrap admin group', members: [] })),
|
||||
list: vi.fn(async () => []),
|
||||
getById: vi.fn(async () => null),
|
||||
getByName: vi.fn(async () => null),
|
||||
update: vi.fn(async () => null),
|
||||
delete: vi.fn(async () => {}),
|
||||
},
|
||||
rbacDefinitionService: {
|
||||
create: vi.fn(async () => makeRbacDef()),
|
||||
list: vi.fn(async () => []),
|
||||
getById: vi.fn(async () => makeRbacDef()),
|
||||
getByName: vi.fn(async () => null),
|
||||
update: vi.fn(async () => makeRbacDef()),
|
||||
delete: vi.fn(async () => {}),
|
||||
},
|
||||
rbacService: {
|
||||
canAccess: vi.fn(async () => false),
|
||||
canRunOperation: vi.fn(async () => false),
|
||||
getPermissions: vi.fn(async () => []),
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
function createApp(deps: MockDeps): Promise<FastifyInstance> {
|
||||
app = Fastify({ logger: false });
|
||||
app.setErrorHandler(errorHandler);
|
||||
registerAuthRoutes(app, deps as unknown as {
|
||||
authService: AuthService;
|
||||
userService: UserService;
|
||||
groupService: GroupService;
|
||||
rbacDefinitionService: RbacDefinitionService;
|
||||
rbacService: RbacService;
|
||||
});
|
||||
return app.ready();
|
||||
}
|
||||
|
||||
describe('Auth Bootstrap', () => {
|
||||
describe('GET /api/v1/auth/status', () => {
|
||||
it('returns hasUsers: false when no users exist', async () => {
|
||||
const deps = createMockDeps();
|
||||
deps.userService.count.mockResolvedValue(0);
|
||||
await createApp(deps);
|
||||
|
||||
const res = await app.inject({ method: 'GET', url: '/api/v1/auth/status' });
|
||||
expect(res.statusCode).toBe(200);
|
||||
expect(res.json<{ hasUsers: boolean }>().hasUsers).toBe(false);
|
||||
});
|
||||
|
||||
it('returns hasUsers: true when users exist', async () => {
|
||||
const deps = createMockDeps();
|
||||
deps.userService.count.mockResolvedValue(1);
|
||||
await createApp(deps);
|
||||
|
||||
const res = await app.inject({ method: 'GET', url: '/api/v1/auth/status' });
|
||||
expect(res.statusCode).toBe(200);
|
||||
expect(res.json<{ hasUsers: boolean }>().hasUsers).toBe(true);
|
||||
});
|
||||
});
|
||||
|
||||
describe('POST /api/v1/auth/bootstrap', () => {
|
||||
it('creates admin user, admin group, RBAC definition targeting group, and returns session token', async () => {
|
||||
const deps = createMockDeps();
|
||||
deps.userService.count.mockResolvedValue(0);
|
||||
await createApp(deps);
|
||||
|
||||
const res = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/auth/bootstrap',
|
||||
payload: { email: 'admin@example.com', password: 'securepass123' },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(201);
|
||||
const body = res.json<LoginResult>();
|
||||
expect(body.token).toBe('test-token-123');
|
||||
expect(body.user.email).toBe('admin@example.com');
|
||||
|
||||
// Verify user was created
|
||||
expect(deps.userService.create).toHaveBeenCalledWith({
|
||||
email: 'admin@example.com',
|
||||
password: 'securepass123',
|
||||
});
|
||||
|
||||
// Verify admin group was created with the user as member
|
||||
expect(deps.groupService.create).toHaveBeenCalledWith({
|
||||
name: 'admin',
|
||||
description: 'Bootstrap admin group',
|
||||
members: ['admin@example.com'],
|
||||
});
|
||||
|
||||
// Verify RBAC definition targets the Group, not the User
|
||||
expect(deps.rbacDefinitionService.create).toHaveBeenCalledWith({
|
||||
name: 'bootstrap-admin',
|
||||
subjects: [{ kind: 'Group', name: 'admin' }],
|
||||
roleBindings: [
|
||||
{ role: 'edit', resource: '*' },
|
||||
{ role: 'run', resource: '*' },
|
||||
{ role: 'run', action: 'impersonate' },
|
||||
{ role: 'run', action: 'logs' },
|
||||
{ role: 'run', action: 'backup' },
|
||||
{ role: 'run', action: 'restore' },
|
||||
{ role: 'run', action: 'audit-purge' },
|
||||
],
|
||||
});
|
||||
|
||||
// Verify auto-login was called
|
||||
expect(deps.authService.login).toHaveBeenCalledWith('admin@example.com', 'securepass123');
|
||||
});
|
||||
|
||||
it('passes name when provided', async () => {
|
||||
const deps = createMockDeps();
|
||||
deps.userService.count.mockResolvedValue(0);
|
||||
await createApp(deps);
|
||||
|
||||
await app.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/auth/bootstrap',
|
||||
payload: { email: 'admin@example.com', password: 'securepass123', name: 'Admin User' },
|
||||
});
|
||||
|
||||
expect(deps.userService.create).toHaveBeenCalledWith({
|
||||
email: 'admin@example.com',
|
||||
password: 'securepass123',
|
||||
name: 'Admin User',
|
||||
});
|
||||
});
|
||||
|
||||
it('returns 409 when users already exist', async () => {
|
||||
const deps = createMockDeps();
|
||||
deps.userService.count.mockResolvedValue(1);
|
||||
await createApp(deps);
|
||||
|
||||
const res = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/auth/bootstrap',
|
||||
payload: { email: 'admin@example.com', password: 'securepass123' },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(409);
|
||||
expect(res.json<{ error: string }>().error).toContain('Users already exist');
|
||||
|
||||
// Should NOT have created user, group, or RBAC
|
||||
expect(deps.userService.create).not.toHaveBeenCalled();
|
||||
expect(deps.groupService.create).not.toHaveBeenCalled();
|
||||
expect(deps.rbacDefinitionService.create).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it('validates email and password via UserService', async () => {
|
||||
const deps = createMockDeps();
|
||||
deps.userService.count.mockResolvedValue(0);
|
||||
// Simulate Zod validation error from UserService
|
||||
deps.userService.create.mockRejectedValue(
|
||||
Object.assign(new Error('Validation error'), { statusCode: 400, issues: [] }),
|
||||
);
|
||||
await createApp(deps);
|
||||
|
||||
const res = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/auth/bootstrap',
|
||||
payload: { email: 'not-an-email', password: 'short' },
|
||||
});
|
||||
|
||||
// The error handler should handle the validation error
|
||||
expect(res.statusCode).toBeGreaterThanOrEqual(400);
|
||||
});
|
||||
});
|
||||
|
||||
describe('POST /api/v1/auth/login', () => {
|
||||
it('logs in successfully', async () => {
|
||||
const deps = createMockDeps();
|
||||
await createApp(deps);
|
||||
|
||||
const res = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/auth/login',
|
||||
payload: { email: 'admin@example.com', password: 'securepass123' },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
expect(res.json<LoginResult>().token).toBe('test-token-123');
|
||||
});
|
||||
});
|
||||
|
||||
describe('POST /api/v1/auth/logout', () => {
|
||||
it('logs out with valid token', async () => {
|
||||
const deps = createMockDeps();
|
||||
deps.authService.findSession.mockResolvedValue({
|
||||
userId: 'user-1',
|
||||
expiresAt: new Date(Date.now() + 86400_000),
|
||||
});
|
||||
await createApp(deps);
|
||||
|
||||
const res = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/auth/logout',
|
||||
headers: { authorization: 'Bearer valid-token' },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
expect(res.json<{ success: boolean }>().success).toBe(true);
|
||||
expect(deps.authService.logout).toHaveBeenCalledWith('valid-token');
|
||||
});
|
||||
|
||||
it('returns 401 without auth', async () => {
|
||||
const deps = createMockDeps();
|
||||
await createApp(deps);
|
||||
|
||||
const res = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/auth/logout',
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(401);
|
||||
});
|
||||
});
|
||||
|
||||
describe('POST /api/v1/auth/impersonate', () => {
|
||||
it('creates session for target user when caller is admin', async () => {
|
||||
const deps = createMockDeps();
|
||||
// Auth: valid session
|
||||
deps.authService.findSession.mockResolvedValue({
|
||||
userId: 'admin-user-id',
|
||||
expiresAt: new Date(Date.now() + 86400_000),
|
||||
});
|
||||
// RBAC: allow impersonate operation
|
||||
deps.rbacService.canRunOperation.mockResolvedValue(true);
|
||||
// Impersonate returns token for target
|
||||
deps.authService.impersonate.mockResolvedValue(
|
||||
makeLoginResult({ token: 'impersonated-token', user: { id: 'user-2', email: 'target@example.com', role: 'user' } }),
|
||||
);
|
||||
await createApp(deps);
|
||||
|
||||
const res = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/auth/impersonate',
|
||||
headers: { authorization: 'Bearer admin-token' },
|
||||
payload: { email: 'target@example.com' },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json<LoginResult>();
|
||||
expect(body.token).toBe('impersonated-token');
|
||||
expect(body.user.email).toBe('target@example.com');
|
||||
expect(deps.authService.impersonate).toHaveBeenCalledWith('target@example.com');
|
||||
});
|
||||
|
||||
it('returns 401 without auth', async () => {
|
||||
const deps = createMockDeps();
|
||||
await createApp(deps);
|
||||
|
||||
const res = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/auth/impersonate',
|
||||
payload: { email: 'target@example.com' },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(401);
|
||||
});
|
||||
|
||||
it('returns 403 when caller lacks admin permission on users', async () => {
|
||||
const deps = createMockDeps();
|
||||
// Auth: valid session
|
||||
deps.authService.findSession.mockResolvedValue({
|
||||
userId: 'non-admin-id',
|
||||
expiresAt: new Date(Date.now() + 86400_000),
|
||||
});
|
||||
// RBAC: deny
|
||||
deps.rbacService.canRunOperation.mockResolvedValue(false);
|
||||
await createApp(deps);
|
||||
|
||||
const res = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/auth/impersonate',
|
||||
headers: { authorization: 'Bearer regular-token' },
|
||||
payload: { email: 'target@example.com' },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(403);
|
||||
});
|
||||
|
||||
it('returns 401 when impersonation target does not exist', async () => {
|
||||
const deps = createMockDeps();
|
||||
// Auth: valid session
|
||||
deps.authService.findSession.mockResolvedValue({
|
||||
userId: 'admin-user-id',
|
||||
expiresAt: new Date(Date.now() + 86400_000),
|
||||
});
|
||||
// RBAC: allow
|
||||
deps.rbacService.canRunOperation.mockResolvedValue(true);
|
||||
// Impersonate fails — user not found
|
||||
const authError = new Error('User not found');
|
||||
(authError as Error & { statusCode: number }).statusCode = 401;
|
||||
deps.authService.impersonate.mockRejectedValue(authError);
|
||||
await createApp(deps);
|
||||
|
||||
const res = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/auth/impersonate',
|
||||
headers: { authorization: 'Bearer admin-token' },
|
||||
payload: { email: 'nonexistent@example.com' },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(401);
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -6,6 +6,9 @@ import { encrypt, decrypt, isSensitiveKey } from '../src/services/backup/crypto.
|
||||
import { registerBackupRoutes } from '../src/routes/backup.js';
|
||||
import type { IMcpServerRepository, ISecretRepository } from '../src/repositories/interfaces.js';
|
||||
import type { IProjectRepository } from '../src/repositories/project.repository.js';
|
||||
import type { IUserRepository } from '../src/repositories/user.repository.js';
|
||||
import type { IGroupRepository } from '../src/repositories/group.repository.js';
|
||||
import type { IRbacDefinitionRepository } from '../src/repositories/rbac-definition.repository.js';
|
||||
|
||||
// Mock data
|
||||
const mockServers = [
|
||||
@@ -31,8 +34,32 @@ const mockSecrets = [
|
||||
|
||||
const mockProjects = [
|
||||
{
|
||||
id: 'proj1', name: 'my-project', description: 'Test project',
|
||||
id: 'proj1', name: 'my-project', description: 'Test project', proxyMode: 'direct', llmProvider: null, llmModel: null,
|
||||
ownerId: 'user1', version: 1, createdAt: new Date(), updatedAt: new Date(),
|
||||
servers: [{ id: 'ps1', server: { id: 's1', name: 'github' } }],
|
||||
},
|
||||
];
|
||||
|
||||
const mockUsers = [
|
||||
{ id: 'u1', email: 'alice@test.com', name: 'Alice', role: 'ADMIN', provider: null, externalId: null, version: 1, createdAt: new Date(), updatedAt: new Date() },
|
||||
{ id: 'u2', email: 'bob@test.com', name: null, role: 'USER', provider: 'oidc', externalId: null, version: 1, createdAt: new Date(), updatedAt: new Date() },
|
||||
];
|
||||
|
||||
const mockGroups = [
|
||||
{
|
||||
id: 'g1', name: 'dev-team', description: 'Developers', version: 1, createdAt: new Date(), updatedAt: new Date(),
|
||||
members: [
|
||||
{ id: 'gm1', user: { id: 'u1', email: 'alice@test.com', name: 'Alice' } },
|
||||
{ id: 'gm2', user: { id: 'u2', email: 'bob@test.com', name: null } },
|
||||
],
|
||||
},
|
||||
];
|
||||
|
||||
const mockRbacDefinitions = [
|
||||
{
|
||||
id: 'rbac1', name: 'admins', version: 1, createdAt: new Date(), updatedAt: new Date(),
|
||||
subjects: [{ kind: 'User', name: 'alice@test.com' }],
|
||||
roleBindings: [{ role: 'edit', resource: '*' }],
|
||||
},
|
||||
];
|
||||
|
||||
@@ -63,9 +90,47 @@ function mockProjectRepo(): IProjectRepository {
|
||||
findAll: vi.fn(async () => [...mockProjects]),
|
||||
findById: vi.fn(async (id: string) => mockProjects.find((p) => p.id === id) ?? null),
|
||||
findByName: vi.fn(async () => null),
|
||||
create: vi.fn(async (data) => ({ id: 'new-proj', ...data, version: 1, createdAt: new Date(), updatedAt: new Date() } as typeof mockProjects[0])),
|
||||
create: vi.fn(async (data) => ({ id: 'new-proj', ...data, servers: [], version: 1, createdAt: new Date(), updatedAt: new Date() } as typeof mockProjects[0])),
|
||||
update: vi.fn(async (id, data) => ({ ...mockProjects.find((p) => p.id === id)!, ...data })),
|
||||
delete: vi.fn(async () => {}),
|
||||
setServers: vi.fn(async () => {}),
|
||||
addServer: vi.fn(async () => {}),
|
||||
removeServer: vi.fn(async () => {}),
|
||||
};
|
||||
}
|
||||
|
||||
function mockUserRepo(): IUserRepository {
|
||||
return {
|
||||
findAll: vi.fn(async () => [...mockUsers]),
|
||||
findById: vi.fn(async (id: string) => mockUsers.find((u) => u.id === id) ?? null),
|
||||
findByEmail: vi.fn(async (email: string) => mockUsers.find((u) => u.email === email) ?? null),
|
||||
create: vi.fn(async (data) => ({ id: 'new-u', ...data, provider: null, externalId: null, version: 1, createdAt: new Date(), updatedAt: new Date() } as typeof mockUsers[0])),
|
||||
delete: vi.fn(async () => {}),
|
||||
count: vi.fn(async () => mockUsers.length),
|
||||
};
|
||||
}
|
||||
|
||||
function mockGroupRepo(): IGroupRepository {
|
||||
return {
|
||||
findAll: vi.fn(async () => [...mockGroups]),
|
||||
findById: vi.fn(async (id: string) => mockGroups.find((g) => g.id === id) ?? null),
|
||||
findByName: vi.fn(async (name: string) => mockGroups.find((g) => g.name === name) ?? null),
|
||||
create: vi.fn(async (data) => ({ id: 'new-g', ...data, version: 1, createdAt: new Date(), updatedAt: new Date() } as typeof mockGroups[0])),
|
||||
update: vi.fn(async (id, data) => ({ ...mockGroups.find((g) => g.id === id)!, ...data })),
|
||||
delete: vi.fn(async () => {}),
|
||||
setMembers: vi.fn(async () => {}),
|
||||
findGroupsForUser: vi.fn(async () => []),
|
||||
};
|
||||
}
|
||||
|
||||
function mockRbacRepo(): IRbacDefinitionRepository {
|
||||
return {
|
||||
findAll: vi.fn(async () => [...mockRbacDefinitions]),
|
||||
findById: vi.fn(async (id: string) => mockRbacDefinitions.find((r) => r.id === id) ?? null),
|
||||
findByName: vi.fn(async (name: string) => mockRbacDefinitions.find((r) => r.name === name) ?? null),
|
||||
create: vi.fn(async (data) => ({ id: 'new-rbac', ...data, version: 1, createdAt: new Date(), updatedAt: new Date() } as typeof mockRbacDefinitions[0])),
|
||||
update: vi.fn(async (id, data) => ({ ...mockRbacDefinitions.find((r) => r.id === id)!, ...data })),
|
||||
delete: vi.fn(async () => {}),
|
||||
};
|
||||
}
|
||||
|
||||
@@ -110,7 +175,7 @@ describe('BackupService', () => {
|
||||
let backupService: BackupService;
|
||||
|
||||
beforeEach(() => {
|
||||
backupService = new BackupService(mockServerRepo(), mockProjectRepo(), mockSecretRepo());
|
||||
backupService = new BackupService(mockServerRepo(), mockProjectRepo(), mockSecretRepo(), mockUserRepo(), mockGroupRepo(), mockRbacRepo());
|
||||
});
|
||||
|
||||
it('creates backup with all resources', async () => {
|
||||
@@ -126,11 +191,50 @@ describe('BackupService', () => {
|
||||
expect(bundle.projects[0]!.name).toBe('my-project');
|
||||
});
|
||||
|
||||
it('includes users in backup', async () => {
|
||||
const bundle = await backupService.createBackup();
|
||||
expect(bundle.users).toHaveLength(2);
|
||||
expect(bundle.users![0]!.email).toBe('alice@test.com');
|
||||
expect(bundle.users![0]!.role).toBe('ADMIN');
|
||||
expect(bundle.users![1]!.email).toBe('bob@test.com');
|
||||
expect(bundle.users![1]!.provider).toBe('oidc');
|
||||
});
|
||||
|
||||
it('includes groups in backup with member emails', async () => {
|
||||
const bundle = await backupService.createBackup();
|
||||
expect(bundle.groups).toHaveLength(1);
|
||||
expect(bundle.groups![0]!.name).toBe('dev-team');
|
||||
expect(bundle.groups![0]!.memberEmails).toEqual(['alice@test.com', 'bob@test.com']);
|
||||
});
|
||||
|
||||
it('includes rbac bindings in backup', async () => {
|
||||
const bundle = await backupService.createBackup();
|
||||
expect(bundle.rbacBindings).toHaveLength(1);
|
||||
expect(bundle.rbacBindings![0]!.name).toBe('admins');
|
||||
expect(bundle.rbacBindings![0]!.subjects).toEqual([{ kind: 'User', name: 'alice@test.com' }]);
|
||||
});
|
||||
|
||||
it('includes enriched projects with server names', async () => {
|
||||
const bundle = await backupService.createBackup();
|
||||
const proj = bundle.projects[0]!;
|
||||
expect(proj.proxyMode).toBe('direct');
|
||||
expect(proj.serverNames).toEqual(['github']);
|
||||
});
|
||||
|
||||
it('filters resources', async () => {
|
||||
const bundle = await backupService.createBackup({ resources: ['servers'] });
|
||||
expect(bundle.servers).toHaveLength(2);
|
||||
expect(bundle.secrets).toHaveLength(0);
|
||||
expect(bundle.projects).toHaveLength(0);
|
||||
expect(bundle.users).toHaveLength(0);
|
||||
expect(bundle.groups).toHaveLength(0);
|
||||
expect(bundle.rbacBindings).toHaveLength(0);
|
||||
});
|
||||
|
||||
it('filters to only users', async () => {
|
||||
const bundle = await backupService.createBackup({ resources: ['users'] });
|
||||
expect(bundle.servers).toHaveLength(0);
|
||||
expect(bundle.users).toHaveLength(2);
|
||||
});
|
||||
|
||||
it('encrypts sensitive secret values when password provided', async () => {
|
||||
@@ -150,13 +254,22 @@ describe('BackupService', () => {
|
||||
(emptySecretRepo.findAll as ReturnType<typeof vi.fn>).mockResolvedValue([]);
|
||||
const emptyProjectRepo = mockProjectRepo();
|
||||
(emptyProjectRepo.findAll as ReturnType<typeof vi.fn>).mockResolvedValue([]);
|
||||
const emptyUserRepo = mockUserRepo();
|
||||
(emptyUserRepo.findAll as ReturnType<typeof vi.fn>).mockResolvedValue([]);
|
||||
const emptyGroupRepo = mockGroupRepo();
|
||||
(emptyGroupRepo.findAll as ReturnType<typeof vi.fn>).mockResolvedValue([]);
|
||||
const emptyRbacRepo = mockRbacRepo();
|
||||
(emptyRbacRepo.findAll as ReturnType<typeof vi.fn>).mockResolvedValue([]);
|
||||
|
||||
const service = new BackupService(emptyServerRepo, emptyProjectRepo, emptySecretRepo);
|
||||
const service = new BackupService(emptyServerRepo, emptyProjectRepo, emptySecretRepo, emptyUserRepo, emptyGroupRepo, emptyRbacRepo);
|
||||
const bundle = await service.createBackup();
|
||||
|
||||
expect(bundle.servers).toHaveLength(0);
|
||||
expect(bundle.secrets).toHaveLength(0);
|
||||
expect(bundle.projects).toHaveLength(0);
|
||||
expect(bundle.users).toHaveLength(0);
|
||||
expect(bundle.groups).toHaveLength(0);
|
||||
expect(bundle.rbacBindings).toHaveLength(0);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -165,16 +278,25 @@ describe('RestoreService', () => {
|
||||
let serverRepo: IMcpServerRepository;
|
||||
let secretRepo: ISecretRepository;
|
||||
let projectRepo: IProjectRepository;
|
||||
let userRepo: IUserRepository;
|
||||
let groupRepo: IGroupRepository;
|
||||
let rbacRepo: IRbacDefinitionRepository;
|
||||
|
||||
beforeEach(() => {
|
||||
serverRepo = mockServerRepo();
|
||||
secretRepo = mockSecretRepo();
|
||||
projectRepo = mockProjectRepo();
|
||||
userRepo = mockUserRepo();
|
||||
groupRepo = mockGroupRepo();
|
||||
rbacRepo = mockRbacRepo();
|
||||
// Default: nothing exists yet
|
||||
(serverRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
|
||||
(secretRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
|
||||
(projectRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
|
||||
restoreService = new RestoreService(serverRepo, projectRepo, secretRepo);
|
||||
(userRepo.findByEmail as ReturnType<typeof vi.fn>).mockResolvedValue(null);
|
||||
(groupRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
|
||||
(rbacRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
|
||||
restoreService = new RestoreService(serverRepo, projectRepo, secretRepo, userRepo, groupRepo, rbacRepo);
|
||||
});
|
||||
|
||||
const validBundle = {
|
||||
@@ -187,6 +309,23 @@ describe('RestoreService', () => {
|
||||
projects: [{ name: 'test-proj', description: 'Test' }],
|
||||
};
|
||||
|
||||
const fullBundle = {
|
||||
...validBundle,
|
||||
users: [
|
||||
{ email: 'alice@test.com', name: 'Alice', role: 'ADMIN', provider: null },
|
||||
{ email: 'bob@test.com', name: null, role: 'USER', provider: 'oidc' },
|
||||
],
|
||||
groups: [
|
||||
{ name: 'dev-team', description: 'Developers', memberEmails: ['alice@test.com', 'bob@test.com'] },
|
||||
],
|
||||
rbacBindings: [
|
||||
{ name: 'admins', subjects: [{ kind: 'User', name: 'alice@test.com' }], roleBindings: [{ role: 'edit', resource: '*' }] },
|
||||
],
|
||||
projects: [
|
||||
{ name: 'test-proj', description: 'Test', proxyMode: 'filtered', llmProvider: 'openai', llmModel: 'gpt-4', serverNames: ['github'], members: ['alice@test.com'] },
|
||||
],
|
||||
};
|
||||
|
||||
it('validates valid bundle', () => {
|
||||
expect(restoreService.validateBundle(validBundle)).toBe(true);
|
||||
});
|
||||
@@ -197,6 +336,11 @@ describe('RestoreService', () => {
|
||||
expect(restoreService.validateBundle({ version: '1' })).toBe(false);
|
||||
});
|
||||
|
||||
it('validates old bundles without new fields (backwards compatibility)', () => {
|
||||
expect(restoreService.validateBundle(validBundle)).toBe(true);
|
||||
// Old bundle has no users/groups/rbacBindings — should still validate
|
||||
});
|
||||
|
||||
it('restores all resources', async () => {
|
||||
const result = await restoreService.restore(validBundle);
|
||||
|
||||
@@ -209,6 +353,95 @@ describe('RestoreService', () => {
|
||||
expect(projectRepo.create).toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it('restores users', async () => {
|
||||
const result = await restoreService.restore(fullBundle);
|
||||
|
||||
expect(result.usersCreated).toBe(2);
|
||||
expect(userRepo.create).toHaveBeenCalledWith(expect.objectContaining({
|
||||
email: 'alice@test.com',
|
||||
name: 'Alice',
|
||||
role: 'ADMIN',
|
||||
passwordHash: '__RESTORED_MUST_RESET__',
|
||||
}));
|
||||
expect(userRepo.create).toHaveBeenCalledWith(expect.objectContaining({
|
||||
email: 'bob@test.com',
|
||||
role: 'USER',
|
||||
}));
|
||||
});
|
||||
|
||||
it('restores groups with member resolution', async () => {
|
||||
// After users are created, simulate they can be found by email
|
||||
let callCount = 0;
|
||||
(userRepo.findByEmail as ReturnType<typeof vi.fn>).mockImplementation(async (email: string) => {
|
||||
// First calls during user restore return null (user doesn't exist yet)
|
||||
// Later calls during group member resolution return the created user
|
||||
callCount++;
|
||||
if (callCount > 2) {
|
||||
// After user creation phase, simulate finding created users
|
||||
if (email === 'alice@test.com') return { id: 'new-u-alice', email };
|
||||
if (email === 'bob@test.com') return { id: 'new-u-bob', email };
|
||||
}
|
||||
return null;
|
||||
});
|
||||
|
||||
const result = await restoreService.restore(fullBundle);
|
||||
|
||||
expect(result.groupsCreated).toBe(1);
|
||||
expect(groupRepo.create).toHaveBeenCalledWith(expect.objectContaining({
|
||||
name: 'dev-team',
|
||||
description: 'Developers',
|
||||
}));
|
||||
expect(groupRepo.setMembers).toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it('restores rbac bindings', async () => {
|
||||
const result = await restoreService.restore(fullBundle);
|
||||
|
||||
expect(result.rbacCreated).toBe(1);
|
||||
expect(rbacRepo.create).toHaveBeenCalledWith(expect.objectContaining({
|
||||
name: 'admins',
|
||||
subjects: [{ kind: 'User', name: 'alice@test.com' }],
|
||||
roleBindings: [{ role: 'edit', resource: '*' }],
|
||||
}));
|
||||
});
|
||||
|
||||
it('restores enriched projects with server linking', async () => {
|
||||
// Simulate servers exist (restored in prior step)
|
||||
(serverRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
|
||||
// After server restore, we can find them
|
||||
let serverCallCount = 0;
|
||||
(serverRepo.findByName as ReturnType<typeof vi.fn>).mockImplementation(async (name: string) => {
|
||||
serverCallCount++;
|
||||
// During server restore phase, first call returns null (server doesn't exist)
|
||||
// During project restore phase, server should be found
|
||||
if (serverCallCount > 1 && name === 'github') return { id: 'restored-s1', name: 'github' };
|
||||
return null;
|
||||
});
|
||||
|
||||
const result = await restoreService.restore(fullBundle);
|
||||
|
||||
expect(result.projectsCreated).toBe(1);
|
||||
expect(projectRepo.create).toHaveBeenCalledWith(expect.objectContaining({
|
||||
name: 'test-proj',
|
||||
proxyMode: 'filtered',
|
||||
llmProvider: 'openai',
|
||||
llmModel: 'gpt-4',
|
||||
}));
|
||||
expect(projectRepo.setServers).toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it('restores old bundle without users/groups/rbac', async () => {
|
||||
const result = await restoreService.restore(validBundle);
|
||||
|
||||
expect(result.serversCreated).toBe(1);
|
||||
expect(result.secretsCreated).toBe(1);
|
||||
expect(result.projectsCreated).toBe(1);
|
||||
expect(result.usersCreated).toBe(0);
|
||||
expect(result.groupsCreated).toBe(0);
|
||||
expect(result.rbacCreated).toBe(0);
|
||||
expect(result.errors).toHaveLength(0);
|
||||
});
|
||||
|
||||
it('skips existing resources with skip strategy', async () => {
|
||||
(serverRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(mockServers[0]);
|
||||
const result = await restoreService.restore(validBundle, { conflictStrategy: 'skip' });
|
||||
@@ -218,6 +451,33 @@ describe('RestoreService', () => {
|
||||
expect(serverRepo.create).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it('skips existing users', async () => {
|
||||
(userRepo.findByEmail as ReturnType<typeof vi.fn>).mockResolvedValue(mockUsers[0]);
|
||||
const bundle = { ...validBundle, users: [{ email: 'alice@test.com', name: 'Alice', role: 'ADMIN', provider: null }] };
|
||||
const result = await restoreService.restore(bundle, { conflictStrategy: 'skip' });
|
||||
|
||||
expect(result.usersSkipped).toBe(1);
|
||||
expect(result.usersCreated).toBe(0);
|
||||
});
|
||||
|
||||
it('skips existing groups', async () => {
|
||||
(groupRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(mockGroups[0]);
|
||||
const bundle = { ...validBundle, groups: [{ name: 'dev-team', description: 'Devs', memberEmails: [] }] };
|
||||
const result = await restoreService.restore(bundle, { conflictStrategy: 'skip' });
|
||||
|
||||
expect(result.groupsSkipped).toBe(1);
|
||||
expect(result.groupsCreated).toBe(0);
|
||||
});
|
||||
|
||||
it('skips existing rbac bindings', async () => {
|
||||
(rbacRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(mockRbacDefinitions[0]);
|
||||
const bundle = { ...validBundle, rbacBindings: [{ name: 'admins', subjects: [], roleBindings: [] }] };
|
||||
const result = await restoreService.restore(bundle, { conflictStrategy: 'skip' });
|
||||
|
||||
expect(result.rbacSkipped).toBe(1);
|
||||
expect(result.rbacCreated).toBe(0);
|
||||
});
|
||||
|
||||
it('aborts on conflict with fail strategy', async () => {
|
||||
(serverRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(mockServers[0]);
|
||||
const result = await restoreService.restore(validBundle, { conflictStrategy: 'fail' });
|
||||
@@ -233,6 +493,18 @@ describe('RestoreService', () => {
|
||||
expect(serverRepo.update).toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it('overwrites existing rbac bindings', async () => {
|
||||
(rbacRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(mockRbacDefinitions[0]);
|
||||
const bundle = {
|
||||
...validBundle,
|
||||
rbacBindings: [{ name: 'admins', subjects: [{ kind: 'User', name: 'new@test.com' }], roleBindings: [{ role: 'view', resource: 'servers' }] }],
|
||||
};
|
||||
const result = await restoreService.restore(bundle, { conflictStrategy: 'overwrite' });
|
||||
|
||||
expect(result.rbacCreated).toBe(1);
|
||||
expect(rbacRepo.update).toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it('fails restore with encrypted bundle and no password', async () => {
|
||||
const encBundle = { ...validBundle, encrypted: true, encryptedSecrets: encrypt('{}', 'pw') };
|
||||
const result = await restoreService.restore(encBundle);
|
||||
@@ -262,6 +534,26 @@ describe('RestoreService', () => {
|
||||
const result = await restoreService.restore(encBundle, { password: 'wrong' });
|
||||
expect(result.errors[0]).toContain('Failed to decrypt');
|
||||
});
|
||||
|
||||
it('restores in correct order: secrets → servers → users → groups → projects → rbac', async () => {
|
||||
const callOrder: string[] = [];
|
||||
(secretRepo.create as ReturnType<typeof vi.fn>).mockImplementation(async () => { callOrder.push('secret'); return { id: 'sec' }; });
|
||||
(serverRepo.create as ReturnType<typeof vi.fn>).mockImplementation(async () => { callOrder.push('server'); return { id: 'srv' }; });
|
||||
(userRepo.create as ReturnType<typeof vi.fn>).mockImplementation(async () => { callOrder.push('user'); return { id: 'usr' }; });
|
||||
(groupRepo.create as ReturnType<typeof vi.fn>).mockImplementation(async () => { callOrder.push('group'); return { id: 'grp' }; });
|
||||
(projectRepo.create as ReturnType<typeof vi.fn>).mockImplementation(async () => { callOrder.push('project'); return { id: 'proj', servers: [] }; });
|
||||
(rbacRepo.create as ReturnType<typeof vi.fn>).mockImplementation(async () => { callOrder.push('rbac'); return { id: 'rbac' }; });
|
||||
|
||||
await restoreService.restore(fullBundle);
|
||||
|
||||
expect(callOrder[0]).toBe('secret');
|
||||
expect(callOrder[1]).toBe('server');
|
||||
expect(callOrder[2]).toBe('user');
|
||||
expect(callOrder[3]).toBe('user'); // second user
|
||||
expect(callOrder[4]).toBe('group');
|
||||
expect(callOrder[5]).toBe('project');
|
||||
expect(callOrder[6]).toBe('rbac');
|
||||
});
|
||||
});
|
||||
|
||||
describe('Backup Routes', () => {
|
||||
@@ -272,7 +564,7 @@ describe('Backup Routes', () => {
|
||||
const sRepo = mockServerRepo();
|
||||
const secRepo = mockSecretRepo();
|
||||
const prRepo = mockProjectRepo();
|
||||
backupService = new BackupService(sRepo, prRepo, secRepo);
|
||||
backupService = new BackupService(sRepo, prRepo, secRepo, mockUserRepo(), mockGroupRepo(), mockRbacRepo());
|
||||
|
||||
const rSRepo = mockServerRepo();
|
||||
(rSRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
|
||||
@@ -280,7 +572,13 @@ describe('Backup Routes', () => {
|
||||
(rSecRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
|
||||
const rPrRepo = mockProjectRepo();
|
||||
(rPrRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
|
||||
restoreService = new RestoreService(rSRepo, rPrRepo, rSecRepo);
|
||||
const rUserRepo = mockUserRepo();
|
||||
(rUserRepo.findByEmail as ReturnType<typeof vi.fn>).mockResolvedValue(null);
|
||||
const rGroupRepo = mockGroupRepo();
|
||||
(rGroupRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
|
||||
const rRbacRepo = mockRbacRepo();
|
||||
(rRbacRepo.findByName as ReturnType<typeof vi.fn>).mockResolvedValue(null);
|
||||
restoreService = new RestoreService(rSRepo, rPrRepo, rSecRepo, rUserRepo, rGroupRepo, rRbacRepo);
|
||||
});
|
||||
|
||||
async function buildApp() {
|
||||
@@ -289,7 +587,7 @@ describe('Backup Routes', () => {
|
||||
return app;
|
||||
}
|
||||
|
||||
it('POST /api/v1/backup returns bundle', async () => {
|
||||
it('POST /api/v1/backup returns bundle with new resource types', async () => {
|
||||
const app = await buildApp();
|
||||
const res = await app.inject({
|
||||
method: 'POST',
|
||||
@@ -303,6 +601,9 @@ describe('Backup Routes', () => {
|
||||
expect(body.servers).toBeDefined();
|
||||
expect(body.secrets).toBeDefined();
|
||||
expect(body.projects).toBeDefined();
|
||||
expect(body.users).toBeDefined();
|
||||
expect(body.groups).toBeDefined();
|
||||
expect(body.rbacBindings).toBeDefined();
|
||||
});
|
||||
|
||||
it('POST /api/v1/restore imports bundle', async () => {
|
||||
@@ -318,6 +619,9 @@ describe('Backup Routes', () => {
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.serversCreated).toBeDefined();
|
||||
expect(body.usersCreated).toBeDefined();
|
||||
expect(body.groupsCreated).toBeDefined();
|
||||
expect(body.rbacCreated).toBeDefined();
|
||||
});
|
||||
|
||||
it('POST /api/v1/restore rejects invalid bundle', async () => {
|
||||
|
||||
124
src/mcpd/tests/bootstrap-system-project.test.ts
Normal file
124
src/mcpd/tests/bootstrap-system-project.test.ts
Normal file
@@ -0,0 +1,124 @@
|
||||
import { describe, it, expect, vi, beforeEach } from 'vitest';
|
||||
import { bootstrapSystemProject, SYSTEM_PROJECT_NAME, SYSTEM_OWNER_ID, getSystemPromptNames } from '../src/bootstrap/system-project.js';
|
||||
import type { PrismaClient } from '@prisma/client';
|
||||
|
||||
function mockPrisma(): PrismaClient {
|
||||
const prompts = new Map<string, { id: string; name: string; projectId: string }>();
|
||||
let promptIdCounter = 1;
|
||||
|
||||
return {
|
||||
project: {
|
||||
upsert: vi.fn(async (args: { where: { name: string }; create: Record<string, unknown>; update: Record<string, unknown> }) => ({
|
||||
id: 'sys-proj-id',
|
||||
name: args.where.name,
|
||||
...args.create,
|
||||
})),
|
||||
},
|
||||
prompt: {
|
||||
findFirst: vi.fn(async (args: { where: { name: string; projectId: string } }) => {
|
||||
return prompts.get(`${args.where.projectId}:${args.where.name}`) ?? null;
|
||||
}),
|
||||
create: vi.fn(async (args: { data: { name: string; content: string; priority: number; projectId: string } }) => {
|
||||
const id = `prompt-${promptIdCounter++}`;
|
||||
const prompt = { id, ...args.data };
|
||||
prompts.set(`${args.data.projectId}:${args.data.name}`, prompt);
|
||||
return prompt;
|
||||
}),
|
||||
},
|
||||
} as unknown as PrismaClient;
|
||||
}
|
||||
|
||||
describe('bootstrapSystemProject', () => {
|
||||
let prisma: PrismaClient;
|
||||
|
||||
beforeEach(() => {
|
||||
prisma = mockPrisma();
|
||||
});
|
||||
|
||||
it('creates the mcpctl-system project via upsert', async () => {
|
||||
await bootstrapSystemProject(prisma);
|
||||
|
||||
expect(prisma.project.upsert).toHaveBeenCalledWith(
|
||||
expect.objectContaining({
|
||||
where: { name: SYSTEM_PROJECT_NAME },
|
||||
create: expect.objectContaining({
|
||||
name: SYSTEM_PROJECT_NAME,
|
||||
ownerId: SYSTEM_OWNER_ID,
|
||||
gated: false,
|
||||
}),
|
||||
update: {},
|
||||
}),
|
||||
);
|
||||
});
|
||||
|
||||
it('creates all system prompts', async () => {
|
||||
await bootstrapSystemProject(prisma);
|
||||
|
||||
const expectedNames = getSystemPromptNames();
|
||||
expect(expectedNames.length).toBeGreaterThanOrEqual(4);
|
||||
|
||||
for (const name of expectedNames) {
|
||||
expect(prisma.prompt.findFirst).toHaveBeenCalledWith(
|
||||
expect.objectContaining({
|
||||
where: { name, projectId: 'sys-proj-id' },
|
||||
}),
|
||||
);
|
||||
}
|
||||
|
||||
expect(prisma.prompt.create).toHaveBeenCalledTimes(expectedNames.length);
|
||||
});
|
||||
|
||||
it('creates system prompts with priority 10', async () => {
|
||||
await bootstrapSystemProject(prisma);
|
||||
|
||||
const createCalls = vi.mocked(prisma.prompt.create).mock.calls;
|
||||
for (const call of createCalls) {
|
||||
const data = (call[0] as { data: { priority: number } }).data;
|
||||
expect(data.priority).toBe(10);
|
||||
}
|
||||
});
|
||||
|
||||
it('does not re-create existing prompts (idempotent)', async () => {
|
||||
// First call creates everything
|
||||
await bootstrapSystemProject(prisma);
|
||||
const firstCallCount = vi.mocked(prisma.prompt.create).mock.calls.length;
|
||||
|
||||
// Second call — prompts already exist in mock, should not create again
|
||||
await bootstrapSystemProject(prisma);
|
||||
|
||||
// create should not have been called additional times
|
||||
expect(vi.mocked(prisma.prompt.create).mock.calls.length).toBe(firstCallCount);
|
||||
});
|
||||
|
||||
it('re-creates deleted prompts on subsequent startup', async () => {
|
||||
// First run creates everything
|
||||
await bootstrapSystemProject(prisma);
|
||||
|
||||
// Simulate deletion: clear the map so findFirst returns null
|
||||
vi.mocked(prisma.prompt.findFirst).mockResolvedValue(null);
|
||||
vi.mocked(prisma.prompt.create).mockClear();
|
||||
|
||||
// Second run should recreate
|
||||
await bootstrapSystemProject(prisma);
|
||||
|
||||
const expectedNames = getSystemPromptNames();
|
||||
expect(vi.mocked(prisma.prompt.create).mock.calls.length).toBe(expectedNames.length);
|
||||
});
|
||||
|
||||
it('system project has gated=false', async () => {
|
||||
await bootstrapSystemProject(prisma);
|
||||
|
||||
const upsertCall = vi.mocked(prisma.project.upsert).mock.calls[0]![0];
|
||||
expect((upsertCall as { create: { gated: boolean } }).create.gated).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
describe('getSystemPromptNames', () => {
|
||||
it('returns all system prompt names', () => {
|
||||
const names = getSystemPromptNames();
|
||||
expect(names).toContain('gate-instructions');
|
||||
expect(names).toContain('gate-encouragement');
|
||||
expect(names).toContain('gate-intercept-preamble');
|
||||
expect(names).toContain('session-greeting');
|
||||
});
|
||||
});
|
||||
250
src/mcpd/tests/group-service.test.ts
Normal file
250
src/mcpd/tests/group-service.test.ts
Normal file
@@ -0,0 +1,250 @@
|
||||
import { describe, it, expect, vi, beforeEach } from 'vitest';
|
||||
import { GroupService } from '../src/services/group.service.js';
|
||||
import { NotFoundError, ConflictError } from '../src/services/mcp-server.service.js';
|
||||
import type { IGroupRepository, GroupWithMembers } from '../src/repositories/group.repository.js';
|
||||
import type { IUserRepository, SafeUser } from '../src/repositories/user.repository.js';
|
||||
import type { Group } from '@prisma/client';
|
||||
|
||||
function makeGroup(overrides: Partial<Group> = {}): Group {
|
||||
return {
|
||||
id: 'grp-1',
|
||||
name: 'developers',
|
||||
description: 'Dev team',
|
||||
version: 1,
|
||||
createdAt: new Date(),
|
||||
updatedAt: new Date(),
|
||||
...overrides,
|
||||
};
|
||||
}
|
||||
|
||||
function makeGroupWithMembers(overrides: Partial<Group> = {}, members: GroupWithMembers['members'] = []): GroupWithMembers {
|
||||
return {
|
||||
...makeGroup(overrides),
|
||||
members,
|
||||
};
|
||||
}
|
||||
|
||||
function makeUser(overrides: Partial<SafeUser> = {}): SafeUser {
|
||||
return {
|
||||
id: 'user-1',
|
||||
email: 'alice@example.com',
|
||||
name: 'Alice',
|
||||
role: 'USER',
|
||||
provider: null,
|
||||
externalId: null,
|
||||
version: 1,
|
||||
createdAt: new Date(),
|
||||
updatedAt: new Date(),
|
||||
...overrides,
|
||||
};
|
||||
}
|
||||
|
||||
function mockGroupRepo(): IGroupRepository {
|
||||
return {
|
||||
findAll: vi.fn(async () => []),
|
||||
findById: vi.fn(async () => null),
|
||||
findByName: vi.fn(async () => null),
|
||||
create: vi.fn(async (data) => makeGroup({ name: data.name, description: data.description ?? '' })),
|
||||
update: vi.fn(async (id, data) => makeGroup({ id, description: data.description ?? '' })),
|
||||
delete: vi.fn(async () => {}),
|
||||
setMembers: vi.fn(async () => {}),
|
||||
findGroupsForUser: vi.fn(async () => []),
|
||||
};
|
||||
}
|
||||
|
||||
function mockUserRepo(): IUserRepository {
|
||||
return {
|
||||
findAll: vi.fn(async () => []),
|
||||
findById: vi.fn(async () => null),
|
||||
findByEmail: vi.fn(async () => null),
|
||||
create: vi.fn(async () => makeUser()),
|
||||
delete: vi.fn(async () => {}),
|
||||
count: vi.fn(async () => 0),
|
||||
};
|
||||
}
|
||||
|
||||
describe('GroupService', () => {
|
||||
let groupRepo: ReturnType<typeof mockGroupRepo>;
|
||||
let userRepo: ReturnType<typeof mockUserRepo>;
|
||||
let service: GroupService;
|
||||
|
||||
beforeEach(() => {
|
||||
groupRepo = mockGroupRepo();
|
||||
userRepo = mockUserRepo();
|
||||
service = new GroupService(groupRepo, userRepo);
|
||||
});
|
||||
|
||||
describe('list', () => {
|
||||
it('returns empty list', async () => {
|
||||
const result = await service.list();
|
||||
expect(result).toEqual([]);
|
||||
expect(groupRepo.findAll).toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it('returns groups with members', async () => {
|
||||
const groups = [
|
||||
makeGroupWithMembers({ id: 'g1', name: 'admins' }, [
|
||||
{ id: 'gm-1', user: { id: 'u1', email: 'a@b.com', name: 'A' } },
|
||||
]),
|
||||
];
|
||||
vi.mocked(groupRepo.findAll).mockResolvedValue(groups);
|
||||
const result = await service.list();
|
||||
expect(result).toHaveLength(1);
|
||||
expect(result[0].members).toHaveLength(1);
|
||||
});
|
||||
});
|
||||
|
||||
describe('create', () => {
|
||||
it('creates a group without members', async () => {
|
||||
const created = makeGroupWithMembers({ name: 'my-group', description: '' }, []);
|
||||
vi.mocked(groupRepo.findById).mockResolvedValue(created);
|
||||
|
||||
const result = await service.create({ name: 'my-group' });
|
||||
expect(result.name).toBe('my-group');
|
||||
expect(groupRepo.create).toHaveBeenCalledWith({ name: 'my-group', description: '' });
|
||||
expect(groupRepo.setMembers).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it('creates a group with members', async () => {
|
||||
const alice = makeUser({ id: 'u-alice', email: 'alice@example.com' });
|
||||
const bob = makeUser({ id: 'u-bob', email: 'bob@example.com', name: 'Bob' });
|
||||
vi.mocked(userRepo.findByEmail).mockImplementation(async (email: string) => {
|
||||
if (email === 'alice@example.com') return alice;
|
||||
if (email === 'bob@example.com') return bob;
|
||||
return null;
|
||||
});
|
||||
|
||||
const created = makeGroupWithMembers({ name: 'team' }, [
|
||||
{ id: 'gm-1', user: { id: 'u-alice', email: 'alice@example.com', name: 'Alice' } },
|
||||
{ id: 'gm-2', user: { id: 'u-bob', email: 'bob@example.com', name: 'Bob' } },
|
||||
]);
|
||||
vi.mocked(groupRepo.findById).mockResolvedValue(created);
|
||||
|
||||
const result = await service.create({
|
||||
name: 'team',
|
||||
members: ['alice@example.com', 'bob@example.com'],
|
||||
});
|
||||
|
||||
expect(groupRepo.setMembers).toHaveBeenCalledWith('grp-1', ['u-alice', 'u-bob']);
|
||||
expect(result.members).toHaveLength(2);
|
||||
});
|
||||
|
||||
it('throws ConflictError when name exists', async () => {
|
||||
vi.mocked(groupRepo.findByName).mockResolvedValue(makeGroupWithMembers({ name: 'taken' }));
|
||||
await expect(service.create({ name: 'taken' })).rejects.toThrow(ConflictError);
|
||||
});
|
||||
|
||||
it('throws NotFoundError for unknown member email', async () => {
|
||||
vi.mocked(userRepo.findByEmail).mockResolvedValue(null);
|
||||
await expect(
|
||||
service.create({ name: 'team', members: ['unknown@example.com'] }),
|
||||
).rejects.toThrow(NotFoundError);
|
||||
});
|
||||
|
||||
it('validates input', async () => {
|
||||
await expect(service.create({ name: '' })).rejects.toThrow();
|
||||
await expect(service.create({ name: 'UPPERCASE' })).rejects.toThrow();
|
||||
});
|
||||
});
|
||||
|
||||
describe('getById', () => {
|
||||
it('returns group when found', async () => {
|
||||
const group = makeGroupWithMembers({ id: 'g1' });
|
||||
vi.mocked(groupRepo.findById).mockResolvedValue(group);
|
||||
const result = await service.getById('g1');
|
||||
expect(result.id).toBe('g1');
|
||||
});
|
||||
|
||||
it('throws NotFoundError when not found', async () => {
|
||||
await expect(service.getById('missing')).rejects.toThrow(NotFoundError);
|
||||
});
|
||||
});
|
||||
|
||||
describe('getByName', () => {
|
||||
it('returns group when found', async () => {
|
||||
const group = makeGroupWithMembers({ name: 'admins' });
|
||||
vi.mocked(groupRepo.findByName).mockResolvedValue(group);
|
||||
const result = await service.getByName('admins');
|
||||
expect(result.name).toBe('admins');
|
||||
});
|
||||
|
||||
it('throws NotFoundError when not found', async () => {
|
||||
await expect(service.getByName('missing')).rejects.toThrow(NotFoundError);
|
||||
});
|
||||
});
|
||||
|
||||
describe('update', () => {
|
||||
it('updates description', async () => {
|
||||
const group = makeGroupWithMembers({ id: 'g1' });
|
||||
vi.mocked(groupRepo.findById).mockResolvedValue(group);
|
||||
|
||||
const updated = makeGroupWithMembers({ id: 'g1', description: 'new desc' });
|
||||
// After update, getById is called again to return fresh data
|
||||
vi.mocked(groupRepo.findById).mockResolvedValue(updated);
|
||||
|
||||
const result = await service.update('g1', { description: 'new desc' });
|
||||
expect(groupRepo.update).toHaveBeenCalledWith('g1', { description: 'new desc' });
|
||||
expect(result.description).toBe('new desc');
|
||||
});
|
||||
|
||||
it('updates members (full replacement)', async () => {
|
||||
const group = makeGroupWithMembers({ id: 'g1' }, [
|
||||
{ id: 'gm-1', user: { id: 'u-old', email: 'old@example.com', name: 'Old' } },
|
||||
]);
|
||||
vi.mocked(groupRepo.findById).mockResolvedValue(group);
|
||||
|
||||
const alice = makeUser({ id: 'u-alice', email: 'alice@example.com' });
|
||||
vi.mocked(userRepo.findByEmail).mockResolvedValue(alice);
|
||||
|
||||
const updated = makeGroupWithMembers({ id: 'g1' }, [
|
||||
{ id: 'gm-2', user: { id: 'u-alice', email: 'alice@example.com', name: 'Alice' } },
|
||||
]);
|
||||
vi.mocked(groupRepo.findById).mockResolvedValueOnce(group).mockResolvedValue(updated);
|
||||
|
||||
const result = await service.update('g1', { members: ['alice@example.com'] });
|
||||
expect(groupRepo.setMembers).toHaveBeenCalledWith('g1', ['u-alice']);
|
||||
expect(result.members).toHaveLength(1);
|
||||
});
|
||||
|
||||
it('throws NotFoundError when group not found', async () => {
|
||||
await expect(service.update('missing', { description: 'x' })).rejects.toThrow(NotFoundError);
|
||||
});
|
||||
|
||||
it('throws NotFoundError for unknown member email on update', async () => {
|
||||
const group = makeGroupWithMembers({ id: 'g1' });
|
||||
vi.mocked(groupRepo.findById).mockResolvedValue(group);
|
||||
vi.mocked(userRepo.findByEmail).mockResolvedValue(null);
|
||||
|
||||
await expect(
|
||||
service.update('g1', { members: ['unknown@example.com'] }),
|
||||
).rejects.toThrow(NotFoundError);
|
||||
});
|
||||
});
|
||||
|
||||
describe('delete', () => {
|
||||
it('deletes group', async () => {
|
||||
const group = makeGroupWithMembers({ id: 'g1' });
|
||||
vi.mocked(groupRepo.findById).mockResolvedValue(group);
|
||||
await service.delete('g1');
|
||||
expect(groupRepo.delete).toHaveBeenCalledWith('g1');
|
||||
});
|
||||
|
||||
it('throws NotFoundError when group not found', async () => {
|
||||
await expect(service.delete('missing')).rejects.toThrow(NotFoundError);
|
||||
});
|
||||
});
|
||||
|
||||
describe('group includes resolved member info', () => {
|
||||
it('members include user id, email, and name', async () => {
|
||||
const group = makeGroupWithMembers({ id: 'g1', name: 'team' }, [
|
||||
{ id: 'gm-1', user: { id: 'u1', email: 'alice@example.com', name: 'Alice' } },
|
||||
{ id: 'gm-2', user: { id: 'u2', email: 'bob@example.com', name: null } },
|
||||
]);
|
||||
vi.mocked(groupRepo.findById).mockResolvedValue(group);
|
||||
|
||||
const result = await service.getById('g1');
|
||||
expect(result.members[0].user).toEqual({ id: 'u1', email: 'alice@example.com', name: 'Alice' });
|
||||
expect(result.members[1].user).toEqual({ id: 'u2', email: 'bob@example.com', name: null });
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -11,10 +11,17 @@ function makeServer(overrides: Partial<McpServer> = {}): McpServer {
|
||||
dockerImage: null,
|
||||
transport: 'STDIO',
|
||||
repositoryUrl: null,
|
||||
externalUrl: null,
|
||||
command: null,
|
||||
containerPort: null,
|
||||
replicas: 1,
|
||||
env: [],
|
||||
healthCheck: null,
|
||||
version: 1,
|
||||
createdAt: new Date(),
|
||||
updatedAt: new Date(),
|
||||
templateName: null,
|
||||
templateVersion: null,
|
||||
...overrides,
|
||||
};
|
||||
}
|
||||
@@ -25,7 +32,7 @@ describe('generateMcpConfig', () => {
|
||||
expect(result).toEqual({ mcpServers: {} });
|
||||
});
|
||||
|
||||
it('generates config for a single server', () => {
|
||||
it('generates config for a single STDIO server', () => {
|
||||
const result = generateMcpConfig([
|
||||
{ server: makeServer(), resolvedEnv: {} },
|
||||
]);
|
||||
@@ -34,7 +41,7 @@ describe('generateMcpConfig', () => {
|
||||
expect(result.mcpServers['slack']?.args).toEqual(['-y', '@anthropic/slack-mcp']);
|
||||
});
|
||||
|
||||
it('includes resolved env when present', () => {
|
||||
it('includes resolved env when present for STDIO server', () => {
|
||||
const result = generateMcpConfig([
|
||||
{ server: makeServer(), resolvedEnv: { SLACK_TEAM_ID: 'T123' } },
|
||||
]);
|
||||
@@ -67,4 +74,35 @@ describe('generateMcpConfig', () => {
|
||||
]);
|
||||
expect(result.mcpServers['slack']?.args).toEqual(['-y', 'slack']);
|
||||
});
|
||||
|
||||
it('generates URL-based config for SSE servers', () => {
|
||||
const server = makeServer({ name: 'sse-server', transport: 'SSE' });
|
||||
const result = generateMcpConfig([
|
||||
{ server, resolvedEnv: { TOKEN: 'abc' } },
|
||||
]);
|
||||
const config = result.mcpServers['sse-server'];
|
||||
expect(config?.url).toBe('http://localhost:3100/api/v1/mcp/proxy/sse-server');
|
||||
expect(config?.command).toBeUndefined();
|
||||
expect(config?.args).toBeUndefined();
|
||||
expect(config?.env).toBeUndefined();
|
||||
});
|
||||
|
||||
it('generates URL-based config for STREAMABLE_HTTP servers', () => {
|
||||
const server = makeServer({ name: 'stream-server', transport: 'STREAMABLE_HTTP' });
|
||||
const result = generateMcpConfig([
|
||||
{ server, resolvedEnv: {} },
|
||||
]);
|
||||
const config = result.mcpServers['stream-server'];
|
||||
expect(config?.url).toBe('http://localhost:3100/api/v1/mcp/proxy/stream-server');
|
||||
expect(config?.command).toBeUndefined();
|
||||
});
|
||||
|
||||
it('mixes STDIO and SSE servers correctly', () => {
|
||||
const result = generateMcpConfig([
|
||||
{ server: makeServer({ name: 'stdio-srv', transport: 'STDIO' }), resolvedEnv: {} },
|
||||
{ server: makeServer({ name: 'sse-srv', transport: 'SSE' }), resolvedEnv: {} },
|
||||
]);
|
||||
expect(result.mcpServers['stdio-srv']?.command).toBe('npx');
|
||||
expect(result.mcpServers['sse-srv']?.url).toBeDefined();
|
||||
});
|
||||
});
|
||||
|
||||
284
src/mcpd/tests/project-routes.test.ts
Normal file
284
src/mcpd/tests/project-routes.test.ts
Normal file
@@ -0,0 +1,284 @@
|
||||
import { describe, it, expect, vi, afterEach } from 'vitest';
|
||||
import Fastify from 'fastify';
|
||||
import type { FastifyInstance } from 'fastify';
|
||||
import { registerProjectRoutes } from '../src/routes/projects.js';
|
||||
import { ProjectService } from '../src/services/project.service.js';
|
||||
import { errorHandler } from '../src/middleware/error-handler.js';
|
||||
import type { IProjectRepository, ProjectWithRelations } from '../src/repositories/project.repository.js';
|
||||
import type { IMcpServerRepository, ISecretRepository } from '../src/repositories/interfaces.js';
|
||||
|
||||
let app: FastifyInstance;
|
||||
|
||||
function makeProject(overrides: Partial<ProjectWithRelations> = {}): ProjectWithRelations {
|
||||
return {
|
||||
id: 'proj-1',
|
||||
name: 'test-project',
|
||||
description: '',
|
||||
ownerId: 'user-1',
|
||||
proxyMode: 'direct',
|
||||
gated: true,
|
||||
llmProvider: null,
|
||||
llmModel: null,
|
||||
version: 1,
|
||||
createdAt: new Date(),
|
||||
updatedAt: new Date(),
|
||||
servers: [],
|
||||
...overrides,
|
||||
};
|
||||
}
|
||||
|
||||
function mockProjectRepo(): IProjectRepository {
|
||||
return {
|
||||
findAll: vi.fn(async () => []),
|
||||
findById: vi.fn(async () => null),
|
||||
findByName: vi.fn(async () => null),
|
||||
create: vi.fn(async (data) => makeProject({
|
||||
name: data.name,
|
||||
description: data.description,
|
||||
ownerId: data.ownerId,
|
||||
proxyMode: data.proxyMode,
|
||||
})),
|
||||
update: vi.fn(async (_id, data) => makeProject({ ...data as Partial<ProjectWithRelations> })),
|
||||
delete: vi.fn(async () => {}),
|
||||
setServers: vi.fn(async () => {}),
|
||||
addServer: vi.fn(async () => {}),
|
||||
removeServer: vi.fn(async () => {}),
|
||||
};
|
||||
}
|
||||
|
||||
function mockServerRepo(): IMcpServerRepository {
|
||||
return {
|
||||
findAll: vi.fn(async () => []),
|
||||
findById: vi.fn(async () => null),
|
||||
findByName: vi.fn(async () => null),
|
||||
create: vi.fn(async () => ({} as never)),
|
||||
update: vi.fn(async () => ({} as never)),
|
||||
delete: vi.fn(async () => {}),
|
||||
};
|
||||
}
|
||||
|
||||
function mockSecretRepo(): ISecretRepository {
|
||||
return {
|
||||
findAll: vi.fn(async () => []),
|
||||
findById: vi.fn(async () => null),
|
||||
findByName: vi.fn(async () => null),
|
||||
create: vi.fn(async () => ({} as never)),
|
||||
update: vi.fn(async () => ({} as never)),
|
||||
delete: vi.fn(async () => {}),
|
||||
};
|
||||
}
|
||||
|
||||
afterEach(async () => {
|
||||
if (app) await app.close();
|
||||
});
|
||||
|
||||
function createApp(projectRepo: IProjectRepository, serverRepo?: IMcpServerRepository) {
|
||||
app = Fastify({ logger: false });
|
||||
app.setErrorHandler(errorHandler);
|
||||
const service = new ProjectService(projectRepo, serverRepo ?? mockServerRepo(), mockSecretRepo());
|
||||
registerProjectRoutes(app, service);
|
||||
return app.ready();
|
||||
}
|
||||
|
||||
describe('Project Routes', () => {
|
||||
describe('GET /api/v1/projects', () => {
|
||||
it('returns project list', async () => {
|
||||
const repo = mockProjectRepo();
|
||||
vi.mocked(repo.findAll).mockResolvedValue([
|
||||
makeProject({ id: 'p1', name: 'alpha', ownerId: 'user-1' }),
|
||||
makeProject({ id: 'p2', name: 'beta', ownerId: 'user-2' }),
|
||||
]);
|
||||
await createApp(repo);
|
||||
|
||||
const res = await app.inject({ method: 'GET', url: '/api/v1/projects' });
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json<Array<{ name: string }>>();
|
||||
expect(body).toHaveLength(2);
|
||||
});
|
||||
|
||||
it('lists all projects without ownerId filtering', async () => {
|
||||
// This is the bug fix: the route must call list() without ownerId
|
||||
// so that RBAC (preSerialization) handles access filtering, not the DB query.
|
||||
const repo = mockProjectRepo();
|
||||
vi.mocked(repo.findAll).mockResolvedValue([makeProject()]);
|
||||
await createApp(repo);
|
||||
|
||||
await app.inject({ method: 'GET', url: '/api/v1/projects' });
|
||||
// findAll must be called with NO arguments (undefined ownerId)
|
||||
expect(repo.findAll).toHaveBeenCalledWith(undefined);
|
||||
});
|
||||
});
|
||||
|
||||
describe('GET /api/v1/projects/:id', () => {
|
||||
it('returns 404 when not found', async () => {
|
||||
const repo = mockProjectRepo();
|
||||
await createApp(repo);
|
||||
const res = await app.inject({ method: 'GET', url: '/api/v1/projects/missing' });
|
||||
expect(res.statusCode).toBe(404);
|
||||
});
|
||||
|
||||
it('returns project when found by ID', async () => {
|
||||
const repo = mockProjectRepo();
|
||||
vi.mocked(repo.findById).mockResolvedValue(makeProject({ id: 'p1', name: 'my-proj' }));
|
||||
await createApp(repo);
|
||||
const res = await app.inject({ method: 'GET', url: '/api/v1/projects/p1' });
|
||||
expect(res.statusCode).toBe(200);
|
||||
expect(res.json<{ name: string }>().name).toBe('my-proj');
|
||||
});
|
||||
|
||||
it('resolves by name when ID not found', async () => {
|
||||
const repo = mockProjectRepo();
|
||||
vi.mocked(repo.findByName).mockResolvedValue(makeProject({ name: 'my-proj' }));
|
||||
await createApp(repo);
|
||||
const res = await app.inject({ method: 'GET', url: '/api/v1/projects/my-proj' });
|
||||
expect(res.statusCode).toBe(200);
|
||||
expect(res.json<{ name: string }>().name).toBe('my-proj');
|
||||
});
|
||||
});
|
||||
|
||||
describe('POST /api/v1/projects', () => {
|
||||
it('creates a project and returns 201', async () => {
|
||||
const repo = mockProjectRepo();
|
||||
vi.mocked(repo.findById).mockResolvedValue(makeProject({ name: 'new-proj' }));
|
||||
await createApp(repo);
|
||||
const res = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/projects',
|
||||
payload: { name: 'new-proj' },
|
||||
});
|
||||
expect(res.statusCode).toBe(201);
|
||||
});
|
||||
|
||||
it('returns 400 for invalid input', async () => {
|
||||
const repo = mockProjectRepo();
|
||||
await createApp(repo);
|
||||
const res = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/projects',
|
||||
payload: { name: '' },
|
||||
});
|
||||
expect(res.statusCode).toBe(400);
|
||||
});
|
||||
|
||||
it('returns 409 when name already exists', async () => {
|
||||
const repo = mockProjectRepo();
|
||||
vi.mocked(repo.findByName).mockResolvedValue(makeProject());
|
||||
await createApp(repo);
|
||||
const res = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/projects',
|
||||
payload: { name: 'taken' },
|
||||
});
|
||||
expect(res.statusCode).toBe(409);
|
||||
});
|
||||
});
|
||||
|
||||
describe('PUT /api/v1/projects/:id', () => {
|
||||
it('updates a project', async () => {
|
||||
const repo = mockProjectRepo();
|
||||
vi.mocked(repo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
|
||||
await createApp(repo);
|
||||
const res = await app.inject({
|
||||
method: 'PUT',
|
||||
url: '/api/v1/projects/p1',
|
||||
payload: { description: 'Updated' },
|
||||
});
|
||||
expect(res.statusCode).toBe(200);
|
||||
});
|
||||
|
||||
it('returns 404 when not found', async () => {
|
||||
const repo = mockProjectRepo();
|
||||
await createApp(repo);
|
||||
const res = await app.inject({
|
||||
method: 'PUT',
|
||||
url: '/api/v1/projects/missing',
|
||||
payload: { description: 'x' },
|
||||
});
|
||||
expect(res.statusCode).toBe(404);
|
||||
});
|
||||
});
|
||||
|
||||
describe('DELETE /api/v1/projects/:id', () => {
|
||||
it('deletes a project and returns 204', async () => {
|
||||
const repo = mockProjectRepo();
|
||||
vi.mocked(repo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
|
||||
await createApp(repo);
|
||||
const res = await app.inject({ method: 'DELETE', url: '/api/v1/projects/p1' });
|
||||
expect(res.statusCode).toBe(204);
|
||||
});
|
||||
|
||||
it('returns 404 when not found', async () => {
|
||||
const repo = mockProjectRepo();
|
||||
await createApp(repo);
|
||||
const res = await app.inject({ method: 'DELETE', url: '/api/v1/projects/missing' });
|
||||
expect(res.statusCode).toBe(404);
|
||||
});
|
||||
});
|
||||
|
||||
describe('POST /api/v1/projects/:id/servers (attach)', () => {
|
||||
it('attaches a server to a project', async () => {
|
||||
const projectRepo = mockProjectRepo();
|
||||
const serverRepo = mockServerRepo();
|
||||
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
|
||||
vi.mocked(serverRepo.findByName).mockResolvedValue({ id: 'srv-1', name: 'my-ha' } as never);
|
||||
await createApp(projectRepo, serverRepo);
|
||||
|
||||
const res = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/projects/p1/servers',
|
||||
payload: { server: 'my-ha' },
|
||||
});
|
||||
expect(res.statusCode).toBe(200);
|
||||
expect(projectRepo.addServer).toHaveBeenCalledWith('p1', 'srv-1');
|
||||
});
|
||||
|
||||
it('returns 400 when server field is missing', async () => {
|
||||
const repo = mockProjectRepo();
|
||||
vi.mocked(repo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
|
||||
await createApp(repo);
|
||||
|
||||
const res = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/projects/p1/servers',
|
||||
payload: {},
|
||||
});
|
||||
expect(res.statusCode).toBe(400);
|
||||
});
|
||||
|
||||
it('returns 404 when server not found', async () => {
|
||||
const projectRepo = mockProjectRepo();
|
||||
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
|
||||
await createApp(projectRepo);
|
||||
|
||||
const res = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/projects/p1/servers',
|
||||
payload: { server: 'nonexistent' },
|
||||
});
|
||||
expect(res.statusCode).toBe(404);
|
||||
});
|
||||
});
|
||||
|
||||
describe('DELETE /api/v1/projects/:id/servers/:serverName (detach)', () => {
|
||||
it('detaches a server from a project', async () => {
|
||||
const projectRepo = mockProjectRepo();
|
||||
const serverRepo = mockServerRepo();
|
||||
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
|
||||
vi.mocked(serverRepo.findByName).mockResolvedValue({ id: 'srv-1', name: 'my-ha' } as never);
|
||||
await createApp(projectRepo, serverRepo);
|
||||
|
||||
const res = await app.inject({ method: 'DELETE', url: '/api/v1/projects/p1/servers/my-ha' });
|
||||
expect(res.statusCode).toBe(204);
|
||||
expect(projectRepo.removeServer).toHaveBeenCalledWith('p1', 'srv-1');
|
||||
});
|
||||
|
||||
it('returns 404 when server not found', async () => {
|
||||
const projectRepo = mockProjectRepo();
|
||||
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
|
||||
await createApp(projectRepo);
|
||||
|
||||
const res = await app.inject({ method: 'DELETE', url: '/api/v1/projects/p1/servers/nonexistent' });
|
||||
expect(res.statusCode).toBe(404);
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -1,66 +1,384 @@
|
||||
import { describe, it, expect, vi, beforeEach } from 'vitest';
|
||||
import { ProjectService } from '../src/services/project.service.js';
|
||||
import { NotFoundError, ConflictError } from '../src/services/mcp-server.service.js';
|
||||
import type { IProjectRepository } from '../src/repositories/project.repository.js';
|
||||
import type { IProjectRepository, ProjectWithRelations } from '../src/repositories/project.repository.js';
|
||||
import type { IMcpServerRepository, ISecretRepository } from '../src/repositories/interfaces.js';
|
||||
import type { McpServer } from '@prisma/client';
|
||||
|
||||
function makeProject(overrides: Partial<ProjectWithRelations> = {}): ProjectWithRelations {
|
||||
return {
|
||||
id: 'proj-1',
|
||||
name: 'test-project',
|
||||
description: '',
|
||||
ownerId: 'user-1',
|
||||
proxyMode: 'direct',
|
||||
gated: true,
|
||||
llmProvider: null,
|
||||
llmModel: null,
|
||||
version: 1,
|
||||
createdAt: new Date(),
|
||||
updatedAt: new Date(),
|
||||
servers: [],
|
||||
...overrides,
|
||||
};
|
||||
}
|
||||
|
||||
function makeServer(overrides: Partial<McpServer> = {}): McpServer {
|
||||
return {
|
||||
id: 'srv-1',
|
||||
name: 'test-server',
|
||||
description: '',
|
||||
packageName: '@mcp/test',
|
||||
dockerImage: null,
|
||||
transport: 'STDIO',
|
||||
repositoryUrl: null,
|
||||
externalUrl: null,
|
||||
command: null,
|
||||
containerPort: null,
|
||||
replicas: 1,
|
||||
env: [],
|
||||
healthCheck: null,
|
||||
version: 1,
|
||||
createdAt: new Date(),
|
||||
updatedAt: new Date(),
|
||||
templateName: null,
|
||||
templateVersion: null,
|
||||
...overrides,
|
||||
};
|
||||
}
|
||||
|
||||
function mockProjectRepo(): IProjectRepository {
|
||||
return {
|
||||
findAll: vi.fn(async () => []),
|
||||
findById: vi.fn(async () => null),
|
||||
findByName: vi.fn(async () => null),
|
||||
create: vi.fn(async (data) => ({
|
||||
id: 'proj-1',
|
||||
create: vi.fn(async (data) => makeProject({
|
||||
name: data.name,
|
||||
description: data.description ?? '',
|
||||
description: data.description,
|
||||
ownerId: data.ownerId,
|
||||
version: 1,
|
||||
createdAt: new Date(),
|
||||
updatedAt: new Date(),
|
||||
})),
|
||||
update: vi.fn(async (id) => ({
|
||||
id, name: 'test', description: '', ownerId: 'u1', version: 2,
|
||||
createdAt: new Date(), updatedAt: new Date(),
|
||||
proxyMode: data.proxyMode,
|
||||
llmProvider: data.llmProvider ?? null,
|
||||
llmModel: data.llmModel ?? null,
|
||||
})),
|
||||
update: vi.fn(async (_id, data) => makeProject({ ...data as Partial<ProjectWithRelations> })),
|
||||
delete: vi.fn(async () => {}),
|
||||
setServers: vi.fn(async () => {}),
|
||||
addServer: vi.fn(async () => {}),
|
||||
removeServer: vi.fn(async () => {}),
|
||||
};
|
||||
}
|
||||
|
||||
function mockServerRepo(): IMcpServerRepository {
|
||||
return {
|
||||
findAll: vi.fn(async () => []),
|
||||
findById: vi.fn(async () => null),
|
||||
findByName: vi.fn(async () => null),
|
||||
create: vi.fn(async () => makeServer()),
|
||||
update: vi.fn(async () => makeServer()),
|
||||
delete: vi.fn(async () => {}),
|
||||
};
|
||||
}
|
||||
|
||||
function mockSecretRepo(): ISecretRepository {
|
||||
return {
|
||||
findAll: vi.fn(async () => []),
|
||||
findById: vi.fn(async () => null),
|
||||
findByName: vi.fn(async () => null),
|
||||
create: vi.fn(async () => ({ id: 'sec-1', name: 'test', data: {}, version: 1, createdAt: new Date(), updatedAt: new Date() })),
|
||||
update: vi.fn(async () => ({ id: 'sec-1', name: 'test', data: {}, version: 1, createdAt: new Date(), updatedAt: new Date() })),
|
||||
delete: vi.fn(async () => {}),
|
||||
};
|
||||
}
|
||||
|
||||
describe('ProjectService', () => {
|
||||
let projectRepo: ReturnType<typeof mockProjectRepo>;
|
||||
let serverRepo: ReturnType<typeof mockServerRepo>;
|
||||
let secretRepo: ReturnType<typeof mockSecretRepo>;
|
||||
let service: ProjectService;
|
||||
|
||||
beforeEach(() => {
|
||||
projectRepo = mockProjectRepo();
|
||||
service = new ProjectService(projectRepo);
|
||||
serverRepo = mockServerRepo();
|
||||
secretRepo = mockSecretRepo();
|
||||
service = new ProjectService(projectRepo, serverRepo, secretRepo);
|
||||
});
|
||||
|
||||
describe('create', () => {
|
||||
it('creates a project', async () => {
|
||||
it('creates a basic project', async () => {
|
||||
// After create, getById is called to re-fetch with relations
|
||||
const created = makeProject({ name: 'my-project', ownerId: 'user-1' });
|
||||
vi.mocked(projectRepo.findById).mockResolvedValue(created);
|
||||
|
||||
const result = await service.create({ name: 'my-project' }, 'user-1');
|
||||
expect(result.name).toBe('my-project');
|
||||
expect(result.ownerId).toBe('user-1');
|
||||
expect(projectRepo.create).toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it('throws ConflictError when name exists', async () => {
|
||||
vi.mocked(projectRepo.findByName).mockResolvedValue({ id: '1' } as never);
|
||||
vi.mocked(projectRepo.findByName).mockResolvedValue(makeProject());
|
||||
await expect(service.create({ name: 'taken' }, 'u1')).rejects.toThrow(ConflictError);
|
||||
});
|
||||
|
||||
it('validates input', async () => {
|
||||
await expect(service.create({ name: '' }, 'u1')).rejects.toThrow();
|
||||
});
|
||||
|
||||
it('creates project with servers (resolves names)', async () => {
|
||||
const srv1 = makeServer({ id: 'srv-1', name: 'github' });
|
||||
const srv2 = makeServer({ id: 'srv-2', name: 'slack' });
|
||||
vi.mocked(serverRepo.findByName).mockImplementation(async (name) => {
|
||||
if (name === 'github') return srv1;
|
||||
if (name === 'slack') return srv2;
|
||||
return null;
|
||||
});
|
||||
|
||||
const created = makeProject({ id: 'proj-new' });
|
||||
vi.mocked(projectRepo.create).mockResolvedValue(created);
|
||||
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({
|
||||
id: 'proj-new',
|
||||
servers: [
|
||||
{ id: 'ps-1', server: { id: 'srv-1', name: 'github' } },
|
||||
{ id: 'ps-2', server: { id: 'srv-2', name: 'slack' } },
|
||||
],
|
||||
}));
|
||||
|
||||
const result = await service.create({ name: 'my-project', servers: ['github', 'slack'] }, 'user-1');
|
||||
expect(projectRepo.setServers).toHaveBeenCalledWith('proj-new', ['srv-1', 'srv-2']);
|
||||
expect(result.servers).toHaveLength(2);
|
||||
});
|
||||
|
||||
it('creates project with proxyMode and llmProvider', async () => {
|
||||
const created = makeProject({ id: 'proj-filtered', proxyMode: 'filtered', llmProvider: 'openai' });
|
||||
vi.mocked(projectRepo.create).mockResolvedValue(created);
|
||||
vi.mocked(projectRepo.findById).mockResolvedValue(created);
|
||||
|
||||
const result = await service.create({
|
||||
name: 'filtered-proj',
|
||||
proxyMode: 'filtered',
|
||||
llmProvider: 'openai',
|
||||
}, 'user-1');
|
||||
|
||||
expect(result.proxyMode).toBe('filtered');
|
||||
expect(result.llmProvider).toBe('openai');
|
||||
});
|
||||
|
||||
it('rejects filtered project without llmProvider', async () => {
|
||||
await expect(
|
||||
service.create({ name: 'bad-proj', proxyMode: 'filtered' }, 'user-1'),
|
||||
).rejects.toThrow();
|
||||
});
|
||||
|
||||
it('throws NotFoundError when server name resolution fails', async () => {
|
||||
vi.mocked(serverRepo.findByName).mockResolvedValue(null);
|
||||
|
||||
await expect(
|
||||
service.create({ name: 'my-project', servers: ['nonexistent'] }, 'user-1'),
|
||||
).rejects.toThrow(NotFoundError);
|
||||
});
|
||||
|
||||
});
|
||||
|
||||
describe('getById', () => {
|
||||
it('throws NotFoundError when not found', async () => {
|
||||
await expect(service.getById('missing')).rejects.toThrow(NotFoundError);
|
||||
});
|
||||
|
||||
it('returns project when found', async () => {
|
||||
const proj = makeProject({ id: 'found' });
|
||||
vi.mocked(projectRepo.findById).mockResolvedValue(proj);
|
||||
const result = await service.getById('found');
|
||||
expect(result.id).toBe('found');
|
||||
});
|
||||
});
|
||||
|
||||
describe('resolveAndGet', () => {
|
||||
it('finds by ID first', async () => {
|
||||
const proj = makeProject({ id: 'proj-id' });
|
||||
vi.mocked(projectRepo.findById).mockResolvedValue(proj);
|
||||
const result = await service.resolveAndGet('proj-id');
|
||||
expect(result.id).toBe('proj-id');
|
||||
});
|
||||
|
||||
it('falls back to name when ID not found', async () => {
|
||||
vi.mocked(projectRepo.findById).mockResolvedValue(null);
|
||||
const proj = makeProject({ name: 'my-name' });
|
||||
vi.mocked(projectRepo.findByName).mockResolvedValue(proj);
|
||||
const result = await service.resolveAndGet('my-name');
|
||||
expect(result.name).toBe('my-name');
|
||||
});
|
||||
|
||||
it('throws NotFoundError when neither ID nor name found', async () => {
|
||||
await expect(service.resolveAndGet('nothing')).rejects.toThrow(NotFoundError);
|
||||
});
|
||||
});
|
||||
|
||||
describe('update', () => {
|
||||
it('updates servers (full replacement)', async () => {
|
||||
const existing = makeProject({ id: 'proj-1' });
|
||||
vi.mocked(projectRepo.findById).mockResolvedValue(existing);
|
||||
|
||||
const srv = makeServer({ id: 'srv-new', name: 'new-srv' });
|
||||
vi.mocked(serverRepo.findByName).mockResolvedValue(srv);
|
||||
|
||||
await service.update('proj-1', { servers: ['new-srv'] });
|
||||
expect(projectRepo.setServers).toHaveBeenCalledWith('proj-1', ['srv-new']);
|
||||
});
|
||||
|
||||
it('updates proxyMode', async () => {
|
||||
const existing = makeProject({ id: 'proj-1' });
|
||||
vi.mocked(projectRepo.findById).mockResolvedValue(existing);
|
||||
|
||||
await service.update('proj-1', { proxyMode: 'filtered', llmProvider: 'anthropic' });
|
||||
expect(projectRepo.update).toHaveBeenCalledWith('proj-1', {
|
||||
proxyMode: 'filtered',
|
||||
llmProvider: 'anthropic',
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('delete', () => {
|
||||
it('deletes project', async () => {
|
||||
vi.mocked(projectRepo.findById).mockResolvedValue({ id: 'p1' } as never);
|
||||
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({ id: 'p1' }));
|
||||
await service.delete('p1');
|
||||
expect(projectRepo.delete).toHaveBeenCalledWith('p1');
|
||||
});
|
||||
|
||||
it('throws NotFoundError when project does not exist', async () => {
|
||||
await expect(service.delete('missing')).rejects.toThrow(NotFoundError);
|
||||
});
|
||||
});
|
||||
|
||||
describe('addServer', () => {
|
||||
it('attaches a server by name', async () => {
|
||||
const project = makeProject({ id: 'proj-1' });
|
||||
const srv = makeServer({ id: 'srv-1', name: 'my-ha' });
|
||||
vi.mocked(projectRepo.findById).mockResolvedValue(project);
|
||||
vi.mocked(serverRepo.findByName).mockResolvedValue(srv);
|
||||
|
||||
await service.addServer('proj-1', 'my-ha');
|
||||
expect(projectRepo.addServer).toHaveBeenCalledWith('proj-1', 'srv-1');
|
||||
});
|
||||
|
||||
it('throws NotFoundError when project not found', async () => {
|
||||
await expect(service.addServer('missing', 'my-ha')).rejects.toThrow(NotFoundError);
|
||||
});
|
||||
|
||||
it('throws NotFoundError when server not found', async () => {
|
||||
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({ id: 'proj-1' }));
|
||||
vi.mocked(serverRepo.findByName).mockResolvedValue(null);
|
||||
|
||||
await expect(service.addServer('proj-1', 'nonexistent')).rejects.toThrow(NotFoundError);
|
||||
});
|
||||
});
|
||||
|
||||
describe('removeServer', () => {
|
||||
it('detaches a server by name', async () => {
|
||||
const project = makeProject({ id: 'proj-1' });
|
||||
const srv = makeServer({ id: 'srv-1', name: 'my-ha' });
|
||||
vi.mocked(projectRepo.findById).mockResolvedValue(project);
|
||||
vi.mocked(serverRepo.findByName).mockResolvedValue(srv);
|
||||
|
||||
await service.removeServer('proj-1', 'my-ha');
|
||||
expect(projectRepo.removeServer).toHaveBeenCalledWith('proj-1', 'srv-1');
|
||||
});
|
||||
|
||||
it('throws NotFoundError when project not found', async () => {
|
||||
await expect(service.removeServer('missing', 'my-ha')).rejects.toThrow(NotFoundError);
|
||||
});
|
||||
|
||||
it('throws NotFoundError when server not found', async () => {
|
||||
vi.mocked(projectRepo.findById).mockResolvedValue(makeProject({ id: 'proj-1' }));
|
||||
vi.mocked(serverRepo.findByName).mockResolvedValue(null);
|
||||
|
||||
await expect(service.removeServer('proj-1', 'nonexistent')).rejects.toThrow(NotFoundError);
|
||||
});
|
||||
});
|
||||
|
||||
describe('generateMcpConfig', () => {
|
||||
it('generates direct mode config with STDIO servers', async () => {
|
||||
const srv = makeServer({ id: 'srv-1', name: 'github', packageName: '@mcp/github', transport: 'STDIO' });
|
||||
const project = makeProject({
|
||||
id: 'proj-1',
|
||||
name: 'my-proj',
|
||||
proxyMode: 'direct',
|
||||
servers: [{ id: 'ps-1', server: { id: 'srv-1', name: 'github' } }],
|
||||
});
|
||||
|
||||
vi.mocked(projectRepo.findById).mockResolvedValue(project);
|
||||
vi.mocked(serverRepo.findById).mockResolvedValue(srv);
|
||||
|
||||
const config = await service.generateMcpConfig('proj-1');
|
||||
expect(config.mcpServers['github']).toBeDefined();
|
||||
expect(config.mcpServers['github']?.command).toBe('npx');
|
||||
expect(config.mcpServers['github']?.args).toEqual(['-y', '@mcp/github']);
|
||||
});
|
||||
|
||||
it('generates direct mode config with SSE servers (URL-based)', async () => {
|
||||
const srv = makeServer({ id: 'srv-2', name: 'sse-server', transport: 'SSE' });
|
||||
const project = makeProject({
|
||||
id: 'proj-1',
|
||||
proxyMode: 'direct',
|
||||
servers: [{ id: 'ps-1', server: { id: 'srv-2', name: 'sse-server' } }],
|
||||
});
|
||||
|
||||
vi.mocked(projectRepo.findById).mockResolvedValue(project);
|
||||
vi.mocked(serverRepo.findById).mockResolvedValue(srv);
|
||||
|
||||
const config = await service.generateMcpConfig('proj-1');
|
||||
expect(config.mcpServers['sse-server']?.url).toBe('http://localhost:3100/api/v1/mcp/proxy/sse-server');
|
||||
expect(config.mcpServers['sse-server']?.command).toBeUndefined();
|
||||
});
|
||||
|
||||
it('generates filtered mode config (single mcplocal entry)', async () => {
|
||||
const project = makeProject({
|
||||
id: 'proj-1',
|
||||
name: 'filtered-proj',
|
||||
proxyMode: 'filtered',
|
||||
llmProvider: 'openai',
|
||||
servers: [{ id: 'ps-1', server: { id: 'srv-1', name: 'github' } }],
|
||||
});
|
||||
|
||||
vi.mocked(projectRepo.findById).mockResolvedValue(project);
|
||||
|
||||
const config = await service.generateMcpConfig('proj-1');
|
||||
expect(Object.keys(config.mcpServers)).toHaveLength(1);
|
||||
expect(config.mcpServers['filtered-proj']?.url).toBe('http://localhost:3100/api/v1/mcp/proxy/project/filtered-proj');
|
||||
});
|
||||
|
||||
it('resolves by name for mcp-config', async () => {
|
||||
const project = makeProject({
|
||||
id: 'proj-1',
|
||||
name: 'my-proj',
|
||||
proxyMode: 'direct',
|
||||
servers: [],
|
||||
});
|
||||
|
||||
vi.mocked(projectRepo.findById).mockResolvedValue(null);
|
||||
vi.mocked(projectRepo.findByName).mockResolvedValue(project);
|
||||
|
||||
const config = await service.generateMcpConfig('my-proj');
|
||||
expect(config.mcpServers).toEqual({});
|
||||
});
|
||||
|
||||
it('includes env for STDIO servers', async () => {
|
||||
const srv = makeServer({
|
||||
id: 'srv-1',
|
||||
name: 'github',
|
||||
transport: 'STDIO',
|
||||
env: [{ name: 'GITHUB_TOKEN', value: 'tok123' }],
|
||||
});
|
||||
const project = makeProject({
|
||||
id: 'proj-1',
|
||||
proxyMode: 'direct',
|
||||
servers: [{ id: 'ps-1', server: { id: 'srv-1', name: 'github' } }],
|
||||
});
|
||||
|
||||
vi.mocked(projectRepo.findById).mockResolvedValue(project);
|
||||
vi.mocked(serverRepo.findById).mockResolvedValue(srv);
|
||||
|
||||
const config = await service.generateMcpConfig('proj-1');
|
||||
expect(config.mcpServers['github']?.env?.['GITHUB_TOKEN']).toBe('tok123');
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
508
src/mcpd/tests/prompt-routes.test.ts
Normal file
508
src/mcpd/tests/prompt-routes.test.ts
Normal file
@@ -0,0 +1,508 @@
|
||||
import { describe, it, expect, vi, afterEach } from 'vitest';
|
||||
import Fastify from 'fastify';
|
||||
import type { FastifyInstance } from 'fastify';
|
||||
import { registerPromptRoutes } from '../src/routes/prompts.js';
|
||||
import { PromptService } from '../src/services/prompt.service.js';
|
||||
import { errorHandler } from '../src/middleware/error-handler.js';
|
||||
import type { IPromptRepository } from '../src/repositories/prompt.repository.js';
|
||||
import type { IPromptRequestRepository } from '../src/repositories/prompt-request.repository.js';
|
||||
import type { IProjectRepository } from '../src/repositories/project.repository.js';
|
||||
import type { Prompt, PromptRequest, Project } from '@prisma/client';
|
||||
|
||||
let app: FastifyInstance;
|
||||
|
||||
function makePrompt(overrides: Partial<Prompt> = {}): Prompt {
|
||||
return {
|
||||
id: 'prompt-1',
|
||||
name: 'test-prompt',
|
||||
content: 'Hello world',
|
||||
projectId: null,
|
||||
priority: 5,
|
||||
summary: null,
|
||||
chapters: null,
|
||||
linkTarget: null,
|
||||
version: 1,
|
||||
createdAt: new Date(),
|
||||
updatedAt: new Date(),
|
||||
...overrides,
|
||||
};
|
||||
}
|
||||
|
||||
function makePromptRequest(overrides: Partial<PromptRequest> = {}): PromptRequest {
|
||||
return {
|
||||
id: 'req-1',
|
||||
name: 'test-request',
|
||||
content: 'Proposed content',
|
||||
projectId: null,
|
||||
priority: 5,
|
||||
createdBySession: 'session-abc',
|
||||
createdByUserId: null,
|
||||
createdAt: new Date(),
|
||||
...overrides,
|
||||
};
|
||||
}
|
||||
|
||||
function makeProject(overrides: Partial<Project> = {}): Project {
|
||||
return {
|
||||
id: 'proj-1',
|
||||
name: 'homeautomation',
|
||||
description: '',
|
||||
prompt: '',
|
||||
proxyMode: 'direct',
|
||||
gated: true,
|
||||
llmProvider: null,
|
||||
llmModel: null,
|
||||
ownerId: 'user-1',
|
||||
createdAt: new Date(),
|
||||
updatedAt: new Date(),
|
||||
...overrides,
|
||||
} as Project;
|
||||
}
|
||||
|
||||
function mockPromptRepo(): IPromptRepository {
|
||||
return {
|
||||
findAll: vi.fn(async () => []),
|
||||
findGlobal: vi.fn(async () => []),
|
||||
findById: vi.fn(async () => null),
|
||||
findByNameAndProject: vi.fn(async () => null),
|
||||
create: vi.fn(async (data) => makePrompt(data)),
|
||||
update: vi.fn(async (id, data) => makePrompt({ id, ...data })),
|
||||
delete: vi.fn(async () => {}),
|
||||
};
|
||||
}
|
||||
|
||||
function mockPromptRequestRepo(): IPromptRequestRepository {
|
||||
return {
|
||||
findAll: vi.fn(async () => []),
|
||||
findGlobal: vi.fn(async () => []),
|
||||
findById: vi.fn(async () => null),
|
||||
findByNameAndProject: vi.fn(async () => null),
|
||||
findBySession: vi.fn(async () => []),
|
||||
create: vi.fn(async (data) => makePromptRequest(data)),
|
||||
update: vi.fn(async (id, data) => makePromptRequest({ id, ...data })),
|
||||
delete: vi.fn(async () => {}),
|
||||
};
|
||||
}
|
||||
|
||||
function makeProjectWithServers(
|
||||
overrides: Partial<Project> = {},
|
||||
serverNames: string[] = [],
|
||||
) {
|
||||
return {
|
||||
...makeProject(overrides),
|
||||
servers: serverNames.map((name, i) => ({
|
||||
id: `ps-${i}`,
|
||||
projectId: overrides.id ?? 'proj-1',
|
||||
serverId: `srv-${i}`,
|
||||
server: { id: `srv-${i}`, name },
|
||||
})),
|
||||
};
|
||||
}
|
||||
|
||||
function mockProjectRepo(): IProjectRepository {
|
||||
return {
|
||||
findAll: vi.fn(async () => []),
|
||||
findById: vi.fn(async () => null),
|
||||
findByName: vi.fn(async () => null),
|
||||
create: vi.fn(async (data) => makeProject(data)),
|
||||
update: vi.fn(async (_id, data) => makeProject({ ...data as Partial<Project> })),
|
||||
delete: vi.fn(async () => {}),
|
||||
};
|
||||
}
|
||||
|
||||
afterEach(async () => {
|
||||
if (app) await app.close();
|
||||
});
|
||||
|
||||
function buildApp(opts?: {
|
||||
promptRepo?: IPromptRepository;
|
||||
promptRequestRepo?: IPromptRequestRepository;
|
||||
projectRepo?: IProjectRepository;
|
||||
}) {
|
||||
const promptRepo = opts?.promptRepo ?? mockPromptRepo();
|
||||
const promptRequestRepo = opts?.promptRequestRepo ?? mockPromptRequestRepo();
|
||||
const projectRepo = opts?.projectRepo ?? mockProjectRepo();
|
||||
const service = new PromptService(promptRepo, promptRequestRepo, projectRepo);
|
||||
|
||||
app = Fastify();
|
||||
app.setErrorHandler(errorHandler);
|
||||
registerPromptRoutes(app, service, projectRepo);
|
||||
return { app, promptRepo, promptRequestRepo, projectRepo, service };
|
||||
}
|
||||
|
||||
describe('Prompt routes', () => {
|
||||
describe('GET /api/v1/prompts', () => {
|
||||
it('returns all prompts without project filter', async () => {
|
||||
const promptRepo = mockPromptRepo();
|
||||
const globalPrompt = makePrompt({ id: 'p-1', name: 'global-rule', projectId: null });
|
||||
const scopedPrompt = makePrompt({ id: 'p-2', name: 'scoped-rule', projectId: 'proj-1' });
|
||||
vi.mocked(promptRepo.findAll).mockResolvedValue([globalPrompt, scopedPrompt]);
|
||||
|
||||
const { app: a } = buildApp({ promptRepo });
|
||||
const res = await a.inject({ method: 'GET', url: '/api/v1/prompts' });
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json() as Prompt[];
|
||||
expect(body).toHaveLength(2);
|
||||
expect(promptRepo.findAll).toHaveBeenCalledWith(undefined);
|
||||
});
|
||||
|
||||
it('filters by project name when ?project= is given', async () => {
|
||||
const promptRepo = mockPromptRepo();
|
||||
const projectRepo = mockProjectRepo();
|
||||
vi.mocked(projectRepo.findByName).mockResolvedValue(makeProject({ id: 'proj-1', name: 'homeautomation' }));
|
||||
vi.mocked(promptRepo.findAll).mockResolvedValue([
|
||||
makePrompt({ id: 'p-1', name: 'ha-rule', projectId: 'proj-1' }),
|
||||
makePrompt({ id: 'p-2', name: 'global-rule', projectId: null }),
|
||||
]);
|
||||
|
||||
const { app: a } = buildApp({ promptRepo, projectRepo });
|
||||
const res = await a.inject({ method: 'GET', url: '/api/v1/prompts?project=homeautomation' });
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
expect(projectRepo.findByName).toHaveBeenCalledWith('homeautomation');
|
||||
expect(promptRepo.findAll).toHaveBeenCalledWith('proj-1');
|
||||
});
|
||||
|
||||
it('returns only global prompts when ?scope=global', async () => {
|
||||
const promptRepo = mockPromptRepo();
|
||||
const globalOnly = [makePrompt({ id: 'p-g', name: 'global-rule', projectId: null })];
|
||||
vi.mocked(promptRepo.findGlobal).mockResolvedValue(globalOnly);
|
||||
|
||||
const { app: a } = buildApp({ promptRepo });
|
||||
const res = await a.inject({ method: 'GET', url: '/api/v1/prompts?scope=global' });
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json() as Prompt[];
|
||||
expect(body).toHaveLength(1);
|
||||
expect(promptRepo.findGlobal).toHaveBeenCalled();
|
||||
expect(promptRepo.findAll).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it('returns 404 when ?project= references unknown project', async () => {
|
||||
const { app: a } = buildApp();
|
||||
const res = await a.inject({ method: 'GET', url: '/api/v1/prompts?project=nonexistent' });
|
||||
|
||||
expect(res.statusCode).toBe(404);
|
||||
const body = res.json() as { error: string };
|
||||
expect(body.error).toContain('Project not found');
|
||||
});
|
||||
});
|
||||
|
||||
describe('GET /api/v1/promptrequests', () => {
|
||||
it('returns all prompt requests without project filter', async () => {
|
||||
const promptRequestRepo = mockPromptRequestRepo();
|
||||
vi.mocked(promptRequestRepo.findAll).mockResolvedValue([
|
||||
makePromptRequest({ id: 'r-1', name: 'req-a' }),
|
||||
]);
|
||||
|
||||
const { app: a } = buildApp({ promptRequestRepo });
|
||||
const res = await a.inject({ method: 'GET', url: '/api/v1/promptrequests' });
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
expect(promptRequestRepo.findAll).toHaveBeenCalledWith(undefined);
|
||||
});
|
||||
|
||||
it('returns only global prompt requests when ?scope=global', async () => {
|
||||
const promptRequestRepo = mockPromptRequestRepo();
|
||||
vi.mocked(promptRequestRepo.findGlobal).mockResolvedValue([]);
|
||||
|
||||
const { app: a } = buildApp({ promptRequestRepo });
|
||||
const res = await a.inject({ method: 'GET', url: '/api/v1/promptrequests?scope=global' });
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
expect(promptRequestRepo.findGlobal).toHaveBeenCalled();
|
||||
expect(promptRequestRepo.findAll).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it('filters by project name when ?project= is given', async () => {
|
||||
const promptRequestRepo = mockPromptRequestRepo();
|
||||
const projectRepo = mockProjectRepo();
|
||||
vi.mocked(projectRepo.findByName).mockResolvedValue(makeProject({ id: 'proj-1' }));
|
||||
|
||||
const { app: a } = buildApp({ promptRequestRepo, projectRepo });
|
||||
const res = await a.inject({ method: 'GET', url: '/api/v1/promptrequests?project=homeautomation' });
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
expect(promptRequestRepo.findAll).toHaveBeenCalledWith('proj-1');
|
||||
});
|
||||
|
||||
it('returns 404 for unknown project on promptrequests', async () => {
|
||||
const { app: a } = buildApp();
|
||||
const res = await a.inject({ method: 'GET', url: '/api/v1/promptrequests?project=nope' });
|
||||
|
||||
expect(res.statusCode).toBe(404);
|
||||
});
|
||||
});
|
||||
|
||||
describe('POST /api/v1/promptrequests', () => {
|
||||
it('creates a global prompt request (no project)', async () => {
|
||||
const promptRequestRepo = mockPromptRequestRepo();
|
||||
const { app: a } = buildApp({ promptRequestRepo });
|
||||
const res = await a.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/promptrequests',
|
||||
payload: { name: 'global-req', content: 'some content' },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(201);
|
||||
expect(promptRequestRepo.create).toHaveBeenCalledWith(
|
||||
expect.objectContaining({ name: 'global-req', content: 'some content' }),
|
||||
);
|
||||
});
|
||||
|
||||
it('resolves project name to ID when project given', async () => {
|
||||
const promptRequestRepo = mockPromptRequestRepo();
|
||||
const projectRepo = mockProjectRepo();
|
||||
const proj = makeProject({ id: 'proj-1', name: 'myproj' });
|
||||
vi.mocked(projectRepo.findByName).mockResolvedValue(proj);
|
||||
vi.mocked(projectRepo.findById).mockResolvedValue(proj);
|
||||
|
||||
const { app: a } = buildApp({ promptRequestRepo, projectRepo });
|
||||
const res = await a.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/promptrequests',
|
||||
payload: { name: 'scoped-req', content: 'text', project: 'myproj' },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(201);
|
||||
expect(projectRepo.findByName).toHaveBeenCalledWith('myproj');
|
||||
expect(promptRequestRepo.create).toHaveBeenCalledWith(
|
||||
expect.objectContaining({ name: 'scoped-req', projectId: 'proj-1' }),
|
||||
);
|
||||
});
|
||||
|
||||
it('returns 404 for unknown project name', async () => {
|
||||
const { app: a } = buildApp();
|
||||
const res = await a.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/promptrequests',
|
||||
payload: { name: 'bad-req', content: 'x', project: 'nope' },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(404);
|
||||
});
|
||||
});
|
||||
|
||||
describe('POST /api/v1/promptrequests/:id/approve', () => {
|
||||
it('atomically approves a prompt request', async () => {
|
||||
const promptRequestRepo = mockPromptRequestRepo();
|
||||
const promptRepo = mockPromptRepo();
|
||||
const req = makePromptRequest({ id: 'req-1', name: 'my-rule', projectId: 'proj-1' });
|
||||
vi.mocked(promptRequestRepo.findById).mockResolvedValue(req);
|
||||
|
||||
const { app: a } = buildApp({ promptRepo, promptRequestRepo });
|
||||
const res = await a.inject({ method: 'POST', url: '/api/v1/promptrequests/req-1/approve' });
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
expect(promptRepo.create).toHaveBeenCalledWith({
|
||||
name: 'my-rule',
|
||||
content: 'Proposed content',
|
||||
projectId: 'proj-1',
|
||||
});
|
||||
expect(promptRequestRepo.delete).toHaveBeenCalledWith('req-1');
|
||||
});
|
||||
});
|
||||
|
||||
describe('Security: projectId tampering', () => {
|
||||
it('rejects projectId in prompt update payload', async () => {
|
||||
const promptRepo = mockPromptRepo();
|
||||
vi.mocked(promptRepo.findById).mockResolvedValue(makePrompt({ id: 'p-1', projectId: 'proj-a' }));
|
||||
|
||||
const { app: a } = buildApp({ promptRepo });
|
||||
const res = await a.inject({
|
||||
method: 'PUT',
|
||||
url: '/api/v1/prompts/p-1',
|
||||
payload: { content: 'new content', projectId: 'proj-evil' },
|
||||
});
|
||||
|
||||
// Should succeed but ignore projectId — UpdatePromptSchema strips it
|
||||
expect(res.statusCode).toBe(200);
|
||||
expect(promptRepo.update).toHaveBeenCalledWith('p-1', { content: 'new content' });
|
||||
// projectId must NOT be in the update call
|
||||
const updateArg = vi.mocked(promptRepo.update).mock.calls[0]![1];
|
||||
expect(updateArg).not.toHaveProperty('projectId');
|
||||
});
|
||||
|
||||
it('rejects projectId in promptrequest update payload', async () => {
|
||||
const promptRequestRepo = mockPromptRequestRepo();
|
||||
vi.mocked(promptRequestRepo.findById).mockResolvedValue(makePromptRequest({ id: 'r-1', projectId: 'proj-a' }));
|
||||
|
||||
const { app: a } = buildApp({ promptRequestRepo });
|
||||
const res = await a.inject({
|
||||
method: 'PUT',
|
||||
url: '/api/v1/promptrequests/r-1',
|
||||
payload: { content: 'new content', projectId: 'proj-evil' },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
expect(promptRequestRepo.update).toHaveBeenCalledWith('r-1', { content: 'new content' });
|
||||
const updateArg = vi.mocked(promptRequestRepo.update).mock.calls[0]![1];
|
||||
expect(updateArg).not.toHaveProperty('projectId');
|
||||
});
|
||||
});
|
||||
|
||||
describe('linkStatus enrichment', () => {
|
||||
it('returns linkStatus=null for non-linked prompts', async () => {
|
||||
const promptRepo = mockPromptRepo();
|
||||
vi.mocked(promptRepo.findAll).mockResolvedValue([
|
||||
makePrompt({ id: 'p-1', name: 'plain', linkTarget: null }),
|
||||
]);
|
||||
|
||||
const { app: a } = buildApp({ promptRepo });
|
||||
const res = await a.inject({ method: 'GET', url: '/api/v1/prompts' });
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json() as Array<{ linkStatus: string | null }>;
|
||||
expect(body[0]!.linkStatus).toBeNull();
|
||||
});
|
||||
|
||||
it('returns linkStatus=alive when project and server exist', async () => {
|
||||
const promptRepo = mockPromptRepo();
|
||||
const projectRepo = mockProjectRepo();
|
||||
vi.mocked(promptRepo.findAll).mockResolvedValue([
|
||||
makePrompt({ id: 'p-1', name: 'linked', linkTarget: 'source-proj/docmost-mcp:docmost://pages/abc' }),
|
||||
]);
|
||||
vi.mocked(projectRepo.findByName).mockImplementation(async (name) => {
|
||||
if (name === 'source-proj') {
|
||||
return makeProjectWithServers({ id: 'sp-1', name: 'source-proj' }, ['docmost-mcp']) as never;
|
||||
}
|
||||
return null;
|
||||
});
|
||||
|
||||
const { app: a } = buildApp({ promptRepo, projectRepo });
|
||||
const res = await a.inject({ method: 'GET', url: '/api/v1/prompts' });
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json() as Array<{ linkStatus: string }>;
|
||||
expect(body[0]!.linkStatus).toBe('alive');
|
||||
});
|
||||
|
||||
it('returns linkStatus=dead when source project not found', async () => {
|
||||
const promptRepo = mockPromptRepo();
|
||||
vi.mocked(promptRepo.findAll).mockResolvedValue([
|
||||
makePrompt({ id: 'p-1', name: 'broken', linkTarget: 'missing-proj/srv:some://uri' }),
|
||||
]);
|
||||
|
||||
const { app: a } = buildApp({ promptRepo });
|
||||
const res = await a.inject({ method: 'GET', url: '/api/v1/prompts' });
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json() as Array<{ linkStatus: string }>;
|
||||
expect(body[0]!.linkStatus).toBe('dead');
|
||||
});
|
||||
|
||||
it('returns linkStatus=dead when server not in project', async () => {
|
||||
const promptRepo = mockPromptRepo();
|
||||
const projectRepo = mockProjectRepo();
|
||||
vi.mocked(promptRepo.findAll).mockResolvedValue([
|
||||
makePrompt({ id: 'p-1', name: 'wrong-srv', linkTarget: 'proj/wrong-server:some://uri' }),
|
||||
]);
|
||||
vi.mocked(projectRepo.findByName).mockResolvedValue(
|
||||
makeProjectWithServers({ id: 'sp-1', name: 'proj' }, ['other-server']) as never,
|
||||
);
|
||||
|
||||
const { app: a } = buildApp({ promptRepo, projectRepo });
|
||||
const res = await a.inject({ method: 'GET', url: '/api/v1/prompts' });
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json() as Array<{ linkStatus: string }>;
|
||||
expect(body[0]!.linkStatus).toBe('dead');
|
||||
});
|
||||
|
||||
it('enriches single prompt GET with linkStatus', async () => {
|
||||
const promptRepo = mockPromptRepo();
|
||||
const projectRepo = mockProjectRepo();
|
||||
vi.mocked(promptRepo.findById).mockResolvedValue(
|
||||
makePrompt({ id: 'p-1', name: 'linked', linkTarget: 'proj/srv:some://uri' }),
|
||||
);
|
||||
vi.mocked(projectRepo.findByName).mockResolvedValue(
|
||||
makeProjectWithServers({ id: 'sp-1', name: 'proj' }, ['srv']) as never,
|
||||
);
|
||||
|
||||
const { app: a } = buildApp({ promptRepo, projectRepo });
|
||||
const res = await a.inject({ method: 'GET', url: '/api/v1/prompts/p-1' });
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json() as { linkStatus: string };
|
||||
expect(body.linkStatus).toBe('alive');
|
||||
});
|
||||
|
||||
it('caches project lookup for multiple linked prompts', async () => {
|
||||
const promptRepo = mockPromptRepo();
|
||||
const projectRepo = mockProjectRepo();
|
||||
vi.mocked(promptRepo.findAll).mockResolvedValue([
|
||||
makePrompt({ id: 'p-1', name: 'link-a', linkTarget: 'proj/srv:uri-a' }),
|
||||
makePrompt({ id: 'p-2', name: 'link-b', linkTarget: 'proj/srv:uri-b' }),
|
||||
]);
|
||||
vi.mocked(projectRepo.findByName).mockResolvedValue(
|
||||
makeProjectWithServers({ id: 'sp-1', name: 'proj' }, ['srv']) as never,
|
||||
);
|
||||
|
||||
const { app: a } = buildApp({ promptRepo, projectRepo });
|
||||
const res = await a.inject({ method: 'GET', url: '/api/v1/prompts' });
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json() as Array<{ linkStatus: string }>;
|
||||
expect(body).toHaveLength(2);
|
||||
expect(body[0]!.linkStatus).toBe('alive');
|
||||
expect(body[1]!.linkStatus).toBe('alive');
|
||||
// Should only call findByName once (cached)
|
||||
expect(projectRepo.findByName).toHaveBeenCalledTimes(1);
|
||||
});
|
||||
|
||||
it('supports ?projectId= query parameter', async () => {
|
||||
const promptRepo = mockPromptRepo();
|
||||
vi.mocked(promptRepo.findAll).mockResolvedValue([
|
||||
makePrompt({ id: 'p-1', name: 'scoped', projectId: 'proj-1' }),
|
||||
]);
|
||||
|
||||
const { app: a } = buildApp({ promptRepo });
|
||||
const res = await a.inject({ method: 'GET', url: '/api/v1/prompts?projectId=proj-1' });
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
expect(promptRepo.findAll).toHaveBeenCalledWith('proj-1');
|
||||
});
|
||||
});
|
||||
|
||||
describe('GET /api/v1/projects/:name/prompts/visible', () => {
|
||||
it('returns approved prompts + session pending requests', async () => {
|
||||
const promptRepo = mockPromptRepo();
|
||||
const promptRequestRepo = mockPromptRequestRepo();
|
||||
const projectRepo = mockProjectRepo();
|
||||
vi.mocked(projectRepo.findByName).mockResolvedValue(makeProject({ id: 'proj-1' }));
|
||||
vi.mocked(promptRepo.findAll).mockResolvedValue([
|
||||
makePrompt({ name: 'approved-one', projectId: 'proj-1' }),
|
||||
makePrompt({ name: 'global-one', projectId: null }),
|
||||
]);
|
||||
vi.mocked(promptRequestRepo.findBySession).mockResolvedValue([
|
||||
makePromptRequest({ name: 'pending-one', projectId: 'proj-1' }),
|
||||
]);
|
||||
|
||||
const { app: a } = buildApp({ promptRepo, promptRequestRepo, projectRepo });
|
||||
const res = await a.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/projects/homeautomation/prompts/visible?session=sess-123',
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json() as Array<{ name: string; type: string }>;
|
||||
expect(body).toHaveLength(3);
|
||||
expect(body.map((b) => b.name)).toContain('approved-one');
|
||||
expect(body.map((b) => b.name)).toContain('global-one');
|
||||
expect(body.map((b) => b.name)).toContain('pending-one');
|
||||
const pending = body.find((b) => b.name === 'pending-one');
|
||||
expect(pending?.type).toBe('promptrequest');
|
||||
});
|
||||
|
||||
it('returns 404 for unknown project', async () => {
|
||||
const { app: a } = buildApp();
|
||||
const res = await a.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/projects/nonexistent/prompts/visible',
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(404);
|
||||
});
|
||||
});
|
||||
});
|
||||
229
src/mcpd/tests/rbac-definition-service.test.ts
Normal file
229
src/mcpd/tests/rbac-definition-service.test.ts
Normal file
@@ -0,0 +1,229 @@
|
||||
import { describe, it, expect, vi, beforeEach } from 'vitest';
|
||||
import { RbacDefinitionService } from '../src/services/rbac-definition.service.js';
|
||||
import { NotFoundError, ConflictError } from '../src/services/mcp-server.service.js';
|
||||
import type { IRbacDefinitionRepository } from '../src/repositories/rbac-definition.repository.js';
|
||||
import type { RbacDefinition } from '@prisma/client';
|
||||
|
||||
function makeDef(overrides: Partial<RbacDefinition> = {}): RbacDefinition {
|
||||
return {
|
||||
id: 'def-1',
|
||||
name: 'test-rbac',
|
||||
subjects: [{ kind: 'User', name: 'alice@example.com' }],
|
||||
roleBindings: [{ role: 'edit', resource: '*' }],
|
||||
version: 1,
|
||||
createdAt: new Date(),
|
||||
updatedAt: new Date(),
|
||||
...overrides,
|
||||
};
|
||||
}
|
||||
|
||||
function mockRepo(): IRbacDefinitionRepository {
|
||||
return {
|
||||
findAll: vi.fn(async () => []),
|
||||
findById: vi.fn(async () => null),
|
||||
findByName: vi.fn(async () => null),
|
||||
create: vi.fn(async (data) => makeDef({ name: data.name, subjects: data.subjects, roleBindings: data.roleBindings })),
|
||||
update: vi.fn(async (id, data) => makeDef({ id, ...data })),
|
||||
delete: vi.fn(async () => {}),
|
||||
};
|
||||
}
|
||||
|
||||
describe('RbacDefinitionService', () => {
|
||||
let repo: ReturnType<typeof mockRepo>;
|
||||
let service: RbacDefinitionService;
|
||||
|
||||
beforeEach(() => {
|
||||
repo = mockRepo();
|
||||
service = new RbacDefinitionService(repo);
|
||||
});
|
||||
|
||||
describe('list', () => {
|
||||
it('returns all definitions', async () => {
|
||||
const defs = await service.list();
|
||||
expect(repo.findAll).toHaveBeenCalled();
|
||||
expect(defs).toEqual([]);
|
||||
});
|
||||
});
|
||||
|
||||
describe('getById', () => {
|
||||
it('returns definition when found', async () => {
|
||||
const def = makeDef();
|
||||
vi.mocked(repo.findById).mockResolvedValue(def);
|
||||
const result = await service.getById('def-1');
|
||||
expect(result.id).toBe('def-1');
|
||||
});
|
||||
|
||||
it('throws NotFoundError when not found', async () => {
|
||||
await expect(service.getById('missing')).rejects.toThrow(NotFoundError);
|
||||
});
|
||||
});
|
||||
|
||||
describe('getByName', () => {
|
||||
it('returns definition when found', async () => {
|
||||
const def = makeDef();
|
||||
vi.mocked(repo.findByName).mockResolvedValue(def);
|
||||
const result = await service.getByName('test-rbac');
|
||||
expect(result.name).toBe('test-rbac');
|
||||
});
|
||||
|
||||
it('throws NotFoundError when not found', async () => {
|
||||
await expect(service.getByName('missing')).rejects.toThrow(NotFoundError);
|
||||
});
|
||||
});
|
||||
|
||||
describe('create', () => {
|
||||
it('creates a definition with valid input', async () => {
|
||||
const result = await service.create({
|
||||
name: 'new-rbac',
|
||||
subjects: [{ kind: 'User', name: 'alice@example.com' }],
|
||||
roleBindings: [{ role: 'edit', resource: '*' }],
|
||||
});
|
||||
expect(result.name).toBe('new-rbac');
|
||||
expect(repo.create).toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it('throws ConflictError when name exists', async () => {
|
||||
vi.mocked(repo.findByName).mockResolvedValue(makeDef());
|
||||
await expect(
|
||||
service.create({
|
||||
name: 'test-rbac',
|
||||
subjects: [{ kind: 'User', name: 'bob@example.com' }],
|
||||
roleBindings: [{ role: 'view', resource: 'servers' }],
|
||||
}),
|
||||
).rejects.toThrow(ConflictError);
|
||||
});
|
||||
|
||||
it('throws on missing subjects', async () => {
|
||||
await expect(
|
||||
service.create({
|
||||
name: 'bad-rbac',
|
||||
subjects: [],
|
||||
roleBindings: [{ role: 'view', resource: 'servers' }],
|
||||
}),
|
||||
).rejects.toThrow();
|
||||
});
|
||||
|
||||
it('throws on missing roleBindings', async () => {
|
||||
await expect(
|
||||
service.create({
|
||||
name: 'bad-rbac',
|
||||
subjects: [{ kind: 'User', name: 'alice@example.com' }],
|
||||
roleBindings: [],
|
||||
}),
|
||||
).rejects.toThrow();
|
||||
});
|
||||
|
||||
it('throws on invalid role', async () => {
|
||||
await expect(
|
||||
service.create({
|
||||
name: 'bad-rbac',
|
||||
subjects: [{ kind: 'User', name: 'alice@example.com' }],
|
||||
roleBindings: [{ role: 'superadmin', resource: '*' }],
|
||||
}),
|
||||
).rejects.toThrow();
|
||||
});
|
||||
|
||||
it('throws on invalid subject kind', async () => {
|
||||
await expect(
|
||||
service.create({
|
||||
name: 'bad-rbac',
|
||||
subjects: [{ kind: 'Robot', name: 'bot-1' }],
|
||||
roleBindings: [{ role: 'view', resource: 'servers' }],
|
||||
}),
|
||||
).rejects.toThrow();
|
||||
});
|
||||
|
||||
it('throws on invalid name format', async () => {
|
||||
await expect(
|
||||
service.create({
|
||||
name: 'Invalid Name!',
|
||||
subjects: [{ kind: 'User', name: 'alice@example.com' }],
|
||||
roleBindings: [{ role: 'view', resource: 'servers' }],
|
||||
}),
|
||||
).rejects.toThrow();
|
||||
});
|
||||
|
||||
it('normalizes singular resource names to plural', async () => {
|
||||
await service.create({
|
||||
name: 'singular-rbac',
|
||||
subjects: [{ kind: 'User', name: 'alice@example.com' }],
|
||||
roleBindings: [
|
||||
{ role: 'view', resource: 'server' },
|
||||
{ role: 'edit', resource: 'secret', name: 'my-secret' },
|
||||
],
|
||||
});
|
||||
const call = vi.mocked(repo.create).mock.calls[0]![0];
|
||||
expect(call.roleBindings[0]!.resource).toBe('servers');
|
||||
expect(call.roleBindings[1]!.resource).toBe('secrets');
|
||||
expect(call.roleBindings[1]!.name).toBe('my-secret');
|
||||
});
|
||||
|
||||
it('creates a definition with operation bindings', async () => {
|
||||
const result = await service.create({
|
||||
name: 'ops-rbac',
|
||||
subjects: [{ kind: 'User', name: 'alice@example.com' }],
|
||||
roleBindings: [{ role: 'run', action: 'logs' }],
|
||||
});
|
||||
expect(result.name).toBe('ops-rbac');
|
||||
expect(repo.create).toHaveBeenCalled();
|
||||
const call = vi.mocked(repo.create).mock.calls[0]![0];
|
||||
expect(call.roleBindings[0]!.action).toBe('logs');
|
||||
});
|
||||
|
||||
it('creates a definition with mixed resource and operation bindings', async () => {
|
||||
const result = await service.create({
|
||||
name: 'mixed-rbac',
|
||||
subjects: [{ kind: 'User', name: 'alice@example.com' }],
|
||||
roleBindings: [
|
||||
{ role: 'view', resource: 'servers' },
|
||||
{ role: 'run', action: 'logs' },
|
||||
],
|
||||
});
|
||||
expect(result.name).toBe('mixed-rbac');
|
||||
expect(repo.create).toHaveBeenCalled();
|
||||
const call = vi.mocked(repo.create).mock.calls[0]![0];
|
||||
expect(call.roleBindings).toHaveLength(2);
|
||||
expect(call.roleBindings[0]!.resource).toBe('servers');
|
||||
expect(call.roleBindings[1]!.action).toBe('logs');
|
||||
});
|
||||
|
||||
it('creates a definition with name-scoped resource binding', async () => {
|
||||
const result = await service.create({
|
||||
name: 'scoped-rbac',
|
||||
subjects: [{ kind: 'User', name: 'alice@example.com' }],
|
||||
roleBindings: [{ role: 'view', resource: 'servers', name: 'my-ha' }],
|
||||
});
|
||||
expect(result.name).toBe('scoped-rbac');
|
||||
expect(repo.create).toHaveBeenCalled();
|
||||
const call = vi.mocked(repo.create).mock.calls[0]![0];
|
||||
expect(call.roleBindings[0]!.resource).toBe('servers');
|
||||
expect(call.roleBindings[0]!.name).toBe('my-ha');
|
||||
});
|
||||
});
|
||||
|
||||
describe('update', () => {
|
||||
it('updates an existing definition', async () => {
|
||||
vi.mocked(repo.findById).mockResolvedValue(makeDef());
|
||||
await service.update('def-1', { subjects: [{ kind: 'User', name: 'bob@example.com' }] });
|
||||
expect(repo.update).toHaveBeenCalledWith('def-1', {
|
||||
subjects: [{ kind: 'User', name: 'bob@example.com' }],
|
||||
});
|
||||
});
|
||||
|
||||
it('throws NotFoundError when definition does not exist', async () => {
|
||||
await expect(service.update('missing', {})).rejects.toThrow(NotFoundError);
|
||||
});
|
||||
});
|
||||
|
||||
describe('delete', () => {
|
||||
it('deletes an existing definition', async () => {
|
||||
vi.mocked(repo.findById).mockResolvedValue(makeDef());
|
||||
await service.delete('def-1');
|
||||
expect(repo.delete).toHaveBeenCalledWith('def-1');
|
||||
});
|
||||
|
||||
it('throws NotFoundError when definition does not exist', async () => {
|
||||
await expect(service.delete('missing')).rejects.toThrow(NotFoundError);
|
||||
});
|
||||
});
|
||||
});
|
||||
444
src/mcpd/tests/rbac-name-scope-integration.test.ts
Normal file
444
src/mcpd/tests/rbac-name-scope-integration.test.ts
Normal file
@@ -0,0 +1,444 @@
|
||||
/**
|
||||
* Integration tests reproducing RBAC name-scoped access bugs.
|
||||
*
|
||||
* Bug 1: `mcpctl get servers` shows ALL servers despite user only having
|
||||
* view:servers+name:my-home-assistant
|
||||
* Bug 2: `mcpctl get server my-home-assistant -o yaml` returns 403 because
|
||||
* CLI resolves name→CUID, and RBAC compares CUID against binding name
|
||||
*
|
||||
* These tests spin up a full Fastify app with auth + RBAC hooks + server routes,
|
||||
* exactly like main.ts, to catch regressions at the HTTP level.
|
||||
*/
|
||||
import { describe, it, expect, vi, afterEach, beforeEach } from 'vitest';
|
||||
import Fastify from 'fastify';
|
||||
import type { FastifyInstance } from 'fastify';
|
||||
import { registerMcpServerRoutes } from '../src/routes/mcp-servers.js';
|
||||
import { McpServerService } from '../src/services/mcp-server.service.js';
|
||||
import { InstanceService } from '../src/services/instance.service.js';
|
||||
import { RbacService } from '../src/services/rbac.service.js';
|
||||
import { errorHandler } from '../src/middleware/error-handler.js';
|
||||
import type { IMcpServerRepository, IMcpInstanceRepository } from '../src/repositories/interfaces.js';
|
||||
import type { IRbacDefinitionRepository } from '../src/repositories/rbac-definition.repository.js';
|
||||
import type { McpOrchestrator } from '../src/services/orchestrator.js';
|
||||
import type { McpServer, RbacDefinition, PrismaClient } from '@prisma/client';
|
||||
|
||||
// ── Test data ──
|
||||
|
||||
const SERVERS: McpServer[] = [
|
||||
{ id: 'clxyz000000001', name: 'my-home-assistant', description: 'HA server', transport: 'STDIO', packageName: null, dockerImage: null, repositoryUrl: null, externalUrl: null, command: null, containerPort: null, replicas: 1, env: [], healthCheck: null, version: 1, createdAt: new Date(), updatedAt: new Date() },
|
||||
{ id: 'clxyz000000002', name: 'slack-server', description: 'Slack MCP', transport: 'STDIO', packageName: null, dockerImage: null, repositoryUrl: null, externalUrl: null, command: null, containerPort: null, replicas: 1, env: [], healthCheck: null, version: 1, createdAt: new Date(), updatedAt: new Date() },
|
||||
{ id: 'clxyz000000003', name: 'github-server', description: 'GitHub MCP', transport: 'STDIO', packageName: null, dockerImage: null, repositoryUrl: null, externalUrl: null, command: null, containerPort: null, replicas: 1, env: [], healthCheck: null, version: 1, createdAt: new Date(), updatedAt: new Date() },
|
||||
];
|
||||
|
||||
// User tokens → userId mapping
|
||||
const SESSIONS: Record<string, { userId: string }> = {
|
||||
'scoped-token': { userId: 'user-scoped' },
|
||||
'admin-token': { userId: 'user-admin' },
|
||||
'multi-scoped-token': { userId: 'user-multi' },
|
||||
'secrets-only-token': { userId: 'user-secrets' },
|
||||
'edit-scoped-token': { userId: 'user-edit-scoped' },
|
||||
};
|
||||
|
||||
// User email mapping
|
||||
const USERS: Record<string, { email: string }> = {
|
||||
'user-scoped': { email: 'scoped@example.com' },
|
||||
'user-admin': { email: 'admin@example.com' },
|
||||
'user-multi': { email: 'multi@example.com' },
|
||||
'user-secrets': { email: 'secrets@example.com' },
|
||||
'user-edit-scoped': { email: 'editscoped@example.com' },
|
||||
};
|
||||
|
||||
// RBAC definitions
|
||||
const RBAC_DEFS: RbacDefinition[] = [
|
||||
{
|
||||
id: 'rbac-scoped', name: 'scoped-view', version: 1, createdAt: new Date(), updatedAt: new Date(),
|
||||
subjects: [{ kind: 'User', name: 'scoped@example.com' }],
|
||||
roleBindings: [{ role: 'view', resource: 'servers', name: 'my-home-assistant' }],
|
||||
},
|
||||
{
|
||||
id: 'rbac-admin', name: 'admin-all', version: 1, createdAt: new Date(), updatedAt: new Date(),
|
||||
subjects: [{ kind: 'User', name: 'admin@example.com' }],
|
||||
roleBindings: [{ role: 'edit', resource: '*' }],
|
||||
},
|
||||
{
|
||||
id: 'rbac-multi', name: 'multi-scoped', version: 1, createdAt: new Date(), updatedAt: new Date(),
|
||||
subjects: [{ kind: 'User', name: 'multi@example.com' }],
|
||||
roleBindings: [
|
||||
{ role: 'view', resource: 'servers', name: 'my-home-assistant' },
|
||||
{ role: 'view', resource: 'servers', name: 'slack-server' },
|
||||
],
|
||||
},
|
||||
{
|
||||
id: 'rbac-secrets', name: 'secrets-only', version: 1, createdAt: new Date(), updatedAt: new Date(),
|
||||
subjects: [{ kind: 'User', name: 'secrets@example.com' }],
|
||||
roleBindings: [{ role: 'view', resource: 'secrets' }],
|
||||
},
|
||||
{
|
||||
id: 'rbac-edit-scoped', name: 'edit-scoped', version: 1, createdAt: new Date(), updatedAt: new Date(),
|
||||
subjects: [{ kind: 'User', name: 'editscoped@example.com' }],
|
||||
roleBindings: [{ role: 'edit', resource: 'servers', name: 'my-home-assistant' }],
|
||||
},
|
||||
];
|
||||
|
||||
// ── Mock factories ──
|
||||
|
||||
function mockServerRepo(): IMcpServerRepository {
|
||||
return {
|
||||
findAll: vi.fn(async () => [...SERVERS]),
|
||||
findById: vi.fn(async (id: string) => SERVERS.find((s) => s.id === id) ?? null),
|
||||
findByName: vi.fn(async (name: string) => SERVERS.find((s) => s.name === name) ?? null),
|
||||
create: vi.fn(async () => SERVERS[0]!),
|
||||
update: vi.fn(async () => SERVERS[0]!),
|
||||
delete: vi.fn(async () => {}),
|
||||
};
|
||||
}
|
||||
|
||||
function mockRbacRepo(): IRbacDefinitionRepository {
|
||||
return {
|
||||
findAll: vi.fn(async () => [...RBAC_DEFS]),
|
||||
findById: vi.fn(async () => null),
|
||||
findByName: vi.fn(async () => null),
|
||||
create: vi.fn(async () => RBAC_DEFS[0]!),
|
||||
update: vi.fn(async () => RBAC_DEFS[0]!),
|
||||
delete: vi.fn(async () => {}),
|
||||
};
|
||||
}
|
||||
|
||||
function mockPrisma(): PrismaClient {
|
||||
return {
|
||||
user: {
|
||||
findUnique: vi.fn(async ({ where }: { where: { id: string } }) => {
|
||||
const u = USERS[where.id];
|
||||
return u ? { email: u.email } : null;
|
||||
}),
|
||||
},
|
||||
groupMember: {
|
||||
findMany: vi.fn(async () => []),
|
||||
},
|
||||
} as unknown as PrismaClient;
|
||||
}
|
||||
|
||||
function stubInstanceRepo(): IMcpInstanceRepository {
|
||||
return {
|
||||
findAll: vi.fn(async () => []),
|
||||
findById: vi.fn(async () => null),
|
||||
findByContainerId: vi.fn(async () => null),
|
||||
create: vi.fn(async (data) => ({
|
||||
id: 'inst-stub', serverId: data.serverId, containerId: null,
|
||||
status: data.status ?? 'STOPPED', port: null, metadata: {},
|
||||
healthStatus: null, lastHealthCheck: null, events: [],
|
||||
version: 1, createdAt: new Date(), updatedAt: new Date(),
|
||||
}) as never),
|
||||
updateStatus: vi.fn(async () => ({}) as never),
|
||||
delete: vi.fn(async () => {}),
|
||||
};
|
||||
}
|
||||
|
||||
function stubOrchestrator(): McpOrchestrator {
|
||||
return {
|
||||
ping: vi.fn(async () => true),
|
||||
pullImage: vi.fn(async () => {}),
|
||||
createContainer: vi.fn(async () => ({ containerId: 'ctr', name: 'stub', state: 'running' as const, port: 3000, createdAt: new Date() })),
|
||||
stopContainer: vi.fn(async () => {}),
|
||||
removeContainer: vi.fn(async () => {}),
|
||||
inspectContainer: vi.fn(async () => ({ containerId: 'ctr', name: 'stub', state: 'running' as const, createdAt: new Date() })),
|
||||
getContainerLogs: vi.fn(async () => ({ stdout: '', stderr: '' })),
|
||||
};
|
||||
}
|
||||
|
||||
// ── App setup (replicates main.ts hooks) ──
|
||||
|
||||
import { normalizeResource } from '../src/validation/rbac-definition.schema.js';
|
||||
import type { RbacAction } from '../src/services/rbac.service.js';
|
||||
|
||||
type PermissionCheck =
|
||||
| { kind: 'resource'; resource: string; action: RbacAction; resourceName?: string }
|
||||
| { kind: 'operation'; operation: string }
|
||||
| { kind: 'skip' };
|
||||
|
||||
function mapUrlToPermission(method: string, url: string): PermissionCheck {
|
||||
const match = url.match(/^\/api\/v1\/([a-z-]+)/);
|
||||
if (!match) return { kind: 'skip' };
|
||||
const segment = match[1] as string;
|
||||
|
||||
if (segment === 'backup') return { kind: 'operation', operation: 'backup' };
|
||||
if (segment === 'restore') return { kind: 'operation', operation: 'restore' };
|
||||
if (segment === 'audit-logs' && method === 'DELETE') return { kind: 'operation', operation: 'audit-purge' };
|
||||
|
||||
const resourceMap: Record<string, string | undefined> = {
|
||||
servers: 'servers', instances: 'instances', secrets: 'secrets',
|
||||
projects: 'projects', templates: 'templates', users: 'users',
|
||||
groups: 'groups', rbac: 'rbac', 'audit-logs': 'rbac', mcp: 'servers',
|
||||
};
|
||||
|
||||
const resource = resourceMap[segment];
|
||||
if (resource === undefined) return { kind: 'skip' };
|
||||
|
||||
let action: RbacAction;
|
||||
switch (method) {
|
||||
case 'GET': case 'HEAD': action = 'view'; break;
|
||||
case 'POST': action = 'create'; break;
|
||||
case 'DELETE': action = 'delete'; break;
|
||||
default: action = 'edit'; break;
|
||||
}
|
||||
|
||||
const nameMatch = url.match(/^\/api\/v1\/[a-z-]+\/([^/?]+)/);
|
||||
const resourceName = nameMatch?.[1];
|
||||
const check: PermissionCheck = { kind: 'resource', resource, action };
|
||||
if (resourceName !== undefined) (check as { resourceName: string }).resourceName = resourceName;
|
||||
return check;
|
||||
}
|
||||
|
||||
let app: FastifyInstance;
|
||||
|
||||
afterEach(async () => {
|
||||
if (app) await app.close();
|
||||
});
|
||||
|
||||
async function createTestApp() {
|
||||
const serverRepo = mockServerRepo();
|
||||
const rbacRepo = mockRbacRepo();
|
||||
const prisma = mockPrisma();
|
||||
const rbacService = new RbacService(rbacRepo, prisma);
|
||||
|
||||
const CUID_RE = /^c[^\s-]{8,}$/i;
|
||||
const nameResolvers: Record<string, { findById(id: string): Promise<{ name: string } | null> }> = {
|
||||
servers: serverRepo,
|
||||
};
|
||||
|
||||
app = Fastify({ logger: false });
|
||||
app.setErrorHandler(errorHandler);
|
||||
|
||||
// Auth hook (mock)
|
||||
app.addHook('preHandler', async (request, reply) => {
|
||||
const url = request.url;
|
||||
if (url.startsWith('/api/v1/auth/') || url === '/healthz') return;
|
||||
if (!url.startsWith('/api/v1/')) return;
|
||||
|
||||
const header = request.headers.authorization;
|
||||
if (!header?.startsWith('Bearer ')) {
|
||||
reply.code(401).send({ error: 'Unauthorized' });
|
||||
return;
|
||||
}
|
||||
const token = header.slice(7);
|
||||
const session = SESSIONS[token];
|
||||
if (!session) {
|
||||
reply.code(401).send({ error: 'Invalid token' });
|
||||
return;
|
||||
}
|
||||
request.userId = session.userId;
|
||||
});
|
||||
|
||||
// RBAC hook (replicates main.ts)
|
||||
app.addHook('preHandler', async (request, reply) => {
|
||||
if (reply.sent) return;
|
||||
const url = request.url;
|
||||
if (url.startsWith('/api/v1/auth/') || url === '/healthz') return;
|
||||
if (!url.startsWith('/api/v1/')) return;
|
||||
if (request.userId === undefined) return;
|
||||
|
||||
const check = mapUrlToPermission(request.method, url);
|
||||
if (check.kind === 'skip') return;
|
||||
|
||||
let allowed: boolean;
|
||||
if (check.kind === 'operation') {
|
||||
allowed = await rbacService.canRunOperation(request.userId, check.operation);
|
||||
} else {
|
||||
// CUID→name resolution
|
||||
if (check.resourceName !== undefined && CUID_RE.test(check.resourceName)) {
|
||||
const resolver = nameResolvers[check.resource];
|
||||
if (resolver) {
|
||||
const entity = await resolver.findById(check.resourceName);
|
||||
if (entity) check.resourceName = entity.name;
|
||||
}
|
||||
}
|
||||
allowed = await rbacService.canAccess(request.userId, check.action, check.resource, check.resourceName);
|
||||
// Compute scope for list filtering
|
||||
if (allowed && check.resourceName === undefined) {
|
||||
request.rbacScope = await rbacService.getAllowedScope(request.userId, check.action, check.resource);
|
||||
}
|
||||
}
|
||||
if (!allowed) {
|
||||
reply.code(403).send({ error: 'Forbidden' });
|
||||
}
|
||||
});
|
||||
|
||||
// Routes
|
||||
const serverService = new McpServerService(serverRepo);
|
||||
const instanceService = new InstanceService(stubInstanceRepo(), serverRepo, stubOrchestrator());
|
||||
serverService.setInstanceService(instanceService);
|
||||
registerMcpServerRoutes(app, serverService, instanceService);
|
||||
|
||||
// preSerialization hook (list filtering)
|
||||
app.addHook('preSerialization', async (request, _reply, payload) => {
|
||||
if (!request.rbacScope || request.rbacScope.wildcard) return payload;
|
||||
if (!Array.isArray(payload)) return payload;
|
||||
return (payload as Array<Record<string, unknown>>).filter((item) => {
|
||||
const name = item['name'];
|
||||
return typeof name === 'string' && request.rbacScope!.names.has(name);
|
||||
});
|
||||
});
|
||||
|
||||
await app.ready();
|
||||
return app;
|
||||
}
|
||||
|
||||
// ── Tests ──
|
||||
|
||||
describe('RBAC name-scoped integration (reproduces mcpctl bugs)', () => {
|
||||
beforeEach(async () => {
|
||||
await createTestApp();
|
||||
});
|
||||
|
||||
describe('Bug 1: mcpctl get servers (list filtering)', () => {
|
||||
it('name-scoped user sees ONLY their permitted server', async () => {
|
||||
const res = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/servers',
|
||||
headers: { authorization: 'Bearer scoped-token' },
|
||||
});
|
||||
expect(res.statusCode).toBe(200);
|
||||
const servers = res.json<Array<{ name: string }>>();
|
||||
expect(servers).toHaveLength(1);
|
||||
expect(servers[0]!.name).toBe('my-home-assistant');
|
||||
});
|
||||
|
||||
it('wildcard user sees ALL servers', async () => {
|
||||
const res = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/servers',
|
||||
headers: { authorization: 'Bearer admin-token' },
|
||||
});
|
||||
expect(res.statusCode).toBe(200);
|
||||
const servers = res.json<Array<{ name: string }>>();
|
||||
expect(servers).toHaveLength(3);
|
||||
});
|
||||
|
||||
it('user with multiple name-scoped bindings sees only those servers', async () => {
|
||||
const res = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/servers',
|
||||
headers: { authorization: 'Bearer multi-scoped-token' },
|
||||
});
|
||||
expect(res.statusCode).toBe(200);
|
||||
const servers = res.json<Array<{ name: string }>>();
|
||||
expect(servers).toHaveLength(2);
|
||||
const names = servers.map((s) => s.name);
|
||||
expect(names).toContain('my-home-assistant');
|
||||
expect(names).toContain('slack-server');
|
||||
expect(names).not.toContain('github-server');
|
||||
});
|
||||
|
||||
it('user with no server permissions gets 403', async () => {
|
||||
const res = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/servers',
|
||||
headers: { authorization: 'Bearer secrets-only-token' },
|
||||
});
|
||||
expect(res.statusCode).toBe(403);
|
||||
});
|
||||
});
|
||||
|
||||
describe('Bug 2: mcpctl get server NAME (CUID resolution)', () => {
|
||||
it('allows access when URL contains CUID matching a name-scoped binding', async () => {
|
||||
// CLI resolves my-home-assistant → clxyz000000001
|
||||
const res = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/servers/clxyz000000001',
|
||||
headers: { authorization: 'Bearer scoped-token' },
|
||||
});
|
||||
expect(res.statusCode).toBe(200);
|
||||
expect(res.json<{ name: string }>().name).toBe('my-home-assistant');
|
||||
});
|
||||
|
||||
it('denies access when CUID resolves to server NOT in binding', async () => {
|
||||
const res = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/servers/clxyz000000002',
|
||||
headers: { authorization: 'Bearer scoped-token' },
|
||||
});
|
||||
expect(res.statusCode).toBe(403);
|
||||
});
|
||||
|
||||
it('passes RBAC when URL has human-readable name (route 404 is expected)', async () => {
|
||||
// Human name in URL: RBAC passes (matches binding directly),
|
||||
// but the route only does findById, so it 404s.
|
||||
// CLI always resolves name→CUID first, so this doesn't happen in practice.
|
||||
// The important thing: it does NOT return 403.
|
||||
const res = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/servers/my-home-assistant',
|
||||
headers: { authorization: 'Bearer scoped-token' },
|
||||
});
|
||||
expect(res.statusCode).toBe(404); // Not 403!
|
||||
});
|
||||
|
||||
it('handles nonexistent CUID gracefully (403)', async () => {
|
||||
const res = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/servers/cnonexistent12345678',
|
||||
headers: { authorization: 'Bearer scoped-token' },
|
||||
});
|
||||
expect(res.statusCode).toBe(403);
|
||||
});
|
||||
|
||||
it('wildcard user can access any server by CUID', async () => {
|
||||
const res = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/servers/clxyz000000002',
|
||||
headers: { authorization: 'Bearer admin-token' },
|
||||
});
|
||||
expect(res.statusCode).toBe(200);
|
||||
expect(res.json<{ name: string }>().name).toBe('slack-server');
|
||||
});
|
||||
});
|
||||
|
||||
describe('name-scoped write operations', () => {
|
||||
it('name-scoped edit user can DELETE their named server by CUID', async () => {
|
||||
const res = await app.inject({
|
||||
method: 'DELETE',
|
||||
url: '/api/v1/servers/clxyz000000001',
|
||||
headers: { authorization: 'Bearer edit-scoped-token' },
|
||||
});
|
||||
expect(res.statusCode).toBe(204);
|
||||
});
|
||||
|
||||
it('name-scoped edit user CANNOT delete other servers', async () => {
|
||||
const res = await app.inject({
|
||||
method: 'DELETE',
|
||||
url: '/api/v1/servers/clxyz000000002',
|
||||
headers: { authorization: 'Bearer edit-scoped-token' },
|
||||
});
|
||||
expect(res.statusCode).toBe(403);
|
||||
});
|
||||
|
||||
it('name-scoped view user CANNOT delete their named server', async () => {
|
||||
const res = await app.inject({
|
||||
method: 'DELETE',
|
||||
url: '/api/v1/servers/clxyz000000001',
|
||||
headers: { authorization: 'Bearer scoped-token' },
|
||||
});
|
||||
expect(res.statusCode).toBe(403);
|
||||
});
|
||||
});
|
||||
|
||||
describe('preSerialization edge cases', () => {
|
||||
it('single-object responses pass through unmodified', async () => {
|
||||
const res = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/servers/clxyz000000001',
|
||||
headers: { authorization: 'Bearer scoped-token' },
|
||||
});
|
||||
expect(res.statusCode).toBe(200);
|
||||
expect(res.json<{ name: string }>().name).toBe('my-home-assistant');
|
||||
});
|
||||
|
||||
it('unauthenticated requests get 401', async () => {
|
||||
const res = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/servers',
|
||||
});
|
||||
expect(res.statusCode).toBe(401);
|
||||
});
|
||||
});
|
||||
});
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user