Compare commits
101 Commits
main
...
feat/gated
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
705df06996 | ||
| 62647a7f90 | |||
|
|
39ca134201 | ||
| 78a1dc9c8a | |||
|
|
9ce705608b | ||
|
|
0824f8e635 | ||
|
|
9bd3127519 | ||
| e8ac500ae9 | |||
|
|
bed725b387 | ||
| 17a456d835 | |||
|
|
9481d394a1 | ||
|
|
bc769c4eeb | ||
| 6f534c8ba9 | |||
|
|
11da8b1fbf | ||
|
|
848868d45f | ||
|
|
869217a07a | ||
| 04d115933b | |||
|
|
7c23da10c6 | ||
| 32b4de4343 | |||
|
|
e06db9afba | ||
|
|
a25809b84a | ||
| f5a902d3e0 | |||
|
|
9cb0c5ce24 | ||
| 06230ec034 | |||
|
|
079c7b3dfa | ||
|
|
7829f4fb92 | ||
|
|
fa6240107f | ||
| b34ea63d3d | |||
|
|
e17a2282e8 | ||
| 01d3c4e02d | |||
|
|
e4affe5962 | ||
| c75e7cdf4d | |||
|
|
65c340a03c | ||
| 677d34b868 | |||
|
|
c5b8cb60b7 | ||
| 9a5deffb8f | |||
|
|
ec7ada5383 | ||
| b81d3be2d5 | |||
|
|
e2c54bfc5c | ||
| 7b7854b007 | |||
|
|
f23dd99662 | ||
| 43af85cb58 | |||
|
|
6d2e3c2eb3 | ||
| ce21db3853 | |||
|
|
767725023e | ||
| 2bd1b55fe8 | |||
|
|
0f2a93f2f0 | ||
| ce81d9d616 | |||
|
|
c6cc39c6f7 | ||
| de074d9a90 | |||
|
|
783cf15179 | ||
| 5844d6c73f | |||
|
|
604bd76d60 | ||
| da14bb8c23 | |||
|
|
9e9a2f4a54 | ||
| c8cdd7f514 | |||
|
|
ec1dfe7438 | ||
| 50b4112398 | |||
|
|
bb17a892d6 | ||
| a8117091a1 | |||
|
|
dcda93d179 | ||
| a6b5e24a8d | |||
|
|
3a6e58274c | ||
|
|
c819b65175 | ||
|
|
c3ef5a664f | ||
|
|
4c2927a16e | ||
| 79dd6e723d | |||
|
|
cde1c59fd6 | ||
| daa5860ed2 | |||
|
|
ecbf48dd49 | ||
| d38b5aac60 | |||
|
|
d07d4d11dd | ||
| fa58c1b5ed | |||
|
|
dd1dfc629d | ||
| 7b3dab142e | |||
|
|
4c127a7dc3 | ||
| c1e3e4aed6 | |||
|
|
e45c6079c1 | ||
| e4aef3acf1 | |||
|
|
a2cda38850 | ||
| 081e90de0f | |||
|
|
4e3d896ef6 | ||
| 0823e965bf | |||
|
|
c97219f85e | ||
| 93adcd4be7 | |||
|
|
d58e6e153f | ||
|
|
1e8847bb63 | ||
|
|
2a0deaa225 | ||
| 4eef6e38a2 | |||
|
|
ca02340a4c | ||
|
|
02254f2aac | ||
|
|
540dd6fd63 | ||
| a05a4c4816 | |||
|
|
97ade470df | ||
|
|
b25ff98374 | ||
|
|
22fe9c3435 | ||
| 72643fceda | |||
|
|
467357c2c6 | ||
| d6a80fc03d | |||
|
|
c07da826a0 | ||
|
|
0482944056 |
3
.gitignore
vendored
3
.gitignore
vendored
@@ -9,6 +9,8 @@ dist/
|
||||
.env
|
||||
.env.local
|
||||
.env.*.local
|
||||
stack/.env
|
||||
.portainer_password
|
||||
|
||||
# Logs
|
||||
logs/
|
||||
@@ -35,3 +37,4 @@ pgdata/
|
||||
|
||||
# Prisma
|
||||
src/db/prisma/migrations/*.sql.backup
|
||||
logs.sh
|
||||
|
||||
392
.taskmaster/docs/prd-gated-prompts.md
Normal file
392
.taskmaster/docs/prd-gated-prompts.md
Normal file
@@ -0,0 +1,392 @@
|
||||
# PRD: Gated Project Experience & Prompt Intelligence
|
||||
|
||||
## Overview
|
||||
|
||||
When 300 developers connect their LLM clients (Claude Code, Cursor, etc.) to mcpctl projects, they need relevant context — security policies, architecture decisions, operational runbooks — without flooding the context window. This feature introduces a gated session flow where the client LLM drives its own context retrieval through keyword-based matching, with the proxy providing a prompt index and encouraging ongoing discovery.
|
||||
|
||||
## Problem
|
||||
|
||||
- Injecting all prompts into instructions doesn't scale (hundreds of pages of policies)
|
||||
- Exposing prompts only as MCP resources means LLMs never read them
|
||||
- An index-only approach works for small numbers but breaks down at scale
|
||||
- No mechanism to link external knowledge (Notion, Docmost) as prompts
|
||||
- LLMs tend to work with whatever they have rather than proactively seek more context
|
||||
|
||||
## Core Concepts
|
||||
|
||||
### Gated Experience
|
||||
|
||||
A project-level flag (`gated: boolean`, default: `true`) that controls whether sessions go through a keyword-driven prompt retrieval flow before accessing project tools and resources.
|
||||
|
||||
**Flow (A + C):**
|
||||
|
||||
1. On `initialize`, instructions include the **prompt index** (names + summaries for all prompts, up to a reasonable cap) and tell client LLM: "Call `begin_session` with 5 keywords describing your task"
|
||||
2. **If client obeys**: `begin_session({ tags: ["zigbee", "lights", "mqtt", "pairing", "automation"] })` → prompt selection (see below) → returns matched prompt content + full prompt index + encouragement to retrieve more → session ungated
|
||||
3. **If client ignores**: First `tools/call` is intercepted → keywords extracted from tool name + arguments → same prompt selection → briefing injected alongside tool result → session ungated
|
||||
4. **Ongoing retrieval**: Client can call `read_prompts({ tags: ["security", "vpn"] })` at any point to retrieve more prompts. The prompt index is always visible so the client LLM can see what's available.
|
||||
|
||||
**Prompt selection — tiered approach:**
|
||||
|
||||
- **Primary (heavy LLM available)**: Tags + full prompt index (names, priorities, summaries, chapters) are sent to the heavy LLM (e.g. Gemini). The LLM understands synonyms, context, and intent — it knows "zigbee" relates to "Z2M" and "Zigbee2MQTT", and that someone working on "lights" probably needs the "common-mistakes" prompt about pairing. The LLM returns a ranked list of relevant prompt names with brief explanations of why each is relevant. The heavy LLM may use the fast LLM for preprocessing if needed (e.g. generating missing summaries on the fly).
|
||||
- **Fallback (no LLM, or `llmProvider=none`)**: Deterministic keyword-based tag matching against summaries/chapters with byte-budget allocation (see "Tag Matching Algorithm" below). Same approach as ResponsePaginator's byte-based fallback. Triggered when: no LLM providers configured, project has `llmProvider: "none"`, or local override sets `provider: "none"`.
|
||||
- **Hybrid (both paths always available)**: Even when heavy LLM does the initial selection, the `read_prompts({ tags: [...] })` tool always uses keyword matching. This way the client LLM can retrieve specific prompts by keyword that the heavy LLM may have missed. The LLM is smart about context, keywords are precise about names — together they cover both fuzzy and exact retrieval.
|
||||
|
||||
**LLM availability resolution** (same chain as existing LLM features):
|
||||
- Project `llmProvider: "none"` → no LLM, keyword fallback only
|
||||
- Project `llmProvider: null` → inherit from global config
|
||||
- Local override `provider: "none"` → no LLM, keyword fallback only
|
||||
- No providers configured → keyword fallback only
|
||||
- Otherwise → use heavy LLM for `begin_session`, fast LLM for summary generation
|
||||
|
||||
### Encouraging Retrieval
|
||||
|
||||
LLMs tend to proceed with incomplete information rather than seek more context. The system must actively counter this at multiple points:
|
||||
|
||||
**In `initialize` instructions:**
|
||||
```
|
||||
You have access to project knowledge containing policies, architecture decisions,
|
||||
and guidelines. Some may contain critical rules about what you're doing. After your
|
||||
initial briefing, if you're unsure about conventions, security requirements, or
|
||||
best practices — request more context using read_prompts. It's always better to
|
||||
check than to guess wrong. The project may have specific rules you don't know about yet.
|
||||
```
|
||||
|
||||
**In `begin_session` response (after matched prompts):**
|
||||
```
|
||||
Other prompts available that may become relevant as your work progresses:
|
||||
- security-policies: Network segmentation, firewall rules, VPN access
|
||||
- naming-conventions: Service and resource naming standards
|
||||
- ...
|
||||
If any of these seem related to what you're doing now or later, request them
|
||||
with read_prompts({ tags: [...] }) or resources/read. Don't assume you have
|
||||
all the context — check when in doubt.
|
||||
```
|
||||
|
||||
**In `read_prompts` response:**
|
||||
```
|
||||
Remember: you can request more prompts at any time with read_prompts({ tags: [...] }).
|
||||
The project may have additional guidelines relevant to your current approach.
|
||||
```
|
||||
|
||||
The tone is not "here's optional reading" but "there are rules you might not know about, and violating them costs more than reading them."
|
||||
|
||||
### Prompt Priority (1-10)
|
||||
|
||||
Every prompt has a priority level that influences selection order and byte-budget allocation:
|
||||
|
||||
| Range | Meaning | Behavior |
|
||||
|-------|---------|----------|
|
||||
| 1-3 | Reference | Low priority, included only on strong keyword match |
|
||||
| 4-6 | Standard | Default priority, included on moderate keyword match |
|
||||
| 7-9 | Important | High priority, lower match threshold |
|
||||
| 10 | Critical | Always included in full, regardless of keyword match (guardrails, common mistakes) |
|
||||
|
||||
Default priority for new prompts: `5`.
|
||||
|
||||
### Prompt Summaries & Chapters (Auto-generated)
|
||||
|
||||
Each prompt gets auto-generated metadata used for the prompt index and tag matching:
|
||||
|
||||
- `summary` (string, ~20 words) — one-line description of what the prompt covers
|
||||
- `chapters` (string[]) — key sections/topics extracted from content
|
||||
|
||||
Generation pipeline:
|
||||
- **Fast LLM available**: Summarize content, extract key topics
|
||||
- **No fast LLM**: First sentence of content + markdown headings via regex
|
||||
- Regenerated on prompt create/update
|
||||
- Cached on the prompt record
|
||||
|
||||
### Tag Matching Algorithm (No-LLM Fallback)
|
||||
|
||||
When no local LLM is available, the system falls back to a deterministic retrieval algorithm:
|
||||
|
||||
1. Client provides tags (5 keywords from `begin_session`, or extracted from tool call)
|
||||
2. For each prompt, compute a match score:
|
||||
- Check tags against prompt `summary` and `chapters` (case-insensitive substring match)
|
||||
- Score = `number_of_matching_tags * base_priority`
|
||||
- Priority 10 prompts: score = infinity (always included)
|
||||
3. Sort by score descending
|
||||
4. Fill a byte budget (configurable, default ~8KB) from top down:
|
||||
- Include full content until budget exhausted
|
||||
- Remaining matched prompts: include as index entries (name + summary)
|
||||
- Non-matched prompts: listed as names only in the "other prompts available" section
|
||||
|
||||
**When `begin_session` is skipped (intercept path):**
|
||||
- Extract keywords from tool name + arguments (e.g., `home-assistant/get_entities({ domain: "light" })` → tags: `["home-assistant", "entities", "light"]`)
|
||||
- Run same matching algorithm
|
||||
- Inject briefing alongside the real tool result
|
||||
|
||||
### `read_prompts` Tool (Ongoing Retrieval)
|
||||
|
||||
Available after session is ungated. Allows the client LLM to request more context at any point:
|
||||
|
||||
```json
|
||||
{
|
||||
"name": "read_prompts",
|
||||
"description": "Request additional project context by keywords. Use this whenever you need guidelines, policies, or conventions related to your current work. It's better to check than to guess.",
|
||||
"inputSchema": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"tags": {
|
||||
"type": "array",
|
||||
"items": { "type": "string" },
|
||||
"description": "Keywords describing what context you need (e.g. [\"security\", \"vpn\", \"firewall\"])"
|
||||
}
|
||||
},
|
||||
"required": ["tags"]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Returns matched prompt content + the prompt index reminder.
|
||||
|
||||
### Prompt Links
|
||||
|
||||
A prompt can be a **link** to an MCP resource in another project's server. The linked content is fetched server-side (by the proxy, not the client), enforcing RBAC.
|
||||
|
||||
Format: `project/server:resource-uri`
|
||||
Example: `system-public/docmost-mcp:docmost://pages/architecture-overview`
|
||||
|
||||
Properties:
|
||||
- The proxy fetches linked content using the source project's service account
|
||||
- Client LLM never gets direct access to the source MCP server
|
||||
- Dead links are detected and marked (health check on link resolution)
|
||||
- Dead links generate error log entries
|
||||
|
||||
RBAC for links:
|
||||
- Creating a link requires `edit` permission on RBAC in the target project
|
||||
- A service account permission is created on the source project for the linked resource
|
||||
- Default: admin group members can manage links
|
||||
|
||||
## Schema Changes
|
||||
|
||||
### Project
|
||||
|
||||
Add field:
|
||||
- `gated: boolean` (default: `true`)
|
||||
|
||||
### Prompt
|
||||
|
||||
Add fields:
|
||||
- `priority: integer` (1-10, default: 5)
|
||||
- `summary: string | null` (auto-generated)
|
||||
- `chapters: string[] | null` (auto-generated, stored as JSON)
|
||||
- `linkTarget: string | null` (format: `project/server:resource-uri`, null for regular prompts)
|
||||
|
||||
### PromptRequest
|
||||
|
||||
Add field:
|
||||
- `priority: integer` (1-10, default: 5)
|
||||
|
||||
## API Changes
|
||||
|
||||
### Modified Endpoints
|
||||
|
||||
- `POST /api/v1/prompts` — accept `priority`, `linkTarget`
|
||||
- `PUT /api/v1/prompts/:id` — accept `priority` (not `linkTarget` — links are immutable, delete and recreate)
|
||||
- `POST /api/v1/promptrequests` — accept `priority`
|
||||
- `GET /api/v1/prompts` — return `priority`, `summary`, `linkTarget`, `linkStatus` (alive/dead/unknown)
|
||||
- `GET /api/v1/projects/:name/prompts/visible` — return `priority`, `summary`, `chapters`
|
||||
|
||||
### New Endpoints
|
||||
|
||||
- `POST /api/v1/prompts/:id/regenerate-summary` — force re-generation of summary/chapters
|
||||
- `GET /api/v1/projects/:name/prompt-index` — returns compact index (name, priority, summary, chapters)
|
||||
|
||||
## MCP Protocol Changes (mcplocal router)
|
||||
|
||||
### Session State
|
||||
|
||||
Router tracks per-session state:
|
||||
- `gated: boolean` — starts `true` if project is gated
|
||||
- `tags: string[]` — accumulated tags from begin_session + read_prompts calls
|
||||
- `retrievedPrompts: Set<string>` — prompts already sent to client (avoid re-sending)
|
||||
|
||||
### Gated Session Flow
|
||||
|
||||
1. On `initialize`: instructions include prompt index + gate message + retrieval encouragement
|
||||
2. `tools/list` while gated: only `begin_session` visible (progressive tool exposure)
|
||||
3. `begin_session({ tags })`: match tags → return briefing + prompt index + encouragement → ungate → send `notifications/tools/list_changed`
|
||||
4. On first `tools/call` while still gated: extract keywords → match → inject briefing alongside result → ungate
|
||||
5. After ungating: all tools work normally, `read_prompts` available for ongoing retrieval
|
||||
|
||||
### `begin_session` Tool
|
||||
|
||||
```json
|
||||
{
|
||||
"name": "begin_session",
|
||||
"description": "Start your session by providing 5 keywords that describe your current task. You'll receive relevant project context, policies, and guidelines. Required before using other tools.",
|
||||
"inputSchema": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"tags": {
|
||||
"type": "array",
|
||||
"items": { "type": "string" },
|
||||
"maxItems": 10,
|
||||
"description": "5 keywords describing your current task (e.g. [\"zigbee\", \"automation\", \"lights\", \"mqtt\", \"pairing\"])"
|
||||
}
|
||||
},
|
||||
"required": ["tags"]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Response structure:
|
||||
```
|
||||
[Priority 10 prompts — always, full content]
|
||||
|
||||
[Tag-matched prompts — full content, byte-budget-capped, priority-ordered]
|
||||
|
||||
Other prompts available that may become relevant as your work progresses:
|
||||
- <name>: <summary>
|
||||
- <name>: <summary>
|
||||
- ...
|
||||
If any of these seem related to what you're doing, request them with
|
||||
read_prompts({ tags: [...] }). Don't assume you have all the context — check.
|
||||
```
|
||||
|
||||
### Prompt Index in Instructions
|
||||
|
||||
The `initialize` instructions include a compact prompt index so the client LLM can see what knowledge exists. Format per prompt: `- <name>: <summary>` (~100 chars max per entry).
|
||||
|
||||
Cap: if more than 50 prompts, include only priority 7+ in instructions index. Full index always available via `resources/list`.
|
||||
|
||||
## CLI Changes
|
||||
|
||||
### New/Modified Commands
|
||||
|
||||
- `mcpctl create prompt <name> --priority <1-10>` — create with priority
|
||||
- `mcpctl create prompt <name> --link <project/server:uri>` — create linked prompt
|
||||
- `mcpctl get prompt -A` — show all prompts across all projects, with link targets
|
||||
- `mcpctl describe project <name>` — show gated status, session greeting, prompt table
|
||||
- `mcpctl edit project <name>` — `gated` field editable
|
||||
|
||||
### Prompt Link Display
|
||||
|
||||
```
|
||||
$ mcpctl get prompt -A
|
||||
PROJECT NAME PRIORITY LINK STATUS
|
||||
homeautomation security-policies 8 - -
|
||||
homeautomation architecture-adr 6 system-public/docmost-mcp:docmost://pages/a1 alive
|
||||
homeautomation common-mistakes 10 - -
|
||||
system-public onboarding 4 - -
|
||||
```
|
||||
|
||||
## Describe Project Output
|
||||
|
||||
```
|
||||
$ mcpctl describe project homeautomation
|
||||
Name: homeautomation
|
||||
Gated: true
|
||||
LLM Provider: gemini-cli
|
||||
...
|
||||
|
||||
Session greeting:
|
||||
You have access to project knowledge containing policies, architecture decisions,
|
||||
and guidelines. Call begin_session with 5 keywords describing your task to receive
|
||||
relevant context. Some prompts contain critical rules — it's better to check than guess.
|
||||
|
||||
Prompts:
|
||||
NAME PRIORITY TYPE LINK
|
||||
common-mistakes 10 local -
|
||||
security-policies 8 local -
|
||||
architecture-adr 6 link system-public/docmost-mcp:docmost://pages/a1
|
||||
stack 5 local -
|
||||
```
|
||||
|
||||
## Testing Strategy
|
||||
|
||||
**Full test coverage is required.** Every new module, service, route, and algorithm must have comprehensive tests. No feature ships without tests.
|
||||
|
||||
### Unit Tests (mcpd)
|
||||
- Prompt priority CRUD: create/update/get with priority field, default value, validation (1-10 range)
|
||||
- Prompt link CRUD: create with linkTarget, immutability (can't update linkTarget), delete
|
||||
- Prompt summary generation: auto-generation on create/update, regex fallback when no LLM
|
||||
- `GET /api/v1/prompts` with priority, linkTarget, linkStatus fields
|
||||
- `GET /api/v1/projects/:name/prompt-index` returns compact index
|
||||
- `POST /api/v1/prompts/:id/regenerate-summary` triggers re-generation
|
||||
- Project `gated` field: CRUD, default value
|
||||
|
||||
### Unit Tests (mcplocal — gating flow)
|
||||
- State machine: gated → `begin_session` → ungated (happy path)
|
||||
- State machine: gated → `tools/call` intercepted → ungated (fallback path)
|
||||
- State machine: non-gated project skips gate entirely
|
||||
- LLM selection path: tags + prompt index sent to heavy LLM, ranked results returned, priority 10 always included
|
||||
- LLM selection path: heavy LLM uses fast LLM for missing summary generation
|
||||
- No-LLM fallback: tag matching score calculation, priority weighting, substring matching
|
||||
- No-LLM fallback: byte-budget exhaustion, priority ordering, index fallback, edge cases
|
||||
- Keyword extraction from tool calls: tool name parsing, argument extraction
|
||||
- `begin_session` response: matched content + index + encouragement text (both LLM and fallback paths)
|
||||
- `read_prompts` response: additional matches, deduplication against already-sent prompts (both paths)
|
||||
- Tools blocked while gated: return error directing to `begin_session`
|
||||
- `tools/list` while gated: only `begin_session` visible
|
||||
- `tools/list` after ungating: `begin_session` replaced by `read_prompts` + all upstream tools
|
||||
- Priority 10 always included regardless of tag match or budget
|
||||
- Prompt index in instructions: cap at 50, priority 7+ when over cap
|
||||
- Notifications: `tools/list_changed` sent after ungating
|
||||
|
||||
### Unit Tests (mcplocal — prompt links)
|
||||
- Link resolution: fetch content from source project's MCP server via service account
|
||||
- Dead link detection: source server unavailable, resource not found, permission denied
|
||||
- Dead link marking: status field updated, error logged
|
||||
- RBAC enforcement: link creation requires edit permission on target project RBAC
|
||||
- Service account permission: auto-created on source project for linked resource
|
||||
- Content isolation: client LLM cannot access source server directly
|
||||
|
||||
### Unit Tests (CLI)
|
||||
- `create prompt` with `--priority` flag, validation
|
||||
- `create prompt` with `--link` flag, format validation
|
||||
- `get prompt -A` output: all projects, link targets, status columns
|
||||
- `describe project` output: gated status, session greeting, prompt table
|
||||
- `edit project` with gated field
|
||||
- Shell completions for new flags and resources
|
||||
|
||||
### Integration Tests
|
||||
- End-to-end gated session: connect → begin_session with tags → tools available → correct prompts returned
|
||||
- End-to-end intercept: connect → skip begin_session → call tool → keywords extracted → briefing injected
|
||||
- End-to-end read_prompts: after ungating → request more context → additional prompts returned → no duplicates
|
||||
- Prompt link resolution: create link → fetch content → verify content matches source
|
||||
- Dead link lifecycle: create link → kill source → verify dead detection → restore → verify recovery
|
||||
- Priority ordering: create prompts at various priorities → verify selection order and budget allocation
|
||||
- Encouragement text: verify retrieval encouragement present in begin_session, read_prompts, and instructions
|
||||
|
||||
## System Prompts (mcpctl-system project)
|
||||
|
||||
All gate messages, encouragement text, and briefing templates are stored as prompts in a special `mcpctl-system` project. This makes them editable at runtime via `mcpctl edit prompt` without code changes or redeployment.
|
||||
|
||||
### Required System Prompts
|
||||
|
||||
| Name | Priority | Purpose |
|
||||
|------|----------|---------|
|
||||
| `gate-instructions` | 10 | Text injected into `initialize` instructions for gated projects. Tells client to call `begin_session` with 5 keywords. |
|
||||
| `gate-encouragement` | 10 | Appended after `begin_session` response. Lists remaining prompts and encourages further retrieval. |
|
||||
| `read-prompts-reminder` | 10 | Appended after `read_prompts` response. Reminds client that more context is available. |
|
||||
| `gate-intercept-preamble` | 10 | Prepended to briefing when injected via tool call intercept (Option C fallback). |
|
||||
| `session-greeting` | 10 | Shown in `mcpctl describe project` as the "hello prompt" — what client LLMs see on connect. |
|
||||
|
||||
### Bootstrap
|
||||
|
||||
The `mcpctl-system` project and its system prompts are created automatically on first startup (seed migration). They can be edited afterward but not deleted — delete attempts return an error.
|
||||
|
||||
### How mcplocal Uses Them
|
||||
|
||||
On router initialization, mcplocal fetches system prompts from mcpd via:
|
||||
```
|
||||
GET /api/v1/projects/mcpctl-system/prompts/visible
|
||||
```
|
||||
|
||||
These are cached with the same 60s TTL as project routers. The prompt content supports template variables:
|
||||
- `{{prompt_index}}` — replaced with the current project's prompt index
|
||||
- `{{project_name}}` — replaced with the current project name
|
||||
- `{{matched_prompts}}` — replaced with tag-matched prompt content
|
||||
- `{{remaining_prompts}}` — replaced with the list of non-matched prompts
|
||||
|
||||
This way the encouragement text, tone, and structure can be tuned by editing prompts — no code changes needed.
|
||||
|
||||
## Security Considerations
|
||||
|
||||
- Prompt links: content fetched server-side, client never gets direct access to source MCP server
|
||||
- RBAC: link creation requires edit permission on target project's RBAC
|
||||
- Service account: source project grants read access to linked resource only
|
||||
- Dead links: logged as errors, marked in listings, never expose source server errors to client
|
||||
- Tag extraction: sanitize tool call arguments before using as keywords (prevent injection)
|
||||
@@ -1408,13 +1408,497 @@
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-21T18:52:29.084Z"
|
||||
},
|
||||
{
|
||||
"id": "37",
|
||||
"title": "Add priority, summary, chapters, and linkTarget fields to Prompt schema",
|
||||
"description": "Extend the Prisma schema for the Prompt model to include priority (integer 1-10, default 5), summary (nullable string), chapters (nullable JSON array), and linkTarget (nullable string for prompt links).",
|
||||
"details": "1. Update `/src/db/prisma/schema.prisma` to add fields to the Prompt model:\n - `priority Int @default(5)` with check constraint 1-10\n - `summary String? @db.Text`\n - `chapters Json?` (stored as JSON array of strings)\n - `linkTarget String?` (format: `project/server:resource-uri`)\n\n2. Create Prisma migration:\n ```bash\n pnpm --filter db exec prisma migrate dev --name add-prompt-priority-summary-chapters-link\n ```\n\n3. Update TypeScript types in shared package to reflect new fields\n\n4. Add validation for priority range (1-10) at the database level if possible, otherwise enforce in application layer",
|
||||
"testStrategy": "- Unit test: Verify migration creates columns with correct types and defaults\n- Unit test: Verify priority default is 5\n- Unit test: Verify nullable fields accept null\n- Unit test: Verify chapters stores/retrieves JSON arrays correctly\n- Integration test: Create prompt with all new fields, retrieve and verify values",
|
||||
"priority": "high",
|
||||
"dependencies": [],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:35:08.154Z"
|
||||
},
|
||||
{
|
||||
"id": "38",
|
||||
"title": "Add priority field to PromptRequest schema",
|
||||
"description": "Extend the Prisma schema for the PromptRequest model to include the priority field (integer 1-10, default 5) to match the Prompt model.",
|
||||
"details": "1. Update `/src/db/prisma/schema.prisma` to add to PromptRequest:\n - `priority Int @default(5)`\n\n2. Create Prisma migration:\n ```bash\n pnpm --filter db exec prisma migrate dev --name add-promptrequest-priority\n ```\n\n3. Update the `CreatePromptRequestSchema` in `/src/mcpd/src/validation/prompt.schema.ts` to include priority validation:\n ```typescript\n priority: z.number().int().min(1).max(10).default(5).optional(),\n ```\n\n4. Update TypeScript types in shared package",
|
||||
"testStrategy": "- Unit test: Migration creates priority column with default 5\n- Unit test: PromptRequest creation with explicit priority\n- Unit test: PromptRequest creation uses default priority when not specified\n- Unit test: Validation rejects priority outside 1-10 range",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"37"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:35:08.160Z"
|
||||
},
|
||||
{
|
||||
"id": "39",
|
||||
"title": "Add gated field to Project schema",
|
||||
"description": "Extend the Prisma schema for the Project model to include the gated boolean field (default true) that controls whether sessions go through the keyword-driven prompt retrieval flow.",
|
||||
"details": "1. Update `/src/db/prisma/schema.prisma` to add to Project:\n - `gated Boolean @default(true)`\n\n2. Create Prisma migration:\n ```bash\n pnpm --filter db exec prisma migrate dev --name add-project-gated\n ```\n\n3. Update project-related TypeScript types\n\n4. Update project validation schemas to include gated field:\n ```typescript\n gated: z.boolean().default(true).optional(),\n ```\n\n5. Update project API routes to accept and return the gated field",
|
||||
"testStrategy": "- Unit test: Migration creates gated column with default true\n- Unit test: Project creation with gated=false\n- Unit test: Project creation uses default gated=true when not specified\n- Unit test: Project update can toggle gated field\n- Integration test: GET /api/v1/projects/:name returns gated field",
|
||||
"priority": "high",
|
||||
"dependencies": [],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:35:08.165Z"
|
||||
},
|
||||
{
|
||||
"id": "40",
|
||||
"title": "Update Prompt CRUD API to handle priority and linkTarget",
|
||||
"description": "Modify prompt API endpoints to accept, validate, and return the priority and linkTarget fields. LinkTarget should be immutable after creation.",
|
||||
"details": "1. Update `/src/mcpd/src/validation/prompt.schema.ts`:\n ```typescript\n export const CreatePromptSchema = z.object({\n name: z.string().min(1).max(100).regex(/^[a-z0-9-]+$/),\n content: z.string().min(1).max(50000),\n projectId: z.string().optional(),\n priority: z.number().int().min(1).max(10).default(5).optional(),\n linkTarget: z.string().regex(/^[a-z0-9-]+\\/[a-z0-9-]+:[\\S]+$/).optional(),\n });\n \n export const UpdatePromptSchema = z.object({\n content: z.string().min(1).max(50000).optional(),\n priority: z.number().int().min(1).max(10).optional(),\n // Note: linkTarget is NOT included - links are immutable\n });\n ```\n\n2. Update `/src/mcpd/src/routes/prompts.ts`:\n - POST /api/v1/prompts: Accept priority, linkTarget\n - PUT /api/v1/prompts/:id: Accept priority only (not linkTarget)\n - GET endpoints: Return priority, linkTarget in response\n\n3. Update repository layer to handle new fields\n\n4. Add linkTarget format validation: `project/server:resource-uri`",
|
||||
"testStrategy": "- Unit test: POST /api/v1/prompts with priority creates prompt with correct priority\n- Unit test: POST /api/v1/prompts with linkTarget creates linked prompt\n- Unit test: PUT /api/v1/prompts/:id with priority updates priority\n- Unit test: PUT /api/v1/prompts/:id rejects linkTarget (immutable)\n- Unit test: GET /api/v1/prompts returns priority and linkTarget fields\n- Unit test: Invalid linkTarget format rejected (validation error)\n- Unit test: Priority outside 1-10 range rejected",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"37"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:37:17.506Z"
|
||||
},
|
||||
{
|
||||
"id": "41",
|
||||
"title": "Update PromptRequest API to handle priority",
|
||||
"description": "Modify prompt request API endpoints to accept, validate, and return the priority field for proposed prompts.",
|
||||
"details": "1. Update validation in `/src/mcpd/src/validation/prompt.schema.ts`:\n ```typescript\n export const CreatePromptRequestSchema = z.object({\n name: z.string().min(1).max(100).regex(/^[a-z0-9-]+$/),\n content: z.string().min(1).max(50000),\n projectId: z.string().optional(),\n createdBySession: z.string().optional(),\n createdByUserId: z.string().optional(),\n priority: z.number().int().min(1).max(10).default(5).optional(),\n });\n ```\n\n2. Update `/src/mcpd/src/routes/prompts.ts` for PromptRequest endpoints:\n - POST /api/v1/promptrequests: Accept priority\n - GET /api/v1/promptrequests: Return priority\n - POST /api/v1/promptrequests/:id/approve: Preserve priority when creating Prompt\n\n3. Update PromptService.approve() to copy priority from request to prompt\n\n4. Update repository layer",
|
||||
"testStrategy": "- Unit test: POST /api/v1/promptrequests with priority creates request with correct priority\n- Unit test: POST /api/v1/promptrequests uses default priority 5 when not specified\n- Unit test: GET /api/v1/promptrequests returns priority field\n- Unit test: Approve preserves priority from request to created prompt\n- Unit test: Priority validation (1-10 range)",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"38"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:37:17.511Z"
|
||||
},
|
||||
{
|
||||
"id": "42",
|
||||
"title": "Implement prompt summary generation service",
|
||||
"description": "Create a service that auto-generates summary (20 words) and chapters (key sections) for prompts, using fast LLM when available or regex fallback.",
|
||||
"details": "1. Create `/src/mcpd/src/services/prompt-summary.service.ts`:\n ```typescript\n export class PromptSummaryService {\n constructor(\n private llmClient: LlmClient | null,\n private promptRepo: IPromptRepository\n ) {}\n \n async generateSummary(content: string): Promise<{ summary: string; chapters: string[] }> {\n if (this.llmClient) {\n return this.generateWithLlm(content);\n }\n return this.generateWithRegex(content);\n }\n \n private async generateWithLlm(content: string): Promise<...> {\n // Send content to fast LLM with prompt:\n // \"Generate a 20-word summary and extract key section topics...\"\n }\n \n private generateWithRegex(content: string): { summary: string; chapters: string[] } {\n // summary: first sentence of content (truncated to ~20 words)\n // chapters: extract markdown headings via regex /^#+\\s+(.+)$/gm\n }\n }\n ```\n\n2. Integrate with PromptService:\n - Call generateSummary on prompt create\n - Call generateSummary on prompt update (when content changes)\n - Cache results on the prompt record\n\n3. Handle LLM availability check via existing LlmConfig patterns",
|
||||
"testStrategy": "- Unit test: generateWithRegex extracts first sentence as summary\n- Unit test: generateWithRegex extracts markdown headings as chapters\n- Unit test: generateWithLlm calls LLM with correct prompt (mock LLM)\n- Unit test: generateSummary uses LLM when available\n- Unit test: generateSummary falls back to regex when no LLM\n- Unit test: Empty content handled gracefully\n- Unit test: Content without headings returns empty chapters array\n- Integration test: Creating prompt triggers summary generation",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"37"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:39:28.196Z"
|
||||
},
|
||||
{
|
||||
"id": "43",
|
||||
"title": "Add regenerate-summary API endpoint",
|
||||
"description": "Create POST /api/v1/prompts/:id/regenerate-summary endpoint to force re-generation of summary and chapters for a prompt.",
|
||||
"details": "1. Add route in `/src/mcpd/src/routes/prompts.ts`:\n ```typescript\n fastify.post('/api/v1/prompts/:id/regenerate-summary', async (request, reply) => {\n const { id } = request.params as { id: string };\n const prompt = await promptService.findById(id);\n if (!prompt) {\n return reply.status(404).send({ error: 'Prompt not found' });\n }\n \n const { summary, chapters } = await summaryService.generateSummary(prompt.content);\n const updated = await promptService.updateSummary(id, summary, chapters);\n \n return reply.send(updated);\n });\n ```\n\n2. Add `updateSummary(id, summary, chapters)` method to PromptRepository and PromptService\n\n3. Return the updated prompt with new summary/chapters in response",
|
||||
"testStrategy": "- Unit test: POST to valid prompt ID regenerates summary\n- Unit test: Returns updated prompt with new summary/chapters\n- Unit test: 404 for non-existent prompt ID\n- Unit test: Uses LLM when available, regex fallback otherwise\n- Integration test: End-to-end regeneration updates database",
|
||||
"priority": "medium",
|
||||
"dependencies": [
|
||||
"42"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:39:28.201Z"
|
||||
},
|
||||
{
|
||||
"id": "44",
|
||||
"title": "Create prompt-index API endpoint",
|
||||
"description": "Create GET /api/v1/projects/:name/prompt-index endpoint that returns a compact index of prompts (name, priority, summary, chapters) for a project.",
|
||||
"details": "1. Add route in `/src/mcpd/src/routes/prompts.ts`:\n ```typescript\n fastify.get('/api/v1/projects/:name/prompt-index', async (request, reply) => {\n const { name } = request.params as { name: string };\n const project = await projectService.findByName(name);\n if (!project) {\n return reply.status(404).send({ error: 'Project not found' });\n }\n \n const prompts = await promptService.findByProject(project.id);\n const index = prompts.map(p => ({\n name: p.name,\n priority: p.priority,\n summary: p.summary,\n chapters: p.chapters,\n linkTarget: p.linkTarget,\n }));\n \n return reply.send({ prompts: index });\n });\n ```\n\n2. Consider adding global prompts to the index (inherited by all projects)\n\n3. Sort by priority descending in response",
|
||||
"testStrategy": "- Unit test: Returns compact index for valid project\n- Unit test: Index contains name, priority, summary, chapters, linkTarget\n- Unit test: 404 for non-existent project\n- Unit test: Empty array for project with no prompts\n- Unit test: Results sorted by priority descending\n- Integration test: End-to-end retrieval matches database state",
|
||||
"priority": "medium",
|
||||
"dependencies": [
|
||||
"42"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:39:28.208Z"
|
||||
},
|
||||
{
|
||||
"id": "45",
|
||||
"title": "Implement tag-matching algorithm for prompt selection",
|
||||
"description": "Create a deterministic keyword-based tag matching algorithm as the no-LLM fallback for prompt selection, with byte-budget allocation and priority weighting.",
|
||||
"details": "1. Create `/src/mcplocal/src/services/tag-matcher.service.ts`:\n ```typescript\n interface MatchedPrompt {\n prompt: PromptIndex;\n score: number;\n matchedTags: string[];\n }\n \n export class TagMatcherService {\n constructor(private byteBudget: number = 8192) {}\n \n matchPrompts(tags: string[], promptIndex: PromptIndex[]): {\n fullContent: PromptIndex[]; // Prompts to include in full\n indexOnly: PromptIndex[]; // Prompts to include as index entries\n remaining: PromptIndex[]; // Non-matched prompts (names only)\n } {\n // 1. Priority 10 prompts: always included (score = Infinity)\n // 2. For each prompt, compute score:\n // - Check tags against summary + chapters (case-insensitive substring)\n // - score = matching_tags_count * priority\n // 3. Sort by score descending\n // 4. Fill byte budget from top:\n // - Include full content until budget exhausted\n // - Remaining matched: include as index entries\n // - Non-matched: names only\n }\n \n private computeScore(tags: string[], prompt: PromptIndex): number {\n if (prompt.priority === 10) return Infinity;\n const matchingTags = tags.filter(tag => \n this.matchesPrompt(tag.toLowerCase(), prompt)\n );\n return matchingTags.length * prompt.priority;\n }\n \n private matchesPrompt(tag: string, prompt: PromptIndex): boolean {\n const searchText = [\n prompt.summary || '',\n ...(prompt.chapters || [])\n ].join(' ').toLowerCase();\n return searchText.includes(tag);\n }\n }\n ```\n\n2. Handle edge cases: empty tags, no prompts, all priority 10, etc.",
|
||||
"testStrategy": "- Unit test: Priority 10 prompts always included regardless of tags\n- Unit test: Score calculation: matching_tags * priority\n- Unit test: Case-insensitive matching\n- Unit test: Substring matching in summary and chapters\n- Unit test: Byte budget exhaustion stops full content inclusion\n- Unit test: Matched prompts beyond budget become index entries\n- Unit test: Non-matched prompts listed as names only\n- Unit test: Sorting by score descending\n- Unit test: Empty tags returns priority 10 only\n- Unit test: No prompts returns empty result",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"44"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:40:47.570Z"
|
||||
},
|
||||
{
|
||||
"id": "46",
|
||||
"title": "Implement LLM-based prompt selection service",
|
||||
"description": "Create a service that uses the heavy LLM to intelligently select relevant prompts based on tags and the full prompt index, understanding synonyms and context.",
|
||||
"details": "1. Create `/src/mcplocal/src/services/llm-prompt-selector.service.ts`:\n ```typescript\n export class LlmPromptSelectorService {\n constructor(\n private llmClient: LlmClient,\n private fastLlmClient: LlmClient | null,\n private tagMatcher: TagMatcherService // fallback\n ) {}\n \n async selectPrompts(tags: string[], promptIndex: PromptIndex[]): Promise<{\n selected: Array<{ name: string; reason: string }>;\n priority10: PromptIndex[]; // Always included\n }> {\n // 1. Extract priority 10 prompts (always included)\n // 2. Generate missing summaries using fast LLM if needed\n // 3. Send to heavy LLM:\n const prompt = `\n Given these keywords: ${tags.join(', ')}\n And this prompt index:\n ${promptIndex.map(p => `- ${p.name}: ${p.summary}`).join('\\n')}\n \n Select the most relevant prompts for someone working on tasks\n related to these keywords. Consider synonyms and related concepts.\n Return a ranked JSON array: [{name: string, reason: string}]\n `;\n // 4. Parse LLM response\n // 5. On LLM error, fall back to tag matcher\n }\n }\n ```\n\n2. Handle LLM timeouts and errors gracefully with fallback\n\n3. Validate LLM response format",
|
||||
"testStrategy": "- Unit test: Priority 10 prompts always returned regardless of LLM selection\n- Unit test: LLM called with correct prompt format (mock)\n- Unit test: LLM response parsed correctly\n- Unit test: Invalid LLM response falls back to tag matcher\n- Unit test: LLM timeout falls back to tag matcher\n- Unit test: Missing summaries trigger fast LLM generation\n- Unit test: No LLM available uses tag matcher directly\n- Integration test: End-to-end selection with mock LLM",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"45"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:45:57.158Z"
|
||||
},
|
||||
{
|
||||
"id": "47",
|
||||
"title": "Implement session state management for gating",
|
||||
"description": "Extend the McpRouter to track per-session gating state including gated status, accumulated tags, and retrieved prompts set.",
|
||||
"details": "1. Update `/src/mcplocal/src/router.ts` to add session state:\n ```typescript\n interface SessionState {\n gated: boolean; // starts true if project is gated\n tags: string[]; // accumulated from begin_session + read_prompts\n retrievedPrompts: Set<string>; // prompts already sent (avoid duplicates)\n }\n \n export class McpRouter {\n private sessionStates: Map<string, SessionState> = new Map();\n \n getSessionState(sessionId: string): SessionState {\n if (!this.sessionStates.has(sessionId)) {\n this.sessionStates.set(sessionId, {\n gated: this.projectConfig?.gated ?? true,\n tags: [],\n retrievedPrompts: new Set(),\n });\n }\n return this.sessionStates.get(sessionId)!;\n }\n \n ungateSession(sessionId: string): void {\n const state = this.getSessionState(sessionId);\n state.gated = false;\n }\n \n addRetrievedPrompts(sessionId: string, names: string[]): void {\n const state = this.getSessionState(sessionId);\n names.forEach(n => state.retrievedPrompts.add(n));\n }\n }\n ```\n\n2. Clean up session state when session closes\n\n3. Handle session state for non-gated projects (gated=false from start)",
|
||||
"testStrategy": "- Unit test: New session starts with gated=true for gated project\n- Unit test: New session starts with gated=false for non-gated project\n- Unit test: ungateSession changes gated to false\n- Unit test: addRetrievedPrompts adds to set\n- Unit test: retrievedPrompts prevents duplicates\n- Unit test: Session state isolated per sessionId\n- Unit test: Session cleanup removes state",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"39"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:45:57.164Z"
|
||||
},
|
||||
{
|
||||
"id": "48",
|
||||
"title": "Implement begin_session tool for gated sessions",
|
||||
"description": "Create the begin_session MCP tool that accepts 5 keywords, triggers prompt selection, returns matched content with encouragement, and ungates the session.",
|
||||
"details": "1. Add begin_session tool definition in `/src/mcplocal/src/router.ts`:\n ```typescript\n private getBeginSessionTool(): Tool {\n return {\n name: 'begin_session',\n description: 'Start your session by providing 5 keywords that describe your current task. You\\'ll receive relevant project context, policies, and guidelines. Required before using other tools.',\n inputSchema: {\n type: 'object',\n properties: {\n tags: {\n type: 'array',\n items: { type: 'string' },\n maxItems: 10,\n description: '5 keywords describing your current task'\n }\n },\n required: ['tags']\n }\n };\n }\n ```\n\n2. Implement begin_session handler:\n - Validate tags array (1-10 items)\n - Call LlmPromptSelector or TagMatcher based on LLM availability\n - Fetch full content for selected prompts\n - Build response with matched content + index + encouragement\n - Ungate session\n - Send `notifications/tools/list_changed`\n\n3. Response format:\n ```\n [Priority 10 prompts - full content]\n \n [Tag-matched prompts - full content, priority-ordered]\n \n Other prompts available that may become relevant...\n - name: summary\n ...\n If any seem related, request them with read_prompts({ tags: [...] }).\n ```",
|
||||
"testStrategy": "- Unit test: begin_session with valid tags returns matched prompts\n- Unit test: begin_session includes priority 10 prompts always\n- Unit test: begin_session response includes encouragement text\n- Unit test: begin_session response includes prompt index\n- Unit test: Session ungated after successful begin_session\n- Unit test: notifications/tools/list_changed sent after ungating\n- Unit test: Empty tags handled (returns priority 10 only)\n- Unit test: Invalid tags rejected with error\n- Unit test: begin_session while already ungated returns error",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"46",
|
||||
"47"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:50:39.111Z"
|
||||
},
|
||||
{
|
||||
"id": "49",
|
||||
"title": "Implement read_prompts tool for ongoing retrieval",
|
||||
"description": "Create the read_prompts MCP tool that allows clients to request additional context by keywords after the session is ungated.",
|
||||
"details": "1. Add read_prompts tool definition:\n ```typescript\n private getReadPromptsTool(): Tool {\n return {\n name: 'read_prompts',\n description: 'Request additional project context by keywords. Use this whenever you need guidelines, policies, or conventions related to your current work.',\n inputSchema: {\n type: 'object',\n properties: {\n tags: {\n type: 'array',\n items: { type: 'string' },\n description: 'Keywords describing what context you need'\n }\n },\n required: ['tags']\n }\n };\n }\n ```\n\n2. Implement read_prompts handler:\n - Always use keyword matching (not LLM) for precision\n - Exclude already-retrieved prompts from response\n - Add newly retrieved prompts to session state\n - Include reminder about more prompts available\n\n3. Response format:\n ```\n [Matched prompt content - deduplicated]\n \n Remember: you can request more prompts at any time with read_prompts({ tags: [...] }).\n The project may have additional guidelines relevant to your current approach.\n ```",
|
||||
"testStrategy": "- Unit test: read_prompts returns matched prompts by keyword\n- Unit test: Already retrieved prompts excluded from response\n- Unit test: Newly retrieved prompts added to session state\n- Unit test: Response includes reminder text\n- Unit test: read_prompts while gated returns error\n- Unit test: Empty tags returns empty response\n- Unit test: Uses keyword matching not LLM",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"48"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:50:39.115Z"
|
||||
},
|
||||
{
|
||||
"id": "50",
|
||||
"title": "Implement progressive tool exposure for gated sessions",
|
||||
"description": "Modify tools/list behavior to only expose begin_session while gated, and expose all tools plus read_prompts after ungating.",
|
||||
"details": "1. Update tools/list handling in `/src/mcplocal/src/router.ts`:\n ```typescript\n async handleToolsList(sessionId: string): Promise<Tool[]> {\n const state = this.getSessionState(sessionId);\n \n if (state.gated) {\n // Only show begin_session while gated\n return [this.getBeginSessionTool()];\n }\n \n // After ungating: all upstream tools + read_prompts\n const upstreamTools = await this.discoverTools();\n return [...upstreamTools, this.getReadPromptsTool()];\n }\n ```\n\n2. Block direct tool calls while gated:\n ```typescript\n async handleToolCall(sessionId: string, toolName: string, args: any): Promise<any> {\n const state = this.getSessionState(sessionId);\n \n if (state.gated && toolName !== 'begin_session') {\n // Intercept: extract keywords, match prompts, inject briefing\n return this.handleInterceptedCall(sessionId, toolName, args);\n }\n \n // Normal routing\n return this.routeToolCall(toolName, args);\n }\n ```\n\n3. Ensure notifications/tools/list_changed is sent after ungating",
|
||||
"testStrategy": "- Unit test: tools/list while gated returns only begin_session\n- Unit test: tools/list after ungating returns all tools + read_prompts\n- Unit test: begin_session not visible after ungating\n- Unit test: Tool call while gated (not begin_session) triggers intercept\n- Unit test: Tool call after ungating routes normally\n- Unit test: notifications/tools/list_changed sent on ungate",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"48",
|
||||
"49"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:50:39.120Z"
|
||||
},
|
||||
{
|
||||
"id": "51",
|
||||
"title": "Implement keyword extraction from tool calls",
|
||||
"description": "Create a service that extracts keywords from tool names and arguments for the intercept fallback path when clients skip begin_session.",
|
||||
"details": "1. Create `/src/mcplocal/src/services/keyword-extractor.service.ts`:\n ```typescript\n export class KeywordExtractorService {\n extractKeywords(toolName: string, args: Record<string, any>): string[] {\n const keywords: string[] = [];\n \n // Extract from tool name (split on / and -)\n // e.g., \"home-assistant/get_entities\" -> [\"home\", \"assistant\", \"get\", \"entities\"]\n keywords.push(...this.extractFromName(toolName));\n \n // Extract from argument values\n // e.g., { domain: \"light\", entity_id: \"light.kitchen\" } -> [\"light\", \"kitchen\"]\n keywords.push(...this.extractFromArgs(args));\n \n // Deduplicate and sanitize\n return [...new Set(keywords.map(k => this.sanitize(k)))];\n }\n \n private sanitize(keyword: string): string {\n // Remove special characters, lowercase, limit length\n return keyword.toLowerCase().replace(/[^a-z0-9]/g, '').slice(0, 50);\n }\n }\n ```\n\n2. Handle various argument types: strings, arrays, nested objects\n\n3. Prevent injection by sanitizing extracted keywords",
|
||||
"testStrategy": "- Unit test: Extracts keywords from tool name with /\n- Unit test: Extracts keywords from tool name with -\n- Unit test: Extracts keywords from string argument values\n- Unit test: Extracts keywords from array argument values\n- Unit test: Handles nested object arguments\n- Unit test: Sanitizes special characters\n- Unit test: Deduplicates keywords\n- Unit test: Handles empty arguments\n- Unit test: Limits keyword length to prevent abuse",
|
||||
"priority": "medium",
|
||||
"dependencies": [],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:40:47.575Z"
|
||||
},
|
||||
{
|
||||
"id": "52",
|
||||
"title": "Implement tool call intercept with briefing injection",
|
||||
"description": "When a gated session calls a tool without first calling begin_session, intercept the call, extract keywords, match prompts, and inject the briefing alongside the real tool result.",
|
||||
"details": "1. Implement handleInterceptedCall in `/src/mcplocal/src/router.ts`:\n ```typescript\n async handleInterceptedCall(\n sessionId: string,\n toolName: string,\n args: any\n ): Promise<ToolResult> {\n // 1. Extract keywords from tool call\n const keywords = this.keywordExtractor.extractKeywords(toolName, args);\n \n // 2. Match prompts using keywords\n const { fullContent, indexOnly, remaining } = \n await this.promptSelector.selectPrompts(keywords, this.promptIndex);\n \n // 3. Execute the actual tool call\n const actualResult = await this.routeToolCall(toolName, args);\n \n // 4. Build briefing with intercept preamble\n const briefing = this.buildBriefing(fullContent, indexOnly, remaining, 'intercept');\n \n // 5. Ungate session\n this.ungateSession(sessionId);\n \n // 6. Send notifications/tools/list_changed\n await this.sendToolsListChanged();\n \n // 7. Return combined result\n return {\n content: [{\n type: 'text',\n text: `${briefing}\\n\\n---\\n\\n${actualResult.content[0].text}`\n }]\n };\n }\n ```\n\n2. Use gate-intercept-preamble system prompt for the briefing prefix",
|
||||
"testStrategy": "- Unit test: Tool call while gated triggers intercept\n- Unit test: Keywords extracted from tool name and args\n- Unit test: Prompts matched using extracted keywords\n- Unit test: Actual tool still executes and returns result\n- Unit test: Briefing prepended to tool result\n- Unit test: Session ungated after intercept\n- Unit test: notifications/tools/list_changed sent\n- Unit test: Intercept preamble included in briefing\n- Integration test: End-to-end intercept flow",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"50",
|
||||
"51"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:51:03.822Z"
|
||||
},
|
||||
{
|
||||
"id": "53",
|
||||
"title": "Add prompt index to initialize instructions",
|
||||
"description": "Modify the initialize handler to include the compact prompt index and gate message in instructions for gated projects.",
|
||||
"details": "1. Update initialize handling in `/src/mcplocal/src/router.ts`:\n ```typescript\n async handleInitialize(sessionId: string): Promise<InitializeResult> {\n const state = this.getSessionState(sessionId);\n \n let instructions = this.projectConfig.prompt || '';\n \n if (state.gated) {\n // Add gate instructions\n const gateInstructions = await this.getSystemPrompt('gate-instructions');\n \n // Build prompt index (cap at 50, priority 7+ if over)\n const index = this.buildPromptIndex();\n \n instructions += `\\n\\n${gateInstructions.replace('{{prompt_index}}', index)}`;\n }\n \n return {\n protocolVersion: '2024-11-05',\n capabilities: { ... },\n serverInfo: { ... },\n instructions,\n };\n }\n ```\n\n2. Build prompt index with cap:\n - If <= 50 prompts: include all\n - If > 50 prompts: include only priority 7+\n - Format: `- <name>: <summary>` (~100 chars per entry)",
|
||||
"testStrategy": "- Unit test: Gated project includes gate instructions in initialize\n- Unit test: Prompt index included in instructions\n- Unit test: Index capped at 50 entries\n- Unit test: Over 50 prompts shows priority 7+ only\n- Unit test: Non-gated project skips gate instructions\n- Unit test: {{prompt_index}} template replaced\n- Integration test: End-to-end initialize with gated project",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"47",
|
||||
"44"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:52:13.697Z"
|
||||
},
|
||||
{
|
||||
"id": "54",
|
||||
"title": "Create mcpctl-system project with system prompts",
|
||||
"description": "Implement bootstrap logic to create the mcpctl-system project and its required system prompts on first startup, with protection against deletion.",
|
||||
"details": "1. Create seed migration or startup hook:\n ```typescript\n async function bootstrapSystemProject() {\n const systemProject = await projectRepo.findByName('mcpctl-system');\n if (systemProject) return; // Already exists\n \n // Create mcpctl-system project\n const project = await projectRepo.create({\n name: 'mcpctl-system',\n description: 'System prompts for mcpctl gating and encouragement',\n gated: false, // System project is not gated\n ownerId: SYSTEM_USER_ID,\n });\n \n // Create required system prompts\n const systemPrompts = [\n { name: 'gate-instructions', priority: 10, content: GATE_INSTRUCTIONS },\n { name: 'gate-encouragement', priority: 10, content: GATE_ENCOURAGEMENT },\n { name: 'read-prompts-reminder', priority: 10, content: READ_PROMPTS_REMINDER },\n { name: 'gate-intercept-preamble', priority: 10, content: GATE_INTERCEPT_PREAMBLE },\n { name: 'session-greeting', priority: 10, content: SESSION_GREETING },\n ];\n \n for (const p of systemPrompts) {\n await promptRepo.create({ ...p, projectId: project.id });\n }\n }\n ```\n\n2. Add delete protection in prompt delete endpoint:\n - Check if prompt belongs to mcpctl-system\n - Return 403 error if attempting to delete system prompt\n\n3. Define default content for each system prompt per PRD",
|
||||
"testStrategy": "- Unit test: System project created on first startup\n- Unit test: All 5 system prompts created\n- Unit test: Subsequent startups don't duplicate\n- Unit test: Delete system prompt returns 403\n- Unit test: System prompts have priority 10\n- Unit test: mcpctl-system project has gated=false\n- Integration test: End-to-end bootstrap flow",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"40",
|
||||
"39"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:56:12.064Z"
|
||||
},
|
||||
{
|
||||
"id": "55",
|
||||
"title": "Implement system prompt fetching and caching in mcplocal",
|
||||
"description": "Add functionality to mcplocal router to fetch system prompts from mcpd and cache them with 60s TTL, supporting template variable replacement.",
|
||||
"details": "1. Add system prompt fetching in `/src/mcplocal/src/router.ts`:\n ```typescript\n private systemPromptCache: Map<string, { content: string; expiresAt: number }> = new Map();\n \n async getSystemPrompt(name: string): Promise<string> {\n const cached = this.systemPromptCache.get(name);\n if (cached && cached.expiresAt > Date.now()) {\n return cached.content;\n }\n \n const prompts = await this.mcpdClient.fetch(\n '/api/v1/projects/mcpctl-system/prompts/visible'\n );\n const prompt = prompts.find(p => p.name === name);\n if (!prompt) {\n throw new Error(`System prompt not found: ${name}`);\n }\n \n this.systemPromptCache.set(name, {\n content: prompt.content,\n expiresAt: Date.now() + 60000, // 60s TTL\n });\n \n return prompt.content;\n }\n ```\n\n2. Add template variable replacement:\n ```typescript\n replaceTemplateVariables(content: string, vars: Record<string, string>): string {\n return content\n .replace(/\\{\\{prompt_index\\}\\}/g, vars.prompt_index || '')\n .replace(/\\{\\{project_name\\}\\}/g, vars.project_name || '')\n .replace(/\\{\\{matched_prompts\\}\\}/g, vars.matched_prompts || '')\n .replace(/\\{\\{remaining_prompts\\}\\}/g, vars.remaining_prompts || '');\n }\n ```",
|
||||
"testStrategy": "- Unit test: System prompt fetched from mcpd\n- Unit test: Cached prompt returned within TTL\n- Unit test: Cache miss triggers fresh fetch\n- Unit test: Missing system prompt throws error\n- Unit test: Template variables replaced correctly\n- Unit test: Unknown template variables left as-is\n- Integration test: End-to-end fetch and cache",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"54"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:57:28.917Z"
|
||||
},
|
||||
{
|
||||
"id": "56",
|
||||
"title": "Implement prompt link resolution service",
|
||||
"description": "Create a service that fetches linked prompt content from source MCP servers using the project's service account, with dead link detection.",
|
||||
"details": "1. Create `/src/mcplocal/src/services/link-resolver.service.ts`:\n ```typescript\n export class LinkResolverService {\n constructor(private mcpdClient: McpdClient) {}\n \n async resolveLink(linkTarget: string): Promise<{\n content: string | null;\n status: 'alive' | 'dead' | 'unknown';\n error?: string;\n }> {\n // Parse linkTarget: project/server:resource-uri\n const { project, server, uri } = this.parseLink(linkTarget);\n \n try {\n // Use service account for source project\n const content = await this.fetchResource(project, server, uri);\n return { content, status: 'alive' };\n } catch (error) {\n this.logDeadLink(linkTarget, error);\n return { \n content: null, \n status: 'dead',\n error: error.message \n };\n }\n }\n \n private parseLink(linkTarget: string): { project: string; server: string; uri: string } {\n const match = linkTarget.match(/^([^/]+)\\/([^:]+):(.+)$/);\n if (!match) throw new Error('Invalid link format');\n return { project: match[1], server: match[2], uri: match[3] };\n }\n \n private async fetchResource(project: string, server: string, uri: string): Promise<string> {\n // Call mcpd to fetch resource via service account\n // mcpd routes to the source project's MCP server\n }\n }\n ```\n\n2. Log dead links as errors\n\n3. Cache resolution results",
|
||||
"testStrategy": "- Unit test: Valid link parsed correctly\n- Unit test: Invalid link format throws error\n- Unit test: Successful resolution returns content and status='alive'\n- Unit test: Failed resolution returns status='dead' with error\n- Unit test: Dead link logged as error\n- Unit test: Service account header included in request\n- Integration test: End-to-end link resolution",
|
||||
"priority": "medium",
|
||||
"dependencies": [
|
||||
"40"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T23:07:29.026Z"
|
||||
},
|
||||
{
|
||||
"id": "57",
|
||||
"title": "Add linkStatus to prompt GET responses",
|
||||
"description": "Modify the GET /api/v1/prompts endpoint to include linkStatus (alive/dead/unknown) for linked prompts by checking link health.",
|
||||
"details": "1. Update `/src/mcpd/src/routes/prompts.ts` GET endpoint:\n ```typescript\n fastify.get('/api/v1/prompts', async (request, reply) => {\n const prompts = await promptService.findAll(filter);\n \n // Check link status for linked prompts\n const promptsWithStatus = await Promise.all(\n prompts.map(async (p) => {\n if (!p.linkTarget) {\n return { ...p, linkStatus: null };\n }\n const status = await linkResolver.checkLinkHealth(p.linkTarget);\n return { ...p, linkStatus: status };\n })\n );\n \n return reply.send(promptsWithStatus);\n });\n ```\n\n2. Consider caching link health to avoid repeated checks\n\n3. Add `linkStatus` field to prompt response schema:\n - `null` for non-linked prompts\n - `'alive'` for working links\n - `'dead'` for broken links\n - `'unknown'` for unchecked links",
|
||||
"testStrategy": "- Unit test: Non-linked prompt has linkStatus=null\n- Unit test: Linked prompt with working link has linkStatus='alive'\n- Unit test: Linked prompt with broken link has linkStatus='dead'\n- Unit test: Link health cached to avoid repeated checks\n- Unit test: All prompts in response have linkStatus field\n- Integration test: End-to-end GET with linked prompts",
|
||||
"priority": "medium",
|
||||
"dependencies": [
|
||||
"56"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T23:09:07.078Z"
|
||||
},
|
||||
{
|
||||
"id": "58",
|
||||
"title": "Add RBAC for prompt link creation",
|
||||
"description": "Implement RBAC checks requiring edit permission on the target project to create prompt links, and auto-create service account permission on the source project.",
|
||||
"details": "1. Update prompt creation in `/src/mcpd/src/services/prompt.service.ts`:\n ```typescript\n async createPrompt(data: CreatePromptInput, userId: string): Promise<Prompt> {\n if (data.linkTarget) {\n // Verify user has edit permission on target project RBAC\n const hasPermission = await this.rbacService.checkPermission(\n userId, data.projectId, 'edit'\n );\n if (!hasPermission) {\n throw new ForbiddenError('Edit permission required to create prompt links');\n }\n \n // Parse link target\n const { project: sourceProject, server, uri } = this.parseLink(data.linkTarget);\n \n // Create service account permission on source project\n await this.rbacService.createServiceAccountPermission(\n data.projectId, // target project\n sourceProject, // source project\n server,\n uri,\n 'read'\n );\n }\n \n return this.promptRepo.create(data);\n }\n ```\n\n2. Clean up service account permission when link is deleted\n\n3. Handle permission denied from source project",
|
||||
"testStrategy": "- Unit test: Link creation requires edit permission\n- Unit test: Link creation without permission throws 403\n- Unit test: Service account permission created on source project\n- Unit test: Service account permission deleted when link deleted\n- Unit test: Non-link prompts skip RBAC checks\n- Integration test: End-to-end link creation with RBAC",
|
||||
"priority": "medium",
|
||||
"dependencies": [
|
||||
"56"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T23:09:07.081Z"
|
||||
},
|
||||
{
|
||||
"id": "59",
|
||||
"title": "Update CLI create prompt command for priority and link",
|
||||
"description": "Extend the mcpctl create prompt command to accept --priority (1-10) and --link (project/server:uri) flags.",
|
||||
"details": "1. Update `/src/cli/src/commands/create.ts` for prompt:\n ```typescript\n .command('prompt <name>')\n .description('Create a new prompt')\n .option('-p, --project <name>', 'Project to create prompt in')\n .option('--priority <number>', 'Priority level (1-10, default: 5)', '5')\n .option('--link <target>', 'Link to MCP resource (project/server:uri)')\n .option('-f, --file <path>', 'Read content from file')\n .action(async (name, options) => {\n const priority = parseInt(options.priority, 10);\n if (priority < 1 || priority > 10) {\n console.error('Priority must be between 1 and 10');\n process.exit(1);\n }\n \n let content = '';\n if (options.link) {\n // Linked prompts don't need content (fetched from source)\n content = `[Link: ${options.link}]`;\n } else if (options.file) {\n content = await fs.readFile(options.file, 'utf-8');\n } else {\n content = await promptForContent();\n }\n \n const body = {\n name,\n content,\n projectId: options.project,\n priority,\n linkTarget: options.link,\n };\n \n await api.post('/api/v1/prompts', body);\n });\n ```\n\n2. Validate link format: `project/server:resource-uri`\n\n3. Add shell completions for new flags",
|
||||
"testStrategy": "- Unit test: --priority flag sets prompt priority\n- Unit test: --priority validation (1-10 range)\n- Unit test: --link flag sets linkTarget\n- Unit test: --link validation (format check)\n- Unit test: Linked prompt skips content prompt\n- Unit test: Default priority is 5\n- Integration test: End-to-end create with flags",
|
||||
"priority": "medium",
|
||||
"dependencies": [
|
||||
"40"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T23:03:45.972Z"
|
||||
},
|
||||
{
|
||||
"id": "60",
|
||||
"title": "Update CLI get prompt command for -A flag and link columns",
|
||||
"description": "Extend the mcpctl get prompt command with -A (all projects) flag and add link target and status columns to output.",
|
||||
"details": "1. Update `/src/cli/src/commands/get.ts` for prompt:\n ```typescript\n .command('prompt [name]')\n .option('-A, --all-projects', 'Show prompts from all projects')\n .option('-p, --project <name>', 'Filter by project')\n .action(async (name, options) => {\n let url = '/api/v1/prompts';\n if (options.allProjects) {\n url += '?all=true';\n } else if (options.project) {\n url += `?project=${options.project}`;\n }\n \n const prompts = await api.get(url);\n \n // Format table with new columns\n formatPromptsTable(prompts, {\n columns: ['PROJECT', 'NAME', 'PRIORITY', 'LINK', 'STATUS']\n });\n });\n ```\n\n2. Update table formatter to handle link columns:\n ```\n PROJECT NAME PRIORITY LINK STATUS\n homeautomation security-policies 8 - -\n homeautomation architecture-adr 6 system-public/docmost-mcp:docmost://pages/a1 alive\n ```\n\n3. Add shell completions for -A flag",
|
||||
"testStrategy": "- Unit test: -A flag shows all projects\n- Unit test: --project flag filters by project\n- Unit test: PRIORITY column displayed\n- Unit test: LINK column shows linkTarget or -\n- Unit test: STATUS column shows linkStatus or -\n- Unit test: Table formatted correctly\n- Integration test: End-to-end get with flags",
|
||||
"priority": "medium",
|
||||
"dependencies": [
|
||||
"57",
|
||||
"59"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T23:09:31.501Z"
|
||||
},
|
||||
{
|
||||
"id": "61",
|
||||
"title": "Update CLI describe project command for gated status",
|
||||
"description": "Extend mcpctl describe project to show gated status, session greeting, and prompt table with priority and link information.",
|
||||
"details": "1. Update `/src/cli/src/commands/get.ts` describe project:\n ```typescript\n async function describeProject(name: string) {\n const project = await api.get(`/api/v1/projects/${name}`);\n const prompts = await api.get(`/api/v1/projects/${name}/prompt-index`);\n const greeting = await getSessionGreeting(name);\n \n console.log(`Name: ${project.name}`);\n console.log(`Gated: ${project.gated}`);\n console.log(`LLM Provider: ${project.llmProvider || '-'}`);\n console.log(`...`);\n console.log();\n console.log(`Session greeting:`);\n console.log(` ${greeting}`);\n console.log();\n console.log(`Prompts:`);\n console.log(` NAME PRIORITY TYPE LINK`);\n for (const p of prompts) {\n const type = p.linkTarget ? 'link' : 'local';\n const link = p.linkTarget || '-';\n console.log(` ${p.name.padEnd(20)} ${p.priority.toString().padEnd(9)} ${type.padEnd(7)} ${link}`);\n }\n }\n ```\n\n2. Fetch session greeting from system prompts or project config",
|
||||
"testStrategy": "- Unit test: Gated status displayed\n- Unit test: Session greeting displayed\n- Unit test: Prompt table with PRIORITY, TYPE, LINK columns\n- Unit test: TYPE shows 'local' or 'link'\n- Unit test: LINK shows target or -\n- Integration test: End-to-end describe project",
|
||||
"priority": "medium",
|
||||
"dependencies": [
|
||||
"44",
|
||||
"54"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T23:04:56.320Z"
|
||||
},
|
||||
{
|
||||
"id": "62",
|
||||
"title": "Update CLI edit project command for gated field",
|
||||
"description": "Extend mcpctl edit project to allow editing the gated boolean field.",
|
||||
"details": "1. Update `/src/cli/src/commands/edit.ts` for project:\n ```typescript\n async function editProject(name: string) {\n const project = await api.get(`/api/v1/projects/${name}`);\n \n // Add gated to editable fields\n const yaml = `\n name: ${project.name}\n description: ${project.description}\n gated: ${project.gated}\n llmProvider: ${project.llmProvider || ''}\n ...`;\n \n const edited = await openEditor(yaml);\n const parsed = YAML.parse(edited);\n \n // Validate gated is boolean\n if (typeof parsed.gated !== 'boolean') {\n console.error('gated must be true or false');\n process.exit(1);\n }\n \n await api.put(`/api/v1/projects/${name}`, parsed);\n }\n ```\n\n2. Update project validation schema to accept gated\n\n3. Handle conversion from string 'true'/'false' to boolean",
|
||||
"testStrategy": "- Unit test: Gated field appears in editor YAML\n- Unit test: Gated field saved on edit\n- Unit test: Boolean validation (true/false only)\n- Unit test: String 'true'/'false' converted to boolean\n- Integration test: End-to-end edit project gated",
|
||||
"priority": "medium",
|
||||
"dependencies": [
|
||||
"39"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T23:03:46.657Z"
|
||||
},
|
||||
{
|
||||
"id": "63",
|
||||
"title": "Add unit tests for prompt priority and link CRUD",
|
||||
"description": "Create comprehensive unit tests for all prompt CRUD operations with the new priority and linkTarget fields.",
|
||||
"details": "1. Add tests in `/src/mcpd/tests/services/prompt-service.test.ts`:\n ```typescript\n describe('Prompt Priority', () => {\n it('creates prompt with explicit priority', async () => {\n const prompt = await service.createPrompt({ ...data, priority: 8 });\n expect(prompt.priority).toBe(8);\n });\n \n it('uses default priority 5 when not specified', async () => {\n const prompt = await service.createPrompt(data);\n expect(prompt.priority).toBe(5);\n });\n \n it('validates priority range 1-10', async () => {\n await expect(service.createPrompt({ ...data, priority: 11 }))\n .rejects.toThrow();\n });\n \n it('updates priority', async () => {\n const updated = await service.updatePrompt(id, { priority: 3 });\n expect(updated.priority).toBe(3);\n });\n });\n \n describe('Prompt Links', () => {\n it('creates linked prompt', async () => {\n const prompt = await service.createPrompt({\n ...data,\n linkTarget: 'project/server:uri'\n });\n expect(prompt.linkTarget).toBe('project/server:uri');\n });\n \n it('rejects invalid link format', async () => {\n await expect(service.createPrompt({\n ...data,\n linkTarget: 'invalid'\n })).rejects.toThrow();\n });\n \n it('linkTarget is immutable on update', async () => {\n // linkTarget not accepted in update schema\n });\n });\n ```",
|
||||
"testStrategy": "This task IS the test implementation. Verify:\n- All priority CRUD tests pass\n- All link CRUD tests pass\n- Validation tests cover edge cases\n- Tests use proper mocking patterns\n- Coverage meets project standards",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"40",
|
||||
"41"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:52:53.091Z"
|
||||
},
|
||||
{
|
||||
"id": "64",
|
||||
"title": "Add unit tests for tag matching algorithm",
|
||||
"description": "Create comprehensive unit tests for the deterministic tag matching algorithm covering score calculation, byte budget, and priority handling.",
|
||||
"details": "1. Add tests in `/src/mcplocal/tests/services/tag-matcher.test.ts`:\n ```typescript\n describe('TagMatcherService', () => {\n describe('score calculation', () => {\n it('priority 10 prompts have infinite score', () => {\n const score = matcher.computeScore(['any'], { priority: 10, ... });\n expect(score).toBe(Infinity);\n });\n \n it('score = matching_tags * priority', () => {\n const score = matcher.computeScore(\n ['tag1', 'tag2'],\n { priority: 5, summary: 'tag1 tag2', chapters: [] }\n );\n expect(score).toBe(10); // 2 tags * 5 priority\n });\n });\n \n describe('matching', () => {\n it('matches case-insensitively', () => {\n const matches = matcher.matchesPrompt('ZIGBEE', { summary: 'zigbee setup' });\n expect(matches).toBe(true);\n });\n \n it('matches substring in summary', () => { ... });\n it('matches substring in chapters', () => { ... });\n });\n \n describe('byte budget', () => {\n it('includes full content until budget exhausted', () => { ... });\n it('matched prompts beyond budget become index entries', () => { ... });\n it('non-matched prompts listed as names only', () => { ... });\n });\n });\n ```",
|
||||
"testStrategy": "This task IS the test implementation. Verify:\n- Score calculation tests pass\n- Matching tests cover all cases\n- Byte budget tests verify allocation\n- Edge cases handled (empty tags, no prompts, etc.)\n- Tests are deterministic",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"45"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:51:03.827Z"
|
||||
},
|
||||
{
|
||||
"id": "65",
|
||||
"title": "Add unit tests for gating state machine",
|
||||
"description": "Create comprehensive unit tests for the session gating state machine covering all transitions and edge cases.",
|
||||
"details": "1. Add tests in `/src/mcplocal/tests/router-gating.test.ts`:\n ```typescript\n describe('Gating State Machine', () => {\n describe('initial state', () => {\n it('starts gated for gated project', () => {\n const router = createRouter({ gated: true });\n const state = router.getSessionState('session1');\n expect(state.gated).toBe(true);\n });\n \n it('starts ungated for non-gated project', () => {\n const router = createRouter({ gated: false });\n const state = router.getSessionState('session1');\n expect(state.gated).toBe(false);\n });\n });\n \n describe('begin_session transition', () => {\n it('ungates session on successful begin_session', async () => {\n const router = createGatedRouter();\n await router.handleBeginSession('session1', { tags: ['test'] });\n expect(router.getSessionState('session1').gated).toBe(false);\n });\n \n it('returns matched prompts', async () => { ... });\n it('sends notifications/tools/list_changed', async () => { ... });\n });\n \n describe('intercept transition', () => {\n it('ungates session on tool call intercept', async () => { ... });\n it('extracts keywords from tool call', async () => { ... });\n it('injects briefing with tool result', async () => { ... });\n });\n \n describe('tools/list behavior', () => {\n it('returns only begin_session while gated', async () => { ... });\n it('returns all tools + read_prompts after ungating', async () => { ... });\n });\n });\n ```",
|
||||
"testStrategy": "This task IS the test implementation. Verify:\n- Initial state tests pass\n- Transition tests cover happy paths\n- Edge case tests (already ungated, etc.)\n- Notification tests verify signals sent\n- Tests use proper mocking",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"50",
|
||||
"52"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:51:03.832Z"
|
||||
},
|
||||
{
|
||||
"id": "66",
|
||||
"title": "Add unit tests for LLM prompt selection",
|
||||
"description": "Create unit tests for the LLM-based prompt selection service covering LLM interactions, fallback behavior, and priority 10 handling.",
|
||||
"details": "1. Add tests in `/src/mcplocal/tests/services/llm-prompt-selector.test.ts`:\n ```typescript\n describe('LlmPromptSelectorService', () => {\n describe('priority 10 handling', () => {\n it('always includes priority 10 prompts', async () => {\n const result = await selector.selectPrompts(['unrelated'], promptIndex);\n expect(result.priority10).toContain(priority10Prompt);\n });\n });\n \n describe('LLM selection', () => {\n it('sends tags and index to heavy LLM', async () => {\n await selector.selectPrompts(['zigbee', 'mqtt'], promptIndex);\n expect(mockLlm.complete).toHaveBeenCalledWith(\n expect.stringContaining('zigbee')\n );\n });\n \n it('parses LLM response correctly', async () => {\n mockLlm.complete.mockResolvedValue(\n '[{\"name\": \"prompt1\", \"reason\": \"relevant\"}]'\n );\n const result = await selector.selectPrompts(['test'], promptIndex);\n expect(result.selected[0].name).toBe('prompt1');\n });\n });\n \n describe('fallback behavior', () => {\n it('falls back to tag matcher on LLM error', async () => { ... });\n it('falls back on LLM timeout', async () => { ... });\n it('falls back when no LLM available', async () => { ... });\n });\n \n describe('summary generation', () => {\n it('generates missing summaries with fast LLM', async () => { ... });\n });\n });\n ```",
|
||||
"testStrategy": "This task IS the test implementation. Verify:\n- Priority 10 tests pass\n- LLM interaction tests use proper mocks\n- Fallback tests cover all error scenarios\n- Summary generation tests pass\n- Response parsing handles edge cases",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"46"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:51:03.836Z"
|
||||
},
|
||||
{
|
||||
"id": "67",
|
||||
"title": "Add integration tests for gated session flow",
|
||||
"description": "Create end-to-end integration tests for the complete gated session flow including connect, begin_session, tool calls, and read_prompts.",
|
||||
"details": "1. Add tests in `/src/mcplocal/tests/integration/gated-flow.test.ts`:\n ```typescript\n describe('Gated Session Flow Integration', () => {\n let app: FastifyInstance;\n let mcpClient: McpClient;\n \n beforeAll(async () => {\n app = await createTestApp();\n // Seed test project with gated=true and test prompts\n });\n \n describe('end-to-end gated flow', () => {\n it('connect → begin_session with tags → tools available → correct prompts', async () => {\n // 1. Connect to MCP endpoint\n const session = await mcpClient.connect(app, 'test-project');\n \n // 2. Verify only begin_session available\n const toolsBefore = await session.listTools();\n expect(toolsBefore.map(t => t.name)).toEqual(['begin_session']);\n \n // 3. Call begin_session\n const briefing = await session.callTool('begin_session', {\n tags: ['test', 'integration']\n });\n expect(briefing).toContain('matched prompt content');\n \n // 4. Verify all tools now available\n const toolsAfter = await session.listTools();\n expect(toolsAfter.map(t => t.name)).toContain('read_prompts');\n });\n });\n \n describe('end-to-end intercept flow', () => {\n it('connect → skip begin_session → call tool → keywords extracted → briefing injected', async () => { ... });\n });\n \n describe('end-to-end read_prompts', () => {\n it('after ungating → request more context → additional prompts → no duplicates', async () => { ... });\n });\n });\n ```",
|
||||
"testStrategy": "This task IS the test implementation. Verify:\n- Happy path tests pass\n- Intercept path tests pass\n- read_prompts deduplication works\n- Tests use realistic data\n- Tests clean up properly",
|
||||
"priority": "high",
|
||||
"dependencies": [
|
||||
"65"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T22:51:03.840Z"
|
||||
},
|
||||
{
|
||||
"id": "68",
|
||||
"title": "Add integration tests for prompt links",
|
||||
"description": "Create end-to-end integration tests for prompt link creation, resolution, and dead link detection.",
|
||||
"details": "1. Add tests in `/src/mcplocal/tests/integration/prompt-links.test.ts`:\n ```typescript\n describe('Prompt Links Integration', () => {\n describe('link creation', () => {\n it('creates link with RBAC permission', async () => {\n // Setup: user with edit permission on target project\n const prompt = await api.post('/api/v1/prompts', {\n name: 'linked-prompt',\n content: '[Link]',\n projectId: targetProject.id,\n linkTarget: 'source-project/server:uri'\n });\n expect(prompt.linkTarget).toBe('source-project/server:uri');\n });\n \n it('rejects link creation without RBAC permission', async () => { ... });\n });\n \n describe('link resolution', () => {\n it('fetches content from source server', async () => { ... });\n it('uses service account for RBAC', async () => { ... });\n });\n \n describe('dead link lifecycle', () => {\n it('detects dead link when source unavailable', async () => {\n // Kill source server\n const prompts = await api.get('/api/v1/prompts');\n const linked = prompts.find(p => p.linkTarget);\n expect(linked.linkStatus).toBe('dead');\n });\n \n it('recovers when source restored', async () => { ... });\n });\n });\n ```",
|
||||
"testStrategy": "This task IS the test implementation. Verify:\n- RBAC tests cover permission scenarios\n- Resolution tests verify content fetched\n- Dead link tests cover full lifecycle\n- Tests properly mock/control source servers\n- Tests clean up resources",
|
||||
"priority": "medium",
|
||||
"dependencies": [
|
||||
"57",
|
||||
"58"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T23:12:22.348Z"
|
||||
},
|
||||
{
|
||||
"id": "69",
|
||||
"title": "Add CLI unit tests for new prompt and project flags",
|
||||
"description": "Create unit tests for the new CLI flags: --priority, --link for prompts, -A for get, and gated field for projects.",
|
||||
"details": "1. Add tests in `/src/cli/tests/commands/prompt.test.ts`:\n ```typescript\n describe('create prompt command', () => {\n it('--priority sets prompt priority', async () => {\n await cli('create prompt test --priority 8');\n expect(mockApi.post).toHaveBeenCalledWith(\n '/api/v1/prompts',\n expect.objectContaining({ priority: 8 })\n );\n });\n \n it('--priority validates range 1-10', async () => {\n await expect(cli('create prompt test --priority 15'))\n .rejects.toThrow('Priority must be between 1 and 10');\n });\n \n it('--link sets linkTarget', async () => {\n await cli('create prompt test --link proj/srv:uri');\n expect(mockApi.post).toHaveBeenCalledWith(\n '/api/v1/prompts',\n expect.objectContaining({ linkTarget: 'proj/srv:uri' })\n );\n });\n });\n \n describe('get prompt command', () => {\n it('-A shows all projects', async () => {\n await cli('get prompt -A');\n expect(mockApi.get).toHaveBeenCalledWith('/api/v1/prompts?all=true');\n });\n });\n ```\n\n2. Add tests for project gated field editing\n\n3. Add tests for describe project output",
|
||||
"testStrategy": "This task IS the test implementation. Verify:\n- Flag parsing tests pass\n- Validation tests cover edge cases\n- API call tests verify correct parameters\n- Output formatting tests verify columns\n- Tests mock API properly",
|
||||
"priority": "medium",
|
||||
"dependencies": [
|
||||
"59",
|
||||
"60",
|
||||
"61",
|
||||
"62"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T23:12:22.352Z"
|
||||
},
|
||||
{
|
||||
"id": "70",
|
||||
"title": "Add shell completions for new CLI flags",
|
||||
"description": "Update shell completion scripts (bash, zsh, fish) to include completions for new flags: --priority, --link, -A, and gated values.",
|
||||
"details": "1. Update `/completions/mcpctl.fish`:\n ```fish\n # create prompt completions\n complete -c mcpctl -n '__fish_seen_subcommand_from create; and __fish_seen_subcommand_from prompt' -l priority -d 'Priority level (1-10)' -a '(seq 1 10)'\n complete -c mcpctl -n '__fish_seen_subcommand_from create; and __fish_seen_subcommand_from prompt' -l link -d 'Link to MCP resource (project/server:uri)'\n \n # get prompt completions \n complete -c mcpctl -n '__fish_seen_subcommand_from get; and __fish_seen_subcommand_from prompt' -s A -l all-projects -d 'Show prompts from all projects'\n ```\n\n2. Update bash completions similarly\n\n3. Update zsh completions similarly\n\n4. Add dynamic completion for priority values (1-10)",
|
||||
"testStrategy": "- Manual test: Fish completions suggest --priority with values 1-10\n- Manual test: Fish completions suggest --link flag\n- Manual test: Fish completions suggest -A/--all-projects\n- Manual test: Bash completions work similarly\n- Manual test: Zsh completions work similarly",
|
||||
"priority": "low",
|
||||
"dependencies": [
|
||||
"59",
|
||||
"60"
|
||||
],
|
||||
"status": "done",
|
||||
"subtasks": [],
|
||||
"updatedAt": "2026-02-25T23:12:22.363Z"
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"version": "1.0.0",
|
||||
"lastModified": "2026-02-21T18:52:29.084Z",
|
||||
"taskCount": 36,
|
||||
"completedCount": 33,
|
||||
"lastModified": "2026-02-25T23:12:22.364Z",
|
||||
"taskCount": 70,
|
||||
"completedCount": 67,
|
||||
"tags": [
|
||||
"master"
|
||||
]
|
||||
|
||||
@@ -2,91 +2,175 @@ _mcpctl() {
|
||||
local cur prev words cword
|
||||
_init_completion || return
|
||||
|
||||
local commands="config status get describe instance instances apply setup claude project projects backup restore help"
|
||||
local global_opts="-v --version -o --output --daemon-url -h --help"
|
||||
local resources="servers profiles projects instances"
|
||||
local commands="status login logout config get describe delete logs create edit apply backup restore mcp approve help"
|
||||
local project_commands="attach-server detach-server get describe delete logs create edit help"
|
||||
local global_opts="-v --version --daemon-url --direct --project -h --help"
|
||||
local resources="servers instances secrets templates projects users groups rbac prompts promptrequests"
|
||||
|
||||
case "${words[1]}" in
|
||||
# Check if --project was given
|
||||
local has_project=false
|
||||
local i
|
||||
for ((i=1; i < cword; i++)); do
|
||||
if [[ "${words[i]}" == "--project" ]]; then
|
||||
has_project=true
|
||||
break
|
||||
fi
|
||||
done
|
||||
|
||||
# Find the first subcommand (skip --project and its argument, skip flags)
|
||||
local subcmd=""
|
||||
local subcmd_pos=0
|
||||
for ((i=1; i < cword; i++)); do
|
||||
if [[ "${words[i]}" == "--project" || "${words[i]}" == "--daemon-url" ]]; then
|
||||
((i++)) # skip the argument
|
||||
continue
|
||||
fi
|
||||
if [[ "${words[i]}" != -* ]]; then
|
||||
subcmd="${words[i]}"
|
||||
subcmd_pos=$i
|
||||
break
|
||||
fi
|
||||
done
|
||||
|
||||
# Find the resource type after get/describe/delete/edit
|
||||
local resource_type=""
|
||||
if [[ -n "$subcmd_pos" ]] && [[ $subcmd_pos -gt 0 ]]; then
|
||||
for ((i=subcmd_pos+1; i < cword; i++)); do
|
||||
if [[ "${words[i]}" != -* ]] && [[ " $resources " == *" ${words[i]} "* ]]; then
|
||||
resource_type="${words[i]}"
|
||||
break
|
||||
fi
|
||||
done
|
||||
fi
|
||||
|
||||
# If completing the --project value
|
||||
if [[ "$prev" == "--project" ]]; then
|
||||
local names
|
||||
names=$(mcpctl get projects -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null)
|
||||
COMPREPLY=($(compgen -W "$names" -- "$cur"))
|
||||
return
|
||||
fi
|
||||
|
||||
# Fetch resource names dynamically (jq extracts only top-level names)
|
||||
_mcpctl_resource_names() {
|
||||
local rt="$1"
|
||||
if [[ -n "$rt" ]]; then
|
||||
# Instances don't have a name field — use server.name instead
|
||||
if [[ "$rt" == "instances" ]]; then
|
||||
mcpctl get instances -o json 2>/dev/null | jq -r '.[][].server.name' 2>/dev/null
|
||||
else
|
||||
mcpctl get "$rt" -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null
|
||||
fi
|
||||
fi
|
||||
}
|
||||
|
||||
# Get the --project value from the command line
|
||||
_mcpctl_get_project_value() {
|
||||
local i
|
||||
for ((i=1; i < cword; i++)); do
|
||||
if [[ "${words[i]}" == "--project" ]] && (( i+1 < cword )); then
|
||||
echo "${words[i+1]}"
|
||||
return
|
||||
fi
|
||||
done
|
||||
}
|
||||
|
||||
case "$subcmd" in
|
||||
config)
|
||||
COMPREPLY=($(compgen -W "view set path reset help" -- "$cur"))
|
||||
if [[ $((cword - subcmd_pos)) -eq 1 ]]; then
|
||||
COMPREPLY=($(compgen -W "view set path reset claude claude-generate setup impersonate help" -- "$cur"))
|
||||
fi
|
||||
return ;;
|
||||
status)
|
||||
COMPREPLY=($(compgen -W "--daemon-url -h --help" -- "$cur"))
|
||||
COMPREPLY=($(compgen -W "-h --help" -- "$cur"))
|
||||
return ;;
|
||||
get)
|
||||
if [[ $cword -eq 2 ]]; then
|
||||
login)
|
||||
COMPREPLY=($(compgen -W "--url --email --password -h --help" -- "$cur"))
|
||||
return ;;
|
||||
logout)
|
||||
return ;;
|
||||
mcp)
|
||||
return ;;
|
||||
get|describe|delete)
|
||||
if [[ -z "$resource_type" ]]; then
|
||||
COMPREPLY=($(compgen -W "$resources" -- "$cur"))
|
||||
else
|
||||
COMPREPLY=($(compgen -W "-o --output --daemon-url -h --help" -- "$cur"))
|
||||
local names
|
||||
names=$(_mcpctl_resource_names "$resource_type")
|
||||
COMPREPLY=($(compgen -W "$names -o --output -h --help" -- "$cur"))
|
||||
fi
|
||||
return ;;
|
||||
describe)
|
||||
if [[ $cword -eq 2 ]]; then
|
||||
COMPREPLY=($(compgen -W "$resources" -- "$cur"))
|
||||
edit)
|
||||
if [[ -z "$resource_type" ]]; then
|
||||
COMPREPLY=($(compgen -W "servers projects" -- "$cur"))
|
||||
else
|
||||
COMPREPLY=($(compgen -W "-o --output --daemon-url -h --help" -- "$cur"))
|
||||
local names
|
||||
names=$(_mcpctl_resource_names "$resource_type")
|
||||
COMPREPLY=($(compgen -W "$names -h --help" -- "$cur"))
|
||||
fi
|
||||
return ;;
|
||||
instance|instances)
|
||||
if [[ $cword -eq 2 ]]; then
|
||||
COMPREPLY=($(compgen -W "list ls start stop restart remove rm logs inspect help" -- "$cur"))
|
||||
else
|
||||
case "${words[2]}" in
|
||||
logs)
|
||||
COMPREPLY=($(compgen -W "--tail --since -h --help" -- "$cur"))
|
||||
;;
|
||||
start)
|
||||
COMPREPLY=($(compgen -W "--env --image -h --help" -- "$cur"))
|
||||
;;
|
||||
list|ls)
|
||||
COMPREPLY=($(compgen -W "--server-id -o --output -h --help" -- "$cur"))
|
||||
;;
|
||||
esac
|
||||
fi
|
||||
logs)
|
||||
COMPREPLY=($(compgen -W "--tail --since -f --follow -h --help" -- "$cur"))
|
||||
return ;;
|
||||
claude)
|
||||
if [[ $cword -eq 2 ]]; then
|
||||
COMPREPLY=($(compgen -W "generate show add remove help" -- "$cur"))
|
||||
else
|
||||
case "${words[2]}" in
|
||||
generate|show|add|remove)
|
||||
COMPREPLY=($(compgen -W "--path -p -h --help" -- "$cur"))
|
||||
;;
|
||||
esac
|
||||
fi
|
||||
return ;;
|
||||
project|projects)
|
||||
if [[ $cword -eq 2 ]]; then
|
||||
COMPREPLY=($(compgen -W "list ls create delete rm show profiles set-profiles help" -- "$cur"))
|
||||
else
|
||||
case "${words[2]}" in
|
||||
create)
|
||||
COMPREPLY=($(compgen -W "--description -d -h --help" -- "$cur"))
|
||||
;;
|
||||
list|ls)
|
||||
COMPREPLY=($(compgen -W "-o --output -h --help" -- "$cur"))
|
||||
;;
|
||||
esac
|
||||
create)
|
||||
if [[ $((cword - subcmd_pos)) -eq 1 ]]; then
|
||||
COMPREPLY=($(compgen -W "server secret project user group rbac prompt promptrequest help" -- "$cur"))
|
||||
fi
|
||||
return ;;
|
||||
apply)
|
||||
COMPREPLY=($(compgen -f -- "$cur"))
|
||||
return ;;
|
||||
backup)
|
||||
COMPREPLY=($(compgen -W "-o --output -p --password -r --resources -h --help" -- "$cur"))
|
||||
COMPREPLY=($(compgen -W "-o --output -p --password -h --help" -- "$cur"))
|
||||
return ;;
|
||||
restore)
|
||||
COMPREPLY=($(compgen -W "-i --input -p --password -c --conflict -h --help" -- "$cur"))
|
||||
return ;;
|
||||
setup)
|
||||
attach-server)
|
||||
# Only complete if no server arg given yet (first arg after subcmd)
|
||||
if [[ $((cword - subcmd_pos)) -ne 1 ]]; then return; fi
|
||||
local proj names all_servers proj_servers
|
||||
proj=$(_mcpctl_get_project_value)
|
||||
if [[ -n "$proj" ]]; then
|
||||
all_servers=$(mcpctl get servers -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null)
|
||||
proj_servers=$(mcpctl --project "$proj" get servers -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null)
|
||||
names=$(comm -23 <(echo "$all_servers" | sort) <(echo "$proj_servers" | sort))
|
||||
else
|
||||
names=$(_mcpctl_resource_names "servers")
|
||||
fi
|
||||
COMPREPLY=($(compgen -W "$names" -- "$cur"))
|
||||
return ;;
|
||||
detach-server)
|
||||
# Only complete if no server arg given yet (first arg after subcmd)
|
||||
if [[ $((cword - subcmd_pos)) -ne 1 ]]; then return; fi
|
||||
local proj names
|
||||
proj=$(_mcpctl_get_project_value)
|
||||
if [[ -n "$proj" ]]; then
|
||||
names=$(mcpctl --project "$proj" get servers -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null)
|
||||
fi
|
||||
COMPREPLY=($(compgen -W "$names" -- "$cur"))
|
||||
return ;;
|
||||
approve)
|
||||
if [[ -z "$resource_type" ]]; then
|
||||
COMPREPLY=($(compgen -W "promptrequest" -- "$cur"))
|
||||
else
|
||||
local names
|
||||
names=$(_mcpctl_resource_names "$resource_type")
|
||||
COMPREPLY=($(compgen -W "$names" -- "$cur"))
|
||||
fi
|
||||
return ;;
|
||||
help)
|
||||
COMPREPLY=($(compgen -W "$commands" -- "$cur"))
|
||||
return ;;
|
||||
esac
|
||||
|
||||
if [[ $cword -eq 1 ]]; then
|
||||
COMPREPLY=($(compgen -W "$commands $global_opts" -- "$cur"))
|
||||
# No subcommand yet — offer commands based on context
|
||||
if [[ -z "$subcmd" ]]; then
|
||||
if $has_project; then
|
||||
COMPREPLY=($(compgen -W "$project_commands $global_opts" -- "$cur"))
|
||||
else
|
||||
COMPREPLY=($(compgen -W "$commands $global_opts" -- "$cur"))
|
||||
fi
|
||||
fi
|
||||
}
|
||||
|
||||
|
||||
@@ -1,80 +1,325 @@
|
||||
# mcpctl fish completions
|
||||
|
||||
set -l commands config status get describe instance instances apply setup claude project projects backup restore help
|
||||
# Erase any stale completions from previous versions
|
||||
complete -c mcpctl -e
|
||||
|
||||
set -l commands status login logout config get describe delete logs create edit apply patch backup restore mcp approve help
|
||||
set -l project_commands attach-server detach-server get describe delete logs create edit help
|
||||
|
||||
# Disable file completions by default
|
||||
complete -c mcpctl -f
|
||||
|
||||
# Global options
|
||||
complete -c mcpctl -s v -l version -d 'Show version'
|
||||
complete -c mcpctl -s o -l output -d 'Output format' -xa 'table json yaml'
|
||||
complete -c mcpctl -l daemon-url -d 'mcpd daemon URL' -x
|
||||
complete -c mcpctl -l daemon-url -d 'mcplocal daemon URL' -x
|
||||
complete -c mcpctl -l direct -d 'Bypass mcplocal, connect directly to mcpd'
|
||||
complete -c mcpctl -l project -d 'Target project context' -x
|
||||
complete -c mcpctl -s h -l help -d 'Show help'
|
||||
|
||||
# Top-level commands
|
||||
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a config -d 'Manage configuration'
|
||||
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a status -d 'Show status and connectivity'
|
||||
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a get -d 'List resources'
|
||||
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a describe -d 'Show resource details'
|
||||
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a instance -d 'Manage instances'
|
||||
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a apply -d 'Apply configuration from file'
|
||||
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a setup -d 'Interactive setup wizard'
|
||||
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a claude -d 'Manage Claude .mcp.json'
|
||||
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a project -d 'Manage projects'
|
||||
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a backup -d 'Backup configuration'
|
||||
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a restore -d 'Restore from backup'
|
||||
complete -c mcpctl -n "not __fish_seen_subcommand_from $commands" -a help -d 'Show help'
|
||||
# Helper: check if --project was given
|
||||
function __mcpctl_has_project
|
||||
set -l tokens (commandline -opc)
|
||||
for i in (seq (count $tokens))
|
||||
if test "$tokens[$i]" = "--project"
|
||||
return 0
|
||||
end
|
||||
end
|
||||
return 1
|
||||
end
|
||||
|
||||
# get/describe resources
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from get describe" -a 'servers profiles projects instances' -d 'Resource type'
|
||||
# Helper: check if a resource type has been selected after get/describe/delete/edit
|
||||
set -l resources servers instances secrets templates projects users groups rbac prompts promptrequests
|
||||
# All accepted resource aliases (plural + singular + short forms)
|
||||
set -l resource_aliases servers server srv instances instance inst secrets secret sec templates template tpl projects project proj users user groups group rbac rbac-definition rbac-binding prompts prompt promptrequests promptrequest pr
|
||||
|
||||
function __mcpctl_needs_resource_type
|
||||
set -l tokens (commandline -opc)
|
||||
set -l found_cmd false
|
||||
for tok in $tokens
|
||||
if $found_cmd
|
||||
# Check if next token after get/describe/delete/edit is a resource type or alias
|
||||
if contains -- $tok $resource_aliases
|
||||
return 1 # resource type already present
|
||||
end
|
||||
end
|
||||
if contains -- $tok get describe delete edit patch
|
||||
set found_cmd true
|
||||
end
|
||||
end
|
||||
if $found_cmd
|
||||
return 0 # command found but no resource type yet
|
||||
end
|
||||
return 1
|
||||
end
|
||||
|
||||
# Map any resource alias to the canonical plural form for API calls
|
||||
function __mcpctl_resolve_resource
|
||||
switch $argv[1]
|
||||
case server srv servers; echo servers
|
||||
case instance inst instances; echo instances
|
||||
case secret sec secrets; echo secrets
|
||||
case template tpl templates; echo templates
|
||||
case project proj projects; echo projects
|
||||
case user users; echo users
|
||||
case group groups; echo groups
|
||||
case rbac rbac-definition rbac-binding; echo rbac
|
||||
case prompt prompts; echo prompts
|
||||
case promptrequest promptrequests pr; echo promptrequests
|
||||
case '*'; echo $argv[1]
|
||||
end
|
||||
end
|
||||
|
||||
function __mcpctl_get_resource_type
|
||||
set -l tokens (commandline -opc)
|
||||
set -l found_cmd false
|
||||
for tok in $tokens
|
||||
if $found_cmd
|
||||
if contains -- $tok $resource_aliases
|
||||
__mcpctl_resolve_resource $tok
|
||||
return
|
||||
end
|
||||
end
|
||||
if contains -- $tok get describe delete edit patch
|
||||
set found_cmd true
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
# Fetch resource names dynamically from the API (jq extracts only top-level names)
|
||||
function __mcpctl_resource_names
|
||||
set -l resource (__mcpctl_get_resource_type)
|
||||
if test -z "$resource"
|
||||
return
|
||||
end
|
||||
# Instances don't have a name field — use server.name instead
|
||||
if test "$resource" = "instances"
|
||||
mcpctl get instances -o json 2>/dev/null | jq -r '.[][].server.name' 2>/dev/null
|
||||
else if test "$resource" = "prompts" -o "$resource" = "promptrequests"
|
||||
# Use -A to include all projects, not just global
|
||||
mcpctl get $resource -A -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null
|
||||
else
|
||||
mcpctl get $resource -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null
|
||||
end
|
||||
end
|
||||
|
||||
# Fetch project names for --project value
|
||||
function __mcpctl_project_names
|
||||
mcpctl get projects -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null
|
||||
end
|
||||
|
||||
# Helper: get the --project value from the command line
|
||||
function __mcpctl_get_project_value
|
||||
set -l tokens (commandline -opc)
|
||||
for i in (seq (count $tokens))
|
||||
if test "$tokens[$i]" = "--project"; and test $i -lt (count $tokens)
|
||||
echo $tokens[(math $i + 1)]
|
||||
return
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
# Servers currently attached to the project (for detach-server)
|
||||
function __mcpctl_project_servers
|
||||
set -l proj (__mcpctl_get_project_value)
|
||||
if test -z "$proj"
|
||||
return
|
||||
end
|
||||
mcpctl --project $proj get servers -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null
|
||||
end
|
||||
|
||||
# Servers NOT attached to the project (for attach-server)
|
||||
function __mcpctl_available_servers
|
||||
set -l proj (__mcpctl_get_project_value)
|
||||
if test -z "$proj"
|
||||
# No project — show all servers
|
||||
mcpctl get servers -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null
|
||||
return
|
||||
end
|
||||
set -l all (mcpctl get servers -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null)
|
||||
set -l attached (mcpctl --project $proj get servers -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null)
|
||||
for s in $all
|
||||
if not contains -- $s $attached
|
||||
echo $s
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
# --project value completion
|
||||
complete -c mcpctl -l project -xa '(__mcpctl_project_names)'
|
||||
|
||||
# Top-level commands (without --project)
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a status -d 'Show status and connectivity'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a login -d 'Authenticate with mcpd'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a logout -d 'Log out'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a config -d 'Manage configuration'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a get -d 'List resources'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a describe -d 'Show resource details'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a delete -d 'Delete a resource'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a logs -d 'Get instance logs'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a create -d 'Create a resource'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a edit -d 'Edit a resource'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a apply -d 'Apply configuration from file'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a backup -d 'Backup configuration'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a restore -d 'Restore from backup'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a patch -d 'Patch a resource field'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a approve -d 'Approve a prompt request'
|
||||
complete -c mcpctl -n "not __mcpctl_has_project; and not __fish_seen_subcommand_from $commands" -a help -d 'Show help'
|
||||
|
||||
# Project-scoped commands (with --project)
|
||||
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a attach-server -d 'Attach a server to the project'
|
||||
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a detach-server -d 'Detach a server from the project'
|
||||
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a get -d 'List resources (scoped to project)'
|
||||
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a describe -d 'Show resource details'
|
||||
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a delete -d 'Delete a resource'
|
||||
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a logs -d 'Get instance logs'
|
||||
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a create -d 'Create a resource'
|
||||
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a edit -d 'Edit a resource'
|
||||
complete -c mcpctl -n "__mcpctl_has_project; and not __fish_seen_subcommand_from $project_commands" -a help -d 'Show help'
|
||||
|
||||
# Resource types — only when resource type not yet selected
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from get describe delete patch; and __mcpctl_needs_resource_type" -a "$resources" -d 'Resource type'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from edit; and __mcpctl_needs_resource_type" -a 'servers secrets projects groups rbac prompts promptrequests' -d 'Resource type'
|
||||
|
||||
# Resource names — after resource type is selected
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from get describe delete edit patch; and not __mcpctl_needs_resource_type" -a '(__mcpctl_resource_names)' -d 'Resource name'
|
||||
|
||||
# Helper: check if attach-server/detach-server already has a server argument
|
||||
function __mcpctl_needs_server_arg
|
||||
set -l tokens (commandline -opc)
|
||||
set -l found_cmd false
|
||||
for tok in $tokens
|
||||
if $found_cmd
|
||||
if not string match -q -- '-*' $tok
|
||||
return 1 # server arg already present
|
||||
end
|
||||
end
|
||||
if contains -- $tok attach-server detach-server
|
||||
set found_cmd true
|
||||
end
|
||||
end
|
||||
if $found_cmd
|
||||
return 0 # command found but no server arg yet
|
||||
end
|
||||
return 1
|
||||
end
|
||||
|
||||
# attach-server: show servers NOT in the project (only if no server arg yet)
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from attach-server; and __mcpctl_needs_server_arg" -a '(__mcpctl_available_servers)' -d 'Server'
|
||||
|
||||
# detach-server: show servers IN the project (only if no server arg yet)
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from detach-server; and __mcpctl_needs_server_arg" -a '(__mcpctl_project_servers)' -d 'Server'
|
||||
|
||||
# get/describe options
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from get" -s o -l output -d 'Output format' -xa 'table json yaml'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from get" -l project -d 'Filter by project' -xa '(__mcpctl_project_names)'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from get" -s A -l all -d 'Show all resources across projects'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from describe" -s o -l output -d 'Output format' -xa 'detail json yaml'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from describe" -l show-values -d 'Show secret values'
|
||||
|
||||
# login options
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from login" -l url -d 'mcpd URL' -x
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from login" -l email -d 'Email address' -x
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from login" -l password -d 'Password' -x
|
||||
|
||||
# config subcommands
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from view set path reset" -a view -d 'Show configuration'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from view set path reset" -a set -d 'Set a config value'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from view set path reset" -a path -d 'Show config file path'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from view set path reset" -a reset -d 'Reset to defaults'
|
||||
set -l config_cmds view set path reset claude claude-generate setup impersonate
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a view -d 'Show configuration'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a set -d 'Set a config value'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a path -d 'Show config file path'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a reset -d 'Reset to defaults'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a claude -d 'Generate .mcp.json for project'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a setup -d 'Configure LLM provider'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from config; and not __fish_seen_subcommand_from $config_cmds" -a impersonate -d 'Impersonate a user'
|
||||
|
||||
# instance subcommands
|
||||
set -l instance_cmds list ls start stop restart remove rm logs inspect
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a list -d 'List instances'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a start -d 'Start instance'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a stop -d 'Stop instance'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a restart -d 'Restart instance'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a remove -d 'Remove instance'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a logs -d 'Get logs'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and not __fish_seen_subcommand_from $instance_cmds" -a inspect -d 'Inspect container'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and __fish_seen_subcommand_from logs" -l tail -d 'Number of lines' -x
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from instance instances; and __fish_seen_subcommand_from logs" -l since -d 'Since timestamp' -x
|
||||
# create subcommands
|
||||
set -l create_cmds server secret project user group rbac prompt promptrequest
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a server -d 'Create a server'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a secret -d 'Create a secret'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a project -d 'Create a project'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a user -d 'Create a user'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a group -d 'Create a group'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a rbac -d 'Create an RBAC binding'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a prompt -d 'Create an approved prompt'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and not __fish_seen_subcommand_from $create_cmds" -a promptrequest -d 'Create a prompt request'
|
||||
|
||||
# claude subcommands
|
||||
set -l claude_cmds generate show add remove
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from claude; and not __fish_seen_subcommand_from $claude_cmds" -a generate -d 'Generate .mcp.json'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from claude; and not __fish_seen_subcommand_from $claude_cmds" -a show -d 'Show .mcp.json'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from claude; and not __fish_seen_subcommand_from $claude_cmds" -a add -d 'Add server entry'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from claude; and not __fish_seen_subcommand_from $claude_cmds" -a remove -d 'Remove server entry'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from claude; and __fish_seen_subcommand_from $claude_cmds" -s p -l path -d 'Path to .mcp.json' -rF
|
||||
# create prompt/promptrequest options
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and __fish_seen_subcommand_from prompt promptrequest" -l project -d 'Project name' -xa '(__mcpctl_project_names)'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and __fish_seen_subcommand_from prompt promptrequest" -l content -d 'Prompt content text' -x
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and __fish_seen_subcommand_from prompt promptrequest" -l content-file -d 'Read content from file' -rF
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and __fish_seen_subcommand_from prompt promptrequest" -l priority -d 'Priority 1-10' -xa '(seq 1 10)'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and __fish_seen_subcommand_from prompt" -l link -d 'Link to MCP resource (project/server:uri)' -x
|
||||
|
||||
# project subcommands
|
||||
set -l project_cmds list ls create delete rm show profiles set-profiles
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a list -d 'List projects'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a create -d 'Create project'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a delete -d 'Delete project'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a show -d 'Show project'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a profiles -d 'List profiles'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and not __fish_seen_subcommand_from $project_cmds" -a set-profiles -d 'Set profiles'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from project projects; and __fish_seen_subcommand_from create" -s d -l description -d 'Description' -x
|
||||
# create project --gated/--no-gated
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and __fish_seen_subcommand_from project" -l gated -d 'Enable gated sessions'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from create; and __fish_seen_subcommand_from project" -l no-gated -d 'Disable gated sessions'
|
||||
|
||||
# logs: takes a server/instance name, then options
|
||||
function __mcpctl_instance_names
|
||||
mcpctl get instances -o json 2>/dev/null | jq -r '.[][].server.name' 2>/dev/null
|
||||
end
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from logs" -a '(__mcpctl_instance_names)' -d 'Server name'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from logs" -l tail -d 'Number of lines' -x
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from logs" -l since -d 'Since timestamp' -x
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from logs" -s f -l follow -d 'Follow log output'
|
||||
|
||||
# backup options
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from backup" -s o -l output -d 'Output file' -rF
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from backup" -s p -l password -d 'Encryption password' -x
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from backup" -s r -l resources -d 'Resources to backup' -xa 'servers profiles projects'
|
||||
|
||||
# restore options
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from restore" -s i -l input -d 'Input file' -rF
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from restore" -s p -l password -d 'Decryption password' -x
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from restore" -s c -l conflict -d 'Conflict strategy' -xa 'skip overwrite fail'
|
||||
|
||||
# approve: first arg is resource type, second is name
|
||||
function __mcpctl_approve_needs_type
|
||||
set -l tokens (commandline -opc)
|
||||
set -l found false
|
||||
for tok in $tokens
|
||||
if $found
|
||||
if contains -- $tok promptrequest promptrequests
|
||||
return 1 # type already given
|
||||
end
|
||||
end
|
||||
if test "$tok" = "approve"
|
||||
set found true
|
||||
end
|
||||
end
|
||||
if $found
|
||||
return 0 # approve found but no type yet
|
||||
end
|
||||
return 1
|
||||
end
|
||||
|
||||
function __mcpctl_approve_needs_name
|
||||
set -l tokens (commandline -opc)
|
||||
set -l found_type false
|
||||
for tok in $tokens
|
||||
if $found_type
|
||||
# next non-flag token after type is the name
|
||||
if not string match -q -- '-*' $tok
|
||||
return 1 # name already given
|
||||
end
|
||||
end
|
||||
if contains -- $tok promptrequest promptrequests
|
||||
set found_type true
|
||||
end
|
||||
end
|
||||
if $found_type
|
||||
return 0 # type given but no name yet
|
||||
end
|
||||
return 1
|
||||
end
|
||||
|
||||
function __mcpctl_promptrequest_names
|
||||
mcpctl get promptrequests -A -o json 2>/dev/null | jq -r '.[][].name' 2>/dev/null
|
||||
end
|
||||
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from approve; and __mcpctl_approve_needs_type" -a 'promptrequest' -d 'Resource type'
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from approve; and __mcpctl_approve_needs_name" -a '(__mcpctl_promptrequest_names)' -d 'Prompt request name'
|
||||
|
||||
# apply takes a file
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from apply" -s f -l file -d 'Configuration file' -rF
|
||||
complete -c mcpctl -n "__fish_seen_subcommand_from apply" -F
|
||||
|
||||
# help completions
|
||||
|
||||
398
deploy.sh
Executable file
398
deploy.sh
Executable file
@@ -0,0 +1,398 @@
|
||||
#!/bin/bash
|
||||
# Deploy mcpctl stack to Portainer
|
||||
# Usage: ./deploy.sh [--dry-run]
|
||||
|
||||
set -e
|
||||
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
STACK_DIR="$SCRIPT_DIR/stack"
|
||||
COMPOSE_FILE="$STACK_DIR/docker-compose.yml"
|
||||
ENV_FILE="$STACK_DIR/.env"
|
||||
|
||||
# Portainer configuration
|
||||
PORTAINER_URL="${PORTAINER_URL:-http://10.0.0.194:9000}"
|
||||
PORTAINER_USER="${PORTAINER_USER:-michal}"
|
||||
STACK_NAME="mcpctl"
|
||||
ENDPOINT_ID="2"
|
||||
|
||||
# Colors for output
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[1;33m'
|
||||
NC='\033[0m'
|
||||
|
||||
log_info() { echo -e "${GREEN}[INFO]${NC} $1" >&2; }
|
||||
log_warn() { echo -e "${YELLOW}[WARN]${NC} $1" >&2; }
|
||||
log_error() { echo -e "${RED}[ERROR]${NC} $1" >&2; }
|
||||
|
||||
check_files() {
|
||||
if [[ ! -f "$COMPOSE_FILE" ]]; then
|
||||
log_error "Compose file not found: $COMPOSE_FILE"
|
||||
exit 1
|
||||
fi
|
||||
if [[ ! -f "$ENV_FILE" ]]; then
|
||||
log_error "Environment file not found: $ENV_FILE"
|
||||
exit 1
|
||||
fi
|
||||
log_info "Found compose file: $COMPOSE_FILE"
|
||||
log_info "Found env file: $ENV_FILE"
|
||||
}
|
||||
|
||||
get_password() {
|
||||
if [[ -n "$PORTAINER_PASSWORD" ]]; then
|
||||
echo "$PORTAINER_PASSWORD"
|
||||
return
|
||||
fi
|
||||
if [[ -f "$SCRIPT_DIR/.portainer_password" ]]; then
|
||||
cat "$SCRIPT_DIR/.portainer_password"
|
||||
return
|
||||
fi
|
||||
if [[ -f "$HOME/.portainer_password" ]]; then
|
||||
cat "$HOME/.portainer_password"
|
||||
return
|
||||
fi
|
||||
read -s -p "Enter Portainer password for $PORTAINER_USER: " password
|
||||
echo >&2
|
||||
echo "$password"
|
||||
}
|
||||
|
||||
get_jwt_token() {
|
||||
local password="$1"
|
||||
log_info "Authenticating to Portainer..."
|
||||
|
||||
local escaped_password
|
||||
escaped_password=$(printf '%s' "$password" | jq -Rs .)
|
||||
|
||||
local response
|
||||
response=$(curl -s -X POST "$PORTAINER_URL/api/auth" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d "{\"Username\":\"$PORTAINER_USER\",\"Password\":$escaped_password}")
|
||||
|
||||
local token
|
||||
token=$(echo "$response" | jq -r '.jwt // empty')
|
||||
|
||||
if [[ -z "$token" ]]; then
|
||||
log_error "Authentication failed: $(echo "$response" | jq -r '.message // "Unknown error"')"
|
||||
exit 1
|
||||
fi
|
||||
echo "$token"
|
||||
}
|
||||
|
||||
parse_env_to_json() {
|
||||
local env_file="$1"
|
||||
local json_array="["
|
||||
local first=true
|
||||
|
||||
while IFS= read -r line || [[ -n "$line" ]]; do
|
||||
[[ "$line" =~ ^#.*$ ]] && continue
|
||||
[[ -z "$line" ]] && continue
|
||||
|
||||
local name="${line%%=*}"
|
||||
local value="${line#*=}"
|
||||
[[ "$name" == "$line" ]] && continue
|
||||
|
||||
if [[ "$first" == "true" ]]; then
|
||||
first=false
|
||||
else
|
||||
json_array+=","
|
||||
fi
|
||||
|
||||
value=$(echo "$value" | sed 's/\\/\\\\/g; s/"/\\"/g')
|
||||
json_array+="{\"name\":\"$name\",\"value\":\"$value\"}"
|
||||
done < "$env_file"
|
||||
|
||||
json_array+="]"
|
||||
echo "$json_array"
|
||||
}
|
||||
|
||||
# Find existing stack by name
|
||||
find_stack_id() {
|
||||
local token="$1"
|
||||
local response
|
||||
response=$(curl -s -X GET "$PORTAINER_URL/api/stacks" \
|
||||
-H "Authorization: Bearer $token")
|
||||
|
||||
echo "$response" | jq -r --arg name "$STACK_NAME" \
|
||||
'.[] | select(.Name == $name) | .Id // empty'
|
||||
}
|
||||
|
||||
get_stack_info() {
|
||||
local token="$1"
|
||||
local stack_id="$2"
|
||||
curl -s -X GET "$PORTAINER_URL/api/stacks/$stack_id" \
|
||||
-H "Authorization: Bearer $token" \
|
||||
-H "Content-Type: application/json"
|
||||
}
|
||||
|
||||
get_stack_file() {
|
||||
local token="$1"
|
||||
local stack_id="$2"
|
||||
local response
|
||||
response=$(curl -s -X GET "$PORTAINER_URL/api/stacks/$stack_id/file" \
|
||||
-H "Authorization: Bearer $token" \
|
||||
-H "Content-Type: application/json")
|
||||
|
||||
if echo "$response" | jq -e '.StackFileContent' > /dev/null 2>&1; then
|
||||
echo "$response" | jq -r '.StackFileContent'
|
||||
else
|
||||
echo "# Could not retrieve current compose file"
|
||||
fi
|
||||
}
|
||||
|
||||
show_diff() {
|
||||
local token="$1"
|
||||
local stack_id="$2"
|
||||
local env_json="$3"
|
||||
|
||||
log_info "Fetching current state from Portainer..."
|
||||
|
||||
local current_compose
|
||||
current_compose=$(get_stack_file "$token" "$stack_id")
|
||||
|
||||
local current_env
|
||||
local stack_info
|
||||
stack_info=$(get_stack_info "$token" "$stack_id")
|
||||
current_env=$(echo "$stack_info" | jq -r 'if .Env then .Env[] | "\(.name)=\(.value)" else empty end' 2>/dev/null | sort)
|
||||
|
||||
local new_env
|
||||
new_env=$(echo "$env_json" | jq -r '.[] | "\(.name)=\(.value)"' | sort)
|
||||
|
||||
local tmp_dir
|
||||
tmp_dir=$(mktemp -d)
|
||||
|
||||
echo "$current_compose" > "$tmp_dir/current_compose.yml"
|
||||
cat "$COMPOSE_FILE" > "$tmp_dir/new_compose.yml"
|
||||
echo "$current_env" > "$tmp_dir/current_env.txt"
|
||||
echo "$new_env" > "$tmp_dir/new_env.txt"
|
||||
|
||||
echo ""
|
||||
echo "=== ENVIRONMENT VARIABLES DIFF ==="
|
||||
echo ""
|
||||
|
||||
if diff -u "$tmp_dir/current_env.txt" "$tmp_dir/new_env.txt" > "$tmp_dir/env_diff.txt" 2>&1; then
|
||||
echo -e "${GREEN}No changes in environment variables${NC}"
|
||||
else
|
||||
while IFS= read -r line; do
|
||||
if [[ "$line" == ---* ]] || [[ "$line" == +++* ]] || [[ "$line" == @@* ]]; then
|
||||
echo -e "${YELLOW}$line${NC}"
|
||||
elif [[ "$line" == -* ]]; then
|
||||
echo -e "${RED}$line${NC}"
|
||||
elif [[ "$line" == +* ]]; then
|
||||
echo -e "${GREEN}$line${NC}"
|
||||
else
|
||||
echo "$line"
|
||||
fi
|
||||
done < "$tmp_dir/env_diff.txt"
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo "=== COMPOSE FILE DIFF ==="
|
||||
echo ""
|
||||
|
||||
if diff -u "$tmp_dir/current_compose.yml" "$tmp_dir/new_compose.yml" > "$tmp_dir/compose_diff.txt" 2>&1; then
|
||||
echo -e "${GREEN}No changes in compose file${NC}"
|
||||
else
|
||||
while IFS= read -r line; do
|
||||
if [[ "$line" == ---* ]] || [[ "$line" == +++* ]] || [[ "$line" == @@* ]]; then
|
||||
echo -e "${YELLOW}$line${NC}"
|
||||
elif [[ "$line" == -* ]]; then
|
||||
echo -e "${RED}$line${NC}"
|
||||
elif [[ "$line" == +* ]]; then
|
||||
echo -e "${GREEN}$line${NC}"
|
||||
else
|
||||
echo "$line"
|
||||
fi
|
||||
done < "$tmp_dir/compose_diff.txt"
|
||||
fi
|
||||
|
||||
rm -rf "$tmp_dir"
|
||||
}
|
||||
|
||||
create_stack() {
|
||||
local token="$1"
|
||||
local env_json="$2"
|
||||
|
||||
local compose_content
|
||||
compose_content=$(cat "$COMPOSE_FILE")
|
||||
|
||||
local compose_escaped
|
||||
compose_escaped=$(echo "$compose_content" | jq -Rs .)
|
||||
|
||||
log_info "Creating new stack '$STACK_NAME'..."
|
||||
|
||||
local payload
|
||||
payload=$(jq -n \
|
||||
--arg name "$STACK_NAME" \
|
||||
--argjson env "$env_json" \
|
||||
--argjson stackFileContent "$compose_escaped" \
|
||||
'{
|
||||
"name": $name,
|
||||
"env": $env,
|
||||
"stackFileContent": $stackFileContent
|
||||
}')
|
||||
|
||||
local response
|
||||
response=$(curl -s -X POST "$PORTAINER_URL/api/stacks?type=2&method=string&endpointId=$ENDPOINT_ID" \
|
||||
-H "Authorization: Bearer $token" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d "$payload")
|
||||
|
||||
local error_msg
|
||||
error_msg=$(echo "$response" | jq -r '.message // empty')
|
||||
|
||||
if [[ -n "$error_msg" ]]; then
|
||||
log_error "Stack creation failed: $error_msg"
|
||||
echo "$response" | jq .
|
||||
exit 1
|
||||
fi
|
||||
|
||||
local new_id
|
||||
new_id=$(echo "$response" | jq -r '.Id')
|
||||
log_info "Stack created successfully! (ID: $new_id)"
|
||||
echo "$response" | jq '{Id, Name, Status, CreationDate}'
|
||||
}
|
||||
|
||||
update_stack() {
|
||||
local token="$1"
|
||||
local stack_id="$2"
|
||||
local dry_run="$3"
|
||||
|
||||
local compose_content
|
||||
compose_content=$(cat "$COMPOSE_FILE")
|
||||
|
||||
local env_json
|
||||
env_json=$(parse_env_to_json "$ENV_FILE")
|
||||
|
||||
if [[ "$dry_run" == "true" ]]; then
|
||||
log_warn "DRY RUN - Not actually deploying"
|
||||
show_diff "$token" "$stack_id" "$env_json"
|
||||
echo ""
|
||||
log_warn "DRY RUN complete - no changes made"
|
||||
log_info "Run without --dry-run to apply these changes"
|
||||
return 0
|
||||
fi
|
||||
|
||||
local env_count
|
||||
env_count=$(echo "$env_json" | jq 'length')
|
||||
log_info "Deploying $env_count environment variables"
|
||||
log_info "Updating stack '$STACK_NAME' (ID: $stack_id)..."
|
||||
|
||||
local compose_escaped
|
||||
compose_escaped=$(echo "$compose_content" | jq -Rs .)
|
||||
|
||||
local payload
|
||||
payload=$(jq -n \
|
||||
--argjson env "$env_json" \
|
||||
--argjson stackFileContent "$compose_escaped" \
|
||||
'{
|
||||
"env": $env,
|
||||
"stackFileContent": $stackFileContent,
|
||||
"prune": true,
|
||||
"pullImage": true
|
||||
}')
|
||||
|
||||
local response
|
||||
response=$(curl -s -X PUT "$PORTAINER_URL/api/stacks/$stack_id?endpointId=$ENDPOINT_ID" \
|
||||
-H "Authorization: Bearer $token" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d "$payload")
|
||||
|
||||
local error_msg
|
||||
error_msg=$(echo "$response" | jq -r '.message // empty')
|
||||
|
||||
if [[ -n "$error_msg" ]]; then
|
||||
log_error "Deployment failed: $error_msg"
|
||||
echo "$response" | jq .
|
||||
exit 1
|
||||
fi
|
||||
|
||||
log_info "Stack updated successfully!"
|
||||
echo "$response" | jq '{Id, Name, Status, CreationDate, UpdateDate}'
|
||||
}
|
||||
|
||||
main() {
|
||||
local dry_run=false
|
||||
|
||||
while [[ $# -gt 0 ]]; do
|
||||
case $1 in
|
||||
--dry-run)
|
||||
dry_run=true
|
||||
shift
|
||||
;;
|
||||
--help|-h)
|
||||
echo "Usage: $0 [--dry-run]"
|
||||
echo ""
|
||||
echo "Deploy mcpctl stack to Portainer"
|
||||
echo ""
|
||||
echo "Options:"
|
||||
echo " --dry-run Show what would be deployed without actually deploying"
|
||||
echo " --help Show this help message"
|
||||
echo ""
|
||||
echo "Environment variables:"
|
||||
echo " PORTAINER_URL Portainer URL (default: http://10.0.0.194:9000)"
|
||||
echo " PORTAINER_USER Portainer username (default: michal)"
|
||||
echo " PORTAINER_PASSWORD Portainer password (or store in ~/.portainer_password)"
|
||||
exit 0
|
||||
;;
|
||||
*)
|
||||
log_error "Unknown option: $1"
|
||||
exit 1
|
||||
;;
|
||||
esac
|
||||
done
|
||||
|
||||
echo "========================================"
|
||||
echo " mcpctl Stack Deployment"
|
||||
echo "========================================"
|
||||
echo ""
|
||||
|
||||
check_files
|
||||
|
||||
local password
|
||||
password=$(get_password)
|
||||
|
||||
local token
|
||||
token=$(get_jwt_token "$password")
|
||||
log_info "Authentication successful"
|
||||
|
||||
# Find or create stack
|
||||
local stack_id
|
||||
stack_id=$(find_stack_id "$token")
|
||||
|
||||
if [[ -z "$stack_id" ]]; then
|
||||
if [[ "$dry_run" == "true" ]]; then
|
||||
log_warn "Stack '$STACK_NAME' does not exist yet"
|
||||
log_info "A real deploy would create it"
|
||||
return 0
|
||||
fi
|
||||
|
||||
log_info "Stack '$STACK_NAME' not found, creating..."
|
||||
local env_json
|
||||
env_json=$(parse_env_to_json "$ENV_FILE")
|
||||
create_stack "$token" "$env_json"
|
||||
else
|
||||
local stack_info
|
||||
stack_info=$(get_stack_info "$token" "$stack_id")
|
||||
local status_code
|
||||
status_code=$(echo "$stack_info" | jq -r '.Status // 0')
|
||||
local status_text="Unknown"
|
||||
case "$status_code" in
|
||||
1) status_text="Active" ;;
|
||||
2) status_text="Inactive" ;;
|
||||
esac
|
||||
log_info "Current stack status: $status_text (ID: $stack_id, Env vars: $(echo "$stack_info" | jq '.Env | length'))"
|
||||
|
||||
echo ""
|
||||
update_stack "$token" "$stack_id" "$dry_run"
|
||||
fi
|
||||
|
||||
echo ""
|
||||
log_info "Done!"
|
||||
|
||||
if [[ "$dry_run" == "false" ]]; then
|
||||
log_info "Check Portainer UI to verify containers are running"
|
||||
log_info "URL: $PORTAINER_URL/#!/$ENDPOINT_ID/docker/stacks/$STACK_NAME"
|
||||
fi
|
||||
}
|
||||
|
||||
main "$@"
|
||||
@@ -49,6 +49,9 @@ COPY --from=builder /app/src/shared/dist/ src/shared/dist/
|
||||
COPY --from=builder /app/src/db/dist/ src/db/dist/
|
||||
COPY --from=builder /app/src/mcpd/dist/ src/mcpd/dist/
|
||||
|
||||
# Copy templates for seeding
|
||||
COPY templates/ templates/
|
||||
|
||||
# Copy entrypoint
|
||||
COPY deploy/entrypoint.sh /entrypoint.sh
|
||||
RUN chmod +x /entrypoint.sh
|
||||
|
||||
13
deploy/Dockerfile.node-runner
Normal file
13
deploy/Dockerfile.node-runner
Normal file
@@ -0,0 +1,13 @@
|
||||
# Base container for npm-based MCP servers (STDIO transport).
|
||||
# mcpd uses this image to run `npx -y <packageName>` when a server
|
||||
# has packageName but no dockerImage.
|
||||
# Using slim (Debian) instead of alpine for better npm package compatibility.
|
||||
FROM node:20-slim
|
||||
|
||||
WORKDIR /mcp
|
||||
|
||||
# Pre-warm npx cache directory
|
||||
RUN mkdir -p /root/.npm
|
||||
|
||||
# Default entrypoint — overridden by mcpd via container command
|
||||
ENTRYPOINT ["npx", "-y"]
|
||||
@@ -30,6 +30,8 @@ services:
|
||||
MCPD_PORT: "3100"
|
||||
MCPD_HOST: "0.0.0.0"
|
||||
MCPD_LOG_LEVEL: info
|
||||
MCPD_NODE_RUNNER_IMAGE: mcpctl-node-runner:latest
|
||||
MCPD_MCP_NETWORK: mcp-servers
|
||||
depends_on:
|
||||
postgres:
|
||||
condition: service_healthy
|
||||
@@ -48,6 +50,16 @@ services:
|
||||
retries: 3
|
||||
start_period: 10s
|
||||
|
||||
# Base image for npm-based MCP servers (built once, used by mcpd)
|
||||
node-runner:
|
||||
build:
|
||||
context: ..
|
||||
dockerfile: deploy/Dockerfile.node-runner
|
||||
image: mcpctl-node-runner:latest
|
||||
profiles:
|
||||
- build
|
||||
entrypoint: ["echo", "Image built successfully"]
|
||||
|
||||
postgres-test:
|
||||
image: postgres:16-alpine
|
||||
container_name: mcpctl-postgres-test
|
||||
@@ -71,8 +83,11 @@ networks:
|
||||
mcpctl:
|
||||
driver: bridge
|
||||
mcp-servers:
|
||||
name: mcp-servers
|
||||
driver: bridge
|
||||
internal: true
|
||||
# Not internal — MCP servers need outbound access to reach external APIs
|
||||
# (e.g., Grafana, Home Assistant). Isolation is enforced by not binding
|
||||
# host ports on MCP server containers; only mcpd can reach them.
|
||||
|
||||
volumes:
|
||||
mcpctl-pgdata:
|
||||
|
||||
@@ -4,8 +4,8 @@ set -e
|
||||
echo "mcpd: pushing database schema..."
|
||||
pnpm -F @mcpctl/db exec prisma db push --schema=prisma/schema.prisma --accept-data-loss 2>&1
|
||||
|
||||
echo "mcpd: seeding default data..."
|
||||
node src/mcpd/dist/seed-runner.js
|
||||
echo "mcpd: seeding templates..."
|
||||
TEMPLATES_DIR=templates node src/mcpd/dist/seed-runner.js
|
||||
|
||||
echo "mcpd: starting server..."
|
||||
exec node src/mcpd/dist/main.js
|
||||
|
||||
15
deploy/mcplocal.service
Normal file
15
deploy/mcplocal.service
Normal file
@@ -0,0 +1,15 @@
|
||||
[Unit]
|
||||
Description=mcpctl local MCP proxy
|
||||
After=network.target
|
||||
|
||||
[Service]
|
||||
Type=simple
|
||||
ExecStart=/usr/bin/mcpctl-local
|
||||
Restart=on-failure
|
||||
RestartSec=5
|
||||
Environment=MCPLOCAL_MCPD_URL=http://10.0.0.194:3100
|
||||
Environment=MCPLOCAL_HTTP_PORT=3200
|
||||
Environment=MCPLOCAL_HTTP_HOST=127.0.0.1
|
||||
|
||||
[Install]
|
||||
WantedBy=default.target
|
||||
@@ -96,10 +96,12 @@ servers:
|
||||
description: Slack MCP server
|
||||
transport: STDIO
|
||||
packageName: "@anthropic/slack-mcp"
|
||||
envTemplate:
|
||||
env:
|
||||
- name: SLACK_TOKEN
|
||||
description: Slack bot token
|
||||
isSecret: true
|
||||
valueFrom:
|
||||
secretRef:
|
||||
name: slack-secrets
|
||||
key: token
|
||||
|
||||
- name: github
|
||||
description: GitHub MCP server
|
||||
|
||||
28
examples/ha-mcp.yaml
Normal file
28
examples/ha-mcp.yaml
Normal file
@@ -0,0 +1,28 @@
|
||||
servers:
|
||||
- name: ha-mcp
|
||||
description: "Home Assistant MCP - smart home control via MCP"
|
||||
dockerImage: "ghcr.io/homeassistant-ai/ha-mcp:2.4"
|
||||
transport: STREAMABLE_HTTP
|
||||
containerPort: 3000
|
||||
# For mcpd-managed containers:
|
||||
command:
|
||||
- python
|
||||
- "-c"
|
||||
- "from ha_mcp.server import HomeAssistantSmartMCPServer; s = HomeAssistantSmartMCPServer(); s.mcp.run(transport='sse', host='0.0.0.0', port=3000)"
|
||||
# For connecting to an already-running instance (host.containers.internal for container-to-host):
|
||||
externalUrl: "http://host.containers.internal:8086/mcp"
|
||||
env:
|
||||
- name: HOMEASSISTANT_URL
|
||||
value: ""
|
||||
- name: HOMEASSISTANT_TOKEN
|
||||
valueFrom:
|
||||
secretRef:
|
||||
name: ha-secrets
|
||||
key: token
|
||||
|
||||
profiles:
|
||||
- name: production
|
||||
server: ha-mcp
|
||||
envOverrides:
|
||||
HOMEASSISTANT_URL: "https://ha.itaz.eu"
|
||||
HOMEASSISTANT_TOKEN: "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiIyNjFlZTRhOWI2MGM0YTllOGJkNTIxN2Q3YmVmZDkzNSIsImlhdCI6MTc3MDA3NjYzOCwiZXhwIjoyMDg1NDM2NjM4fQ.17mAQxIrCBrQx3ogqAUetwEt-cngRmJiH-e7sLt-3FY"
|
||||
35
fulldeploy.sh
Executable file
35
fulldeploy.sh
Executable file
@@ -0,0 +1,35 @@
|
||||
#!/bin/bash
|
||||
# Full deployment: Docker image → Portainer stack → RPM build/publish/install
|
||||
set -e
|
||||
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
cd "$SCRIPT_DIR"
|
||||
|
||||
# Load .env
|
||||
if [ -f .env ]; then
|
||||
set -a; source .env; set +a
|
||||
fi
|
||||
|
||||
echo "========================================"
|
||||
echo " mcpctl Full Deploy"
|
||||
echo "========================================"
|
||||
|
||||
echo ""
|
||||
echo ">>> Step 1/3: Build & push mcpd Docker image"
|
||||
echo ""
|
||||
bash scripts/build-mcpd.sh "$@"
|
||||
|
||||
echo ""
|
||||
echo ">>> Step 2/3: Deploy stack to production"
|
||||
echo ""
|
||||
bash deploy.sh
|
||||
|
||||
echo ""
|
||||
echo ">>> Step 3/3: Build, publish & install RPM"
|
||||
echo ""
|
||||
bash scripts/release.sh
|
||||
|
||||
echo ""
|
||||
echo "========================================"
|
||||
echo " Full deploy complete!"
|
||||
echo "========================================"
|
||||
26
installlocal.sh
Executable file
26
installlocal.sh
Executable file
@@ -0,0 +1,26 @@
|
||||
#!/bin/bash
|
||||
# Build (if needed) and install mcpctl RPM locally
|
||||
set -e
|
||||
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
cd "$SCRIPT_DIR"
|
||||
|
||||
RPM_FILE=$(ls dist/mcpctl-*.rpm 2>/dev/null | head -1)
|
||||
|
||||
# Build if no RPM exists or if source is newer than the RPM
|
||||
if [[ -z "$RPM_FILE" ]] || [[ $(find src/ -name '*.ts' -newer "$RPM_FILE" 2>/dev/null | head -1) ]]; then
|
||||
echo "==> Building RPM..."
|
||||
bash scripts/build-rpm.sh
|
||||
RPM_FILE=$(ls dist/mcpctl-*.rpm 2>/dev/null | head -1)
|
||||
else
|
||||
echo "==> RPM is up to date: $RPM_FILE"
|
||||
fi
|
||||
|
||||
echo "==> Installing $RPM_FILE..."
|
||||
sudo rpm -Uvh --force "$RPM_FILE"
|
||||
|
||||
echo "==> Reloading systemd user units..."
|
||||
systemctl --user daemon-reload
|
||||
|
||||
echo "==> Done!"
|
||||
echo " Enable mcplocal: systemctl --user enable --now mcplocal"
|
||||
10
nfpm.yaml
10
nfpm.yaml
@@ -5,11 +5,21 @@ release: "1"
|
||||
maintainer: michal
|
||||
description: kubectl-like CLI for managing MCP servers
|
||||
license: MIT
|
||||
depends:
|
||||
- jq
|
||||
contents:
|
||||
- src: ./dist/mcpctl
|
||||
dst: /usr/bin/mcpctl
|
||||
file_info:
|
||||
mode: 0755
|
||||
- src: ./dist/mcpctl-local
|
||||
dst: /usr/bin/mcpctl-local
|
||||
file_info:
|
||||
mode: 0755
|
||||
- src: ./deploy/mcplocal.service
|
||||
dst: /usr/lib/systemd/user/mcplocal.service
|
||||
file_info:
|
||||
mode: 0644
|
||||
- src: ./completions/mcpctl.bash
|
||||
dst: /usr/share/bash-completion/completions/mcpctl
|
||||
file_info:
|
||||
|
||||
@@ -18,7 +18,11 @@
|
||||
"typecheck": "tsc --build",
|
||||
"rpm:build": "bash scripts/build-rpm.sh",
|
||||
"rpm:publish": "bash scripts/publish-rpm.sh",
|
||||
"release": "bash scripts/release.sh"
|
||||
"release": "bash scripts/release.sh",
|
||||
"mcpd:build": "bash scripts/build-mcpd.sh",
|
||||
"mcpd:deploy": "bash deploy.sh",
|
||||
"mcpd:deploy-dry": "bash deploy.sh --dry-run",
|
||||
"mcpd:logs": "bash logs.sh"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=20.0.0",
|
||||
|
||||
6
pnpm-lock.yaml
generated
6
pnpm-lock.yaml
generated
@@ -112,6 +112,9 @@ importers:
|
||||
fastify:
|
||||
specifier: ^5.0.0
|
||||
version: 5.7.4
|
||||
js-yaml:
|
||||
specifier: ^4.1.0
|
||||
version: 4.1.1
|
||||
zod:
|
||||
specifier: ^3.24.0
|
||||
version: 3.25.76
|
||||
@@ -122,6 +125,9 @@ importers:
|
||||
'@types/dockerode':
|
||||
specifier: ^4.0.1
|
||||
version: 4.0.1
|
||||
'@types/js-yaml':
|
||||
specifier: ^4.0.9
|
||||
version: 4.0.9
|
||||
'@types/node':
|
||||
specifier: ^25.3.0
|
||||
version: 25.3.0
|
||||
|
||||
55
pr.sh
Executable file
55
pr.sh
Executable file
@@ -0,0 +1,55 @@
|
||||
#!/usr/bin/env bash
|
||||
# Usage: bash pr.sh "PR title" "PR body"
|
||||
# Loads GITEA_TOKEN from .env automatically
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
# Load .env if GITEA_TOKEN not already exported
|
||||
if [ -z "${GITEA_TOKEN:-}" ] && [ -f .env ]; then
|
||||
set -a
|
||||
source .env
|
||||
set +a
|
||||
fi
|
||||
|
||||
GITEA_URL="${GITEA_URL:-http://10.0.0.194:3012}"
|
||||
REPO="${GITEA_OWNER:-michal}/mcpctl"
|
||||
|
||||
TITLE="${1:?Usage: pr.sh <title> [body]}"
|
||||
BODY="${2:-}"
|
||||
BASE="${3:-main}"
|
||||
HEAD=$(git rev-parse --abbrev-ref HEAD)
|
||||
|
||||
if [ "$HEAD" = "$BASE" ]; then
|
||||
echo "Error: already on $BASE, switch to a feature branch first" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [ -z "${GITEA_TOKEN:-}" ]; then
|
||||
echo "Error: GITEA_TOKEN not set and .env not found" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Push if needed
|
||||
if ! git rev-parse --verify "origin/$HEAD" &>/dev/null; then
|
||||
git push -u origin "$HEAD"
|
||||
else
|
||||
git push
|
||||
fi
|
||||
|
||||
# Create PR
|
||||
RESPONSE=$(curl -s -X POST "$GITEA_URL/api/v1/repos/$REPO/pulls" \
|
||||
-H "Authorization: token $GITEA_TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d "$(jq -n --arg t "$TITLE" --arg b "$BODY" --arg h "$HEAD" --arg base "$BASE" \
|
||||
'{title: $t, body: $b, head: $h, base: $base}')")
|
||||
|
||||
PR_NUM=$(echo "$RESPONSE" | jq -r '.number // empty')
|
||||
PR_URL=$(echo "$RESPONSE" | jq -r '.html_url // empty')
|
||||
|
||||
if [ -z "$PR_NUM" ]; then
|
||||
echo "Error creating PR:" >&2
|
||||
echo "$RESPONSE" | jq . 2>/dev/null || echo "$RESPONSE" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "PR #$PR_NUM: https://mysources.co.uk/$REPO/pulls/$PR_NUM"
|
||||
32
scripts/build-mcpd.sh
Executable file
32
scripts/build-mcpd.sh
Executable file
@@ -0,0 +1,32 @@
|
||||
#!/bin/bash
|
||||
# Build mcpd Docker image and push to Gitea container registry
|
||||
set -e
|
||||
|
||||
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
|
||||
PROJECT_ROOT="$(dirname "$SCRIPT_DIR")"
|
||||
cd "$PROJECT_ROOT"
|
||||
|
||||
# Load .env for GITEA_TOKEN
|
||||
if [ -f .env ]; then
|
||||
set -a; source .env; set +a
|
||||
fi
|
||||
|
||||
# Push directly to internal address (external proxy has body size limit)
|
||||
REGISTRY="10.0.0.194:3012"
|
||||
IMAGE="mcpd"
|
||||
TAG="${1:-latest}"
|
||||
|
||||
echo "==> Building mcpd image..."
|
||||
podman build -t "$IMAGE:$TAG" -f deploy/Dockerfile.mcpd .
|
||||
|
||||
echo "==> Tagging as $REGISTRY/michal/$IMAGE:$TAG..."
|
||||
podman tag "$IMAGE:$TAG" "$REGISTRY/michal/$IMAGE:$TAG"
|
||||
|
||||
echo "==> Logging in to $REGISTRY..."
|
||||
podman login --tls-verify=false -u michal -p "$GITEA_TOKEN" "$REGISTRY"
|
||||
|
||||
echo "==> Pushing to $REGISTRY/michal/$IMAGE:$TAG..."
|
||||
podman push --tls-verify=false "$REGISTRY/michal/$IMAGE:$TAG"
|
||||
|
||||
echo "==> Done!"
|
||||
echo " Image: $REGISTRY/michal/$IMAGE:$TAG"
|
||||
@@ -16,10 +16,11 @@ export PATH="$HOME/.npm-global/bin:$HOME/.bun/bin:$HOME/.local/bin:$PATH"
|
||||
echo "==> Building TypeScript..."
|
||||
pnpm build
|
||||
|
||||
echo "==> Bundling standalone binary..."
|
||||
echo "==> Bundling standalone binaries..."
|
||||
mkdir -p dist
|
||||
rm -f dist/mcpctl dist/mcpctl-*.rpm
|
||||
rm -f dist/mcpctl dist/mcpctl-local dist/mcpctl-*.rpm
|
||||
bun build src/cli/src/index.ts --compile --outfile dist/mcpctl
|
||||
bun build src/mcplocal/src/main.ts --compile --outfile dist/mcpctl-local
|
||||
|
||||
echo "==> Packaging RPM..."
|
||||
nfpm pkg --packager rpm --target dist/
|
||||
|
||||
@@ -24,7 +24,10 @@ export class ApiError extends Error {
|
||||
function request<T>(method: string, url: string, timeout: number, body?: unknown, token?: string): Promise<ApiResponse<T>> {
|
||||
return new Promise((resolve, reject) => {
|
||||
const parsed = new URL(url);
|
||||
const headers: Record<string, string> = { 'Content-Type': 'application/json' };
|
||||
const headers: Record<string, string> = {};
|
||||
if (body !== undefined) {
|
||||
headers['Content-Type'] = 'application/json';
|
||||
}
|
||||
if (token) {
|
||||
headers['Authorization'] = `Bearer ${token}`;
|
||||
}
|
||||
|
||||
@@ -1,9 +1,25 @@
|
||||
import { Command } from 'commander';
|
||||
import { readFileSync } from 'node:fs';
|
||||
import { readFileSync, readSync } from 'node:fs';
|
||||
import yaml from 'js-yaml';
|
||||
import { z } from 'zod';
|
||||
import type { ApiClient } from '../api-client.js';
|
||||
|
||||
const HealthCheckSchema = z.object({
|
||||
tool: z.string().min(1),
|
||||
arguments: z.record(z.unknown()).default({}),
|
||||
intervalSeconds: z.number().int().min(5).max(3600).default(60),
|
||||
timeoutSeconds: z.number().int().min(1).max(120).default(10),
|
||||
failureThreshold: z.number().int().min(1).max(20).default(3),
|
||||
});
|
||||
|
||||
const ServerEnvEntrySchema = z.object({
|
||||
name: z.string().min(1),
|
||||
value: z.string().optional(),
|
||||
valueFrom: z.object({
|
||||
secretRef: z.object({ name: z.string(), key: z.string() }),
|
||||
}).optional(),
|
||||
});
|
||||
|
||||
const ServerSpecSchema = z.object({
|
||||
name: z.string().min(1),
|
||||
description: z.string().default(''),
|
||||
@@ -11,31 +27,117 @@ const ServerSpecSchema = z.object({
|
||||
dockerImage: z.string().optional(),
|
||||
transport: z.enum(['STDIO', 'SSE', 'STREAMABLE_HTTP']).default('STDIO'),
|
||||
repositoryUrl: z.string().url().optional(),
|
||||
envTemplate: z.array(z.object({
|
||||
name: z.string(),
|
||||
description: z.string().default(''),
|
||||
isSecret: z.boolean().default(false),
|
||||
})).default([]),
|
||||
externalUrl: z.string().url().optional(),
|
||||
command: z.array(z.string()).optional(),
|
||||
containerPort: z.number().int().min(1).max(65535).optional(),
|
||||
replicas: z.number().int().min(0).max(10).default(1),
|
||||
env: z.array(ServerEnvEntrySchema).default([]),
|
||||
healthCheck: HealthCheckSchema.optional(),
|
||||
});
|
||||
|
||||
const ProfileSpecSchema = z.object({
|
||||
const SecretSpecSchema = z.object({
|
||||
name: z.string().min(1),
|
||||
server: z.string().min(1),
|
||||
permissions: z.array(z.string()).default([]),
|
||||
envOverrides: z.record(z.string()).default({}),
|
||||
data: z.record(z.string()).default({}),
|
||||
});
|
||||
|
||||
const TemplateEnvEntrySchema = z.object({
|
||||
name: z.string().min(1),
|
||||
description: z.string().optional(),
|
||||
required: z.boolean().optional(),
|
||||
defaultValue: z.string().optional(),
|
||||
});
|
||||
|
||||
const TemplateSpecSchema = z.object({
|
||||
name: z.string().min(1),
|
||||
version: z.string().default('1.0.0'),
|
||||
description: z.string().default(''),
|
||||
packageName: z.string().optional(),
|
||||
dockerImage: z.string().optional(),
|
||||
transport: z.enum(['STDIO', 'SSE', 'STREAMABLE_HTTP']).default('STDIO'),
|
||||
repositoryUrl: z.string().optional(),
|
||||
externalUrl: z.string().optional(),
|
||||
command: z.array(z.string()).optional(),
|
||||
containerPort: z.number().int().min(1).max(65535).optional(),
|
||||
replicas: z.number().int().min(0).max(10).default(1),
|
||||
env: z.array(TemplateEnvEntrySchema).default([]),
|
||||
healthCheck: HealthCheckSchema.optional(),
|
||||
});
|
||||
|
||||
const UserSpecSchema = z.object({
|
||||
email: z.string().email(),
|
||||
password: z.string().min(8),
|
||||
name: z.string().optional(),
|
||||
});
|
||||
|
||||
const GroupSpecSchema = z.object({
|
||||
name: z.string().min(1),
|
||||
description: z.string().default(''),
|
||||
members: z.array(z.string().email()).default([]),
|
||||
});
|
||||
|
||||
const RbacSubjectSchema = z.object({
|
||||
kind: z.enum(['User', 'Group', 'ServiceAccount']),
|
||||
name: z.string().min(1),
|
||||
});
|
||||
|
||||
const RESOURCE_ALIASES: Record<string, string> = {
|
||||
server: 'servers', instance: 'instances', secret: 'secrets',
|
||||
project: 'projects', template: 'templates', user: 'users', group: 'groups',
|
||||
prompt: 'prompts', promptrequest: 'promptrequests',
|
||||
};
|
||||
|
||||
const RbacRoleBindingSchema = z.union([
|
||||
z.object({
|
||||
role: z.enum(['edit', 'view', 'create', 'delete', 'run', 'expose']),
|
||||
resource: z.string().min(1).transform((r) => RESOURCE_ALIASES[r] ?? r),
|
||||
name: z.string().min(1).optional(),
|
||||
}),
|
||||
z.object({
|
||||
role: z.literal('run'),
|
||||
action: z.string().min(1),
|
||||
}),
|
||||
]);
|
||||
|
||||
const RbacBindingSpecSchema = z.object({
|
||||
name: z.string().min(1),
|
||||
subjects: z.array(RbacSubjectSchema).default([]),
|
||||
roleBindings: z.array(RbacRoleBindingSchema).default([]),
|
||||
});
|
||||
|
||||
const PromptSpecSchema = z.object({
|
||||
name: z.string().min(1).max(100).regex(/^[a-z0-9-]+$/),
|
||||
content: z.string().min(1).max(50000),
|
||||
projectId: z.string().optional(),
|
||||
priority: z.number().int().min(1).max(10).optional(),
|
||||
linkTarget: z.string().optional(),
|
||||
});
|
||||
|
||||
const ProjectSpecSchema = z.object({
|
||||
name: z.string().min(1),
|
||||
description: z.string().default(''),
|
||||
profiles: z.array(z.string()).default([]),
|
||||
prompt: z.string().max(10000).default(''),
|
||||
proxyMode: z.enum(['direct', 'filtered']).default('direct'),
|
||||
gated: z.boolean().default(true),
|
||||
llmProvider: z.string().optional(),
|
||||
llmModel: z.string().optional(),
|
||||
servers: z.array(z.string()).default([]),
|
||||
});
|
||||
|
||||
const ApplyConfigSchema = z.object({
|
||||
secrets: z.array(SecretSpecSchema).default([]),
|
||||
servers: z.array(ServerSpecSchema).default([]),
|
||||
profiles: z.array(ProfileSpecSchema).default([]),
|
||||
users: z.array(UserSpecSchema).default([]),
|
||||
groups: z.array(GroupSpecSchema).default([]),
|
||||
projects: z.array(ProjectSpecSchema).default([]),
|
||||
});
|
||||
templates: z.array(TemplateSpecSchema).default([]),
|
||||
rbacBindings: z.array(RbacBindingSpecSchema).default([]),
|
||||
rbac: z.array(RbacBindingSpecSchema).default([]),
|
||||
prompts: z.array(PromptSpecSchema).default([]),
|
||||
}).transform((data) => ({
|
||||
...data,
|
||||
// Merge rbac into rbacBindings so both keys work
|
||||
rbacBindings: [...data.rbacBindings, ...data.rbac],
|
||||
}));
|
||||
|
||||
export type ApplyConfig = z.infer<typeof ApplyConfigSchema>;
|
||||
|
||||
@@ -49,16 +151,26 @@ export function createApplyCommand(deps: ApplyCommandDeps): Command {
|
||||
|
||||
return new Command('apply')
|
||||
.description('Apply declarative configuration from a YAML or JSON file')
|
||||
.argument('<file>', 'Path to config file (.yaml, .yml, or .json)')
|
||||
.argument('[file]', 'Path to config file (.yaml, .yml, or .json)')
|
||||
.option('-f, --file <file>', 'Path to config file (alternative to positional arg)')
|
||||
.option('--dry-run', 'Validate and show changes without applying')
|
||||
.action(async (file: string, opts: { dryRun?: boolean }) => {
|
||||
.action(async (fileArg: string | undefined, opts: { file?: string; dryRun?: boolean }) => {
|
||||
const file = fileArg ?? opts.file;
|
||||
if (!file) {
|
||||
throw new Error('File path required. Usage: mcpctl apply <file> or mcpctl apply -f <file>');
|
||||
}
|
||||
const config = loadConfigFile(file);
|
||||
|
||||
if (opts.dryRun) {
|
||||
log('Dry run - would apply:');
|
||||
if (config.secrets.length > 0) log(` ${config.secrets.length} secret(s)`);
|
||||
if (config.servers.length > 0) log(` ${config.servers.length} server(s)`);
|
||||
if (config.profiles.length > 0) log(` ${config.profiles.length} profile(s)`);
|
||||
if (config.users.length > 0) log(` ${config.users.length} user(s)`);
|
||||
if (config.groups.length > 0) log(` ${config.groups.length} group(s)`);
|
||||
if (config.projects.length > 0) log(` ${config.projects.length} project(s)`);
|
||||
if (config.templates.length > 0) log(` ${config.templates.length} template(s)`);
|
||||
if (config.rbacBindings.length > 0) log(` ${config.rbacBindings.length} rbacBinding(s)`);
|
||||
if (config.prompts.length > 0) log(` ${config.prompts.length} prompt(s)`);
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -66,11 +178,27 @@ export function createApplyCommand(deps: ApplyCommandDeps): Command {
|
||||
});
|
||||
}
|
||||
|
||||
function readStdin(): string {
|
||||
const chunks: Buffer[] = [];
|
||||
const buf = Buffer.alloc(4096);
|
||||
try {
|
||||
// eslint-disable-next-line no-constant-condition
|
||||
while (true) {
|
||||
const bytesRead = readSync(0, buf, 0, buf.length, null);
|
||||
if (bytesRead === 0) break;
|
||||
chunks.push(buf.subarray(0, bytesRead));
|
||||
}
|
||||
} catch {
|
||||
// EOF or closed pipe
|
||||
}
|
||||
return Buffer.concat(chunks).toString('utf-8');
|
||||
}
|
||||
|
||||
function loadConfigFile(path: string): ApplyConfig {
|
||||
const raw = readFileSync(path, 'utf-8');
|
||||
const raw = path === '-' ? readStdin() : readFileSync(path, 'utf-8');
|
||||
let parsed: unknown;
|
||||
|
||||
if (path.endsWith('.json')) {
|
||||
if (path === '-' ? raw.trimStart().startsWith('{') : path.endsWith('.json')) {
|
||||
parsed = JSON.parse(raw);
|
||||
} else {
|
||||
parsed = yaml.load(raw);
|
||||
@@ -80,7 +208,25 @@ function loadConfigFile(path: string): ApplyConfig {
|
||||
}
|
||||
|
||||
async function applyConfig(client: ApiClient, config: ApplyConfig, log: (...args: unknown[]) => void): Promise<void> {
|
||||
// Apply servers first (profiles depend on servers)
|
||||
// Apply order: secrets, servers, users, groups, projects, templates, rbacBindings
|
||||
|
||||
// Apply secrets
|
||||
for (const secret of config.secrets) {
|
||||
try {
|
||||
const existing = await findByName(client, 'secrets', secret.name);
|
||||
if (existing) {
|
||||
await client.put(`/api/v1/secrets/${(existing as { id: string }).id}`, { data: secret.data });
|
||||
log(`Updated secret: ${secret.name}`);
|
||||
} else {
|
||||
await client.post('/api/v1/secrets', secret);
|
||||
log(`Created secret: ${secret.name}`);
|
||||
}
|
||||
} catch (err) {
|
||||
log(`Error applying secret '${secret.name}': ${err instanceof Error ? err.message : err}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Apply servers
|
||||
for (const server of config.servers) {
|
||||
try {
|
||||
const existing = await findByName(client, 'servers', server.name);
|
||||
@@ -96,57 +242,103 @@ async function applyConfig(client: ApiClient, config: ApplyConfig, log: (...args
|
||||
}
|
||||
}
|
||||
|
||||
// Apply profiles (need server IDs)
|
||||
for (const profile of config.profiles) {
|
||||
// Apply users (matched by email)
|
||||
for (const user of config.users) {
|
||||
try {
|
||||
const server = await findByName(client, 'servers', profile.server);
|
||||
if (!server) {
|
||||
log(`Skipping profile '${profile.name}': server '${profile.server}' not found`);
|
||||
continue;
|
||||
}
|
||||
const serverId = (server as { id: string }).id;
|
||||
|
||||
const existing = await findProfile(client, serverId, profile.name);
|
||||
const existing = await findByField(client, 'users', 'email', user.email);
|
||||
if (existing) {
|
||||
await client.put(`/api/v1/profiles/${(existing as { id: string }).id}`, {
|
||||
permissions: profile.permissions,
|
||||
envOverrides: profile.envOverrides,
|
||||
});
|
||||
log(`Updated profile: ${profile.name} (server: ${profile.server})`);
|
||||
await client.put(`/api/v1/users/${(existing as { id: string }).id}`, user);
|
||||
log(`Updated user: ${user.email}`);
|
||||
} else {
|
||||
await client.post('/api/v1/profiles', {
|
||||
name: profile.name,
|
||||
serverId,
|
||||
permissions: profile.permissions,
|
||||
envOverrides: profile.envOverrides,
|
||||
});
|
||||
log(`Created profile: ${profile.name} (server: ${profile.server})`);
|
||||
await client.post('/api/v1/users', user);
|
||||
log(`Created user: ${user.email}`);
|
||||
}
|
||||
} catch (err) {
|
||||
log(`Error applying profile '${profile.name}': ${err instanceof Error ? err.message : err}`);
|
||||
log(`Error applying user '${user.email}': ${err instanceof Error ? err.message : err}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Apply projects
|
||||
// Apply groups
|
||||
for (const group of config.groups) {
|
||||
try {
|
||||
const existing = await findByName(client, 'groups', group.name);
|
||||
if (existing) {
|
||||
await client.put(`/api/v1/groups/${(existing as { id: string }).id}`, group);
|
||||
log(`Updated group: ${group.name}`);
|
||||
} else {
|
||||
await client.post('/api/v1/groups', group);
|
||||
log(`Created group: ${group.name}`);
|
||||
}
|
||||
} catch (err) {
|
||||
log(`Error applying group '${group.name}': ${err instanceof Error ? err.message : err}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Apply projects (send full spec including servers)
|
||||
for (const project of config.projects) {
|
||||
try {
|
||||
const existing = await findByName(client, 'projects', project.name);
|
||||
if (existing) {
|
||||
await client.put(`/api/v1/projects/${(existing as { id: string }).id}`, {
|
||||
description: project.description,
|
||||
});
|
||||
await client.put(`/api/v1/projects/${(existing as { id: string }).id}`, project);
|
||||
log(`Updated project: ${project.name}`);
|
||||
} else {
|
||||
await client.post('/api/v1/projects', {
|
||||
name: project.name,
|
||||
description: project.description,
|
||||
});
|
||||
await client.post('/api/v1/projects', project);
|
||||
log(`Created project: ${project.name}`);
|
||||
}
|
||||
} catch (err) {
|
||||
log(`Error applying project '${project.name}': ${err instanceof Error ? err.message : err}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Apply templates
|
||||
for (const template of config.templates) {
|
||||
try {
|
||||
const existing = await findByName(client, 'templates', template.name);
|
||||
if (existing) {
|
||||
await client.put(`/api/v1/templates/${(existing as { id: string }).id}`, template);
|
||||
log(`Updated template: ${template.name}`);
|
||||
} else {
|
||||
await client.post('/api/v1/templates', template);
|
||||
log(`Created template: ${template.name}`);
|
||||
}
|
||||
} catch (err) {
|
||||
log(`Error applying template '${template.name}': ${err instanceof Error ? err.message : err}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Apply RBAC bindings
|
||||
for (const rbacBinding of config.rbacBindings) {
|
||||
try {
|
||||
const existing = await findByName(client, 'rbac', rbacBinding.name);
|
||||
if (existing) {
|
||||
await client.put(`/api/v1/rbac/${(existing as { id: string }).id}`, rbacBinding);
|
||||
log(`Updated rbacBinding: ${rbacBinding.name}`);
|
||||
} else {
|
||||
await client.post('/api/v1/rbac', rbacBinding);
|
||||
log(`Created rbacBinding: ${rbacBinding.name}`);
|
||||
}
|
||||
} catch (err) {
|
||||
log(`Error applying rbacBinding '${rbacBinding.name}': ${err instanceof Error ? err.message : err}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Apply prompts
|
||||
for (const prompt of config.prompts) {
|
||||
try {
|
||||
const existing = await findByName(client, 'prompts', prompt.name);
|
||||
if (existing) {
|
||||
const updateData: Record<string, unknown> = { content: prompt.content };
|
||||
if (prompt.priority !== undefined) updateData.priority = prompt.priority;
|
||||
await client.put(`/api/v1/prompts/${(existing as { id: string }).id}`, updateData);
|
||||
log(`Updated prompt: ${prompt.name}`);
|
||||
} else {
|
||||
await client.post('/api/v1/prompts', prompt);
|
||||
log(`Created prompt: ${prompt.name}`);
|
||||
}
|
||||
} catch (err) {
|
||||
log(`Error applying prompt '${prompt.name}': ${err instanceof Error ? err.message : err}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async function findByName(client: ApiClient, resource: string, name: string): Promise<unknown | null> {
|
||||
@@ -158,12 +350,10 @@ async function findByName(client: ApiClient, resource: string, name: string): Pr
|
||||
}
|
||||
}
|
||||
|
||||
async function findProfile(client: ApiClient, serverId: string, name: string): Promise<unknown | null> {
|
||||
async function findByField<T extends string>(client: ApiClient, resource: string, field: T, value: string): Promise<unknown | null> {
|
||||
try {
|
||||
const profiles = await client.get<Array<{ name: string; serverId: string }>>(
|
||||
`/api/v1/profiles?serverId=${serverId}`,
|
||||
);
|
||||
return profiles.find((p) => p.name === name) ?? null;
|
||||
const items = await client.get<Array<Record<string, unknown>>>(`/api/v1/${resource}`);
|
||||
return items.find((item) => item[field] === value) ?? null;
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
|
||||
@@ -10,6 +10,10 @@ export interface PromptDeps {
|
||||
password(message: string): Promise<string>;
|
||||
}
|
||||
|
||||
export interface StatusResponse {
|
||||
hasUsers: boolean;
|
||||
}
|
||||
|
||||
export interface AuthCommandDeps {
|
||||
configDeps: Partial<ConfigLoaderDeps>;
|
||||
credentialsDeps: Partial<CredentialsDeps>;
|
||||
@@ -17,6 +21,8 @@ export interface AuthCommandDeps {
|
||||
log: (...args: string[]) => void;
|
||||
loginRequest: (mcpdUrl: string, email: string, password: string) => Promise<LoginResponse>;
|
||||
logoutRequest: (mcpdUrl: string, token: string) => Promise<void>;
|
||||
statusRequest: (mcpdUrl: string) => Promise<StatusResponse>;
|
||||
bootstrapRequest: (mcpdUrl: string, email: string, password: string, name?: string) => Promise<LoginResponse>;
|
||||
}
|
||||
|
||||
interface LoginResponse {
|
||||
@@ -80,6 +86,70 @@ function defaultLogoutRequest(mcpdUrl: string, token: string): Promise<void> {
|
||||
});
|
||||
}
|
||||
|
||||
function defaultStatusRequest(mcpdUrl: string): Promise<StatusResponse> {
|
||||
return new Promise((resolve, reject) => {
|
||||
const url = new URL('/api/v1/auth/status', mcpdUrl);
|
||||
const opts: http.RequestOptions = {
|
||||
hostname: url.hostname,
|
||||
port: url.port,
|
||||
path: url.pathname,
|
||||
method: 'GET',
|
||||
timeout: 10000,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
};
|
||||
const req = http.request(opts, (res) => {
|
||||
const chunks: Buffer[] = [];
|
||||
res.on('data', (chunk: Buffer) => chunks.push(chunk));
|
||||
res.on('end', () => {
|
||||
const raw = Buffer.concat(chunks).toString('utf-8');
|
||||
if ((res.statusCode ?? 0) >= 400) {
|
||||
reject(new Error(`Status check failed (${res.statusCode}): ${raw}`));
|
||||
return;
|
||||
}
|
||||
resolve(JSON.parse(raw) as StatusResponse);
|
||||
});
|
||||
});
|
||||
req.on('error', (err) => reject(new Error(`Cannot reach mcpd: ${err.message}`)));
|
||||
req.on('timeout', () => { req.destroy(); reject(new Error('Status request timed out')); });
|
||||
req.end();
|
||||
});
|
||||
}
|
||||
|
||||
function defaultBootstrapRequest(mcpdUrl: string, email: string, password: string, name?: string): Promise<LoginResponse> {
|
||||
return new Promise((resolve, reject) => {
|
||||
const url = new URL('/api/v1/auth/bootstrap', mcpdUrl);
|
||||
const payload: Record<string, string> = { email, password };
|
||||
if (name) {
|
||||
payload['name'] = name;
|
||||
}
|
||||
const body = JSON.stringify(payload);
|
||||
const opts: http.RequestOptions = {
|
||||
hostname: url.hostname,
|
||||
port: url.port,
|
||||
path: url.pathname,
|
||||
method: 'POST',
|
||||
timeout: 10000,
|
||||
headers: { 'Content-Type': 'application/json', 'Content-Length': Buffer.byteLength(body) },
|
||||
};
|
||||
const req = http.request(opts, (res) => {
|
||||
const chunks: Buffer[] = [];
|
||||
res.on('data', (chunk: Buffer) => chunks.push(chunk));
|
||||
res.on('end', () => {
|
||||
const raw = Buffer.concat(chunks).toString('utf-8');
|
||||
if ((res.statusCode ?? 0) >= 400) {
|
||||
reject(new Error(`Bootstrap failed (${res.statusCode}): ${raw}`));
|
||||
return;
|
||||
}
|
||||
resolve(JSON.parse(raw) as LoginResponse);
|
||||
});
|
||||
});
|
||||
req.on('error', (err) => reject(new Error(`Cannot reach mcpd: ${err.message}`)));
|
||||
req.on('timeout', () => { req.destroy(); reject(new Error('Bootstrap request timed out')); });
|
||||
req.write(body);
|
||||
req.end();
|
||||
});
|
||||
}
|
||||
|
||||
async function defaultInput(message: string): Promise<string> {
|
||||
const { default: inquirer } = await import('inquirer');
|
||||
const { answer } = await inquirer.prompt([{ type: 'input', name: 'answer', message }]);
|
||||
@@ -99,10 +169,12 @@ const defaultDeps: AuthCommandDeps = {
|
||||
log: (...args) => console.log(...args),
|
||||
loginRequest: defaultLoginRequest,
|
||||
logoutRequest: defaultLogoutRequest,
|
||||
statusRequest: defaultStatusRequest,
|
||||
bootstrapRequest: defaultBootstrapRequest,
|
||||
};
|
||||
|
||||
export function createLoginCommand(deps?: Partial<AuthCommandDeps>): Command {
|
||||
const { configDeps, credentialsDeps, prompt, log, loginRequest } = { ...defaultDeps, ...deps };
|
||||
const { configDeps, credentialsDeps, prompt, log, loginRequest, statusRequest, bootstrapRequest } = { ...defaultDeps, ...deps };
|
||||
|
||||
return new Command('login')
|
||||
.description('Authenticate with mcpd')
|
||||
@@ -111,17 +183,36 @@ export function createLoginCommand(deps?: Partial<AuthCommandDeps>): Command {
|
||||
const config = loadConfig(configDeps);
|
||||
const mcpdUrl = opts.mcpdUrl ?? config.mcpdUrl;
|
||||
|
||||
const email = await prompt.input('Email:');
|
||||
const password = await prompt.password('Password:');
|
||||
|
||||
try {
|
||||
const result = await loginRequest(mcpdUrl, email, password);
|
||||
saveCredentials({
|
||||
token: result.token,
|
||||
mcpdUrl,
|
||||
user: result.user.email,
|
||||
}, credentialsDeps);
|
||||
log(`Logged in as ${result.user.email}`);
|
||||
const status = await statusRequest(mcpdUrl);
|
||||
|
||||
if (!status.hasUsers) {
|
||||
log('No users configured. Creating first admin account.');
|
||||
const email = await prompt.input('Email:');
|
||||
const password = await prompt.password('Password:');
|
||||
const name = await prompt.input('Name (optional):');
|
||||
|
||||
const result = name
|
||||
? await bootstrapRequest(mcpdUrl, email, password, name)
|
||||
: await bootstrapRequest(mcpdUrl, email, password);
|
||||
saveCredentials({
|
||||
token: result.token,
|
||||
mcpdUrl,
|
||||
user: result.user.email,
|
||||
}, credentialsDeps);
|
||||
log(`Logged in as ${result.user.email} (admin)`);
|
||||
} else {
|
||||
const email = await prompt.input('Email:');
|
||||
const password = await prompt.password('Password:');
|
||||
|
||||
const result = await loginRequest(mcpdUrl, email, password);
|
||||
saveCredentials({
|
||||
token: result.token,
|
||||
mcpdUrl,
|
||||
user: result.user.email,
|
||||
}, credentialsDeps);
|
||||
log(`Logged in as ${result.user.email}`);
|
||||
}
|
||||
} catch (err) {
|
||||
log(`Login failed: ${(err as Error).message}`);
|
||||
process.exitCode = 1;
|
||||
|
||||
@@ -1,155 +0,0 @@
|
||||
import { Command } from 'commander';
|
||||
import { writeFileSync, readFileSync, existsSync } from 'node:fs';
|
||||
import { resolve } from 'node:path';
|
||||
import type { ApiClient } from '../api-client.js';
|
||||
|
||||
interface McpConfig {
|
||||
mcpServers: Record<string, { command: string; args: string[]; env?: Record<string, string> }>;
|
||||
}
|
||||
|
||||
export interface ClaudeCommandDeps {
|
||||
client: ApiClient;
|
||||
log: (...args: unknown[]) => void;
|
||||
}
|
||||
|
||||
export function createClaudeCommand(deps: ClaudeCommandDeps): Command {
|
||||
const { client, log } = deps;
|
||||
|
||||
const cmd = new Command('claude')
|
||||
.description('Manage Claude MCP configuration (.mcp.json)');
|
||||
|
||||
cmd
|
||||
.command('generate <projectId>')
|
||||
.description('Generate .mcp.json from a project configuration')
|
||||
.option('-o, --output <path>', 'Output file path', '.mcp.json')
|
||||
.option('--merge', 'Merge with existing .mcp.json instead of overwriting')
|
||||
.option('--stdout', 'Print to stdout instead of writing a file')
|
||||
.action(async (projectId: string, opts: { output: string; merge?: boolean; stdout?: boolean }) => {
|
||||
const config = await client.get<McpConfig>(`/api/v1/projects/${projectId}/mcp-config`);
|
||||
|
||||
if (opts.stdout) {
|
||||
log(JSON.stringify(config, null, 2));
|
||||
return;
|
||||
}
|
||||
|
||||
const outputPath = resolve(opts.output);
|
||||
let finalConfig = config;
|
||||
|
||||
if (opts.merge && existsSync(outputPath)) {
|
||||
try {
|
||||
const existing = JSON.parse(readFileSync(outputPath, 'utf-8')) as McpConfig;
|
||||
finalConfig = {
|
||||
mcpServers: {
|
||||
...existing.mcpServers,
|
||||
...config.mcpServers,
|
||||
},
|
||||
};
|
||||
} catch {
|
||||
// If existing file is invalid, just overwrite
|
||||
}
|
||||
}
|
||||
|
||||
writeFileSync(outputPath, JSON.stringify(finalConfig, null, 2) + '\n');
|
||||
const serverCount = Object.keys(finalConfig.mcpServers).length;
|
||||
log(`Wrote ${outputPath} (${serverCount} server(s))`);
|
||||
});
|
||||
|
||||
cmd
|
||||
.command('show')
|
||||
.description('Show current .mcp.json configuration')
|
||||
.option('-p, --path <path>', 'Path to .mcp.json', '.mcp.json')
|
||||
.action((opts: { path: string }) => {
|
||||
const filePath = resolve(opts.path);
|
||||
if (!existsSync(filePath)) {
|
||||
log(`No .mcp.json found at ${filePath}`);
|
||||
return;
|
||||
}
|
||||
const content = readFileSync(filePath, 'utf-8');
|
||||
try {
|
||||
const config = JSON.parse(content) as McpConfig;
|
||||
const servers = Object.entries(config.mcpServers ?? {});
|
||||
if (servers.length === 0) {
|
||||
log('No MCP servers configured.');
|
||||
return;
|
||||
}
|
||||
log(`MCP servers in ${filePath}:\n`);
|
||||
for (const [name, server] of servers) {
|
||||
log(` ${name}`);
|
||||
log(` command: ${server.command} ${server.args.join(' ')}`);
|
||||
if (server.env) {
|
||||
const envKeys = Object.keys(server.env);
|
||||
log(` env: ${envKeys.join(', ')}`);
|
||||
}
|
||||
}
|
||||
} catch {
|
||||
log(`Invalid JSON in ${filePath}`);
|
||||
}
|
||||
});
|
||||
|
||||
cmd
|
||||
.command('add <name>')
|
||||
.description('Add an MCP server entry to .mcp.json')
|
||||
.requiredOption('-c, --command <cmd>', 'Command to run')
|
||||
.option('-a, --args <args...>', 'Command arguments')
|
||||
.option('-e, --env <key=value...>', 'Environment variables')
|
||||
.option('-p, --path <path>', 'Path to .mcp.json', '.mcp.json')
|
||||
.action((name: string, opts: { command: string; args?: string[]; env?: string[]; path: string }) => {
|
||||
const filePath = resolve(opts.path);
|
||||
let config: McpConfig = { mcpServers: {} };
|
||||
|
||||
if (existsSync(filePath)) {
|
||||
try {
|
||||
config = JSON.parse(readFileSync(filePath, 'utf-8')) as McpConfig;
|
||||
} catch {
|
||||
// Start fresh
|
||||
}
|
||||
}
|
||||
|
||||
const entry: { command: string; args: string[]; env?: Record<string, string> } = {
|
||||
command: opts.command,
|
||||
args: opts.args ?? [],
|
||||
};
|
||||
|
||||
if (opts.env && opts.env.length > 0) {
|
||||
const env: Record<string, string> = {};
|
||||
for (const pair of opts.env) {
|
||||
const eqIdx = pair.indexOf('=');
|
||||
if (eqIdx > 0) {
|
||||
env[pair.slice(0, eqIdx)] = pair.slice(eqIdx + 1);
|
||||
}
|
||||
}
|
||||
entry.env = env;
|
||||
}
|
||||
|
||||
config.mcpServers[name] = entry;
|
||||
writeFileSync(filePath, JSON.stringify(config, null, 2) + '\n');
|
||||
log(`Added '${name}' to ${filePath}`);
|
||||
});
|
||||
|
||||
cmd
|
||||
.command('remove <name>')
|
||||
.description('Remove an MCP server entry from .mcp.json')
|
||||
.option('-p, --path <path>', 'Path to .mcp.json', '.mcp.json')
|
||||
.action((name: string, opts: { path: string }) => {
|
||||
const filePath = resolve(opts.path);
|
||||
if (!existsSync(filePath)) {
|
||||
log(`No .mcp.json found at ${filePath}`);
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
const config = JSON.parse(readFileSync(filePath, 'utf-8')) as McpConfig;
|
||||
if (!(name in config.mcpServers)) {
|
||||
log(`Server '${name}' not found in ${filePath}`);
|
||||
return;
|
||||
}
|
||||
delete config.mcpServers[name];
|
||||
writeFileSync(filePath, JSON.stringify(config, null, 2) + '\n');
|
||||
log(`Removed '${name}' from ${filePath}`);
|
||||
} catch {
|
||||
log(`Invalid JSON in ${filePath}`);
|
||||
}
|
||||
});
|
||||
|
||||
return cmd;
|
||||
}
|
||||
464
src/cli/src/commands/config-setup.ts
Normal file
464
src/cli/src/commands/config-setup.ts
Normal file
@@ -0,0 +1,464 @@
|
||||
import { Command } from 'commander';
|
||||
import http from 'node:http';
|
||||
import https from 'node:https';
|
||||
import { execFile } from 'node:child_process';
|
||||
import { promisify } from 'node:util';
|
||||
import { loadConfig, saveConfig } from '../config/index.js';
|
||||
import type { ConfigLoaderDeps, McpctlConfig, LlmConfig, LlmProviderName, LlmProviderEntry, LlmTier } from '../config/index.js';
|
||||
import type { SecretStore } from '@mcpctl/shared';
|
||||
import { createSecretStore } from '@mcpctl/shared';
|
||||
|
||||
const execFileAsync = promisify(execFile);
|
||||
|
||||
export interface ConfigSetupPrompt {
|
||||
select<T>(message: string, choices: Array<{ name: string; value: T; description?: string }>): Promise<T>;
|
||||
input(message: string, defaultValue?: string): Promise<string>;
|
||||
password(message: string): Promise<string>;
|
||||
confirm(message: string, defaultValue?: boolean): Promise<boolean>;
|
||||
}
|
||||
|
||||
export interface ConfigSetupDeps {
|
||||
configDeps: Partial<ConfigLoaderDeps>;
|
||||
secretStore: SecretStore;
|
||||
log: (...args: string[]) => void;
|
||||
prompt: ConfigSetupPrompt;
|
||||
fetchModels: (url: string, path: string) => Promise<string[]>;
|
||||
whichBinary: (name: string) => Promise<string | null>;
|
||||
}
|
||||
|
||||
interface ProviderChoice {
|
||||
name: string;
|
||||
value: LlmProviderName;
|
||||
description: string;
|
||||
}
|
||||
|
||||
/** Provider config fields returned by per-provider setup functions. */
|
||||
interface ProviderFields {
|
||||
model?: string;
|
||||
url?: string;
|
||||
binaryPath?: string;
|
||||
}
|
||||
|
||||
const FAST_PROVIDER_CHOICES: ProviderChoice[] = [
|
||||
{ name: 'vLLM', value: 'vllm', description: 'Self-hosted vLLM (OpenAI-compatible)' },
|
||||
{ name: 'Ollama', value: 'ollama', description: 'Local models via Ollama' },
|
||||
];
|
||||
|
||||
const HEAVY_PROVIDER_CHOICES: ProviderChoice[] = [
|
||||
{ name: 'Gemini CLI', value: 'gemini-cli', description: 'Google Gemini via local CLI (free, no API key)' },
|
||||
{ name: 'Anthropic (Claude)', value: 'anthropic', description: 'Claude API (requires API key)' },
|
||||
{ name: 'OpenAI', value: 'openai', description: 'OpenAI API (requires API key)' },
|
||||
{ name: 'DeepSeek', value: 'deepseek', description: 'DeepSeek API (requires API key)' },
|
||||
];
|
||||
|
||||
const ALL_PROVIDER_CHOICES: ProviderChoice[] = [
|
||||
...FAST_PROVIDER_CHOICES,
|
||||
...HEAVY_PROVIDER_CHOICES,
|
||||
{ name: 'None (disable)', value: 'none', description: 'Disable LLM features' },
|
||||
];
|
||||
|
||||
const GEMINI_MODELS = ['gemini-2.5-flash', 'gemini-2.5-pro', 'gemini-2.0-flash'];
|
||||
const ANTHROPIC_MODELS = ['claude-haiku-3-5-20241022', 'claude-sonnet-4-20250514', 'claude-opus-4-20250514'];
|
||||
const DEEPSEEK_MODELS = ['deepseek-chat', 'deepseek-reasoner'];
|
||||
|
||||
function defaultFetchModels(baseUrl: string, path: string): Promise<string[]> {
|
||||
return new Promise((resolve) => {
|
||||
const url = new URL(path, baseUrl);
|
||||
const isHttps = url.protocol === 'https:';
|
||||
const transport = isHttps ? https : http;
|
||||
|
||||
const req = transport.get({
|
||||
hostname: url.hostname,
|
||||
port: url.port || (isHttps ? 443 : 80),
|
||||
path: url.pathname,
|
||||
timeout: 5000,
|
||||
}, (res) => {
|
||||
const chunks: Buffer[] = [];
|
||||
res.on('data', (chunk: Buffer) => chunks.push(chunk));
|
||||
res.on('end', () => {
|
||||
try {
|
||||
const raw = Buffer.concat(chunks).toString('utf-8');
|
||||
const data = JSON.parse(raw) as { models?: Array<{ name: string }>; data?: Array<{ id: string }> };
|
||||
// Ollama format: { models: [{ name }] }
|
||||
if (data.models) {
|
||||
resolve(data.models.map((m) => m.name));
|
||||
return;
|
||||
}
|
||||
// OpenAI/vLLM format: { data: [{ id }] }
|
||||
if (data.data) {
|
||||
resolve(data.data.map((m) => m.id));
|
||||
return;
|
||||
}
|
||||
resolve([]);
|
||||
} catch {
|
||||
resolve([]);
|
||||
}
|
||||
});
|
||||
});
|
||||
req.on('error', () => resolve([]));
|
||||
req.on('timeout', () => { req.destroy(); resolve([]); });
|
||||
});
|
||||
}
|
||||
|
||||
async function defaultSelect<T>(message: string, choices: Array<{ name: string; value: T; description?: string }>): Promise<T> {
|
||||
const { default: inquirer } = await import('inquirer');
|
||||
const { answer } = await inquirer.prompt([{
|
||||
type: 'list',
|
||||
name: 'answer',
|
||||
message,
|
||||
choices: choices.map((c) => ({
|
||||
name: c.description ? `${c.name} — ${c.description}` : c.name,
|
||||
value: c.value,
|
||||
short: c.name,
|
||||
})),
|
||||
}]);
|
||||
return answer as T;
|
||||
}
|
||||
|
||||
async function defaultInput(message: string, defaultValue?: string): Promise<string> {
|
||||
const { default: inquirer } = await import('inquirer');
|
||||
const { answer } = await inquirer.prompt([{
|
||||
type: 'input',
|
||||
name: 'answer',
|
||||
message,
|
||||
default: defaultValue,
|
||||
}]);
|
||||
return answer as string;
|
||||
}
|
||||
|
||||
async function defaultPassword(message: string): Promise<string> {
|
||||
const { default: inquirer } = await import('inquirer');
|
||||
const { answer } = await inquirer.prompt([{ type: 'password', name: 'answer', message }]);
|
||||
return answer as string;
|
||||
}
|
||||
|
||||
async function defaultConfirm(message: string, defaultValue?: boolean): Promise<boolean> {
|
||||
const { default: inquirer } = await import('inquirer');
|
||||
const { answer } = await inquirer.prompt([{
|
||||
type: 'confirm',
|
||||
name: 'answer',
|
||||
message,
|
||||
default: defaultValue ?? true,
|
||||
}]);
|
||||
return answer as boolean;
|
||||
}
|
||||
|
||||
const defaultPrompt: ConfigSetupPrompt = {
|
||||
select: defaultSelect,
|
||||
input: defaultInput,
|
||||
password: defaultPassword,
|
||||
confirm: defaultConfirm,
|
||||
};
|
||||
|
||||
async function defaultWhichBinary(name: string): Promise<string | null> {
|
||||
try {
|
||||
const { stdout } = await execFileAsync('which', [name], { timeout: 3000 });
|
||||
const path = stdout.trim();
|
||||
return path || null;
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
// --- Per-provider setup functions (return ProviderFields for reuse in both modes) ---
|
||||
|
||||
async function setupGeminiCliFields(
|
||||
prompt: ConfigSetupPrompt,
|
||||
log: (...args: string[]) => void,
|
||||
whichBinary: (name: string) => Promise<string | null>,
|
||||
currentModel?: string,
|
||||
): Promise<ProviderFields> {
|
||||
const model = await prompt.select<string>('Select model:', [
|
||||
...GEMINI_MODELS.map((m) => ({
|
||||
name: m === currentModel ? `${m} (current)` : m,
|
||||
value: m,
|
||||
})),
|
||||
{ name: 'Custom...', value: '__custom__' },
|
||||
]);
|
||||
|
||||
const finalModel = model === '__custom__'
|
||||
? await prompt.input('Model name:', currentModel)
|
||||
: model;
|
||||
|
||||
let binaryPath: string | undefined;
|
||||
const detected = await whichBinary('gemini');
|
||||
if (detected) {
|
||||
log(`Found gemini at: ${detected}`);
|
||||
binaryPath = detected;
|
||||
} else {
|
||||
log('Warning: gemini binary not found in PATH');
|
||||
const manualPath = await prompt.input('Binary path (or install with: npm i -g @google/gemini-cli):');
|
||||
if (manualPath) binaryPath = manualPath;
|
||||
}
|
||||
|
||||
const result: ProviderFields = { model: finalModel };
|
||||
if (binaryPath) result.binaryPath = binaryPath;
|
||||
return result;
|
||||
}
|
||||
|
||||
async function setupOllamaFields(
|
||||
prompt: ConfigSetupPrompt,
|
||||
fetchModels: ConfigSetupDeps['fetchModels'],
|
||||
currentUrl?: string,
|
||||
currentModel?: string,
|
||||
): Promise<ProviderFields> {
|
||||
const url = await prompt.input('Ollama URL:', currentUrl ?? 'http://localhost:11434');
|
||||
const models = await fetchModels(url, '/api/tags');
|
||||
let model: string;
|
||||
|
||||
if (models.length > 0) {
|
||||
const choices = models.map((m) => ({
|
||||
name: m === currentModel ? `${m} (current)` : m,
|
||||
value: m,
|
||||
}));
|
||||
choices.push({ name: 'Custom...', value: '__custom__' });
|
||||
model = await prompt.select<string>('Select model:', choices);
|
||||
if (model === '__custom__') {
|
||||
model = await prompt.input('Model name:', currentModel);
|
||||
}
|
||||
} else {
|
||||
model = await prompt.input('Model name (could not fetch models):', currentModel ?? 'llama3.2');
|
||||
}
|
||||
|
||||
const result: ProviderFields = { model };
|
||||
if (url) result.url = url;
|
||||
return result;
|
||||
}
|
||||
|
||||
async function setupVllmFields(
|
||||
prompt: ConfigSetupPrompt,
|
||||
fetchModels: ConfigSetupDeps['fetchModels'],
|
||||
currentUrl?: string,
|
||||
currentModel?: string,
|
||||
): Promise<ProviderFields> {
|
||||
const url = await prompt.input('vLLM URL:', currentUrl ?? 'http://localhost:8000');
|
||||
const models = await fetchModels(url, '/v1/models');
|
||||
let model: string;
|
||||
|
||||
if (models.length > 0) {
|
||||
const choices = models.map((m) => ({
|
||||
name: m === currentModel ? `${m} (current)` : m,
|
||||
value: m,
|
||||
}));
|
||||
choices.push({ name: 'Custom...', value: '__custom__' });
|
||||
model = await prompt.select<string>('Select model:', choices);
|
||||
if (model === '__custom__') {
|
||||
model = await prompt.input('Model name:', currentModel);
|
||||
}
|
||||
} else {
|
||||
model = await prompt.input('Model name (could not fetch models):', currentModel ?? 'default');
|
||||
}
|
||||
|
||||
const result: ProviderFields = { model };
|
||||
if (url) result.url = url;
|
||||
return result;
|
||||
}
|
||||
|
||||
async function setupApiKeyFields(
|
||||
prompt: ConfigSetupPrompt,
|
||||
secretStore: SecretStore,
|
||||
provider: LlmProviderName,
|
||||
secretKey: string,
|
||||
hardcodedModels: string[],
|
||||
currentModel?: string,
|
||||
currentUrl?: string,
|
||||
): Promise<ProviderFields> {
|
||||
const existingKey = await secretStore.get(secretKey);
|
||||
let apiKey: string;
|
||||
|
||||
if (existingKey) {
|
||||
const masked = `****${existingKey.slice(-4)}`;
|
||||
const changeKey = await prompt.confirm(`API key stored (${masked}). Change it?`, false);
|
||||
apiKey = changeKey ? await prompt.password('API key:') : existingKey;
|
||||
} else {
|
||||
apiKey = await prompt.password('API key:');
|
||||
}
|
||||
|
||||
if (apiKey !== existingKey) {
|
||||
await secretStore.set(secretKey, apiKey);
|
||||
}
|
||||
|
||||
let model: string;
|
||||
if (hardcodedModels.length > 0) {
|
||||
const choices = hardcodedModels.map((m) => ({
|
||||
name: m === currentModel ? `${m} (current)` : m,
|
||||
value: m,
|
||||
}));
|
||||
choices.push({ name: 'Custom...', value: '__custom__' });
|
||||
model = await prompt.select<string>('Select model:', choices);
|
||||
if (model === '__custom__') {
|
||||
model = await prompt.input('Model name:', currentModel);
|
||||
}
|
||||
} else {
|
||||
model = await prompt.input('Model name:', currentModel ?? 'gpt-4o');
|
||||
}
|
||||
|
||||
let url: string | undefined;
|
||||
if (provider === 'openai') {
|
||||
const customUrl = await prompt.confirm('Use custom API endpoint?', false);
|
||||
if (customUrl) {
|
||||
url = await prompt.input('API URL:', currentUrl ?? 'https://api.openai.com');
|
||||
}
|
||||
}
|
||||
|
||||
const result: ProviderFields = { model };
|
||||
if (url) result.url = url;
|
||||
return result;
|
||||
}
|
||||
|
||||
/** Configure a single provider type and return its fields. */
|
||||
async function setupProviderFields(
|
||||
providerType: LlmProviderName,
|
||||
prompt: ConfigSetupPrompt,
|
||||
log: (...args: string[]) => void,
|
||||
fetchModels: ConfigSetupDeps['fetchModels'],
|
||||
whichBinary: (name: string) => Promise<string | null>,
|
||||
secretStore: SecretStore,
|
||||
): Promise<ProviderFields> {
|
||||
switch (providerType) {
|
||||
case 'gemini-cli':
|
||||
return setupGeminiCliFields(prompt, log, whichBinary);
|
||||
case 'ollama':
|
||||
return setupOllamaFields(prompt, fetchModels);
|
||||
case 'vllm':
|
||||
return setupVllmFields(prompt, fetchModels);
|
||||
case 'anthropic':
|
||||
return setupApiKeyFields(prompt, secretStore, 'anthropic', 'anthropic-api-key', ANTHROPIC_MODELS);
|
||||
case 'openai':
|
||||
return setupApiKeyFields(prompt, secretStore, 'openai', 'openai-api-key', []);
|
||||
case 'deepseek':
|
||||
return setupApiKeyFields(prompt, secretStore, 'deepseek', 'deepseek-api-key', DEEPSEEK_MODELS);
|
||||
default:
|
||||
return {};
|
||||
}
|
||||
}
|
||||
|
||||
/** Build a LlmProviderEntry from type, name, and fields. */
|
||||
function buildEntry(providerType: LlmProviderName, name: string, fields: ProviderFields, tier?: LlmTier): LlmProviderEntry {
|
||||
const entry: LlmProviderEntry = { name, type: providerType };
|
||||
if (fields.model) entry.model = fields.model;
|
||||
if (fields.url) entry.url = fields.url;
|
||||
if (fields.binaryPath) entry.binaryPath = fields.binaryPath;
|
||||
if (tier) entry.tier = tier;
|
||||
return entry;
|
||||
}
|
||||
|
||||
/** Simple mode: single provider (legacy format). */
|
||||
async function simpleSetup(
|
||||
config: McpctlConfig,
|
||||
configDeps: Partial<ConfigLoaderDeps>,
|
||||
prompt: ConfigSetupPrompt,
|
||||
log: (...args: string[]) => void,
|
||||
fetchModels: ConfigSetupDeps['fetchModels'],
|
||||
whichBinary: (name: string) => Promise<string | null>,
|
||||
secretStore: SecretStore,
|
||||
): Promise<void> {
|
||||
const currentLlm = config.llm && 'provider' in config.llm ? config.llm : undefined;
|
||||
|
||||
const choices = ALL_PROVIDER_CHOICES.map((c) => {
|
||||
if (currentLlm?.provider === c.value) {
|
||||
return { ...c, name: `${c.name} (current)` };
|
||||
}
|
||||
return c;
|
||||
});
|
||||
|
||||
const provider = await prompt.select<LlmProviderName>('Select LLM provider:', choices);
|
||||
|
||||
if (provider === 'none') {
|
||||
const updated: McpctlConfig = { ...config, llm: { provider: 'none' } };
|
||||
saveConfig(updated, configDeps);
|
||||
log('LLM disabled. Restart mcplocal: systemctl --user restart mcplocal');
|
||||
return;
|
||||
}
|
||||
|
||||
const fields = await setupProviderFields(provider, prompt, log, fetchModels, whichBinary, secretStore);
|
||||
const llmConfig: LlmConfig = { provider, ...fields };
|
||||
const updated: McpctlConfig = { ...config, llm: llmConfig };
|
||||
saveConfig(updated, configDeps);
|
||||
log(`\nLLM configured: ${llmConfig.provider}${llmConfig.model ? ` / ${llmConfig.model}` : ''}`);
|
||||
log('Restart mcplocal: systemctl --user restart mcplocal');
|
||||
}
|
||||
|
||||
/** Advanced mode: multiple providers with tier assignments. */
|
||||
async function advancedSetup(
|
||||
config: McpctlConfig,
|
||||
configDeps: Partial<ConfigLoaderDeps>,
|
||||
prompt: ConfigSetupPrompt,
|
||||
log: (...args: string[]) => void,
|
||||
fetchModels: ConfigSetupDeps['fetchModels'],
|
||||
whichBinary: (name: string) => Promise<string | null>,
|
||||
secretStore: SecretStore,
|
||||
): Promise<void> {
|
||||
const entries: LlmProviderEntry[] = [];
|
||||
|
||||
// Fast providers
|
||||
const addFast = await prompt.confirm('Add a FAST provider? (vLLM, Ollama — local, cheap, fast)', true);
|
||||
if (addFast) {
|
||||
let addMore = true;
|
||||
while (addMore) {
|
||||
const providerType = await prompt.select<LlmProviderName>('Fast provider type:', FAST_PROVIDER_CHOICES);
|
||||
const defaultName = providerType === 'vllm' ? 'vllm-local' : providerType;
|
||||
const name = await prompt.input('Provider name:', defaultName);
|
||||
const fields = await setupProviderFields(providerType, prompt, log, fetchModels, whichBinary, secretStore);
|
||||
entries.push(buildEntry(providerType, name, fields, 'fast'));
|
||||
log(` Added: ${name} (${providerType}) → fast tier`);
|
||||
addMore = await prompt.confirm('Add another fast provider?', false);
|
||||
}
|
||||
}
|
||||
|
||||
// Heavy providers
|
||||
const addHeavy = await prompt.confirm('Add a HEAVY provider? (Gemini, Anthropic, OpenAI — cloud, smart)', true);
|
||||
if (addHeavy) {
|
||||
let addMore = true;
|
||||
while (addMore) {
|
||||
const providerType = await prompt.select<LlmProviderName>('Heavy provider type:', HEAVY_PROVIDER_CHOICES);
|
||||
const defaultName = providerType;
|
||||
const name = await prompt.input('Provider name:', defaultName);
|
||||
const fields = await setupProviderFields(providerType, prompt, log, fetchModels, whichBinary, secretStore);
|
||||
entries.push(buildEntry(providerType, name, fields, 'heavy'));
|
||||
log(` Added: ${name} (${providerType}) → heavy tier`);
|
||||
addMore = await prompt.confirm('Add another heavy provider?', false);
|
||||
}
|
||||
}
|
||||
|
||||
if (entries.length === 0) {
|
||||
log('No providers configured.');
|
||||
return;
|
||||
}
|
||||
|
||||
// Summary
|
||||
log('\nProvider configuration:');
|
||||
for (const e of entries) {
|
||||
log(` ${e.tier ?? 'unassigned'}: ${e.name} (${e.type})${e.model ? ` / ${e.model}` : ''}`);
|
||||
}
|
||||
|
||||
const updated: McpctlConfig = { ...config, llm: { providers: entries } };
|
||||
saveConfig(updated, configDeps);
|
||||
log('\nRestart mcplocal: systemctl --user restart mcplocal');
|
||||
}
|
||||
|
||||
export function createConfigSetupCommand(deps?: Partial<ConfigSetupDeps>): Command {
|
||||
return new Command('setup')
|
||||
.description('Interactive LLM provider setup wizard')
|
||||
.action(async () => {
|
||||
const configDeps = deps?.configDeps ?? {};
|
||||
const log = deps?.log ?? ((...args: string[]) => console.log(...args));
|
||||
const prompt = deps?.prompt ?? defaultPrompt;
|
||||
const fetchModels = deps?.fetchModels ?? defaultFetchModels;
|
||||
const whichBinary = deps?.whichBinary ?? defaultWhichBinary;
|
||||
const secretStore = deps?.secretStore ?? await createSecretStore();
|
||||
|
||||
const config = loadConfig(configDeps);
|
||||
|
||||
const mode = await prompt.select<'simple' | 'advanced'>('Setup mode:', [
|
||||
{ name: 'Simple', value: 'simple', description: 'One provider for everything' },
|
||||
{ name: 'Advanced', value: 'advanced', description: 'Multiple providers with fast/heavy tiers' },
|
||||
]);
|
||||
|
||||
if (mode === 'simple') {
|
||||
await simpleSetup(config, configDeps, prompt, log, fetchModels, whichBinary, secretStore);
|
||||
} else {
|
||||
await advancedSetup(config, configDeps, prompt, log, fetchModels, whichBinary, secretStore);
|
||||
}
|
||||
});
|
||||
}
|
||||
@@ -1,19 +1,36 @@
|
||||
import { Command } from 'commander';
|
||||
import { writeFileSync, readFileSync, existsSync } from 'node:fs';
|
||||
import { resolve, join } from 'node:path';
|
||||
import { homedir } from 'node:os';
|
||||
import { loadConfig, saveConfig, mergeConfig, getConfigPath, DEFAULT_CONFIG } from '../config/index.js';
|
||||
import type { McpctlConfig, ConfigLoaderDeps } from '../config/index.js';
|
||||
import { formatJson, formatYaml } from '../formatters/index.js';
|
||||
import { saveCredentials, loadCredentials } from '../auth/index.js';
|
||||
import { createConfigSetupCommand } from './config-setup.js';
|
||||
import type { CredentialsDeps, StoredCredentials } from '../auth/index.js';
|
||||
import type { ApiClient } from '../api-client.js';
|
||||
|
||||
interface McpConfig {
|
||||
mcpServers: Record<string, { command?: string; args?: string[]; url?: string; env?: Record<string, string> }>;
|
||||
}
|
||||
|
||||
export interface ConfigCommandDeps {
|
||||
configDeps: Partial<ConfigLoaderDeps>;
|
||||
log: (...args: string[]) => void;
|
||||
}
|
||||
|
||||
export interface ConfigApiDeps {
|
||||
client: ApiClient;
|
||||
credentialsDeps: Partial<CredentialsDeps>;
|
||||
log: (...args: string[]) => void;
|
||||
}
|
||||
|
||||
const defaultDeps: ConfigCommandDeps = {
|
||||
configDeps: {},
|
||||
log: (...args) => console.log(...args),
|
||||
};
|
||||
|
||||
export function createConfigCommand(deps?: Partial<ConfigCommandDeps>): Command {
|
||||
export function createConfigCommand(deps?: Partial<ConfigCommandDeps>, apiDeps?: ConfigApiDeps): Command {
|
||||
const { configDeps, log } = { ...defaultDeps, ...deps };
|
||||
|
||||
const config = new Command('config').description('Manage mcpctl configuration');
|
||||
@@ -68,5 +85,134 @@ export function createConfigCommand(deps?: Partial<ConfigCommandDeps>): Command
|
||||
log('Configuration reset to defaults');
|
||||
});
|
||||
|
||||
// claude/claude-generate: generate .mcp.json pointing at mcpctl mcp bridge
|
||||
function registerClaudeCommand(name: string, hidden: boolean): void {
|
||||
const cmd = config
|
||||
.command(name)
|
||||
.description(hidden ? '' : 'Generate .mcp.json that connects a project via mcpctl mcp bridge')
|
||||
.requiredOption('--project <name>', 'Project name')
|
||||
.option('-o, --output <path>', 'Output file path', '.mcp.json')
|
||||
.option('--merge', 'Merge with existing .mcp.json instead of overwriting')
|
||||
.option('--stdout', 'Print to stdout instead of writing a file')
|
||||
.action((opts: { project: string; output: string; merge?: boolean; stdout?: boolean }) => {
|
||||
const mcpConfig: McpConfig = {
|
||||
mcpServers: {
|
||||
[opts.project]: {
|
||||
command: 'mcpctl',
|
||||
args: ['mcp', '-p', opts.project],
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
if (opts.stdout) {
|
||||
log(JSON.stringify(mcpConfig, null, 2));
|
||||
return;
|
||||
}
|
||||
|
||||
const outputPath = resolve(opts.output);
|
||||
let finalConfig = mcpConfig;
|
||||
|
||||
if (opts.merge && existsSync(outputPath)) {
|
||||
try {
|
||||
const existing = JSON.parse(readFileSync(outputPath, 'utf-8')) as McpConfig;
|
||||
finalConfig = {
|
||||
mcpServers: {
|
||||
...existing.mcpServers,
|
||||
...mcpConfig.mcpServers,
|
||||
},
|
||||
};
|
||||
} catch {
|
||||
// If existing file is invalid, just overwrite
|
||||
}
|
||||
}
|
||||
|
||||
writeFileSync(outputPath, JSON.stringify(finalConfig, null, 2) + '\n');
|
||||
const serverCount = Object.keys(finalConfig.mcpServers).length;
|
||||
log(`Wrote ${outputPath} (${serverCount} server(s))`);
|
||||
});
|
||||
if (hidden) {
|
||||
// Commander shows empty-description commands but they won't clutter help output
|
||||
void cmd; // suppress unused lint
|
||||
}
|
||||
}
|
||||
|
||||
registerClaudeCommand('claude', false);
|
||||
registerClaudeCommand('claude-generate', true); // backward compat
|
||||
|
||||
config.addCommand(createConfigSetupCommand({ configDeps }));
|
||||
|
||||
if (apiDeps) {
|
||||
const { client, credentialsDeps, log: apiLog } = apiDeps;
|
||||
|
||||
config
|
||||
.command('impersonate')
|
||||
.description('Impersonate another user or return to original identity')
|
||||
.argument('[email]', 'Email of user to impersonate')
|
||||
.option('--quit', 'Stop impersonating and return to original identity')
|
||||
.action(async (email: string | undefined, opts: { quit?: boolean }) => {
|
||||
const configDir = credentialsDeps?.configDir ?? join(homedir(), '.mcpctl');
|
||||
const backupPath = join(configDir, 'credentials-backup');
|
||||
|
||||
if (opts.quit) {
|
||||
if (!existsSync(backupPath)) {
|
||||
apiLog('No impersonation session to quit');
|
||||
process.exitCode = 1;
|
||||
return;
|
||||
}
|
||||
|
||||
const backupRaw = readFileSync(backupPath, 'utf-8');
|
||||
const backup = JSON.parse(backupRaw) as StoredCredentials;
|
||||
saveCredentials(backup, credentialsDeps);
|
||||
|
||||
// Remove backup file
|
||||
const { unlinkSync } = await import('node:fs');
|
||||
unlinkSync(backupPath);
|
||||
|
||||
apiLog(`Returned to ${backup.user}`);
|
||||
return;
|
||||
}
|
||||
|
||||
if (!email) {
|
||||
apiLog('Email is required when not using --quit');
|
||||
process.exitCode = 1;
|
||||
return;
|
||||
}
|
||||
|
||||
// Save current credentials as backup
|
||||
const currentCreds = loadCredentials(credentialsDeps);
|
||||
if (!currentCreds) {
|
||||
apiLog('Not logged in. Run "mcpctl login" first.');
|
||||
process.exitCode = 1;
|
||||
return;
|
||||
}
|
||||
|
||||
writeFileSync(backupPath, JSON.stringify(currentCreds, null, 2) + '\n', 'utf-8');
|
||||
|
||||
try {
|
||||
const result = await client.post<{ token: string; user: { email: string } }>(
|
||||
'/api/v1/auth/impersonate',
|
||||
{ email },
|
||||
);
|
||||
|
||||
saveCredentials({
|
||||
token: result.token,
|
||||
mcpdUrl: currentCreds.mcpdUrl,
|
||||
user: result.user.email,
|
||||
}, credentialsDeps);
|
||||
|
||||
apiLog(`Impersonating ${result.user.email}. Use 'mcpctl config impersonate --quit' to return.`);
|
||||
} catch (err) {
|
||||
// Restore backup on failure
|
||||
const backup = JSON.parse(readFileSync(backupPath, 'utf-8')) as StoredCredentials;
|
||||
saveCredentials(backup, credentialsDeps);
|
||||
const { unlinkSync } = await import('node:fs');
|
||||
unlinkSync(backupPath);
|
||||
|
||||
apiLog(`Impersonate failed: ${(err as Error).message}`);
|
||||
process.exitCode = 1;
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
return config;
|
||||
}
|
||||
|
||||
432
src/cli/src/commands/create.ts
Normal file
432
src/cli/src/commands/create.ts
Normal file
@@ -0,0 +1,432 @@
|
||||
import { Command } from 'commander';
|
||||
import { type ApiClient, ApiError } from '../api-client.js';
|
||||
export interface CreateCommandDeps {
|
||||
client: ApiClient;
|
||||
log: (...args: unknown[]) => void;
|
||||
}
|
||||
|
||||
function collect(value: string, prev: string[]): string[] {
|
||||
return [...prev, value];
|
||||
}
|
||||
|
||||
interface ServerEnvEntry {
|
||||
name: string;
|
||||
value?: string;
|
||||
valueFrom?: { secretRef: { name: string; key: string } };
|
||||
}
|
||||
|
||||
function parseServerEnv(entries: string[]): ServerEnvEntry[] {
|
||||
return entries.map((entry) => {
|
||||
const eqIdx = entry.indexOf('=');
|
||||
if (eqIdx === -1) {
|
||||
throw new Error(`Invalid env format '${entry}'. Expected KEY=value or KEY=secretRef:SECRET:KEY`);
|
||||
}
|
||||
const envName = entry.slice(0, eqIdx);
|
||||
const rhs = entry.slice(eqIdx + 1);
|
||||
|
||||
if (rhs.startsWith('secretRef:')) {
|
||||
const parts = rhs.split(':');
|
||||
if (parts.length !== 3) {
|
||||
throw new Error(`Invalid secretRef format '${entry}'. Expected KEY=secretRef:SECRET_NAME:SECRET_KEY`);
|
||||
}
|
||||
return {
|
||||
name: envName,
|
||||
valueFrom: { secretRef: { name: parts[1]!, key: parts[2]! } },
|
||||
};
|
||||
}
|
||||
|
||||
return { name: envName, value: rhs };
|
||||
});
|
||||
}
|
||||
|
||||
function parseEnvEntries(entries: string[]): Record<string, string> {
|
||||
const result: Record<string, string> = {};
|
||||
for (const entry of entries) {
|
||||
const eqIdx = entry.indexOf('=');
|
||||
if (eqIdx === -1) {
|
||||
throw new Error(`Invalid env format '${entry}'. Expected KEY=value`);
|
||||
}
|
||||
result[entry.slice(0, eqIdx)] = entry.slice(eqIdx + 1);
|
||||
}
|
||||
return result;
|
||||
}
|
||||
|
||||
export function createCreateCommand(deps: CreateCommandDeps): Command {
|
||||
const { client, log } = deps;
|
||||
|
||||
const cmd = new Command('create')
|
||||
.description('Create a resource (server, secret, project, user, group, rbac)');
|
||||
|
||||
// --- create server ---
|
||||
cmd.command('server')
|
||||
.description('Create an MCP server definition')
|
||||
.argument('<name>', 'Server name (lowercase, hyphens allowed)')
|
||||
.option('-d, --description <text>', 'Server description')
|
||||
.option('--package-name <name>', 'NPM package name')
|
||||
.option('--docker-image <image>', 'Docker image')
|
||||
.option('--transport <type>', 'Transport type (STDIO, SSE, STREAMABLE_HTTP)')
|
||||
.option('--repository-url <url>', 'Source repository URL')
|
||||
.option('--external-url <url>', 'External endpoint URL')
|
||||
.option('--command <arg>', 'Command argument (repeat for multiple)', collect, [])
|
||||
.option('--container-port <port>', 'Container port number')
|
||||
.option('--replicas <count>', 'Number of replicas')
|
||||
.option('--env <entry>', 'Env var: KEY=value (inline) or KEY=secretRef:SECRET:KEY (secret ref, repeat for multiple)', collect, [])
|
||||
.option('--from-template <name>', 'Create from template (name or name:version)')
|
||||
.option('--force', 'Update if already exists')
|
||||
.action(async (name: string, opts) => {
|
||||
let base: Record<string, unknown> = {};
|
||||
|
||||
// If --from-template, fetch template and use as base
|
||||
if (opts.fromTemplate) {
|
||||
const tplRef = opts.fromTemplate as string;
|
||||
const [tplName, tplVersion] = tplRef.includes(':')
|
||||
? [tplRef.slice(0, tplRef.indexOf(':')), tplRef.slice(tplRef.indexOf(':') + 1)]
|
||||
: [tplRef, undefined];
|
||||
|
||||
const templates = await client.get<Array<Record<string, unknown>>>(`/api/v1/templates?name=${encodeURIComponent(tplName)}`);
|
||||
let template: Record<string, unknown> | undefined;
|
||||
if (tplVersion) {
|
||||
template = templates.find((t) => t.name === tplName && t.version === tplVersion);
|
||||
if (!template) throw new Error(`Template '${tplName}' version '${tplVersion}' not found`);
|
||||
} else {
|
||||
template = templates.find((t) => t.name === tplName);
|
||||
if (!template) throw new Error(`Template '${tplName}' not found`);
|
||||
}
|
||||
|
||||
// Copy template fields as base (strip template-only, internal, and null fields)
|
||||
const { id: _id, createdAt: _c, updatedAt: _u, version: _v, name: _n, ...tplFields } = template;
|
||||
base = {};
|
||||
for (const [k, v] of Object.entries(tplFields)) {
|
||||
if (v !== null && v !== undefined) base[k] = v;
|
||||
}
|
||||
|
||||
// Convert template env (description/required) to server env (name/value/valueFrom)
|
||||
const tplEnv = template.env as Array<{ name: string; description?: string; required?: boolean; defaultValue?: string }> | undefined;
|
||||
if (tplEnv && tplEnv.length > 0) {
|
||||
base.env = tplEnv.map((e) => ({ name: e.name, value: e.defaultValue ?? '' }));
|
||||
}
|
||||
|
||||
// Track template origin
|
||||
base.templateName = tplName;
|
||||
base.templateVersion = (template.version as string) ?? '1.0.0';
|
||||
}
|
||||
|
||||
// Build body: template base → CLI overrides (last wins)
|
||||
const body: Record<string, unknown> = {
|
||||
...base,
|
||||
name,
|
||||
};
|
||||
if (opts.description !== undefined) body.description = opts.description;
|
||||
if (opts.transport) body.transport = opts.transport;
|
||||
if (opts.replicas) body.replicas = parseInt(opts.replicas, 10);
|
||||
if (opts.packageName) body.packageName = opts.packageName;
|
||||
if (opts.dockerImage) body.dockerImage = opts.dockerImage;
|
||||
if (opts.repositoryUrl) body.repositoryUrl = opts.repositoryUrl;
|
||||
if (opts.externalUrl) body.externalUrl = opts.externalUrl;
|
||||
if (opts.command.length > 0) body.command = opts.command;
|
||||
if (opts.containerPort) body.containerPort = parseInt(opts.containerPort, 10);
|
||||
if (opts.env.length > 0) {
|
||||
// Merge: CLI env entries override template env entries by name
|
||||
const cliEnv = parseServerEnv(opts.env);
|
||||
const existing = (body.env as ServerEnvEntry[] | undefined) ?? [];
|
||||
const merged = [...existing];
|
||||
for (const entry of cliEnv) {
|
||||
const idx = merged.findIndex((e) => e.name === entry.name);
|
||||
if (idx >= 0) {
|
||||
merged[idx] = entry;
|
||||
} else {
|
||||
merged.push(entry);
|
||||
}
|
||||
}
|
||||
body.env = merged;
|
||||
}
|
||||
|
||||
// Defaults when no template
|
||||
if (!opts.fromTemplate) {
|
||||
if (body.description === undefined) body.description = '';
|
||||
if (!body.transport) body.transport = 'STDIO';
|
||||
if (!body.replicas) body.replicas = 1;
|
||||
}
|
||||
|
||||
try {
|
||||
const server = await client.post<{ id: string; name: string }>('/api/v1/servers', body);
|
||||
log(`server '${server.name}' created (id: ${server.id})`);
|
||||
} catch (err) {
|
||||
if (err instanceof ApiError && err.status === 409 && opts.force) {
|
||||
const existing = (await client.get<Array<{ id: string; name: string }>>('/api/v1/servers')).find((s) => s.name === name);
|
||||
if (!existing) throw err;
|
||||
const { name: _n, ...updateBody } = body;
|
||||
await client.put(`/api/v1/servers/${existing.id}`, updateBody);
|
||||
log(`server '${name}' updated (id: ${existing.id})`);
|
||||
} else {
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// --- create secret ---
|
||||
cmd.command('secret')
|
||||
.description('Create a secret')
|
||||
.argument('<name>', 'Secret name (lowercase, hyphens allowed)')
|
||||
.option('--data <entry>', 'Secret data KEY=value (repeat for multiple)', collect, [])
|
||||
.option('--force', 'Update if already exists')
|
||||
.action(async (name: string, opts) => {
|
||||
const data = parseEnvEntries(opts.data);
|
||||
try {
|
||||
const secret = await client.post<{ id: string; name: string }>('/api/v1/secrets', {
|
||||
name,
|
||||
data,
|
||||
});
|
||||
log(`secret '${secret.name}' created (id: ${secret.id})`);
|
||||
} catch (err) {
|
||||
if (err instanceof ApiError && err.status === 409 && opts.force) {
|
||||
const existing = (await client.get<Array<{ id: string; name: string }>>('/api/v1/secrets')).find((s) => s.name === name);
|
||||
if (!existing) throw err;
|
||||
await client.put(`/api/v1/secrets/${existing.id}`, { data });
|
||||
log(`secret '${name}' updated (id: ${existing.id})`);
|
||||
} else {
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// --- create project ---
|
||||
cmd.command('project')
|
||||
.description('Create a project')
|
||||
.argument('<name>', 'Project name')
|
||||
.option('-d, --description <text>', 'Project description', '')
|
||||
.option('--proxy-mode <mode>', 'Proxy mode (direct, filtered)')
|
||||
.option('--prompt <text>', 'Project-level prompt / instructions for the LLM')
|
||||
.option('--gated', 'Enable gated sessions (default: true)')
|
||||
.option('--no-gated', 'Disable gated sessions')
|
||||
.option('--server <name>', 'Server name (repeat for multiple)', collect, [])
|
||||
.option('--force', 'Update if already exists')
|
||||
.action(async (name: string, opts) => {
|
||||
const body: Record<string, unknown> = {
|
||||
name,
|
||||
description: opts.description,
|
||||
proxyMode: opts.proxyMode ?? 'direct',
|
||||
};
|
||||
if (opts.prompt) body.prompt = opts.prompt;
|
||||
if (opts.gated !== undefined) body.gated = opts.gated as boolean;
|
||||
if (opts.server.length > 0) body.servers = opts.server;
|
||||
|
||||
try {
|
||||
const project = await client.post<{ id: string; name: string }>('/api/v1/projects', body);
|
||||
log(`project '${project.name}' created (id: ${project.id})`);
|
||||
} catch (err) {
|
||||
if (err instanceof ApiError && err.status === 409 && opts.force) {
|
||||
const existing = (await client.get<Array<{ id: string; name: string }>>('/api/v1/projects')).find((p) => p.name === name);
|
||||
if (!existing) throw err;
|
||||
const { name: _n, ...updateBody } = body;
|
||||
await client.put(`/api/v1/projects/${existing.id}`, updateBody);
|
||||
log(`project '${name}' updated (id: ${existing.id})`);
|
||||
} else {
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// --- create user ---
|
||||
cmd.command('user')
|
||||
.description('Create a user')
|
||||
.argument('<email>', 'User email address')
|
||||
.option('--password <pass>', 'User password')
|
||||
.option('--name <name>', 'User display name')
|
||||
.option('--force', 'Update if already exists')
|
||||
.action(async (email: string, opts) => {
|
||||
if (!opts.password) {
|
||||
throw new Error('--password is required');
|
||||
}
|
||||
const body: Record<string, unknown> = {
|
||||
email,
|
||||
password: opts.password,
|
||||
};
|
||||
if (opts.name) body.name = opts.name;
|
||||
|
||||
try {
|
||||
const user = await client.post<{ id: string; email: string }>('/api/v1/users', body);
|
||||
log(`user '${user.email}' created (id: ${user.id})`);
|
||||
} catch (err) {
|
||||
if (err instanceof ApiError && err.status === 409 && opts.force) {
|
||||
const existing = (await client.get<Array<{ id: string; email: string }>>('/api/v1/users')).find((u) => u.email === email);
|
||||
if (!existing) throw err;
|
||||
const { email: _e, ...updateBody } = body;
|
||||
await client.put(`/api/v1/users/${existing.id}`, updateBody);
|
||||
log(`user '${email}' updated (id: ${existing.id})`);
|
||||
} else {
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// --- create group ---
|
||||
cmd.command('group')
|
||||
.description('Create a group')
|
||||
.argument('<name>', 'Group name')
|
||||
.option('--description <text>', 'Group description')
|
||||
.option('--member <email>', 'Member email (repeat for multiple)', collect, [])
|
||||
.option('--force', 'Update if already exists')
|
||||
.action(async (name: string, opts) => {
|
||||
const body: Record<string, unknown> = {
|
||||
name,
|
||||
members: opts.member,
|
||||
};
|
||||
if (opts.description) body.description = opts.description;
|
||||
|
||||
try {
|
||||
const group = await client.post<{ id: string; name: string }>('/api/v1/groups', body);
|
||||
log(`group '${group.name}' created (id: ${group.id})`);
|
||||
} catch (err) {
|
||||
if (err instanceof ApiError && err.status === 409 && opts.force) {
|
||||
const existing = (await client.get<Array<{ id: string; name: string }>>('/api/v1/groups')).find((g) => g.name === name);
|
||||
if (!existing) throw err;
|
||||
const { name: _n, ...updateBody } = body;
|
||||
await client.put(`/api/v1/groups/${existing.id}`, updateBody);
|
||||
log(`group '${name}' updated (id: ${existing.id})`);
|
||||
} else {
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// --- create rbac ---
|
||||
cmd.command('rbac')
|
||||
.description('Create an RBAC binding definition')
|
||||
.argument('<name>', 'RBAC binding name')
|
||||
.option('--subject <entry>', 'Subject as Kind:name (repeat for multiple)', collect, [])
|
||||
.option('--binding <entry>', 'Role binding as role:resource (e.g. edit:servers, run:projects)', collect, [])
|
||||
.option('--operation <action>', 'Operation binding (e.g. logs, backup)', collect, [])
|
||||
.option('--force', 'Update if already exists')
|
||||
.action(async (name: string, opts) => {
|
||||
const subjects = (opts.subject as string[]).map((entry: string) => {
|
||||
const colonIdx = entry.indexOf(':');
|
||||
if (colonIdx === -1) {
|
||||
throw new Error(`Invalid subject format '${entry}'. Expected Kind:name (e.g. User:alice@example.com)`);
|
||||
}
|
||||
return { kind: entry.slice(0, colonIdx), name: entry.slice(colonIdx + 1) };
|
||||
});
|
||||
|
||||
const roleBindings: Array<Record<string, string>> = [];
|
||||
|
||||
// Resource bindings from --binding flag (role:resource or role:resource:name)
|
||||
for (const entry of opts.binding as string[]) {
|
||||
const parts = entry.split(':');
|
||||
if (parts.length === 2) {
|
||||
roleBindings.push({ role: parts[0]!, resource: parts[1]! });
|
||||
} else if (parts.length === 3) {
|
||||
roleBindings.push({ role: parts[0]!, resource: parts[1]!, name: parts[2]! });
|
||||
} else {
|
||||
throw new Error(`Invalid binding format '${entry}'. Expected role:resource or role:resource:name (e.g. edit:servers, view:servers:my-ha)`);
|
||||
}
|
||||
}
|
||||
|
||||
// Operation bindings from --operation flag
|
||||
for (const action of opts.operation as string[]) {
|
||||
roleBindings.push({ role: 'run', action });
|
||||
}
|
||||
|
||||
const body: Record<string, unknown> = {
|
||||
name,
|
||||
subjects,
|
||||
roleBindings,
|
||||
};
|
||||
|
||||
try {
|
||||
const rbac = await client.post<{ id: string; name: string }>('/api/v1/rbac', body);
|
||||
log(`rbac '${rbac.name}' created (id: ${rbac.id})`);
|
||||
} catch (err) {
|
||||
if (err instanceof ApiError && err.status === 409 && opts.force) {
|
||||
const existing = (await client.get<Array<{ id: string; name: string }>>('/api/v1/rbac')).find((r) => r.name === name);
|
||||
if (!existing) throw err;
|
||||
const { name: _n, ...updateBody } = body;
|
||||
await client.put(`/api/v1/rbac/${existing.id}`, updateBody);
|
||||
log(`rbac '${name}' updated (id: ${existing.id})`);
|
||||
} else {
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// --- create prompt ---
|
||||
cmd.command('prompt')
|
||||
.description('Create an approved prompt')
|
||||
.argument('<name>', 'Prompt name (lowercase alphanumeric with hyphens)')
|
||||
.option('--project <name>', 'Project name to scope the prompt to')
|
||||
.option('--content <text>', 'Prompt content text')
|
||||
.option('--content-file <path>', 'Read prompt content from file')
|
||||
.option('--priority <number>', 'Priority 1-10 (default: 5, higher = more important)')
|
||||
.option('--link <target>', 'Link to MCP resource (format: project/server:uri)')
|
||||
.action(async (name: string, opts) => {
|
||||
let content = opts.content as string | undefined;
|
||||
if (opts.contentFile) {
|
||||
const fs = await import('node:fs/promises');
|
||||
content = await fs.readFile(opts.contentFile as string, 'utf-8');
|
||||
}
|
||||
if (!content) {
|
||||
throw new Error('--content or --content-file is required');
|
||||
}
|
||||
|
||||
const body: Record<string, unknown> = { name, content };
|
||||
if (opts.project) {
|
||||
// Resolve project name to ID
|
||||
const projects = await client.get<Array<{ id: string; name: string }>>('/api/v1/projects');
|
||||
const project = projects.find((p) => p.name === opts.project);
|
||||
if (!project) throw new Error(`Project '${opts.project as string}' not found`);
|
||||
body.projectId = project.id;
|
||||
}
|
||||
if (opts.priority) {
|
||||
const priority = Number(opts.priority);
|
||||
if (isNaN(priority) || priority < 1 || priority > 10) {
|
||||
throw new Error('--priority must be a number between 1 and 10');
|
||||
}
|
||||
body.priority = priority;
|
||||
}
|
||||
if (opts.link) {
|
||||
body.linkTarget = opts.link;
|
||||
}
|
||||
|
||||
const prompt = await client.post<{ id: string; name: string }>('/api/v1/prompts', body);
|
||||
log(`prompt '${prompt.name}' created (id: ${prompt.id})`);
|
||||
});
|
||||
|
||||
// --- create promptrequest ---
|
||||
cmd.command('promptrequest')
|
||||
.description('Create a prompt request (pending proposal that needs approval)')
|
||||
.argument('<name>', 'Prompt request name (lowercase alphanumeric with hyphens)')
|
||||
.option('--project <name>', 'Project name to scope the prompt request to')
|
||||
.option('--content <text>', 'Prompt content text')
|
||||
.option('--content-file <path>', 'Read prompt content from file')
|
||||
.option('--priority <number>', 'Priority 1-10 (default: 5, higher = more important)')
|
||||
.action(async (name: string, opts) => {
|
||||
let content = opts.content as string | undefined;
|
||||
if (opts.contentFile) {
|
||||
const fs = await import('node:fs/promises');
|
||||
content = await fs.readFile(opts.contentFile as string, 'utf-8');
|
||||
}
|
||||
if (!content) {
|
||||
throw new Error('--content or --content-file is required');
|
||||
}
|
||||
|
||||
const body: Record<string, unknown> = { name, content };
|
||||
if (opts.project) {
|
||||
body.project = opts.project;
|
||||
}
|
||||
if (opts.priority) {
|
||||
const priority = Number(opts.priority);
|
||||
if (isNaN(priority) || priority < 1 || priority > 10) {
|
||||
throw new Error('--priority must be a number between 1 and 10');
|
||||
}
|
||||
body.priority = priority;
|
||||
}
|
||||
|
||||
const pr = await client.post<{ id: string; name: string }>(
|
||||
'/api/v1/promptrequests',
|
||||
body,
|
||||
);
|
||||
log(`prompt request '${pr.name}' created (id: ${pr.id})`);
|
||||
log(` approve with: mcpctl approve promptrequest ${pr.name}`);
|
||||
});
|
||||
|
||||
return cmd;
|
||||
}
|
||||
33
src/cli/src/commands/delete.ts
Normal file
33
src/cli/src/commands/delete.ts
Normal file
@@ -0,0 +1,33 @@
|
||||
import { Command } from 'commander';
|
||||
import type { ApiClient } from '../api-client.js';
|
||||
import { resolveResource, resolveNameOrId } from './shared.js';
|
||||
|
||||
export interface DeleteCommandDeps {
|
||||
client: ApiClient;
|
||||
log: (...args: unknown[]) => void;
|
||||
}
|
||||
|
||||
export function createDeleteCommand(deps: DeleteCommandDeps): Command {
|
||||
const { client, log } = deps;
|
||||
|
||||
return new Command('delete')
|
||||
.description('Delete a resource (server, instance, secret, project, user, group, rbac)')
|
||||
.argument('<resource>', 'resource type')
|
||||
.argument('<id>', 'resource ID or name')
|
||||
.action(async (resourceArg: string, idOrName: string) => {
|
||||
const resource = resolveResource(resourceArg);
|
||||
|
||||
// Resolve name → ID for any resource type
|
||||
let id: string;
|
||||
try {
|
||||
id = await resolveNameOrId(client, resource, idOrName);
|
||||
} catch {
|
||||
id = idOrName; // Fall through with original
|
||||
}
|
||||
|
||||
await client.delete(`/api/v1/${resource}/${id}`);
|
||||
|
||||
const singular = resource.replace(/s$/, '');
|
||||
log(`${singular} '${idOrName}' deleted.`);
|
||||
});
|
||||
}
|
||||
@@ -1,74 +1,638 @@
|
||||
import { Command } from 'commander';
|
||||
import { formatJson, formatYaml } from '../formatters/output.js';
|
||||
import { resolveResource, resolveNameOrId } from './shared.js';
|
||||
import type { ApiClient } from '../api-client.js';
|
||||
|
||||
export interface DescribeCommandDeps {
|
||||
client: ApiClient;
|
||||
fetchResource: (resource: string, id: string) => Promise<unknown>;
|
||||
fetchInspect?: (id: string) => Promise<unknown>;
|
||||
log: (...args: string[]) => void;
|
||||
}
|
||||
|
||||
const RESOURCE_ALIASES: Record<string, string> = {
|
||||
server: 'servers',
|
||||
srv: 'servers',
|
||||
profile: 'profiles',
|
||||
prof: 'profiles',
|
||||
project: 'projects',
|
||||
proj: 'projects',
|
||||
instance: 'instances',
|
||||
inst: 'instances',
|
||||
};
|
||||
|
||||
function resolveResource(name: string): string {
|
||||
const lower = name.toLowerCase();
|
||||
return RESOURCE_ALIASES[lower] ?? lower;
|
||||
function pad(label: string, width = 18): string {
|
||||
return label.padEnd(width);
|
||||
}
|
||||
|
||||
function formatDetail(obj: Record<string, unknown>, indent = 0): string {
|
||||
const pad = ' '.repeat(indent);
|
||||
function formatServerDetail(server: Record<string, unknown>): string {
|
||||
const lines: string[] = [];
|
||||
lines.push(`=== Server: ${server.name} ===`);
|
||||
lines.push(`${pad('Name:')}${server.name}`);
|
||||
lines.push(`${pad('Transport:')}${server.transport ?? '-'}`);
|
||||
lines.push(`${pad('Replicas:')}${server.replicas ?? 1}`);
|
||||
if (server.dockerImage) lines.push(`${pad('Docker Image:')}${server.dockerImage}`);
|
||||
if (server.packageName) lines.push(`${pad('Package:')}${server.packageName}`);
|
||||
if (server.externalUrl) lines.push(`${pad('External URL:')}${server.externalUrl}`);
|
||||
if (server.repositoryUrl) lines.push(`${pad('Repository:')}${server.repositoryUrl}`);
|
||||
if (server.containerPort) lines.push(`${pad('Container Port:')}${server.containerPort}`);
|
||||
if (server.description) lines.push(`${pad('Description:')}${server.description}`);
|
||||
|
||||
for (const [key, value] of Object.entries(obj)) {
|
||||
if (value === null || value === undefined) {
|
||||
lines.push(`${pad}${key}: -`);
|
||||
} else if (Array.isArray(value)) {
|
||||
if (value.length === 0) {
|
||||
lines.push(`${pad}${key}: []`);
|
||||
} else if (typeof value[0] === 'object') {
|
||||
lines.push(`${pad}${key}:`);
|
||||
for (const item of value) {
|
||||
lines.push(`${pad} - ${JSON.stringify(item)}`);
|
||||
}
|
||||
} else {
|
||||
lines.push(`${pad}${key}: ${value.join(', ')}`);
|
||||
const command = server.command as string[] | null;
|
||||
if (command && command.length > 0) {
|
||||
lines.push('');
|
||||
lines.push('Command:');
|
||||
lines.push(` ${command.join(' ')}`);
|
||||
}
|
||||
|
||||
const env = server.env as Array<{ name: string; value?: string; valueFrom?: { secretRef: { name: string; key: string } } }> | undefined;
|
||||
if (env && env.length > 0) {
|
||||
lines.push('');
|
||||
lines.push('Environment:');
|
||||
const nameW = Math.max(6, ...env.map((e) => e.name.length)) + 2;
|
||||
lines.push(` ${'NAME'.padEnd(nameW)}SOURCE`);
|
||||
for (const e of env) {
|
||||
if (e.value !== undefined) {
|
||||
lines.push(` ${e.name.padEnd(nameW)}${e.value}`);
|
||||
} else if (e.valueFrom?.secretRef) {
|
||||
const ref = e.valueFrom.secretRef;
|
||||
lines.push(` ${e.name.padEnd(nameW)}secret:${ref.name}/${ref.key}`);
|
||||
}
|
||||
} else if (typeof value === 'object') {
|
||||
lines.push(`${pad}${key}:`);
|
||||
lines.push(formatDetail(value as Record<string, unknown>, indent + 1));
|
||||
} else {
|
||||
lines.push(`${pad}${key}: ${String(value)}`);
|
||||
}
|
||||
}
|
||||
|
||||
const hc = server.healthCheck as { tool: string; arguments?: Record<string, unknown>; intervalSeconds?: number; timeoutSeconds?: number; failureThreshold?: number } | null;
|
||||
if (hc) {
|
||||
lines.push('');
|
||||
lines.push('Health Check:');
|
||||
lines.push(` ${pad('Tool:', 22)}${hc.tool}`);
|
||||
if (hc.arguments && Object.keys(hc.arguments).length > 0) {
|
||||
lines.push(` ${pad('Arguments:', 22)}${JSON.stringify(hc.arguments)}`);
|
||||
}
|
||||
lines.push(` ${pad('Interval:', 22)}${hc.intervalSeconds ?? 60}s`);
|
||||
lines.push(` ${pad('Timeout:', 22)}${hc.timeoutSeconds ?? 10}s`);
|
||||
lines.push(` ${pad('Failure Threshold:', 22)}${hc.failureThreshold ?? 3}`);
|
||||
}
|
||||
|
||||
lines.push('');
|
||||
lines.push('Metadata:');
|
||||
lines.push(` ${pad('ID:', 12)}${server.id}`);
|
||||
if (server.createdAt) lines.push(` ${pad('Created:', 12)}${server.createdAt}`);
|
||||
if (server.updatedAt) lines.push(` ${pad('Updated:', 12)}${server.updatedAt}`);
|
||||
|
||||
return lines.join('\n');
|
||||
}
|
||||
|
||||
function formatInstanceDetail(instance: Record<string, unknown>, inspect?: Record<string, unknown>): string {
|
||||
const lines: string[] = [];
|
||||
const server = instance.server as { name: string } | undefined;
|
||||
lines.push(`=== Instance: ${server?.name ?? instance.id} ===`);
|
||||
lines.push(`${pad('Status:')}${instance.status}`);
|
||||
lines.push(`${pad('Server:')}${server?.name ?? String(instance.serverId)}`);
|
||||
lines.push(`${pad('Container ID:')}${instance.containerId ?? '-'}`);
|
||||
lines.push(`${pad('Port:')}${instance.port ?? '-'}`);
|
||||
|
||||
// Health section
|
||||
const healthStatus = instance.healthStatus as string | null;
|
||||
const lastHealthCheck = instance.lastHealthCheck as string | null;
|
||||
if (healthStatus || lastHealthCheck) {
|
||||
lines.push('');
|
||||
lines.push('Health:');
|
||||
lines.push(` ${pad('Status:', 16)}${healthStatus ?? 'unknown'}`);
|
||||
if (lastHealthCheck) lines.push(` ${pad('Last Check:', 16)}${lastHealthCheck}`);
|
||||
}
|
||||
|
||||
const metadata = instance.metadata as Record<string, unknown> | undefined;
|
||||
if (metadata && Object.keys(metadata).length > 0) {
|
||||
lines.push('');
|
||||
lines.push('Metadata:');
|
||||
for (const [key, value] of Object.entries(metadata)) {
|
||||
lines.push(` ${pad(key + ':', 16)}${String(value)}`);
|
||||
}
|
||||
}
|
||||
|
||||
if (inspect) {
|
||||
lines.push('');
|
||||
lines.push('Container:');
|
||||
for (const [key, value] of Object.entries(inspect)) {
|
||||
if (typeof value === 'object' && value !== null) {
|
||||
lines.push(` ${key}: ${JSON.stringify(value)}`);
|
||||
} else {
|
||||
lines.push(` ${pad(key + ':', 16)}${String(value)}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Events section (k8s-style)
|
||||
const events = instance.events as Array<{ timestamp: string; type: string; message: string }> | undefined;
|
||||
if (events && events.length > 0) {
|
||||
lines.push('');
|
||||
lines.push('Events:');
|
||||
const tsW = 26;
|
||||
const typeW = 10;
|
||||
lines.push(` ${'TIMESTAMP'.padEnd(tsW)}${'TYPE'.padEnd(typeW)}MESSAGE`);
|
||||
for (const ev of events) {
|
||||
lines.push(` ${(ev.timestamp ?? '').padEnd(tsW)}${(ev.type ?? '').padEnd(typeW)}${ev.message ?? ''}`);
|
||||
}
|
||||
}
|
||||
|
||||
lines.push('');
|
||||
lines.push(` ${pad('ID:', 12)}${instance.id}`);
|
||||
if (instance.createdAt) lines.push(` ${pad('Created:', 12)}${instance.createdAt}`);
|
||||
if (instance.updatedAt) lines.push(` ${pad('Updated:', 12)}${instance.updatedAt}`);
|
||||
|
||||
return lines.join('\n');
|
||||
}
|
||||
|
||||
function formatProjectDetail(
|
||||
project: Record<string, unknown>,
|
||||
prompts: Array<{ name: string; priority: number; linkTarget: string | null }> = [],
|
||||
): string {
|
||||
const lines: string[] = [];
|
||||
lines.push(`=== Project: ${project.name} ===`);
|
||||
lines.push(`${pad('Name:')}${project.name}`);
|
||||
if (project.description) lines.push(`${pad('Description:')}${project.description}`);
|
||||
lines.push(`${pad('Gated:')}${project.gated ? 'yes' : 'no'}`);
|
||||
|
||||
// Proxy config section
|
||||
const proxyMode = project.proxyMode as string | undefined;
|
||||
const llmProvider = project.llmProvider as string | undefined;
|
||||
const llmModel = project.llmModel as string | undefined;
|
||||
if (proxyMode || llmProvider || llmModel) {
|
||||
lines.push('');
|
||||
lines.push('Proxy Config:');
|
||||
lines.push(` ${pad('Mode:', 18)}${proxyMode ?? 'direct'}`);
|
||||
if (llmProvider) lines.push(` ${pad('LLM Provider:', 18)}${llmProvider}`);
|
||||
if (llmModel) lines.push(` ${pad('LLM Model:', 18)}${llmModel}`);
|
||||
}
|
||||
|
||||
// Servers section
|
||||
const servers = project.servers as Array<{ server: { name: string } }> | undefined;
|
||||
if (servers && servers.length > 0) {
|
||||
lines.push('');
|
||||
lines.push('Servers:');
|
||||
lines.push(' NAME');
|
||||
for (const s of servers) {
|
||||
lines.push(` ${s.server.name}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Prompts section
|
||||
if (prompts.length > 0) {
|
||||
lines.push('');
|
||||
lines.push('Prompts:');
|
||||
const nameW = Math.max(4, ...prompts.map((p) => p.name.length)) + 2;
|
||||
lines.push(` ${'NAME'.padEnd(nameW)}${'PRI'.padEnd(6)}TYPE`);
|
||||
for (const p of prompts) {
|
||||
const type = p.linkTarget ? 'link' : 'local';
|
||||
lines.push(` ${p.name.padEnd(nameW)}${String(p.priority).padEnd(6)}${type}`);
|
||||
}
|
||||
}
|
||||
|
||||
lines.push('');
|
||||
lines.push('Metadata:');
|
||||
lines.push(` ${pad('ID:', 12)}${project.id}`);
|
||||
if (project.ownerId) lines.push(` ${pad('Owner:', 12)}${project.ownerId}`);
|
||||
if (project.createdAt) lines.push(` ${pad('Created:', 12)}${project.createdAt}`);
|
||||
if (project.updatedAt) lines.push(` ${pad('Updated:', 12)}${project.updatedAt}`);
|
||||
|
||||
return lines.join('\n');
|
||||
}
|
||||
|
||||
function formatSecretDetail(secret: Record<string, unknown>, showValues: boolean): string {
|
||||
const lines: string[] = [];
|
||||
lines.push(`=== Secret: ${secret.name} ===`);
|
||||
lines.push(`${pad('Name:')}${secret.name}`);
|
||||
|
||||
const data = secret.data as Record<string, string> | undefined;
|
||||
if (data && Object.keys(data).length > 0) {
|
||||
lines.push('');
|
||||
lines.push('Data:');
|
||||
const keyW = Math.max(4, ...Object.keys(data).map((k) => k.length)) + 2;
|
||||
for (const [key, value] of Object.entries(data)) {
|
||||
const display = showValues ? value : '***';
|
||||
lines.push(` ${key.padEnd(keyW)}${display}`);
|
||||
}
|
||||
if (!showValues) {
|
||||
lines.push('');
|
||||
lines.push(' (use --show-values to reveal)');
|
||||
}
|
||||
} else {
|
||||
lines.push(`${pad('Data:')}(empty)`);
|
||||
}
|
||||
|
||||
lines.push('');
|
||||
lines.push('Metadata:');
|
||||
lines.push(` ${pad('ID:', 12)}${secret.id}`);
|
||||
if (secret.createdAt) lines.push(` ${pad('Created:', 12)}${secret.createdAt}`);
|
||||
if (secret.updatedAt) lines.push(` ${pad('Updated:', 12)}${secret.updatedAt}`);
|
||||
|
||||
return lines.join('\n');
|
||||
}
|
||||
|
||||
function formatTemplateDetail(template: Record<string, unknown>): string {
|
||||
const lines: string[] = [];
|
||||
lines.push(`=== Template: ${template.name} ===`);
|
||||
lines.push(`${pad('Name:')}${template.name}`);
|
||||
lines.push(`${pad('Version:')}${template.version ?? '1.0.0'}`);
|
||||
lines.push(`${pad('Transport:')}${template.transport ?? 'STDIO'}`);
|
||||
lines.push(`${pad('Replicas:')}${template.replicas ?? 1}`);
|
||||
if (template.dockerImage) lines.push(`${pad('Docker Image:')}${template.dockerImage}`);
|
||||
if (template.packageName) lines.push(`${pad('Package:')}${template.packageName}`);
|
||||
if (template.externalUrl) lines.push(`${pad('External URL:')}${template.externalUrl}`);
|
||||
if (template.repositoryUrl) lines.push(`${pad('Repository:')}${template.repositoryUrl}`);
|
||||
if (template.containerPort) lines.push(`${pad('Container Port:')}${template.containerPort}`);
|
||||
if (template.description) lines.push(`${pad('Description:')}${template.description}`);
|
||||
|
||||
const command = template.command as string[] | null;
|
||||
if (command && command.length > 0) {
|
||||
lines.push('');
|
||||
lines.push('Command:');
|
||||
lines.push(` ${command.join(' ')}`);
|
||||
}
|
||||
|
||||
const env = template.env as Array<{ name: string; description?: string; required?: boolean; defaultValue?: string }> | undefined;
|
||||
if (env && env.length > 0) {
|
||||
lines.push('');
|
||||
lines.push('Environment Variables:');
|
||||
const nameW = Math.max(6, ...env.map((e) => e.name.length)) + 2;
|
||||
lines.push(` ${'NAME'.padEnd(nameW)}${'REQUIRED'.padEnd(10)}DESCRIPTION`);
|
||||
for (const e of env) {
|
||||
const req = e.required ? 'yes' : 'no';
|
||||
const desc = e.description ?? '';
|
||||
lines.push(` ${e.name.padEnd(nameW)}${req.padEnd(10)}${desc}`);
|
||||
}
|
||||
}
|
||||
|
||||
const hc = template.healthCheck as { tool: string; arguments?: Record<string, unknown>; intervalSeconds?: number; timeoutSeconds?: number; failureThreshold?: number } | null;
|
||||
if (hc) {
|
||||
lines.push('');
|
||||
lines.push('Health Check:');
|
||||
lines.push(` ${pad('Tool:', 22)}${hc.tool}`);
|
||||
if (hc.arguments && Object.keys(hc.arguments).length > 0) {
|
||||
lines.push(` ${pad('Arguments:', 22)}${JSON.stringify(hc.arguments)}`);
|
||||
}
|
||||
lines.push(` ${pad('Interval:', 22)}${hc.intervalSeconds ?? 60}s`);
|
||||
lines.push(` ${pad('Timeout:', 22)}${hc.timeoutSeconds ?? 10}s`);
|
||||
lines.push(` ${pad('Failure Threshold:', 22)}${hc.failureThreshold ?? 3}`);
|
||||
}
|
||||
|
||||
lines.push('');
|
||||
lines.push('Usage:');
|
||||
lines.push(` mcpctl create server my-${template.name} --from-template=${template.name}`);
|
||||
|
||||
lines.push('');
|
||||
lines.push('Metadata:');
|
||||
lines.push(` ${pad('ID:', 12)}${template.id}`);
|
||||
if (template.createdAt) lines.push(` ${pad('Created:', 12)}${template.createdAt}`);
|
||||
if (template.updatedAt) lines.push(` ${pad('Updated:', 12)}${template.updatedAt}`);
|
||||
|
||||
return lines.join('\n');
|
||||
}
|
||||
|
||||
interface RbacBinding { role: string; resource?: string; action?: string; name?: string }
|
||||
interface RbacDef { name: string; subjects: Array<{ kind: string; name: string }>; roleBindings: RbacBinding[] }
|
||||
interface PermissionSet { source: string; bindings: RbacBinding[] }
|
||||
|
||||
function formatPermissionSections(sections: PermissionSet[]): string[] {
|
||||
const lines: string[] = [];
|
||||
for (const section of sections) {
|
||||
const bindings = section.bindings;
|
||||
if (bindings.length === 0) continue;
|
||||
|
||||
const resourceBindings = bindings.filter((b) => 'resource' in b && b.resource !== undefined);
|
||||
const operationBindings = bindings.filter((b) => 'action' in b && b.action !== undefined);
|
||||
|
||||
if (resourceBindings.length > 0) {
|
||||
lines.push('');
|
||||
lines.push(`${section.source} — Resources:`);
|
||||
const roleW = Math.max(6, ...resourceBindings.map((b) => b.role.length)) + 2;
|
||||
const resW = Math.max(10, ...resourceBindings.map((b) => (b.resource ?? '').length)) + 2;
|
||||
const hasName = resourceBindings.some((b) => b.name);
|
||||
if (hasName) {
|
||||
lines.push(` ${'ROLE'.padEnd(roleW)}${'RESOURCE'.padEnd(resW)}NAME`);
|
||||
} else {
|
||||
lines.push(` ${'ROLE'.padEnd(roleW)}RESOURCE`);
|
||||
}
|
||||
for (const b of resourceBindings) {
|
||||
if (hasName) {
|
||||
lines.push(` ${b.role.padEnd(roleW)}${(b.resource ?? '').padEnd(resW)}${b.name ?? '*'}`);
|
||||
} else {
|
||||
lines.push(` ${b.role.padEnd(roleW)}${b.resource}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (operationBindings.length > 0) {
|
||||
lines.push('');
|
||||
lines.push(`${section.source} — Operations:`);
|
||||
lines.push(` ${'ACTION'.padEnd(20)}ROLE`);
|
||||
for (const b of operationBindings) {
|
||||
lines.push(` ${(b.action ?? '').padEnd(20)}${b.role}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
return lines;
|
||||
}
|
||||
|
||||
function collectBindingsForSubject(
|
||||
rbacDefs: RbacDef[],
|
||||
kind: string,
|
||||
name: string,
|
||||
): { rbacName: string; bindings: RbacBinding[] }[] {
|
||||
const results: { rbacName: string; bindings: RbacBinding[] }[] = [];
|
||||
for (const def of rbacDefs) {
|
||||
const matched = def.subjects.some((s) => s.kind === kind && s.name === name);
|
||||
if (matched) {
|
||||
results.push({ rbacName: def.name, bindings: def.roleBindings });
|
||||
}
|
||||
}
|
||||
return results;
|
||||
}
|
||||
|
||||
function formatUserDetail(
|
||||
user: Record<string, unknown>,
|
||||
rbacDefs?: RbacDef[],
|
||||
userGroups?: string[],
|
||||
): string {
|
||||
const lines: string[] = [];
|
||||
lines.push(`=== User: ${user.email} ===`);
|
||||
lines.push(`${pad('Email:')}${user.email}`);
|
||||
lines.push(`${pad('Name:')}${(user.name as string | null) ?? '-'}`);
|
||||
lines.push(`${pad('Provider:')}${(user.provider as string | null) ?? 'local'}`);
|
||||
|
||||
if (userGroups && userGroups.length > 0) {
|
||||
lines.push(`${pad('Groups:')}${userGroups.join(', ')}`);
|
||||
}
|
||||
|
||||
if (rbacDefs) {
|
||||
const email = user.email as string;
|
||||
|
||||
// Direct permissions (User:email subjects)
|
||||
const directMatches = collectBindingsForSubject(rbacDefs, 'User', email);
|
||||
const directBindings = directMatches.flatMap((m) => m.bindings);
|
||||
const directSources = directMatches.map((m) => m.rbacName).join(', ');
|
||||
|
||||
// Inherited permissions (Group:name subjects)
|
||||
const inheritedSections: PermissionSet[] = [];
|
||||
if (userGroups) {
|
||||
for (const groupName of userGroups) {
|
||||
const groupMatches = collectBindingsForSubject(rbacDefs, 'Group', groupName);
|
||||
const groupBindings = groupMatches.flatMap((m) => m.bindings);
|
||||
if (groupBindings.length > 0) {
|
||||
inheritedSections.push({ source: `Inherited (${groupName})`, bindings: groupBindings });
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const sections: PermissionSet[] = [];
|
||||
if (directBindings.length > 0) {
|
||||
sections.push({ source: `Direct (${directSources})`, bindings: directBindings });
|
||||
}
|
||||
sections.push(...inheritedSections);
|
||||
|
||||
if (sections.length > 0) {
|
||||
lines.push('');
|
||||
lines.push('Access:');
|
||||
lines.push(...formatPermissionSections(sections));
|
||||
} else {
|
||||
lines.push('');
|
||||
lines.push('Access: (none)');
|
||||
}
|
||||
}
|
||||
|
||||
lines.push('');
|
||||
lines.push('Metadata:');
|
||||
lines.push(` ${pad('ID:', 12)}${user.id}`);
|
||||
if (user.createdAt) lines.push(` ${pad('Created:', 12)}${user.createdAt}`);
|
||||
if (user.updatedAt) lines.push(` ${pad('Updated:', 12)}${user.updatedAt}`);
|
||||
|
||||
return lines.join('\n');
|
||||
}
|
||||
|
||||
function formatGroupDetail(group: Record<string, unknown>, rbacDefs?: RbacDef[]): string {
|
||||
const lines: string[] = [];
|
||||
lines.push(`=== Group: ${group.name} ===`);
|
||||
lines.push(`${pad('Name:')}${group.name}`);
|
||||
if (group.description) lines.push(`${pad('Description:')}${group.description}`);
|
||||
|
||||
const members = group.members as Array<{ user: { email: string }; createdAt?: string }> | undefined;
|
||||
if (members && members.length > 0) {
|
||||
lines.push('');
|
||||
lines.push('Members:');
|
||||
const emailW = Math.max(6, ...members.map((m) => m.user.email.length)) + 2;
|
||||
lines.push(` ${'EMAIL'.padEnd(emailW)}ADDED`);
|
||||
for (const m of members) {
|
||||
const added = (m.createdAt as string | undefined) ?? '-';
|
||||
lines.push(` ${m.user.email.padEnd(emailW)}${added}`);
|
||||
}
|
||||
}
|
||||
|
||||
if (rbacDefs) {
|
||||
const groupName = group.name as string;
|
||||
const matches = collectBindingsForSubject(rbacDefs, 'Group', groupName);
|
||||
const allBindings = matches.flatMap((m) => m.bindings);
|
||||
const sources = matches.map((m) => m.rbacName).join(', ');
|
||||
|
||||
if (allBindings.length > 0) {
|
||||
const sections: PermissionSet[] = [{ source: `Granted (${sources})`, bindings: allBindings }];
|
||||
lines.push('');
|
||||
lines.push('Access:');
|
||||
lines.push(...formatPermissionSections(sections));
|
||||
} else {
|
||||
lines.push('');
|
||||
lines.push('Access: (none)');
|
||||
}
|
||||
}
|
||||
|
||||
lines.push('');
|
||||
lines.push('Metadata:');
|
||||
lines.push(` ${pad('ID:', 12)}${group.id}`);
|
||||
if (group.createdAt) lines.push(` ${pad('Created:', 12)}${group.createdAt}`);
|
||||
if (group.updatedAt) lines.push(` ${pad('Updated:', 12)}${group.updatedAt}`);
|
||||
|
||||
return lines.join('\n');
|
||||
}
|
||||
|
||||
function formatRbacDetail(rbac: Record<string, unknown>): string {
|
||||
const lines: string[] = [];
|
||||
lines.push(`=== RBAC: ${rbac.name} ===`);
|
||||
lines.push(`${pad('Name:')}${rbac.name}`);
|
||||
|
||||
const subjects = rbac.subjects as Array<{ kind: string; name: string }> | undefined;
|
||||
if (subjects && subjects.length > 0) {
|
||||
lines.push('');
|
||||
lines.push('Subjects:');
|
||||
const kindW = Math.max(6, ...subjects.map((s) => s.kind.length)) + 2;
|
||||
lines.push(` ${'KIND'.padEnd(kindW)}NAME`);
|
||||
for (const s of subjects) {
|
||||
lines.push(` ${s.kind.padEnd(kindW)}${s.name}`);
|
||||
}
|
||||
}
|
||||
|
||||
const roleBindings = rbac.roleBindings as Array<{ role: string; resource?: string; action?: string; name?: string }> | undefined;
|
||||
if (roleBindings && roleBindings.length > 0) {
|
||||
// Separate resource bindings from operation bindings
|
||||
const resourceBindings = roleBindings.filter((b) => 'resource' in b && b.resource !== undefined);
|
||||
const operationBindings = roleBindings.filter((b) => 'action' in b && b.action !== undefined);
|
||||
|
||||
if (resourceBindings.length > 0) {
|
||||
lines.push('');
|
||||
lines.push('Resource Bindings:');
|
||||
const roleW = Math.max(6, ...resourceBindings.map((b) => b.role.length)) + 2;
|
||||
const resW = Math.max(10, ...resourceBindings.map((b) => (b.resource ?? '').length)) + 2;
|
||||
const hasName = resourceBindings.some((b) => b.name);
|
||||
if (hasName) {
|
||||
lines.push(` ${'ROLE'.padEnd(roleW)}${'RESOURCE'.padEnd(resW)}NAME`);
|
||||
} else {
|
||||
lines.push(` ${'ROLE'.padEnd(roleW)}RESOURCE`);
|
||||
}
|
||||
for (const b of resourceBindings) {
|
||||
if (hasName) {
|
||||
lines.push(` ${b.role.padEnd(roleW)}${(b.resource ?? '').padEnd(resW)}${b.name ?? '*'}`);
|
||||
} else {
|
||||
lines.push(` ${b.role.padEnd(roleW)}${b.resource}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (operationBindings.length > 0) {
|
||||
lines.push('');
|
||||
lines.push('Operations:');
|
||||
lines.push(` ${'ACTION'.padEnd(20)}ROLE`);
|
||||
for (const b of operationBindings) {
|
||||
lines.push(` ${(b.action ?? '').padEnd(20)}${b.role}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
lines.push('');
|
||||
lines.push('Metadata:');
|
||||
lines.push(` ${pad('ID:', 12)}${rbac.id}`);
|
||||
if (rbac.createdAt) lines.push(` ${pad('Created:', 12)}${rbac.createdAt}`);
|
||||
if (rbac.updatedAt) lines.push(` ${pad('Updated:', 12)}${rbac.updatedAt}`);
|
||||
|
||||
return lines.join('\n');
|
||||
}
|
||||
|
||||
function formatGenericDetail(obj: Record<string, unknown>): string {
|
||||
const lines: string[] = [];
|
||||
for (const [key, value] of Object.entries(obj)) {
|
||||
if (value === null || value === undefined) {
|
||||
lines.push(`${pad(key + ':')} -`);
|
||||
} else if (Array.isArray(value)) {
|
||||
if (value.length === 0) {
|
||||
lines.push(`${pad(key + ':')} []`);
|
||||
} else {
|
||||
lines.push(`${key}:`);
|
||||
for (const item of value) {
|
||||
lines.push(` - ${typeof item === 'object' ? JSON.stringify(item) : String(item)}`);
|
||||
}
|
||||
}
|
||||
} else if (typeof value === 'object') {
|
||||
lines.push(`${key}:`);
|
||||
for (const [k, v] of Object.entries(value as Record<string, unknown>)) {
|
||||
lines.push(` ${pad(k + ':')}${String(v)}`);
|
||||
}
|
||||
} else {
|
||||
lines.push(`${pad(key + ':')}${String(value)}`);
|
||||
}
|
||||
}
|
||||
return lines.join('\n');
|
||||
}
|
||||
|
||||
export function createDescribeCommand(deps: DescribeCommandDeps): Command {
|
||||
return new Command('describe')
|
||||
.description('Show detailed information about a resource')
|
||||
.argument('<resource>', 'resource type (server, profile, project, instance)')
|
||||
.argument('<id>', 'resource ID')
|
||||
.argument('<resource>', 'resource type (server, project, instance)')
|
||||
.argument('<id>', 'resource ID or name')
|
||||
.option('-o, --output <format>', 'output format (detail, json, yaml)', 'detail')
|
||||
.action(async (resourceArg: string, id: string, opts: { output: string }) => {
|
||||
.option('--show-values', 'Show secret values (default: masked)')
|
||||
.action(async (resourceArg: string, idOrName: string, opts: { output: string; showValues?: boolean }) => {
|
||||
const resource = resolveResource(resourceArg);
|
||||
const item = await deps.fetchResource(resource, id);
|
||||
|
||||
// Resolve name → ID
|
||||
let id: string;
|
||||
if (resource === 'instances') {
|
||||
// Instances: accept instance ID or server name (resolve to first running instance)
|
||||
try {
|
||||
id = await resolveNameOrId(deps.client, resource, idOrName);
|
||||
} catch {
|
||||
// Not an instance ID — try as server name
|
||||
const servers = await deps.client.get<Array<{ id: string; name: string }>>('/api/v1/servers');
|
||||
const server = servers.find((s) => s.name === idOrName || s.id === idOrName);
|
||||
if (server) {
|
||||
const instances = await deps.client.get<Array<{ id: string; status: string }>>(`/api/v1/instances?serverId=${server.id}`);
|
||||
const running = instances.find((i) => i.status === 'RUNNING') ?? instances[0];
|
||||
if (running) {
|
||||
id = running.id;
|
||||
} else {
|
||||
throw new Error(`No instances found for server '${idOrName}'`);
|
||||
}
|
||||
} else {
|
||||
id = idOrName;
|
||||
}
|
||||
}
|
||||
} else {
|
||||
try {
|
||||
id = await resolveNameOrId(deps.client, resource, idOrName);
|
||||
} catch {
|
||||
id = idOrName;
|
||||
}
|
||||
}
|
||||
|
||||
const item = await deps.fetchResource(resource, id) as Record<string, unknown>;
|
||||
|
||||
// Enrich instances with container inspect data
|
||||
let inspect: Record<string, unknown> | undefined;
|
||||
if (resource === 'instances' && deps.fetchInspect && item.containerId) {
|
||||
try {
|
||||
inspect = await deps.fetchInspect(id) as Record<string, unknown>;
|
||||
item.containerInspect = inspect;
|
||||
} catch {
|
||||
// Container may not be available
|
||||
}
|
||||
}
|
||||
|
||||
if (opts.output === 'json') {
|
||||
deps.log(formatJson(item));
|
||||
} else if (opts.output === 'yaml') {
|
||||
deps.log(formatYaml(item));
|
||||
} else {
|
||||
const typeName = resource.replace(/s$/, '').charAt(0).toUpperCase() + resource.replace(/s$/, '').slice(1);
|
||||
deps.log(`--- ${typeName} ---`);
|
||||
deps.log(formatDetail(item as Record<string, unknown>));
|
||||
// Visually clean sectioned output
|
||||
switch (resource) {
|
||||
case 'servers':
|
||||
deps.log(formatServerDetail(item));
|
||||
break;
|
||||
case 'instances':
|
||||
deps.log(formatInstanceDetail(item, inspect));
|
||||
break;
|
||||
case 'secrets':
|
||||
deps.log(formatSecretDetail(item, opts.showValues === true));
|
||||
break;
|
||||
case 'templates':
|
||||
deps.log(formatTemplateDetail(item));
|
||||
break;
|
||||
case 'projects': {
|
||||
const projectPrompts = await deps.client
|
||||
.get<Array<{ name: string; priority: number; linkTarget: string | null }>>(`/api/v1/prompts?projectId=${item.id as string}`)
|
||||
.catch(() => []);
|
||||
deps.log(formatProjectDetail(item, projectPrompts));
|
||||
break;
|
||||
}
|
||||
case 'users': {
|
||||
// Fetch RBAC definitions and groups to show permissions
|
||||
const [rbacDefsForUser, allGroupsForUser] = await Promise.all([
|
||||
deps.client.get<RbacDef[]>('/api/v1/rbac').catch(() => [] as RbacDef[]),
|
||||
deps.client.get<Array<{ name: string; members?: Array<{ user: { email: string } }> }>>('/api/v1/groups').catch(() => []),
|
||||
]);
|
||||
const userEmail = item.email as string;
|
||||
const userGroupNames = allGroupsForUser
|
||||
.filter((g) => g.members?.some((m) => m.user.email === userEmail))
|
||||
.map((g) => g.name);
|
||||
deps.log(formatUserDetail(item, rbacDefsForUser, userGroupNames));
|
||||
break;
|
||||
}
|
||||
case 'groups': {
|
||||
const rbacDefsForGroup = await deps.client.get<RbacDef[]>('/api/v1/rbac').catch(() => [] as RbacDef[]);
|
||||
deps.log(formatGroupDetail(item, rbacDefsForGroup));
|
||||
break;
|
||||
}
|
||||
case 'rbac':
|
||||
deps.log(formatRbacDetail(item));
|
||||
break;
|
||||
default:
|
||||
deps.log(formatGenericDetail(item));
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
115
src/cli/src/commands/edit.ts
Normal file
115
src/cli/src/commands/edit.ts
Normal file
@@ -0,0 +1,115 @@
|
||||
import { Command } from 'commander';
|
||||
import { writeFileSync, readFileSync, unlinkSync, mkdtempSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
import { execSync } from 'node:child_process';
|
||||
import yaml from 'js-yaml';
|
||||
import type { ApiClient } from '../api-client.js';
|
||||
import { resolveResource, resolveNameOrId, stripInternalFields } from './shared.js';
|
||||
import { reorderKeys } from '../formatters/output.js';
|
||||
|
||||
export interface EditCommandDeps {
|
||||
client: ApiClient;
|
||||
log: (...args: unknown[]) => void;
|
||||
/** Override for testing — return editor binary name. */
|
||||
getEditor?: () => string;
|
||||
/** Override for testing — simulate opening the editor. */
|
||||
openEditor?: (filePath: string, editor: string) => void;
|
||||
}
|
||||
|
||||
function getEditor(deps: EditCommandDeps): string {
|
||||
if (deps.getEditor) return deps.getEditor();
|
||||
return process.env.VISUAL ?? process.env.EDITOR ?? 'vi';
|
||||
}
|
||||
|
||||
function openEditor(filePath: string, editor: string, deps: EditCommandDeps): void {
|
||||
if (deps.openEditor) {
|
||||
deps.openEditor(filePath, editor);
|
||||
return;
|
||||
}
|
||||
execSync(`${editor} "${filePath}"`, { stdio: 'inherit' });
|
||||
}
|
||||
|
||||
export function createEditCommand(deps: EditCommandDeps): Command {
|
||||
const { client, log } = deps;
|
||||
|
||||
return new Command('edit')
|
||||
.description('Edit a resource in your default editor (server, project)')
|
||||
.argument('<resource>', 'Resource type (server, project)')
|
||||
.argument('<name-or-id>', 'Resource name or ID')
|
||||
.action(async (resourceArg: string, nameOrId: string) => {
|
||||
const resource = resolveResource(resourceArg);
|
||||
|
||||
// Instances are immutable
|
||||
if (resource === 'instances') {
|
||||
log('Error: instances are immutable and cannot be edited.');
|
||||
log('To change an instance, update the server definition and let reconciliation handle it.');
|
||||
process.exitCode = 1;
|
||||
return;
|
||||
}
|
||||
|
||||
const validResources = ['servers', 'secrets', 'projects', 'groups', 'rbac', 'prompts', 'promptrequests'];
|
||||
if (!validResources.includes(resource)) {
|
||||
log(`Error: unknown resource type '${resourceArg}'`);
|
||||
process.exitCode = 1;
|
||||
return;
|
||||
}
|
||||
|
||||
// Resolve name → ID
|
||||
const id = await resolveNameOrId(client, resource, nameOrId);
|
||||
|
||||
// Fetch current state
|
||||
const current = await client.get<Record<string, unknown>>(`/api/v1/${resource}/${id}`);
|
||||
|
||||
// Strip read-only fields for editor
|
||||
const editable = reorderKeys(stripInternalFields(current)) as Record<string, unknown>;
|
||||
|
||||
// Serialize to YAML
|
||||
const singular = resource.replace(/s$/, '');
|
||||
const header = `# Editing ${singular}: ${nameOrId}\n# Save and close to apply changes. Clear the file to cancel.\n`;
|
||||
const originalYaml = yaml.dump(editable, { lineWidth: 120, noRefs: true });
|
||||
const content = header + originalYaml;
|
||||
|
||||
// Write to temp file
|
||||
const tmpDir = mkdtempSync(join(tmpdir(), 'mcpctl-edit-'));
|
||||
const tmpFile = join(tmpDir, `${singular}-${nameOrId}.yaml`);
|
||||
writeFileSync(tmpFile, content, 'utf-8');
|
||||
|
||||
try {
|
||||
// Open editor
|
||||
const editor = getEditor(deps);
|
||||
openEditor(tmpFile, editor, deps);
|
||||
|
||||
// Read back
|
||||
const modified = readFileSync(tmpFile, 'utf-8');
|
||||
|
||||
// Strip comments for comparison
|
||||
const modifiedClean = modified
|
||||
.split('\n')
|
||||
.filter((line) => !line.startsWith('#'))
|
||||
.join('\n')
|
||||
.trim();
|
||||
|
||||
if (!modifiedClean) {
|
||||
log('Edit cancelled (empty file).');
|
||||
return;
|
||||
}
|
||||
|
||||
if (modifiedClean === originalYaml.trim()) {
|
||||
log(`${singular} '${nameOrId}' unchanged.`);
|
||||
return;
|
||||
}
|
||||
|
||||
// Parse and apply
|
||||
const updates = yaml.load(modifiedClean) as Record<string, unknown>;
|
||||
await client.put(`/api/v1/${resource}/${id}`, updates);
|
||||
log(`${singular} '${nameOrId}' updated.`);
|
||||
} finally {
|
||||
try {
|
||||
unlinkSync(tmpFile);
|
||||
} catch {
|
||||
// Ignore cleanup errors
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
@@ -2,9 +2,10 @@ import { Command } from 'commander';
|
||||
import { formatTable } from '../formatters/table.js';
|
||||
import { formatJson, formatYaml } from '../formatters/output.js';
|
||||
import type { Column } from '../formatters/table.js';
|
||||
import { resolveResource, stripInternalFields } from './shared.js';
|
||||
|
||||
export interface GetCommandDeps {
|
||||
fetchResource: (resource: string, id?: string) => Promise<unknown[]>;
|
||||
fetchResource: (resource: string, id?: string, opts?: { project?: string; all?: boolean }) => Promise<unknown[]>;
|
||||
log: (...args: string[]) => void;
|
||||
}
|
||||
|
||||
@@ -16,41 +17,39 @@ interface ServerRow {
|
||||
dockerImage: string | null;
|
||||
}
|
||||
|
||||
interface ProfileRow {
|
||||
id: string;
|
||||
name: string;
|
||||
serverId: string;
|
||||
}
|
||||
|
||||
interface ProjectRow {
|
||||
id: string;
|
||||
name: string;
|
||||
description: string;
|
||||
proxyMode: string;
|
||||
gated: boolean;
|
||||
ownerId: string;
|
||||
servers?: Array<{ server: { name: string } }>;
|
||||
}
|
||||
|
||||
interface SecretRow {
|
||||
id: string;
|
||||
name: string;
|
||||
data: Record<string, string>;
|
||||
}
|
||||
|
||||
interface TemplateRow {
|
||||
id: string;
|
||||
name: string;
|
||||
version: string;
|
||||
transport: string;
|
||||
packageName: string | null;
|
||||
description: string;
|
||||
}
|
||||
|
||||
interface InstanceRow {
|
||||
id: string;
|
||||
serverId: string;
|
||||
server?: { name: string };
|
||||
status: string;
|
||||
containerId: string | null;
|
||||
port: number | null;
|
||||
}
|
||||
|
||||
const RESOURCE_ALIASES: Record<string, string> = {
|
||||
server: 'servers',
|
||||
srv: 'servers',
|
||||
profile: 'profiles',
|
||||
prof: 'profiles',
|
||||
project: 'projects',
|
||||
proj: 'projects',
|
||||
instance: 'instances',
|
||||
inst: 'instances',
|
||||
};
|
||||
|
||||
function resolveResource(name: string): string {
|
||||
const lower = name.toLowerCase();
|
||||
return RESOURCE_ALIASES[lower] ?? lower;
|
||||
healthStatus: string | null;
|
||||
}
|
||||
|
||||
const serverColumns: Column<ServerRow>[] = [
|
||||
@@ -61,22 +60,120 @@ const serverColumns: Column<ServerRow>[] = [
|
||||
{ header: 'ID', key: 'id' },
|
||||
];
|
||||
|
||||
const profileColumns: Column<ProfileRow>[] = [
|
||||
{ header: 'NAME', key: 'name' },
|
||||
{ header: 'SERVER ID', key: 'serverId' },
|
||||
{ header: 'ID', key: 'id' },
|
||||
];
|
||||
interface UserRow {
|
||||
id: string;
|
||||
email: string;
|
||||
name: string | null;
|
||||
provider: string | null;
|
||||
}
|
||||
|
||||
interface GroupRow {
|
||||
id: string;
|
||||
name: string;
|
||||
description: string;
|
||||
members?: Array<{ user: { email: string } }>;
|
||||
}
|
||||
|
||||
interface RbacRow {
|
||||
id: string;
|
||||
name: string;
|
||||
subjects: Array<{ kind: string; name: string }>;
|
||||
roleBindings: Array<{ role: string; resource?: string; action?: string; name?: string }>;
|
||||
}
|
||||
|
||||
const projectColumns: Column<ProjectRow>[] = [
|
||||
{ header: 'NAME', key: 'name' },
|
||||
{ header: 'MODE', key: (r) => r.proxyMode ?? 'direct', width: 10 },
|
||||
{ header: 'GATED', key: (r) => r.gated ? 'yes' : 'no', width: 6 },
|
||||
{ header: 'SERVERS', key: (r) => r.servers ? String(r.servers.length) : '0', width: 8 },
|
||||
{ header: 'DESCRIPTION', key: 'description', width: 30 },
|
||||
{ header: 'ID', key: 'id' },
|
||||
];
|
||||
|
||||
const userColumns: Column<UserRow>[] = [
|
||||
{ header: 'EMAIL', key: 'email' },
|
||||
{ header: 'NAME', key: (r) => r.name ?? '-' },
|
||||
{ header: 'PROVIDER', key: (r) => r.provider ?? 'local', width: 10 },
|
||||
{ header: 'ID', key: 'id' },
|
||||
];
|
||||
|
||||
const groupColumns: Column<GroupRow>[] = [
|
||||
{ header: 'NAME', key: 'name' },
|
||||
{ header: 'MEMBERS', key: (r) => r.members ? String(r.members.length) : '0', width: 8 },
|
||||
{ header: 'DESCRIPTION', key: 'description', width: 40 },
|
||||
{ header: 'OWNER', key: 'ownerId' },
|
||||
{ header: 'ID', key: 'id' },
|
||||
];
|
||||
|
||||
const rbacColumns: Column<RbacRow>[] = [
|
||||
{ header: 'NAME', key: 'name' },
|
||||
{ header: 'SUBJECTS', key: (r) => r.subjects.map((s) => `${s.kind}:${s.name}`).join(', '), width: 30 },
|
||||
{ header: 'BINDINGS', key: (r) => r.roleBindings.map((b) => {
|
||||
if ('action' in b && b.action !== undefined) return `run>${b.action}`;
|
||||
if ('resource' in b && b.resource !== undefined) {
|
||||
const base = `${b.role}:${b.resource}`;
|
||||
return b.name ? `${base}:${b.name}` : base;
|
||||
}
|
||||
return b.role;
|
||||
}).join(', '), width: 40 },
|
||||
{ header: 'ID', key: 'id' },
|
||||
];
|
||||
|
||||
const secretColumns: Column<SecretRow>[] = [
|
||||
{ header: 'NAME', key: 'name' },
|
||||
{ header: 'KEYS', key: (r) => Object.keys(r.data).join(', ') || '-', width: 40 },
|
||||
{ header: 'ID', key: 'id' },
|
||||
];
|
||||
|
||||
const templateColumns: Column<TemplateRow>[] = [
|
||||
{ header: 'NAME', key: 'name' },
|
||||
{ header: 'VERSION', key: 'version', width: 10 },
|
||||
{ header: 'TRANSPORT', key: 'transport', width: 16 },
|
||||
{ header: 'PACKAGE', key: (r) => r.packageName ?? '-' },
|
||||
{ header: 'DESCRIPTION', key: 'description', width: 50 },
|
||||
];
|
||||
|
||||
interface PromptRow {
|
||||
id: string;
|
||||
name: string;
|
||||
projectId: string | null;
|
||||
project?: { name: string } | null;
|
||||
priority: number;
|
||||
linkTarget: string | null;
|
||||
linkStatus: 'alive' | 'dead' | null;
|
||||
createdAt: string;
|
||||
}
|
||||
|
||||
interface PromptRequestRow {
|
||||
id: string;
|
||||
name: string;
|
||||
projectId: string | null;
|
||||
project?: { name: string } | null;
|
||||
createdBySession: string | null;
|
||||
createdAt: string;
|
||||
}
|
||||
|
||||
const promptColumns: Column<PromptRow>[] = [
|
||||
{ header: 'NAME', key: 'name' },
|
||||
{ header: 'PROJECT', key: (r) => r.project?.name ?? (r.projectId ? r.projectId : '(global)'), width: 20 },
|
||||
{ header: 'PRI', key: (r) => String(r.priority), width: 4 },
|
||||
{ header: 'LINK', key: (r) => r.linkTarget ? r.linkTarget.split(':')[0]! : '-', width: 20 },
|
||||
{ header: 'STATUS', key: (r) => r.linkStatus ?? '-', width: 6 },
|
||||
{ header: 'CREATED', key: (r) => new Date(r.createdAt).toLocaleString(), width: 20 },
|
||||
{ header: 'ID', key: 'id' },
|
||||
];
|
||||
|
||||
const promptRequestColumns: Column<PromptRequestRow>[] = [
|
||||
{ header: 'NAME', key: 'name' },
|
||||
{ header: 'PROJECT', key: (r) => r.project?.name ?? (r.projectId ? r.projectId : '(global)'), width: 20 },
|
||||
{ header: 'SESSION', key: (r) => r.createdBySession ? r.createdBySession.slice(0, 12) : '-', width: 14 },
|
||||
{ header: 'CREATED', key: (r) => new Date(r.createdAt).toLocaleString(), width: 20 },
|
||||
{ header: 'ID', key: 'id' },
|
||||
];
|
||||
|
||||
const instanceColumns: Column<InstanceRow>[] = [
|
||||
{ header: 'NAME', key: (r) => r.server?.name ?? '-', width: 20 },
|
||||
{ header: 'STATUS', key: 'status', width: 10 },
|
||||
{ header: 'SERVER ID', key: 'serverId' },
|
||||
{ header: 'HEALTH', key: (r) => r.healthStatus ?? '-', width: 10 },
|
||||
{ header: 'PORT', key: (r) => r.port != null ? String(r.port) : '-', width: 6 },
|
||||
{ header: 'CONTAINER', key: (r) => r.containerId ? r.containerId.slice(0, 12) : '-', width: 14 },
|
||||
{ header: 'ID', key: 'id' },
|
||||
@@ -86,12 +183,24 @@ function getColumnsForResource(resource: string): Column<Record<string, unknown>
|
||||
switch (resource) {
|
||||
case 'servers':
|
||||
return serverColumns as unknown as Column<Record<string, unknown>>[];
|
||||
case 'profiles':
|
||||
return profileColumns as unknown as Column<Record<string, unknown>>[];
|
||||
case 'projects':
|
||||
return projectColumns as unknown as Column<Record<string, unknown>>[];
|
||||
case 'secrets':
|
||||
return secretColumns as unknown as Column<Record<string, unknown>>[];
|
||||
case 'templates':
|
||||
return templateColumns as unknown as Column<Record<string, unknown>>[];
|
||||
case 'instances':
|
||||
return instanceColumns as unknown as Column<Record<string, unknown>>[];
|
||||
case 'users':
|
||||
return userColumns as unknown as Column<Record<string, unknown>>[];
|
||||
case 'groups':
|
||||
return groupColumns as unknown as Column<Record<string, unknown>>[];
|
||||
case 'rbac':
|
||||
return rbacColumns as unknown as Column<Record<string, unknown>>[];
|
||||
case 'prompts':
|
||||
return promptColumns as unknown as Column<Record<string, unknown>>[];
|
||||
case 'promptrequests':
|
||||
return promptRequestColumns as unknown as Column<Record<string, unknown>>[];
|
||||
default:
|
||||
return [
|
||||
{ header: 'ID', key: 'id' as keyof Record<string, unknown> },
|
||||
@@ -100,21 +209,43 @@ function getColumnsForResource(resource: string): Column<Record<string, unknown>
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Transform API response items into apply-compatible format.
|
||||
* Strips internal fields and wraps in the resource key.
|
||||
*/
|
||||
function toApplyFormat(resource: string, items: unknown[]): Record<string, unknown[]> {
|
||||
const cleaned = items.map((item) => {
|
||||
return stripInternalFields(item as Record<string, unknown>);
|
||||
});
|
||||
return { [resource]: cleaned };
|
||||
}
|
||||
|
||||
export function createGetCommand(deps: GetCommandDeps): Command {
|
||||
return new Command('get')
|
||||
.description('List resources (servers, profiles, projects, instances)')
|
||||
.argument('<resource>', 'resource type (servers, profiles, projects, instances)')
|
||||
.argument('[id]', 'specific resource ID')
|
||||
.description('List resources (servers, projects, instances)')
|
||||
.argument('<resource>', 'resource type (servers, projects, instances)')
|
||||
.argument('[id]', 'specific resource ID or name')
|
||||
.option('-o, --output <format>', 'output format (table, json, yaml)', 'table')
|
||||
.action(async (resourceArg: string, id: string | undefined, opts: { output: string }) => {
|
||||
.option('--project <name>', 'Filter by project')
|
||||
.option('-A, --all', 'Show all (including project-scoped) resources')
|
||||
.action(async (resourceArg: string, id: string | undefined, opts: { output: string; project?: string; all?: true }) => {
|
||||
const resource = resolveResource(resourceArg);
|
||||
const items = await deps.fetchResource(resource, id);
|
||||
const fetchOpts: { project?: string; all?: boolean } = {};
|
||||
if (opts.project) fetchOpts.project = opts.project;
|
||||
if (opts.all) fetchOpts.all = true;
|
||||
const items = await deps.fetchResource(resource, id, Object.keys(fetchOpts).length > 0 ? fetchOpts : undefined);
|
||||
|
||||
if (opts.output === 'json') {
|
||||
deps.log(formatJson(items.length === 1 ? items[0] : items));
|
||||
// Apply-compatible JSON wrapped in resource key
|
||||
deps.log(formatJson(toApplyFormat(resource, items)));
|
||||
} else if (opts.output === 'yaml') {
|
||||
deps.log(formatYaml(items.length === 1 ? items[0] : items));
|
||||
// Apply-compatible YAML wrapped in resource key
|
||||
deps.log(formatYaml(toApplyFormat(resource, items)));
|
||||
} else {
|
||||
if (items.length === 0) {
|
||||
deps.log(`No ${resource} found.`);
|
||||
return;
|
||||
}
|
||||
const columns = getColumnsForResource(resource);
|
||||
deps.log(formatTable(items as Record<string, unknown>[], columns));
|
||||
}
|
||||
|
||||
@@ -1,123 +0,0 @@
|
||||
import { Command } from 'commander';
|
||||
import type { ApiClient } from '../api-client.js';
|
||||
|
||||
interface Instance {
|
||||
id: string;
|
||||
serverId: string;
|
||||
status: string;
|
||||
containerId: string | null;
|
||||
port: number | null;
|
||||
createdAt: string;
|
||||
}
|
||||
|
||||
export interface InstanceCommandDeps {
|
||||
client: ApiClient;
|
||||
log: (...args: unknown[]) => void;
|
||||
}
|
||||
|
||||
export function createInstanceCommands(deps: InstanceCommandDeps): Command {
|
||||
const { client, log } = deps;
|
||||
const cmd = new Command('instance')
|
||||
.alias('instances')
|
||||
.alias('inst')
|
||||
.description('Manage MCP server instances');
|
||||
|
||||
cmd
|
||||
.command('list')
|
||||
.alias('ls')
|
||||
.description('List running instances')
|
||||
.option('-s, --server <id>', 'Filter by server ID')
|
||||
.option('-o, --output <format>', 'Output format (table, json)', 'table')
|
||||
.action(async (opts: { server?: string; output: string }) => {
|
||||
let url = '/api/v1/instances';
|
||||
if (opts.server) {
|
||||
url += `?serverId=${encodeURIComponent(opts.server)}`;
|
||||
}
|
||||
const instances = await client.get<Instance[]>(url);
|
||||
if (opts.output === 'json') {
|
||||
log(JSON.stringify(instances, null, 2));
|
||||
return;
|
||||
}
|
||||
if (instances.length === 0) {
|
||||
log('No instances found.');
|
||||
return;
|
||||
}
|
||||
log('ID\tSERVER\tSTATUS\tPORT\tCONTAINER');
|
||||
for (const inst of instances) {
|
||||
const cid = inst.containerId ? inst.containerId.slice(0, 12) : '-';
|
||||
const port = inst.port ?? '-';
|
||||
log(`${inst.id}\t${inst.serverId}\t${inst.status}\t${port}\t${cid}`);
|
||||
}
|
||||
});
|
||||
|
||||
cmd
|
||||
.command('start <serverId>')
|
||||
.description('Start a new MCP server instance')
|
||||
.option('-p, --port <port>', 'Host port to bind')
|
||||
.option('-o, --output <format>', 'Output format (table, json)', 'table')
|
||||
.action(async (serverId: string, opts: { port?: string; output: string }) => {
|
||||
const body: Record<string, unknown> = { serverId };
|
||||
if (opts.port !== undefined) {
|
||||
body.hostPort = parseInt(opts.port, 10);
|
||||
}
|
||||
const instance = await client.post<Instance>('/api/v1/instances', body);
|
||||
if (opts.output === 'json') {
|
||||
log(JSON.stringify(instance, null, 2));
|
||||
return;
|
||||
}
|
||||
log(`Instance ${instance.id} started (status: ${instance.status})`);
|
||||
});
|
||||
|
||||
cmd
|
||||
.command('stop <id>')
|
||||
.description('Stop a running instance')
|
||||
.action(async (id: string) => {
|
||||
const instance = await client.post<Instance>(`/api/v1/instances/${id}/stop`);
|
||||
log(`Instance ${id} stopped (status: ${instance.status})`);
|
||||
});
|
||||
|
||||
cmd
|
||||
.command('restart <id>')
|
||||
.description('Restart an instance (stop, remove, start fresh)')
|
||||
.action(async (id: string) => {
|
||||
const instance = await client.post<Instance>(`/api/v1/instances/${id}/restart`);
|
||||
log(`Instance restarted as ${instance.id} (status: ${instance.status})`);
|
||||
});
|
||||
|
||||
cmd
|
||||
.command('remove <id>')
|
||||
.alias('rm')
|
||||
.description('Remove an instance and its container')
|
||||
.action(async (id: string) => {
|
||||
await client.delete(`/api/v1/instances/${id}`);
|
||||
log(`Instance ${id} removed.`);
|
||||
});
|
||||
|
||||
cmd
|
||||
.command('logs <id>')
|
||||
.description('Get logs from an instance')
|
||||
.option('-t, --tail <lines>', 'Number of lines to show')
|
||||
.action(async (id: string, opts: { tail?: string }) => {
|
||||
let url = `/api/v1/instances/${id}/logs`;
|
||||
if (opts.tail) {
|
||||
url += `?tail=${opts.tail}`;
|
||||
}
|
||||
const logs = await client.get<{ stdout: string; stderr: string }>(url);
|
||||
if (logs.stdout) {
|
||||
log(logs.stdout);
|
||||
}
|
||||
if (logs.stderr) {
|
||||
process.stderr.write(logs.stderr);
|
||||
}
|
||||
});
|
||||
|
||||
cmd
|
||||
.command('inspect <id>')
|
||||
.description('Get detailed container info for an instance')
|
||||
.action(async (id: string) => {
|
||||
const info = await client.get(`/api/v1/instances/${id}/inspect`);
|
||||
log(JSON.stringify(info, null, 2));
|
||||
});
|
||||
|
||||
return cmd;
|
||||
}
|
||||
98
src/cli/src/commands/logs.ts
Normal file
98
src/cli/src/commands/logs.ts
Normal file
@@ -0,0 +1,98 @@
|
||||
import { Command } from 'commander';
|
||||
import type { ApiClient } from '../api-client.js';
|
||||
|
||||
export interface LogsCommandDeps {
|
||||
client: ApiClient;
|
||||
log: (...args: unknown[]) => void;
|
||||
}
|
||||
|
||||
interface InstanceInfo {
|
||||
id: string;
|
||||
status: string;
|
||||
containerId: string | null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Resolve a name/ID to an instance ID.
|
||||
* Accepts: instance ID, server name, or server ID.
|
||||
* For servers with multiple replicas, picks by --instance index or first RUNNING.
|
||||
*/
|
||||
async function resolveInstance(
|
||||
client: ApiClient,
|
||||
nameOrId: string,
|
||||
instanceIndex?: number,
|
||||
): Promise<{ instanceId: string; serverName?: string; replicaInfo?: string }> {
|
||||
// Try as instance ID first
|
||||
try {
|
||||
await client.get(`/api/v1/instances/${nameOrId}`);
|
||||
return { instanceId: nameOrId };
|
||||
} catch {
|
||||
// Not a valid instance ID
|
||||
}
|
||||
|
||||
// Try as server name/ID → find its instances
|
||||
const servers = await client.get<Array<{ id: string; name: string }>>('/api/v1/servers');
|
||||
const server = servers.find((s) => s.name === nameOrId || s.id === nameOrId);
|
||||
if (!server) {
|
||||
throw new Error(`Instance or server '${nameOrId}' not found`);
|
||||
}
|
||||
|
||||
const instances = await client.get<InstanceInfo[]>(`/api/v1/instances?serverId=${server.id}`);
|
||||
if (instances.length === 0) {
|
||||
throw new Error(`No instances found for server '${server.name}'`);
|
||||
}
|
||||
|
||||
// Select by index or pick first running
|
||||
let selected: InstanceInfo | undefined;
|
||||
if (instanceIndex !== undefined) {
|
||||
if (instanceIndex < 0 || instanceIndex >= instances.length) {
|
||||
throw new Error(`Instance index ${instanceIndex} out of range (server '${server.name}' has ${instances.length} instance${instances.length > 1 ? 's' : ''})`);
|
||||
}
|
||||
selected = instances[instanceIndex];
|
||||
} else {
|
||||
selected = instances.find((i) => i.status === 'RUNNING') ?? instances[0];
|
||||
}
|
||||
|
||||
if (!selected) {
|
||||
throw new Error(`No instances found for server '${server.name}'`);
|
||||
}
|
||||
|
||||
const result: { instanceId: string; serverName?: string; replicaInfo?: string } = {
|
||||
instanceId: selected.id,
|
||||
serverName: server.name,
|
||||
};
|
||||
if (instances.length > 1) {
|
||||
result.replicaInfo = `instance ${instances.indexOf(selected) + 1}/${instances.length}`;
|
||||
}
|
||||
return result;
|
||||
}
|
||||
|
||||
export function createLogsCommand(deps: LogsCommandDeps): Command {
|
||||
const { client, log } = deps;
|
||||
|
||||
return new Command('logs')
|
||||
.description('Get logs from an MCP server instance')
|
||||
.argument('<name>', 'Server name, server ID, or instance ID')
|
||||
.option('-t, --tail <lines>', 'Number of lines to show')
|
||||
.option('-i, --instance <index>', 'Instance/replica index (0-based, for servers with multiple replicas)')
|
||||
.action(async (nameOrId: string, opts: { tail?: string; instance?: string }) => {
|
||||
const instanceIndex = opts.instance !== undefined ? parseInt(opts.instance, 10) : undefined;
|
||||
const { instanceId, serverName, replicaInfo } = await resolveInstance(client, nameOrId, instanceIndex);
|
||||
|
||||
if (replicaInfo) {
|
||||
process.stderr.write(`Showing logs for ${serverName} (${replicaInfo})\n`);
|
||||
}
|
||||
|
||||
let url = `/api/v1/instances/${instanceId}/logs`;
|
||||
if (opts.tail) {
|
||||
url += `?tail=${opts.tail}`;
|
||||
}
|
||||
const logs = await client.get<{ stdout: string; stderr: string }>(url);
|
||||
if (logs.stdout) {
|
||||
log(logs.stdout);
|
||||
}
|
||||
if (logs.stderr) {
|
||||
process.stderr.write(logs.stderr);
|
||||
}
|
||||
});
|
||||
}
|
||||
224
src/cli/src/commands/mcp.ts
Normal file
224
src/cli/src/commands/mcp.ts
Normal file
@@ -0,0 +1,224 @@
|
||||
import { Command } from 'commander';
|
||||
import http from 'node:http';
|
||||
import { createInterface } from 'node:readline';
|
||||
|
||||
export interface McpBridgeOptions {
|
||||
projectName: string;
|
||||
mcplocalUrl: string;
|
||||
token?: string | undefined;
|
||||
stdin: NodeJS.ReadableStream;
|
||||
stdout: NodeJS.WritableStream;
|
||||
stderr: NodeJS.WritableStream;
|
||||
}
|
||||
|
||||
function postJsonRpc(
|
||||
url: string,
|
||||
body: string,
|
||||
sessionId: string | undefined,
|
||||
token: string | undefined,
|
||||
): Promise<{ status: number; headers: http.IncomingHttpHeaders; body: string }> {
|
||||
return new Promise((resolve, reject) => {
|
||||
const parsed = new URL(url);
|
||||
const headers: Record<string, string> = {
|
||||
'Content-Type': 'application/json',
|
||||
'Accept': 'application/json, text/event-stream',
|
||||
};
|
||||
if (sessionId) {
|
||||
headers['mcp-session-id'] = sessionId;
|
||||
}
|
||||
if (token) {
|
||||
headers['Authorization'] = `Bearer ${token}`;
|
||||
}
|
||||
|
||||
const req = http.request(
|
||||
{
|
||||
hostname: parsed.hostname,
|
||||
port: parsed.port,
|
||||
path: parsed.pathname,
|
||||
method: 'POST',
|
||||
headers,
|
||||
timeout: 30_000,
|
||||
},
|
||||
(res) => {
|
||||
const chunks: Buffer[] = [];
|
||||
res.on('data', (chunk: Buffer) => chunks.push(chunk));
|
||||
res.on('end', () => {
|
||||
resolve({
|
||||
status: res.statusCode ?? 0,
|
||||
headers: res.headers,
|
||||
body: Buffer.concat(chunks).toString('utf-8'),
|
||||
});
|
||||
});
|
||||
},
|
||||
);
|
||||
req.on('error', reject);
|
||||
req.on('timeout', () => {
|
||||
req.destroy();
|
||||
reject(new Error('Request timed out'));
|
||||
});
|
||||
req.write(body);
|
||||
req.end();
|
||||
});
|
||||
}
|
||||
|
||||
function sendDelete(
|
||||
url: string,
|
||||
sessionId: string,
|
||||
token: string | undefined,
|
||||
): Promise<void> {
|
||||
return new Promise((resolve) => {
|
||||
const parsed = new URL(url);
|
||||
const headers: Record<string, string> = {
|
||||
'mcp-session-id': sessionId,
|
||||
};
|
||||
if (token) {
|
||||
headers['Authorization'] = `Bearer ${token}`;
|
||||
}
|
||||
|
||||
const req = http.request(
|
||||
{
|
||||
hostname: parsed.hostname,
|
||||
port: parsed.port,
|
||||
path: parsed.pathname,
|
||||
method: 'DELETE',
|
||||
headers,
|
||||
timeout: 5_000,
|
||||
},
|
||||
() => resolve(),
|
||||
);
|
||||
req.on('error', () => resolve()); // Best effort cleanup
|
||||
req.on('timeout', () => {
|
||||
req.destroy();
|
||||
resolve();
|
||||
});
|
||||
req.end();
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract JSON-RPC messages from an HTTP response body.
|
||||
* Handles both plain JSON and SSE (text/event-stream) formats.
|
||||
*/
|
||||
function extractJsonRpcMessages(contentType: string | undefined, body: string): string[] {
|
||||
if (contentType?.includes('text/event-stream')) {
|
||||
// Parse SSE: extract data: lines
|
||||
const messages: string[] = [];
|
||||
for (const line of body.split('\n')) {
|
||||
if (line.startsWith('data: ')) {
|
||||
messages.push(line.slice(6));
|
||||
}
|
||||
}
|
||||
return messages;
|
||||
}
|
||||
// Plain JSON response
|
||||
return [body];
|
||||
}
|
||||
|
||||
/**
|
||||
* STDIO-to-Streamable-HTTP MCP bridge.
|
||||
*
|
||||
* Reads JSON-RPC messages line-by-line from stdin, POSTs them to
|
||||
* mcplocal's project endpoint, and writes responses to stdout.
|
||||
*/
|
||||
export async function runMcpBridge(opts: McpBridgeOptions): Promise<void> {
|
||||
const { projectName, mcplocalUrl, token, stdin, stdout, stderr } = opts;
|
||||
const endpointUrl = `${mcplocalUrl.replace(/\/$/, '')}/projects/${encodeURIComponent(projectName)}/mcp`;
|
||||
|
||||
let sessionId: string | undefined;
|
||||
|
||||
const rl = createInterface({ input: stdin, crlfDelay: Infinity });
|
||||
|
||||
for await (const line of rl) {
|
||||
const trimmed = line.trim();
|
||||
if (!trimmed) continue;
|
||||
|
||||
try {
|
||||
const result = await postJsonRpc(endpointUrl, trimmed, sessionId, token);
|
||||
|
||||
// Capture session ID from first response
|
||||
if (!sessionId) {
|
||||
const sid = result.headers['mcp-session-id'];
|
||||
if (typeof sid === 'string') {
|
||||
sessionId = sid;
|
||||
}
|
||||
}
|
||||
|
||||
if (result.status >= 400) {
|
||||
stderr.write(`MCP bridge error: HTTP ${result.status}: ${result.body}\n`);
|
||||
}
|
||||
|
||||
// Handle both plain JSON and SSE responses
|
||||
const messages = extractJsonRpcMessages(result.headers['content-type'], result.body);
|
||||
for (const msg of messages) {
|
||||
const trimmedMsg = msg.trim();
|
||||
if (trimmedMsg) {
|
||||
stdout.write(trimmedMsg + '\n');
|
||||
}
|
||||
}
|
||||
} catch (err) {
|
||||
stderr.write(`MCP bridge error: ${err instanceof Error ? err.message : String(err)}\n`);
|
||||
}
|
||||
}
|
||||
|
||||
// stdin closed — cleanup session
|
||||
if (sessionId) {
|
||||
await sendDelete(endpointUrl, sessionId, token);
|
||||
}
|
||||
}
|
||||
|
||||
export interface McpCommandDeps {
|
||||
getProject: () => string | undefined;
|
||||
configLoader?: () => { mcplocalUrl: string };
|
||||
credentialsLoader?: () => { token: string } | null;
|
||||
}
|
||||
|
||||
export function createMcpCommand(deps: McpCommandDeps): Command {
|
||||
const cmd = new Command('mcp')
|
||||
.description('MCP STDIO transport bridge — connects stdin/stdout to a project MCP endpoint')
|
||||
.passThroughOptions()
|
||||
.option('-p, --project <name>', 'Project name')
|
||||
.action(async (opts: { project?: string }) => {
|
||||
// Accept -p/--project on the command itself, or fall back to global --project
|
||||
const projectName = opts.project ?? deps.getProject();
|
||||
if (!projectName) {
|
||||
process.stderr.write('Error: --project is required for the mcp command\n');
|
||||
process.exitCode = 1;
|
||||
return;
|
||||
}
|
||||
|
||||
let mcplocalUrl = 'http://localhost:3200';
|
||||
if (deps.configLoader) {
|
||||
mcplocalUrl = deps.configLoader().mcplocalUrl;
|
||||
} else {
|
||||
try {
|
||||
const { loadConfig } = await import('../config/index.js');
|
||||
mcplocalUrl = loadConfig().mcplocalUrl;
|
||||
} catch {
|
||||
// Use default
|
||||
}
|
||||
}
|
||||
|
||||
let token: string | undefined;
|
||||
if (deps.credentialsLoader) {
|
||||
token = deps.credentialsLoader()?.token;
|
||||
} else {
|
||||
try {
|
||||
const { loadCredentials } = await import('../auth/index.js');
|
||||
token = loadCredentials()?.token;
|
||||
} catch {
|
||||
// No credentials
|
||||
}
|
||||
}
|
||||
|
||||
await runMcpBridge({
|
||||
projectName,
|
||||
mcplocalUrl,
|
||||
token,
|
||||
stdin: process.stdin,
|
||||
stdout: process.stdout,
|
||||
stderr: process.stderr,
|
||||
});
|
||||
});
|
||||
|
||||
return cmd;
|
||||
}
|
||||
65
src/cli/src/commands/project-ops.ts
Normal file
65
src/cli/src/commands/project-ops.ts
Normal file
@@ -0,0 +1,65 @@
|
||||
import { Command } from 'commander';
|
||||
import type { ApiClient } from '../api-client.js';
|
||||
import { resolveNameOrId, resolveResource } from './shared.js';
|
||||
|
||||
export interface ProjectOpsDeps {
|
||||
client: ApiClient;
|
||||
log: (...args: string[]) => void;
|
||||
getProject: () => string | undefined;
|
||||
}
|
||||
|
||||
function requireProject(deps: ProjectOpsDeps): string {
|
||||
const project = deps.getProject();
|
||||
if (!project) {
|
||||
deps.log('Error: --project <name> is required for this command.');
|
||||
process.exitCode = 1;
|
||||
throw new Error('--project required');
|
||||
}
|
||||
return project;
|
||||
}
|
||||
|
||||
export function createAttachServerCommand(deps: ProjectOpsDeps): Command {
|
||||
const { client, log } = deps;
|
||||
|
||||
return new Command('attach-server')
|
||||
.description('Attach a server to a project (requires --project)')
|
||||
.argument('<server-name>', 'Server name to attach')
|
||||
.action(async (serverName: string) => {
|
||||
const projectName = requireProject(deps);
|
||||
const projectId = await resolveNameOrId(client, 'projects', projectName);
|
||||
await client.post(`/api/v1/projects/${projectId}/servers`, { server: serverName });
|
||||
log(`server '${serverName}' attached to project '${projectName}'`);
|
||||
});
|
||||
}
|
||||
|
||||
export function createDetachServerCommand(deps: ProjectOpsDeps): Command {
|
||||
const { client, log } = deps;
|
||||
|
||||
return new Command('detach-server')
|
||||
.description('Detach a server from a project (requires --project)')
|
||||
.argument('<server-name>', 'Server name to detach')
|
||||
.action(async (serverName: string) => {
|
||||
const projectName = requireProject(deps);
|
||||
const projectId = await resolveNameOrId(client, 'projects', projectName);
|
||||
await client.delete(`/api/v1/projects/${projectId}/servers/${serverName}`);
|
||||
log(`server '${serverName}' detached from project '${projectName}'`);
|
||||
});
|
||||
}
|
||||
|
||||
export function createApproveCommand(deps: ProjectOpsDeps): Command {
|
||||
const { client, log } = deps;
|
||||
|
||||
return new Command('approve')
|
||||
.description('Approve a pending prompt request (atomic: delete request, create prompt)')
|
||||
.argument('<resource>', 'Resource type (promptrequest)')
|
||||
.argument('<name>', 'Resource name or ID')
|
||||
.action(async (resourceArg: string, nameOrId: string) => {
|
||||
const resource = resolveResource(resourceArg);
|
||||
if (resource !== 'promptrequests') {
|
||||
throw new Error(`approve is only supported for 'promptrequest', got '${resourceArg}'`);
|
||||
}
|
||||
const id = await resolveNameOrId(client, 'promptrequests', nameOrId);
|
||||
const prompt = await client.post<{ id: string; name: string }>(`/api/v1/promptrequests/${id}/approve`, {});
|
||||
log(`prompt request approved → prompt '${prompt.name}' created (id: ${prompt.id})`);
|
||||
});
|
||||
}
|
||||
@@ -1,129 +0,0 @@
|
||||
import { Command } from 'commander';
|
||||
import type { ApiClient } from '../api-client.js';
|
||||
|
||||
interface Project {
|
||||
id: string;
|
||||
name: string;
|
||||
description: string;
|
||||
ownerId: string;
|
||||
createdAt: string;
|
||||
}
|
||||
|
||||
interface Profile {
|
||||
id: string;
|
||||
name: string;
|
||||
serverId: string;
|
||||
}
|
||||
|
||||
export interface ProjectCommandDeps {
|
||||
client: ApiClient;
|
||||
log: (...args: unknown[]) => void;
|
||||
}
|
||||
|
||||
export function createProjectCommand(deps: ProjectCommandDeps): Command {
|
||||
const { client, log } = deps;
|
||||
|
||||
const cmd = new Command('project')
|
||||
.alias('projects')
|
||||
.alias('proj')
|
||||
.description('Manage mcpctl projects');
|
||||
|
||||
cmd
|
||||
.command('list')
|
||||
.alias('ls')
|
||||
.description('List all projects')
|
||||
.option('-o, --output <format>', 'Output format (table, json)', 'table')
|
||||
.action(async (opts: { output: string }) => {
|
||||
const projects = await client.get<Project[]>('/api/v1/projects');
|
||||
if (opts.output === 'json') {
|
||||
log(JSON.stringify(projects, null, 2));
|
||||
return;
|
||||
}
|
||||
if (projects.length === 0) {
|
||||
log('No projects found.');
|
||||
return;
|
||||
}
|
||||
log('ID\tNAME\tDESCRIPTION');
|
||||
for (const p of projects) {
|
||||
log(`${p.id}\t${p.name}\t${p.description || '-'}`);
|
||||
}
|
||||
});
|
||||
|
||||
cmd
|
||||
.command('create <name>')
|
||||
.description('Create a new project')
|
||||
.option('-d, --description <text>', 'Project description', '')
|
||||
.action(async (name: string, opts: { description: string }) => {
|
||||
const project = await client.post<Project>('/api/v1/projects', {
|
||||
name,
|
||||
description: opts.description,
|
||||
});
|
||||
log(`Project '${project.name}' created (id: ${project.id})`);
|
||||
});
|
||||
|
||||
cmd
|
||||
.command('delete <id>')
|
||||
.alias('rm')
|
||||
.description('Delete a project')
|
||||
.action(async (id: string) => {
|
||||
await client.delete(`/api/v1/projects/${id}`);
|
||||
log(`Project '${id}' deleted.`);
|
||||
});
|
||||
|
||||
cmd
|
||||
.command('show <id>')
|
||||
.description('Show project details')
|
||||
.action(async (id: string) => {
|
||||
const project = await client.get<Project>(`/api/v1/projects/${id}`);
|
||||
log(`Name: ${project.name}`);
|
||||
log(`ID: ${project.id}`);
|
||||
log(`Description: ${project.description || '-'}`);
|
||||
log(`Owner: ${project.ownerId}`);
|
||||
log(`Created: ${project.createdAt}`);
|
||||
|
||||
try {
|
||||
const profiles = await client.get<Profile[]>(`/api/v1/projects/${id}/profiles`);
|
||||
if (profiles.length > 0) {
|
||||
log('\nProfiles:');
|
||||
for (const p of profiles) {
|
||||
log(` - ${p.name} (id: ${p.id})`);
|
||||
}
|
||||
} else {
|
||||
log('\nNo profiles assigned.');
|
||||
}
|
||||
} catch {
|
||||
// Profiles endpoint may not be available
|
||||
}
|
||||
});
|
||||
|
||||
cmd
|
||||
.command('profiles <id>')
|
||||
.description('List profiles assigned to a project')
|
||||
.option('-o, --output <format>', 'Output format (table, json)', 'table')
|
||||
.action(async (id: string, opts: { output: string }) => {
|
||||
const profiles = await client.get<Profile[]>(`/api/v1/projects/${id}/profiles`);
|
||||
if (opts.output === 'json') {
|
||||
log(JSON.stringify(profiles, null, 2));
|
||||
return;
|
||||
}
|
||||
if (profiles.length === 0) {
|
||||
log('No profiles assigned.');
|
||||
return;
|
||||
}
|
||||
log('ID\tNAME\tSERVER');
|
||||
for (const p of profiles) {
|
||||
log(`${p.id}\t${p.name}\t${p.serverId}`);
|
||||
}
|
||||
});
|
||||
|
||||
cmd
|
||||
.command('set-profiles <id>')
|
||||
.description('Set the profiles assigned to a project')
|
||||
.argument('<profileIds...>', 'Profile IDs to assign')
|
||||
.action(async (id: string, profileIds: string[]) => {
|
||||
await client.put(`/api/v1/projects/${id}/profiles`, { profileIds });
|
||||
log(`Set ${profileIds.length} profile(s) for project '${id}'.`);
|
||||
});
|
||||
|
||||
return cmd;
|
||||
}
|
||||
@@ -1,103 +0,0 @@
|
||||
import { Command } from 'commander';
|
||||
import type { ApiClient } from '../api-client.js';
|
||||
|
||||
export interface SetupPromptDeps {
|
||||
input: (message: string) => Promise<string>;
|
||||
password: (message: string) => Promise<string>;
|
||||
select: <T extends string>(message: string, choices: Array<{ name: string; value: T }>) => Promise<T>;
|
||||
confirm: (message: string) => Promise<boolean>;
|
||||
}
|
||||
|
||||
export interface SetupCommandDeps {
|
||||
client: ApiClient;
|
||||
prompt: SetupPromptDeps;
|
||||
log: (...args: unknown[]) => void;
|
||||
}
|
||||
|
||||
export function createSetupCommand(deps: SetupCommandDeps): Command {
|
||||
const { client, prompt, log } = deps;
|
||||
|
||||
return new Command('setup')
|
||||
.description('Interactive wizard for configuring an MCP server')
|
||||
.argument('[server-name]', 'Server name to set up (will prompt if not given)')
|
||||
.action(async (serverName?: string) => {
|
||||
log('MCP Server Setup Wizard\n');
|
||||
|
||||
// Step 1: Server name
|
||||
const name = serverName ?? await prompt.input('Server name (lowercase, hyphens allowed):');
|
||||
if (!name) {
|
||||
log('Setup cancelled.');
|
||||
return;
|
||||
}
|
||||
|
||||
// Step 2: Transport
|
||||
const transport = await prompt.select('Transport type:', [
|
||||
{ name: 'STDIO (command-line process)', value: 'STDIO' as const },
|
||||
{ name: 'SSE (Server-Sent Events over HTTP)', value: 'SSE' as const },
|
||||
{ name: 'Streamable HTTP', value: 'STREAMABLE_HTTP' as const },
|
||||
]);
|
||||
|
||||
// Step 3: Package or image
|
||||
const packageName = await prompt.input('NPM package name (or leave empty):');
|
||||
const dockerImage = await prompt.input('Docker image (or leave empty):');
|
||||
|
||||
// Step 4: Description
|
||||
const description = await prompt.input('Description:');
|
||||
|
||||
// Step 5: Create the server
|
||||
const serverData: Record<string, unknown> = {
|
||||
name,
|
||||
transport,
|
||||
description,
|
||||
};
|
||||
if (packageName) serverData.packageName = packageName;
|
||||
if (dockerImage) serverData.dockerImage = dockerImage;
|
||||
|
||||
let server: { id: string; name: string };
|
||||
try {
|
||||
server = await client.post<{ id: string; name: string }>('/api/v1/servers', serverData);
|
||||
log(`\nServer '${server.name}' created.`);
|
||||
} catch (err) {
|
||||
log(`\nFailed to create server: ${err instanceof Error ? err.message : err}`);
|
||||
return;
|
||||
}
|
||||
|
||||
// Step 6: Create a profile with env vars
|
||||
const createProfile = await prompt.confirm('Create a profile with environment variables?');
|
||||
if (!createProfile) {
|
||||
log('\nSetup complete!');
|
||||
return;
|
||||
}
|
||||
|
||||
const profileName = await prompt.input('Profile name:') || 'default';
|
||||
|
||||
// Collect env vars
|
||||
const envOverrides: Record<string, string> = {};
|
||||
let addMore = true;
|
||||
while (addMore) {
|
||||
const envName = await prompt.input('Environment variable name (empty to finish):');
|
||||
if (!envName) break;
|
||||
|
||||
const isSecret = await prompt.confirm(`Is '${envName}' a secret (e.g., API key)?`);
|
||||
const envValue = isSecret
|
||||
? await prompt.password(`Value for ${envName}:`)
|
||||
: await prompt.input(`Value for ${envName}:`);
|
||||
|
||||
envOverrides[envName] = envValue;
|
||||
addMore = await prompt.confirm('Add another environment variable?');
|
||||
}
|
||||
|
||||
try {
|
||||
await client.post('/api/v1/profiles', {
|
||||
name: profileName,
|
||||
serverId: server.id,
|
||||
envOverrides,
|
||||
});
|
||||
log(`Profile '${profileName}' created for server '${name}'.`);
|
||||
} catch (err) {
|
||||
log(`Failed to create profile: ${err instanceof Error ? err.message : err}`);
|
||||
}
|
||||
|
||||
log('\nSetup complete!');
|
||||
});
|
||||
}
|
||||
81
src/cli/src/commands/shared.ts
Normal file
81
src/cli/src/commands/shared.ts
Normal file
@@ -0,0 +1,81 @@
|
||||
import type { ApiClient } from '../api-client.js';
|
||||
|
||||
export const RESOURCE_ALIASES: Record<string, string> = {
|
||||
server: 'servers',
|
||||
srv: 'servers',
|
||||
project: 'projects',
|
||||
proj: 'projects',
|
||||
instance: 'instances',
|
||||
inst: 'instances',
|
||||
secret: 'secrets',
|
||||
sec: 'secrets',
|
||||
template: 'templates',
|
||||
tpl: 'templates',
|
||||
user: 'users',
|
||||
group: 'groups',
|
||||
rbac: 'rbac',
|
||||
'rbac-definition': 'rbac',
|
||||
'rbac-binding': 'rbac',
|
||||
prompt: 'prompts',
|
||||
prompts: 'prompts',
|
||||
promptrequest: 'promptrequests',
|
||||
promptrequests: 'promptrequests',
|
||||
pr: 'promptrequests',
|
||||
};
|
||||
|
||||
export function resolveResource(name: string): string {
|
||||
const lower = name.toLowerCase();
|
||||
return RESOURCE_ALIASES[lower] ?? lower;
|
||||
}
|
||||
|
||||
/** Resolve a name-or-ID to an ID. CUIDs pass through; names are looked up. */
|
||||
export async function resolveNameOrId(
|
||||
client: ApiClient,
|
||||
resource: string,
|
||||
nameOrId: string,
|
||||
): Promise<string> {
|
||||
// CUIDs start with 'c' followed by 24+ alphanumeric chars
|
||||
if (/^c[a-z0-9]{24}/.test(nameOrId)) {
|
||||
return nameOrId;
|
||||
}
|
||||
// Users resolve by email, not name
|
||||
if (resource === 'users') {
|
||||
const items = await client.get<Array<{ id: string; email: string }>>(`/api/v1/${resource}`);
|
||||
const match = items.find((item) => item.email === nameOrId);
|
||||
if (match) return match.id;
|
||||
throw new Error(`user '${nameOrId}' not found`);
|
||||
}
|
||||
const items = await client.get<Array<Record<string, unknown>>>(`/api/v1/${resource}`);
|
||||
const match = items.find((item) => {
|
||||
// Instances use server.name, other resources use name directly
|
||||
if (resource === 'instances') {
|
||||
const server = item.server as { name?: string } | undefined;
|
||||
return server?.name === nameOrId;
|
||||
}
|
||||
return item.name === nameOrId;
|
||||
});
|
||||
if (match) return match.id as string;
|
||||
throw new Error(`${resource.replace(/s$/, '')} '${nameOrId}' not found`);
|
||||
}
|
||||
|
||||
/** Strip internal/read-only fields from an API response to make it apply-compatible. */
|
||||
export function stripInternalFields(obj: Record<string, unknown>): Record<string, unknown> {
|
||||
const result = { ...obj };
|
||||
for (const key of ['id', 'createdAt', 'updatedAt', 'version', 'ownerId', 'summary', 'chapters']) {
|
||||
delete result[key];
|
||||
}
|
||||
// Strip relationship joins that aren't part of the resource spec (like k8s namespaces don't list deployments)
|
||||
if ('servers' in result && Array.isArray(result.servers)) {
|
||||
delete result.servers;
|
||||
}
|
||||
if ('owner' in result && typeof result.owner === 'object') {
|
||||
delete result.owner;
|
||||
}
|
||||
if ('members' in result && Array.isArray(result.members)) {
|
||||
delete result.members;
|
||||
}
|
||||
if ('project' in result && typeof result.project === 'object' && result.project !== null) {
|
||||
delete result.project;
|
||||
}
|
||||
return result;
|
||||
}
|
||||
@@ -7,11 +7,32 @@ import type { CredentialsDeps } from '../auth/index.js';
|
||||
import { formatJson, formatYaml } from '../formatters/index.js';
|
||||
import { APP_VERSION } from '@mcpctl/shared';
|
||||
|
||||
// ANSI helpers
|
||||
const GREEN = '\x1b[32m';
|
||||
const RED = '\x1b[31m';
|
||||
const DIM = '\x1b[2m';
|
||||
const RESET = '\x1b[0m';
|
||||
const CLEAR_LINE = '\x1b[2K\r';
|
||||
|
||||
interface ProvidersInfo {
|
||||
providers: string[];
|
||||
tiers: { fast: string[]; heavy: string[] };
|
||||
health: Record<string, boolean>;
|
||||
}
|
||||
|
||||
export interface StatusCommandDeps {
|
||||
configDeps: Partial<ConfigLoaderDeps>;
|
||||
credentialsDeps: Partial<CredentialsDeps>;
|
||||
log: (...args: string[]) => void;
|
||||
write: (text: string) => void;
|
||||
checkHealth: (url: string) => Promise<boolean>;
|
||||
/** Check LLM health via mcplocal's /llm/health endpoint */
|
||||
checkLlm: (mcplocalUrl: string) => Promise<string>;
|
||||
/** Fetch available models from mcplocal's /llm/models endpoint */
|
||||
fetchModels: (mcplocalUrl: string) => Promise<string[]>;
|
||||
/** Fetch provider tier info from mcplocal's /llm/providers endpoint */
|
||||
fetchProviders: (mcplocalUrl: string) => Promise<ProvidersInfo | null>;
|
||||
isTTY: boolean;
|
||||
}
|
||||
|
||||
function defaultCheckHealth(url: string): Promise<boolean> {
|
||||
@@ -28,15 +49,114 @@ function defaultCheckHealth(url: string): Promise<boolean> {
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Check LLM health by querying mcplocal's /llm/health endpoint.
|
||||
* This tests the actual provider running inside the daemon (uses persistent ACP for gemini, etc.)
|
||||
*/
|
||||
function defaultCheckLlm(mcplocalUrl: string): Promise<string> {
|
||||
return new Promise((resolve) => {
|
||||
const req = http.get(`${mcplocalUrl}/llm/health`, { timeout: 45000 }, (res) => {
|
||||
const chunks: Buffer[] = [];
|
||||
res.on('data', (chunk: Buffer) => chunks.push(chunk));
|
||||
res.on('end', () => {
|
||||
try {
|
||||
const body = JSON.parse(Buffer.concat(chunks).toString('utf-8')) as { status: string; error?: string };
|
||||
if (body.status === 'ok') {
|
||||
resolve('ok');
|
||||
} else if (body.status === 'not configured') {
|
||||
resolve('not configured');
|
||||
} else if (body.error) {
|
||||
resolve(body.error.slice(0, 80));
|
||||
} else {
|
||||
resolve(body.status);
|
||||
}
|
||||
} catch {
|
||||
resolve('invalid response');
|
||||
}
|
||||
});
|
||||
});
|
||||
req.on('error', () => resolve('mcplocal unreachable'));
|
||||
req.on('timeout', () => { req.destroy(); resolve('timeout'); });
|
||||
});
|
||||
}
|
||||
|
||||
function defaultFetchModels(mcplocalUrl: string): Promise<string[]> {
|
||||
return new Promise((resolve) => {
|
||||
const req = http.get(`${mcplocalUrl}/llm/models`, { timeout: 5000 }, (res) => {
|
||||
const chunks: Buffer[] = [];
|
||||
res.on('data', (chunk: Buffer) => chunks.push(chunk));
|
||||
res.on('end', () => {
|
||||
try {
|
||||
const body = JSON.parse(Buffer.concat(chunks).toString('utf-8')) as { models?: string[] };
|
||||
resolve(body.models ?? []);
|
||||
} catch {
|
||||
resolve([]);
|
||||
}
|
||||
});
|
||||
});
|
||||
req.on('error', () => resolve([]));
|
||||
req.on('timeout', () => { req.destroy(); resolve([]); });
|
||||
});
|
||||
}
|
||||
|
||||
function defaultFetchProviders(mcplocalUrl: string): Promise<ProvidersInfo | null> {
|
||||
return new Promise((resolve) => {
|
||||
const req = http.get(`${mcplocalUrl}/llm/providers`, { timeout: 5000 }, (res) => {
|
||||
const chunks: Buffer[] = [];
|
||||
res.on('data', (chunk: Buffer) => chunks.push(chunk));
|
||||
res.on('end', () => {
|
||||
try {
|
||||
const body = JSON.parse(Buffer.concat(chunks).toString('utf-8')) as ProvidersInfo;
|
||||
resolve(body);
|
||||
} catch {
|
||||
resolve(null);
|
||||
}
|
||||
});
|
||||
});
|
||||
req.on('error', () => resolve(null));
|
||||
req.on('timeout', () => { req.destroy(); resolve(null); });
|
||||
});
|
||||
}
|
||||
|
||||
const SPINNER_FRAMES = ['⠋', '⠙', '⠹', '⠸', '⠼', '⠴', '⠦', '⠧', '⠇', '⠏'];
|
||||
|
||||
const defaultDeps: StatusCommandDeps = {
|
||||
configDeps: {},
|
||||
credentialsDeps: {},
|
||||
log: (...args) => console.log(...args),
|
||||
write: (text) => process.stdout.write(text),
|
||||
checkHealth: defaultCheckHealth,
|
||||
checkLlm: defaultCheckLlm,
|
||||
fetchModels: defaultFetchModels,
|
||||
fetchProviders: defaultFetchProviders,
|
||||
isTTY: process.stdout.isTTY ?? false,
|
||||
};
|
||||
|
||||
/** Determine LLM label from config (handles both legacy and multi-provider formats). */
|
||||
function getLlmLabel(llm: unknown): string | null {
|
||||
if (!llm || typeof llm !== 'object') return null;
|
||||
// Legacy format: { provider, model }
|
||||
if ('provider' in llm) {
|
||||
const legacy = llm as { provider: string; model?: string };
|
||||
if (legacy.provider === 'none') return null;
|
||||
return `${legacy.provider}${legacy.model ? ` / ${legacy.model}` : ''}`;
|
||||
}
|
||||
// Multi-provider format: { providers: [...] }
|
||||
if ('providers' in llm) {
|
||||
const multi = llm as { providers: Array<{ name: string; type: string; tier?: string }> };
|
||||
if (multi.providers.length === 0) return null;
|
||||
return multi.providers.map((p) => `${p.name}${p.tier ? ` (${p.tier})` : ''}`).join(', ');
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
/** Check if config uses multi-provider format. */
|
||||
function isMultiProvider(llm: unknown): boolean {
|
||||
return !!llm && typeof llm === 'object' && 'providers' in llm;
|
||||
}
|
||||
|
||||
export function createStatusCommand(deps?: Partial<StatusCommandDeps>): Command {
|
||||
const { configDeps, credentialsDeps, log, checkHealth } = { ...defaultDeps, ...deps };
|
||||
const { configDeps, credentialsDeps, log, write, checkHealth, checkLlm, fetchModels, fetchProviders, isTTY } = { ...defaultDeps, ...deps };
|
||||
|
||||
return new Command('status')
|
||||
.description('Show mcpctl status and connectivity')
|
||||
@@ -45,33 +165,124 @@ export function createStatusCommand(deps?: Partial<StatusCommandDeps>): Command
|
||||
const config = loadConfig(configDeps);
|
||||
const creds = loadCredentials(credentialsDeps);
|
||||
|
||||
const llmLabel = getLlmLabel(config.llm);
|
||||
const multiProvider = isMultiProvider(config.llm);
|
||||
|
||||
if (opts.output !== 'table') {
|
||||
// JSON/YAML: run everything in parallel, wait, output at once
|
||||
const [mcplocalReachable, mcpdReachable, llmStatus, providersInfo] = await Promise.all([
|
||||
checkHealth(config.mcplocalUrl),
|
||||
checkHealth(config.mcpdUrl),
|
||||
llmLabel ? checkLlm(config.mcplocalUrl) : Promise.resolve(null),
|
||||
multiProvider ? fetchProviders(config.mcplocalUrl) : Promise.resolve(null),
|
||||
]);
|
||||
|
||||
const llm = llmLabel
|
||||
? llmStatus === 'ok' ? llmLabel : `${llmLabel} (${llmStatus})`
|
||||
: null;
|
||||
|
||||
const status = {
|
||||
version: APP_VERSION,
|
||||
mcplocalUrl: config.mcplocalUrl,
|
||||
mcplocalReachable,
|
||||
mcpdUrl: config.mcpdUrl,
|
||||
mcpdReachable,
|
||||
auth: creds ? { user: creds.user } : null,
|
||||
registries: config.registries,
|
||||
outputFormat: config.outputFormat,
|
||||
llm,
|
||||
llmStatus,
|
||||
...(providersInfo ? { providers: providersInfo } : {}),
|
||||
};
|
||||
|
||||
log(opts.output === 'json' ? formatJson(status) : formatYaml(status));
|
||||
return;
|
||||
}
|
||||
|
||||
// Table format: print lines progressively, LLM last with spinner
|
||||
|
||||
// Fast health checks first
|
||||
const [mcplocalReachable, mcpdReachable] = await Promise.all([
|
||||
checkHealth(config.mcplocalUrl),
|
||||
checkHealth(config.mcpdUrl),
|
||||
]);
|
||||
|
||||
const status = {
|
||||
version: APP_VERSION,
|
||||
mcplocalUrl: config.mcplocalUrl,
|
||||
mcplocalReachable,
|
||||
mcpdUrl: config.mcpdUrl,
|
||||
mcpdReachable,
|
||||
auth: creds ? { user: creds.user } : null,
|
||||
registries: config.registries,
|
||||
outputFormat: config.outputFormat,
|
||||
};
|
||||
log(`mcpctl v${APP_VERSION}`);
|
||||
log(`mcplocal: ${config.mcplocalUrl} (${mcplocalReachable ? 'connected' : 'unreachable'})`);
|
||||
log(`mcpd: ${config.mcpdUrl} (${mcpdReachable ? 'connected' : 'unreachable'})`);
|
||||
log(`Auth: ${creds ? `logged in as ${creds.user}` : 'not logged in'}`);
|
||||
log(`Registries: ${config.registries.join(', ')}`);
|
||||
log(`Output: ${config.outputFormat}`);
|
||||
|
||||
if (opts.output === 'json') {
|
||||
log(formatJson(status));
|
||||
} else if (opts.output === 'yaml') {
|
||||
log(formatYaml(status));
|
||||
if (!llmLabel) {
|
||||
log(`LLM: not configured (run 'mcpctl config setup')`);
|
||||
return;
|
||||
}
|
||||
|
||||
// LLM check + models + providers fetch in parallel
|
||||
const llmPromise = checkLlm(config.mcplocalUrl);
|
||||
const modelsPromise = fetchModels(config.mcplocalUrl);
|
||||
const providersPromise = multiProvider ? fetchProviders(config.mcplocalUrl) : Promise.resolve(null);
|
||||
|
||||
if (isTTY) {
|
||||
let frame = 0;
|
||||
const interval = setInterval(() => {
|
||||
write(`${CLEAR_LINE}LLM: ${DIM}${SPINNER_FRAMES[frame % SPINNER_FRAMES.length]} checking...${RESET}`);
|
||||
frame++;
|
||||
}, 80);
|
||||
|
||||
const [llmStatus, models, providersInfo] = await Promise.all([llmPromise, modelsPromise, providersPromise]);
|
||||
clearInterval(interval);
|
||||
|
||||
if (providersInfo && (providersInfo.tiers.fast.length > 0 || providersInfo.tiers.heavy.length > 0)) {
|
||||
// Tiered display with per-provider health
|
||||
write(`${CLEAR_LINE}`);
|
||||
for (const tier of ['fast', 'heavy'] as const) {
|
||||
const names = providersInfo.tiers[tier];
|
||||
if (names.length === 0) continue;
|
||||
const label = tier === 'fast' ? 'LLM (fast): ' : 'LLM (heavy):';
|
||||
const parts = names.map((n) => {
|
||||
const ok = providersInfo.health[n];
|
||||
return ok ? `${n} ${GREEN}✓${RESET}` : `${n} ${RED}✗${RESET}`;
|
||||
});
|
||||
log(`${label} ${parts.join(', ')}`);
|
||||
}
|
||||
} else {
|
||||
// Legacy single provider display
|
||||
if (llmStatus === 'ok' || llmStatus === 'ok (key stored)') {
|
||||
write(`${CLEAR_LINE}LLM: ${llmLabel} ${GREEN}✓ ${llmStatus}${RESET}\n`);
|
||||
} else {
|
||||
write(`${CLEAR_LINE}LLM: ${llmLabel} ${RED}✗ ${llmStatus}${RESET}\n`);
|
||||
}
|
||||
}
|
||||
if (models.length > 0) {
|
||||
log(`${DIM} Available: ${models.join(', ')}${RESET}`);
|
||||
}
|
||||
} else {
|
||||
log(`mcpctl v${status.version}`);
|
||||
log(`mcplocal: ${status.mcplocalUrl} (${mcplocalReachable ? 'connected' : 'unreachable'})`);
|
||||
log(`mcpd: ${status.mcpdUrl} (${mcpdReachable ? 'connected' : 'unreachable'})`);
|
||||
log(`Auth: ${creds ? `logged in as ${creds.user}` : 'not logged in'}`);
|
||||
log(`Registries: ${status.registries.join(', ')}`);
|
||||
log(`Output: ${status.outputFormat}`);
|
||||
// Non-TTY: no spinner, just wait and print
|
||||
const [llmStatus, models, providersInfo] = await Promise.all([llmPromise, modelsPromise, providersPromise]);
|
||||
|
||||
if (providersInfo && (providersInfo.tiers.fast.length > 0 || providersInfo.tiers.heavy.length > 0)) {
|
||||
for (const tier of ['fast', 'heavy'] as const) {
|
||||
const names = providersInfo.tiers[tier];
|
||||
if (names.length === 0) continue;
|
||||
const label = tier === 'fast' ? 'LLM (fast): ' : 'LLM (heavy):';
|
||||
const parts = names.map((n) => {
|
||||
const ok = providersInfo.health[n];
|
||||
return ok ? `${n} ✓` : `${n} ✗`;
|
||||
});
|
||||
log(`${label} ${parts.join(', ')}`);
|
||||
}
|
||||
} else {
|
||||
if (llmStatus === 'ok' || llmStatus === 'ok (key stored)') {
|
||||
log(`LLM: ${llmLabel} ✓ ${llmStatus}`);
|
||||
} else {
|
||||
log(`LLM: ${llmLabel} ✗ ${llmStatus}`);
|
||||
}
|
||||
}
|
||||
if (models.length > 0) {
|
||||
log(`${DIM} Available: ${models.join(', ')}${RESET}`);
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
export { McpctlConfigSchema, DEFAULT_CONFIG } from './schema.js';
|
||||
export type { McpctlConfig } from './schema.js';
|
||||
export { McpctlConfigSchema, LlmConfigSchema, LlmProviderEntrySchema, LlmMultiConfigSchema, LLM_PROVIDERS, LLM_TIERS, DEFAULT_CONFIG } from './schema.js';
|
||||
export type { McpctlConfig, LlmConfig, LlmProviderEntry, LlmMultiConfig, LlmProviderName, LlmTier } from './schema.js';
|
||||
export { loadConfig, saveConfig, mergeConfig, getConfigPath } from './loader.js';
|
||||
export type { ConfigLoaderDeps } from './loader.js';
|
||||
|
||||
@@ -1,5 +1,50 @@
|
||||
import { z } from 'zod';
|
||||
|
||||
export const LLM_PROVIDERS = ['gemini-cli', 'ollama', 'anthropic', 'openai', 'deepseek', 'vllm', 'none'] as const;
|
||||
export type LlmProviderName = typeof LLM_PROVIDERS[number];
|
||||
|
||||
export const LLM_TIERS = ['fast', 'heavy'] as const;
|
||||
export type LlmTier = typeof LLM_TIERS[number];
|
||||
|
||||
/** Legacy single-provider format. */
|
||||
export const LlmConfigSchema = z.object({
|
||||
/** LLM provider name */
|
||||
provider: z.enum(LLM_PROVIDERS),
|
||||
/** Model name */
|
||||
model: z.string().optional(),
|
||||
/** Provider URL (for ollama, vllm, openai with custom endpoint) */
|
||||
url: z.string().optional(),
|
||||
/** Binary path override (for gemini-cli) */
|
||||
binaryPath: z.string().optional(),
|
||||
}).strict();
|
||||
|
||||
export type LlmConfig = z.infer<typeof LlmConfigSchema>;
|
||||
|
||||
/** Multi-provider entry (advanced mode). */
|
||||
export const LlmProviderEntrySchema = z.object({
|
||||
/** User-chosen name for this provider instance (e.g. "vllm-local") */
|
||||
name: z.string(),
|
||||
/** Provider type */
|
||||
type: z.enum(LLM_PROVIDERS),
|
||||
/** Model name */
|
||||
model: z.string().optional(),
|
||||
/** Provider URL (for ollama, vllm, openai with custom endpoint) */
|
||||
url: z.string().optional(),
|
||||
/** Binary path override (for gemini-cli) */
|
||||
binaryPath: z.string().optional(),
|
||||
/** Tier assignment */
|
||||
tier: z.enum(LLM_TIERS).optional(),
|
||||
}).strict();
|
||||
|
||||
export type LlmProviderEntry = z.infer<typeof LlmProviderEntrySchema>;
|
||||
|
||||
/** Multi-provider format with providers array. */
|
||||
export const LlmMultiConfigSchema = z.object({
|
||||
providers: z.array(LlmProviderEntrySchema).min(1),
|
||||
}).strict();
|
||||
|
||||
export type LlmMultiConfig = z.infer<typeof LlmMultiConfigSchema>;
|
||||
|
||||
export const McpctlConfigSchema = z.object({
|
||||
/** mcplocal daemon endpoint (local LLM pre-processing proxy) */
|
||||
mcplocalUrl: z.string().default('http://localhost:3200'),
|
||||
@@ -19,6 +64,8 @@ export const McpctlConfigSchema = z.object({
|
||||
outputFormat: z.enum(['table', 'json', 'yaml']).default('table'),
|
||||
/** Smithery API key */
|
||||
smitheryApiKey: z.string().optional(),
|
||||
/** LLM provider configuration — accepts legacy single-provider or multi-provider format */
|
||||
llm: z.union([LlmConfigSchema, LlmMultiConfigSchema]).optional(),
|
||||
}).transform((cfg) => {
|
||||
// Backward compatibility: if old daemonUrl is set but mcplocalUrl wasn't explicitly changed,
|
||||
// use daemonUrl as mcplocalUrl
|
||||
|
||||
@@ -6,6 +6,29 @@ export function formatJson(data: unknown): string {
|
||||
return JSON.stringify(data, null, 2);
|
||||
}
|
||||
|
||||
export function formatYaml(data: unknown): string {
|
||||
return yaml.dump(data, { lineWidth: 120, noRefs: true }).trimEnd();
|
||||
/**
|
||||
* Reorder object keys so that long text fields (like `content`, `prompt`)
|
||||
* come last. This makes YAML output more readable when content spans
|
||||
* multiple lines.
|
||||
*/
|
||||
export function reorderKeys(obj: unknown): unknown {
|
||||
if (Array.isArray(obj)) return obj.map(reorderKeys);
|
||||
if (obj !== null && typeof obj === 'object') {
|
||||
const rec = obj as Record<string, unknown>;
|
||||
const lastKeys = ['content', 'prompt'];
|
||||
const ordered: Record<string, unknown> = {};
|
||||
for (const key of Object.keys(rec)) {
|
||||
if (!lastKeys.includes(key)) ordered[key] = reorderKeys(rec[key]);
|
||||
}
|
||||
for (const key of lastKeys) {
|
||||
if (key in rec) ordered[key] = rec[key];
|
||||
}
|
||||
return ordered;
|
||||
}
|
||||
return obj;
|
||||
}
|
||||
|
||||
export function formatYaml(data: unknown): string {
|
||||
const reordered = reorderKeys(data);
|
||||
return yaml.dump(reordered, { lineWidth: 120, noRefs: true }).trimEnd();
|
||||
}
|
||||
|
||||
@@ -5,27 +5,31 @@ import { createConfigCommand } from './commands/config.js';
|
||||
import { createStatusCommand } from './commands/status.js';
|
||||
import { createGetCommand } from './commands/get.js';
|
||||
import { createDescribeCommand } from './commands/describe.js';
|
||||
import { createInstanceCommands } from './commands/instances.js';
|
||||
import { createDeleteCommand } from './commands/delete.js';
|
||||
import { createLogsCommand } from './commands/logs.js';
|
||||
import { createApplyCommand } from './commands/apply.js';
|
||||
import { createSetupCommand } from './commands/setup.js';
|
||||
import { createClaudeCommand } from './commands/claude.js';
|
||||
import { createProjectCommand } from './commands/project.js';
|
||||
import { createCreateCommand } from './commands/create.js';
|
||||
import { createEditCommand } from './commands/edit.js';
|
||||
import { createBackupCommand, createRestoreCommand } from './commands/backup.js';
|
||||
import { createLoginCommand, createLogoutCommand } from './commands/auth.js';
|
||||
import { ApiClient } from './api-client.js';
|
||||
import { createAttachServerCommand, createDetachServerCommand, createApproveCommand } from './commands/project-ops.js';
|
||||
import { createMcpCommand } from './commands/mcp.js';
|
||||
import { createPatchCommand } from './commands/patch.js';
|
||||
import { ApiClient, ApiError } from './api-client.js';
|
||||
import { loadConfig } from './config/index.js';
|
||||
import { loadCredentials } from './auth/index.js';
|
||||
import { resolveNameOrId } from './commands/shared.js';
|
||||
|
||||
export function createProgram(): Command {
|
||||
const program = new Command()
|
||||
.name(APP_NAME)
|
||||
.description('Manage MCP servers like kubectl manages containers')
|
||||
.version(APP_VERSION, '-v, --version')
|
||||
.option('-o, --output <format>', 'output format (table, json, yaml)', 'table')
|
||||
.enablePositionalOptions()
|
||||
.option('--daemon-url <url>', 'mcplocal daemon URL')
|
||||
.option('--direct', 'bypass mcplocal and connect directly to mcpd');
|
||||
.option('--direct', 'bypass mcplocal and connect directly to mcpd')
|
||||
.option('--project <name>', 'Target project for project commands');
|
||||
|
||||
program.addCommand(createConfigCommand());
|
||||
program.addCommand(createStatusCommand());
|
||||
program.addCommand(createLoginCommand());
|
||||
program.addCommand(createLogoutCommand());
|
||||
@@ -45,15 +49,63 @@ export function createProgram(): Command {
|
||||
|
||||
const client = new ApiClient({ baseUrl, token: creds?.token ?? undefined });
|
||||
|
||||
const fetchResource = async (resource: string, id?: string): Promise<unknown[]> => {
|
||||
if (id) {
|
||||
program.addCommand(createConfigCommand(undefined, {
|
||||
client,
|
||||
credentialsDeps: {},
|
||||
log: (...args) => console.log(...args),
|
||||
}));
|
||||
|
||||
const fetchResource = async (resource: string, nameOrId?: string, opts?: { project?: string; all?: boolean }): Promise<unknown[]> => {
|
||||
const projectName = opts?.project ?? program.opts().project as string | undefined;
|
||||
|
||||
// --project scoping for servers and instances
|
||||
if (projectName && !nameOrId && (resource === 'servers' || resource === 'instances')) {
|
||||
const projectId = await resolveNameOrId(client, 'projects', projectName);
|
||||
if (resource === 'servers') {
|
||||
return client.get<unknown[]>(`/api/v1/projects/${projectId}/servers`);
|
||||
}
|
||||
// instances: fetch project servers, then filter instances by serverId
|
||||
const projectServers = await client.get<Array<{ id: string }>>(`/api/v1/projects/${projectId}/servers`);
|
||||
const serverIds = new Set(projectServers.map((s) => s.id));
|
||||
const allInstances = await client.get<Array<{ serverId: string }>>(`/api/v1/instances`);
|
||||
return allInstances.filter((inst) => serverIds.has(inst.serverId));
|
||||
}
|
||||
|
||||
// --project scoping for prompts and promptrequests
|
||||
if (!nameOrId && (resource === 'prompts' || resource === 'promptrequests')) {
|
||||
if (projectName) {
|
||||
return client.get<unknown[]>(`/api/v1/${resource}?project=${encodeURIComponent(projectName)}`);
|
||||
}
|
||||
// Default: global-only. --all (-A) shows everything.
|
||||
if (!opts?.all) {
|
||||
return client.get<unknown[]>(`/api/v1/${resource}?scope=global`);
|
||||
}
|
||||
}
|
||||
|
||||
if (nameOrId) {
|
||||
// Glob pattern — use query param filtering
|
||||
if (nameOrId.includes('*')) {
|
||||
return client.get<unknown[]>(`/api/v1/${resource}?name=${encodeURIComponent(nameOrId)}`);
|
||||
}
|
||||
let id: string;
|
||||
try {
|
||||
id = await resolveNameOrId(client, resource, nameOrId);
|
||||
} catch {
|
||||
id = nameOrId;
|
||||
}
|
||||
const item = await client.get(`/api/v1/${resource}/${id}`);
|
||||
return [item];
|
||||
}
|
||||
return client.get<unknown[]>(`/api/v1/${resource}`);
|
||||
};
|
||||
|
||||
const fetchSingleResource = async (resource: string, id: string): Promise<unknown> => {
|
||||
const fetchSingleResource = async (resource: string, nameOrId: string): Promise<unknown> => {
|
||||
let id: string;
|
||||
try {
|
||||
id = await resolveNameOrId(client, resource, nameOrId);
|
||||
} catch {
|
||||
id = nameOrId;
|
||||
}
|
||||
return client.get(`/api/v1/${resource}/${id}`);
|
||||
};
|
||||
|
||||
@@ -63,11 +115,28 @@ export function createProgram(): Command {
|
||||
}));
|
||||
|
||||
program.addCommand(createDescribeCommand({
|
||||
client,
|
||||
fetchResource: fetchSingleResource,
|
||||
fetchInspect: async (id: string) => client.get(`/api/v1/instances/${id}/inspect`),
|
||||
log: (...args) => console.log(...args),
|
||||
}));
|
||||
|
||||
program.addCommand(createInstanceCommands({
|
||||
program.addCommand(createDeleteCommand({
|
||||
client,
|
||||
log: (...args) => console.log(...args),
|
||||
}));
|
||||
|
||||
program.addCommand(createLogsCommand({
|
||||
client,
|
||||
log: (...args) => console.log(...args),
|
||||
}));
|
||||
|
||||
program.addCommand(createCreateCommand({
|
||||
client,
|
||||
log: (...args) => console.log(...args),
|
||||
}));
|
||||
|
||||
program.addCommand(createEditCommand({
|
||||
client,
|
||||
log: (...args) => console.log(...args),
|
||||
}));
|
||||
@@ -77,39 +146,7 @@ export function createProgram(): Command {
|
||||
log: (...args) => console.log(...args),
|
||||
}));
|
||||
|
||||
program.addCommand(createSetupCommand({
|
||||
client,
|
||||
prompt: {
|
||||
async input(message) {
|
||||
const { default: inquirer } = await import('inquirer');
|
||||
const { answer } = await inquirer.prompt([{ type: 'input', name: 'answer', message }]);
|
||||
return answer as string;
|
||||
},
|
||||
async password(message) {
|
||||
const { default: inquirer } = await import('inquirer');
|
||||
const { answer } = await inquirer.prompt([{ type: 'password', name: 'answer', message }]);
|
||||
return answer as string;
|
||||
},
|
||||
async select(message, choices) {
|
||||
const { default: inquirer } = await import('inquirer');
|
||||
const { answer } = await inquirer.prompt([{ type: 'list', name: 'answer', message, choices }]);
|
||||
return answer;
|
||||
},
|
||||
async confirm(message) {
|
||||
const { default: inquirer } = await import('inquirer');
|
||||
const { answer } = await inquirer.prompt([{ type: 'confirm', name: 'answer', message }]);
|
||||
return answer as boolean;
|
||||
},
|
||||
},
|
||||
log: (...args) => console.log(...args),
|
||||
}));
|
||||
|
||||
program.addCommand(createClaudeCommand({
|
||||
client,
|
||||
log: (...args) => console.log(...args),
|
||||
}));
|
||||
|
||||
program.addCommand(createProjectCommand({
|
||||
program.addCommand(createPatchCommand({
|
||||
client,
|
||||
log: (...args) => console.log(...args),
|
||||
}));
|
||||
@@ -124,6 +161,18 @@ export function createProgram(): Command {
|
||||
log: (...args) => console.log(...args),
|
||||
}));
|
||||
|
||||
const projectOpsDeps = {
|
||||
client,
|
||||
log: (...args: string[]) => console.log(...args),
|
||||
getProject: () => program.opts().project as string | undefined,
|
||||
};
|
||||
program.addCommand(createAttachServerCommand(projectOpsDeps), { hidden: true });
|
||||
program.addCommand(createDetachServerCommand(projectOpsDeps), { hidden: true });
|
||||
program.addCommand(createApproveCommand(projectOpsDeps));
|
||||
program.addCommand(createMcpCommand({
|
||||
getProject: () => program.opts().project as string | undefined,
|
||||
}), { hidden: true });
|
||||
|
||||
return program;
|
||||
}
|
||||
|
||||
@@ -134,5 +183,35 @@ const isDirectRun =
|
||||
import.meta.url === `file://${process.argv[1]}`;
|
||||
|
||||
if (isDirectRun) {
|
||||
createProgram().parseAsync(process.argv);
|
||||
createProgram().parseAsync(process.argv).catch((err: unknown) => {
|
||||
if (err instanceof ApiError) {
|
||||
if (err.status === 401) {
|
||||
console.error("Error: you need to log in. Run 'mcpctl login' to authenticate.");
|
||||
} else if (err.status === 403) {
|
||||
console.error('Error: permission denied. You do not have access to this resource.');
|
||||
} else {
|
||||
let msg: string;
|
||||
try {
|
||||
const parsed = JSON.parse(err.body) as { error?: string; message?: string; details?: unknown };
|
||||
msg = parsed.error ?? parsed.message ?? err.body;
|
||||
if (parsed.details && Array.isArray(parsed.details)) {
|
||||
const issues = parsed.details as Array<{ message?: string; path?: string[] }>;
|
||||
const detail = issues.map((i) => {
|
||||
const path = i.path?.join('.') ?? '';
|
||||
return path ? `${path}: ${i.message}` : (i.message ?? '');
|
||||
}).filter(Boolean).join('; ');
|
||||
if (detail) msg += `: ${detail}`;
|
||||
}
|
||||
} catch {
|
||||
msg = err.body;
|
||||
}
|
||||
console.error(`Error: ${msg}`);
|
||||
}
|
||||
} else if (err instanceof Error) {
|
||||
console.error(`Error: ${err.message}`);
|
||||
} else {
|
||||
console.error(`Error: ${String(err)}`);
|
||||
}
|
||||
process.exit(1);
|
||||
});
|
||||
}
|
||||
|
||||
@@ -21,6 +21,16 @@ beforeAll(async () => {
|
||||
res.writeHead(201, { 'Content-Type': 'application/json' });
|
||||
res.end(JSON.stringify({ id: 'srv-new', ...body }));
|
||||
});
|
||||
} else if (req.url === '/api/v1/servers/srv-1' && req.method === 'DELETE') {
|
||||
// Fastify rejects empty body with Content-Type: application/json
|
||||
const ct = req.headers['content-type'] ?? '';
|
||||
if (ct.includes('application/json')) {
|
||||
res.writeHead(400, { 'Content-Type': 'application/json' });
|
||||
res.end(JSON.stringify({ error: "Body cannot be empty when content-type is set to 'application/json'" }));
|
||||
} else {
|
||||
res.writeHead(204);
|
||||
res.end();
|
||||
}
|
||||
} else if (req.url === '/api/v1/missing' && req.method === 'GET') {
|
||||
res.writeHead(404, { 'Content-Type': 'application/json' });
|
||||
res.end(JSON.stringify({ error: 'Not found' }));
|
||||
@@ -75,6 +85,12 @@ describe('ApiClient', () => {
|
||||
await expect(client.get('/anything')).rejects.toThrow();
|
||||
});
|
||||
|
||||
it('performs DELETE without Content-Type header', async () => {
|
||||
const client = new ApiClient({ baseUrl: `http://localhost:${port}` });
|
||||
// Should succeed (204) because no Content-Type is sent on bodyless DELETE
|
||||
await expect(client.delete('/api/v1/servers/srv-1')).resolves.toBeUndefined();
|
||||
});
|
||||
|
||||
it('sends Authorization header when token provided', async () => {
|
||||
// We need a separate server to check the header
|
||||
let receivedAuth = '';
|
||||
|
||||
@@ -24,9 +24,10 @@ describe('createProgram', () => {
|
||||
expect(status).toBeDefined();
|
||||
});
|
||||
|
||||
it('has output option', () => {
|
||||
it('subcommands have output option', () => {
|
||||
const program = createProgram();
|
||||
const opt = program.options.find((o) => o.long === '--output');
|
||||
const get = program.commands.find((c) => c.name() === 'get');
|
||||
const opt = get?.options.find((o) => o.long === '--output');
|
||||
expect(opt).toBeDefined();
|
||||
});
|
||||
|
||||
|
||||
@@ -86,9 +86,6 @@ servers:
|
||||
servers:
|
||||
- name: test
|
||||
transport: STDIO
|
||||
profiles:
|
||||
- name: default
|
||||
server: test
|
||||
`);
|
||||
|
||||
const cmd = createApplyCommand({ client, log });
|
||||
@@ -97,52 +94,51 @@ profiles:
|
||||
expect(client.post).not.toHaveBeenCalled();
|
||||
expect(output.join('\n')).toContain('Dry run');
|
||||
expect(output.join('\n')).toContain('1 server(s)');
|
||||
expect(output.join('\n')).toContain('1 profile(s)');
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('applies profiles with server lookup', async () => {
|
||||
it('applies secrets', async () => {
|
||||
const configPath = join(tmpDir, 'config.yaml');
|
||||
writeFileSync(configPath, `
|
||||
secrets:
|
||||
- name: ha-creds
|
||||
data:
|
||||
TOKEN: abc123
|
||||
URL: https://ha.local
|
||||
`);
|
||||
|
||||
const cmd = createApplyCommand({ client, log });
|
||||
await cmd.parseAsync([configPath], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/secrets', expect.objectContaining({
|
||||
name: 'ha-creds',
|
||||
data: { TOKEN: 'abc123', URL: 'https://ha.local' },
|
||||
}));
|
||||
expect(output.join('\n')).toContain('Created secret: ha-creds');
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('updates existing secrets', async () => {
|
||||
vi.mocked(client.get).mockImplementation(async (url: string) => {
|
||||
if (url === '/api/v1/servers') return [{ id: 'srv-1', name: 'slack' }];
|
||||
if (url === '/api/v1/secrets') return [{ id: 'sec-1', name: 'ha-creds' }];
|
||||
return [];
|
||||
});
|
||||
|
||||
const configPath = join(tmpDir, 'config.yaml');
|
||||
writeFileSync(configPath, `
|
||||
profiles:
|
||||
- name: default
|
||||
server: slack
|
||||
envOverrides:
|
||||
SLACK_TOKEN: "xoxb-test"
|
||||
secrets:
|
||||
- name: ha-creds
|
||||
data:
|
||||
TOKEN: new-token
|
||||
`);
|
||||
|
||||
const cmd = createApplyCommand({ client, log });
|
||||
await cmd.parseAsync([configPath], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/profiles', expect.objectContaining({
|
||||
name: 'default',
|
||||
serverId: 'srv-1',
|
||||
envOverrides: { SLACK_TOKEN: 'xoxb-test' },
|
||||
}));
|
||||
expect(output.join('\n')).toContain('Created profile: default');
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('skips profiles when server not found', async () => {
|
||||
const configPath = join(tmpDir, 'config.yaml');
|
||||
writeFileSync(configPath, `
|
||||
profiles:
|
||||
- name: default
|
||||
server: nonexistent
|
||||
`);
|
||||
|
||||
const cmd = createApplyCommand({ client, log });
|
||||
await cmd.parseAsync([configPath], { from: 'user' });
|
||||
|
||||
expect(client.post).not.toHaveBeenCalled();
|
||||
expect(output.join('\n')).toContain("Skipping profile 'default'");
|
||||
expect(client.put).toHaveBeenCalledWith('/api/v1/secrets/sec-1', { data: { TOKEN: 'new-token' } });
|
||||
expect(output.join('\n')).toContain('Updated secret: ha-creds');
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
@@ -163,4 +159,347 @@ projects:
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('applies users (no role field)', async () => {
|
||||
const configPath = join(tmpDir, 'config.yaml');
|
||||
writeFileSync(configPath, `
|
||||
users:
|
||||
- email: alice@test.com
|
||||
password: password123
|
||||
name: Alice
|
||||
`);
|
||||
|
||||
const cmd = createApplyCommand({ client, log });
|
||||
await cmd.parseAsync([configPath], { from: 'user' });
|
||||
|
||||
const callBody = vi.mocked(client.post).mock.calls[0]![1] as Record<string, unknown>;
|
||||
expect(callBody).toEqual(expect.objectContaining({
|
||||
email: 'alice@test.com',
|
||||
password: 'password123',
|
||||
name: 'Alice',
|
||||
}));
|
||||
expect(callBody).not.toHaveProperty('role');
|
||||
expect(output.join('\n')).toContain('Created user: alice@test.com');
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('updates existing users matched by email', async () => {
|
||||
vi.mocked(client.get).mockImplementation(async (url: string) => {
|
||||
if (url === '/api/v1/users') return [{ id: 'usr-1', email: 'alice@test.com' }];
|
||||
return [];
|
||||
});
|
||||
|
||||
const configPath = join(tmpDir, 'config.yaml');
|
||||
writeFileSync(configPath, `
|
||||
users:
|
||||
- email: alice@test.com
|
||||
password: newpassword
|
||||
name: Alice Updated
|
||||
`);
|
||||
|
||||
const cmd = createApplyCommand({ client, log });
|
||||
await cmd.parseAsync([configPath], { from: 'user' });
|
||||
|
||||
expect(client.put).toHaveBeenCalledWith('/api/v1/users/usr-1', expect.objectContaining({
|
||||
email: 'alice@test.com',
|
||||
name: 'Alice Updated',
|
||||
}));
|
||||
expect(output.join('\n')).toContain('Updated user: alice@test.com');
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('applies groups', async () => {
|
||||
const configPath = join(tmpDir, 'config.yaml');
|
||||
writeFileSync(configPath, `
|
||||
groups:
|
||||
- name: dev-team
|
||||
description: Development team
|
||||
members:
|
||||
- alice@test.com
|
||||
- bob@test.com
|
||||
`);
|
||||
|
||||
const cmd = createApplyCommand({ client, log });
|
||||
await cmd.parseAsync([configPath], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/groups', expect.objectContaining({
|
||||
name: 'dev-team',
|
||||
description: 'Development team',
|
||||
members: ['alice@test.com', 'bob@test.com'],
|
||||
}));
|
||||
expect(output.join('\n')).toContain('Created group: dev-team');
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('updates existing groups', async () => {
|
||||
vi.mocked(client.get).mockImplementation(async (url: string) => {
|
||||
if (url === '/api/v1/groups') return [{ id: 'grp-1', name: 'dev-team' }];
|
||||
return [];
|
||||
});
|
||||
|
||||
const configPath = join(tmpDir, 'config.yaml');
|
||||
writeFileSync(configPath, `
|
||||
groups:
|
||||
- name: dev-team
|
||||
description: Updated devs
|
||||
members:
|
||||
- new@test.com
|
||||
`);
|
||||
|
||||
const cmd = createApplyCommand({ client, log });
|
||||
await cmd.parseAsync([configPath], { from: 'user' });
|
||||
|
||||
expect(client.put).toHaveBeenCalledWith('/api/v1/groups/grp-1', expect.objectContaining({
|
||||
name: 'dev-team',
|
||||
description: 'Updated devs',
|
||||
}));
|
||||
expect(output.join('\n')).toContain('Updated group: dev-team');
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('applies rbacBindings', async () => {
|
||||
const configPath = join(tmpDir, 'config.yaml');
|
||||
writeFileSync(configPath, `
|
||||
rbac:
|
||||
- name: developers
|
||||
subjects:
|
||||
- kind: User
|
||||
name: alice@test.com
|
||||
- kind: Group
|
||||
name: dev-team
|
||||
roleBindings:
|
||||
- role: edit
|
||||
resource: servers
|
||||
- role: view
|
||||
resource: instances
|
||||
`);
|
||||
|
||||
const cmd = createApplyCommand({ client, log });
|
||||
await cmd.parseAsync([configPath], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', expect.objectContaining({
|
||||
name: 'developers',
|
||||
subjects: [
|
||||
{ kind: 'User', name: 'alice@test.com' },
|
||||
{ kind: 'Group', name: 'dev-team' },
|
||||
],
|
||||
roleBindings: [
|
||||
{ role: 'edit', resource: 'servers' },
|
||||
{ role: 'view', resource: 'instances' },
|
||||
],
|
||||
}));
|
||||
expect(output.join('\n')).toContain('Created rbacBinding: developers');
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('updates existing rbacBindings', async () => {
|
||||
vi.mocked(client.get).mockImplementation(async (url: string) => {
|
||||
if (url === '/api/v1/rbac') return [{ id: 'rbac-1', name: 'developers' }];
|
||||
return [];
|
||||
});
|
||||
|
||||
const configPath = join(tmpDir, 'config.yaml');
|
||||
writeFileSync(configPath, `
|
||||
rbacBindings:
|
||||
- name: developers
|
||||
subjects:
|
||||
- kind: User
|
||||
name: new@test.com
|
||||
roleBindings:
|
||||
- role: edit
|
||||
resource: "*"
|
||||
`);
|
||||
|
||||
const cmd = createApplyCommand({ client, log });
|
||||
await cmd.parseAsync([configPath], { from: 'user' });
|
||||
|
||||
expect(client.put).toHaveBeenCalledWith('/api/v1/rbac/rbac-1', expect.objectContaining({
|
||||
name: 'developers',
|
||||
}));
|
||||
expect(output.join('\n')).toContain('Updated rbacBinding: developers');
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('applies projects with servers', async () => {
|
||||
const configPath = join(tmpDir, 'config.yaml');
|
||||
writeFileSync(configPath, `
|
||||
projects:
|
||||
- name: smart-home
|
||||
description: Home automation
|
||||
proxyMode: filtered
|
||||
llmProvider: gemini-cli
|
||||
llmModel: gemini-2.0-flash
|
||||
servers:
|
||||
- my-grafana
|
||||
- my-ha
|
||||
`);
|
||||
|
||||
const cmd = createApplyCommand({ client, log });
|
||||
await cmd.parseAsync([configPath], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/projects', expect.objectContaining({
|
||||
name: 'smart-home',
|
||||
proxyMode: 'filtered',
|
||||
llmProvider: 'gemini-cli',
|
||||
llmModel: 'gemini-2.0-flash',
|
||||
servers: ['my-grafana', 'my-ha'],
|
||||
}));
|
||||
expect(output.join('\n')).toContain('Created project: smart-home');
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('dry-run shows all new resource types', async () => {
|
||||
const configPath = join(tmpDir, 'config.yaml');
|
||||
writeFileSync(configPath, `
|
||||
secrets:
|
||||
- name: creds
|
||||
data:
|
||||
TOKEN: abc
|
||||
users:
|
||||
- email: alice@test.com
|
||||
password: password123
|
||||
groups:
|
||||
- name: dev-team
|
||||
members: []
|
||||
projects:
|
||||
- name: my-proj
|
||||
description: A project
|
||||
rbacBindings:
|
||||
- name: admins
|
||||
subjects:
|
||||
- kind: User
|
||||
name: admin@test.com
|
||||
roleBindings:
|
||||
- role: edit
|
||||
resource: "*"
|
||||
`);
|
||||
|
||||
const cmd = createApplyCommand({ client, log });
|
||||
await cmd.parseAsync([configPath, '--dry-run'], { from: 'user' });
|
||||
|
||||
expect(client.post).not.toHaveBeenCalled();
|
||||
const text = output.join('\n');
|
||||
expect(text).toContain('Dry run');
|
||||
expect(text).toContain('1 secret(s)');
|
||||
expect(text).toContain('1 user(s)');
|
||||
expect(text).toContain('1 group(s)');
|
||||
expect(text).toContain('1 project(s)');
|
||||
expect(text).toContain('1 rbacBinding(s)');
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('applies resources in correct order', async () => {
|
||||
const callOrder: string[] = [];
|
||||
vi.mocked(client.post).mockImplementation(async (url: string) => {
|
||||
callOrder.push(url);
|
||||
return { id: 'new-id', name: 'test' };
|
||||
});
|
||||
|
||||
const configPath = join(tmpDir, 'config.yaml');
|
||||
writeFileSync(configPath, `
|
||||
rbacBindings:
|
||||
- name: admins
|
||||
subjects:
|
||||
- kind: User
|
||||
name: admin@test.com
|
||||
roleBindings:
|
||||
- role: edit
|
||||
resource: "*"
|
||||
users:
|
||||
- email: admin@test.com
|
||||
password: password123
|
||||
secrets:
|
||||
- name: creds
|
||||
data:
|
||||
KEY: val
|
||||
groups:
|
||||
- name: dev-team
|
||||
servers:
|
||||
- name: my-server
|
||||
transport: STDIO
|
||||
projects:
|
||||
- name: my-proj
|
||||
`);
|
||||
|
||||
const cmd = createApplyCommand({ client, log });
|
||||
await cmd.parseAsync([configPath], { from: 'user' });
|
||||
|
||||
// Apply order: secrets → servers → users → groups → projects → templates → rbacBindings
|
||||
expect(callOrder[0]).toBe('/api/v1/secrets');
|
||||
expect(callOrder[1]).toBe('/api/v1/servers');
|
||||
expect(callOrder[2]).toBe('/api/v1/users');
|
||||
expect(callOrder[3]).toBe('/api/v1/groups');
|
||||
expect(callOrder[4]).toBe('/api/v1/projects');
|
||||
expect(callOrder[5]).toBe('/api/v1/rbac');
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('applies rbac with operation bindings', async () => {
|
||||
const configPath = join(tmpDir, 'config.yaml');
|
||||
writeFileSync(configPath, `
|
||||
rbac:
|
||||
- name: ops-team
|
||||
subjects:
|
||||
- kind: Group
|
||||
name: ops
|
||||
roleBindings:
|
||||
- role: edit
|
||||
resource: servers
|
||||
- role: run
|
||||
action: backup
|
||||
- role: run
|
||||
action: logs
|
||||
`);
|
||||
|
||||
const cmd = createApplyCommand({ client, log });
|
||||
await cmd.parseAsync([configPath], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', expect.objectContaining({
|
||||
name: 'ops-team',
|
||||
roleBindings: [
|
||||
{ role: 'edit', resource: 'servers' },
|
||||
{ role: 'run', action: 'backup' },
|
||||
{ role: 'run', action: 'logs' },
|
||||
],
|
||||
}));
|
||||
expect(output.join('\n')).toContain('Created rbacBinding: ops-team');
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('applies rbac with name-scoped resource binding', async () => {
|
||||
const configPath = join(tmpDir, 'config.yaml');
|
||||
writeFileSync(configPath, `
|
||||
rbac:
|
||||
- name: ha-viewer
|
||||
subjects:
|
||||
- kind: User
|
||||
name: alice@test.com
|
||||
roleBindings:
|
||||
- role: view
|
||||
resource: servers
|
||||
name: my-ha
|
||||
`);
|
||||
|
||||
const cmd = createApplyCommand({ client, log });
|
||||
await cmd.parseAsync([configPath], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', expect.objectContaining({
|
||||
name: 'ha-viewer',
|
||||
roleBindings: [
|
||||
{ role: 'view', resource: 'servers', name: 'my-ha' },
|
||||
],
|
||||
}));
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
});
|
||||
|
||||
@@ -37,6 +37,8 @@ describe('login command', () => {
|
||||
user: { email },
|
||||
}),
|
||||
logoutRequest: async () => {},
|
||||
statusRequest: async () => ({ hasUsers: true }),
|
||||
bootstrapRequest: async () => ({ token: '', user: { email: '' } }),
|
||||
});
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
expect(output[0]).toContain('Logged in as alice@test.com');
|
||||
@@ -58,6 +60,8 @@ describe('login command', () => {
|
||||
log,
|
||||
loginRequest: async () => { throw new Error('Invalid credentials'); },
|
||||
logoutRequest: async () => {},
|
||||
statusRequest: async () => ({ hasUsers: true }),
|
||||
bootstrapRequest: async () => ({ token: '', user: { email: '' } }),
|
||||
});
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
expect(output[0]).toContain('Login failed');
|
||||
@@ -83,6 +87,8 @@ describe('login command', () => {
|
||||
return { token: 'tok', user: { email } };
|
||||
},
|
||||
logoutRequest: async () => {},
|
||||
statusRequest: async () => ({ hasUsers: true }),
|
||||
bootstrapRequest: async () => ({ token: '', user: { email: '' } }),
|
||||
});
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
expect(capturedUrl).toBe('http://custom:3100');
|
||||
@@ -103,12 +109,74 @@ describe('login command', () => {
|
||||
return { token: 'tok', user: { email } };
|
||||
},
|
||||
logoutRequest: async () => {},
|
||||
statusRequest: async () => ({ hasUsers: true }),
|
||||
bootstrapRequest: async () => ({ token: '', user: { email: '' } }),
|
||||
});
|
||||
await cmd.parseAsync(['--mcpd-url', 'http://override:3100'], { from: 'user' });
|
||||
expect(capturedUrl).toBe('http://override:3100');
|
||||
});
|
||||
});
|
||||
|
||||
describe('login bootstrap flow', () => {
|
||||
it('bootstraps first admin when no users exist', async () => {
|
||||
let bootstrapCalled = false;
|
||||
const cmd = createLoginCommand({
|
||||
configDeps: { configDir: tempDir },
|
||||
credentialsDeps: { configDir: tempDir },
|
||||
prompt: {
|
||||
input: async (msg) => {
|
||||
if (msg.includes('Name')) return 'Admin User';
|
||||
return 'admin@test.com';
|
||||
},
|
||||
password: async () => 'admin-pass',
|
||||
},
|
||||
log,
|
||||
loginRequest: async () => ({ token: '', user: { email: '' } }),
|
||||
logoutRequest: async () => {},
|
||||
statusRequest: async () => ({ hasUsers: false }),
|
||||
bootstrapRequest: async (_url, email, _password) => {
|
||||
bootstrapCalled = true;
|
||||
return { token: 'admin-token', user: { email } };
|
||||
},
|
||||
});
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
|
||||
expect(bootstrapCalled).toBe(true);
|
||||
expect(output.join('\n')).toContain('No users configured');
|
||||
expect(output.join('\n')).toContain('admin@test.com');
|
||||
expect(output.join('\n')).toContain('admin');
|
||||
|
||||
const creds = loadCredentials({ configDir: tempDir });
|
||||
expect(creds).not.toBeNull();
|
||||
expect(creds!.token).toBe('admin-token');
|
||||
expect(creds!.user).toBe('admin@test.com');
|
||||
});
|
||||
|
||||
it('falls back to normal login when users exist', async () => {
|
||||
let loginCalled = false;
|
||||
const cmd = createLoginCommand({
|
||||
configDeps: { configDir: tempDir },
|
||||
credentialsDeps: { configDir: tempDir },
|
||||
prompt: {
|
||||
input: async () => 'alice@test.com',
|
||||
password: async () => 'secret',
|
||||
},
|
||||
log,
|
||||
loginRequest: async (_url, email) => {
|
||||
loginCalled = true;
|
||||
return { token: 'session-tok', user: { email } };
|
||||
},
|
||||
logoutRequest: async () => {},
|
||||
statusRequest: async () => ({ hasUsers: true }),
|
||||
bootstrapRequest: async () => { throw new Error('Should not be called'); },
|
||||
});
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
|
||||
expect(loginCalled).toBe(true);
|
||||
expect(output.join('\n')).not.toContain('No users configured');
|
||||
});
|
||||
});
|
||||
|
||||
describe('logout command', () => {
|
||||
it('removes credentials on logout', async () => {
|
||||
saveCredentials({ token: 'tok', mcpdUrl: 'http://x:3100', user: 'alice' }, { configDir: tempDir });
|
||||
@@ -120,6 +188,8 @@ describe('logout command', () => {
|
||||
log,
|
||||
loginRequest: async () => ({ token: '', user: { email: '' } }),
|
||||
logoutRequest: async () => { logoutCalled = true; },
|
||||
statusRequest: async () => ({ hasUsers: true }),
|
||||
bootstrapRequest: async () => ({ token: '', user: { email: '' } }),
|
||||
});
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
expect(output[0]).toContain('Logged out successfully');
|
||||
@@ -137,6 +207,8 @@ describe('logout command', () => {
|
||||
log,
|
||||
loginRequest: async () => ({ token: '', user: { email: '' } }),
|
||||
logoutRequest: async () => {},
|
||||
statusRequest: async () => ({ hasUsers: true }),
|
||||
bootstrapRequest: async () => ({ token: '', user: { email: '' } }),
|
||||
});
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
expect(output[0]).toContain('Not logged in');
|
||||
|
||||
@@ -1,158 +1,192 @@
|
||||
import { describe, it, expect, vi, beforeEach } from 'vitest';
|
||||
import { describe, it, expect, vi, beforeEach, afterEach } from 'vitest';
|
||||
import { writeFileSync, readFileSync, mkdtempSync, rmSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
import { createClaudeCommand } from '../../src/commands/claude.js';
|
||||
import { createConfigCommand } from '../../src/commands/config.js';
|
||||
import type { ApiClient } from '../../src/api-client.js';
|
||||
import { saveCredentials, loadCredentials } from '../../src/auth/index.js';
|
||||
|
||||
function mockClient(): ApiClient {
|
||||
return {
|
||||
get: vi.fn(async () => ({
|
||||
mcpServers: {
|
||||
'slack--default': { command: 'npx', args: ['-y', '@anthropic/slack-mcp'], env: { WORKSPACE: 'test' } },
|
||||
'github--default': { command: 'npx', args: ['-y', '@anthropic/github-mcp'] },
|
||||
},
|
||||
})),
|
||||
post: vi.fn(async () => ({})),
|
||||
get: vi.fn(async () => ({})),
|
||||
post: vi.fn(async () => ({ token: 'impersonated-tok', user: { email: 'other@test.com' } })),
|
||||
put: vi.fn(async () => ({})),
|
||||
delete: vi.fn(async () => {}),
|
||||
} as unknown as ApiClient;
|
||||
}
|
||||
|
||||
describe('claude command', () => {
|
||||
describe('config claude', () => {
|
||||
let client: ReturnType<typeof mockClient>;
|
||||
let output: string[];
|
||||
let tmpDir: string;
|
||||
const log = (...args: unknown[]) => output.push(args.map(String).join(' '));
|
||||
const log = (...args: string[]) => output.push(args.join(' '));
|
||||
|
||||
beforeEach(() => {
|
||||
client = mockClient();
|
||||
output = [];
|
||||
tmpDir = mkdtempSync(join(tmpdir(), 'mcpctl-claude-'));
|
||||
tmpDir = mkdtempSync(join(tmpdir(), 'mcpctl-config-claude-'));
|
||||
});
|
||||
|
||||
describe('generate', () => {
|
||||
it('generates .mcp.json from project config', async () => {
|
||||
const outPath = join(tmpDir, '.mcp.json');
|
||||
const cmd = createClaudeCommand({ client, log });
|
||||
await cmd.parseAsync(['generate', 'proj-1', '-o', outPath], { from: 'user' });
|
||||
afterEach(() => {
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
expect(client.get).toHaveBeenCalledWith('/api/v1/projects/proj-1/mcp-config');
|
||||
const written = JSON.parse(readFileSync(outPath, 'utf-8'));
|
||||
expect(written.mcpServers['slack--default']).toBeDefined();
|
||||
expect(output.join('\n')).toContain('2 server(s)');
|
||||
it('generates .mcp.json with mcpctl mcp bridge entry', async () => {
|
||||
const outPath = join(tmpDir, '.mcp.json');
|
||||
const cmd = createConfigCommand(
|
||||
{ configDeps: { configDir: tmpDir }, log },
|
||||
{ client, credentialsDeps: { configDir: tmpDir }, log },
|
||||
);
|
||||
await cmd.parseAsync(['claude', '--project', 'homeautomation', '-o', outPath], { from: 'user' });
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
// No API call should be made
|
||||
expect(client.get).not.toHaveBeenCalled();
|
||||
|
||||
const written = JSON.parse(readFileSync(outPath, 'utf-8'));
|
||||
expect(written.mcpServers['homeautomation']).toEqual({
|
||||
command: 'mcpctl',
|
||||
args: ['mcp', '-p', 'homeautomation'],
|
||||
});
|
||||
expect(output.join('\n')).toContain('1 server(s)');
|
||||
});
|
||||
|
||||
it('prints to stdout with --stdout', async () => {
|
||||
const cmd = createClaudeCommand({ client, log });
|
||||
await cmd.parseAsync(['generate', 'proj-1', '--stdout'], { from: 'user' });
|
||||
it('prints to stdout with --stdout', async () => {
|
||||
const cmd = createConfigCommand(
|
||||
{ configDeps: { configDir: tmpDir }, log },
|
||||
{ client, credentialsDeps: { configDir: tmpDir }, log },
|
||||
);
|
||||
await cmd.parseAsync(['claude', '--project', 'myproj', '--stdout'], { from: 'user' });
|
||||
|
||||
expect(output[0]).toContain('mcpServers');
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('merges with existing .mcp.json', async () => {
|
||||
const outPath = join(tmpDir, '.mcp.json');
|
||||
writeFileSync(outPath, JSON.stringify({
|
||||
mcpServers: { 'existing--server': { command: 'echo', args: [] } },
|
||||
}));
|
||||
|
||||
const cmd = createClaudeCommand({ client, log });
|
||||
await cmd.parseAsync(['generate', 'proj-1', '-o', outPath, '--merge'], { from: 'user' });
|
||||
|
||||
const written = JSON.parse(readFileSync(outPath, 'utf-8'));
|
||||
expect(written.mcpServers['existing--server']).toBeDefined();
|
||||
expect(written.mcpServers['slack--default']).toBeDefined();
|
||||
expect(output.join('\n')).toContain('3 server(s)');
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
const parsed = JSON.parse(output[0]);
|
||||
expect(parsed.mcpServers['myproj']).toEqual({
|
||||
command: 'mcpctl',
|
||||
args: ['mcp', '-p', 'myproj'],
|
||||
});
|
||||
});
|
||||
|
||||
describe('show', () => {
|
||||
it('shows servers in .mcp.json', () => {
|
||||
const filePath = join(tmpDir, '.mcp.json');
|
||||
writeFileSync(filePath, JSON.stringify({
|
||||
mcpServers: {
|
||||
'slack': { command: 'npx', args: ['-y', '@anthropic/slack-mcp'], env: { TOKEN: 'x' } },
|
||||
},
|
||||
}));
|
||||
it('merges with existing .mcp.json', async () => {
|
||||
const outPath = join(tmpDir, '.mcp.json');
|
||||
writeFileSync(outPath, JSON.stringify({
|
||||
mcpServers: { 'existing--server': { command: 'echo', args: [] } },
|
||||
}));
|
||||
|
||||
const cmd = createClaudeCommand({ client, log });
|
||||
cmd.parseAsync(['show', '-p', filePath], { from: 'user' });
|
||||
const cmd = createConfigCommand(
|
||||
{ configDeps: { configDir: tmpDir }, log },
|
||||
{ client, credentialsDeps: { configDir: tmpDir }, log },
|
||||
);
|
||||
await cmd.parseAsync(['claude', '--project', 'proj-1', '-o', outPath, '--merge'], { from: 'user' });
|
||||
|
||||
expect(output.join('\n')).toContain('slack');
|
||||
expect(output.join('\n')).toContain('npx -y @anthropic/slack-mcp');
|
||||
expect(output.join('\n')).toContain('TOKEN');
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
const written = JSON.parse(readFileSync(outPath, 'utf-8'));
|
||||
expect(written.mcpServers['existing--server']).toBeDefined();
|
||||
expect(written.mcpServers['proj-1']).toEqual({
|
||||
command: 'mcpctl',
|
||||
args: ['mcp', '-p', 'proj-1'],
|
||||
});
|
||||
expect(output.join('\n')).toContain('2 server(s)');
|
||||
});
|
||||
|
||||
it('handles missing file', () => {
|
||||
const cmd = createClaudeCommand({ client, log });
|
||||
cmd.parseAsync(['show', '-p', join(tmpDir, 'nonexistent.json')], { from: 'user' });
|
||||
it('backward compat: claude-generate still works', async () => {
|
||||
const outPath = join(tmpDir, '.mcp.json');
|
||||
const cmd = createConfigCommand(
|
||||
{ configDeps: { configDir: tmpDir }, log },
|
||||
{ client, credentialsDeps: { configDir: tmpDir }, log },
|
||||
);
|
||||
await cmd.parseAsync(['claude-generate', '--project', 'proj-1', '-o', outPath], { from: 'user' });
|
||||
|
||||
expect(output.join('\n')).toContain('No .mcp.json found');
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
const written = JSON.parse(readFileSync(outPath, 'utf-8'));
|
||||
expect(written.mcpServers['proj-1']).toEqual({
|
||||
command: 'mcpctl',
|
||||
args: ['mcp', '-p', 'proj-1'],
|
||||
});
|
||||
});
|
||||
|
||||
describe('add', () => {
|
||||
it('adds a server entry', () => {
|
||||
const filePath = join(tmpDir, '.mcp.json');
|
||||
const cmd = createClaudeCommand({ client, log });
|
||||
cmd.parseAsync(['add', 'my-server', '-c', 'npx', '-a', '-y', 'my-pkg', '-p', filePath], { from: 'user' });
|
||||
it('uses project name as the server key', async () => {
|
||||
const outPath = join(tmpDir, '.mcp.json');
|
||||
const cmd = createConfigCommand(
|
||||
{ configDeps: { configDir: tmpDir }, log },
|
||||
);
|
||||
await cmd.parseAsync(['claude', '--project', 'my-fancy-project', '-o', outPath], { from: 'user' });
|
||||
|
||||
const written = JSON.parse(readFileSync(filePath, 'utf-8'));
|
||||
expect(written.mcpServers['my-server']).toEqual({
|
||||
command: 'npx',
|
||||
args: ['-y', 'my-pkg'],
|
||||
});
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('adds server with env vars', () => {
|
||||
const filePath = join(tmpDir, '.mcp.json');
|
||||
const cmd = createClaudeCommand({ client, log });
|
||||
cmd.parseAsync(['add', 'my-server', '-c', 'node', '-e', 'KEY=val', 'SECRET=abc', '-p', filePath], { from: 'user' });
|
||||
|
||||
const written = JSON.parse(readFileSync(filePath, 'utf-8'));
|
||||
expect(written.mcpServers['my-server'].env).toEqual({ KEY: 'val', SECRET: 'abc' });
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
});
|
||||
|
||||
describe('remove', () => {
|
||||
it('removes a server entry', () => {
|
||||
const filePath = join(tmpDir, '.mcp.json');
|
||||
writeFileSync(filePath, JSON.stringify({
|
||||
mcpServers: { 'slack': { command: 'npx', args: [] }, 'github': { command: 'npx', args: [] } },
|
||||
}));
|
||||
|
||||
const cmd = createClaudeCommand({ client, log });
|
||||
cmd.parseAsync(['remove', 'slack', '-p', filePath], { from: 'user' });
|
||||
|
||||
const written = JSON.parse(readFileSync(filePath, 'utf-8'));
|
||||
expect(written.mcpServers['slack']).toBeUndefined();
|
||||
expect(written.mcpServers['github']).toBeDefined();
|
||||
expect(output.join('\n')).toContain("Removed 'slack'");
|
||||
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('reports when server not found', () => {
|
||||
const filePath = join(tmpDir, '.mcp.json');
|
||||
writeFileSync(filePath, JSON.stringify({ mcpServers: {} }));
|
||||
|
||||
const cmd = createClaudeCommand({ client, log });
|
||||
cmd.parseAsync(['remove', 'nonexistent', '-p', filePath], { from: 'user' });
|
||||
|
||||
expect(output.join('\n')).toContain('not found');
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
const written = JSON.parse(readFileSync(outPath, 'utf-8'));
|
||||
expect(Object.keys(written.mcpServers)).toEqual(['my-fancy-project']);
|
||||
});
|
||||
});
|
||||
|
||||
describe('config impersonate', () => {
|
||||
let client: ReturnType<typeof mockClient>;
|
||||
let output: string[];
|
||||
let tmpDir: string;
|
||||
const log = (...args: string[]) => output.push(args.join(' '));
|
||||
|
||||
beforeEach(() => {
|
||||
client = mockClient();
|
||||
output = [];
|
||||
tmpDir = mkdtempSync(join(tmpdir(), 'mcpctl-config-impersonate-'));
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('impersonates a user and saves backup', async () => {
|
||||
saveCredentials({ token: 'admin-tok', mcpdUrl: 'http://localhost:3100', user: 'admin@test.com' }, { configDir: tmpDir });
|
||||
|
||||
const cmd = createConfigCommand(
|
||||
{ configDeps: { configDir: tmpDir }, log },
|
||||
{ client, credentialsDeps: { configDir: tmpDir }, log },
|
||||
);
|
||||
await cmd.parseAsync(['impersonate', 'other@test.com'], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/auth/impersonate', { email: 'other@test.com' });
|
||||
expect(output.join('\n')).toContain('Impersonating other@test.com');
|
||||
|
||||
const creds = loadCredentials({ configDir: tmpDir });
|
||||
expect(creds!.user).toBe('other@test.com');
|
||||
expect(creds!.token).toBe('impersonated-tok');
|
||||
|
||||
// Backup exists
|
||||
const backup = JSON.parse(readFileSync(join(tmpDir, 'credentials-backup'), 'utf-8'));
|
||||
expect(backup.user).toBe('admin@test.com');
|
||||
});
|
||||
|
||||
it('quits impersonation and restores backup', async () => {
|
||||
// Set up current (impersonated) credentials
|
||||
saveCredentials({ token: 'impersonated-tok', mcpdUrl: 'http://localhost:3100', user: 'other@test.com' }, { configDir: tmpDir });
|
||||
// Set up backup (original) credentials
|
||||
writeFileSync(join(tmpDir, 'credentials-backup'), JSON.stringify({
|
||||
token: 'admin-tok', mcpdUrl: 'http://localhost:3100', user: 'admin@test.com',
|
||||
}));
|
||||
|
||||
const cmd = createConfigCommand(
|
||||
{ configDeps: { configDir: tmpDir }, log },
|
||||
{ client, credentialsDeps: { configDir: tmpDir }, log },
|
||||
);
|
||||
await cmd.parseAsync(['impersonate', '--quit'], { from: 'user' });
|
||||
|
||||
expect(output.join('\n')).toContain('Returned to admin@test.com');
|
||||
|
||||
const creds = loadCredentials({ configDir: tmpDir });
|
||||
expect(creds!.user).toBe('admin@test.com');
|
||||
expect(creds!.token).toBe('admin-tok');
|
||||
});
|
||||
|
||||
it('errors when not logged in', async () => {
|
||||
const cmd = createConfigCommand(
|
||||
{ configDeps: { configDir: tmpDir }, log },
|
||||
{ client, credentialsDeps: { configDir: tmpDir }, log },
|
||||
);
|
||||
await cmd.parseAsync(['impersonate', 'other@test.com'], { from: 'user' });
|
||||
|
||||
expect(output.join('\n')).toContain('Not logged in');
|
||||
});
|
||||
|
||||
it('errors when quitting with no backup', async () => {
|
||||
const cmd = createConfigCommand(
|
||||
{ configDeps: { configDir: tmpDir }, log },
|
||||
{ client, credentialsDeps: { configDir: tmpDir }, log },
|
||||
);
|
||||
await cmd.parseAsync(['impersonate', '--quit'], { from: 'user' });
|
||||
|
||||
expect(output.join('\n')).toContain('No impersonation session to quit');
|
||||
});
|
||||
});
|
||||
|
||||
293
src/cli/tests/commands/config-setup.test.ts
Normal file
293
src/cli/tests/commands/config-setup.test.ts
Normal file
@@ -0,0 +1,293 @@
|
||||
import { describe, it, expect, vi, beforeEach } from 'vitest';
|
||||
import { createConfigSetupCommand } from '../../src/commands/config-setup.js';
|
||||
import type { ConfigSetupDeps, ConfigSetupPrompt } from '../../src/commands/config-setup.js';
|
||||
import type { SecretStore } from '@mcpctl/shared';
|
||||
import { mkdtempSync, rmSync, readFileSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
|
||||
let tempDir: string;
|
||||
let logs: string[];
|
||||
|
||||
beforeEach(() => {
|
||||
tempDir = mkdtempSync(join(tmpdir(), 'mcpctl-config-setup-test-'));
|
||||
logs = [];
|
||||
});
|
||||
|
||||
function cleanup() {
|
||||
rmSync(tempDir, { recursive: true, force: true });
|
||||
}
|
||||
|
||||
function mockSecretStore(secrets: Record<string, string> = {}): SecretStore {
|
||||
const store: Record<string, string> = { ...secrets };
|
||||
return {
|
||||
get: vi.fn(async (key: string) => store[key] ?? null),
|
||||
set: vi.fn(async (key: string, value: string) => { store[key] = value; }),
|
||||
delete: vi.fn(async () => true),
|
||||
backend: () => 'mock',
|
||||
};
|
||||
}
|
||||
|
||||
function mockPrompt(answers: unknown[]): ConfigSetupPrompt {
|
||||
let callIndex = 0;
|
||||
return {
|
||||
select: vi.fn(async () => answers[callIndex++]),
|
||||
input: vi.fn(async () => answers[callIndex++] as string),
|
||||
password: vi.fn(async () => answers[callIndex++] as string),
|
||||
confirm: vi.fn(async () => answers[callIndex++] as boolean),
|
||||
};
|
||||
}
|
||||
|
||||
function buildDeps(overrides: {
|
||||
secrets?: Record<string, string>;
|
||||
answers?: unknown[];
|
||||
fetchModels?: ConfigSetupDeps['fetchModels'];
|
||||
whichBinary?: ConfigSetupDeps['whichBinary'];
|
||||
} = {}): ConfigSetupDeps {
|
||||
return {
|
||||
configDeps: { configDir: tempDir },
|
||||
secretStore: mockSecretStore(overrides.secrets),
|
||||
log: (...args: string[]) => logs.push(args.join(' ')),
|
||||
prompt: mockPrompt(overrides.answers ?? []),
|
||||
fetchModels: overrides.fetchModels ?? vi.fn(async () => []),
|
||||
whichBinary: overrides.whichBinary ?? vi.fn(async () => '/usr/bin/gemini'),
|
||||
};
|
||||
}
|
||||
|
||||
function readConfig(): Record<string, unknown> {
|
||||
const raw = readFileSync(join(tempDir, 'config.json'), 'utf-8');
|
||||
return JSON.parse(raw) as Record<string, unknown>;
|
||||
}
|
||||
|
||||
async function runSetup(deps: ConfigSetupDeps): Promise<void> {
|
||||
const cmd = createConfigSetupCommand(deps);
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
}
|
||||
|
||||
describe('config setup wizard', () => {
|
||||
describe('provider: none', () => {
|
||||
it('disables LLM and saves config', async () => {
|
||||
const deps = buildDeps({ answers: ['simple', 'none'] });
|
||||
await runSetup(deps);
|
||||
|
||||
const config = readConfig();
|
||||
expect(config.llm).toEqual({ provider: 'none' });
|
||||
expect(logs.some((l) => l.includes('LLM disabled'))).toBe(true);
|
||||
cleanup();
|
||||
});
|
||||
});
|
||||
|
||||
describe('provider: gemini-cli', () => {
|
||||
it('auto-detects binary path and saves config', async () => {
|
||||
// Answers: select provider, select model (no binary prompt — auto-detected)
|
||||
const deps = buildDeps({
|
||||
answers: ['simple', 'gemini-cli', 'gemini-2.5-flash'],
|
||||
whichBinary: vi.fn(async () => '/home/user/.npm-global/bin/gemini'),
|
||||
});
|
||||
await runSetup(deps);
|
||||
|
||||
const config = readConfig();
|
||||
const llm = config.llm as Record<string, unknown>;
|
||||
expect(llm.provider).toBe('gemini-cli');
|
||||
expect(llm.model).toBe('gemini-2.5-flash');
|
||||
expect(llm.binaryPath).toBe('/home/user/.npm-global/bin/gemini');
|
||||
expect(logs.some((l) => l.includes('Found gemini at'))).toBe(true);
|
||||
cleanup();
|
||||
});
|
||||
|
||||
it('prompts for manual path when binary not found', async () => {
|
||||
// Answers: select provider, select model, enter manual path
|
||||
const deps = buildDeps({
|
||||
answers: ['simple', 'gemini-cli', 'gemini-2.5-flash', '/opt/gemini'],
|
||||
whichBinary: vi.fn(async () => null),
|
||||
});
|
||||
await runSetup(deps);
|
||||
|
||||
const config = readConfig();
|
||||
const llm = config.llm as Record<string, unknown>;
|
||||
expect(llm.binaryPath).toBe('/opt/gemini');
|
||||
expect(logs.some((l) => l.includes('not found'))).toBe(true);
|
||||
cleanup();
|
||||
});
|
||||
|
||||
it('saves gemini-cli with custom model', async () => {
|
||||
// Answers: select provider, select custom, enter model name
|
||||
const deps = buildDeps({
|
||||
answers: ['simple', 'gemini-cli', '__custom__', 'gemini-3.0-flash'],
|
||||
whichBinary: vi.fn(async () => '/usr/bin/gemini'),
|
||||
});
|
||||
await runSetup(deps);
|
||||
|
||||
const config = readConfig();
|
||||
const llm = config.llm as Record<string, unknown>;
|
||||
expect(llm.model).toBe('gemini-3.0-flash');
|
||||
cleanup();
|
||||
});
|
||||
});
|
||||
|
||||
describe('provider: ollama', () => {
|
||||
it('fetches models and allows selection', async () => {
|
||||
const fetchModels = vi.fn(async () => ['llama3.2', 'codellama', 'mistral']);
|
||||
// Answers: select provider, enter URL, select model
|
||||
const deps = buildDeps({
|
||||
answers: ['simple', 'ollama', 'http://localhost:11434', 'codellama'],
|
||||
fetchModels,
|
||||
});
|
||||
await runSetup(deps);
|
||||
|
||||
expect(fetchModels).toHaveBeenCalledWith('http://localhost:11434', '/api/tags');
|
||||
const config = readConfig();
|
||||
const llm = config.llm as Record<string, unknown>;
|
||||
expect(llm.provider).toBe('ollama');
|
||||
expect(llm.model).toBe('codellama');
|
||||
expect(llm.url).toBe('http://localhost:11434');
|
||||
cleanup();
|
||||
});
|
||||
|
||||
it('falls back to manual input when fetch fails', async () => {
|
||||
const fetchModels = vi.fn(async () => []);
|
||||
// Answers: select provider, enter URL, enter model manually
|
||||
const deps = buildDeps({
|
||||
answers: ['simple', 'ollama', 'http://localhost:11434', 'llama3.2'],
|
||||
fetchModels,
|
||||
});
|
||||
await runSetup(deps);
|
||||
|
||||
const config = readConfig();
|
||||
expect((config.llm as Record<string, unknown>).model).toBe('llama3.2');
|
||||
cleanup();
|
||||
});
|
||||
});
|
||||
|
||||
describe('provider: anthropic', () => {
|
||||
it('prompts for API key and saves to secret store', async () => {
|
||||
// Answers: select provider, enter API key, select model
|
||||
const deps = buildDeps({
|
||||
answers: ['simple', 'anthropic', 'sk-ant-new-key', 'claude-haiku-3-5-20241022'],
|
||||
});
|
||||
await runSetup(deps);
|
||||
|
||||
expect(deps.secretStore.set).toHaveBeenCalledWith('anthropic-api-key', 'sk-ant-new-key');
|
||||
const config = readConfig();
|
||||
const llm = config.llm as Record<string, unknown>;
|
||||
expect(llm.provider).toBe('anthropic');
|
||||
expect(llm.model).toBe('claude-haiku-3-5-20241022');
|
||||
// API key should NOT be in config file
|
||||
expect(llm).not.toHaveProperty('apiKey');
|
||||
cleanup();
|
||||
});
|
||||
|
||||
it('shows existing key masked and allows keeping it', async () => {
|
||||
// Answers: select provider, confirm change=false, select model
|
||||
const deps = buildDeps({
|
||||
secrets: { 'anthropic-api-key': 'sk-ant-existing-key-1234' },
|
||||
answers: ['simple', 'anthropic', false, 'claude-sonnet-4-20250514'],
|
||||
});
|
||||
await runSetup(deps);
|
||||
|
||||
// Should NOT have called set (kept existing key)
|
||||
expect(deps.secretStore.set).not.toHaveBeenCalled();
|
||||
const config = readConfig();
|
||||
expect((config.llm as Record<string, unknown>).model).toBe('claude-sonnet-4-20250514');
|
||||
cleanup();
|
||||
});
|
||||
|
||||
it('allows replacing existing key', async () => {
|
||||
// Answers: select provider, confirm change=true, enter new key, select model
|
||||
const deps = buildDeps({
|
||||
secrets: { 'anthropic-api-key': 'sk-ant-old' },
|
||||
answers: ['simple', 'anthropic', true, 'sk-ant-new', 'claude-haiku-3-5-20241022'],
|
||||
});
|
||||
await runSetup(deps);
|
||||
|
||||
expect(deps.secretStore.set).toHaveBeenCalledWith('anthropic-api-key', 'sk-ant-new');
|
||||
cleanup();
|
||||
});
|
||||
});
|
||||
|
||||
describe('provider: vllm', () => {
|
||||
it('fetches models from vLLM and allows selection', async () => {
|
||||
const fetchModels = vi.fn(async () => ['my-model', 'llama-70b']);
|
||||
// Answers: select provider, enter URL, select model
|
||||
const deps = buildDeps({
|
||||
answers: ['simple', 'vllm', 'http://gpu:8000', 'llama-70b'],
|
||||
fetchModels,
|
||||
});
|
||||
await runSetup(deps);
|
||||
|
||||
expect(fetchModels).toHaveBeenCalledWith('http://gpu:8000', '/v1/models');
|
||||
const config = readConfig();
|
||||
const llm = config.llm as Record<string, unknown>;
|
||||
expect(llm.provider).toBe('vllm');
|
||||
expect(llm.url).toBe('http://gpu:8000');
|
||||
expect(llm.model).toBe('llama-70b');
|
||||
cleanup();
|
||||
});
|
||||
});
|
||||
|
||||
describe('provider: openai', () => {
|
||||
it('prompts for key, model, and optional custom endpoint', async () => {
|
||||
// Answers: select provider, enter key, enter model, confirm custom URL=true, enter URL
|
||||
const deps = buildDeps({
|
||||
answers: ['simple', 'openai', 'sk-openai-key', 'gpt-4o', true, 'https://custom.api.com'],
|
||||
});
|
||||
await runSetup(deps);
|
||||
|
||||
expect(deps.secretStore.set).toHaveBeenCalledWith('openai-api-key', 'sk-openai-key');
|
||||
const config = readConfig();
|
||||
const llm = config.llm as Record<string, unknown>;
|
||||
expect(llm.provider).toBe('openai');
|
||||
expect(llm.model).toBe('gpt-4o');
|
||||
expect(llm.url).toBe('https://custom.api.com');
|
||||
cleanup();
|
||||
});
|
||||
|
||||
it('skips custom URL when not requested', async () => {
|
||||
// Answers: select provider, enter key, enter model, confirm custom URL=false
|
||||
const deps = buildDeps({
|
||||
answers: ['simple', 'openai', 'sk-openai-key', 'gpt-4o-mini', false],
|
||||
});
|
||||
await runSetup(deps);
|
||||
|
||||
const config = readConfig();
|
||||
const llm = config.llm as Record<string, unknown>;
|
||||
expect(llm.url).toBeUndefined();
|
||||
cleanup();
|
||||
});
|
||||
});
|
||||
|
||||
describe('provider: deepseek', () => {
|
||||
it('prompts for key and model', async () => {
|
||||
// Answers: select provider, enter key, select model
|
||||
const deps = buildDeps({
|
||||
answers: ['simple', 'deepseek', 'sk-ds-key', 'deepseek-chat'],
|
||||
});
|
||||
await runSetup(deps);
|
||||
|
||||
expect(deps.secretStore.set).toHaveBeenCalledWith('deepseek-api-key', 'sk-ds-key');
|
||||
const config = readConfig();
|
||||
const llm = config.llm as Record<string, unknown>;
|
||||
expect(llm.provider).toBe('deepseek');
|
||||
expect(llm.model).toBe('deepseek-chat');
|
||||
cleanup();
|
||||
});
|
||||
});
|
||||
|
||||
describe('output messages', () => {
|
||||
it('shows restart instruction', async () => {
|
||||
const deps = buildDeps({ answers: ['simple', 'gemini-cli', 'gemini-2.5-flash'] });
|
||||
await runSetup(deps);
|
||||
|
||||
expect(logs.some((l) => l.includes('systemctl --user restart mcplocal'))).toBe(true);
|
||||
cleanup();
|
||||
});
|
||||
|
||||
it('shows configured provider and model', async () => {
|
||||
const deps = buildDeps({ answers: ['simple', 'gemini-cli', 'gemini-2.5-flash'] });
|
||||
await runSetup(deps);
|
||||
|
||||
expect(logs.some((l) => l.includes('gemini-cli') && l.includes('gemini-2.5-flash'))).toBe(true);
|
||||
cleanup();
|
||||
});
|
||||
});
|
||||
});
|
||||
560
src/cli/tests/commands/create.test.ts
Normal file
560
src/cli/tests/commands/create.test.ts
Normal file
@@ -0,0 +1,560 @@
|
||||
import { describe, it, expect, vi, beforeEach } from 'vitest';
|
||||
import { createCreateCommand } from '../../src/commands/create.js';
|
||||
import { type ApiClient, ApiError } from '../../src/api-client.js';
|
||||
|
||||
function mockClient(): ApiClient {
|
||||
return {
|
||||
get: vi.fn(async () => []),
|
||||
post: vi.fn(async () => ({ id: 'new-id', name: 'test' })),
|
||||
put: vi.fn(async () => ({})),
|
||||
delete: vi.fn(async () => {}),
|
||||
} as unknown as ApiClient;
|
||||
}
|
||||
|
||||
describe('create command', () => {
|
||||
let client: ReturnType<typeof mockClient>;
|
||||
let output: string[];
|
||||
const log = (...args: unknown[]) => output.push(args.map(String).join(' '));
|
||||
|
||||
beforeEach(() => {
|
||||
client = mockClient();
|
||||
output = [];
|
||||
});
|
||||
|
||||
describe('create server', () => {
|
||||
it('creates a server with minimal flags', async () => {
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync(['server', 'my-server'], { from: 'user' });
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/servers', expect.objectContaining({
|
||||
name: 'my-server',
|
||||
transport: 'STDIO',
|
||||
replicas: 1,
|
||||
}));
|
||||
expect(output.join('\n')).toContain("server 'test' created");
|
||||
});
|
||||
|
||||
it('creates a server with all flags', async () => {
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync([
|
||||
'server', 'ha-mcp',
|
||||
'-d', 'Home Assistant MCP',
|
||||
'--docker-image', 'ghcr.io/ha-mcp:latest',
|
||||
'--transport', 'STREAMABLE_HTTP',
|
||||
'--external-url', 'http://localhost:8086/mcp',
|
||||
'--container-port', '3000',
|
||||
'--replicas', '2',
|
||||
'--command', 'python',
|
||||
'--command', '-c',
|
||||
'--command', 'print("hello")',
|
||||
'--env', 'API_KEY=secretRef:creds:API_KEY',
|
||||
'--env', 'BASE_URL=http://localhost',
|
||||
], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/servers', {
|
||||
name: 'ha-mcp',
|
||||
description: 'Home Assistant MCP',
|
||||
dockerImage: 'ghcr.io/ha-mcp:latest',
|
||||
transport: 'STREAMABLE_HTTP',
|
||||
externalUrl: 'http://localhost:8086/mcp',
|
||||
containerPort: 3000,
|
||||
replicas: 2,
|
||||
command: ['python', '-c', 'print("hello")'],
|
||||
env: [
|
||||
{ name: 'API_KEY', valueFrom: { secretRef: { name: 'creds', key: 'API_KEY' } } },
|
||||
{ name: 'BASE_URL', value: 'http://localhost' },
|
||||
],
|
||||
});
|
||||
});
|
||||
|
||||
it('defaults transport to STDIO', async () => {
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync(['server', 'test'], { from: 'user' });
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/servers', expect.objectContaining({
|
||||
transport: 'STDIO',
|
||||
}));
|
||||
});
|
||||
|
||||
it('strips null values from template when using --from-template', async () => {
|
||||
vi.mocked(client.get).mockResolvedValueOnce([{
|
||||
id: 'tpl-1',
|
||||
name: 'grafana',
|
||||
version: '1.0.0',
|
||||
description: 'Grafana MCP',
|
||||
packageName: '@leval/mcp-grafana',
|
||||
dockerImage: null,
|
||||
transport: 'STDIO',
|
||||
repositoryUrl: 'https://github.com/test',
|
||||
externalUrl: null,
|
||||
command: null,
|
||||
containerPort: null,
|
||||
replicas: 1,
|
||||
env: [{ name: 'TOKEN', required: true, description: 'A token' }],
|
||||
healthCheck: { tool: 'test', arguments: {} },
|
||||
createdAt: '2025-01-01',
|
||||
updatedAt: '2025-01-01',
|
||||
}] as never);
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync([
|
||||
'server', 'my-grafana', '--from-template=grafana',
|
||||
'--env', 'TOKEN=secretRef:creds:TOKEN',
|
||||
], { from: 'user' });
|
||||
const call = vi.mocked(client.post).mock.calls[0]![1] as Record<string, unknown>;
|
||||
// null fields from template should NOT be in the body
|
||||
expect(call).not.toHaveProperty('dockerImage');
|
||||
expect(call).not.toHaveProperty('externalUrl');
|
||||
expect(call).not.toHaveProperty('command');
|
||||
expect(call).not.toHaveProperty('containerPort');
|
||||
// non-null fields should be present
|
||||
expect(call.packageName).toBe('@leval/mcp-grafana');
|
||||
expect(call.healthCheck).toEqual({ tool: 'test', arguments: {} });
|
||||
expect(call.templateName).toBe('grafana');
|
||||
});
|
||||
|
||||
it('throws on 409 without --force', async () => {
|
||||
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"Server already exists: my-server"}'));
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await expect(cmd.parseAsync(['server', 'my-server'], { from: 'user' })).rejects.toThrow('API error 409');
|
||||
});
|
||||
|
||||
it('updates existing server on 409 with --force', async () => {
|
||||
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"Server already exists"}'));
|
||||
vi.mocked(client.get).mockResolvedValueOnce([{ id: 'srv-1', name: 'my-server' }] as never);
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync(['server', 'my-server', '--force'], { from: 'user' });
|
||||
expect(client.put).toHaveBeenCalledWith('/api/v1/servers/srv-1', expect.objectContaining({
|
||||
transport: 'STDIO',
|
||||
}));
|
||||
expect(output.join('\n')).toContain("server 'my-server' updated");
|
||||
});
|
||||
});
|
||||
|
||||
describe('create secret', () => {
|
||||
it('creates a secret with --data flags', async () => {
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync([
|
||||
'secret', 'ha-creds',
|
||||
'--data', 'TOKEN=abc123',
|
||||
'--data', 'URL=https://ha.local',
|
||||
], { from: 'user' });
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/secrets', {
|
||||
name: 'ha-creds',
|
||||
data: { TOKEN: 'abc123', URL: 'https://ha.local' },
|
||||
});
|
||||
expect(output.join('\n')).toContain("secret 'test' created");
|
||||
});
|
||||
|
||||
it('creates a secret with empty data', async () => {
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync(['secret', 'empty-secret'], { from: 'user' });
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/secrets', {
|
||||
name: 'empty-secret',
|
||||
data: {},
|
||||
});
|
||||
});
|
||||
|
||||
it('throws on 409 without --force', async () => {
|
||||
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"Secret already exists: my-creds"}'));
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await expect(cmd.parseAsync(['secret', 'my-creds', '--data', 'KEY=val'], { from: 'user' })).rejects.toThrow('API error 409');
|
||||
});
|
||||
|
||||
it('updates existing secret on 409 with --force', async () => {
|
||||
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"Secret already exists"}'));
|
||||
vi.mocked(client.get).mockResolvedValueOnce([{ id: 'sec-1', name: 'my-creds' }] as never);
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync(['secret', 'my-creds', '--data', 'KEY=val', '--force'], { from: 'user' });
|
||||
expect(client.put).toHaveBeenCalledWith('/api/v1/secrets/sec-1', { data: { KEY: 'val' } });
|
||||
expect(output.join('\n')).toContain("secret 'my-creds' updated");
|
||||
});
|
||||
});
|
||||
|
||||
describe('create project', () => {
|
||||
it('creates a project', async () => {
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync(['project', 'my-project', '-d', 'A test project'], { from: 'user' });
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/projects', {
|
||||
name: 'my-project',
|
||||
description: 'A test project',
|
||||
proxyMode: 'direct',
|
||||
});
|
||||
expect(output.join('\n')).toContain("project 'test' created");
|
||||
});
|
||||
|
||||
it('creates a project with no description', async () => {
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync(['project', 'minimal'], { from: 'user' });
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/projects', {
|
||||
name: 'minimal',
|
||||
description: '',
|
||||
proxyMode: 'direct',
|
||||
});
|
||||
});
|
||||
|
||||
it('updates existing project on 409 with --force', async () => {
|
||||
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"Project already exists"}'));
|
||||
vi.mocked(client.get).mockResolvedValueOnce([{ id: 'proj-1', name: 'my-proj' }] as never);
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync(['project', 'my-proj', '-d', 'updated', '--force'], { from: 'user' });
|
||||
expect(client.put).toHaveBeenCalledWith('/api/v1/projects/proj-1', { description: 'updated', proxyMode: 'direct' });
|
||||
expect(output.join('\n')).toContain("project 'my-proj' updated");
|
||||
});
|
||||
});
|
||||
|
||||
describe('create user', () => {
|
||||
it('creates a user with password and name', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'usr-1', email: 'alice@test.com' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync([
|
||||
'user', 'alice@test.com',
|
||||
'--password', 'secret123',
|
||||
'--name', 'Alice',
|
||||
], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/users', {
|
||||
email: 'alice@test.com',
|
||||
password: 'secret123',
|
||||
name: 'Alice',
|
||||
});
|
||||
expect(output.join('\n')).toContain("user 'alice@test.com' created");
|
||||
});
|
||||
|
||||
it('does not send role field (RBAC is the auth mechanism)', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'usr-1', email: 'admin@test.com' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync([
|
||||
'user', 'admin@test.com',
|
||||
'--password', 'pass123',
|
||||
], { from: 'user' });
|
||||
|
||||
const callBody = vi.mocked(client.post).mock.calls[0]![1] as Record<string, unknown>;
|
||||
expect(callBody).not.toHaveProperty('role');
|
||||
});
|
||||
|
||||
it('requires --password', async () => {
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await expect(cmd.parseAsync(['user', 'alice@test.com'], { from: 'user' })).rejects.toThrow('--password is required');
|
||||
});
|
||||
|
||||
it('throws on 409 without --force', async () => {
|
||||
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"User already exists"}'));
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await expect(
|
||||
cmd.parseAsync(['user', 'alice@test.com', '--password', 'pass'], { from: 'user' }),
|
||||
).rejects.toThrow('API error 409');
|
||||
});
|
||||
|
||||
it('updates existing user on 409 with --force', async () => {
|
||||
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"User already exists"}'));
|
||||
vi.mocked(client.get).mockResolvedValueOnce([{ id: 'usr-1', email: 'alice@test.com' }] as never);
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync([
|
||||
'user', 'alice@test.com', '--password', 'newpass', '--name', 'Alice New', '--force',
|
||||
], { from: 'user' });
|
||||
|
||||
expect(client.put).toHaveBeenCalledWith('/api/v1/users/usr-1', {
|
||||
password: 'newpass',
|
||||
name: 'Alice New',
|
||||
});
|
||||
expect(output.join('\n')).toContain("user 'alice@test.com' updated");
|
||||
});
|
||||
});
|
||||
|
||||
describe('create group', () => {
|
||||
it('creates a group with members', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'grp-1', name: 'dev-team' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync([
|
||||
'group', 'dev-team',
|
||||
'--description', 'Development team',
|
||||
'--member', 'alice@test.com',
|
||||
'--member', 'bob@test.com',
|
||||
], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/groups', {
|
||||
name: 'dev-team',
|
||||
description: 'Development team',
|
||||
members: ['alice@test.com', 'bob@test.com'],
|
||||
});
|
||||
expect(output.join('\n')).toContain("group 'dev-team' created");
|
||||
});
|
||||
|
||||
it('creates a group with no members', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'grp-1', name: 'empty-group' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync(['group', 'empty-group'], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/groups', {
|
||||
name: 'empty-group',
|
||||
members: [],
|
||||
});
|
||||
});
|
||||
|
||||
it('throws on 409 without --force', async () => {
|
||||
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"Group already exists"}'));
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await expect(
|
||||
cmd.parseAsync(['group', 'dev-team'], { from: 'user' }),
|
||||
).rejects.toThrow('API error 409');
|
||||
});
|
||||
|
||||
it('updates existing group on 409 with --force', async () => {
|
||||
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"Group already exists"}'));
|
||||
vi.mocked(client.get).mockResolvedValueOnce([{ id: 'grp-1', name: 'dev-team' }] as never);
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync([
|
||||
'group', 'dev-team', '--member', 'new@test.com', '--force',
|
||||
], { from: 'user' });
|
||||
|
||||
expect(client.put).toHaveBeenCalledWith('/api/v1/groups/grp-1', {
|
||||
members: ['new@test.com'],
|
||||
});
|
||||
expect(output.join('\n')).toContain("group 'dev-team' updated");
|
||||
});
|
||||
});
|
||||
|
||||
describe('create rbac', () => {
|
||||
it('creates an RBAC definition with subjects and bindings', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'rbac-1', name: 'developers' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync([
|
||||
'rbac', 'developers',
|
||||
'--subject', 'User:alice@test.com',
|
||||
'--subject', 'Group:dev-team',
|
||||
'--binding', 'edit:servers',
|
||||
'--binding', 'view:instances',
|
||||
], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', {
|
||||
name: 'developers',
|
||||
subjects: [
|
||||
{ kind: 'User', name: 'alice@test.com' },
|
||||
{ kind: 'Group', name: 'dev-team' },
|
||||
],
|
||||
roleBindings: [
|
||||
{ role: 'edit', resource: 'servers' },
|
||||
{ role: 'view', resource: 'instances' },
|
||||
],
|
||||
});
|
||||
expect(output.join('\n')).toContain("rbac 'developers' created");
|
||||
});
|
||||
|
||||
it('creates an RBAC definition with wildcard resource', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'rbac-1', name: 'admins' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync([
|
||||
'rbac', 'admins',
|
||||
'--subject', 'User:admin@test.com',
|
||||
'--binding', 'edit:*',
|
||||
], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', {
|
||||
name: 'admins',
|
||||
subjects: [{ kind: 'User', name: 'admin@test.com' }],
|
||||
roleBindings: [{ role: 'edit', resource: '*' }],
|
||||
});
|
||||
});
|
||||
|
||||
it('creates an RBAC definition with empty subjects and bindings', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'rbac-1', name: 'empty' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync(['rbac', 'empty'], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', {
|
||||
name: 'empty',
|
||||
subjects: [],
|
||||
roleBindings: [],
|
||||
});
|
||||
});
|
||||
|
||||
it('throws on invalid subject format', async () => {
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await expect(
|
||||
cmd.parseAsync(['rbac', 'bad', '--subject', 'no-colon'], { from: 'user' }),
|
||||
).rejects.toThrow('Invalid subject format');
|
||||
});
|
||||
|
||||
it('throws on invalid binding format', async () => {
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await expect(
|
||||
cmd.parseAsync(['rbac', 'bad', '--binding', 'no-colon'], { from: 'user' }),
|
||||
).rejects.toThrow('Invalid binding format');
|
||||
});
|
||||
|
||||
it('throws on 409 without --force', async () => {
|
||||
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"RBAC already exists"}'));
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await expect(
|
||||
cmd.parseAsync(['rbac', 'developers', '--subject', 'User:a@b.com', '--binding', 'edit:servers'], { from: 'user' }),
|
||||
).rejects.toThrow('API error 409');
|
||||
});
|
||||
|
||||
it('updates existing RBAC on 409 with --force', async () => {
|
||||
vi.mocked(client.post).mockRejectedValueOnce(new ApiError(409, '{"error":"RBAC already exists"}'));
|
||||
vi.mocked(client.get).mockResolvedValueOnce([{ id: 'rbac-1', name: 'developers' }] as never);
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync([
|
||||
'rbac', 'developers',
|
||||
'--subject', 'User:new@test.com',
|
||||
'--binding', 'edit:*',
|
||||
'--force',
|
||||
], { from: 'user' });
|
||||
|
||||
expect(client.put).toHaveBeenCalledWith('/api/v1/rbac/rbac-1', {
|
||||
subjects: [{ kind: 'User', name: 'new@test.com' }],
|
||||
roleBindings: [{ role: 'edit', resource: '*' }],
|
||||
});
|
||||
expect(output.join('\n')).toContain("rbac 'developers' updated");
|
||||
});
|
||||
|
||||
it('creates an RBAC definition with operation bindings', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'rbac-1', name: 'ops' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync([
|
||||
'rbac', 'ops',
|
||||
'--subject', 'Group:ops-team',
|
||||
'--binding', 'edit:servers',
|
||||
'--operation', 'logs',
|
||||
'--operation', 'backup',
|
||||
], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', {
|
||||
name: 'ops',
|
||||
subjects: [{ kind: 'Group', name: 'ops-team' }],
|
||||
roleBindings: [
|
||||
{ role: 'edit', resource: 'servers' },
|
||||
{ role: 'run', action: 'logs' },
|
||||
{ role: 'run', action: 'backup' },
|
||||
],
|
||||
});
|
||||
expect(output.join('\n')).toContain("rbac 'ops' created");
|
||||
});
|
||||
|
||||
it('creates an RBAC definition with name-scoped binding', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'rbac-1', name: 'ha-viewer' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync([
|
||||
'rbac', 'ha-viewer',
|
||||
'--subject', 'User:alice@test.com',
|
||||
'--binding', 'view:servers:my-ha',
|
||||
], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/rbac', {
|
||||
name: 'ha-viewer',
|
||||
subjects: [{ kind: 'User', name: 'alice@test.com' }],
|
||||
roleBindings: [
|
||||
{ role: 'view', resource: 'servers', name: 'my-ha' },
|
||||
],
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('create prompt', () => {
|
||||
it('creates a prompt with content', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'p-1', name: 'test-prompt' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync(['prompt', 'test-prompt', '--content', 'Hello world'], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/prompts', {
|
||||
name: 'test-prompt',
|
||||
content: 'Hello world',
|
||||
});
|
||||
expect(output.join('\n')).toContain("prompt 'test-prompt' created");
|
||||
});
|
||||
|
||||
it('requires content or content-file', async () => {
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await expect(
|
||||
cmd.parseAsync(['prompt', 'no-content'], { from: 'user' }),
|
||||
).rejects.toThrow('--content or --content-file is required');
|
||||
});
|
||||
|
||||
it('--priority sets prompt priority', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'p-1', name: 'pri-prompt' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync(['prompt', 'pri-prompt', '--content', 'x', '--priority', '8'], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/prompts', expect.objectContaining({
|
||||
priority: 8,
|
||||
}));
|
||||
});
|
||||
|
||||
it('--priority validates range 1-10', async () => {
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await expect(
|
||||
cmd.parseAsync(['prompt', 'bad', '--content', 'x', '--priority', '15'], { from: 'user' }),
|
||||
).rejects.toThrow('--priority must be a number between 1 and 10');
|
||||
});
|
||||
|
||||
it('--priority rejects zero', async () => {
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await expect(
|
||||
cmd.parseAsync(['prompt', 'bad', '--content', 'x', '--priority', '0'], { from: 'user' }),
|
||||
).rejects.toThrow('--priority must be a number between 1 and 10');
|
||||
});
|
||||
|
||||
it('--link sets linkTarget', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'p-1', name: 'linked' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync(['prompt', 'linked', '--content', 'x', '--link', 'proj/srv:docmost://pages/abc'], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/prompts', expect.objectContaining({
|
||||
linkTarget: 'proj/srv:docmost://pages/abc',
|
||||
}));
|
||||
});
|
||||
|
||||
it('--project resolves project name to ID', async () => {
|
||||
vi.mocked(client.get).mockResolvedValueOnce([{ id: 'proj-1', name: 'my-project' }] as never);
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'p-1', name: 'scoped' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync(['prompt', 'scoped', '--content', 'x', '--project', 'my-project'], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/prompts', expect.objectContaining({
|
||||
projectId: 'proj-1',
|
||||
}));
|
||||
});
|
||||
|
||||
it('--project throws when project not found', async () => {
|
||||
vi.mocked(client.get).mockResolvedValueOnce([] as never);
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await expect(
|
||||
cmd.parseAsync(['prompt', 'bad', '--content', 'x', '--project', 'nope'], { from: 'user' }),
|
||||
).rejects.toThrow("Project 'nope' not found");
|
||||
});
|
||||
});
|
||||
|
||||
describe('create promptrequest', () => {
|
||||
it('creates a prompt request with priority', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'r-1', name: 'req' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync(['promptrequest', 'req', '--content', 'proposal', '--priority', '7'], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/promptrequests', expect.objectContaining({
|
||||
name: 'req',
|
||||
content: 'proposal',
|
||||
priority: 7,
|
||||
}));
|
||||
});
|
||||
});
|
||||
|
||||
describe('create project', () => {
|
||||
it('creates a project with --gated', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'proj-1', name: 'gated-proj' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync(['project', 'gated-proj', '--gated'], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/projects', expect.objectContaining({
|
||||
gated: true,
|
||||
}));
|
||||
});
|
||||
|
||||
it('creates a project with --no-gated', async () => {
|
||||
vi.mocked(client.post).mockResolvedValueOnce({ id: 'proj-1', name: 'open-proj' });
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync(['project', 'open-proj', '--no-gated'], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/projects', expect.objectContaining({
|
||||
gated: false,
|
||||
}));
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -1,42 +1,59 @@
|
||||
import { describe, it, expect, vi } from 'vitest';
|
||||
import { createDescribeCommand } from '../../src/commands/describe.js';
|
||||
import type { DescribeCommandDeps } from '../../src/commands/describe.js';
|
||||
import type { ApiClient } from '../../src/api-client.js';
|
||||
|
||||
function mockClient(): ApiClient {
|
||||
return {
|
||||
get: vi.fn(async () => []),
|
||||
post: vi.fn(async () => ({})),
|
||||
put: vi.fn(async () => ({})),
|
||||
delete: vi.fn(async () => {}),
|
||||
} as unknown as ApiClient;
|
||||
}
|
||||
|
||||
function makeDeps(item: unknown = {}): DescribeCommandDeps & { output: string[] } {
|
||||
const output: string[] = [];
|
||||
return {
|
||||
output,
|
||||
client: mockClient(),
|
||||
fetchResource: vi.fn(async () => item),
|
||||
log: (...args: string[]) => output.push(args.join(' ')),
|
||||
};
|
||||
}
|
||||
|
||||
describe('describe command', () => {
|
||||
it('shows detailed server info', async () => {
|
||||
it('shows detailed server info with sections', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'srv-1',
|
||||
name: 'slack',
|
||||
transport: 'STDIO',
|
||||
packageName: '@slack/mcp',
|
||||
dockerImage: null,
|
||||
envTemplate: [],
|
||||
env: [],
|
||||
createdAt: '2025-01-01',
|
||||
});
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'server', 'srv-1']);
|
||||
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('servers', 'srv-1');
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('--- Server ---');
|
||||
expect(text).toContain('name: slack');
|
||||
expect(text).toContain('transport: STDIO');
|
||||
expect(text).toContain('dockerImage: -');
|
||||
expect(text).toContain('=== Server: slack ===');
|
||||
expect(text).toContain('Name:');
|
||||
expect(text).toContain('slack');
|
||||
expect(text).toContain('Transport:');
|
||||
expect(text).toContain('STDIO');
|
||||
expect(text).toContain('Package:');
|
||||
expect(text).toContain('@slack/mcp');
|
||||
expect(text).toContain('Metadata:');
|
||||
expect(text).toContain('ID:');
|
||||
});
|
||||
|
||||
it('resolves resource aliases', async () => {
|
||||
const deps = makeDeps({ id: 'p1' });
|
||||
const deps = makeDeps({ id: 's1' });
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'prof', 'p1']);
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('profiles', 'p1');
|
||||
await cmd.parseAsync(['node', 'test', 'sec', 's1']);
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('secrets', 's1');
|
||||
});
|
||||
|
||||
it('outputs JSON format', async () => {
|
||||
@@ -55,31 +72,625 @@ describe('describe command', () => {
|
||||
expect(deps.output[0]).toContain('name: slack');
|
||||
});
|
||||
|
||||
it('formats nested objects', async () => {
|
||||
it('shows project detail', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'srv-1',
|
||||
name: 'slack',
|
||||
metadata: { version: '1.0', nested: { deep: true } },
|
||||
id: 'proj-1',
|
||||
name: 'my-project',
|
||||
description: 'A test project',
|
||||
ownerId: 'user-1',
|
||||
createdAt: '2025-01-01',
|
||||
});
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'server', 'srv-1']);
|
||||
await cmd.parseAsync(['node', 'test', 'project', 'proj-1']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('metadata:');
|
||||
expect(text).toContain('version: 1.0');
|
||||
expect(text).toContain('=== Project: my-project ===');
|
||||
expect(text).toContain('A test project');
|
||||
expect(text).toContain('user-1');
|
||||
});
|
||||
|
||||
it('formats arrays correctly', async () => {
|
||||
it('shows secret detail with masked values', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'sec-1',
|
||||
name: 'ha-creds',
|
||||
data: { TOKEN: 'abc123', URL: 'https://ha.local' },
|
||||
createdAt: '2025-01-01',
|
||||
});
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'secret', 'sec-1']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('=== Secret: ha-creds ===');
|
||||
expect(text).toContain('TOKEN');
|
||||
expect(text).toContain('***');
|
||||
expect(text).not.toContain('abc123');
|
||||
expect(text).toContain('use --show-values to reveal');
|
||||
});
|
||||
|
||||
it('shows secret detail with revealed values when --show-values', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'sec-1',
|
||||
name: 'ha-creds',
|
||||
data: { TOKEN: 'abc123' },
|
||||
createdAt: '2025-01-01',
|
||||
});
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'secret', 'sec-1', '--show-values']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('abc123');
|
||||
expect(text).not.toContain('***');
|
||||
});
|
||||
|
||||
it('shows instance detail with container info', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'inst-1',
|
||||
serverId: 'srv-1',
|
||||
status: 'RUNNING',
|
||||
containerId: 'abc123',
|
||||
port: 3000,
|
||||
createdAt: '2025-01-01',
|
||||
});
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'instance', 'inst-1']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('=== Instance: inst-1 ===');
|
||||
expect(text).toContain('RUNNING');
|
||||
expect(text).toContain('abc123');
|
||||
});
|
||||
|
||||
it('resolves server name to instance for describe instance', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'inst-1',
|
||||
serverId: 'srv-1',
|
||||
server: { name: 'my-grafana' },
|
||||
status: 'RUNNING',
|
||||
containerId: 'abc123',
|
||||
port: 3000,
|
||||
});
|
||||
// resolveNameOrId will throw (not a CUID, name won't match instances)
|
||||
vi.mocked(deps.client.get)
|
||||
.mockResolvedValueOnce([] as never) // instances list (no name match)
|
||||
.mockResolvedValueOnce([{ id: 'srv-1', name: 'my-grafana' }] as never) // servers list
|
||||
.mockResolvedValueOnce([{ id: 'inst-1', status: 'RUNNING' }] as never); // instances for server
|
||||
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'instance', 'my-grafana']);
|
||||
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('instances', 'inst-1');
|
||||
});
|
||||
|
||||
it('resolves server name and picks running instance over stopped', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'inst-2',
|
||||
serverId: 'srv-1',
|
||||
server: { name: 'my-ha' },
|
||||
status: 'RUNNING',
|
||||
containerId: 'def456',
|
||||
});
|
||||
vi.mocked(deps.client.get)
|
||||
.mockResolvedValueOnce([] as never) // instances list
|
||||
.mockResolvedValueOnce([{ id: 'srv-1', name: 'my-ha' }] as never)
|
||||
.mockResolvedValueOnce([
|
||||
{ id: 'inst-1', status: 'ERROR' },
|
||||
{ id: 'inst-2', status: 'RUNNING' },
|
||||
] as never);
|
||||
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'instance', 'my-ha']);
|
||||
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('instances', 'inst-2');
|
||||
});
|
||||
|
||||
it('throws when no instances found for server name', async () => {
|
||||
const deps = makeDeps();
|
||||
vi.mocked(deps.client.get)
|
||||
.mockResolvedValueOnce([] as never) // instances list
|
||||
.mockResolvedValueOnce([{ id: 'srv-1', name: 'my-server' }] as never)
|
||||
.mockResolvedValueOnce([] as never); // no instances
|
||||
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await expect(cmd.parseAsync(['node', 'test', 'instance', 'my-server'])).rejects.toThrow(
|
||||
/No instances found/,
|
||||
);
|
||||
});
|
||||
|
||||
it('shows instance with server name in header', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'inst-1',
|
||||
serverId: 'srv-1',
|
||||
server: { name: 'my-grafana' },
|
||||
status: 'RUNNING',
|
||||
containerId: 'abc123',
|
||||
port: 3000,
|
||||
});
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'instance', 'inst-1']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('=== Instance: my-grafana ===');
|
||||
});
|
||||
|
||||
it('shows instance health and events', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'inst-1',
|
||||
serverId: 'srv-1',
|
||||
server: { name: 'my-grafana' },
|
||||
status: 'RUNNING',
|
||||
containerId: 'abc123',
|
||||
healthStatus: 'healthy',
|
||||
lastHealthCheck: '2025-01-15T10:30:00Z',
|
||||
events: [
|
||||
{ timestamp: '2025-01-15T10:30:00Z', type: 'Normal', message: 'Health check passed (45ms)' },
|
||||
],
|
||||
});
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'instance', 'inst-1']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('Health:');
|
||||
expect(text).toContain('healthy');
|
||||
expect(text).toContain('Events:');
|
||||
expect(text).toContain('Health check passed');
|
||||
});
|
||||
|
||||
it('shows server healthCheck section', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'srv-1',
|
||||
permissions: ['read', 'write'],
|
||||
envTemplate: [],
|
||||
name: 'my-grafana',
|
||||
transport: 'STDIO',
|
||||
healthCheck: {
|
||||
tool: 'list_datasources',
|
||||
arguments: {},
|
||||
intervalSeconds: 60,
|
||||
timeoutSeconds: 10,
|
||||
failureThreshold: 3,
|
||||
},
|
||||
});
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'server', 'srv-1']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('permissions: read, write');
|
||||
expect(text).toContain('envTemplate: []');
|
||||
expect(text).toContain('Health Check:');
|
||||
expect(text).toContain('list_datasources');
|
||||
expect(text).toContain('60s');
|
||||
expect(text).toContain('Failure Threshold:');
|
||||
});
|
||||
|
||||
it('shows template detail with healthCheck and usage', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'tpl-1',
|
||||
name: 'grafana',
|
||||
transport: 'STDIO',
|
||||
version: '1.0.0',
|
||||
packageName: '@leval/mcp-grafana',
|
||||
env: [
|
||||
{ name: 'GRAFANA_URL', required: true, description: 'Grafana instance URL' },
|
||||
],
|
||||
healthCheck: {
|
||||
tool: 'list_datasources',
|
||||
arguments: {},
|
||||
intervalSeconds: 60,
|
||||
timeoutSeconds: 10,
|
||||
failureThreshold: 3,
|
||||
},
|
||||
});
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'template', 'tpl-1']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('=== Template: grafana ===');
|
||||
expect(text).toContain('@leval/mcp-grafana');
|
||||
expect(text).toContain('GRAFANA_URL');
|
||||
expect(text).toContain('Health Check:');
|
||||
expect(text).toContain('list_datasources');
|
||||
expect(text).toContain('mcpctl create server my-grafana --from-template=grafana');
|
||||
});
|
||||
|
||||
it('shows user detail (no Role field — RBAC is the auth mechanism)', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'usr-1',
|
||||
email: 'alice@test.com',
|
||||
name: 'Alice Smith',
|
||||
provider: null,
|
||||
createdAt: '2025-01-01',
|
||||
updatedAt: '2025-01-15',
|
||||
});
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'user', 'usr-1']);
|
||||
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('users', 'usr-1');
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('=== User: alice@test.com ===');
|
||||
expect(text).toContain('Email:');
|
||||
expect(text).toContain('alice@test.com');
|
||||
expect(text).toContain('Name:');
|
||||
expect(text).toContain('Alice Smith');
|
||||
expect(text).not.toContain('Role:');
|
||||
expect(text).toContain('Provider:');
|
||||
expect(text).toContain('local');
|
||||
expect(text).toContain('ID:');
|
||||
expect(text).toContain('usr-1');
|
||||
});
|
||||
|
||||
it('shows user with no name as dash', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'usr-2',
|
||||
email: 'bob@test.com',
|
||||
name: null,
|
||||
provider: 'oidc',
|
||||
});
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'user', 'usr-2']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('=== User: bob@test.com ===');
|
||||
expect(text).toContain('Name:');
|
||||
expect(text).toContain('-');
|
||||
expect(text).not.toContain('Role:');
|
||||
expect(text).toContain('oidc');
|
||||
});
|
||||
|
||||
it('shows group detail with members', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'grp-1',
|
||||
name: 'dev-team',
|
||||
description: 'Development team',
|
||||
members: [
|
||||
{ user: { email: 'alice@test.com' }, createdAt: '2025-01-01' },
|
||||
{ user: { email: 'bob@test.com' }, createdAt: '2025-01-02' },
|
||||
],
|
||||
createdAt: '2025-01-01',
|
||||
updatedAt: '2025-01-15',
|
||||
});
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'group', 'grp-1']);
|
||||
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('groups', 'grp-1');
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('=== Group: dev-team ===');
|
||||
expect(text).toContain('Name:');
|
||||
expect(text).toContain('dev-team');
|
||||
expect(text).toContain('Description:');
|
||||
expect(text).toContain('Development team');
|
||||
expect(text).toContain('Members:');
|
||||
expect(text).toContain('EMAIL');
|
||||
expect(text).toContain('ADDED');
|
||||
expect(text).toContain('alice@test.com');
|
||||
expect(text).toContain('bob@test.com');
|
||||
expect(text).toContain('ID:');
|
||||
expect(text).toContain('grp-1');
|
||||
});
|
||||
|
||||
it('shows group detail with no members', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'grp-2',
|
||||
name: 'empty-group',
|
||||
description: '',
|
||||
members: [],
|
||||
});
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'group', 'grp-2']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('=== Group: empty-group ===');
|
||||
// No Members section when empty
|
||||
expect(text).not.toContain('EMAIL');
|
||||
});
|
||||
|
||||
it('shows RBAC detail with subjects and bindings', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'rbac-1',
|
||||
name: 'developers',
|
||||
subjects: [
|
||||
{ kind: 'User', name: 'alice@test.com' },
|
||||
{ kind: 'Group', name: 'dev-team' },
|
||||
],
|
||||
roleBindings: [
|
||||
{ role: 'edit', resource: 'servers' },
|
||||
{ role: 'view', resource: 'instances' },
|
||||
{ role: 'view', resource: 'projects' },
|
||||
],
|
||||
createdAt: '2025-01-01',
|
||||
updatedAt: '2025-01-15',
|
||||
});
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'rbac', 'rbac-1']);
|
||||
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('rbac', 'rbac-1');
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('=== RBAC: developers ===');
|
||||
expect(text).toContain('Name:');
|
||||
expect(text).toContain('developers');
|
||||
// Subjects section
|
||||
expect(text).toContain('Subjects:');
|
||||
expect(text).toContain('KIND');
|
||||
expect(text).toContain('NAME');
|
||||
expect(text).toContain('User');
|
||||
expect(text).toContain('alice@test.com');
|
||||
expect(text).toContain('Group');
|
||||
expect(text).toContain('dev-team');
|
||||
// Role Bindings section
|
||||
expect(text).toContain('Resource Bindings:');
|
||||
expect(text).toContain('ROLE');
|
||||
expect(text).toContain('RESOURCE');
|
||||
expect(text).toContain('edit');
|
||||
expect(text).toContain('servers');
|
||||
expect(text).toContain('view');
|
||||
expect(text).toContain('instances');
|
||||
expect(text).toContain('projects');
|
||||
expect(text).toContain('ID:');
|
||||
expect(text).toContain('rbac-1');
|
||||
});
|
||||
|
||||
it('shows RBAC detail with wildcard resource', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'rbac-2',
|
||||
name: 'admins',
|
||||
subjects: [{ kind: 'User', name: 'admin@test.com' }],
|
||||
roleBindings: [{ role: 'edit', resource: '*' }],
|
||||
});
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'rbac', 'rbac-2']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('=== RBAC: admins ===');
|
||||
expect(text).toContain('edit');
|
||||
expect(text).toContain('*');
|
||||
});
|
||||
|
||||
it('shows RBAC detail with empty subjects and bindings', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'rbac-3',
|
||||
name: 'empty-rbac',
|
||||
subjects: [],
|
||||
roleBindings: [],
|
||||
});
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'rbac', 'rbac-3']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('=== RBAC: empty-rbac ===');
|
||||
// No Subjects or Role Bindings sections when empty
|
||||
expect(text).not.toContain('KIND');
|
||||
expect(text).not.toContain('ROLE');
|
||||
expect(text).not.toContain('RESOURCE');
|
||||
});
|
||||
|
||||
it('shows RBAC detail with mixed resource and operation bindings', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'rbac-1',
|
||||
name: 'admin-access',
|
||||
subjects: [{ kind: 'Group', name: 'admin' }],
|
||||
roleBindings: [
|
||||
{ role: 'edit', resource: '*' },
|
||||
{ role: 'run', resource: 'projects' },
|
||||
{ role: 'run', action: 'logs' },
|
||||
{ role: 'run', action: 'backup' },
|
||||
],
|
||||
createdAt: '2025-01-01',
|
||||
});
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'rbac', 'rbac-1']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('Resource Bindings:');
|
||||
expect(text).toContain('edit');
|
||||
expect(text).toContain('*');
|
||||
expect(text).toContain('run');
|
||||
expect(text).toContain('projects');
|
||||
expect(text).toContain('Operations:');
|
||||
expect(text).toContain('ACTION');
|
||||
expect(text).toContain('logs');
|
||||
expect(text).toContain('backup');
|
||||
});
|
||||
|
||||
it('shows RBAC detail with name-scoped resource binding', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'rbac-1',
|
||||
name: 'ha-viewer',
|
||||
subjects: [{ kind: 'User', name: 'alice@test.com' }],
|
||||
roleBindings: [
|
||||
{ role: 'view', resource: 'servers', name: 'my-ha' },
|
||||
{ role: 'edit', resource: 'secrets' },
|
||||
],
|
||||
});
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'rbac', 'rbac-1']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('Resource Bindings:');
|
||||
expect(text).toContain('NAME');
|
||||
expect(text).toContain('my-ha');
|
||||
expect(text).toContain('view');
|
||||
expect(text).toContain('servers');
|
||||
});
|
||||
|
||||
it('shows user with direct RBAC permissions', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'usr-1',
|
||||
email: 'alice@test.com',
|
||||
name: 'Alice',
|
||||
provider: null,
|
||||
});
|
||||
vi.mocked(deps.client.get)
|
||||
.mockResolvedValueOnce([] as never) // users list (resolveNameOrId)
|
||||
.mockResolvedValueOnce([ // RBAC defs
|
||||
{
|
||||
name: 'dev-access',
|
||||
subjects: [{ kind: 'User', name: 'alice@test.com' }],
|
||||
roleBindings: [
|
||||
{ role: 'edit', resource: 'servers' },
|
||||
{ role: 'run', action: 'logs' },
|
||||
],
|
||||
},
|
||||
] as never)
|
||||
.mockResolvedValueOnce([] as never); // groups
|
||||
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'user', 'usr-1']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('=== User: alice@test.com ===');
|
||||
expect(text).toContain('Access:');
|
||||
expect(text).toContain('Direct (dev-access)');
|
||||
expect(text).toContain('Resources:');
|
||||
expect(text).toContain('edit');
|
||||
expect(text).toContain('servers');
|
||||
expect(text).toContain('Operations:');
|
||||
expect(text).toContain('logs');
|
||||
});
|
||||
|
||||
it('shows user with inherited group permissions', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'usr-1',
|
||||
email: 'bob@test.com',
|
||||
name: 'Bob',
|
||||
provider: null,
|
||||
});
|
||||
vi.mocked(deps.client.get)
|
||||
.mockResolvedValueOnce([] as never) // users list
|
||||
.mockResolvedValueOnce([ // RBAC defs
|
||||
{
|
||||
name: 'team-perms',
|
||||
subjects: [{ kind: 'Group', name: 'dev-team' }],
|
||||
roleBindings: [
|
||||
{ role: 'view', resource: '*' },
|
||||
{ role: 'run', action: 'backup' },
|
||||
],
|
||||
},
|
||||
] as never)
|
||||
.mockResolvedValueOnce([ // groups
|
||||
{ name: 'dev-team', members: [{ user: { email: 'bob@test.com' } }] },
|
||||
] as never);
|
||||
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'user', 'usr-1']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('Groups:');
|
||||
expect(text).toContain('dev-team');
|
||||
expect(text).toContain('Access:');
|
||||
expect(text).toContain('Inherited (dev-team)');
|
||||
expect(text).toContain('view');
|
||||
expect(text).toContain('*');
|
||||
expect(text).toContain('backup');
|
||||
});
|
||||
|
||||
it('shows user with no permissions', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'usr-1',
|
||||
email: 'nobody@test.com',
|
||||
name: null,
|
||||
provider: null,
|
||||
});
|
||||
vi.mocked(deps.client.get)
|
||||
.mockResolvedValueOnce([] as never)
|
||||
.mockResolvedValueOnce([] as never)
|
||||
.mockResolvedValueOnce([] as never);
|
||||
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'user', 'usr-1']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('Access: (none)');
|
||||
});
|
||||
|
||||
it('shows group with RBAC permissions', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'grp-1',
|
||||
name: 'admin',
|
||||
description: 'Admin group',
|
||||
members: [{ user: { email: 'alice@test.com' } }],
|
||||
});
|
||||
vi.mocked(deps.client.get)
|
||||
.mockResolvedValueOnce([] as never) // groups list (resolveNameOrId)
|
||||
.mockResolvedValueOnce([ // RBAC defs
|
||||
{
|
||||
name: 'admin-access',
|
||||
subjects: [{ kind: 'Group', name: 'admin' }],
|
||||
roleBindings: [
|
||||
{ role: 'edit', resource: '*' },
|
||||
{ role: 'run', action: 'backup' },
|
||||
{ role: 'run', action: 'restore' },
|
||||
],
|
||||
},
|
||||
] as never);
|
||||
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'group', 'grp-1']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('=== Group: admin ===');
|
||||
expect(text).toContain('Access:');
|
||||
expect(text).toContain('Granted (admin-access)');
|
||||
expect(text).toContain('edit');
|
||||
expect(text).toContain('*');
|
||||
expect(text).toContain('backup');
|
||||
expect(text).toContain('restore');
|
||||
});
|
||||
|
||||
it('shows group with name-scoped permissions', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'grp-1',
|
||||
name: 'ha-team',
|
||||
description: 'HA team',
|
||||
members: [],
|
||||
});
|
||||
vi.mocked(deps.client.get)
|
||||
.mockResolvedValueOnce([] as never)
|
||||
.mockResolvedValueOnce([ // RBAC defs
|
||||
{
|
||||
name: 'ha-access',
|
||||
subjects: [{ kind: 'Group', name: 'ha-team' }],
|
||||
roleBindings: [
|
||||
{ role: 'edit', resource: 'servers', name: 'my-ha' },
|
||||
{ role: 'view', resource: 'secrets' },
|
||||
],
|
||||
},
|
||||
] as never);
|
||||
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'group', 'grp-1']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('Access:');
|
||||
expect(text).toContain('Granted (ha-access)');
|
||||
expect(text).toContain('my-ha');
|
||||
expect(text).toContain('NAME');
|
||||
});
|
||||
|
||||
it('outputs user detail as JSON', async () => {
|
||||
const deps = makeDeps({ id: 'usr-1', email: 'alice@test.com', name: 'Alice', role: 'ADMIN' });
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'user', 'usr-1', '-o', 'json']);
|
||||
|
||||
const parsed = JSON.parse(deps.output[0] ?? '');
|
||||
expect(parsed.email).toBe('alice@test.com');
|
||||
expect(parsed.role).toBe('ADMIN');
|
||||
});
|
||||
|
||||
it('outputs group detail as YAML', async () => {
|
||||
const deps = makeDeps({ id: 'grp-1', name: 'dev-team', description: 'Devs' });
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'group', 'grp-1', '-o', 'yaml']);
|
||||
|
||||
expect(deps.output[0]).toContain('name: dev-team');
|
||||
});
|
||||
|
||||
it('outputs rbac detail as JSON', async () => {
|
||||
const deps = makeDeps({
|
||||
id: 'rbac-1',
|
||||
name: 'devs',
|
||||
subjects: [{ kind: 'User', name: 'a@b.com' }],
|
||||
roleBindings: [{ role: 'edit', resource: 'servers' }],
|
||||
});
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'rbac', 'rbac-1', '-o', 'json']);
|
||||
|
||||
const parsed = JSON.parse(deps.output[0] ?? '');
|
||||
expect(parsed.subjects).toHaveLength(1);
|
||||
expect(parsed.roleBindings[0].role).toBe('edit');
|
||||
});
|
||||
});
|
||||
|
||||
153
src/cli/tests/commands/edit.test.ts
Normal file
153
src/cli/tests/commands/edit.test.ts
Normal file
@@ -0,0 +1,153 @@
|
||||
import { describe, it, expect, vi, beforeEach } from 'vitest';
|
||||
import { readFileSync, writeFileSync } from 'node:fs';
|
||||
import yaml from 'js-yaml';
|
||||
import { createEditCommand } from '../../src/commands/edit.js';
|
||||
import type { ApiClient } from '../../src/api-client.js';
|
||||
|
||||
function mockClient(): ApiClient {
|
||||
return {
|
||||
get: vi.fn(async () => ({})),
|
||||
post: vi.fn(async () => ({})),
|
||||
put: vi.fn(async () => ({})),
|
||||
delete: vi.fn(async () => {}),
|
||||
} as unknown as ApiClient;
|
||||
}
|
||||
|
||||
describe('edit command', () => {
|
||||
let client: ReturnType<typeof mockClient>;
|
||||
let output: string[];
|
||||
const log = (...args: unknown[]) => output.push(args.map(String).join(' '));
|
||||
|
||||
beforeEach(() => {
|
||||
client = mockClient();
|
||||
output = [];
|
||||
});
|
||||
|
||||
it('fetches server, opens editor, applies changes on save', async () => {
|
||||
// GET /api/v1/servers returns list for resolveNameOrId
|
||||
vi.mocked(client.get).mockImplementation(async (path: string) => {
|
||||
if (path === '/api/v1/servers') {
|
||||
return [{ id: 'srv-1', name: 'ha-mcp' }];
|
||||
}
|
||||
// GET /api/v1/servers/srv-1 returns full server
|
||||
return {
|
||||
id: 'srv-1',
|
||||
name: 'ha-mcp',
|
||||
description: 'Old desc',
|
||||
transport: 'STDIO',
|
||||
replicas: 1,
|
||||
createdAt: '2025-01-01',
|
||||
updatedAt: '2025-01-01',
|
||||
version: 1,
|
||||
};
|
||||
});
|
||||
|
||||
const cmd = createEditCommand({
|
||||
client,
|
||||
log,
|
||||
getEditor: () => 'vi',
|
||||
openEditor: (filePath) => {
|
||||
// Simulate user editing the file
|
||||
const content = readFileSync(filePath, 'utf-8');
|
||||
const modified = content
|
||||
.replace('Old desc', 'New desc')
|
||||
.replace('replicas: 1', 'replicas: 3');
|
||||
writeFileSync(filePath, modified, 'utf-8');
|
||||
},
|
||||
});
|
||||
|
||||
await cmd.parseAsync(['server', 'ha-mcp'], { from: 'user' });
|
||||
|
||||
expect(client.put).toHaveBeenCalledWith('/api/v1/servers/srv-1', expect.objectContaining({
|
||||
description: 'New desc',
|
||||
replicas: 3,
|
||||
}));
|
||||
expect(output.join('\n')).toContain("server 'ha-mcp' updated");
|
||||
});
|
||||
|
||||
it('detects no changes and skips PUT', async () => {
|
||||
vi.mocked(client.get).mockImplementation(async (path: string) => {
|
||||
if (path === '/api/v1/servers') return [{ id: 'srv-1', name: 'test' }];
|
||||
return {
|
||||
id: 'srv-1', name: 'test', description: '', transport: 'STDIO',
|
||||
createdAt: '2025-01-01', updatedAt: '2025-01-01', version: 1,
|
||||
};
|
||||
});
|
||||
|
||||
const cmd = createEditCommand({
|
||||
client,
|
||||
log,
|
||||
getEditor: () => 'vi',
|
||||
openEditor: () => {
|
||||
// Don't modify the file
|
||||
},
|
||||
});
|
||||
|
||||
await cmd.parseAsync(['server', 'test'], { from: 'user' });
|
||||
|
||||
expect(client.put).not.toHaveBeenCalled();
|
||||
expect(output.join('\n')).toContain("unchanged");
|
||||
});
|
||||
|
||||
it('handles empty file as cancel', async () => {
|
||||
vi.mocked(client.get).mockImplementation(async (path: string) => {
|
||||
if (path === '/api/v1/servers') return [{ id: 'srv-1', name: 'test' }];
|
||||
return { id: 'srv-1', name: 'test', createdAt: '2025-01-01', updatedAt: '2025-01-01', version: 1 };
|
||||
});
|
||||
|
||||
const cmd = createEditCommand({
|
||||
client,
|
||||
log,
|
||||
getEditor: () => 'vi',
|
||||
openEditor: (filePath) => {
|
||||
writeFileSync(filePath, '', 'utf-8');
|
||||
},
|
||||
});
|
||||
|
||||
await cmd.parseAsync(['server', 'test'], { from: 'user' });
|
||||
|
||||
expect(client.put).not.toHaveBeenCalled();
|
||||
expect(output.join('\n')).toContain('cancelled');
|
||||
});
|
||||
|
||||
it('strips read-only fields from editor content', async () => {
|
||||
vi.mocked(client.get).mockImplementation(async (path: string) => {
|
||||
if (path === '/api/v1/servers') return [{ id: 'srv-1', name: 'test' }];
|
||||
return {
|
||||
id: 'srv-1', name: 'test', description: '', transport: 'STDIO',
|
||||
createdAt: '2025-01-01', updatedAt: '2025-01-01', version: 1,
|
||||
};
|
||||
});
|
||||
|
||||
let editorContent = '';
|
||||
const cmd = createEditCommand({
|
||||
client,
|
||||
log,
|
||||
getEditor: () => 'vi',
|
||||
openEditor: (filePath) => {
|
||||
editorContent = readFileSync(filePath, 'utf-8');
|
||||
},
|
||||
});
|
||||
|
||||
await cmd.parseAsync(['server', 'test'], { from: 'user' });
|
||||
|
||||
// The editor content should NOT contain read-only fields
|
||||
expect(editorContent).not.toContain('id:');
|
||||
expect(editorContent).not.toContain('createdAt');
|
||||
expect(editorContent).not.toContain('updatedAt');
|
||||
expect(editorContent).not.toContain('version');
|
||||
// But should contain editable fields
|
||||
expect(editorContent).toContain('name:');
|
||||
});
|
||||
|
||||
it('rejects edit instance with error message', async () => {
|
||||
const cmd = createEditCommand({ client, log });
|
||||
|
||||
await cmd.parseAsync(['instance', 'inst-1'], { from: 'user' });
|
||||
|
||||
expect(client.get).not.toHaveBeenCalled();
|
||||
expect(client.put).not.toHaveBeenCalled();
|
||||
expect(output.join('\n')).toContain('immutable');
|
||||
});
|
||||
|
||||
});
|
||||
@@ -20,7 +20,7 @@ describe('get command', () => {
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'servers']);
|
||||
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('servers', undefined);
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('servers', undefined, undefined);
|
||||
expect(deps.output[0]).toContain('NAME');
|
||||
expect(deps.output[0]).toContain('TRANSPORT');
|
||||
expect(deps.output.join('\n')).toContain('slack');
|
||||
@@ -31,49 +31,51 @@ describe('get command', () => {
|
||||
const deps = makeDeps([]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'srv']);
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('servers', undefined);
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('servers', undefined, undefined);
|
||||
});
|
||||
|
||||
it('passes ID when provided', async () => {
|
||||
const deps = makeDeps([{ id: 'srv-1', name: 'slack' }]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'servers', 'srv-1']);
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('servers', 'srv-1');
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('servers', 'srv-1', undefined);
|
||||
});
|
||||
|
||||
it('outputs JSON format', async () => {
|
||||
const deps = makeDeps([{ id: 'srv-1', name: 'slack' }]);
|
||||
it('outputs apply-compatible JSON format', async () => {
|
||||
const deps = makeDeps([{ id: 'srv-1', name: 'slack', createdAt: '2025-01-01', updatedAt: '2025-01-01', version: 1 }]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'servers', '-o', 'json']);
|
||||
|
||||
const parsed = JSON.parse(deps.output[0] ?? '');
|
||||
expect(parsed).toEqual({ id: 'srv-1', name: 'slack' });
|
||||
// Wrapped in resource key, internal fields stripped
|
||||
expect(parsed).toHaveProperty('servers');
|
||||
expect(parsed.servers[0].name).toBe('slack');
|
||||
expect(parsed.servers[0]).not.toHaveProperty('id');
|
||||
expect(parsed.servers[0]).not.toHaveProperty('createdAt');
|
||||
expect(parsed.servers[0]).not.toHaveProperty('updatedAt');
|
||||
expect(parsed.servers[0]).not.toHaveProperty('version');
|
||||
});
|
||||
|
||||
it('outputs YAML format', async () => {
|
||||
const deps = makeDeps([{ id: 'srv-1', name: 'slack' }]);
|
||||
it('outputs apply-compatible YAML format', async () => {
|
||||
const deps = makeDeps([{ id: 'srv-1', name: 'slack', createdAt: '2025-01-01' }]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'servers', '-o', 'yaml']);
|
||||
expect(deps.output[0]).toContain('name: slack');
|
||||
});
|
||||
|
||||
it('lists profiles with correct columns', async () => {
|
||||
const deps = makeDeps([
|
||||
{ id: 'p1', name: 'default', serverId: 'srv-1' },
|
||||
]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'profiles']);
|
||||
expect(deps.output[0]).toContain('NAME');
|
||||
expect(deps.output[0]).toContain('SERVER ID');
|
||||
const text = deps.output[0];
|
||||
expect(text).toContain('servers:');
|
||||
expect(text).toContain('name: slack');
|
||||
expect(text).not.toContain('id:');
|
||||
expect(text).not.toContain('createdAt:');
|
||||
});
|
||||
|
||||
it('lists instances with correct columns', async () => {
|
||||
const deps = makeDeps([
|
||||
{ id: 'inst-1', serverId: 'srv-1', status: 'RUNNING', containerId: 'abc123def456', port: 3000 },
|
||||
{ id: 'inst-1', serverId: 'srv-1', server: { name: 'my-grafana' }, status: 'RUNNING', containerId: 'abc123def456', port: 3000 },
|
||||
]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'instances']);
|
||||
expect(deps.output[0]).toContain('NAME');
|
||||
expect(deps.output[0]).toContain('STATUS');
|
||||
expect(deps.output.join('\n')).toContain('my-grafana');
|
||||
expect(deps.output.join('\n')).toContain('RUNNING');
|
||||
});
|
||||
|
||||
@@ -81,6 +83,255 @@ describe('get command', () => {
|
||||
const deps = makeDeps([]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'servers']);
|
||||
expect(deps.output[0]).toContain('No results');
|
||||
expect(deps.output[0]).toContain('No servers found');
|
||||
});
|
||||
|
||||
it('lists users with correct columns (no ROLE column)', async () => {
|
||||
const deps = makeDeps([
|
||||
{ id: 'usr-1', email: 'alice@test.com', name: 'Alice', provider: null },
|
||||
{ id: 'usr-2', email: 'bob@test.com', name: null, provider: 'oidc' },
|
||||
]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'users']);
|
||||
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('users', undefined, undefined);
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('EMAIL');
|
||||
expect(text).toContain('NAME');
|
||||
expect(text).not.toContain('ROLE');
|
||||
expect(text).toContain('PROVIDER');
|
||||
expect(text).toContain('alice@test.com');
|
||||
expect(text).toContain('Alice');
|
||||
expect(text).toContain('bob@test.com');
|
||||
expect(text).toContain('oidc');
|
||||
});
|
||||
|
||||
it('resolves user alias', async () => {
|
||||
const deps = makeDeps([]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'user']);
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('users', undefined, undefined);
|
||||
});
|
||||
|
||||
it('lists groups with correct columns', async () => {
|
||||
const deps = makeDeps([
|
||||
{
|
||||
id: 'grp-1',
|
||||
name: 'dev-team',
|
||||
description: 'Developers',
|
||||
members: [{ user: { email: 'alice@test.com' } }, { user: { email: 'bob@test.com' } }],
|
||||
},
|
||||
{ id: 'grp-2', name: 'ops-team', description: 'Operations', members: [] },
|
||||
]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'groups']);
|
||||
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('groups', undefined, undefined);
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('NAME');
|
||||
expect(text).toContain('MEMBERS');
|
||||
expect(text).toContain('DESCRIPTION');
|
||||
expect(text).toContain('dev-team');
|
||||
expect(text).toContain('2');
|
||||
expect(text).toContain('ops-team');
|
||||
expect(text).toContain('0');
|
||||
});
|
||||
|
||||
it('resolves group alias', async () => {
|
||||
const deps = makeDeps([]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'group']);
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('groups', undefined, undefined);
|
||||
});
|
||||
|
||||
it('lists rbac definitions with correct columns', async () => {
|
||||
const deps = makeDeps([
|
||||
{
|
||||
id: 'rbac-1',
|
||||
name: 'admins',
|
||||
subjects: [{ kind: 'User', name: 'admin@test.com' }],
|
||||
roleBindings: [{ role: 'edit', resource: '*' }],
|
||||
},
|
||||
]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'rbac']);
|
||||
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('rbac', undefined, undefined);
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('NAME');
|
||||
expect(text).toContain('SUBJECTS');
|
||||
expect(text).toContain('BINDINGS');
|
||||
expect(text).toContain('admins');
|
||||
expect(text).toContain('User:admin@test.com');
|
||||
expect(text).toContain('edit:*');
|
||||
});
|
||||
|
||||
it('resolves rbac-definition alias', async () => {
|
||||
const deps = makeDeps([]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'rbac-definition']);
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('rbac', undefined, undefined);
|
||||
});
|
||||
|
||||
it('lists projects with new columns', async () => {
|
||||
const deps = makeDeps([{
|
||||
id: 'proj-1',
|
||||
name: 'smart-home',
|
||||
description: 'Home automation',
|
||||
proxyMode: 'filtered',
|
||||
ownerId: 'usr-1',
|
||||
servers: [{ server: { name: 'grafana' } }],
|
||||
}]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'projects']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('MODE');
|
||||
expect(text).toContain('SERVERS');
|
||||
expect(text).toContain('smart-home');
|
||||
expect(text).toContain('filtered');
|
||||
expect(text).toContain('1');
|
||||
});
|
||||
|
||||
it('displays mixed resource and operation bindings', async () => {
|
||||
const deps = makeDeps([
|
||||
{
|
||||
id: 'rbac-1',
|
||||
name: 'admin-access',
|
||||
subjects: [{ kind: 'Group', name: 'admin' }],
|
||||
roleBindings: [
|
||||
{ role: 'edit', resource: '*' },
|
||||
{ role: 'run', action: 'logs' },
|
||||
{ role: 'run', action: 'backup' },
|
||||
],
|
||||
},
|
||||
]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'rbac']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('edit:*');
|
||||
expect(text).toContain('run>logs');
|
||||
expect(text).toContain('run>backup');
|
||||
});
|
||||
|
||||
it('displays name-scoped resource bindings', async () => {
|
||||
const deps = makeDeps([
|
||||
{
|
||||
id: 'rbac-1',
|
||||
name: 'ha-viewer',
|
||||
subjects: [{ kind: 'User', name: 'alice@test.com' }],
|
||||
roleBindings: [{ role: 'view', resource: 'servers', name: 'my-ha' }],
|
||||
},
|
||||
]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'rbac']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('view:servers:my-ha');
|
||||
});
|
||||
|
||||
it('shows no results message for empty users list', async () => {
|
||||
const deps = makeDeps([]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'users']);
|
||||
expect(deps.output[0]).toContain('No users found');
|
||||
});
|
||||
|
||||
it('shows no results message for empty groups list', async () => {
|
||||
const deps = makeDeps([]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'groups']);
|
||||
expect(deps.output[0]).toContain('No groups found');
|
||||
});
|
||||
|
||||
it('shows no results message for empty rbac list', async () => {
|
||||
const deps = makeDeps([]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'rbac']);
|
||||
expect(deps.output[0]).toContain('No rbac found');
|
||||
});
|
||||
|
||||
it('lists prompts with project name column', async () => {
|
||||
const deps = makeDeps([
|
||||
{ id: 'p-1', name: 'debug-guide', projectId: 'proj-1', project: { name: 'smart-home' }, createdAt: '2025-01-01T00:00:00Z' },
|
||||
{ id: 'p-2', name: 'global-rules', projectId: null, project: null, createdAt: '2025-01-01T00:00:00Z' },
|
||||
]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'prompts']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('NAME');
|
||||
expect(text).toContain('PROJECT');
|
||||
expect(text).toContain('debug-guide');
|
||||
expect(text).toContain('smart-home');
|
||||
expect(text).toContain('global-rules');
|
||||
expect(text).toContain('(global)');
|
||||
});
|
||||
|
||||
it('lists promptrequests with project name column', async () => {
|
||||
const deps = makeDeps([
|
||||
{ id: 'pr-1', name: 'new-rule', projectId: 'proj-1', project: { name: 'my-project' }, createdBySession: 'sess-abc123def456', createdAt: '2025-01-01T00:00:00Z' },
|
||||
]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'promptrequests']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('new-rule');
|
||||
expect(text).toContain('my-project');
|
||||
expect(text).toContain('sess-abc123d');
|
||||
});
|
||||
|
||||
it('passes --project option to fetchResource', async () => {
|
||||
const deps = makeDeps([]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'prompts', '--project', 'smart-home']);
|
||||
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('prompts', undefined, { project: 'smart-home' });
|
||||
});
|
||||
|
||||
it('does not pass project when --project is not specified', async () => {
|
||||
const deps = makeDeps([]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'prompts']);
|
||||
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('prompts', undefined, undefined);
|
||||
});
|
||||
|
||||
it('passes --all flag to fetchResource', async () => {
|
||||
const deps = makeDeps([]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'prompts', '-A']);
|
||||
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('prompts', undefined, { all: true });
|
||||
});
|
||||
|
||||
it('passes both --project and --all when both given', async () => {
|
||||
const deps = makeDeps([]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'prompts', '--project', 'my-proj', '-A']);
|
||||
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('prompts', undefined, { project: 'my-proj', all: true });
|
||||
});
|
||||
|
||||
it('resolves prompt alias', async () => {
|
||||
const deps = makeDeps([]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'prompt']);
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('prompts', undefined, undefined);
|
||||
});
|
||||
|
||||
it('resolves pr alias to promptrequests', async () => {
|
||||
const deps = makeDeps([]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'pr']);
|
||||
expect(deps.fetchResource).toHaveBeenCalledWith('promptrequests', undefined, undefined);
|
||||
});
|
||||
|
||||
it('shows no results message for empty prompts list', async () => {
|
||||
const deps = makeDeps([]);
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'prompts']);
|
||||
expect(deps.output[0]).toContain('No prompts found');
|
||||
});
|
||||
});
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import { describe, it, expect, vi, beforeEach } from 'vitest';
|
||||
import { createInstanceCommands } from '../../src/commands/instances.js';
|
||||
import { createDeleteCommand } from '../../src/commands/delete.js';
|
||||
import { createLogsCommand } from '../../src/commands/logs.js';
|
||||
import type { ApiClient } from '../../src/api-client.js';
|
||||
|
||||
function mockClient(): ApiClient {
|
||||
@@ -11,7 +12,7 @@ function mockClient(): ApiClient {
|
||||
} as unknown as ApiClient;
|
||||
}
|
||||
|
||||
describe('instance commands', () => {
|
||||
describe('delete command', () => {
|
||||
let client: ReturnType<typeof mockClient>;
|
||||
let output: string[];
|
||||
const log = (...args: unknown[]) => output.push(args.map(String).join(' '));
|
||||
@@ -21,107 +22,127 @@ describe('instance commands', () => {
|
||||
output = [];
|
||||
});
|
||||
|
||||
describe('list', () => {
|
||||
it('shows no instances message when empty', async () => {
|
||||
const cmd = createInstanceCommands({ client, log });
|
||||
await cmd.parseAsync(['list'], { from: 'user' });
|
||||
expect(output.join('\n')).toContain('No instances found');
|
||||
});
|
||||
|
||||
it('shows instance table', async () => {
|
||||
vi.mocked(client.get).mockResolvedValue([
|
||||
{ id: 'inst-1', serverId: 'srv-1', status: 'RUNNING', containerId: 'ctr-abc123def', port: 3000, createdAt: '2025-01-01' },
|
||||
]);
|
||||
const cmd = createInstanceCommands({ client, log });
|
||||
await cmd.parseAsync(['list'], { from: 'user' });
|
||||
expect(output.join('\n')).toContain('inst-1');
|
||||
expect(output.join('\n')).toContain('RUNNING');
|
||||
});
|
||||
|
||||
it('filters by server', async () => {
|
||||
const cmd = createInstanceCommands({ client, log });
|
||||
await cmd.parseAsync(['list', '-s', 'srv-1'], { from: 'user' });
|
||||
expect(client.get).toHaveBeenCalledWith(expect.stringContaining('serverId=srv-1'));
|
||||
});
|
||||
|
||||
it('outputs json', async () => {
|
||||
vi.mocked(client.get).mockResolvedValue([{ id: 'inst-1' }]);
|
||||
const cmd = createInstanceCommands({ client, log });
|
||||
await cmd.parseAsync(['list', '-o', 'json'], { from: 'user' });
|
||||
expect(output[0]).toContain('"id"');
|
||||
});
|
||||
it('deletes an instance by ID', async () => {
|
||||
const cmd = createDeleteCommand({ client, log });
|
||||
await cmd.parseAsync(['instance', 'inst-1'], { from: 'user' });
|
||||
expect(client.delete).toHaveBeenCalledWith('/api/v1/instances/inst-1');
|
||||
expect(output.join('\n')).toContain('deleted');
|
||||
});
|
||||
|
||||
describe('start', () => {
|
||||
it('starts an instance', async () => {
|
||||
vi.mocked(client.post).mockResolvedValue({ id: 'inst-new', status: 'RUNNING' });
|
||||
const cmd = createInstanceCommands({ client, log });
|
||||
await cmd.parseAsync(['start', 'srv-1'], { from: 'user' });
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/instances', { serverId: 'srv-1' });
|
||||
expect(output.join('\n')).toContain('started');
|
||||
});
|
||||
|
||||
it('passes host port', async () => {
|
||||
vi.mocked(client.post).mockResolvedValue({ id: 'inst-new', status: 'RUNNING' });
|
||||
const cmd = createInstanceCommands({ client, log });
|
||||
await cmd.parseAsync(['start', 'srv-1', '-p', '8080'], { from: 'user' });
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/instances', { serverId: 'srv-1', hostPort: 8080 });
|
||||
});
|
||||
it('deletes a server by ID', async () => {
|
||||
const cmd = createDeleteCommand({ client, log });
|
||||
await cmd.parseAsync(['server', 'srv-1'], { from: 'user' });
|
||||
expect(client.delete).toHaveBeenCalledWith('/api/v1/servers/srv-1');
|
||||
expect(output.join('\n')).toContain('deleted');
|
||||
});
|
||||
|
||||
describe('stop', () => {
|
||||
it('stops an instance', async () => {
|
||||
vi.mocked(client.post).mockResolvedValue({ id: 'inst-1', status: 'STOPPED' });
|
||||
const cmd = createInstanceCommands({ client, log });
|
||||
await cmd.parseAsync(['stop', 'inst-1'], { from: 'user' });
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/instances/inst-1/stop');
|
||||
expect(output.join('\n')).toContain('stopped');
|
||||
});
|
||||
it('resolves server name to ID', async () => {
|
||||
vi.mocked(client.get).mockResolvedValue([
|
||||
{ id: 'srv-abc', name: 'ha-mcp' },
|
||||
]);
|
||||
const cmd = createDeleteCommand({ client, log });
|
||||
await cmd.parseAsync(['server', 'ha-mcp'], { from: 'user' });
|
||||
expect(client.delete).toHaveBeenCalledWith('/api/v1/servers/srv-abc');
|
||||
});
|
||||
|
||||
describe('restart', () => {
|
||||
it('restarts an instance', async () => {
|
||||
vi.mocked(client.post).mockResolvedValue({ id: 'inst-2', status: 'RUNNING' });
|
||||
const cmd = createInstanceCommands({ client, log });
|
||||
await cmd.parseAsync(['restart', 'inst-1'], { from: 'user' });
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/instances/inst-1/restart');
|
||||
expect(output.join('\n')).toContain('restarted');
|
||||
});
|
||||
it('deletes a project', async () => {
|
||||
const cmd = createDeleteCommand({ client, log });
|
||||
await cmd.parseAsync(['project', 'proj-1'], { from: 'user' });
|
||||
expect(client.delete).toHaveBeenCalledWith('/api/v1/projects/proj-1');
|
||||
});
|
||||
|
||||
describe('remove', () => {
|
||||
it('removes an instance', async () => {
|
||||
const cmd = createInstanceCommands({ client, log });
|
||||
await cmd.parseAsync(['remove', 'inst-1'], { from: 'user' });
|
||||
expect(client.delete).toHaveBeenCalledWith('/api/v1/instances/inst-1');
|
||||
expect(output.join('\n')).toContain('removed');
|
||||
});
|
||||
});
|
||||
|
||||
describe('logs', () => {
|
||||
it('shows logs', async () => {
|
||||
vi.mocked(client.get).mockResolvedValue({ stdout: 'hello world\n', stderr: '' });
|
||||
const cmd = createInstanceCommands({ client, log });
|
||||
await cmd.parseAsync(['logs', 'inst-1'], { from: 'user' });
|
||||
expect(client.get).toHaveBeenCalledWith('/api/v1/instances/inst-1/logs');
|
||||
expect(output.join('\n')).toContain('hello world');
|
||||
});
|
||||
|
||||
it('passes tail option', async () => {
|
||||
vi.mocked(client.get).mockResolvedValue({ stdout: '', stderr: '' });
|
||||
const cmd = createInstanceCommands({ client, log });
|
||||
await cmd.parseAsync(['logs', 'inst-1', '-t', '50'], { from: 'user' });
|
||||
expect(client.get).toHaveBeenCalledWith('/api/v1/instances/inst-1/logs?tail=50');
|
||||
});
|
||||
});
|
||||
|
||||
describe('inspect', () => {
|
||||
it('shows container info as json', async () => {
|
||||
vi.mocked(client.get).mockResolvedValue({ containerId: 'ctr-abc', state: 'running' });
|
||||
const cmd = createInstanceCommands({ client, log });
|
||||
await cmd.parseAsync(['inspect', 'inst-1'], { from: 'user' });
|
||||
expect(client.get).toHaveBeenCalledWith('/api/v1/instances/inst-1/inspect');
|
||||
expect(output[0]).toContain('ctr-abc');
|
||||
});
|
||||
it('accepts resource aliases', async () => {
|
||||
const cmd = createDeleteCommand({ client, log });
|
||||
await cmd.parseAsync(['srv', 'srv-1'], { from: 'user' });
|
||||
expect(client.delete).toHaveBeenCalledWith('/api/v1/servers/srv-1');
|
||||
});
|
||||
});
|
||||
|
||||
describe('logs command', () => {
|
||||
let client: ReturnType<typeof mockClient>;
|
||||
let output: string[];
|
||||
const log = (...args: unknown[]) => output.push(args.map(String).join(' '));
|
||||
|
||||
beforeEach(() => {
|
||||
client = mockClient();
|
||||
output = [];
|
||||
});
|
||||
|
||||
it('shows logs by instance ID', async () => {
|
||||
vi.mocked(client.get)
|
||||
.mockResolvedValueOnce({ id: 'inst-1', status: 'RUNNING' } as never) // instance lookup
|
||||
.mockResolvedValueOnce({ stdout: 'hello world\n', stderr: '' } as never); // logs
|
||||
const cmd = createLogsCommand({ client, log });
|
||||
await cmd.parseAsync(['inst-1'], { from: 'user' });
|
||||
expect(client.get).toHaveBeenCalledWith('/api/v1/instances/inst-1');
|
||||
expect(client.get).toHaveBeenCalledWith('/api/v1/instances/inst-1/logs');
|
||||
expect(output.join('\n')).toContain('hello world');
|
||||
});
|
||||
|
||||
it('resolves server name to instance ID', async () => {
|
||||
vi.mocked(client.get)
|
||||
.mockRejectedValueOnce(new Error('not found')) // instance lookup fails
|
||||
.mockResolvedValueOnce([{ id: 'srv-1', name: 'my-grafana' }] as never) // servers list
|
||||
.mockResolvedValueOnce([{ id: 'inst-1', status: 'RUNNING', containerId: 'abc' }] as never) // instances for server
|
||||
.mockResolvedValueOnce({ stdout: 'grafana logs\n', stderr: '' } as never); // logs
|
||||
const cmd = createLogsCommand({ client, log });
|
||||
await cmd.parseAsync(['my-grafana'], { from: 'user' });
|
||||
expect(client.get).toHaveBeenCalledWith('/api/v1/instances/inst-1/logs');
|
||||
expect(output.join('\n')).toContain('grafana logs');
|
||||
});
|
||||
|
||||
it('picks RUNNING instance over others', async () => {
|
||||
vi.mocked(client.get)
|
||||
.mockRejectedValueOnce(new Error('not found'))
|
||||
.mockResolvedValueOnce([{ id: 'srv-1', name: 'ha-mcp' }] as never)
|
||||
.mockResolvedValueOnce([
|
||||
{ id: 'inst-err', status: 'ERROR', containerId: null },
|
||||
{ id: 'inst-ok', status: 'RUNNING', containerId: 'abc' },
|
||||
] as never)
|
||||
.mockResolvedValueOnce({ stdout: 'running instance\n', stderr: '' } as never);
|
||||
const cmd = createLogsCommand({ client, log });
|
||||
await cmd.parseAsync(['ha-mcp'], { from: 'user' });
|
||||
expect(client.get).toHaveBeenCalledWith('/api/v1/instances/inst-ok/logs');
|
||||
});
|
||||
|
||||
it('selects specific replica with --instance', async () => {
|
||||
vi.mocked(client.get)
|
||||
.mockRejectedValueOnce(new Error('not found'))
|
||||
.mockResolvedValueOnce([{ id: 'srv-1', name: 'ha-mcp' }] as never)
|
||||
.mockResolvedValueOnce([
|
||||
{ id: 'inst-0', status: 'RUNNING', containerId: 'a' },
|
||||
{ id: 'inst-1', status: 'RUNNING', containerId: 'b' },
|
||||
] as never)
|
||||
.mockResolvedValueOnce({ stdout: 'replica 1\n', stderr: '' } as never);
|
||||
const cmd = createLogsCommand({ client, log });
|
||||
await cmd.parseAsync(['ha-mcp', '-i', '1'], { from: 'user' });
|
||||
expect(client.get).toHaveBeenCalledWith('/api/v1/instances/inst-1/logs');
|
||||
});
|
||||
|
||||
it('throws on out-of-range --instance index', async () => {
|
||||
vi.mocked(client.get)
|
||||
.mockRejectedValueOnce(new Error('not found'))
|
||||
.mockResolvedValueOnce([{ id: 'srv-1', name: 'ha-mcp' }] as never)
|
||||
.mockResolvedValueOnce([{ id: 'inst-0', status: 'RUNNING' }] as never);
|
||||
const cmd = createLogsCommand({ client, log });
|
||||
await expect(cmd.parseAsync(['ha-mcp', '-i', '5'], { from: 'user' })).rejects.toThrow('out of range');
|
||||
});
|
||||
|
||||
it('throws when server has no instances', async () => {
|
||||
vi.mocked(client.get)
|
||||
.mockRejectedValueOnce(new Error('not found'))
|
||||
.mockResolvedValueOnce([{ id: 'srv-1', name: 'empty-srv' }] as never)
|
||||
.mockResolvedValueOnce([] as never);
|
||||
const cmd = createLogsCommand({ client, log });
|
||||
await expect(cmd.parseAsync(['empty-srv'], { from: 'user' })).rejects.toThrow('No instances found');
|
||||
});
|
||||
|
||||
it('passes tail option', async () => {
|
||||
vi.mocked(client.get)
|
||||
.mockResolvedValueOnce({ id: 'inst-1' } as never)
|
||||
.mockResolvedValueOnce({ stdout: '', stderr: '' } as never);
|
||||
const cmd = createLogsCommand({ client, log });
|
||||
await cmd.parseAsync(['inst-1', '-t', '50'], { from: 'user' });
|
||||
expect(client.get).toHaveBeenCalledWith('/api/v1/instances/inst-1/logs?tail=50');
|
||||
});
|
||||
});
|
||||
|
||||
481
src/cli/tests/commands/mcp.test.ts
Normal file
481
src/cli/tests/commands/mcp.test.ts
Normal file
@@ -0,0 +1,481 @@
|
||||
import { describe, it, expect, beforeAll, afterAll } from 'vitest';
|
||||
import http from 'node:http';
|
||||
import { Readable, Writable } from 'node:stream';
|
||||
import { runMcpBridge, createMcpCommand } from '../../src/commands/mcp.js';
|
||||
|
||||
// ---- Mock MCP server (simulates mcplocal project endpoint) ----
|
||||
|
||||
interface RecordedRequest {
|
||||
method: string;
|
||||
url: string;
|
||||
headers: http.IncomingHttpHeaders;
|
||||
body: string;
|
||||
}
|
||||
|
||||
let mockServer: http.Server;
|
||||
let mockPort: number;
|
||||
const recorded: RecordedRequest[] = [];
|
||||
let sessionCounter = 0;
|
||||
|
||||
function makeInitializeResponse(id: number | string) {
|
||||
return JSON.stringify({
|
||||
jsonrpc: '2.0',
|
||||
id,
|
||||
result: {
|
||||
protocolVersion: '2024-11-05',
|
||||
capabilities: { tools: {} },
|
||||
serverInfo: { name: 'test-server', version: '1.0.0' },
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
function makeToolsListResponse(id: number | string) {
|
||||
return JSON.stringify({
|
||||
jsonrpc: '2.0',
|
||||
id,
|
||||
result: {
|
||||
tools: [
|
||||
{ name: 'grafana/query', description: 'Query Grafana', inputSchema: { type: 'object', properties: {} } },
|
||||
],
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
function makeToolCallResponse(id: number | string) {
|
||||
return JSON.stringify({
|
||||
jsonrpc: '2.0',
|
||||
id,
|
||||
result: {
|
||||
content: [{ type: 'text', text: 'tool result' }],
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
beforeAll(async () => {
|
||||
mockServer = http.createServer((req, res) => {
|
||||
const chunks: Buffer[] = [];
|
||||
req.on('data', (c: Buffer) => chunks.push(c));
|
||||
req.on('end', () => {
|
||||
const body = Buffer.concat(chunks).toString('utf-8');
|
||||
recorded.push({ method: req.method ?? '', url: req.url ?? '', headers: req.headers, body });
|
||||
|
||||
if (req.method === 'DELETE') {
|
||||
res.writeHead(200);
|
||||
res.end();
|
||||
return;
|
||||
}
|
||||
|
||||
if (req.method === 'POST' && req.url?.startsWith('/projects/')) {
|
||||
let sessionId = req.headers['mcp-session-id'] as string | undefined;
|
||||
|
||||
// Assign session ID on first request
|
||||
if (!sessionId) {
|
||||
sessionCounter++;
|
||||
sessionId = `session-${sessionCounter}`;
|
||||
}
|
||||
res.setHeader('mcp-session-id', sessionId);
|
||||
|
||||
// Parse JSON-RPC and respond based on method
|
||||
try {
|
||||
const rpc = JSON.parse(body) as { id: number | string; method: string };
|
||||
let responseBody: string;
|
||||
|
||||
switch (rpc.method) {
|
||||
case 'initialize':
|
||||
responseBody = makeInitializeResponse(rpc.id);
|
||||
break;
|
||||
case 'tools/list':
|
||||
responseBody = makeToolsListResponse(rpc.id);
|
||||
break;
|
||||
case 'tools/call':
|
||||
responseBody = makeToolCallResponse(rpc.id);
|
||||
break;
|
||||
default:
|
||||
responseBody = JSON.stringify({ jsonrpc: '2.0', id: rpc.id, error: { code: -32601, message: 'Method not found' } });
|
||||
}
|
||||
|
||||
// Respond in SSE format for /projects/sse-project/mcp
|
||||
if (req.url?.includes('sse-project')) {
|
||||
res.writeHead(200, { 'Content-Type': 'text/event-stream' });
|
||||
res.end(`event: message\ndata: ${responseBody}\n\n`);
|
||||
} else {
|
||||
res.writeHead(200, { 'Content-Type': 'application/json' });
|
||||
res.end(responseBody);
|
||||
}
|
||||
} catch {
|
||||
res.writeHead(400, { 'Content-Type': 'application/json' });
|
||||
res.end(JSON.stringify({ error: 'Invalid JSON' }));
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
res.writeHead(404);
|
||||
res.end();
|
||||
});
|
||||
});
|
||||
|
||||
await new Promise<void>((resolve) => {
|
||||
mockServer.listen(0, () => {
|
||||
const addr = mockServer.address();
|
||||
if (addr && typeof addr === 'object') {
|
||||
mockPort = addr.port;
|
||||
}
|
||||
resolve();
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
afterAll(() => {
|
||||
mockServer.close();
|
||||
});
|
||||
|
||||
// ---- Helper to run bridge with mock streams ----
|
||||
|
||||
function createMockStreams() {
|
||||
const stdoutChunks: string[] = [];
|
||||
const stderrChunks: string[] = [];
|
||||
|
||||
const stdout = new Writable({
|
||||
write(chunk: Buffer, _encoding, callback) {
|
||||
stdoutChunks.push(chunk.toString());
|
||||
callback();
|
||||
},
|
||||
});
|
||||
|
||||
const stderr = new Writable({
|
||||
write(chunk: Buffer, _encoding, callback) {
|
||||
stderrChunks.push(chunk.toString());
|
||||
callback();
|
||||
},
|
||||
});
|
||||
|
||||
return { stdout, stderr, stdoutChunks, stderrChunks };
|
||||
}
|
||||
|
||||
function pushAndEnd(stdin: Readable, lines: string[]) {
|
||||
for (const line of lines) {
|
||||
stdin.push(line + '\n');
|
||||
}
|
||||
stdin.push(null); // EOF
|
||||
}
|
||||
|
||||
// ---- Tests ----
|
||||
|
||||
describe('MCP STDIO Bridge', () => {
|
||||
beforeAll(() => {
|
||||
recorded.length = 0;
|
||||
sessionCounter = 0;
|
||||
});
|
||||
|
||||
it('forwards initialize request and returns response', async () => {
|
||||
recorded.length = 0;
|
||||
const stdin = new Readable({ read() {} });
|
||||
const { stdout, stdoutChunks } = createMockStreams();
|
||||
|
||||
const initMsg = JSON.stringify({
|
||||
jsonrpc: '2.0', id: 1, method: 'initialize',
|
||||
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
|
||||
});
|
||||
|
||||
pushAndEnd(stdin, [initMsg]);
|
||||
|
||||
await runMcpBridge({
|
||||
projectName: 'test-project',
|
||||
mcplocalUrl: `http://localhost:${mockPort}`,
|
||||
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
|
||||
});
|
||||
|
||||
// Verify request was made to correct URL
|
||||
expect(recorded.some((r) => r.url === '/projects/test-project/mcp' && r.method === 'POST')).toBe(true);
|
||||
|
||||
// Verify response on stdout
|
||||
const output = stdoutChunks.join('');
|
||||
const parsed = JSON.parse(output.trim());
|
||||
expect(parsed.result.serverInfo.name).toBe('test-server');
|
||||
expect(parsed.result.protocolVersion).toBe('2024-11-05');
|
||||
});
|
||||
|
||||
it('sends session ID on subsequent requests', async () => {
|
||||
recorded.length = 0;
|
||||
const stdin = new Readable({ read() {} });
|
||||
const { stdout, stdoutChunks } = createMockStreams();
|
||||
|
||||
const initMsg = JSON.stringify({
|
||||
jsonrpc: '2.0', id: 1, method: 'initialize',
|
||||
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
|
||||
});
|
||||
const toolsListMsg = JSON.stringify({ jsonrpc: '2.0', id: 2, method: 'tools/list', params: {} });
|
||||
|
||||
pushAndEnd(stdin, [initMsg, toolsListMsg]);
|
||||
|
||||
await runMcpBridge({
|
||||
projectName: 'test-project',
|
||||
mcplocalUrl: `http://localhost:${mockPort}`,
|
||||
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
|
||||
});
|
||||
|
||||
// First POST should NOT have mcp-session-id header
|
||||
const firstPost = recorded.find((r) => r.method === 'POST' && r.body.includes('initialize'));
|
||||
expect(firstPost).toBeDefined();
|
||||
expect(firstPost!.headers['mcp-session-id']).toBeUndefined();
|
||||
|
||||
// Second POST SHOULD have mcp-session-id header
|
||||
const secondPost = recorded.find((r) => r.method === 'POST' && r.body.includes('tools/list'));
|
||||
expect(secondPost).toBeDefined();
|
||||
expect(secondPost!.headers['mcp-session-id']).toMatch(/^session-/);
|
||||
|
||||
// Verify tools/list response
|
||||
const lines = stdoutChunks.join('').trim().split('\n');
|
||||
expect(lines.length).toBe(2);
|
||||
const toolsResponse = JSON.parse(lines[1]);
|
||||
expect(toolsResponse.result.tools[0].name).toBe('grafana/query');
|
||||
});
|
||||
|
||||
it('forwards tools/call and returns result', async () => {
|
||||
recorded.length = 0;
|
||||
const stdin = new Readable({ read() {} });
|
||||
const { stdout, stdoutChunks } = createMockStreams();
|
||||
|
||||
const initMsg = JSON.stringify({
|
||||
jsonrpc: '2.0', id: 1, method: 'initialize',
|
||||
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
|
||||
});
|
||||
const callMsg = JSON.stringify({
|
||||
jsonrpc: '2.0', id: 2, method: 'tools/call',
|
||||
params: { name: 'grafana/query', arguments: { query: 'test' } },
|
||||
});
|
||||
|
||||
pushAndEnd(stdin, [initMsg, callMsg]);
|
||||
|
||||
await runMcpBridge({
|
||||
projectName: 'test-project',
|
||||
mcplocalUrl: `http://localhost:${mockPort}`,
|
||||
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
|
||||
});
|
||||
|
||||
const lines = stdoutChunks.join('').trim().split('\n');
|
||||
expect(lines.length).toBe(2);
|
||||
const callResponse = JSON.parse(lines[1]);
|
||||
expect(callResponse.result.content[0].text).toBe('tool result');
|
||||
});
|
||||
|
||||
it('forwards Authorization header when token provided', async () => {
|
||||
recorded.length = 0;
|
||||
const stdin = new Readable({ read() {} });
|
||||
const { stdout } = createMockStreams();
|
||||
|
||||
const initMsg = JSON.stringify({
|
||||
jsonrpc: '2.0', id: 1, method: 'initialize',
|
||||
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
|
||||
});
|
||||
|
||||
pushAndEnd(stdin, [initMsg]);
|
||||
|
||||
await runMcpBridge({
|
||||
projectName: 'test-project',
|
||||
mcplocalUrl: `http://localhost:${mockPort}`,
|
||||
token: 'my-secret-token',
|
||||
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
|
||||
});
|
||||
|
||||
const post = recorded.find((r) => r.method === 'POST');
|
||||
expect(post).toBeDefined();
|
||||
expect(post!.headers['authorization']).toBe('Bearer my-secret-token');
|
||||
});
|
||||
|
||||
it('does not send Authorization header when no token', async () => {
|
||||
recorded.length = 0;
|
||||
const stdin = new Readable({ read() {} });
|
||||
const { stdout } = createMockStreams();
|
||||
|
||||
const initMsg = JSON.stringify({
|
||||
jsonrpc: '2.0', id: 1, method: 'initialize',
|
||||
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
|
||||
});
|
||||
|
||||
pushAndEnd(stdin, [initMsg]);
|
||||
|
||||
await runMcpBridge({
|
||||
projectName: 'test-project',
|
||||
mcplocalUrl: `http://localhost:${mockPort}`,
|
||||
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
|
||||
});
|
||||
|
||||
const post = recorded.find((r) => r.method === 'POST');
|
||||
expect(post).toBeDefined();
|
||||
expect(post!.headers['authorization']).toBeUndefined();
|
||||
});
|
||||
|
||||
it('sends DELETE to clean up session on stdin EOF', async () => {
|
||||
recorded.length = 0;
|
||||
const stdin = new Readable({ read() {} });
|
||||
const { stdout } = createMockStreams();
|
||||
|
||||
const initMsg = JSON.stringify({
|
||||
jsonrpc: '2.0', id: 1, method: 'initialize',
|
||||
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
|
||||
});
|
||||
|
||||
pushAndEnd(stdin, [initMsg]);
|
||||
|
||||
await runMcpBridge({
|
||||
projectName: 'test-project',
|
||||
mcplocalUrl: `http://localhost:${mockPort}`,
|
||||
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
|
||||
});
|
||||
|
||||
// Should have a DELETE request for session cleanup
|
||||
const deleteReq = recorded.find((r) => r.method === 'DELETE');
|
||||
expect(deleteReq).toBeDefined();
|
||||
expect(deleteReq!.headers['mcp-session-id']).toMatch(/^session-/);
|
||||
});
|
||||
|
||||
it('does not send DELETE if no session was established', async () => {
|
||||
recorded.length = 0;
|
||||
const stdin = new Readable({ read() {} });
|
||||
const { stdout } = createMockStreams();
|
||||
|
||||
// Push EOF immediately with no messages
|
||||
stdin.push(null);
|
||||
|
||||
await runMcpBridge({
|
||||
projectName: 'test-project',
|
||||
mcplocalUrl: `http://localhost:${mockPort}`,
|
||||
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
|
||||
});
|
||||
|
||||
expect(recorded.filter((r) => r.method === 'DELETE')).toHaveLength(0);
|
||||
});
|
||||
|
||||
it('writes errors to stderr, not stdout', async () => {
|
||||
recorded.length = 0;
|
||||
const stdin = new Readable({ read() {} });
|
||||
const { stdout, stdoutChunks, stderr, stderrChunks } = createMockStreams();
|
||||
|
||||
// Send to a non-existent port to trigger connection error
|
||||
const badMsg = JSON.stringify({ jsonrpc: '2.0', id: 1, method: 'initialize', params: {} });
|
||||
pushAndEnd(stdin, [badMsg]);
|
||||
|
||||
await runMcpBridge({
|
||||
projectName: 'test-project',
|
||||
mcplocalUrl: 'http://localhost:1', // will fail to connect
|
||||
stdin, stdout, stderr,
|
||||
});
|
||||
|
||||
// Error should be on stderr
|
||||
expect(stderrChunks.join('')).toContain('MCP bridge error');
|
||||
// stdout should be empty (no corrupted output)
|
||||
expect(stdoutChunks.join('')).toBe('');
|
||||
});
|
||||
|
||||
it('skips blank lines in stdin', async () => {
|
||||
recorded.length = 0;
|
||||
const stdin = new Readable({ read() {} });
|
||||
const { stdout, stdoutChunks } = createMockStreams();
|
||||
|
||||
const initMsg = JSON.stringify({
|
||||
jsonrpc: '2.0', id: 1, method: 'initialize',
|
||||
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
|
||||
});
|
||||
|
||||
pushAndEnd(stdin, ['', ' ', initMsg, '']);
|
||||
|
||||
await runMcpBridge({
|
||||
projectName: 'test-project',
|
||||
mcplocalUrl: `http://localhost:${mockPort}`,
|
||||
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
|
||||
});
|
||||
|
||||
// Only one POST (for the actual message)
|
||||
const posts = recorded.filter((r) => r.method === 'POST');
|
||||
expect(posts).toHaveLength(1);
|
||||
|
||||
// One response line
|
||||
const lines = stdoutChunks.join('').trim().split('\n');
|
||||
expect(lines).toHaveLength(1);
|
||||
});
|
||||
|
||||
it('handles SSE (text/event-stream) responses', async () => {
|
||||
recorded.length = 0;
|
||||
const stdin = new Readable({ read() {} });
|
||||
const { stdout, stdoutChunks } = createMockStreams();
|
||||
|
||||
const initMsg = JSON.stringify({
|
||||
jsonrpc: '2.0', id: 1, method: 'initialize',
|
||||
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
|
||||
});
|
||||
|
||||
pushAndEnd(stdin, [initMsg]);
|
||||
|
||||
await runMcpBridge({
|
||||
projectName: 'sse-project', // triggers SSE response from mock server
|
||||
mcplocalUrl: `http://localhost:${mockPort}`,
|
||||
stdin, stdout, stderr: new Writable({ write(_, __, cb) { cb(); } }),
|
||||
});
|
||||
|
||||
// Should extract JSON from SSE data: lines
|
||||
const output = stdoutChunks.join('').trim();
|
||||
const parsed = JSON.parse(output);
|
||||
expect(parsed.result.serverInfo.name).toBe('test-server');
|
||||
});
|
||||
|
||||
it('URL-encodes project name', async () => {
|
||||
recorded.length = 0;
|
||||
const stdin = new Readable({ read() {} });
|
||||
const { stdout } = createMockStreams();
|
||||
const { stderr } = createMockStreams();
|
||||
|
||||
const initMsg = JSON.stringify({
|
||||
jsonrpc: '2.0', id: 1, method: 'initialize',
|
||||
params: { protocolVersion: '2024-11-05', capabilities: {}, clientInfo: { name: 'test', version: '1.0' } },
|
||||
});
|
||||
|
||||
pushAndEnd(stdin, [initMsg]);
|
||||
|
||||
await runMcpBridge({
|
||||
projectName: 'my project',
|
||||
mcplocalUrl: `http://localhost:${mockPort}`,
|
||||
stdin, stdout, stderr,
|
||||
});
|
||||
|
||||
const post = recorded.find((r) => r.method === 'POST');
|
||||
expect(post?.url).toBe('/projects/my%20project/mcp');
|
||||
});
|
||||
});
|
||||
|
||||
describe('createMcpCommand', () => {
|
||||
it('accepts --project option directly', () => {
|
||||
const cmd = createMcpCommand({
|
||||
getProject: () => undefined,
|
||||
configLoader: () => ({ mcplocalUrl: 'http://localhost:3200' }),
|
||||
credentialsLoader: () => null,
|
||||
});
|
||||
const opt = cmd.options.find((o) => o.long === '--project');
|
||||
expect(opt).toBeDefined();
|
||||
expect(opt!.short).toBe('-p');
|
||||
});
|
||||
|
||||
it('parses --project from command args', async () => {
|
||||
let capturedProject: string | undefined;
|
||||
const cmd = createMcpCommand({
|
||||
getProject: () => undefined,
|
||||
configLoader: () => ({ mcplocalUrl: `http://localhost:${mockPort}` }),
|
||||
credentialsLoader: () => null,
|
||||
});
|
||||
// Override the action to capture what project was parsed
|
||||
// We test by checking the option parsing works, not by running the full bridge
|
||||
const parsed = cmd.parse(['--project', 'test-proj'], { from: 'user' });
|
||||
capturedProject = parsed.opts().project;
|
||||
expect(capturedProject).toBe('test-proj');
|
||||
});
|
||||
|
||||
it('parses -p shorthand from command args', () => {
|
||||
const cmd = createMcpCommand({
|
||||
getProject: () => undefined,
|
||||
configLoader: () => ({ mcplocalUrl: `http://localhost:${mockPort}` }),
|
||||
credentialsLoader: () => null,
|
||||
});
|
||||
const parsed = cmd.parse(['-p', 'my-project'], { from: 'user' });
|
||||
expect(parsed.opts().project).toBe('my-project');
|
||||
});
|
||||
});
|
||||
@@ -1,17 +1,19 @@
|
||||
import { describe, it, expect, vi, beforeEach } from 'vitest';
|
||||
import { createProjectCommand } from '../../src/commands/project.js';
|
||||
import type { ApiClient } from '../../src/api-client.js';
|
||||
import { createCreateCommand } from '../../src/commands/create.js';
|
||||
import { createGetCommand } from '../../src/commands/get.js';
|
||||
import { createDescribeCommand } from '../../src/commands/describe.js';
|
||||
import { type ApiClient, ApiError } from '../../src/api-client.js';
|
||||
|
||||
function mockClient(): ApiClient {
|
||||
return {
|
||||
get: vi.fn(async () => []),
|
||||
post: vi.fn(async () => ({ id: 'proj-1', name: 'my-project' })),
|
||||
post: vi.fn(async () => ({ id: 'new-id', name: 'test' })),
|
||||
put: vi.fn(async () => ({})),
|
||||
delete: vi.fn(async () => {}),
|
||||
} as unknown as ApiClient;
|
||||
}
|
||||
|
||||
describe('project command', () => {
|
||||
describe('project with new fields', () => {
|
||||
let client: ReturnType<typeof mockClient>;
|
||||
let output: string[];
|
||||
const log = (...args: unknown[]) => output.push(args.map(String).join(' '));
|
||||
@@ -21,91 +23,90 @@ describe('project command', () => {
|
||||
output = [];
|
||||
});
|
||||
|
||||
describe('list', () => {
|
||||
it('shows no projects message when empty', async () => {
|
||||
const cmd = createProjectCommand({ client, log });
|
||||
await cmd.parseAsync(['list'], { from: 'user' });
|
||||
expect(output.join('\n')).toContain('No projects found');
|
||||
describe('create project with enhanced options', () => {
|
||||
it('creates project with proxy mode and servers', async () => {
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync([
|
||||
'project', 'smart-home',
|
||||
'-d', 'Smart home project',
|
||||
'--proxy-mode', 'filtered',
|
||||
'--server', 'my-grafana',
|
||||
'--server', 'my-ha',
|
||||
], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/projects', expect.objectContaining({
|
||||
name: 'smart-home',
|
||||
description: 'Smart home project',
|
||||
proxyMode: 'filtered',
|
||||
servers: ['my-grafana', 'my-ha'],
|
||||
}));
|
||||
});
|
||||
|
||||
it('shows project table', async () => {
|
||||
vi.mocked(client.get).mockResolvedValue([
|
||||
{ id: 'proj-1', name: 'dev', description: 'Dev project', ownerId: 'user-1', createdAt: '2025-01-01' },
|
||||
]);
|
||||
const cmd = createProjectCommand({ client, log });
|
||||
await cmd.parseAsync(['list'], { from: 'user' });
|
||||
expect(output.join('\n')).toContain('proj-1');
|
||||
expect(output.join('\n')).toContain('dev');
|
||||
});
|
||||
it('defaults proxy mode to direct', async () => {
|
||||
const cmd = createCreateCommand({ client, log });
|
||||
await cmd.parseAsync(['project', 'basic'], { from: 'user' });
|
||||
|
||||
it('outputs json', async () => {
|
||||
vi.mocked(client.get).mockResolvedValue([{ id: 'proj-1', name: 'dev' }]);
|
||||
const cmd = createProjectCommand({ client, log });
|
||||
await cmd.parseAsync(['list', '-o', 'json'], { from: 'user' });
|
||||
expect(output[0]).toContain('"id"');
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/projects', expect.objectContaining({
|
||||
proxyMode: 'direct',
|
||||
}));
|
||||
});
|
||||
});
|
||||
|
||||
describe('create', () => {
|
||||
it('creates a project', async () => {
|
||||
const cmd = createProjectCommand({ client, log });
|
||||
await cmd.parseAsync(['create', 'my-project', '-d', 'A test project'], { from: 'user' });
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/projects', {
|
||||
name: 'my-project',
|
||||
description: 'A test project',
|
||||
});
|
||||
expect(output.join('\n')).toContain("Project 'my-project' created");
|
||||
describe('get projects shows new columns', () => {
|
||||
it('shows MODE and SERVERS columns', async () => {
|
||||
const deps = {
|
||||
output: [] as string[],
|
||||
fetchResource: vi.fn(async () => [{
|
||||
id: 'proj-1',
|
||||
name: 'smart-home',
|
||||
description: 'Test',
|
||||
proxyMode: 'filtered',
|
||||
ownerId: 'user-1',
|
||||
servers: [{ server: { name: 'grafana' } }, { server: { name: 'ha' } }],
|
||||
}]),
|
||||
log: (...args: string[]) => deps.output.push(args.join(' ')),
|
||||
};
|
||||
const cmd = createGetCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'projects']);
|
||||
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('MODE');
|
||||
expect(text).toContain('SERVERS');
|
||||
expect(text).toContain('smart-home');
|
||||
});
|
||||
});
|
||||
|
||||
describe('delete', () => {
|
||||
it('deletes a project', async () => {
|
||||
const cmd = createProjectCommand({ client, log });
|
||||
await cmd.parseAsync(['delete', 'proj-1'], { from: 'user' });
|
||||
expect(client.delete).toHaveBeenCalledWith('/api/v1/projects/proj-1');
|
||||
expect(output.join('\n')).toContain('deleted');
|
||||
});
|
||||
});
|
||||
describe('describe project shows full detail', () => {
|
||||
it('shows servers and proxy config', async () => {
|
||||
const deps = {
|
||||
output: [] as string[],
|
||||
client: mockClient(),
|
||||
fetchResource: vi.fn(async () => ({
|
||||
id: 'proj-1',
|
||||
name: 'smart-home',
|
||||
description: 'Smart home',
|
||||
proxyMode: 'filtered',
|
||||
llmProvider: 'gemini-cli',
|
||||
llmModel: 'gemini-2.0-flash',
|
||||
ownerId: 'user-1',
|
||||
servers: [
|
||||
{ server: { name: 'my-grafana' } },
|
||||
{ server: { name: 'my-ha' } },
|
||||
],
|
||||
createdAt: '2025-01-01',
|
||||
updatedAt: '2025-01-01',
|
||||
})),
|
||||
log: (...args: string[]) => deps.output.push(args.join(' ')),
|
||||
};
|
||||
const cmd = createDescribeCommand(deps);
|
||||
await cmd.parseAsync(['node', 'test', 'project', 'proj-1']);
|
||||
|
||||
describe('show', () => {
|
||||
it('shows project details', async () => {
|
||||
vi.mocked(client.get).mockImplementation(async (url: string) => {
|
||||
if (url.endsWith('/profiles')) return [];
|
||||
return { id: 'proj-1', name: 'dev', description: 'Dev project', ownerId: 'user-1', createdAt: '2025-01-01' };
|
||||
});
|
||||
const cmd = createProjectCommand({ client, log });
|
||||
await cmd.parseAsync(['show', 'proj-1'], { from: 'user' });
|
||||
expect(output.join('\n')).toContain('Name: dev');
|
||||
expect(output.join('\n')).toContain('ID: proj-1');
|
||||
});
|
||||
});
|
||||
|
||||
describe('profiles', () => {
|
||||
it('lists profiles for a project', async () => {
|
||||
vi.mocked(client.get).mockResolvedValue([
|
||||
{ id: 'prof-1', name: 'default', serverId: 'srv-1' },
|
||||
]);
|
||||
const cmd = createProjectCommand({ client, log });
|
||||
await cmd.parseAsync(['profiles', 'proj-1'], { from: 'user' });
|
||||
expect(client.get).toHaveBeenCalledWith('/api/v1/projects/proj-1/profiles');
|
||||
expect(output.join('\n')).toContain('default');
|
||||
});
|
||||
|
||||
it('shows empty message when no profiles', async () => {
|
||||
const cmd = createProjectCommand({ client, log });
|
||||
await cmd.parseAsync(['profiles', 'proj-1'], { from: 'user' });
|
||||
expect(output.join('\n')).toContain('No profiles assigned');
|
||||
});
|
||||
});
|
||||
|
||||
describe('set-profiles', () => {
|
||||
it('sets profiles for a project', async () => {
|
||||
const cmd = createProjectCommand({ client, log });
|
||||
await cmd.parseAsync(['set-profiles', 'proj-1', 'prof-1', 'prof-2'], { from: 'user' });
|
||||
expect(client.put).toHaveBeenCalledWith('/api/v1/projects/proj-1/profiles', {
|
||||
profileIds: ['prof-1', 'prof-2'],
|
||||
});
|
||||
expect(output.join('\n')).toContain('2 profile(s)');
|
||||
const text = deps.output.join('\n');
|
||||
expect(text).toContain('=== Project: smart-home ===');
|
||||
expect(text).toContain('filtered');
|
||||
expect(text).toContain('gemini-cli');
|
||||
expect(text).toContain('my-grafana');
|
||||
expect(text).toContain('my-ha');
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
@@ -1,141 +0,0 @@
|
||||
import { describe, it, expect, vi, beforeEach } from 'vitest';
|
||||
import { createSetupCommand } from '../../src/commands/setup.js';
|
||||
import type { ApiClient } from '../../src/api-client.js';
|
||||
import type { SetupPromptDeps } from '../../src/commands/setup.js';
|
||||
|
||||
function mockClient(): ApiClient {
|
||||
return {
|
||||
get: vi.fn(async () => []),
|
||||
post: vi.fn(async () => ({ id: 'new-id', name: 'test' })),
|
||||
put: vi.fn(async () => ({})),
|
||||
delete: vi.fn(async () => {}),
|
||||
} as unknown as ApiClient;
|
||||
}
|
||||
|
||||
function mockPrompt(answers: Record<string, string | boolean>): SetupPromptDeps {
|
||||
const answersQueue = { ...answers };
|
||||
return {
|
||||
input: vi.fn(async (message: string) => {
|
||||
for (const [key, val] of Object.entries(answersQueue)) {
|
||||
if (message.toLowerCase().includes(key.toLowerCase()) && typeof val === 'string') {
|
||||
delete answersQueue[key];
|
||||
return val;
|
||||
}
|
||||
}
|
||||
return '';
|
||||
}),
|
||||
password: vi.fn(async () => 'secret-value'),
|
||||
select: vi.fn(async () => 'STDIO') as SetupPromptDeps['select'],
|
||||
confirm: vi.fn(async (message: string) => {
|
||||
if (message.includes('profile')) return true;
|
||||
if (message.includes('secret')) return false;
|
||||
if (message.includes('another')) return false;
|
||||
return false;
|
||||
}),
|
||||
};
|
||||
}
|
||||
|
||||
describe('setup command', () => {
|
||||
let client: ReturnType<typeof mockClient>;
|
||||
let output: string[];
|
||||
const log = (...args: unknown[]) => output.push(args.map(String).join(' '));
|
||||
|
||||
beforeEach(() => {
|
||||
client = mockClient();
|
||||
output = [];
|
||||
});
|
||||
|
||||
it('creates server with prompted values', async () => {
|
||||
const prompt = mockPrompt({
|
||||
'transport': 'STDIO',
|
||||
'npm package': '@anthropic/slack-mcp',
|
||||
'docker image': '',
|
||||
'description': 'Slack server',
|
||||
'profile name': 'default',
|
||||
'environment variable name': '',
|
||||
});
|
||||
|
||||
const cmd = createSetupCommand({ client, prompt, log });
|
||||
await cmd.parseAsync(['slack'], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledWith('/api/v1/servers', expect.objectContaining({
|
||||
name: 'slack',
|
||||
transport: 'STDIO',
|
||||
}));
|
||||
expect(output.join('\n')).toContain("Server 'test' created");
|
||||
});
|
||||
|
||||
it('creates profile with env vars', async () => {
|
||||
vi.mocked(client.post)
|
||||
.mockResolvedValueOnce({ id: 'srv-1', name: 'slack' }) // server create
|
||||
.mockResolvedValueOnce({ id: 'prof-1', name: 'default' }); // profile create
|
||||
|
||||
const prompt = mockPrompt({
|
||||
'transport': 'STDIO',
|
||||
'npm package': '',
|
||||
'docker image': '',
|
||||
'description': '',
|
||||
'profile name': 'default',
|
||||
});
|
||||
// Override confirm to create profile and add one env var
|
||||
let confirmCallCount = 0;
|
||||
vi.mocked(prompt.confirm).mockImplementation(async (msg: string) => {
|
||||
confirmCallCount++;
|
||||
if (msg.includes('profile')) return true;
|
||||
if (msg.includes('secret')) return true;
|
||||
if (msg.includes('another')) return false;
|
||||
return false;
|
||||
});
|
||||
// Override input to provide env var name then empty to stop
|
||||
let inputCallCount = 0;
|
||||
vi.mocked(prompt.input).mockImplementation(async (msg: string) => {
|
||||
inputCallCount++;
|
||||
if (msg.includes('Profile name')) return 'default';
|
||||
if (msg.includes('variable name') && inputCallCount <= 8) return 'API_KEY';
|
||||
if (msg.includes('variable name')) return '';
|
||||
return '';
|
||||
});
|
||||
|
||||
const cmd = createSetupCommand({ client, prompt, log });
|
||||
await cmd.parseAsync(['slack'], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledTimes(2);
|
||||
const profileCall = vi.mocked(client.post).mock.calls[1];
|
||||
expect(profileCall?.[0]).toBe('/api/v1/profiles');
|
||||
expect(profileCall?.[1]).toEqual(expect.objectContaining({
|
||||
name: 'default',
|
||||
serverId: 'srv-1',
|
||||
}));
|
||||
});
|
||||
|
||||
it('exits if server creation fails', async () => {
|
||||
vi.mocked(client.post).mockRejectedValue(new Error('conflict'));
|
||||
|
||||
const prompt = mockPrompt({
|
||||
'npm package': '',
|
||||
'docker image': '',
|
||||
'description': '',
|
||||
});
|
||||
|
||||
const cmd = createSetupCommand({ client, prompt, log });
|
||||
await cmd.parseAsync(['slack'], { from: 'user' });
|
||||
|
||||
expect(output.join('\n')).toContain('Failed to create server');
|
||||
expect(client.post).toHaveBeenCalledTimes(1); // Only server create, no profile
|
||||
});
|
||||
|
||||
it('skips profile creation when declined', async () => {
|
||||
const prompt = mockPrompt({
|
||||
'npm package': '',
|
||||
'docker image': '',
|
||||
'description': '',
|
||||
});
|
||||
vi.mocked(prompt.confirm).mockResolvedValue(false);
|
||||
|
||||
const cmd = createSetupCommand({ client, prompt, log });
|
||||
await cmd.parseAsync(['test-server'], { from: 'user' });
|
||||
|
||||
expect(client.post).toHaveBeenCalledTimes(1); // Only server create
|
||||
expect(output.join('\n')).toContain('Setup complete');
|
||||
});
|
||||
});
|
||||
@@ -3,19 +3,39 @@ import { mkdtempSync, rmSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
import { createStatusCommand } from '../../src/commands/status.js';
|
||||
import type { StatusCommandDeps } from '../../src/commands/status.js';
|
||||
import { saveConfig, DEFAULT_CONFIG } from '../../src/config/index.js';
|
||||
import { saveCredentials } from '../../src/auth/index.js';
|
||||
|
||||
let tempDir: string;
|
||||
let output: string[];
|
||||
let written: string[];
|
||||
|
||||
function log(...args: string[]) {
|
||||
output.push(args.join(' '));
|
||||
}
|
||||
|
||||
function write(text: string) {
|
||||
written.push(text);
|
||||
}
|
||||
|
||||
function baseDeps(overrides?: Partial<StatusCommandDeps>): Partial<StatusCommandDeps> {
|
||||
return {
|
||||
configDeps: { configDir: tempDir },
|
||||
credentialsDeps: { configDir: tempDir },
|
||||
log,
|
||||
write,
|
||||
checkHealth: async () => true,
|
||||
fetchProviders: async () => null,
|
||||
isTTY: false,
|
||||
...overrides,
|
||||
};
|
||||
}
|
||||
|
||||
beforeEach(() => {
|
||||
tempDir = mkdtempSync(join(tmpdir(), 'mcpctl-status-test-'));
|
||||
output = [];
|
||||
written = [];
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
@@ -24,12 +44,7 @@ afterEach(() => {
|
||||
|
||||
describe('status command', () => {
|
||||
it('shows status in table format', async () => {
|
||||
const cmd = createStatusCommand({
|
||||
configDeps: { configDir: tempDir },
|
||||
credentialsDeps: { configDir: tempDir },
|
||||
log,
|
||||
checkHealth: async () => true,
|
||||
});
|
||||
const cmd = createStatusCommand(baseDeps());
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
const out = output.join('\n');
|
||||
expect(out).toContain('mcpctl v');
|
||||
@@ -39,46 +54,26 @@ describe('status command', () => {
|
||||
});
|
||||
|
||||
it('shows unreachable when daemons are down', async () => {
|
||||
const cmd = createStatusCommand({
|
||||
configDeps: { configDir: tempDir },
|
||||
credentialsDeps: { configDir: tempDir },
|
||||
log,
|
||||
checkHealth: async () => false,
|
||||
});
|
||||
const cmd = createStatusCommand(baseDeps({ checkHealth: async () => false }));
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
expect(output.join('\n')).toContain('unreachable');
|
||||
});
|
||||
|
||||
it('shows not logged in when no credentials', async () => {
|
||||
const cmd = createStatusCommand({
|
||||
configDeps: { configDir: tempDir },
|
||||
credentialsDeps: { configDir: tempDir },
|
||||
log,
|
||||
checkHealth: async () => true,
|
||||
});
|
||||
const cmd = createStatusCommand(baseDeps());
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
expect(output.join('\n')).toContain('not logged in');
|
||||
});
|
||||
|
||||
it('shows logged in user when credentials exist', async () => {
|
||||
saveCredentials({ token: 'tok', mcpdUrl: 'http://x:3100', user: 'alice@example.com' }, { configDir: tempDir });
|
||||
const cmd = createStatusCommand({
|
||||
configDeps: { configDir: tempDir },
|
||||
credentialsDeps: { configDir: tempDir },
|
||||
log,
|
||||
checkHealth: async () => true,
|
||||
});
|
||||
const cmd = createStatusCommand(baseDeps());
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
expect(output.join('\n')).toContain('logged in as alice@example.com');
|
||||
});
|
||||
|
||||
it('shows status in JSON format', async () => {
|
||||
const cmd = createStatusCommand({
|
||||
configDeps: { configDir: tempDir },
|
||||
credentialsDeps: { configDir: tempDir },
|
||||
log,
|
||||
checkHealth: async () => true,
|
||||
});
|
||||
const cmd = createStatusCommand(baseDeps());
|
||||
await cmd.parseAsync(['-o', 'json'], { from: 'user' });
|
||||
const parsed = JSON.parse(output[0]) as Record<string, unknown>;
|
||||
expect(parsed['version']).toBe('0.1.0');
|
||||
@@ -87,12 +82,7 @@ describe('status command', () => {
|
||||
});
|
||||
|
||||
it('shows status in YAML format', async () => {
|
||||
const cmd = createStatusCommand({
|
||||
configDeps: { configDir: tempDir },
|
||||
credentialsDeps: { configDir: tempDir },
|
||||
log,
|
||||
checkHealth: async () => false,
|
||||
});
|
||||
const cmd = createStatusCommand(baseDeps({ checkHealth: async () => false }));
|
||||
await cmd.parseAsync(['-o', 'yaml'], { from: 'user' });
|
||||
expect(output[0]).toContain('mcplocalReachable: false');
|
||||
});
|
||||
@@ -100,15 +90,12 @@ describe('status command', () => {
|
||||
it('checks correct URLs from config', async () => {
|
||||
saveConfig({ ...DEFAULT_CONFIG, mcplocalUrl: 'http://local:3200', mcpdUrl: 'http://remote:3100' }, { configDir: tempDir });
|
||||
const checkedUrls: string[] = [];
|
||||
const cmd = createStatusCommand({
|
||||
configDeps: { configDir: tempDir },
|
||||
credentialsDeps: { configDir: tempDir },
|
||||
log,
|
||||
const cmd = createStatusCommand(baseDeps({
|
||||
checkHealth: async (url) => {
|
||||
checkedUrls.push(url);
|
||||
return false;
|
||||
},
|
||||
});
|
||||
}));
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
expect(checkedUrls).toContain('http://local:3200');
|
||||
expect(checkedUrls).toContain('http://remote:3100');
|
||||
@@ -116,14 +103,100 @@ describe('status command', () => {
|
||||
|
||||
it('shows registries from config', async () => {
|
||||
saveConfig({ ...DEFAULT_CONFIG, registries: ['official'] }, { configDir: tempDir });
|
||||
const cmd = createStatusCommand({
|
||||
configDeps: { configDir: tempDir },
|
||||
credentialsDeps: { configDir: tempDir },
|
||||
log,
|
||||
checkHealth: async () => true,
|
||||
});
|
||||
const cmd = createStatusCommand(baseDeps());
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
expect(output.join('\n')).toContain('official');
|
||||
expect(output.join('\n')).not.toContain('glama');
|
||||
});
|
||||
|
||||
it('shows LLM not configured hint when no LLM is set', async () => {
|
||||
const cmd = createStatusCommand(baseDeps());
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
const out = output.join('\n');
|
||||
expect(out).toContain('LLM:');
|
||||
expect(out).toContain('not configured');
|
||||
expect(out).toContain('mcpctl config setup');
|
||||
});
|
||||
|
||||
it('shows green check when LLM is healthy (non-TTY)', async () => {
|
||||
saveConfig({ ...DEFAULT_CONFIG, llm: { provider: 'anthropic', model: 'claude-haiku-3-5-20241022' } }, { configDir: tempDir });
|
||||
const cmd = createStatusCommand(baseDeps({ checkLlm: async () => 'ok' }));
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
const out = output.join('\n');
|
||||
expect(out).toContain('anthropic / claude-haiku-3-5-20241022');
|
||||
expect(out).toContain('✓ ok');
|
||||
});
|
||||
|
||||
it('shows red cross when LLM check fails (non-TTY)', async () => {
|
||||
saveConfig({ ...DEFAULT_CONFIG, llm: { provider: 'gemini-cli', model: 'gemini-2.5-flash' } }, { configDir: tempDir });
|
||||
const cmd = createStatusCommand(baseDeps({ checkLlm: async () => 'not authenticated' }));
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
const out = output.join('\n');
|
||||
expect(out).toContain('✗ not authenticated');
|
||||
});
|
||||
|
||||
it('shows error message from mcplocal', async () => {
|
||||
saveConfig({ ...DEFAULT_CONFIG, llm: { provider: 'gemini-cli', model: 'gemini-2.5-flash' } }, { configDir: tempDir });
|
||||
const cmd = createStatusCommand(baseDeps({ checkLlm: async () => 'binary not found' }));
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
expect(output.join('\n')).toContain('✗ binary not found');
|
||||
});
|
||||
|
||||
it('queries mcplocal URL for LLM health', async () => {
|
||||
saveConfig({ ...DEFAULT_CONFIG, mcplocalUrl: 'http://custom:9999', llm: { provider: 'gemini-cli', model: 'gemini-2.5-flash' } }, { configDir: tempDir });
|
||||
let queriedUrl = '';
|
||||
const cmd = createStatusCommand(baseDeps({
|
||||
checkLlm: async (url) => { queriedUrl = url; return 'ok'; },
|
||||
}));
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
expect(queriedUrl).toBe('http://custom:9999');
|
||||
});
|
||||
|
||||
it('uses spinner on TTY and writes final result', async () => {
|
||||
saveConfig({ ...DEFAULT_CONFIG, llm: { provider: 'gemini-cli', model: 'gemini-2.5-flash' } }, { configDir: tempDir });
|
||||
const cmd = createStatusCommand(baseDeps({
|
||||
isTTY: true,
|
||||
checkLlm: async () => 'ok',
|
||||
}));
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
// On TTY, the final LLM line goes through write(), not log()
|
||||
const finalWrite = written[written.length - 1];
|
||||
expect(finalWrite).toContain('gemini-cli / gemini-2.5-flash');
|
||||
expect(finalWrite).toContain('✓ ok');
|
||||
});
|
||||
|
||||
it('uses spinner on TTY and shows failure', async () => {
|
||||
saveConfig({ ...DEFAULT_CONFIG, llm: { provider: 'gemini-cli', model: 'gemini-2.5-flash' } }, { configDir: tempDir });
|
||||
const cmd = createStatusCommand(baseDeps({
|
||||
isTTY: true,
|
||||
checkLlm: async () => 'not authenticated',
|
||||
}));
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
const finalWrite = written[written.length - 1];
|
||||
expect(finalWrite).toContain('✗ not authenticated');
|
||||
});
|
||||
|
||||
it('shows not configured when LLM provider is none', async () => {
|
||||
saveConfig({ ...DEFAULT_CONFIG, llm: { provider: 'none' } }, { configDir: tempDir });
|
||||
const cmd = createStatusCommand(baseDeps());
|
||||
await cmd.parseAsync([], { from: 'user' });
|
||||
expect(output.join('\n')).toContain('not configured');
|
||||
});
|
||||
|
||||
it('includes llm and llmStatus in JSON output', async () => {
|
||||
saveConfig({ ...DEFAULT_CONFIG, llm: { provider: 'gemini-cli', model: 'gemini-2.5-flash' } }, { configDir: tempDir });
|
||||
const cmd = createStatusCommand(baseDeps({ checkLlm: async () => 'ok' }));
|
||||
await cmd.parseAsync(['-o', 'json'], { from: 'user' });
|
||||
const parsed = JSON.parse(output[0]) as Record<string, unknown>;
|
||||
expect(parsed['llm']).toBe('gemini-cli / gemini-2.5-flash');
|
||||
expect(parsed['llmStatus']).toBe('ok');
|
||||
});
|
||||
|
||||
it('includes null llm in JSON output when not configured', async () => {
|
||||
const cmd = createStatusCommand(baseDeps());
|
||||
await cmd.parseAsync(['-o', 'json'], { from: 'user' });
|
||||
const parsed = JSON.parse(output[0]) as Record<string, unknown>;
|
||||
expect(parsed['llm']).toBeNull();
|
||||
expect(parsed['llmStatus']).toBeNull();
|
||||
});
|
||||
});
|
||||
|
||||
176
src/cli/tests/completions.test.ts
Normal file
176
src/cli/tests/completions.test.ts
Normal file
@@ -0,0 +1,176 @@
|
||||
import { describe, it, expect } from 'vitest';
|
||||
import { readFileSync } from 'node:fs';
|
||||
import { join, dirname } from 'node:path';
|
||||
import { fileURLToPath } from 'node:url';
|
||||
|
||||
const root = join(dirname(fileURLToPath(import.meta.url)), '..', '..', '..');
|
||||
const fishFile = readFileSync(join(root, 'completions', 'mcpctl.fish'), 'utf-8');
|
||||
const bashFile = readFileSync(join(root, 'completions', 'mcpctl.bash'), 'utf-8');
|
||||
|
||||
describe('fish completions', () => {
|
||||
it('erases stale completions at the top', () => {
|
||||
const lines = fishFile.split('\n');
|
||||
const firstComplete = lines.findIndex((l) => l.startsWith('complete '));
|
||||
expect(lines[firstComplete]).toContain('-e');
|
||||
});
|
||||
|
||||
it('does not offer resource types without __mcpctl_needs_resource_type guard', () => {
|
||||
const resourceTypes = ['servers', 'instances', 'secrets', 'templates', 'projects', 'users', 'groups', 'rbac', 'prompts', 'promptrequests'];
|
||||
const lines = fishFile.split('\n').filter((l) => l.startsWith('complete '));
|
||||
|
||||
for (const line of lines) {
|
||||
// Find lines that offer resource types as positional args
|
||||
const offersResourceType = resourceTypes.some((r) => {
|
||||
// Match `-a "...servers..."` or `-a 'servers projects'`
|
||||
const aMatch = line.match(/-a\s+['"]([^'"]+)['"]/);
|
||||
if (!aMatch) return false;
|
||||
return aMatch[1].split(/\s+/).includes(r);
|
||||
});
|
||||
|
||||
if (!offersResourceType) continue;
|
||||
|
||||
// Skip the help completions line and the -e line
|
||||
if (line.includes('__fish_seen_subcommand_from help')) continue;
|
||||
// Skip project-scoped command offerings (those offer commands, not resource types)
|
||||
if (line.includes('attach-server') || line.includes('detach-server')) continue;
|
||||
// Skip lines that offer commands (not resource types)
|
||||
if (line.includes("-d 'Show") || line.includes("-d 'Manage") || line.includes("-d 'Authenticate") ||
|
||||
line.includes("-d 'Log out'") || line.includes("-d 'Get instance") || line.includes("-d 'Create a resource'") ||
|
||||
line.includes("-d 'Edit a resource'") || line.includes("-d 'Apply") || line.includes("-d 'Backup") ||
|
||||
line.includes("-d 'Restore") || line.includes("-d 'List resources") || line.includes("-d 'Delete a resource'")) continue;
|
||||
|
||||
// Lines offering resource types MUST have __mcpctl_needs_resource_type in their condition
|
||||
expect(line, `Resource type completion missing guard: ${line}`).toContain('__mcpctl_needs_resource_type');
|
||||
}
|
||||
});
|
||||
|
||||
it('resource name completions require resource type to be selected', () => {
|
||||
const lines = fishFile.split('\n').filter((l) => l.startsWith('complete') && l.includes('__mcpctl_resource_names'));
|
||||
expect(lines.length).toBeGreaterThan(0);
|
||||
for (const line of lines) {
|
||||
expect(line).toContain('not __mcpctl_needs_resource_type');
|
||||
}
|
||||
});
|
||||
|
||||
it('defines --project option', () => {
|
||||
expect(fishFile).toContain("complete -c mcpctl -l project");
|
||||
});
|
||||
|
||||
it('attach-server command only shows with --project', () => {
|
||||
// Only check lines that OFFER attach-server as a command (via -a attach-server), not argument completions
|
||||
const lines = fishFile.split('\n').filter((l) =>
|
||||
l.startsWith('complete') && l.includes("-a attach-server"));
|
||||
expect(lines.length).toBeGreaterThan(0);
|
||||
for (const line of lines) {
|
||||
expect(line).toContain('__mcpctl_has_project');
|
||||
}
|
||||
});
|
||||
|
||||
it('detach-server command only shows with --project', () => {
|
||||
const lines = fishFile.split('\n').filter((l) =>
|
||||
l.startsWith('complete') && l.includes("-a detach-server"));
|
||||
expect(lines.length).toBeGreaterThan(0);
|
||||
for (const line of lines) {
|
||||
expect(line).toContain('__mcpctl_has_project');
|
||||
}
|
||||
});
|
||||
|
||||
it('resource name functions use jq .[][].name to unwrap wrapped JSON and avoid nested matches', () => {
|
||||
// API returns { "resources": [...] } not [...], so .[].name fails silently.
|
||||
// Must use .[][].name to unwrap the outer object then iterate the array.
|
||||
// Also must not use string match regex which matches nested name fields.
|
||||
const resourceNamesFn = fishFile.match(/function __mcpctl_resource_names[\s\S]*?^end/m)?.[0] ?? '';
|
||||
const projectNamesFn = fishFile.match(/function __mcpctl_project_names[\s\S]*?^end/m)?.[0] ?? '';
|
||||
|
||||
expect(resourceNamesFn, '__mcpctl_resource_names must use jq .[][].name').toContain("jq -r '.[][].name'");
|
||||
expect(resourceNamesFn, '__mcpctl_resource_names must not use string match on name').not.toMatch(/string match.*"name"/);
|
||||
|
||||
expect(projectNamesFn, '__mcpctl_project_names must use jq .[][].name').toContain("jq -r '.[][].name'");
|
||||
expect(projectNamesFn, '__mcpctl_project_names must not use string match on name').not.toMatch(/string match.*"name"/);
|
||||
});
|
||||
|
||||
it('instances use server.name instead of name', () => {
|
||||
const resourceNamesFn = fishFile.match(/function __mcpctl_resource_names[\s\S]*?^end/m)?.[0] ?? '';
|
||||
expect(resourceNamesFn, 'must handle instances via server.name').toContain('.server.name');
|
||||
});
|
||||
|
||||
it('attach-server completes with available (unattached) servers and guards against repeat', () => {
|
||||
const attachLine = fishFile.split('\n').find((l) =>
|
||||
l.startsWith('complete') && l.includes('__fish_seen_subcommand_from attach-server'));
|
||||
expect(attachLine, 'attach-server argument completion must exist').toBeDefined();
|
||||
expect(attachLine, 'attach-server must use __mcpctl_available_servers').toContain('__mcpctl_available_servers');
|
||||
expect(attachLine, 'attach-server must guard with __mcpctl_needs_server_arg').toContain('__mcpctl_needs_server_arg');
|
||||
});
|
||||
|
||||
it('detach-server completes with project servers and guards against repeat', () => {
|
||||
const detachLine = fishFile.split('\n').find((l) =>
|
||||
l.startsWith('complete') && l.includes('__fish_seen_subcommand_from detach-server'));
|
||||
expect(detachLine, 'detach-server argument completion must exist').toBeDefined();
|
||||
expect(detachLine, 'detach-server must use __mcpctl_project_servers').toContain('__mcpctl_project_servers');
|
||||
expect(detachLine, 'detach-server must guard with __mcpctl_needs_server_arg').toContain('__mcpctl_needs_server_arg');
|
||||
});
|
||||
|
||||
it('non-project commands do not show with --project', () => {
|
||||
const nonProjectCmds = ['status', 'login', 'logout', 'config', 'apply', 'backup', 'restore'];
|
||||
const lines = fishFile.split('\n').filter((l) => l.startsWith('complete') && l.includes('-a '));
|
||||
|
||||
for (const cmd of nonProjectCmds) {
|
||||
const cmdLines = lines.filter((l) => {
|
||||
const aMatch = l.match(/-a\s+(\S+)/);
|
||||
return aMatch && aMatch[1].replace(/['"]/g, '') === cmd;
|
||||
});
|
||||
for (const line of cmdLines) {
|
||||
expect(line, `${cmd} should require 'not __mcpctl_has_project'`).toContain('not __mcpctl_has_project');
|
||||
}
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
describe('bash completions', () => {
|
||||
it('separates project commands from regular commands', () => {
|
||||
expect(bashFile).toContain('project_commands=');
|
||||
expect(bashFile).toContain('attach-server detach-server');
|
||||
});
|
||||
|
||||
it('checks has_project before offering project commands', () => {
|
||||
expect(bashFile).toContain('if $has_project');
|
||||
expect(bashFile).toContain('$project_commands');
|
||||
});
|
||||
|
||||
it('fetches resource names dynamically after resource type', () => {
|
||||
expect(bashFile).toContain('_mcpctl_resource_names');
|
||||
// get/describe/delete should use resource_names when resource_type is set
|
||||
expect(bashFile).toMatch(/get\|describe\|delete\)[\s\S]*?_mcpctl_resource_names/);
|
||||
});
|
||||
|
||||
it('attach-server filters out already-attached servers and guards against repeat', () => {
|
||||
const attachBlock = bashFile.match(/attach-server\)[\s\S]*?return ;;/)?.[0] ?? '';
|
||||
expect(attachBlock, 'attach-server must use _mcpctl_get_project_value').toContain('_mcpctl_get_project_value');
|
||||
expect(attachBlock, 'attach-server must query project servers to exclude').toContain('--project');
|
||||
expect(attachBlock, 'attach-server must check position to prevent repeat').toContain('cword - subcmd_pos');
|
||||
});
|
||||
|
||||
it('detach-server shows only project servers and guards against repeat', () => {
|
||||
const detachBlock = bashFile.match(/detach-server\)[\s\S]*?return ;;/)?.[0] ?? '';
|
||||
expect(detachBlock, 'detach-server must use _mcpctl_get_project_value').toContain('_mcpctl_get_project_value');
|
||||
expect(detachBlock, 'detach-server must query project servers').toContain('--project');
|
||||
expect(detachBlock, 'detach-server must check position to prevent repeat').toContain('cword - subcmd_pos');
|
||||
});
|
||||
|
||||
it('instances use server.name instead of name', () => {
|
||||
const fnMatch = bashFile.match(/_mcpctl_resource_names\(\)[\s\S]*?\n\s*\}/)?.[0] ?? '';
|
||||
expect(fnMatch, 'must handle instances via .server.name').toContain('.server.name');
|
||||
});
|
||||
|
||||
it('defines --project option', () => {
|
||||
expect(bashFile).toContain('--project');
|
||||
});
|
||||
|
||||
it('resource name function uses jq .[][].name to unwrap wrapped JSON and avoid nested matches', () => {
|
||||
const fnMatch = bashFile.match(/_mcpctl_resource_names\(\)[\s\S]*?\n\s*\}/)?.[0] ?? '';
|
||||
expect(fnMatch, '_mcpctl_resource_names must use jq .[][].name').toContain("jq -r '.[][].name'");
|
||||
expect(fnMatch, '_mcpctl_resource_names must not use grep on name').not.toMatch(/grep.*"name"/);
|
||||
// Guard against .[].name (single bracket) which fails on wrapped JSON
|
||||
expect(fnMatch, '_mcpctl_resource_names must not use .[].name (needs .[][].name)').not.toMatch(/jq.*'\.\[\]\.name'/);
|
||||
});
|
||||
});
|
||||
@@ -16,52 +16,67 @@ describe('CLI command registration (e2e)', () => {
|
||||
expect(commandNames).toContain('logout');
|
||||
expect(commandNames).toContain('get');
|
||||
expect(commandNames).toContain('describe');
|
||||
expect(commandNames).toContain('instance');
|
||||
expect(commandNames).toContain('delete');
|
||||
expect(commandNames).toContain('logs');
|
||||
expect(commandNames).toContain('apply');
|
||||
expect(commandNames).toContain('setup');
|
||||
expect(commandNames).toContain('claude');
|
||||
expect(commandNames).toContain('project');
|
||||
expect(commandNames).toContain('create');
|
||||
expect(commandNames).toContain('edit');
|
||||
expect(commandNames).toContain('backup');
|
||||
expect(commandNames).toContain('restore');
|
||||
});
|
||||
|
||||
it('instance command has lifecycle subcommands', () => {
|
||||
it('old project and claude top-level commands are removed', () => {
|
||||
const program = createProgram();
|
||||
const instance = program.commands.find((c) => c.name() === 'instance');
|
||||
expect(instance).toBeDefined();
|
||||
|
||||
const subcommands = instance!.commands.map((c) => c.name());
|
||||
expect(subcommands).toContain('list');
|
||||
expect(subcommands).toContain('start');
|
||||
expect(subcommands).toContain('stop');
|
||||
expect(subcommands).toContain('restart');
|
||||
expect(subcommands).toContain('remove');
|
||||
expect(subcommands).toContain('logs');
|
||||
expect(subcommands).toContain('inspect');
|
||||
const commandNames = program.commands.map((c) => c.name());
|
||||
expect(commandNames).not.toContain('claude');
|
||||
expect(commandNames).not.toContain('project');
|
||||
expect(commandNames).not.toContain('instance');
|
||||
});
|
||||
|
||||
it('claude command has config management subcommands', () => {
|
||||
it('config command has claude-generate and impersonate subcommands', () => {
|
||||
const program = createProgram();
|
||||
const claude = program.commands.find((c) => c.name() === 'claude');
|
||||
expect(claude).toBeDefined();
|
||||
const config = program.commands.find((c) => c.name() === 'config');
|
||||
expect(config).toBeDefined();
|
||||
|
||||
const subcommands = claude!.commands.map((c) => c.name());
|
||||
expect(subcommands).toContain('generate');
|
||||
expect(subcommands).toContain('show');
|
||||
expect(subcommands).toContain('add');
|
||||
expect(subcommands).toContain('remove');
|
||||
const subcommands = config!.commands.map((c) => c.name());
|
||||
expect(subcommands).toContain('claude-generate');
|
||||
expect(subcommands).toContain('impersonate');
|
||||
expect(subcommands).toContain('view');
|
||||
expect(subcommands).toContain('set');
|
||||
expect(subcommands).toContain('path');
|
||||
expect(subcommands).toContain('reset');
|
||||
});
|
||||
|
||||
it('project command has CRUD subcommands', () => {
|
||||
it('create command has user, group, rbac, prompt, promptrequest subcommands', () => {
|
||||
const program = createProgram();
|
||||
const project = program.commands.find((c) => c.name() === 'project');
|
||||
expect(project).toBeDefined();
|
||||
const create = program.commands.find((c) => c.name() === 'create');
|
||||
expect(create).toBeDefined();
|
||||
|
||||
const subcommands = project!.commands.map((c) => c.name());
|
||||
expect(subcommands).toContain('list');
|
||||
expect(subcommands).toContain('create');
|
||||
expect(subcommands).toContain('delete');
|
||||
expect(subcommands).toContain('show');
|
||||
expect(subcommands).toContain('profiles');
|
||||
expect(subcommands).toContain('set-profiles');
|
||||
const subcommands = create!.commands.map((c) => c.name());
|
||||
expect(subcommands).toContain('server');
|
||||
expect(subcommands).toContain('secret');
|
||||
expect(subcommands).toContain('project');
|
||||
expect(subcommands).toContain('user');
|
||||
expect(subcommands).toContain('group');
|
||||
expect(subcommands).toContain('rbac');
|
||||
expect(subcommands).toContain('prompt');
|
||||
expect(subcommands).toContain('promptrequest');
|
||||
});
|
||||
|
||||
it('get command accepts --project option', () => {
|
||||
const program = createProgram();
|
||||
const get = program.commands.find((c) => c.name() === 'get');
|
||||
expect(get).toBeDefined();
|
||||
|
||||
const projectOpt = get!.options.find((o) => o.long === '--project');
|
||||
expect(projectOpt).toBeDefined();
|
||||
expect(projectOpt!.description).toContain('project');
|
||||
});
|
||||
|
||||
it('program-level --project option is defined', () => {
|
||||
const program = createProgram();
|
||||
const projectOpt = program.options.find((o) => o.long === '--project');
|
||||
expect(projectOpt).toBeDefined();
|
||||
});
|
||||
|
||||
it('displays version', () => {
|
||||
|
||||
@@ -0,0 +1,204 @@
|
||||
-- CreateEnum
|
||||
CREATE TYPE "Role" AS ENUM ('USER', 'ADMIN');
|
||||
|
||||
-- CreateEnum
|
||||
CREATE TYPE "Transport" AS ENUM ('STDIO', 'SSE', 'STREAMABLE_HTTP');
|
||||
|
||||
-- CreateEnum
|
||||
CREATE TYPE "InstanceStatus" AS ENUM ('STARTING', 'RUNNING', 'STOPPING', 'STOPPED', 'ERROR');
|
||||
|
||||
-- CreateTable
|
||||
CREATE TABLE "User" (
|
||||
"id" TEXT NOT NULL,
|
||||
"email" TEXT NOT NULL,
|
||||
"name" TEXT,
|
||||
"passwordHash" TEXT NOT NULL,
|
||||
"role" "Role" NOT NULL DEFAULT 'USER',
|
||||
"version" INTEGER NOT NULL DEFAULT 1,
|
||||
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
"updatedAt" TIMESTAMP(3) NOT NULL,
|
||||
|
||||
CONSTRAINT "User_pkey" PRIMARY KEY ("id")
|
||||
);
|
||||
|
||||
-- CreateTable
|
||||
CREATE TABLE "Session" (
|
||||
"id" TEXT NOT NULL,
|
||||
"token" TEXT NOT NULL,
|
||||
"userId" TEXT NOT NULL,
|
||||
"expiresAt" TIMESTAMP(3) NOT NULL,
|
||||
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
|
||||
CONSTRAINT "Session_pkey" PRIMARY KEY ("id")
|
||||
);
|
||||
|
||||
-- CreateTable
|
||||
CREATE TABLE "McpServer" (
|
||||
"id" TEXT NOT NULL,
|
||||
"name" TEXT NOT NULL,
|
||||
"description" TEXT NOT NULL DEFAULT '',
|
||||
"packageName" TEXT,
|
||||
"dockerImage" TEXT,
|
||||
"transport" "Transport" NOT NULL DEFAULT 'STDIO',
|
||||
"repositoryUrl" TEXT,
|
||||
"externalUrl" TEXT,
|
||||
"command" JSONB,
|
||||
"containerPort" INTEGER,
|
||||
"envTemplate" JSONB NOT NULL DEFAULT '[]',
|
||||
"version" INTEGER NOT NULL DEFAULT 1,
|
||||
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
"updatedAt" TIMESTAMP(3) NOT NULL,
|
||||
|
||||
CONSTRAINT "McpServer_pkey" PRIMARY KEY ("id")
|
||||
);
|
||||
|
||||
-- CreateTable
|
||||
CREATE TABLE "McpProfile" (
|
||||
"id" TEXT NOT NULL,
|
||||
"name" TEXT NOT NULL,
|
||||
"serverId" TEXT NOT NULL,
|
||||
"permissions" JSONB NOT NULL DEFAULT '[]',
|
||||
"envOverrides" JSONB NOT NULL DEFAULT '{}',
|
||||
"version" INTEGER NOT NULL DEFAULT 1,
|
||||
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
"updatedAt" TIMESTAMP(3) NOT NULL,
|
||||
|
||||
CONSTRAINT "McpProfile_pkey" PRIMARY KEY ("id")
|
||||
);
|
||||
|
||||
-- CreateTable
|
||||
CREATE TABLE "Project" (
|
||||
"id" TEXT NOT NULL,
|
||||
"name" TEXT NOT NULL,
|
||||
"description" TEXT NOT NULL DEFAULT '',
|
||||
"ownerId" TEXT NOT NULL,
|
||||
"version" INTEGER NOT NULL DEFAULT 1,
|
||||
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
"updatedAt" TIMESTAMP(3) NOT NULL,
|
||||
|
||||
CONSTRAINT "Project_pkey" PRIMARY KEY ("id")
|
||||
);
|
||||
|
||||
-- CreateTable
|
||||
CREATE TABLE "ProjectMcpProfile" (
|
||||
"id" TEXT NOT NULL,
|
||||
"projectId" TEXT NOT NULL,
|
||||
"profileId" TEXT NOT NULL,
|
||||
|
||||
CONSTRAINT "ProjectMcpProfile_pkey" PRIMARY KEY ("id")
|
||||
);
|
||||
|
||||
-- CreateTable
|
||||
CREATE TABLE "McpInstance" (
|
||||
"id" TEXT NOT NULL,
|
||||
"serverId" TEXT NOT NULL,
|
||||
"containerId" TEXT,
|
||||
"status" "InstanceStatus" NOT NULL DEFAULT 'STOPPED',
|
||||
"port" INTEGER,
|
||||
"metadata" JSONB NOT NULL DEFAULT '{}',
|
||||
"version" INTEGER NOT NULL DEFAULT 1,
|
||||
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
"updatedAt" TIMESTAMP(3) NOT NULL,
|
||||
|
||||
CONSTRAINT "McpInstance_pkey" PRIMARY KEY ("id")
|
||||
);
|
||||
|
||||
-- CreateTable
|
||||
CREATE TABLE "AuditLog" (
|
||||
"id" TEXT NOT NULL,
|
||||
"userId" TEXT NOT NULL,
|
||||
"action" TEXT NOT NULL,
|
||||
"resource" TEXT NOT NULL,
|
||||
"resourceId" TEXT,
|
||||
"details" JSONB NOT NULL DEFAULT '{}',
|
||||
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
|
||||
CONSTRAINT "AuditLog_pkey" PRIMARY KEY ("id")
|
||||
);
|
||||
|
||||
-- CreateIndex
|
||||
CREATE UNIQUE INDEX "User_email_key" ON "User"("email");
|
||||
|
||||
-- CreateIndex
|
||||
CREATE INDEX "User_email_idx" ON "User"("email");
|
||||
|
||||
-- CreateIndex
|
||||
CREATE UNIQUE INDEX "Session_token_key" ON "Session"("token");
|
||||
|
||||
-- CreateIndex
|
||||
CREATE INDEX "Session_token_idx" ON "Session"("token");
|
||||
|
||||
-- CreateIndex
|
||||
CREATE INDEX "Session_userId_idx" ON "Session"("userId");
|
||||
|
||||
-- CreateIndex
|
||||
CREATE INDEX "Session_expiresAt_idx" ON "Session"("expiresAt");
|
||||
|
||||
-- CreateIndex
|
||||
CREATE UNIQUE INDEX "McpServer_name_key" ON "McpServer"("name");
|
||||
|
||||
-- CreateIndex
|
||||
CREATE INDEX "McpServer_name_idx" ON "McpServer"("name");
|
||||
|
||||
-- CreateIndex
|
||||
CREATE INDEX "McpProfile_serverId_idx" ON "McpProfile"("serverId");
|
||||
|
||||
-- CreateIndex
|
||||
CREATE UNIQUE INDEX "McpProfile_name_serverId_key" ON "McpProfile"("name", "serverId");
|
||||
|
||||
-- CreateIndex
|
||||
CREATE UNIQUE INDEX "Project_name_key" ON "Project"("name");
|
||||
|
||||
-- CreateIndex
|
||||
CREATE INDEX "Project_name_idx" ON "Project"("name");
|
||||
|
||||
-- CreateIndex
|
||||
CREATE INDEX "Project_ownerId_idx" ON "Project"("ownerId");
|
||||
|
||||
-- CreateIndex
|
||||
CREATE INDEX "ProjectMcpProfile_projectId_idx" ON "ProjectMcpProfile"("projectId");
|
||||
|
||||
-- CreateIndex
|
||||
CREATE INDEX "ProjectMcpProfile_profileId_idx" ON "ProjectMcpProfile"("profileId");
|
||||
|
||||
-- CreateIndex
|
||||
CREATE UNIQUE INDEX "ProjectMcpProfile_projectId_profileId_key" ON "ProjectMcpProfile"("projectId", "profileId");
|
||||
|
||||
-- CreateIndex
|
||||
CREATE INDEX "McpInstance_serverId_idx" ON "McpInstance"("serverId");
|
||||
|
||||
-- CreateIndex
|
||||
CREATE INDEX "McpInstance_status_idx" ON "McpInstance"("status");
|
||||
|
||||
-- CreateIndex
|
||||
CREATE INDEX "AuditLog_userId_idx" ON "AuditLog"("userId");
|
||||
|
||||
-- CreateIndex
|
||||
CREATE INDEX "AuditLog_action_idx" ON "AuditLog"("action");
|
||||
|
||||
-- CreateIndex
|
||||
CREATE INDEX "AuditLog_resource_idx" ON "AuditLog"("resource");
|
||||
|
||||
-- CreateIndex
|
||||
CREATE INDEX "AuditLog_createdAt_idx" ON "AuditLog"("createdAt");
|
||||
|
||||
-- AddForeignKey
|
||||
ALTER TABLE "Session" ADD CONSTRAINT "Session_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE CASCADE ON UPDATE CASCADE;
|
||||
|
||||
-- AddForeignKey
|
||||
ALTER TABLE "McpProfile" ADD CONSTRAINT "McpProfile_serverId_fkey" FOREIGN KEY ("serverId") REFERENCES "McpServer"("id") ON DELETE CASCADE ON UPDATE CASCADE;
|
||||
|
||||
-- AddForeignKey
|
||||
ALTER TABLE "Project" ADD CONSTRAINT "Project_ownerId_fkey" FOREIGN KEY ("ownerId") REFERENCES "User"("id") ON DELETE CASCADE ON UPDATE CASCADE;
|
||||
|
||||
-- AddForeignKey
|
||||
ALTER TABLE "ProjectMcpProfile" ADD CONSTRAINT "ProjectMcpProfile_projectId_fkey" FOREIGN KEY ("projectId") REFERENCES "Project"("id") ON DELETE CASCADE ON UPDATE CASCADE;
|
||||
|
||||
-- AddForeignKey
|
||||
ALTER TABLE "ProjectMcpProfile" ADD CONSTRAINT "ProjectMcpProfile_profileId_fkey" FOREIGN KEY ("profileId") REFERENCES "McpProfile"("id") ON DELETE CASCADE ON UPDATE CASCADE;
|
||||
|
||||
-- AddForeignKey
|
||||
ALTER TABLE "McpInstance" ADD CONSTRAINT "McpInstance_serverId_fkey" FOREIGN KEY ("serverId") REFERENCES "McpServer"("id") ON DELETE CASCADE ON UPDATE CASCADE;
|
||||
|
||||
-- AddForeignKey
|
||||
ALTER TABLE "AuditLog" ADD CONSTRAINT "AuditLog_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE CASCADE ON UPDATE CASCADE;
|
||||
@@ -0,0 +1,8 @@
|
||||
-- DropForeignKey
|
||||
ALTER TABLE "ProjectMember" DROP CONSTRAINT IF EXISTS "ProjectMember_projectId_fkey";
|
||||
|
||||
-- DropForeignKey
|
||||
ALTER TABLE "ProjectMember" DROP CONSTRAINT IF EXISTS "ProjectMember_userId_fkey";
|
||||
|
||||
-- DropTable
|
||||
DROP TABLE IF EXISTS "ProjectMember";
|
||||
@@ -0,0 +1,11 @@
|
||||
-- AlterTable: Add gated flag to Project
|
||||
ALTER TABLE "Project" ADD COLUMN "gated" BOOLEAN NOT NULL DEFAULT true;
|
||||
|
||||
-- AlterTable: Add priority, summary, chapters, linkTarget to Prompt
|
||||
ALTER TABLE "Prompt" ADD COLUMN "priority" INTEGER NOT NULL DEFAULT 5;
|
||||
ALTER TABLE "Prompt" ADD COLUMN "summary" TEXT;
|
||||
ALTER TABLE "Prompt" ADD COLUMN "chapters" JSONB;
|
||||
ALTER TABLE "Prompt" ADD COLUMN "linkTarget" TEXT;
|
||||
|
||||
-- AlterTable: Add priority to PromptRequest
|
||||
ALTER TABLE "PromptRequest" ADD COLUMN "priority" INTEGER NOT NULL DEFAULT 5;
|
||||
3
src/db/prisma/migrations/migration_lock.toml
Normal file
3
src/db/prisma/migrations/migration_lock.toml
Normal file
@@ -0,0 +1,3 @@
|
||||
# Please do not edit this file manually
|
||||
# It should be added in your version-control system (e.g., Git)
|
||||
provider = "postgresql"
|
||||
@@ -15,13 +15,16 @@ model User {
|
||||
name String?
|
||||
passwordHash String
|
||||
role Role @default(USER)
|
||||
provider String?
|
||||
externalId String?
|
||||
version Int @default(1)
|
||||
createdAt DateTime @default(now())
|
||||
updatedAt DateTime @updatedAt
|
||||
|
||||
sessions Session[]
|
||||
auditLogs AuditLog[]
|
||||
projects Project[]
|
||||
sessions Session[]
|
||||
auditLogs AuditLog[]
|
||||
ownedProjects Project[]
|
||||
groupMemberships GroupMember[]
|
||||
|
||||
@@index([email])
|
||||
}
|
||||
@@ -57,13 +60,21 @@ model McpServer {
|
||||
dockerImage String?
|
||||
transport Transport @default(STDIO)
|
||||
repositoryUrl String?
|
||||
envTemplate Json @default("[]")
|
||||
externalUrl String?
|
||||
command Json?
|
||||
containerPort Int?
|
||||
replicas Int @default(1)
|
||||
env Json @default("[]")
|
||||
healthCheck Json?
|
||||
version Int @default(1)
|
||||
createdAt DateTime @default(now())
|
||||
updatedAt DateTime @updatedAt
|
||||
|
||||
profiles McpProfile[]
|
||||
templateName String?
|
||||
templateVersion String?
|
||||
|
||||
instances McpInstance[]
|
||||
projects ProjectServer[]
|
||||
|
||||
@@index([name])
|
||||
}
|
||||
@@ -74,23 +85,83 @@ enum Transport {
|
||||
STREAMABLE_HTTP
|
||||
}
|
||||
|
||||
// ── MCP Profiles ──
|
||||
// ── MCP Templates ──
|
||||
|
||||
model McpProfile {
|
||||
model McpTemplate {
|
||||
id String @id @default(cuid())
|
||||
name String @unique
|
||||
version String @default("1.0.0")
|
||||
description String @default("")
|
||||
packageName String?
|
||||
dockerImage String?
|
||||
transport Transport @default(STDIO)
|
||||
repositoryUrl String?
|
||||
externalUrl String?
|
||||
command Json?
|
||||
containerPort Int?
|
||||
replicas Int @default(1)
|
||||
env Json @default("[]")
|
||||
healthCheck Json?
|
||||
createdAt DateTime @default(now())
|
||||
updatedAt DateTime @updatedAt
|
||||
|
||||
@@index([name])
|
||||
}
|
||||
|
||||
// ── Secrets ──
|
||||
|
||||
model Secret {
|
||||
id String @id @default(cuid())
|
||||
name String @unique
|
||||
data Json @default("{}")
|
||||
version Int @default(1)
|
||||
createdAt DateTime @default(now())
|
||||
updatedAt DateTime @updatedAt
|
||||
|
||||
@@index([name])
|
||||
}
|
||||
|
||||
// ── Groups ──
|
||||
|
||||
model Group {
|
||||
id String @id @default(cuid())
|
||||
name String
|
||||
serverId String
|
||||
permissions Json @default("[]")
|
||||
envOverrides Json @default("{}")
|
||||
name String @unique
|
||||
description String @default("")
|
||||
version Int @default(1)
|
||||
createdAt DateTime @default(now())
|
||||
updatedAt DateTime @updatedAt
|
||||
|
||||
server McpServer @relation(fields: [serverId], references: [id], onDelete: Cascade)
|
||||
projects ProjectMcpProfile[]
|
||||
members GroupMember[]
|
||||
|
||||
@@unique([name, serverId])
|
||||
@@index([serverId])
|
||||
@@index([name])
|
||||
}
|
||||
|
||||
model GroupMember {
|
||||
id String @id @default(cuid())
|
||||
groupId String
|
||||
userId String
|
||||
createdAt DateTime @default(now())
|
||||
|
||||
group Group @relation(fields: [groupId], references: [id], onDelete: Cascade)
|
||||
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
|
||||
|
||||
@@unique([groupId, userId])
|
||||
@@index([groupId])
|
||||
@@index([userId])
|
||||
}
|
||||
|
||||
// ── RBAC Definitions ──
|
||||
|
||||
model RbacDefinition {
|
||||
id String @id @default(cuid())
|
||||
name String @unique
|
||||
subjects Json @default("[]")
|
||||
roleBindings Json @default("[]")
|
||||
version Int @default(1)
|
||||
createdAt DateTime @default(now())
|
||||
updatedAt DateTime @updatedAt
|
||||
|
||||
@@index([name])
|
||||
}
|
||||
|
||||
// ── Projects ──
|
||||
@@ -99,31 +170,35 @@ model Project {
|
||||
id String @id @default(cuid())
|
||||
name String @unique
|
||||
description String @default("")
|
||||
prompt String @default("")
|
||||
proxyMode String @default("direct")
|
||||
gated Boolean @default(true)
|
||||
llmProvider String?
|
||||
llmModel String?
|
||||
ownerId String
|
||||
version Int @default(1)
|
||||
createdAt DateTime @default(now())
|
||||
updatedAt DateTime @updatedAt
|
||||
|
||||
owner User @relation(fields: [ownerId], references: [id], onDelete: Cascade)
|
||||
profiles ProjectMcpProfile[]
|
||||
owner User @relation(fields: [ownerId], references: [id], onDelete: Cascade)
|
||||
servers ProjectServer[]
|
||||
prompts Prompt[]
|
||||
promptRequests PromptRequest[]
|
||||
|
||||
@@index([name])
|
||||
@@index([ownerId])
|
||||
}
|
||||
|
||||
// ── Project <-> Profile join table ──
|
||||
|
||||
model ProjectMcpProfile {
|
||||
id String @id @default(cuid())
|
||||
model ProjectServer {
|
||||
id String @id @default(cuid())
|
||||
projectId String
|
||||
profileId String
|
||||
serverId String
|
||||
createdAt DateTime @default(now())
|
||||
|
||||
project Project @relation(fields: [projectId], references: [id], onDelete: Cascade)
|
||||
profile McpProfile @relation(fields: [profileId], references: [id], onDelete: Cascade)
|
||||
project Project @relation(fields: [projectId], references: [id], onDelete: Cascade)
|
||||
server McpServer @relation(fields: [serverId], references: [id], onDelete: Cascade)
|
||||
|
||||
@@unique([projectId, profileId])
|
||||
@@index([projectId])
|
||||
@@index([profileId])
|
||||
@@unique([projectId, serverId])
|
||||
}
|
||||
|
||||
// ── MCP Instances (running containers) ──
|
||||
@@ -134,10 +209,13 @@ model McpInstance {
|
||||
containerId String?
|
||||
status InstanceStatus @default(STOPPED)
|
||||
port Int?
|
||||
metadata Json @default("{}")
|
||||
version Int @default(1)
|
||||
createdAt DateTime @default(now())
|
||||
updatedAt DateTime @updatedAt
|
||||
metadata Json @default("{}")
|
||||
healthStatus String?
|
||||
lastHealthCheck DateTime?
|
||||
events Json @default("[]")
|
||||
version Int @default(1)
|
||||
createdAt DateTime @default(now())
|
||||
updatedAt DateTime @updatedAt
|
||||
|
||||
server McpServer @relation(fields: [serverId], references: [id], onDelete: Cascade)
|
||||
|
||||
@@ -153,6 +231,46 @@ enum InstanceStatus {
|
||||
ERROR
|
||||
}
|
||||
|
||||
// ── Prompts (approved content resources) ──
|
||||
|
||||
model Prompt {
|
||||
id String @id @default(cuid())
|
||||
name String
|
||||
content String @db.Text
|
||||
projectId String?
|
||||
priority Int @default(5)
|
||||
summary String? @db.Text
|
||||
chapters Json?
|
||||
linkTarget String?
|
||||
version Int @default(1)
|
||||
createdAt DateTime @default(now())
|
||||
updatedAt DateTime @updatedAt
|
||||
|
||||
project Project? @relation(fields: [projectId], references: [id], onDelete: Cascade)
|
||||
|
||||
@@unique([name, projectId])
|
||||
@@index([projectId])
|
||||
}
|
||||
|
||||
// ── Prompt Requests (pending proposals from LLM sessions) ──
|
||||
|
||||
model PromptRequest {
|
||||
id String @id @default(cuid())
|
||||
name String
|
||||
content String @db.Text
|
||||
projectId String?
|
||||
priority Int @default(5)
|
||||
createdBySession String?
|
||||
createdByUserId String?
|
||||
createdAt DateTime @default(now())
|
||||
|
||||
project Project? @relation(fields: [projectId], references: [id], onDelete: Cascade)
|
||||
|
||||
@@unique([name, projectId])
|
||||
@@index([projectId])
|
||||
@@index([createdBySession])
|
||||
}
|
||||
|
||||
// ── Audit Logs ──
|
||||
|
||||
model AuditLog {
|
||||
|
||||
@@ -4,9 +4,9 @@ export type {
|
||||
User,
|
||||
Session,
|
||||
McpServer,
|
||||
McpProfile,
|
||||
McpTemplate,
|
||||
Secret,
|
||||
Project,
|
||||
ProjectMcpProfile,
|
||||
McpInstance,
|
||||
AuditLog,
|
||||
Role,
|
||||
@@ -14,5 +14,5 @@ export type {
|
||||
InstanceStatus,
|
||||
} from '@prisma/client';
|
||||
|
||||
export { seedMcpServers, defaultServers } from './seed/index.js';
|
||||
export type { SeedServer } from './seed/index.js';
|
||||
export { seedTemplates } from './seed/index.js';
|
||||
export type { SeedTemplate, TemplateEnvEntry, HealthCheckSpec } from './seed/index.js';
|
||||
|
||||
@@ -1,131 +1,77 @@
|
||||
import { PrismaClient } from '@prisma/client';
|
||||
import { PrismaClient, Prisma } from '@prisma/client';
|
||||
|
||||
export interface SeedServer {
|
||||
export interface TemplateEnvEntry {
|
||||
name: string;
|
||||
description: string;
|
||||
packageName: string;
|
||||
transport: 'STDIO' | 'SSE' | 'STREAMABLE_HTTP';
|
||||
repositoryUrl: string;
|
||||
envTemplate: Array<{
|
||||
name: string;
|
||||
description: string;
|
||||
isSecret: boolean;
|
||||
setupUrl?: string;
|
||||
}>;
|
||||
description?: string;
|
||||
required?: boolean;
|
||||
defaultValue?: string;
|
||||
}
|
||||
|
||||
export const defaultServers: SeedServer[] = [
|
||||
{
|
||||
name: 'slack',
|
||||
description: 'Slack MCP server for reading channels, messages, and user info',
|
||||
packageName: '@anthropic/slack-mcp',
|
||||
transport: 'STDIO',
|
||||
repositoryUrl: 'https://github.com/modelcontextprotocol/servers/tree/main/src/slack',
|
||||
envTemplate: [
|
||||
{
|
||||
name: 'SLACK_BOT_TOKEN',
|
||||
description: 'Slack Bot User OAuth Token (xoxb-...)',
|
||||
isSecret: true,
|
||||
setupUrl: 'https://api.slack.com/apps',
|
||||
},
|
||||
{
|
||||
name: 'SLACK_TEAM_ID',
|
||||
description: 'Slack Workspace Team ID',
|
||||
isSecret: false,
|
||||
},
|
||||
],
|
||||
},
|
||||
{
|
||||
name: 'jira',
|
||||
description: 'Jira MCP server for issues, projects, and boards',
|
||||
packageName: '@anthropic/jira-mcp',
|
||||
transport: 'STDIO',
|
||||
repositoryUrl: 'https://github.com/modelcontextprotocol/servers/tree/main/src/jira',
|
||||
envTemplate: [
|
||||
{
|
||||
name: 'JIRA_URL',
|
||||
description: 'Jira instance URL (e.g., https://company.atlassian.net)',
|
||||
isSecret: false,
|
||||
},
|
||||
{
|
||||
name: 'JIRA_EMAIL',
|
||||
description: 'Jira account email',
|
||||
isSecret: false,
|
||||
},
|
||||
{
|
||||
name: 'JIRA_API_TOKEN',
|
||||
description: 'Jira API token',
|
||||
isSecret: true,
|
||||
setupUrl: 'https://id.atlassian.com/manage-profile/security/api-tokens',
|
||||
},
|
||||
],
|
||||
},
|
||||
{
|
||||
name: 'github',
|
||||
description: 'GitHub MCP server for repos, issues, PRs, and code search',
|
||||
packageName: '@anthropic/github-mcp',
|
||||
transport: 'STDIO',
|
||||
repositoryUrl: 'https://github.com/modelcontextprotocol/servers/tree/main/src/github',
|
||||
envTemplate: [
|
||||
{
|
||||
name: 'GITHUB_TOKEN',
|
||||
description: 'GitHub Personal Access Token',
|
||||
isSecret: true,
|
||||
setupUrl: 'https://github.com/settings/tokens',
|
||||
},
|
||||
],
|
||||
},
|
||||
{
|
||||
name: 'terraform',
|
||||
description: 'Terraform MCP server for infrastructure documentation and state',
|
||||
packageName: '@anthropic/terraform-mcp',
|
||||
transport: 'STDIO',
|
||||
repositoryUrl: 'https://github.com/modelcontextprotocol/servers/tree/main/src/terraform',
|
||||
envTemplate: [],
|
||||
},
|
||||
];
|
||||
export interface HealthCheckSpec {
|
||||
tool: string;
|
||||
arguments?: Record<string, unknown>;
|
||||
intervalSeconds?: number;
|
||||
timeoutSeconds?: number;
|
||||
failureThreshold?: number;
|
||||
}
|
||||
|
||||
export async function seedMcpServers(
|
||||
export interface SeedTemplate {
|
||||
name: string;
|
||||
version: string;
|
||||
description: string;
|
||||
packageName?: string;
|
||||
dockerImage?: string;
|
||||
transport: 'STDIO' | 'SSE' | 'STREAMABLE_HTTP';
|
||||
repositoryUrl?: string;
|
||||
externalUrl?: string;
|
||||
command?: string[];
|
||||
containerPort?: number;
|
||||
replicas?: number;
|
||||
env?: TemplateEnvEntry[];
|
||||
healthCheck?: HealthCheckSpec;
|
||||
}
|
||||
|
||||
export async function seedTemplates(
|
||||
prisma: PrismaClient,
|
||||
servers: SeedServer[] = defaultServers,
|
||||
templates: SeedTemplate[],
|
||||
): Promise<number> {
|
||||
let created = 0;
|
||||
let upserted = 0;
|
||||
|
||||
for (const server of servers) {
|
||||
await prisma.mcpServer.upsert({
|
||||
where: { name: server.name },
|
||||
for (const tpl of templates) {
|
||||
await prisma.mcpTemplate.upsert({
|
||||
where: { name: tpl.name },
|
||||
update: {
|
||||
description: server.description,
|
||||
packageName: server.packageName,
|
||||
transport: server.transport,
|
||||
repositoryUrl: server.repositoryUrl,
|
||||
envTemplate: server.envTemplate,
|
||||
version: tpl.version,
|
||||
description: tpl.description,
|
||||
packageName: tpl.packageName ?? null,
|
||||
dockerImage: tpl.dockerImage ?? null,
|
||||
transport: tpl.transport,
|
||||
repositoryUrl: tpl.repositoryUrl ?? null,
|
||||
externalUrl: tpl.externalUrl ?? null,
|
||||
command: (tpl.command ?? Prisma.JsonNull) as Prisma.InputJsonValue,
|
||||
containerPort: tpl.containerPort ?? null,
|
||||
replicas: tpl.replicas ?? 1,
|
||||
env: (tpl.env ?? []) as unknown as Prisma.InputJsonValue,
|
||||
healthCheck: (tpl.healthCheck ?? Prisma.JsonNull) as unknown as Prisma.InputJsonValue,
|
||||
},
|
||||
create: {
|
||||
name: server.name,
|
||||
description: server.description,
|
||||
packageName: server.packageName,
|
||||
transport: server.transport,
|
||||
repositoryUrl: server.repositoryUrl,
|
||||
envTemplate: server.envTemplate,
|
||||
name: tpl.name,
|
||||
version: tpl.version,
|
||||
description: tpl.description,
|
||||
packageName: tpl.packageName ?? null,
|
||||
dockerImage: tpl.dockerImage ?? null,
|
||||
transport: tpl.transport,
|
||||
repositoryUrl: tpl.repositoryUrl ?? null,
|
||||
externalUrl: tpl.externalUrl ?? null,
|
||||
command: (tpl.command ?? Prisma.JsonNull) as Prisma.InputJsonValue,
|
||||
containerPort: tpl.containerPort ?? null,
|
||||
replicas: tpl.replicas ?? 1,
|
||||
env: (tpl.env ?? []) as unknown as Prisma.InputJsonValue,
|
||||
healthCheck: (tpl.healthCheck ?? Prisma.JsonNull) as unknown as Prisma.InputJsonValue,
|
||||
},
|
||||
});
|
||||
created++;
|
||||
upserted++;
|
||||
}
|
||||
|
||||
return created;
|
||||
}
|
||||
|
||||
// CLI entry point
|
||||
if (import.meta.url === `file://${process.argv[1]}`) {
|
||||
const prisma = new PrismaClient();
|
||||
seedMcpServers(prisma)
|
||||
.then((count) => {
|
||||
console.log(`Seeded ${count} MCP servers`);
|
||||
return prisma.$disconnect();
|
||||
})
|
||||
.catch((e) => {
|
||||
console.error(e);
|
||||
return prisma.$disconnect().then(() => process.exit(1));
|
||||
});
|
||||
return upserted;
|
||||
}
|
||||
|
||||
@@ -48,11 +48,16 @@ export async function cleanupTestDb(): Promise<void> {
|
||||
export async function clearAllTables(client: PrismaClient): Promise<void> {
|
||||
// Delete in order respecting foreign keys
|
||||
await client.auditLog.deleteMany();
|
||||
await client.projectMcpProfile.deleteMany();
|
||||
await client.mcpInstance.deleteMany();
|
||||
await client.mcpProfile.deleteMany();
|
||||
await client.projectServer.deleteMany();
|
||||
await client.projectMember.deleteMany();
|
||||
await client.secret.deleteMany();
|
||||
await client.session.deleteMany();
|
||||
await client.project.deleteMany();
|
||||
await client.mcpServer.deleteMany();
|
||||
await client.mcpTemplate.deleteMany();
|
||||
await client.groupMember.deleteMany();
|
||||
await client.group.deleteMany();
|
||||
await client.rbacDefinition.deleteMany();
|
||||
await client.user.deleteMany();
|
||||
}
|
||||
|
||||
@@ -23,11 +23,35 @@ async function createUser(overrides: { email?: string; name?: string; role?: 'US
|
||||
data: {
|
||||
email: overrides.email ?? `test-${Date.now()}@example.com`,
|
||||
name: overrides.name ?? 'Test User',
|
||||
passwordHash: '$2b$10$test-hash-placeholder',
|
||||
role: overrides.role ?? 'USER',
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
async function createGroup(overrides: { name?: string; description?: string } = {}) {
|
||||
return prisma.group.create({
|
||||
data: {
|
||||
name: overrides.name ?? `group-${Date.now()}`,
|
||||
description: overrides.description ?? 'Test group',
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
async function createProject(overrides: { name?: string; ownerId?: string } = {}) {
|
||||
let ownerId = overrides.ownerId;
|
||||
if (!ownerId) {
|
||||
const user = await createUser();
|
||||
ownerId = user.id;
|
||||
}
|
||||
return prisma.project.create({
|
||||
data: {
|
||||
name: overrides.name ?? `project-${Date.now()}`,
|
||||
ownerId,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
async function createServer(overrides: { name?: string; transport?: 'STDIO' | 'SSE' | 'STREAMABLE_HTTP' } = {}) {
|
||||
return prisma.mcpServer.create({
|
||||
data: {
|
||||
@@ -123,7 +147,7 @@ describe('McpServer', () => {
|
||||
const server = await createServer();
|
||||
expect(server.transport).toBe('STDIO');
|
||||
expect(server.version).toBe(1);
|
||||
expect(server.envTemplate).toEqual([]);
|
||||
expect(server.env).toEqual([]);
|
||||
});
|
||||
|
||||
it('enforces unique name', async () => {
|
||||
@@ -131,18 +155,18 @@ describe('McpServer', () => {
|
||||
await expect(createServer({ name: 'slack' })).rejects.toThrow();
|
||||
});
|
||||
|
||||
it('stores envTemplate as JSON', async () => {
|
||||
it('stores env as JSON', async () => {
|
||||
const server = await prisma.mcpServer.create({
|
||||
data: {
|
||||
name: 'with-env',
|
||||
envTemplate: [
|
||||
{ name: 'API_KEY', description: 'Key', isSecret: true },
|
||||
env: [
|
||||
{ name: 'API_KEY', value: 'test-key' },
|
||||
],
|
||||
},
|
||||
});
|
||||
const envTemplate = server.envTemplate as Array<{ name: string }>;
|
||||
expect(envTemplate).toHaveLength(1);
|
||||
expect(envTemplate[0].name).toBe('API_KEY');
|
||||
const env = server.env as Array<{ name: string }>;
|
||||
expect(env).toHaveLength(1);
|
||||
expect(env[0].name).toBe('API_KEY');
|
||||
});
|
||||
|
||||
it('supports SSE transport', async () => {
|
||||
@@ -151,43 +175,46 @@ describe('McpServer', () => {
|
||||
});
|
||||
});
|
||||
|
||||
// ── McpProfile model ──
|
||||
// ── Secret model ──
|
||||
|
||||
describe('McpProfile', () => {
|
||||
it('creates a profile linked to server', async () => {
|
||||
const server = await createServer();
|
||||
const profile = await prisma.mcpProfile.create({
|
||||
describe('Secret', () => {
|
||||
it('creates a secret with defaults', async () => {
|
||||
const secret = await prisma.secret.create({
|
||||
data: { name: 'my-secret' },
|
||||
});
|
||||
expect(secret.name).toBe('my-secret');
|
||||
expect(secret.data).toEqual({});
|
||||
expect(secret.version).toBe(1);
|
||||
});
|
||||
|
||||
it('stores key-value data as JSON', async () => {
|
||||
const secret = await prisma.secret.create({
|
||||
data: {
|
||||
name: 'readonly',
|
||||
serverId: server.id,
|
||||
permissions: ['read'],
|
||||
name: 'api-keys',
|
||||
data: { API_KEY: 'test-key', API_SECRET: 'test-secret' },
|
||||
},
|
||||
});
|
||||
expect(profile.name).toBe('readonly');
|
||||
expect(profile.serverId).toBe(server.id);
|
||||
const data = secret.data as Record<string, string>;
|
||||
expect(data['API_KEY']).toBe('test-key');
|
||||
expect(data['API_SECRET']).toBe('test-secret');
|
||||
});
|
||||
|
||||
it('enforces unique name per server', async () => {
|
||||
const server = await createServer();
|
||||
const data = { name: 'default', serverId: server.id };
|
||||
await prisma.mcpProfile.create({ data });
|
||||
await expect(prisma.mcpProfile.create({ data })).rejects.toThrow();
|
||||
it('enforces unique name', async () => {
|
||||
await prisma.secret.create({ data: { name: 'dup-secret' } });
|
||||
await expect(prisma.secret.create({ data: { name: 'dup-secret' } })).rejects.toThrow();
|
||||
});
|
||||
|
||||
it('allows same profile name on different servers', async () => {
|
||||
const server1 = await createServer({ name: 'server-1' });
|
||||
const server2 = await createServer({ name: 'server-2' });
|
||||
await prisma.mcpProfile.create({ data: { name: 'default', serverId: server1.id } });
|
||||
const profile2 = await prisma.mcpProfile.create({ data: { name: 'default', serverId: server2.id } });
|
||||
expect(profile2.name).toBe('default');
|
||||
});
|
||||
|
||||
it('cascades delete when server is deleted', async () => {
|
||||
const server = await createServer();
|
||||
await prisma.mcpProfile.create({ data: { name: 'test', serverId: server.id } });
|
||||
await prisma.mcpServer.delete({ where: { id: server.id } });
|
||||
const profiles = await prisma.mcpProfile.findMany({ where: { serverId: server.id } });
|
||||
expect(profiles).toHaveLength(0);
|
||||
it('updates data', async () => {
|
||||
const secret = await prisma.secret.create({
|
||||
data: { name: 'updatable', data: { KEY: 'old' } },
|
||||
});
|
||||
const updated = await prisma.secret.update({
|
||||
where: { id: secret.id },
|
||||
data: { data: { KEY: 'new', EXTRA: 'added' } },
|
||||
});
|
||||
const data = updated.data as Record<string, string>;
|
||||
expect(data['KEY']).toBe('new');
|
||||
expect(data['EXTRA']).toBe('added');
|
||||
});
|
||||
});
|
||||
|
||||
@@ -220,62 +247,6 @@ describe('Project', () => {
|
||||
});
|
||||
});
|
||||
|
||||
// ── ProjectMcpProfile (join table) ──
|
||||
|
||||
describe('ProjectMcpProfile', () => {
|
||||
it('links project to profile', async () => {
|
||||
const user = await createUser();
|
||||
const server = await createServer();
|
||||
const profile = await prisma.mcpProfile.create({
|
||||
data: { name: 'default', serverId: server.id },
|
||||
});
|
||||
const project = await prisma.project.create({
|
||||
data: { name: 'test-project', ownerId: user.id },
|
||||
});
|
||||
|
||||
const link = await prisma.projectMcpProfile.create({
|
||||
data: { projectId: project.id, profileId: profile.id },
|
||||
});
|
||||
expect(link.projectId).toBe(project.id);
|
||||
expect(link.profileId).toBe(profile.id);
|
||||
});
|
||||
|
||||
it('enforces unique project+profile combination', async () => {
|
||||
const user = await createUser();
|
||||
const server = await createServer();
|
||||
const profile = await prisma.mcpProfile.create({
|
||||
data: { name: 'default', serverId: server.id },
|
||||
});
|
||||
const project = await prisma.project.create({
|
||||
data: { name: 'test-project', ownerId: user.id },
|
||||
});
|
||||
|
||||
const data = { projectId: project.id, profileId: profile.id };
|
||||
await prisma.projectMcpProfile.create({ data });
|
||||
await expect(prisma.projectMcpProfile.create({ data })).rejects.toThrow();
|
||||
});
|
||||
|
||||
it('loads profiles through project include', async () => {
|
||||
const user = await createUser();
|
||||
const server = await createServer();
|
||||
const profile = await prisma.mcpProfile.create({
|
||||
data: { name: 'slack-ro', serverId: server.id },
|
||||
});
|
||||
const project = await prisma.project.create({
|
||||
data: { name: 'reports', ownerId: user.id },
|
||||
});
|
||||
await prisma.projectMcpProfile.create({
|
||||
data: { projectId: project.id, profileId: profile.id },
|
||||
});
|
||||
|
||||
const loaded = await prisma.project.findUnique({
|
||||
where: { id: project.id },
|
||||
include: { profiles: { include: { profile: true } } },
|
||||
});
|
||||
expect(loaded!.profiles).toHaveLength(1);
|
||||
expect(loaded!.profiles[0].profile.name).toBe('slack-ro');
|
||||
});
|
||||
});
|
||||
|
||||
// ── McpInstance model ──
|
||||
|
||||
@@ -362,3 +333,236 @@ describe('AuditLog', () => {
|
||||
expect(logs).toHaveLength(0);
|
||||
});
|
||||
});
|
||||
|
||||
// ── User SSO fields ──
|
||||
|
||||
describe('User SSO fields', () => {
|
||||
it('stores provider and externalId', async () => {
|
||||
const user = await prisma.user.create({
|
||||
data: {
|
||||
email: 'sso@example.com',
|
||||
passwordHash: 'hash',
|
||||
provider: 'oidc',
|
||||
externalId: 'ext-123',
|
||||
},
|
||||
});
|
||||
expect(user.provider).toBe('oidc');
|
||||
expect(user.externalId).toBe('ext-123');
|
||||
});
|
||||
|
||||
it('defaults provider and externalId to null', async () => {
|
||||
const user = await createUser();
|
||||
expect(user.provider).toBeNull();
|
||||
expect(user.externalId).toBeNull();
|
||||
});
|
||||
});
|
||||
|
||||
// ── Group model ──
|
||||
|
||||
describe('Group', () => {
|
||||
it('creates a group with defaults', async () => {
|
||||
const group = await createGroup();
|
||||
expect(group.id).toBeDefined();
|
||||
expect(group.version).toBe(1);
|
||||
});
|
||||
|
||||
it('enforces unique name', async () => {
|
||||
await createGroup({ name: 'devs' });
|
||||
await expect(createGroup({ name: 'devs' })).rejects.toThrow();
|
||||
});
|
||||
|
||||
it('creates group members', async () => {
|
||||
const group = await createGroup();
|
||||
const user = await createUser();
|
||||
const member = await prisma.groupMember.create({
|
||||
data: { groupId: group.id, userId: user.id },
|
||||
});
|
||||
expect(member.groupId).toBe(group.id);
|
||||
expect(member.userId).toBe(user.id);
|
||||
});
|
||||
|
||||
it('enforces unique group-user pair', async () => {
|
||||
const group = await createGroup();
|
||||
const user = await createUser();
|
||||
await prisma.groupMember.create({ data: { groupId: group.id, userId: user.id } });
|
||||
await expect(
|
||||
prisma.groupMember.create({ data: { groupId: group.id, userId: user.id } }),
|
||||
).rejects.toThrow();
|
||||
});
|
||||
|
||||
it('cascades delete when group is deleted', async () => {
|
||||
const group = await createGroup();
|
||||
const user = await createUser();
|
||||
await prisma.groupMember.create({ data: { groupId: group.id, userId: user.id } });
|
||||
await prisma.group.delete({ where: { id: group.id } });
|
||||
const members = await prisma.groupMember.findMany({ where: { groupId: group.id } });
|
||||
expect(members).toHaveLength(0);
|
||||
});
|
||||
});
|
||||
|
||||
// ── RbacDefinition model ──
|
||||
|
||||
describe('RbacDefinition', () => {
|
||||
it('creates with defaults', async () => {
|
||||
const rbac = await prisma.rbacDefinition.create({
|
||||
data: { name: 'test-rbac' },
|
||||
});
|
||||
expect(rbac.subjects).toEqual([]);
|
||||
expect(rbac.roleBindings).toEqual([]);
|
||||
expect(rbac.version).toBe(1);
|
||||
});
|
||||
|
||||
it('enforces unique name', async () => {
|
||||
await prisma.rbacDefinition.create({ data: { name: 'dup-rbac' } });
|
||||
await expect(prisma.rbacDefinition.create({ data: { name: 'dup-rbac' } })).rejects.toThrow();
|
||||
});
|
||||
|
||||
it('stores subjects as JSON', async () => {
|
||||
const rbac = await prisma.rbacDefinition.create({
|
||||
data: {
|
||||
name: 'with-subjects',
|
||||
subjects: [{ kind: 'User', name: 'alice@test.com' }, { kind: 'Group', name: 'devs' }],
|
||||
},
|
||||
});
|
||||
const subjects = rbac.subjects as Array<{ kind: string; name: string }>;
|
||||
expect(subjects).toHaveLength(2);
|
||||
expect(subjects[0].kind).toBe('User');
|
||||
});
|
||||
|
||||
it('stores roleBindings as JSON', async () => {
|
||||
const rbac = await prisma.rbacDefinition.create({
|
||||
data: {
|
||||
name: 'with-bindings',
|
||||
roleBindings: [{ role: 'editor', resource: 'servers' }],
|
||||
},
|
||||
});
|
||||
const bindings = rbac.roleBindings as Array<{ role: string; resource: string }>;
|
||||
expect(bindings).toHaveLength(1);
|
||||
expect(bindings[0].role).toBe('editor');
|
||||
});
|
||||
|
||||
it('updates subjects and roleBindings', async () => {
|
||||
const rbac = await prisma.rbacDefinition.create({ data: { name: 'updatable-rbac' } });
|
||||
const updated = await prisma.rbacDefinition.update({
|
||||
where: { id: rbac.id },
|
||||
data: {
|
||||
subjects: [{ kind: 'User', name: 'bob@test.com' }],
|
||||
roleBindings: [{ role: 'admin', resource: '*' }],
|
||||
},
|
||||
});
|
||||
expect((updated.subjects as unknown[]).length).toBe(1);
|
||||
expect((updated.roleBindings as unknown[]).length).toBe(1);
|
||||
});
|
||||
});
|
||||
|
||||
// ── ProjectServer model ──
|
||||
|
||||
describe('ProjectServer', () => {
|
||||
it('links project to server', async () => {
|
||||
const project = await createProject();
|
||||
const server = await createServer();
|
||||
const ps = await prisma.projectServer.create({
|
||||
data: { projectId: project.id, serverId: server.id },
|
||||
});
|
||||
expect(ps.projectId).toBe(project.id);
|
||||
expect(ps.serverId).toBe(server.id);
|
||||
});
|
||||
|
||||
it('enforces unique project-server pair', async () => {
|
||||
const project = await createProject();
|
||||
const server = await createServer();
|
||||
await prisma.projectServer.create({ data: { projectId: project.id, serverId: server.id } });
|
||||
await expect(
|
||||
prisma.projectServer.create({ data: { projectId: project.id, serverId: server.id } }),
|
||||
).rejects.toThrow();
|
||||
});
|
||||
|
||||
it('cascades delete when project is deleted', async () => {
|
||||
const project = await createProject();
|
||||
const server = await createServer();
|
||||
await prisma.projectServer.create({ data: { projectId: project.id, serverId: server.id } });
|
||||
await prisma.project.delete({ where: { id: project.id } });
|
||||
const links = await prisma.projectServer.findMany({ where: { projectId: project.id } });
|
||||
expect(links).toHaveLength(0);
|
||||
});
|
||||
|
||||
it('cascades delete when server is deleted', async () => {
|
||||
const project = await createProject();
|
||||
const server = await createServer();
|
||||
await prisma.projectServer.create({ data: { projectId: project.id, serverId: server.id } });
|
||||
await prisma.mcpServer.delete({ where: { id: server.id } });
|
||||
const links = await prisma.projectServer.findMany({ where: { serverId: server.id } });
|
||||
expect(links).toHaveLength(0);
|
||||
});
|
||||
});
|
||||
|
||||
// ── ProjectMember model ──
|
||||
|
||||
describe('ProjectMember', () => {
|
||||
it('links project to user with role', async () => {
|
||||
const user = await createUser();
|
||||
const project = await createProject({ ownerId: user.id });
|
||||
const pm = await prisma.projectMember.create({
|
||||
data: { projectId: project.id, userId: user.id, role: 'admin' },
|
||||
});
|
||||
expect(pm.role).toBe('admin');
|
||||
});
|
||||
|
||||
it('defaults role to member', async () => {
|
||||
const user = await createUser();
|
||||
const project = await createProject({ ownerId: user.id });
|
||||
const pm = await prisma.projectMember.create({
|
||||
data: { projectId: project.id, userId: user.id },
|
||||
});
|
||||
expect(pm.role).toBe('member');
|
||||
});
|
||||
|
||||
it('enforces unique project-user pair', async () => {
|
||||
const user = await createUser();
|
||||
const project = await createProject({ ownerId: user.id });
|
||||
await prisma.projectMember.create({ data: { projectId: project.id, userId: user.id } });
|
||||
await expect(
|
||||
prisma.projectMember.create({ data: { projectId: project.id, userId: user.id } }),
|
||||
).rejects.toThrow();
|
||||
});
|
||||
|
||||
it('cascades delete when project is deleted', async () => {
|
||||
const user = await createUser();
|
||||
const project = await createProject({ ownerId: user.id });
|
||||
await prisma.projectMember.create({ data: { projectId: project.id, userId: user.id } });
|
||||
await prisma.project.delete({ where: { id: project.id } });
|
||||
const members = await prisma.projectMember.findMany({ where: { projectId: project.id } });
|
||||
expect(members).toHaveLength(0);
|
||||
});
|
||||
});
|
||||
|
||||
// ── Project new fields ──
|
||||
|
||||
describe('Project new fields', () => {
|
||||
it('defaults proxyMode to direct', async () => {
|
||||
const project = await createProject();
|
||||
expect(project.proxyMode).toBe('direct');
|
||||
});
|
||||
|
||||
it('stores proxyMode, llmProvider, llmModel', async () => {
|
||||
const user = await createUser();
|
||||
const project = await prisma.project.create({
|
||||
data: {
|
||||
name: 'filtered-project',
|
||||
ownerId: user.id,
|
||||
proxyMode: 'filtered',
|
||||
llmProvider: 'gemini-cli',
|
||||
llmModel: 'gemini-2.0-flash',
|
||||
},
|
||||
});
|
||||
expect(project.proxyMode).toBe('filtered');
|
||||
expect(project.llmProvider).toBe('gemini-cli');
|
||||
expect(project.llmModel).toBe('gemini-2.0-flash');
|
||||
});
|
||||
|
||||
it('defaults llmProvider and llmModel to null', async () => {
|
||||
const project = await createProject();
|
||||
expect(project.llmProvider).toBeNull();
|
||||
expect(project.llmModel).toBeNull();
|
||||
});
|
||||
});
|
||||
|
||||
@@ -1,7 +1,8 @@
|
||||
import { describe, it, expect, beforeAll, afterAll, beforeEach } from 'vitest';
|
||||
import type { PrismaClient } from '@prisma/client';
|
||||
import { setupTestDb, cleanupTestDb, clearAllTables } from './helpers.js';
|
||||
import { seedMcpServers, defaultServers } from '../src/seed/index.js';
|
||||
import { seedTemplates } from '../src/seed/index.js';
|
||||
import type { SeedTemplate } from '../src/seed/index.js';
|
||||
|
||||
let prisma: PrismaClient;
|
||||
|
||||
@@ -17,55 +18,69 @@ beforeEach(async () => {
|
||||
await clearAllTables(prisma);
|
||||
});
|
||||
|
||||
describe('seedMcpServers', () => {
|
||||
it('seeds all default servers', async () => {
|
||||
const count = await seedMcpServers(prisma);
|
||||
expect(count).toBe(defaultServers.length);
|
||||
const testTemplates: SeedTemplate[] = [
|
||||
{
|
||||
name: 'github',
|
||||
version: '1.0.0',
|
||||
description: 'GitHub MCP server',
|
||||
packageName: '@anthropic/github-mcp',
|
||||
transport: 'STDIO',
|
||||
env: [{ name: 'GITHUB_TOKEN', description: 'Personal access token', required: true }],
|
||||
},
|
||||
{
|
||||
name: 'slack',
|
||||
version: '1.0.0',
|
||||
description: 'Slack MCP server',
|
||||
packageName: '@anthropic/slack-mcp',
|
||||
transport: 'STDIO',
|
||||
env: [],
|
||||
},
|
||||
];
|
||||
|
||||
const servers = await prisma.mcpServer.findMany({ orderBy: { name: 'asc' } });
|
||||
expect(servers).toHaveLength(defaultServers.length);
|
||||
describe('seedTemplates', () => {
|
||||
it('seeds templates', async () => {
|
||||
const count = await seedTemplates(prisma, testTemplates);
|
||||
expect(count).toBe(2);
|
||||
|
||||
const names = servers.map((s) => s.name);
|
||||
expect(names).toContain('slack');
|
||||
expect(names).toContain('github');
|
||||
expect(names).toContain('jira');
|
||||
expect(names).toContain('terraform');
|
||||
const templates = await prisma.mcpTemplate.findMany({ orderBy: { name: 'asc' } });
|
||||
expect(templates).toHaveLength(2);
|
||||
expect(templates.map((t) => t.name)).toEqual(['github', 'slack']);
|
||||
});
|
||||
|
||||
it('is idempotent (upsert)', async () => {
|
||||
await seedMcpServers(prisma);
|
||||
const count = await seedMcpServers(prisma);
|
||||
expect(count).toBe(defaultServers.length);
|
||||
await seedTemplates(prisma, testTemplates);
|
||||
const count = await seedTemplates(prisma, testTemplates);
|
||||
expect(count).toBe(2);
|
||||
|
||||
const servers = await prisma.mcpServer.findMany();
|
||||
expect(servers).toHaveLength(defaultServers.length);
|
||||
const templates = await prisma.mcpTemplate.findMany();
|
||||
expect(templates).toHaveLength(2);
|
||||
});
|
||||
|
||||
it('seeds envTemplate correctly', async () => {
|
||||
await seedMcpServers(prisma);
|
||||
const slack = await prisma.mcpServer.findUnique({ where: { name: 'slack' } });
|
||||
const envTemplate = slack!.envTemplate as Array<{ name: string; isSecret: boolean }>;
|
||||
expect(envTemplate).toHaveLength(2);
|
||||
expect(envTemplate[0].name).toBe('SLACK_BOT_TOKEN');
|
||||
expect(envTemplate[0].isSecret).toBe(true);
|
||||
it('seeds env correctly', async () => {
|
||||
await seedTemplates(prisma, testTemplates);
|
||||
const github = await prisma.mcpTemplate.findUnique({ where: { name: 'github' } });
|
||||
const env = github!.env as Array<{ name: string; description?: string; required?: boolean }>;
|
||||
expect(env).toHaveLength(1);
|
||||
expect(env[0].name).toBe('GITHUB_TOKEN');
|
||||
expect(env[0].required).toBe(true);
|
||||
});
|
||||
|
||||
it('accepts custom server list', async () => {
|
||||
const custom = [
|
||||
it('accepts custom template list', async () => {
|
||||
const custom: SeedTemplate[] = [
|
||||
{
|
||||
name: 'custom-server',
|
||||
description: 'Custom test server',
|
||||
name: 'custom-template',
|
||||
version: '2.0.0',
|
||||
description: 'Custom test template',
|
||||
packageName: '@test/custom',
|
||||
transport: 'STDIO' as const,
|
||||
repositoryUrl: 'https://example.com',
|
||||
envTemplate: [],
|
||||
transport: 'STDIO',
|
||||
env: [],
|
||||
},
|
||||
];
|
||||
const count = await seedMcpServers(prisma, custom);
|
||||
const count = await seedTemplates(prisma, custom);
|
||||
expect(count).toBe(1);
|
||||
|
||||
const servers = await prisma.mcpServer.findMany();
|
||||
expect(servers).toHaveLength(1);
|
||||
expect(servers[0].name).toBe('custom-server');
|
||||
const templates = await prisma.mcpTemplate.findMany();
|
||||
expect(templates).toHaveLength(1);
|
||||
expect(templates[0].name).toBe('custom-template');
|
||||
});
|
||||
});
|
||||
|
||||
@@ -23,11 +23,13 @@
|
||||
"bcrypt": "^5.1.1",
|
||||
"dockerode": "^4.0.9",
|
||||
"fastify": "^5.0.0",
|
||||
"js-yaml": "^4.1.0",
|
||||
"zod": "^3.24.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/bcrypt": "^5.0.2",
|
||||
"@types/dockerode": "^4.0.1",
|
||||
"@types/js-yaml": "^4.0.9",
|
||||
"@types/node": "^25.3.0"
|
||||
}
|
||||
}
|
||||
|
||||
102
src/mcpd/src/bootstrap/system-project.ts
Normal file
102
src/mcpd/src/bootstrap/system-project.ts
Normal file
@@ -0,0 +1,102 @@
|
||||
/**
|
||||
* Bootstrap the mcpctl-system project and its system prompts.
|
||||
*
|
||||
* This runs on every mcpd startup and uses upserts to be idempotent.
|
||||
* System prompts are editable by users but will be re-created if deleted.
|
||||
*/
|
||||
|
||||
import type { PrismaClient } from '@prisma/client';
|
||||
|
||||
/** Well-known owner ID for system-managed resources. */
|
||||
export const SYSTEM_OWNER_ID = 'system';
|
||||
|
||||
/** Well-known project name for system prompts. */
|
||||
export const SYSTEM_PROJECT_NAME = 'mcpctl-system';
|
||||
|
||||
interface SystemPromptDef {
|
||||
name: string;
|
||||
priority: number;
|
||||
content: string;
|
||||
}
|
||||
|
||||
const SYSTEM_PROMPTS: SystemPromptDef[] = [
|
||||
{
|
||||
name: 'gate-instructions',
|
||||
priority: 10,
|
||||
content: `This project uses a gated session. Before you can access tools, you must describe your current task by calling begin_session with 3-7 keywords.
|
||||
|
||||
After calling begin_session, you will receive:
|
||||
1. Relevant project prompts matched to your keywords
|
||||
2. A list of other available prompts
|
||||
3. Full access to all project tools
|
||||
|
||||
Choose your keywords carefully — they determine which context you receive.`,
|
||||
},
|
||||
{
|
||||
name: 'gate-encouragement',
|
||||
priority: 10,
|
||||
content: `If any of the listed prompts seem relevant to your work, or if you encounter unfamiliar patterns, conventions, or constraints during implementation, use read_prompts({ tags: [...] }) to retrieve them.
|
||||
|
||||
It is better to check and not need it than to proceed without important context. The project maintainers have documented common pitfalls, architecture decisions, and required patterns — taking 10 seconds to retrieve a prompt can save hours of rework.`,
|
||||
},
|
||||
{
|
||||
name: 'gate-intercept-preamble',
|
||||
priority: 10,
|
||||
content: `The following project context was automatically retrieved based on your tool call. You bypassed the begin_session step, so this context was matched using keywords extracted from your tool invocation.
|
||||
|
||||
Review this context carefully — it may contain important guidelines, constraints, or patterns relevant to your work. If you need more context, use read_prompts({ tags: [...] }) at any time.`,
|
||||
},
|
||||
{
|
||||
name: 'session-greeting',
|
||||
priority: 10,
|
||||
content: `Welcome to this project. To get started, call begin_session with keywords describing your task.
|
||||
|
||||
Example: begin_session({ tags: ["zigbee", "pairing", "mqtt"] })
|
||||
|
||||
This will load relevant project context, policies, and guidelines tailored to your work.`,
|
||||
},
|
||||
];
|
||||
|
||||
/**
|
||||
* Ensure the mcpctl-system project and its system prompts exist.
|
||||
* Uses upserts so this is safe to call on every startup.
|
||||
*/
|
||||
export async function bootstrapSystemProject(prisma: PrismaClient): Promise<void> {
|
||||
// Upsert the system project
|
||||
const project = await prisma.project.upsert({
|
||||
where: { name: SYSTEM_PROJECT_NAME },
|
||||
create: {
|
||||
name: SYSTEM_PROJECT_NAME,
|
||||
description: 'System prompts for mcpctl gating and session management',
|
||||
prompt: '',
|
||||
proxyMode: 'direct',
|
||||
gated: false,
|
||||
ownerId: SYSTEM_OWNER_ID,
|
||||
},
|
||||
update: {}, // Don't overwrite user edits to the project itself
|
||||
});
|
||||
|
||||
// Upsert each system prompt (re-create if deleted, don't overwrite content if edited)
|
||||
for (const def of SYSTEM_PROMPTS) {
|
||||
const existing = await prisma.prompt.findFirst({
|
||||
where: { name: def.name, projectId: project.id },
|
||||
});
|
||||
|
||||
if (!existing) {
|
||||
await prisma.prompt.create({
|
||||
data: {
|
||||
name: def.name,
|
||||
content: def.content,
|
||||
priority: def.priority,
|
||||
projectId: project.id,
|
||||
},
|
||||
});
|
||||
}
|
||||
// If the prompt exists, don't overwrite — user may have edited it
|
||||
}
|
||||
}
|
||||
|
||||
/** Get the names of all system prompts (for delete protection). */
|
||||
export function getSystemPromptNames(): string[] {
|
||||
return SYSTEM_PROMPTS.map((p) => p.name);
|
||||
}
|
||||
@@ -1,18 +1,29 @@
|
||||
import { readdirSync, readFileSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import { PrismaClient } from '@prisma/client';
|
||||
import { seedMcpServers } from '@mcpctl/db';
|
||||
import yaml from 'js-yaml';
|
||||
import { seedTemplates } from '@mcpctl/db';
|
||||
import type { SeedTemplate } from '@mcpctl/db';
|
||||
import { loadConfigFromEnv } from './config/index.js';
|
||||
import { createServer } from './server.js';
|
||||
import { setupGracefulShutdown } from './utils/index.js';
|
||||
import {
|
||||
McpServerRepository,
|
||||
McpProfileRepository,
|
||||
SecretRepository,
|
||||
McpInstanceRepository,
|
||||
ProjectRepository,
|
||||
AuditLogRepository,
|
||||
TemplateRepository,
|
||||
RbacDefinitionRepository,
|
||||
UserRepository,
|
||||
GroupRepository,
|
||||
} from './repositories/index.js';
|
||||
import { PromptRepository } from './repositories/prompt.repository.js';
|
||||
import { PromptRequestRepository } from './repositories/prompt-request.repository.js';
|
||||
import { bootstrapSystemProject } from './bootstrap/system-project.js';
|
||||
import {
|
||||
McpServerService,
|
||||
McpProfileService,
|
||||
SecretService,
|
||||
InstanceService,
|
||||
ProjectService,
|
||||
AuditLogService,
|
||||
@@ -23,10 +34,19 @@ import {
|
||||
RestoreService,
|
||||
AuthService,
|
||||
McpProxyService,
|
||||
TemplateService,
|
||||
HealthProbeRunner,
|
||||
RbacDefinitionService,
|
||||
RbacService,
|
||||
UserService,
|
||||
GroupService,
|
||||
} from './services/index.js';
|
||||
import type { RbacAction } from './services/index.js';
|
||||
import type { UpdateRbacDefinitionInput } from './validation/rbac-definition.schema.js';
|
||||
import { createAuthMiddleware } from './middleware/auth.js';
|
||||
import {
|
||||
registerMcpServerRoutes,
|
||||
registerMcpProfileRoutes,
|
||||
registerSecretRoutes,
|
||||
registerInstanceRoutes,
|
||||
registerProjectRoutes,
|
||||
registerAuditLogRoutes,
|
||||
@@ -34,7 +54,156 @@ import {
|
||||
registerBackupRoutes,
|
||||
registerAuthRoutes,
|
||||
registerMcpProxyRoutes,
|
||||
registerTemplateRoutes,
|
||||
registerRbacRoutes,
|
||||
registerUserRoutes,
|
||||
registerGroupRoutes,
|
||||
} from './routes/index.js';
|
||||
import { registerPromptRoutes } from './routes/prompts.js';
|
||||
import { PromptService } from './services/prompt.service.js';
|
||||
|
||||
type PermissionCheck =
|
||||
| { kind: 'resource'; resource: string; action: RbacAction; resourceName?: string }
|
||||
| { kind: 'operation'; operation: string }
|
||||
| { kind: 'skip' };
|
||||
|
||||
/**
|
||||
* Map an HTTP method + URL to a permission check.
|
||||
* Returns 'skip' for URLs that should not be RBAC-checked.
|
||||
*/
|
||||
function mapUrlToPermission(method: string, url: string): PermissionCheck {
|
||||
const match = url.match(/^\/api\/v1\/([a-z-]+)/);
|
||||
if (!match) return { kind: 'skip' };
|
||||
|
||||
const segment = match[1] as string;
|
||||
|
||||
// Operations (non-resource endpoints)
|
||||
if (segment === 'backup') return { kind: 'operation', operation: 'backup' };
|
||||
if (segment === 'restore') return { kind: 'operation', operation: 'restore' };
|
||||
if (segment === 'audit-logs' && method === 'DELETE') return { kind: 'operation', operation: 'audit-purge' };
|
||||
|
||||
const resourceMap: Record<string, string | undefined> = {
|
||||
'servers': 'servers',
|
||||
'instances': 'instances',
|
||||
'secrets': 'secrets',
|
||||
'projects': 'projects',
|
||||
'templates': 'templates',
|
||||
'users': 'users',
|
||||
'groups': 'groups',
|
||||
'rbac': 'rbac',
|
||||
'audit-logs': 'rbac',
|
||||
'mcp': 'servers',
|
||||
'prompts': 'prompts',
|
||||
'promptrequests': 'promptrequests',
|
||||
};
|
||||
|
||||
const resource = resourceMap[segment];
|
||||
if (resource === undefined) return { kind: 'skip' };
|
||||
|
||||
// Special case: /api/v1/promptrequests/:id/approve → needs both delete+promptrequests and create+prompts
|
||||
// We check delete on promptrequests (the harder permission); create on prompts is checked in the service layer
|
||||
const approveMatch = url.match(/^\/api\/v1\/promptrequests\/([^/?]+)\/approve/);
|
||||
if (approveMatch?.[1]) {
|
||||
return { kind: 'resource', resource: 'promptrequests', action: 'delete', resourceName: approveMatch[1] };
|
||||
}
|
||||
|
||||
// Special case: /api/v1/projects/:name/prompts/visible → view prompts
|
||||
const visiblePromptsMatch = url.match(/^\/api\/v1\/projects\/([^/?]+)\/prompts\/visible/);
|
||||
if (visiblePromptsMatch?.[1]) {
|
||||
return { kind: 'resource', resource: 'prompts', action: 'view' };
|
||||
}
|
||||
|
||||
// Special case: /api/v1/projects/:name/promptrequests → create promptrequests
|
||||
const projectPromptrequestsMatch = url.match(/^\/api\/v1\/projects\/([^/?]+)\/promptrequests/);
|
||||
if (projectPromptrequestsMatch?.[1] && method === 'POST') {
|
||||
return { kind: 'resource', resource: 'promptrequests', action: 'create' };
|
||||
}
|
||||
|
||||
// Special case: /api/v1/projects/:id/instructions → view projects
|
||||
const instructionsMatch = url.match(/^\/api\/v1\/projects\/([^/?]+)\/instructions/);
|
||||
if (instructionsMatch?.[1]) {
|
||||
return { kind: 'resource', resource: 'projects', action: 'view', resourceName: instructionsMatch[1] };
|
||||
}
|
||||
|
||||
// Special case: /api/v1/projects/:id/mcp-config → requires 'expose' permission
|
||||
const mcpConfigMatch = url.match(/^\/api\/v1\/projects\/([^/?]+)\/mcp-config/);
|
||||
if (mcpConfigMatch?.[1]) {
|
||||
return { kind: 'resource', resource: 'projects', action: 'expose', resourceName: mcpConfigMatch[1] };
|
||||
}
|
||||
|
||||
// Special case: /api/v1/projects/:id/servers — attach/detach requires 'edit'
|
||||
const projectServersMatch = url.match(/^\/api\/v1\/projects\/([^/?]+)\/servers/);
|
||||
if (projectServersMatch?.[1] && method !== 'GET') {
|
||||
return { kind: 'resource', resource: 'projects', action: 'edit', resourceName: projectServersMatch[1] };
|
||||
}
|
||||
|
||||
// Map HTTP method to action
|
||||
let action: RbacAction;
|
||||
switch (method) {
|
||||
case 'GET':
|
||||
case 'HEAD':
|
||||
action = 'view';
|
||||
break;
|
||||
case 'POST':
|
||||
action = 'create';
|
||||
break;
|
||||
case 'DELETE':
|
||||
action = 'delete';
|
||||
break;
|
||||
default: // PUT, PATCH
|
||||
action = 'edit';
|
||||
break;
|
||||
}
|
||||
|
||||
// Extract resource name/ID from URL (3rd segment: /api/v1/servers/:nameOrId)
|
||||
const nameMatch = url.match(/^\/api\/v1\/[a-z-]+\/([^/?]+)/);
|
||||
const resourceName = nameMatch?.[1];
|
||||
|
||||
const check: PermissionCheck = { kind: 'resource', resource, action };
|
||||
if (resourceName !== undefined) (check as { resourceName: string }).resourceName = resourceName;
|
||||
return check;
|
||||
}
|
||||
|
||||
/**
|
||||
* Migrate legacy 'admin' role bindings → granular roles.
|
||||
* Old format: { role: 'admin', resource: '*' }
|
||||
* New format: { role: 'edit', resource: '*' }, { role: 'run', resource: '*' },
|
||||
* plus operation bindings for impersonate, logs, backup, restore, audit-purge
|
||||
*/
|
||||
async function migrateAdminRole(rbacRepo: InstanceType<typeof RbacDefinitionRepository>): Promise<void> {
|
||||
const definitions = await rbacRepo.findAll();
|
||||
for (const def of definitions) {
|
||||
const bindings = def.roleBindings as Array<Record<string, unknown>>;
|
||||
const hasAdminRole = bindings.some((b) => b['role'] === 'admin');
|
||||
if (!hasAdminRole) continue;
|
||||
|
||||
// Replace admin bindings with granular equivalents
|
||||
const newBindings: Array<Record<string, string>> = [];
|
||||
for (const b of bindings) {
|
||||
if (b['role'] === 'admin') {
|
||||
const resource = b['resource'] as string;
|
||||
newBindings.push({ role: 'edit', resource });
|
||||
newBindings.push({ role: 'run', resource });
|
||||
} else {
|
||||
newBindings.push(b as Record<string, string>);
|
||||
}
|
||||
}
|
||||
// Add operation bindings (idempotent — only for wildcard admin)
|
||||
const hasWildcard = bindings.some((b) => b['role'] === 'admin' && b['resource'] === '*');
|
||||
if (hasWildcard) {
|
||||
const ops = ['impersonate', 'logs', 'backup', 'restore', 'audit-purge'];
|
||||
for (const op of ops) {
|
||||
if (!newBindings.some((b) => b['action'] === op)) {
|
||||
newBindings.push({ role: 'run', action: op });
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
await rbacRepo.update(def.id, { roleBindings: newBindings as UpdateRbacDefinitionInput['roleBindings'] });
|
||||
// eslint-disable-next-line no-console
|
||||
console.log(`mcpd: migrated RBAC '${def.name}' from admin → granular roles`);
|
||||
}
|
||||
}
|
||||
|
||||
async function main(): Promise<void> {
|
||||
const config = loadConfigFromEnv();
|
||||
@@ -45,31 +214,83 @@ async function main(): Promise<void> {
|
||||
});
|
||||
await prisma.$connect();
|
||||
|
||||
// Seed default servers (upsert, safe to repeat)
|
||||
await seedMcpServers(prisma);
|
||||
// Seed templates from YAML files
|
||||
const templatesDir = process.env.TEMPLATES_DIR ?? 'templates';
|
||||
const templateFiles = (() => {
|
||||
try {
|
||||
return readdirSync(templatesDir).filter((f) => f.endsWith('.yaml') || f.endsWith('.yml'));
|
||||
} catch {
|
||||
return [];
|
||||
}
|
||||
})();
|
||||
const templates: SeedTemplate[] = templateFiles.map((f) => {
|
||||
const content = readFileSync(join(templatesDir, f), 'utf-8');
|
||||
const parsed = yaml.load(content) as SeedTemplate;
|
||||
return {
|
||||
...parsed,
|
||||
transport: parsed.transport ?? 'STDIO',
|
||||
version: parsed.version ?? '1.0.0',
|
||||
description: parsed.description ?? '',
|
||||
...(parsed.healthCheck ? { healthCheck: parsed.healthCheck } : {}),
|
||||
};
|
||||
});
|
||||
await seedTemplates(prisma, templates);
|
||||
|
||||
// Bootstrap system project and prompts
|
||||
await bootstrapSystemProject(prisma);
|
||||
|
||||
// Repositories
|
||||
const serverRepo = new McpServerRepository(prisma);
|
||||
const profileRepo = new McpProfileRepository(prisma);
|
||||
const secretRepo = new SecretRepository(prisma);
|
||||
const instanceRepo = new McpInstanceRepository(prisma);
|
||||
const projectRepo = new ProjectRepository(prisma);
|
||||
const auditLogRepo = new AuditLogRepository(prisma);
|
||||
const templateRepo = new TemplateRepository(prisma);
|
||||
const rbacDefinitionRepo = new RbacDefinitionRepository(prisma);
|
||||
const userRepo = new UserRepository(prisma);
|
||||
const groupRepo = new GroupRepository(prisma);
|
||||
|
||||
// CUID detection for RBAC name resolution
|
||||
const CUID_RE = /^c[^\s-]{8,}$/i;
|
||||
const nameResolvers: Record<string, { findById(id: string): Promise<{ name: string } | null> }> = {
|
||||
servers: serverRepo,
|
||||
secrets: secretRepo,
|
||||
projects: projectRepo,
|
||||
groups: groupRepo,
|
||||
};
|
||||
|
||||
// Migrate legacy 'admin' role → granular roles
|
||||
await migrateAdminRole(rbacDefinitionRepo);
|
||||
|
||||
// Orchestrator
|
||||
const orchestrator = new DockerContainerManager();
|
||||
|
||||
// Services
|
||||
const serverService = new McpServerService(serverRepo);
|
||||
const profileService = new McpProfileService(profileRepo, serverRepo);
|
||||
const instanceService = new InstanceService(instanceRepo, serverRepo, orchestrator);
|
||||
const projectService = new ProjectService(projectRepo, profileRepo, serverRepo);
|
||||
const instanceService = new InstanceService(instanceRepo, serverRepo, orchestrator, secretRepo);
|
||||
serverService.setInstanceService(instanceService);
|
||||
const secretService = new SecretService(secretRepo);
|
||||
const projectService = new ProjectService(projectRepo, serverRepo, secretRepo);
|
||||
const auditLogService = new AuditLogService(auditLogRepo);
|
||||
const metricsCollector = new MetricsCollector();
|
||||
const healthAggregator = new HealthAggregator(metricsCollector, orchestrator);
|
||||
const backupService = new BackupService(serverRepo, profileRepo, projectRepo);
|
||||
const restoreService = new RestoreService(serverRepo, profileRepo, projectRepo);
|
||||
const backupService = new BackupService(serverRepo, projectRepo, secretRepo, userRepo, groupRepo, rbacDefinitionRepo);
|
||||
const restoreService = new RestoreService(serverRepo, projectRepo, secretRepo, userRepo, groupRepo, rbacDefinitionRepo);
|
||||
const authService = new AuthService(prisma);
|
||||
const mcpProxyService = new McpProxyService(instanceRepo);
|
||||
const templateService = new TemplateService(templateRepo);
|
||||
const mcpProxyService = new McpProxyService(instanceRepo, serverRepo, orchestrator);
|
||||
const rbacDefinitionService = new RbacDefinitionService(rbacDefinitionRepo);
|
||||
const rbacService = new RbacService(rbacDefinitionRepo, prisma);
|
||||
const userService = new UserService(userRepo);
|
||||
const groupService = new GroupService(groupRepo, userRepo);
|
||||
const promptRepo = new PromptRepository(prisma);
|
||||
const promptRequestRepo = new PromptRequestRepository(prisma);
|
||||
const promptService = new PromptService(promptRepo, promptRequestRepo, projectRepo);
|
||||
|
||||
// Auth middleware for global hooks
|
||||
const authMiddleware = createAuthMiddleware({
|
||||
findSession: (token) => authService.findSession(token),
|
||||
});
|
||||
|
||||
// Server
|
||||
const app = await createServer(config, {
|
||||
@@ -85,28 +306,120 @@ async function main(): Promise<void> {
|
||||
},
|
||||
});
|
||||
|
||||
// ── Global auth hook ──
|
||||
// Runs on all /api/v1/* routes EXCEPT auth endpoints and health checks.
|
||||
// Tests that use createServer() directly are NOT affected — this hook
|
||||
// is only registered here in main.ts.
|
||||
app.addHook('preHandler', async (request, reply) => {
|
||||
const url = request.url;
|
||||
// Skip auth for health, auth, and root
|
||||
if (url.startsWith('/api/v1/auth/') || url === '/healthz' || url === '/health') return;
|
||||
if (!url.startsWith('/api/v1/')) return;
|
||||
|
||||
// Run auth middleware
|
||||
await authMiddleware(request, reply);
|
||||
});
|
||||
|
||||
// ── Global RBAC hook ──
|
||||
// Runs after the auth hook. Maps URL to resource+action and checks permissions.
|
||||
app.addHook('preHandler', async (request, reply) => {
|
||||
if (reply.sent) return; // Auth hook already rejected
|
||||
const url = request.url;
|
||||
if (url.startsWith('/api/v1/auth/') || url === '/healthz' || url === '/health') return;
|
||||
if (!url.startsWith('/api/v1/')) return;
|
||||
if (request.userId === undefined) return; // Auth hook will handle 401
|
||||
|
||||
const check = mapUrlToPermission(request.method, url);
|
||||
if (check.kind === 'skip') return;
|
||||
|
||||
// Extract service account identity from header (sent by mcplocal)
|
||||
const saHeader = request.headers['x-service-account'];
|
||||
const serviceAccountName = typeof saHeader === 'string' ? saHeader : undefined;
|
||||
|
||||
let allowed: boolean;
|
||||
if (check.kind === 'operation') {
|
||||
allowed = await rbacService.canRunOperation(request.userId, check.operation, serviceAccountName);
|
||||
} else {
|
||||
// Resolve CUID → human name for name-scoped RBAC bindings
|
||||
if (check.resourceName !== undefined && CUID_RE.test(check.resourceName)) {
|
||||
const resolver = nameResolvers[check.resource];
|
||||
if (resolver) {
|
||||
const entity = await resolver.findById(check.resourceName);
|
||||
if (entity) check.resourceName = entity.name;
|
||||
}
|
||||
}
|
||||
allowed = await rbacService.canAccess(request.userId, check.action, check.resource, check.resourceName, serviceAccountName);
|
||||
// Compute scope for list filtering (used by preSerialization hook)
|
||||
if (allowed && check.resourceName === undefined) {
|
||||
request.rbacScope = await rbacService.getAllowedScope(request.userId, check.action, check.resource, serviceAccountName);
|
||||
}
|
||||
}
|
||||
if (!allowed) {
|
||||
reply.code(403).send({ error: 'Forbidden' });
|
||||
}
|
||||
});
|
||||
|
||||
// Routes
|
||||
registerMcpServerRoutes(app, serverService);
|
||||
registerMcpProfileRoutes(app, profileService);
|
||||
registerMcpServerRoutes(app, serverService, instanceService);
|
||||
registerTemplateRoutes(app, templateService);
|
||||
registerSecretRoutes(app, secretService);
|
||||
registerInstanceRoutes(app, instanceService);
|
||||
registerProjectRoutes(app, projectService);
|
||||
registerAuditLogRoutes(app, auditLogService);
|
||||
registerHealthMonitoringRoutes(app, { healthAggregator, metricsCollector });
|
||||
registerBackupRoutes(app, { backupService, restoreService });
|
||||
registerAuthRoutes(app, { authService });
|
||||
registerAuthRoutes(app, { authService, userService, groupService, rbacDefinitionService, rbacService });
|
||||
registerMcpProxyRoutes(app, {
|
||||
mcpProxyService,
|
||||
auditLogService,
|
||||
authDeps: { findSession: (token) => authService.findSession(token) },
|
||||
});
|
||||
registerRbacRoutes(app, rbacDefinitionService);
|
||||
registerUserRoutes(app, userService);
|
||||
registerGroupRoutes(app, groupService);
|
||||
registerPromptRoutes(app, promptService, projectRepo);
|
||||
|
||||
// ── RBAC list filtering hook ──
|
||||
// Filters array responses to only include resources the user is allowed to see.
|
||||
app.addHook('preSerialization', async (request, _reply, payload) => {
|
||||
if (!request.rbacScope || request.rbacScope.wildcard) return payload;
|
||||
if (!Array.isArray(payload)) return payload;
|
||||
return (payload as Array<Record<string, unknown>>).filter((item) => {
|
||||
const name = item['name'];
|
||||
return typeof name === 'string' && request.rbacScope!.names.has(name);
|
||||
});
|
||||
});
|
||||
|
||||
// Start
|
||||
await app.listen({ port: config.port, host: config.host });
|
||||
app.log.info(`mcpd listening on ${config.host}:${config.port}`);
|
||||
|
||||
// Periodic container liveness sync — detect crashed containers
|
||||
const SYNC_INTERVAL_MS = 30_000; // 30s
|
||||
const syncTimer = setInterval(async () => {
|
||||
try {
|
||||
await instanceService.syncStatus();
|
||||
} catch (err) {
|
||||
app.log.error({ err }, 'Container status sync failed');
|
||||
}
|
||||
}, SYNC_INTERVAL_MS);
|
||||
|
||||
// Health probe runner — periodic MCP tool-call probes (like k8s livenessProbe)
|
||||
const healthProbeRunner = new HealthProbeRunner(
|
||||
instanceRepo,
|
||||
serverRepo,
|
||||
orchestrator,
|
||||
{ info: (msg) => app.log.info(msg), error: (obj, msg) => app.log.error(obj, msg) },
|
||||
);
|
||||
healthProbeRunner.start(15_000);
|
||||
|
||||
// Graceful shutdown
|
||||
setupGracefulShutdown(app, {
|
||||
disconnectDb: () => prisma.$disconnect(),
|
||||
disconnectDb: async () => {
|
||||
clearInterval(syncTimer);
|
||||
healthProbeRunner.stop();
|
||||
await prisma.$disconnect();
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
|
||||
@@ -7,6 +7,7 @@ export interface AuthDeps {
|
||||
declare module 'fastify' {
|
||||
interface FastifyRequest {
|
||||
userId?: string;
|
||||
rbacScope?: { wildcard: boolean; names: Set<string> };
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
36
src/mcpd/src/middleware/rbac.ts
Normal file
36
src/mcpd/src/middleware/rbac.ts
Normal file
@@ -0,0 +1,36 @@
|
||||
import type { FastifyRequest, FastifyReply } from 'fastify';
|
||||
import type { RbacService, RbacAction } from '../services/rbac.service.js';
|
||||
|
||||
export function createRbacMiddleware(rbacService: RbacService) {
|
||||
function requirePermission(resource: string, action: RbacAction, resourceName?: string) {
|
||||
return async (request: FastifyRequest, reply: FastifyReply): Promise<void> => {
|
||||
if (request.userId === undefined) {
|
||||
reply.code(401).send({ error: 'Authentication required' });
|
||||
return;
|
||||
}
|
||||
|
||||
const allowed = await rbacService.canAccess(request.userId, action, resource, resourceName);
|
||||
if (!allowed) {
|
||||
reply.code(403).send({ error: 'Forbidden' });
|
||||
return;
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
function requireOperation(operation: string) {
|
||||
return async (request: FastifyRequest, reply: FastifyReply): Promise<void> => {
|
||||
if (request.userId === undefined) {
|
||||
reply.code(401).send({ error: 'Authentication required' });
|
||||
return;
|
||||
}
|
||||
|
||||
const allowed = await rbacService.canRunOperation(request.userId, operation);
|
||||
if (!allowed) {
|
||||
reply.code(403).send({ error: 'Forbidden' });
|
||||
return;
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
return { requirePermission, requireOperation };
|
||||
}
|
||||
93
src/mcpd/src/repositories/group.repository.ts
Normal file
93
src/mcpd/src/repositories/group.repository.ts
Normal file
@@ -0,0 +1,93 @@
|
||||
import type { PrismaClient, Group } from '@prisma/client';
|
||||
|
||||
export interface GroupWithMembers extends Group {
|
||||
members: Array<{ id: string; user: { id: string; email: string; name: string | null } }>;
|
||||
}
|
||||
|
||||
export interface IGroupRepository {
|
||||
findAll(): Promise<GroupWithMembers[]>;
|
||||
findById(id: string): Promise<GroupWithMembers | null>;
|
||||
findByName(name: string): Promise<GroupWithMembers | null>;
|
||||
create(data: { name: string; description?: string }): Promise<Group>;
|
||||
update(id: string, data: { description?: string }): Promise<Group>;
|
||||
delete(id: string): Promise<void>;
|
||||
setMembers(groupId: string, userIds: string[]): Promise<void>;
|
||||
findGroupsForUser(userId: string): Promise<Array<{ id: string; name: string }>>;
|
||||
}
|
||||
|
||||
const MEMBERS_INCLUDE = {
|
||||
members: {
|
||||
select: {
|
||||
id: true,
|
||||
user: {
|
||||
select: { id: true, email: true, name: true },
|
||||
},
|
||||
},
|
||||
},
|
||||
} as const;
|
||||
|
||||
export class GroupRepository implements IGroupRepository {
|
||||
constructor(private readonly prisma: PrismaClient) {}
|
||||
|
||||
async findAll(): Promise<GroupWithMembers[]> {
|
||||
return this.prisma.group.findMany({
|
||||
orderBy: { name: 'asc' },
|
||||
include: MEMBERS_INCLUDE,
|
||||
});
|
||||
}
|
||||
|
||||
async findById(id: string): Promise<GroupWithMembers | null> {
|
||||
return this.prisma.group.findUnique({
|
||||
where: { id },
|
||||
include: MEMBERS_INCLUDE,
|
||||
});
|
||||
}
|
||||
|
||||
async findByName(name: string): Promise<GroupWithMembers | null> {
|
||||
return this.prisma.group.findUnique({
|
||||
where: { name },
|
||||
include: MEMBERS_INCLUDE,
|
||||
});
|
||||
}
|
||||
|
||||
async create(data: { name: string; description?: string }): Promise<Group> {
|
||||
const createData: Record<string, unknown> = { name: data.name };
|
||||
if (data.description !== undefined) createData['description'] = data.description;
|
||||
return this.prisma.group.create({
|
||||
data: createData as Parameters<PrismaClient['group']['create']>[0]['data'],
|
||||
});
|
||||
}
|
||||
|
||||
async update(id: string, data: { description?: string }): Promise<Group> {
|
||||
const updateData: Record<string, unknown> = {};
|
||||
if (data.description !== undefined) updateData['description'] = data.description;
|
||||
return this.prisma.group.update({ where: { id }, data: updateData });
|
||||
}
|
||||
|
||||
async delete(id: string): Promise<void> {
|
||||
await this.prisma.group.delete({ where: { id } });
|
||||
}
|
||||
|
||||
async setMembers(groupId: string, userIds: string[]): Promise<void> {
|
||||
await this.prisma.$transaction(async (tx) => {
|
||||
await tx.groupMember.deleteMany({ where: { groupId } });
|
||||
if (userIds.length > 0) {
|
||||
await tx.groupMember.createMany({
|
||||
data: userIds.map((userId) => ({ groupId, userId })),
|
||||
});
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
async findGroupsForUser(userId: string): Promise<Array<{ id: string; name: string }>> {
|
||||
const memberships = await this.prisma.groupMember.findMany({
|
||||
where: { userId },
|
||||
select: {
|
||||
group: {
|
||||
select: { id: true, name: true },
|
||||
},
|
||||
},
|
||||
});
|
||||
return memberships.map((m) => m.group);
|
||||
}
|
||||
}
|
||||
@@ -1,7 +1,15 @@
|
||||
export type { IMcpServerRepository, IMcpProfileRepository, IMcpInstanceRepository, IAuditLogRepository, AuditLogFilter } from './interfaces.js';
|
||||
export type { IMcpServerRepository, IMcpInstanceRepository, ISecretRepository, IAuditLogRepository, AuditLogFilter } from './interfaces.js';
|
||||
export { McpServerRepository } from './mcp-server.repository.js';
|
||||
export { McpProfileRepository } from './mcp-profile.repository.js';
|
||||
export type { IProjectRepository } from './project.repository.js';
|
||||
export { SecretRepository } from './secret.repository.js';
|
||||
export type { IProjectRepository, ProjectWithRelations } from './project.repository.js';
|
||||
export { ProjectRepository } from './project.repository.js';
|
||||
export { McpInstanceRepository } from './mcp-instance.repository.js';
|
||||
export { AuditLogRepository } from './audit-log.repository.js';
|
||||
export type { ITemplateRepository } from './template.repository.js';
|
||||
export { TemplateRepository } from './template.repository.js';
|
||||
export type { IRbacDefinitionRepository } from './rbac-definition.repository.js';
|
||||
export { RbacDefinitionRepository } from './rbac-definition.repository.js';
|
||||
export type { IUserRepository, SafeUser } from './user.repository.js';
|
||||
export { UserRepository } from './user.repository.js';
|
||||
export type { IGroupRepository, GroupWithMembers } from './group.repository.js';
|
||||
export { GroupRepository } from './group.repository.js';
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import type { McpServer, McpProfile, McpInstance, AuditLog, InstanceStatus } from '@prisma/client';
|
||||
import type { McpServer, McpInstance, AuditLog, Secret, InstanceStatus } from '@prisma/client';
|
||||
import type { CreateMcpServerInput, UpdateMcpServerInput } from '../validation/mcp-server.schema.js';
|
||||
import type { CreateMcpProfileInput, UpdateMcpProfileInput } from '../validation/mcp-profile.schema.js';
|
||||
import type { CreateSecretInput, UpdateSecretInput } from '../validation/secret.schema.js';
|
||||
|
||||
export interface IMcpServerRepository {
|
||||
findAll(): Promise<McpServer[]>;
|
||||
@@ -16,16 +16,16 @@ export interface IMcpInstanceRepository {
|
||||
findById(id: string): Promise<McpInstance | null>;
|
||||
findByContainerId(containerId: string): Promise<McpInstance | null>;
|
||||
create(data: { serverId: string; containerId?: string; status?: InstanceStatus; port?: number; metadata?: Record<string, unknown> }): Promise<McpInstance>;
|
||||
updateStatus(id: string, status: InstanceStatus, fields?: { containerId?: string; port?: number; metadata?: Record<string, unknown> }): Promise<McpInstance>;
|
||||
updateStatus(id: string, status: InstanceStatus, fields?: { containerId?: string; port?: number; metadata?: Record<string, unknown>; healthStatus?: string; lastHealthCheck?: Date; events?: unknown[] }): Promise<McpInstance>;
|
||||
delete(id: string): Promise<void>;
|
||||
}
|
||||
|
||||
export interface IMcpProfileRepository {
|
||||
findAll(serverId?: string): Promise<McpProfile[]>;
|
||||
findById(id: string): Promise<McpProfile | null>;
|
||||
findByServerAndName(serverId: string, name: string): Promise<McpProfile | null>;
|
||||
create(data: CreateMcpProfileInput): Promise<McpProfile>;
|
||||
update(id: string, data: UpdateMcpProfileInput): Promise<McpProfile>;
|
||||
export interface ISecretRepository {
|
||||
findAll(): Promise<Secret[]>;
|
||||
findById(id: string): Promise<Secret | null>;
|
||||
findByName(name: string): Promise<Secret | null>;
|
||||
create(data: CreateSecretInput): Promise<Secret>;
|
||||
update(id: string, data: UpdateSecretInput): Promise<Secret>;
|
||||
delete(id: string): Promise<void>;
|
||||
}
|
||||
|
||||
|
||||
@@ -11,12 +11,16 @@ export class McpInstanceRepository implements IMcpInstanceRepository {
|
||||
}
|
||||
return this.prisma.mcpInstance.findMany({
|
||||
where,
|
||||
include: { server: { select: { name: true } } },
|
||||
orderBy: { createdAt: 'desc' },
|
||||
});
|
||||
}
|
||||
|
||||
async findById(id: string): Promise<McpInstance | null> {
|
||||
return this.prisma.mcpInstance.findUnique({ where: { id } });
|
||||
return this.prisma.mcpInstance.findUnique({
|
||||
where: { id },
|
||||
include: { server: { select: { name: true } } },
|
||||
});
|
||||
}
|
||||
|
||||
async findByContainerId(containerId: string): Promise<McpInstance | null> {
|
||||
@@ -44,7 +48,7 @@ export class McpInstanceRepository implements IMcpInstanceRepository {
|
||||
async updateStatus(
|
||||
id: string,
|
||||
status: InstanceStatus,
|
||||
fields?: { containerId?: string; port?: number; metadata?: Record<string, unknown> },
|
||||
fields?: { containerId?: string; port?: number; metadata?: Record<string, unknown>; healthStatus?: string; lastHealthCheck?: Date; events?: unknown[] },
|
||||
): Promise<McpInstance> {
|
||||
const updateData: Prisma.McpInstanceUpdateInput = {
|
||||
status,
|
||||
@@ -59,6 +63,15 @@ export class McpInstanceRepository implements IMcpInstanceRepository {
|
||||
if (fields?.metadata !== undefined) {
|
||||
updateData.metadata = fields.metadata as Prisma.InputJsonValue;
|
||||
}
|
||||
if (fields?.healthStatus !== undefined) {
|
||||
updateData.healthStatus = fields.healthStatus;
|
||||
}
|
||||
if (fields?.lastHealthCheck !== undefined) {
|
||||
updateData.lastHealthCheck = fields.lastHealthCheck;
|
||||
}
|
||||
if (fields?.events !== undefined) {
|
||||
updateData.events = fields.events as unknown as Prisma.InputJsonValue;
|
||||
}
|
||||
return this.prisma.mcpInstance.update({
|
||||
where: { id },
|
||||
data: updateData,
|
||||
|
||||
@@ -1,46 +0,0 @@
|
||||
import type { PrismaClient, McpProfile } from '@prisma/client';
|
||||
import type { IMcpProfileRepository } from './interfaces.js';
|
||||
import type { CreateMcpProfileInput, UpdateMcpProfileInput } from '../validation/mcp-profile.schema.js';
|
||||
|
||||
export class McpProfileRepository implements IMcpProfileRepository {
|
||||
constructor(private readonly prisma: PrismaClient) {}
|
||||
|
||||
async findAll(serverId?: string): Promise<McpProfile[]> {
|
||||
const where = serverId !== undefined ? { serverId } : {};
|
||||
return this.prisma.mcpProfile.findMany({ where, orderBy: { name: 'asc' } });
|
||||
}
|
||||
|
||||
async findById(id: string): Promise<McpProfile | null> {
|
||||
return this.prisma.mcpProfile.findUnique({ where: { id } });
|
||||
}
|
||||
|
||||
async findByServerAndName(serverId: string, name: string): Promise<McpProfile | null> {
|
||||
return this.prisma.mcpProfile.findUnique({
|
||||
where: { name_serverId: { name, serverId } },
|
||||
});
|
||||
}
|
||||
|
||||
async create(data: CreateMcpProfileInput): Promise<McpProfile> {
|
||||
return this.prisma.mcpProfile.create({
|
||||
data: {
|
||||
name: data.name,
|
||||
serverId: data.serverId,
|
||||
permissions: data.permissions,
|
||||
envOverrides: data.envOverrides,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
async update(id: string, data: UpdateMcpProfileInput): Promise<McpProfile> {
|
||||
const updateData: Record<string, unknown> = {};
|
||||
if (data.name !== undefined) updateData['name'] = data.name;
|
||||
if (data.permissions !== undefined) updateData['permissions'] = data.permissions;
|
||||
if (data.envOverrides !== undefined) updateData['envOverrides'] = data.envOverrides;
|
||||
|
||||
return this.prisma.mcpProfile.update({ where: { id }, data: updateData });
|
||||
}
|
||||
|
||||
async delete(id: string): Promise<void> {
|
||||
await this.prisma.mcpProfile.delete({ where: { id } });
|
||||
}
|
||||
}
|
||||
@@ -1,4 +1,4 @@
|
||||
import type { PrismaClient, McpServer } from '@prisma/client';
|
||||
import { type PrismaClient, type McpServer, Prisma } from '@prisma/client';
|
||||
import type { IMcpServerRepository } from './interfaces.js';
|
||||
import type { CreateMcpServerInput, UpdateMcpServerInput } from '../validation/mcp-server.schema.js';
|
||||
|
||||
@@ -26,7 +26,12 @@ export class McpServerRepository implements IMcpServerRepository {
|
||||
dockerImage: data.dockerImage ?? null,
|
||||
transport: data.transport,
|
||||
repositoryUrl: data.repositoryUrl ?? null,
|
||||
envTemplate: data.envTemplate,
|
||||
externalUrl: data.externalUrl ?? null,
|
||||
command: data.command ?? Prisma.DbNull,
|
||||
containerPort: data.containerPort ?? null,
|
||||
replicas: data.replicas,
|
||||
env: data.env,
|
||||
healthCheck: (data.healthCheck ?? Prisma.JsonNull) as Prisma.InputJsonValue,
|
||||
},
|
||||
});
|
||||
}
|
||||
@@ -38,7 +43,12 @@ export class McpServerRepository implements IMcpServerRepository {
|
||||
if (data.dockerImage !== undefined) updateData['dockerImage'] = data.dockerImage;
|
||||
if (data.transport !== undefined) updateData['transport'] = data.transport;
|
||||
if (data.repositoryUrl !== undefined) updateData['repositoryUrl'] = data.repositoryUrl;
|
||||
if (data.envTemplate !== undefined) updateData['envTemplate'] = data.envTemplate;
|
||||
if (data.externalUrl !== undefined) updateData['externalUrl'] = data.externalUrl;
|
||||
if (data.command !== undefined) updateData['command'] = data.command;
|
||||
if (data.containerPort !== undefined) updateData['containerPort'] = data.containerPort;
|
||||
if (data.replicas !== undefined) updateData['replicas'] = data.replicas;
|
||||
if (data.env !== undefined) updateData['env'] = data.env;
|
||||
if (data.healthCheck !== undefined) updateData['healthCheck'] = (data.healthCheck ?? Prisma.JsonNull) as Prisma.InputJsonValue;
|
||||
|
||||
return this.prisma.mcpServer.update({ where: { id }, data: updateData });
|
||||
}
|
||||
|
||||
@@ -1,69 +1,92 @@
|
||||
import type { PrismaClient, Project } from '@prisma/client';
|
||||
import type { CreateProjectInput, UpdateProjectInput } from '../validation/project.schema.js';
|
||||
|
||||
export interface ProjectWithRelations extends Project {
|
||||
servers: Array<{ id: string; projectId: string; serverId: string; server: Record<string, unknown> & { id: string; name: string } }>;
|
||||
}
|
||||
|
||||
const PROJECT_INCLUDE = {
|
||||
servers: { include: { server: true } },
|
||||
} as const;
|
||||
|
||||
export interface IProjectRepository {
|
||||
findAll(ownerId?: string): Promise<Project[]>;
|
||||
findById(id: string): Promise<Project | null>;
|
||||
findByName(name: string): Promise<Project | null>;
|
||||
create(data: CreateProjectInput & { ownerId: string }): Promise<Project>;
|
||||
update(id: string, data: UpdateProjectInput): Promise<Project>;
|
||||
findAll(ownerId?: string): Promise<ProjectWithRelations[]>;
|
||||
findById(id: string): Promise<ProjectWithRelations | null>;
|
||||
findByName(name: string): Promise<ProjectWithRelations | null>;
|
||||
create(data: { name: string; description: string; prompt?: string; ownerId: string; proxyMode: string; llmProvider?: string; llmModel?: string }): Promise<ProjectWithRelations>;
|
||||
update(id: string, data: Record<string, unknown>): Promise<ProjectWithRelations>;
|
||||
delete(id: string): Promise<void>;
|
||||
setProfiles(projectId: string, profileIds: string[]): Promise<void>;
|
||||
getProfileIds(projectId: string): Promise<string[]>;
|
||||
setServers(projectId: string, serverIds: string[]): Promise<void>;
|
||||
addServer(projectId: string, serverId: string): Promise<void>;
|
||||
removeServer(projectId: string, serverId: string): Promise<void>;
|
||||
}
|
||||
|
||||
export class ProjectRepository implements IProjectRepository {
|
||||
constructor(private readonly prisma: PrismaClient) {}
|
||||
|
||||
async findAll(ownerId?: string): Promise<Project[]> {
|
||||
async findAll(ownerId?: string): Promise<ProjectWithRelations[]> {
|
||||
const where = ownerId !== undefined ? { ownerId } : {};
|
||||
return this.prisma.project.findMany({ where, orderBy: { name: 'asc' } });
|
||||
return this.prisma.project.findMany({ where, orderBy: { name: 'asc' }, include: PROJECT_INCLUDE }) as unknown as Promise<ProjectWithRelations[]>;
|
||||
}
|
||||
|
||||
async findById(id: string): Promise<Project | null> {
|
||||
return this.prisma.project.findUnique({ where: { id } });
|
||||
async findById(id: string): Promise<ProjectWithRelations | null> {
|
||||
return this.prisma.project.findUnique({ where: { id }, include: PROJECT_INCLUDE }) as unknown as Promise<ProjectWithRelations | null>;
|
||||
}
|
||||
|
||||
async findByName(name: string): Promise<Project | null> {
|
||||
return this.prisma.project.findUnique({ where: { name } });
|
||||
async findByName(name: string): Promise<ProjectWithRelations | null> {
|
||||
return this.prisma.project.findUnique({ where: { name }, include: PROJECT_INCLUDE }) as unknown as Promise<ProjectWithRelations | null>;
|
||||
}
|
||||
|
||||
async create(data: CreateProjectInput & { ownerId: string }): Promise<Project> {
|
||||
async create(data: { name: string; description: string; prompt?: string; ownerId: string; proxyMode: string; llmProvider?: string; llmModel?: string }): Promise<ProjectWithRelations> {
|
||||
const createData: Record<string, unknown> = {
|
||||
name: data.name,
|
||||
description: data.description,
|
||||
ownerId: data.ownerId,
|
||||
proxyMode: data.proxyMode,
|
||||
};
|
||||
if (data.prompt !== undefined) createData['prompt'] = data.prompt;
|
||||
if (data.llmProvider !== undefined) createData['llmProvider'] = data.llmProvider;
|
||||
if (data.llmModel !== undefined) createData['llmModel'] = data.llmModel;
|
||||
|
||||
return this.prisma.project.create({
|
||||
data: {
|
||||
name: data.name,
|
||||
description: data.description,
|
||||
ownerId: data.ownerId,
|
||||
},
|
||||
});
|
||||
data: createData as Parameters<PrismaClient['project']['create']>[0]['data'],
|
||||
include: PROJECT_INCLUDE,
|
||||
}) as unknown as Promise<ProjectWithRelations>;
|
||||
}
|
||||
|
||||
async update(id: string, data: UpdateProjectInput): Promise<Project> {
|
||||
const updateData: Record<string, unknown> = {};
|
||||
if (data.description !== undefined) updateData['description'] = data.description;
|
||||
return this.prisma.project.update({ where: { id }, data: updateData });
|
||||
async update(id: string, data: Record<string, unknown>): Promise<ProjectWithRelations> {
|
||||
return this.prisma.project.update({
|
||||
where: { id },
|
||||
data,
|
||||
include: PROJECT_INCLUDE,
|
||||
}) as unknown as Promise<ProjectWithRelations>;
|
||||
}
|
||||
|
||||
async delete(id: string): Promise<void> {
|
||||
await this.prisma.project.delete({ where: { id } });
|
||||
}
|
||||
|
||||
async setProfiles(projectId: string, profileIds: string[]): Promise<void> {
|
||||
await this.prisma.$transaction([
|
||||
this.prisma.projectMcpProfile.deleteMany({ where: { projectId } }),
|
||||
...profileIds.map((profileId) =>
|
||||
this.prisma.projectMcpProfile.create({
|
||||
data: { projectId, profileId },
|
||||
}),
|
||||
),
|
||||
]);
|
||||
async setServers(projectId: string, serverIds: string[]): Promise<void> {
|
||||
await this.prisma.$transaction(async (tx) => {
|
||||
await tx.projectServer.deleteMany({ where: { projectId } });
|
||||
if (serverIds.length > 0) {
|
||||
await tx.projectServer.createMany({
|
||||
data: serverIds.map((serverId) => ({ projectId, serverId })),
|
||||
});
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
async getProfileIds(projectId: string): Promise<string[]> {
|
||||
const links = await this.prisma.projectMcpProfile.findMany({
|
||||
where: { projectId },
|
||||
select: { profileId: true },
|
||||
async addServer(projectId: string, serverId: string): Promise<void> {
|
||||
await this.prisma.projectServer.upsert({
|
||||
where: { projectId_serverId: { projectId, serverId } },
|
||||
create: { projectId, serverId },
|
||||
update: {},
|
||||
});
|
||||
}
|
||||
|
||||
async removeServer(projectId: string, serverId: string): Promise<void> {
|
||||
await this.prisma.projectServer.deleteMany({
|
||||
where: { projectId, serverId },
|
||||
});
|
||||
return links.map((l) => l.profileId);
|
||||
}
|
||||
}
|
||||
|
||||
69
src/mcpd/src/repositories/prompt-request.repository.ts
Normal file
69
src/mcpd/src/repositories/prompt-request.repository.ts
Normal file
@@ -0,0 +1,69 @@
|
||||
import type { PrismaClient, PromptRequest } from '@prisma/client';
|
||||
|
||||
export interface IPromptRequestRepository {
|
||||
findAll(projectId?: string): Promise<PromptRequest[]>;
|
||||
findGlobal(): Promise<PromptRequest[]>;
|
||||
findById(id: string): Promise<PromptRequest | null>;
|
||||
findByNameAndProject(name: string, projectId: string | null): Promise<PromptRequest | null>;
|
||||
findBySession(sessionId: string, projectId?: string): Promise<PromptRequest[]>;
|
||||
create(data: { name: string; content: string; projectId?: string; priority?: number; createdBySession?: string; createdByUserId?: string }): Promise<PromptRequest>;
|
||||
update(id: string, data: { content?: string; priority?: number }): Promise<PromptRequest>;
|
||||
delete(id: string): Promise<void>;
|
||||
}
|
||||
|
||||
export class PromptRequestRepository implements IPromptRequestRepository {
|
||||
constructor(private readonly prisma: PrismaClient) {}
|
||||
|
||||
async findAll(projectId?: string): Promise<PromptRequest[]> {
|
||||
const include = { project: { select: { name: true } } };
|
||||
if (projectId !== undefined) {
|
||||
return this.prisma.promptRequest.findMany({
|
||||
where: { OR: [{ projectId }, { projectId: null }] },
|
||||
include,
|
||||
orderBy: { createdAt: 'desc' },
|
||||
});
|
||||
}
|
||||
return this.prisma.promptRequest.findMany({ include, orderBy: { createdAt: 'desc' } });
|
||||
}
|
||||
|
||||
async findGlobal(): Promise<PromptRequest[]> {
|
||||
return this.prisma.promptRequest.findMany({
|
||||
where: { projectId: null },
|
||||
include: { project: { select: { name: true } } },
|
||||
orderBy: { createdAt: 'desc' },
|
||||
});
|
||||
}
|
||||
|
||||
async findById(id: string): Promise<PromptRequest | null> {
|
||||
return this.prisma.promptRequest.findUnique({ where: { id } });
|
||||
}
|
||||
|
||||
async findByNameAndProject(name: string, projectId: string | null): Promise<PromptRequest | null> {
|
||||
return this.prisma.promptRequest.findUnique({
|
||||
where: { name_projectId: { name, projectId: projectId ?? '' } },
|
||||
});
|
||||
}
|
||||
|
||||
async findBySession(sessionId: string, projectId?: string): Promise<PromptRequest[]> {
|
||||
const where: Record<string, unknown> = { createdBySession: sessionId };
|
||||
if (projectId !== undefined) {
|
||||
where['OR'] = [{ projectId }, { projectId: null }];
|
||||
}
|
||||
return this.prisma.promptRequest.findMany({
|
||||
where,
|
||||
orderBy: { createdAt: 'desc' },
|
||||
});
|
||||
}
|
||||
|
||||
async create(data: { name: string; content: string; projectId?: string; priority?: number; createdBySession?: string; createdByUserId?: string }): Promise<PromptRequest> {
|
||||
return this.prisma.promptRequest.create({ data });
|
||||
}
|
||||
|
||||
async update(id: string, data: { content?: string; priority?: number }): Promise<PromptRequest> {
|
||||
return this.prisma.promptRequest.update({ where: { id }, data });
|
||||
}
|
||||
|
||||
async delete(id: string): Promise<void> {
|
||||
await this.prisma.promptRequest.delete({ where: { id } });
|
||||
}
|
||||
}
|
||||
58
src/mcpd/src/repositories/prompt.repository.ts
Normal file
58
src/mcpd/src/repositories/prompt.repository.ts
Normal file
@@ -0,0 +1,58 @@
|
||||
import type { PrismaClient, Prompt } from '@prisma/client';
|
||||
|
||||
export interface IPromptRepository {
|
||||
findAll(projectId?: string): Promise<Prompt[]>;
|
||||
findGlobal(): Promise<Prompt[]>;
|
||||
findById(id: string): Promise<Prompt | null>;
|
||||
findByNameAndProject(name: string, projectId: string | null): Promise<Prompt | null>;
|
||||
create(data: { name: string; content: string; projectId?: string; priority?: number; linkTarget?: string }): Promise<Prompt>;
|
||||
update(id: string, data: { content?: string; priority?: number; summary?: string; chapters?: string[] }): Promise<Prompt>;
|
||||
delete(id: string): Promise<void>;
|
||||
}
|
||||
|
||||
export class PromptRepository implements IPromptRepository {
|
||||
constructor(private readonly prisma: PrismaClient) {}
|
||||
|
||||
async findAll(projectId?: string): Promise<Prompt[]> {
|
||||
const include = { project: { select: { name: true } } };
|
||||
if (projectId !== undefined) {
|
||||
// Project-scoped + global prompts
|
||||
return this.prisma.prompt.findMany({
|
||||
where: { OR: [{ projectId }, { projectId: null }] },
|
||||
include,
|
||||
orderBy: { name: 'asc' },
|
||||
});
|
||||
}
|
||||
return this.prisma.prompt.findMany({ include, orderBy: { name: 'asc' } });
|
||||
}
|
||||
|
||||
async findGlobal(): Promise<Prompt[]> {
|
||||
return this.prisma.prompt.findMany({
|
||||
where: { projectId: null },
|
||||
include: { project: { select: { name: true } } },
|
||||
orderBy: { name: 'asc' },
|
||||
});
|
||||
}
|
||||
|
||||
async findById(id: string): Promise<Prompt | null> {
|
||||
return this.prisma.prompt.findUnique({ where: { id } });
|
||||
}
|
||||
|
||||
async findByNameAndProject(name: string, projectId: string | null): Promise<Prompt | null> {
|
||||
return this.prisma.prompt.findUnique({
|
||||
where: { name_projectId: { name, projectId: projectId ?? '' } },
|
||||
});
|
||||
}
|
||||
|
||||
async create(data: { name: string; content: string; projectId?: string; priority?: number; linkTarget?: string }): Promise<Prompt> {
|
||||
return this.prisma.prompt.create({ data });
|
||||
}
|
||||
|
||||
async update(id: string, data: { content?: string; priority?: number; summary?: string; chapters?: string[] }): Promise<Prompt> {
|
||||
return this.prisma.prompt.update({ where: { id }, data });
|
||||
}
|
||||
|
||||
async delete(id: string): Promise<void> {
|
||||
await this.prisma.prompt.delete({ where: { id } });
|
||||
}
|
||||
}
|
||||
48
src/mcpd/src/repositories/rbac-definition.repository.ts
Normal file
48
src/mcpd/src/repositories/rbac-definition.repository.ts
Normal file
@@ -0,0 +1,48 @@
|
||||
import type { PrismaClient, RbacDefinition } from '@prisma/client';
|
||||
import type { CreateRbacDefinitionInput, UpdateRbacDefinitionInput } from '../validation/rbac-definition.schema.js';
|
||||
|
||||
export interface IRbacDefinitionRepository {
|
||||
findAll(): Promise<RbacDefinition[]>;
|
||||
findById(id: string): Promise<RbacDefinition | null>;
|
||||
findByName(name: string): Promise<RbacDefinition | null>;
|
||||
create(data: CreateRbacDefinitionInput): Promise<RbacDefinition>;
|
||||
update(id: string, data: UpdateRbacDefinitionInput): Promise<RbacDefinition>;
|
||||
delete(id: string): Promise<void>;
|
||||
}
|
||||
|
||||
export class RbacDefinitionRepository implements IRbacDefinitionRepository {
|
||||
constructor(private readonly prisma: PrismaClient) {}
|
||||
|
||||
async findAll(): Promise<RbacDefinition[]> {
|
||||
return this.prisma.rbacDefinition.findMany({ orderBy: { name: 'asc' } });
|
||||
}
|
||||
|
||||
async findById(id: string): Promise<RbacDefinition | null> {
|
||||
return this.prisma.rbacDefinition.findUnique({ where: { id } });
|
||||
}
|
||||
|
||||
async findByName(name: string): Promise<RbacDefinition | null> {
|
||||
return this.prisma.rbacDefinition.findUnique({ where: { name } });
|
||||
}
|
||||
|
||||
async create(data: CreateRbacDefinitionInput): Promise<RbacDefinition> {
|
||||
return this.prisma.rbacDefinition.create({
|
||||
data: {
|
||||
name: data.name,
|
||||
subjects: data.subjects,
|
||||
roleBindings: data.roleBindings,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
async update(id: string, data: UpdateRbacDefinitionInput): Promise<RbacDefinition> {
|
||||
const updateData: Record<string, unknown> = {};
|
||||
if (data.subjects !== undefined) updateData['subjects'] = data.subjects;
|
||||
if (data.roleBindings !== undefined) updateData['roleBindings'] = data.roleBindings;
|
||||
return this.prisma.rbacDefinition.update({ where: { id }, data: updateData });
|
||||
}
|
||||
|
||||
async delete(id: string): Promise<void> {
|
||||
await this.prisma.rbacDefinition.delete({ where: { id } });
|
||||
}
|
||||
}
|
||||
39
src/mcpd/src/repositories/secret.repository.ts
Normal file
39
src/mcpd/src/repositories/secret.repository.ts
Normal file
@@ -0,0 +1,39 @@
|
||||
import { type PrismaClient, type Secret } from '@prisma/client';
|
||||
import type { ISecretRepository } from './interfaces.js';
|
||||
import type { CreateSecretInput, UpdateSecretInput } from '../validation/secret.schema.js';
|
||||
|
||||
export class SecretRepository implements ISecretRepository {
|
||||
constructor(private readonly prisma: PrismaClient) {}
|
||||
|
||||
async findAll(): Promise<Secret[]> {
|
||||
return this.prisma.secret.findMany({ orderBy: { name: 'asc' } });
|
||||
}
|
||||
|
||||
async findById(id: string): Promise<Secret | null> {
|
||||
return this.prisma.secret.findUnique({ where: { id } });
|
||||
}
|
||||
|
||||
async findByName(name: string): Promise<Secret | null> {
|
||||
return this.prisma.secret.findUnique({ where: { name } });
|
||||
}
|
||||
|
||||
async create(data: CreateSecretInput): Promise<Secret> {
|
||||
return this.prisma.secret.create({
|
||||
data: {
|
||||
name: data.name,
|
||||
data: data.data,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
async update(id: string, data: UpdateSecretInput): Promise<Secret> {
|
||||
return this.prisma.secret.update({
|
||||
where: { id },
|
||||
data: { data: data.data },
|
||||
});
|
||||
}
|
||||
|
||||
async delete(id: string): Promise<void> {
|
||||
await this.prisma.secret.delete({ where: { id } });
|
||||
}
|
||||
}
|
||||
82
src/mcpd/src/repositories/template.repository.ts
Normal file
82
src/mcpd/src/repositories/template.repository.ts
Normal file
@@ -0,0 +1,82 @@
|
||||
import { type PrismaClient, type McpTemplate, Prisma } from '@prisma/client';
|
||||
import type { CreateTemplateInput, UpdateTemplateInput } from '../validation/template.schema.js';
|
||||
|
||||
export interface ITemplateRepository {
|
||||
findAll(): Promise<McpTemplate[]>;
|
||||
findById(id: string): Promise<McpTemplate | null>;
|
||||
findByName(name: string): Promise<McpTemplate | null>;
|
||||
search(pattern: string): Promise<McpTemplate[]>;
|
||||
create(data: CreateTemplateInput): Promise<McpTemplate>;
|
||||
update(id: string, data: UpdateTemplateInput): Promise<McpTemplate>;
|
||||
delete(id: string): Promise<void>;
|
||||
}
|
||||
|
||||
export class TemplateRepository implements ITemplateRepository {
|
||||
constructor(private readonly prisma: PrismaClient) {}
|
||||
|
||||
async findAll(): Promise<McpTemplate[]> {
|
||||
return this.prisma.mcpTemplate.findMany({ orderBy: { name: 'asc' } });
|
||||
}
|
||||
|
||||
async findById(id: string): Promise<McpTemplate | null> {
|
||||
return this.prisma.mcpTemplate.findUnique({ where: { id } });
|
||||
}
|
||||
|
||||
async findByName(name: string): Promise<McpTemplate | null> {
|
||||
return this.prisma.mcpTemplate.findUnique({ where: { name } });
|
||||
}
|
||||
|
||||
async search(pattern: string): Promise<McpTemplate[]> {
|
||||
// Convert glob * to SQL %
|
||||
const sqlPattern = pattern.replace(/\*/g, '%');
|
||||
return this.prisma.mcpTemplate.findMany({
|
||||
where: { name: { contains: sqlPattern.replace(/%/g, ''), mode: 'insensitive' } },
|
||||
orderBy: { name: 'asc' },
|
||||
});
|
||||
}
|
||||
|
||||
async create(data: CreateTemplateInput): Promise<McpTemplate> {
|
||||
return this.prisma.mcpTemplate.create({
|
||||
data: {
|
||||
name: data.name,
|
||||
version: data.version,
|
||||
description: data.description,
|
||||
packageName: data.packageName ?? null,
|
||||
dockerImage: data.dockerImage ?? null,
|
||||
transport: data.transport,
|
||||
repositoryUrl: data.repositoryUrl ?? null,
|
||||
externalUrl: data.externalUrl ?? null,
|
||||
command: (data.command ?? Prisma.JsonNull) as Prisma.InputJsonValue,
|
||||
containerPort: data.containerPort ?? null,
|
||||
replicas: data.replicas,
|
||||
env: (data.env ?? []) as unknown as Prisma.InputJsonValue,
|
||||
healthCheck: (data.healthCheck ?? Prisma.JsonNull) as Prisma.InputJsonValue,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
async update(id: string, data: UpdateTemplateInput): Promise<McpTemplate> {
|
||||
const updateData: Record<string, unknown> = {};
|
||||
if (data.version !== undefined) updateData.version = data.version;
|
||||
if (data.description !== undefined) updateData.description = data.description;
|
||||
if (data.packageName !== undefined) updateData.packageName = data.packageName;
|
||||
if (data.dockerImage !== undefined) updateData.dockerImage = data.dockerImage;
|
||||
if (data.transport !== undefined) updateData.transport = data.transport;
|
||||
if (data.repositoryUrl !== undefined) updateData.repositoryUrl = data.repositoryUrl;
|
||||
if (data.externalUrl !== undefined) updateData.externalUrl = data.externalUrl;
|
||||
if (data.command !== undefined) updateData.command = (data.command ?? Prisma.JsonNull) as Prisma.InputJsonValue;
|
||||
if (data.containerPort !== undefined) updateData.containerPort = data.containerPort;
|
||||
if (data.replicas !== undefined) updateData.replicas = data.replicas;
|
||||
if (data.env !== undefined) updateData.env = (data.env ?? []) as Prisma.InputJsonValue;
|
||||
if (data.healthCheck !== undefined) updateData.healthCheck = (data.healthCheck ?? Prisma.JsonNull) as Prisma.InputJsonValue;
|
||||
|
||||
return this.prisma.mcpTemplate.update({
|
||||
where: { id },
|
||||
data: updateData,
|
||||
});
|
||||
}
|
||||
|
||||
async delete(id: string): Promise<void> {
|
||||
await this.prisma.mcpTemplate.delete({ where: { id } });
|
||||
}
|
||||
}
|
||||
76
src/mcpd/src/repositories/user.repository.ts
Normal file
76
src/mcpd/src/repositories/user.repository.ts
Normal file
@@ -0,0 +1,76 @@
|
||||
import type { PrismaClient, User } from '@prisma/client';
|
||||
|
||||
/** User without the passwordHash field — safe for API responses. */
|
||||
export type SafeUser = Omit<User, 'passwordHash'>;
|
||||
|
||||
export interface IUserRepository {
|
||||
findAll(): Promise<SafeUser[]>;
|
||||
findById(id: string): Promise<SafeUser | null>;
|
||||
findByEmail(email: string, includeHash?: boolean): Promise<SafeUser | null> | Promise<User | null>;
|
||||
create(data: { email: string; passwordHash: string; name?: string; role?: string }): Promise<SafeUser>;
|
||||
delete(id: string): Promise<void>;
|
||||
count(): Promise<number>;
|
||||
}
|
||||
|
||||
/** Fields to select when passwordHash must be excluded. */
|
||||
const safeSelect = {
|
||||
id: true,
|
||||
email: true,
|
||||
name: true,
|
||||
role: true,
|
||||
provider: true,
|
||||
externalId: true,
|
||||
version: true,
|
||||
createdAt: true,
|
||||
updatedAt: true,
|
||||
} as const;
|
||||
|
||||
export class UserRepository implements IUserRepository {
|
||||
constructor(private readonly prisma: PrismaClient) {}
|
||||
|
||||
async findAll(): Promise<SafeUser[]> {
|
||||
return this.prisma.user.findMany({
|
||||
select: safeSelect,
|
||||
orderBy: { email: 'asc' },
|
||||
});
|
||||
}
|
||||
|
||||
async findById(id: string): Promise<SafeUser | null> {
|
||||
return this.prisma.user.findUnique({
|
||||
where: { id },
|
||||
select: safeSelect,
|
||||
});
|
||||
}
|
||||
|
||||
async findByEmail(email: string, includeHash?: boolean): Promise<User | SafeUser | null> {
|
||||
if (includeHash === true) {
|
||||
return this.prisma.user.findUnique({ where: { email } });
|
||||
}
|
||||
return this.prisma.user.findUnique({
|
||||
where: { email },
|
||||
select: safeSelect,
|
||||
});
|
||||
}
|
||||
|
||||
async create(data: { email: string; passwordHash: string; name?: string; role?: string }): Promise<SafeUser> {
|
||||
const createData: Record<string, unknown> = {
|
||||
email: data.email,
|
||||
passwordHash: data.passwordHash,
|
||||
};
|
||||
if (data.name !== undefined) createData['name'] = data.name;
|
||||
if (data.role !== undefined) createData['role'] = data.role;
|
||||
|
||||
return this.prisma.user.create({
|
||||
data: createData as Parameters<PrismaClient['user']['create']>[0]['data'],
|
||||
select: safeSelect,
|
||||
});
|
||||
}
|
||||
|
||||
async delete(id: string): Promise<void> {
|
||||
await this.prisma.user.delete({ where: { id } });
|
||||
}
|
||||
|
||||
async count(): Promise<number> {
|
||||
return this.prisma.user.count();
|
||||
}
|
||||
}
|
||||
@@ -1,15 +1,76 @@
|
||||
import type { FastifyInstance } from 'fastify';
|
||||
import type { AuthService } from '../services/auth.service.js';
|
||||
import type { UserService } from '../services/user.service.js';
|
||||
import type { GroupService } from '../services/group.service.js';
|
||||
import type { RbacDefinitionService } from '../services/rbac-definition.service.js';
|
||||
import type { RbacService } from '../services/rbac.service.js';
|
||||
import { createAuthMiddleware } from '../middleware/auth.js';
|
||||
import { createRbacMiddleware } from '../middleware/rbac.js';
|
||||
|
||||
export interface AuthRouteDeps {
|
||||
authService: AuthService;
|
||||
userService: UserService;
|
||||
groupService: GroupService;
|
||||
rbacDefinitionService: RbacDefinitionService;
|
||||
rbacService: RbacService;
|
||||
}
|
||||
|
||||
export function registerAuthRoutes(app: FastifyInstance, deps: AuthRouteDeps): void {
|
||||
const authMiddleware = createAuthMiddleware({
|
||||
findSession: (token) => deps.authService.findSession(token),
|
||||
});
|
||||
const { requireOperation } = createRbacMiddleware(deps.rbacService);
|
||||
|
||||
// GET /api/v1/auth/status — unauthenticated, returns whether any users exist
|
||||
app.get('/api/v1/auth/status', async () => {
|
||||
const count = await deps.userService.count();
|
||||
return { hasUsers: count > 0 };
|
||||
});
|
||||
|
||||
// POST /api/v1/auth/bootstrap — only works when no users exist (first-run setup)
|
||||
app.post('/api/v1/auth/bootstrap', async (request, reply) => {
|
||||
const count = await deps.userService.count();
|
||||
if (count > 0) {
|
||||
reply.code(409).send({ error: 'Users already exist. Use login instead.' });
|
||||
return;
|
||||
}
|
||||
|
||||
const { email, password, name } = request.body as { email: string; password: string; name?: string };
|
||||
|
||||
// Create the first admin user
|
||||
await deps.userService.create({
|
||||
email,
|
||||
password,
|
||||
...(name !== undefined ? { name } : {}),
|
||||
});
|
||||
|
||||
// Create "admin" group and add the first user to it
|
||||
await deps.groupService.create({
|
||||
name: 'admin',
|
||||
description: 'Bootstrap admin group',
|
||||
members: [email],
|
||||
});
|
||||
|
||||
// Create bootstrap RBAC: full resource access + all operations
|
||||
await deps.rbacDefinitionService.create({
|
||||
name: 'bootstrap-admin',
|
||||
subjects: [{ kind: 'Group', name: 'admin' }],
|
||||
roleBindings: [
|
||||
{ role: 'edit', resource: '*' },
|
||||
{ role: 'run', resource: '*' },
|
||||
{ role: 'run', action: 'impersonate' },
|
||||
{ role: 'run', action: 'logs' },
|
||||
{ role: 'run', action: 'backup' },
|
||||
{ role: 'run', action: 'restore' },
|
||||
{ role: 'run', action: 'audit-purge' },
|
||||
],
|
||||
});
|
||||
|
||||
// Auto-login so the caller gets a token immediately
|
||||
const session = await deps.authService.login(email, password);
|
||||
reply.code(201);
|
||||
return session;
|
||||
});
|
||||
|
||||
// POST /api/v1/auth/login — no auth required
|
||||
app.post<{
|
||||
@@ -28,4 +89,15 @@ export function registerAuthRoutes(app: FastifyInstance, deps: AuthRouteDeps): v
|
||||
await deps.authService.logout(token);
|
||||
return { success: true };
|
||||
});
|
||||
|
||||
// POST /api/v1/auth/impersonate — requires auth + run:impersonate operation
|
||||
app.post(
|
||||
'/api/v1/auth/impersonate',
|
||||
{ preHandler: [authMiddleware, requireOperation('impersonate')] },
|
||||
async (request) => {
|
||||
const { email } = request.body as { email: string };
|
||||
const result = await deps.authService.impersonate(email);
|
||||
return result;
|
||||
},
|
||||
);
|
||||
}
|
||||
|
||||
@@ -13,7 +13,7 @@ export function registerBackupRoutes(app: FastifyInstance, deps: BackupDeps): vo
|
||||
app.post<{
|
||||
Body: {
|
||||
password?: string;
|
||||
resources?: Array<'servers' | 'profiles' | 'projects'>;
|
||||
resources?: Array<'servers' | 'secrets' | 'projects' | 'users' | 'groups' | 'rbac'>;
|
||||
};
|
||||
}>('/api/v1/backup', async (request) => {
|
||||
const opts: BackupOptions = {};
|
||||
@@ -51,7 +51,7 @@ export function registerBackupRoutes(app: FastifyInstance, deps: BackupDeps): vo
|
||||
|
||||
const result = await deps.restoreService.restore(bundle, restoreOpts);
|
||||
|
||||
if (result.errors.length > 0 && result.serversCreated === 0 && result.profilesCreated === 0 && result.projectsCreated === 0) {
|
||||
if (result.errors.length > 0 && result.serversCreated === 0 && result.secretsCreated === 0 && result.projectsCreated === 0 && result.usersCreated === 0 && result.groupsCreated === 0 && result.rbacCreated === 0) {
|
||||
reply.code(422);
|
||||
}
|
||||
|
||||
|
||||
35
src/mcpd/src/routes/groups.ts
Normal file
35
src/mcpd/src/routes/groups.ts
Normal file
@@ -0,0 +1,35 @@
|
||||
import type { FastifyInstance } from 'fastify';
|
||||
import type { GroupService } from '../services/group.service.js';
|
||||
|
||||
export function registerGroupRoutes(
|
||||
app: FastifyInstance,
|
||||
service: GroupService,
|
||||
): void {
|
||||
app.get('/api/v1/groups', async () => {
|
||||
return service.list();
|
||||
});
|
||||
|
||||
app.get<{ Params: { id: string } }>('/api/v1/groups/:id', async (request) => {
|
||||
// Try by ID first, fall back to name lookup
|
||||
try {
|
||||
return await service.getById(request.params.id);
|
||||
} catch {
|
||||
return service.getByName(request.params.id);
|
||||
}
|
||||
});
|
||||
|
||||
app.post('/api/v1/groups', async (request, reply) => {
|
||||
const group = await service.create(request.body);
|
||||
reply.code(201);
|
||||
return group;
|
||||
});
|
||||
|
||||
app.put<{ Params: { id: string } }>('/api/v1/groups/:id', async (request) => {
|
||||
return service.update(request.params.id, request.body);
|
||||
});
|
||||
|
||||
app.delete<{ Params: { id: string } }>('/api/v1/groups/:id', async (request, reply) => {
|
||||
await service.delete(request.params.id);
|
||||
reply.code(204);
|
||||
});
|
||||
}
|
||||
@@ -1,7 +1,7 @@
|
||||
export { registerHealthRoutes } from './health.js';
|
||||
export type { HealthDeps } from './health.js';
|
||||
export { registerMcpServerRoutes } from './mcp-servers.js';
|
||||
export { registerMcpProfileRoutes } from './mcp-profiles.js';
|
||||
export { registerSecretRoutes } from './secrets.js';
|
||||
export { registerProjectRoutes } from './projects.js';
|
||||
export { registerInstanceRoutes } from './instances.js';
|
||||
export { registerAuditLogRoutes } from './audit-logs.js';
|
||||
@@ -13,3 +13,7 @@ export { registerAuthRoutes } from './auth.js';
|
||||
export type { AuthRouteDeps } from './auth.js';
|
||||
export { registerMcpProxyRoutes } from './mcp-proxy.js';
|
||||
export type { McpProxyRouteDeps } from './mcp-proxy.js';
|
||||
export { registerTemplateRoutes } from './templates.js';
|
||||
export { registerRbacRoutes } from './rbac-definitions.js';
|
||||
export { registerUserRoutes } from './users.js';
|
||||
export { registerGroupRoutes } from './groups.js';
|
||||
|
||||
@@ -10,40 +10,17 @@ export function registerInstanceRoutes(app: FastifyInstance, service: InstanceSe
|
||||
return service.getById(request.params.id);
|
||||
});
|
||||
|
||||
app.post<{ Body: { serverId: string; env?: Record<string, string>; hostPort?: number } }>(
|
||||
'/api/v1/instances',
|
||||
async (request, reply) => {
|
||||
const { serverId } = request.body;
|
||||
const opts: { env?: Record<string, string>; hostPort?: number } = {};
|
||||
if (request.body.env) {
|
||||
opts.env = request.body.env;
|
||||
}
|
||||
if (request.body.hostPort !== undefined) {
|
||||
opts.hostPort = request.body.hostPort;
|
||||
}
|
||||
const instance = await service.start(serverId, opts);
|
||||
reply.code(201);
|
||||
return instance;
|
||||
},
|
||||
);
|
||||
|
||||
app.post<{ Params: { id: string } }>('/api/v1/instances/:id/stop', async (request) => {
|
||||
return service.stop(request.params.id);
|
||||
});
|
||||
|
||||
app.post<{ Params: { id: string } }>('/api/v1/instances/:id/restart', async (request) => {
|
||||
return service.restart(request.params.id);
|
||||
app.delete<{ Params: { id: string } }>('/api/v1/instances/:id', async (request, reply) => {
|
||||
const { serverId } = await service.remove(request.params.id);
|
||||
// Reconcile: server will auto-create a replacement if replicas > 0
|
||||
await service.reconcile(serverId);
|
||||
reply.code(204);
|
||||
});
|
||||
|
||||
app.get<{ Params: { id: string } }>('/api/v1/instances/:id/inspect', async (request) => {
|
||||
return service.inspect(request.params.id);
|
||||
});
|
||||
|
||||
app.delete<{ Params: { id: string } }>('/api/v1/instances/:id', async (request, reply) => {
|
||||
await service.remove(request.params.id);
|
||||
reply.code(204);
|
||||
});
|
||||
|
||||
app.get<{ Params: { id: string }; Querystring: { tail?: string } }>(
|
||||
'/api/v1/instances/:id/logs',
|
||||
async (request) => {
|
||||
|
||||
@@ -1,27 +0,0 @@
|
||||
import type { FastifyInstance } from 'fastify';
|
||||
import type { McpProfileService } from '../services/mcp-profile.service.js';
|
||||
|
||||
export function registerMcpProfileRoutes(app: FastifyInstance, service: McpProfileService): void {
|
||||
app.get<{ Querystring: { serverId?: string } }>('/api/v1/profiles', async (request) => {
|
||||
return service.list(request.query.serverId);
|
||||
});
|
||||
|
||||
app.get<{ Params: { id: string } }>('/api/v1/profiles/:id', async (request) => {
|
||||
return service.getById(request.params.id);
|
||||
});
|
||||
|
||||
app.post('/api/v1/profiles', async (request, reply) => {
|
||||
const profile = await service.create(request.body);
|
||||
reply.code(201);
|
||||
return profile;
|
||||
});
|
||||
|
||||
app.put<{ Params: { id: string } }>('/api/v1/profiles/:id', async (request) => {
|
||||
return service.update(request.params.id, request.body);
|
||||
});
|
||||
|
||||
app.delete<{ Params: { id: string } }>('/api/v1/profiles/:id', async (request, reply) => {
|
||||
await service.delete(request.params.id);
|
||||
reply.code(204);
|
||||
});
|
||||
}
|
||||
@@ -1,7 +1,12 @@
|
||||
import type { FastifyInstance } from 'fastify';
|
||||
import type { McpServerService } from '../services/mcp-server.service.js';
|
||||
import type { InstanceService } from '../services/instance.service.js';
|
||||
|
||||
export function registerMcpServerRoutes(app: FastifyInstance, service: McpServerService): void {
|
||||
export function registerMcpServerRoutes(
|
||||
app: FastifyInstance,
|
||||
service: McpServerService,
|
||||
instanceService: InstanceService,
|
||||
): void {
|
||||
app.get('/api/v1/servers', async () => {
|
||||
return service.list();
|
||||
});
|
||||
@@ -12,12 +17,17 @@ export function registerMcpServerRoutes(app: FastifyInstance, service: McpServer
|
||||
|
||||
app.post('/api/v1/servers', async (request, reply) => {
|
||||
const server = await service.create(request.body);
|
||||
// Auto-reconcile: create instances to match replicas
|
||||
await instanceService.reconcile(server.id);
|
||||
reply.code(201);
|
||||
return server;
|
||||
});
|
||||
|
||||
app.put<{ Params: { id: string } }>('/api/v1/servers/:id', async (request) => {
|
||||
return service.update(request.params.id, request.body);
|
||||
const server = await service.update(request.params.id, request.body);
|
||||
// Re-reconcile after update (replicas may have changed)
|
||||
await instanceService.reconcile(server.id);
|
||||
return server;
|
||||
});
|
||||
|
||||
app.delete<{ Params: { id: string } }>('/api/v1/servers/:id', async (request, reply) => {
|
||||
|
||||
@@ -2,13 +2,13 @@ import type { FastifyInstance } from 'fastify';
|
||||
import type { ProjectService } from '../services/project.service.js';
|
||||
|
||||
export function registerProjectRoutes(app: FastifyInstance, service: ProjectService): void {
|
||||
app.get('/api/v1/projects', async (request) => {
|
||||
// If authenticated, filter by owner; otherwise list all
|
||||
return service.list(request.userId);
|
||||
app.get('/api/v1/projects', async () => {
|
||||
// RBAC preSerialization hook handles access filtering
|
||||
return service.list();
|
||||
});
|
||||
|
||||
app.get<{ Params: { id: string } }>('/api/v1/projects/:id', async (request) => {
|
||||
return service.getById(request.params.id);
|
||||
return service.resolveAndGet(request.params.id);
|
||||
});
|
||||
|
||||
app.post('/api/v1/projects', async (request, reply) => {
|
||||
@@ -19,25 +19,51 @@ export function registerProjectRoutes(app: FastifyInstance, service: ProjectServ
|
||||
});
|
||||
|
||||
app.put<{ Params: { id: string } }>('/api/v1/projects/:id', async (request) => {
|
||||
return service.update(request.params.id, request.body);
|
||||
const project = await service.resolveAndGet(request.params.id);
|
||||
return service.update(project.id, request.body);
|
||||
});
|
||||
|
||||
app.delete<{ Params: { id: string } }>('/api/v1/projects/:id', async (request, reply) => {
|
||||
await service.delete(request.params.id);
|
||||
const project = await service.resolveAndGet(request.params.id);
|
||||
await service.delete(project.id);
|
||||
reply.code(204);
|
||||
});
|
||||
|
||||
// Profile associations
|
||||
app.get<{ Params: { id: string } }>('/api/v1/projects/:id/profiles', async (request) => {
|
||||
return service.getProfiles(request.params.id);
|
||||
});
|
||||
|
||||
app.put<{ Params: { id: string } }>('/api/v1/projects/:id/profiles', async (request) => {
|
||||
return service.setProfiles(request.params.id, request.body);
|
||||
});
|
||||
|
||||
// MCP config generation
|
||||
// Generate .mcp.json for a project
|
||||
app.get<{ Params: { id: string } }>('/api/v1/projects/:id/mcp-config', async (request) => {
|
||||
return service.getMcpConfig(request.params.id);
|
||||
return service.generateMcpConfig(request.params.id);
|
||||
});
|
||||
|
||||
// Attach a server to a project
|
||||
app.post<{ Params: { id: string }; Body: { server: string } }>('/api/v1/projects/:id/servers', async (request) => {
|
||||
const body = request.body as { server?: string };
|
||||
if (!body.server) {
|
||||
throw Object.assign(new Error('Missing "server" in request body'), { statusCode: 400 });
|
||||
}
|
||||
return service.addServer(request.params.id, body.server);
|
||||
});
|
||||
|
||||
// Detach a server from a project
|
||||
app.delete<{ Params: { id: string; serverName: string } }>('/api/v1/projects/:id/servers/:serverName', async (request, reply) => {
|
||||
await service.removeServer(request.params.id, request.params.serverName);
|
||||
reply.code(204);
|
||||
});
|
||||
|
||||
// List servers in a project (for mcplocal discovery)
|
||||
app.get<{ Params: { id: string } }>('/api/v1/projects/:id/servers', async (request) => {
|
||||
const project = await service.resolveAndGet(request.params.id);
|
||||
return project.servers.map((ps) => ps.server);
|
||||
});
|
||||
|
||||
// Get project instructions for LLM (prompt + server list)
|
||||
app.get<{ Params: { id: string } }>('/api/v1/projects/:id/instructions', async (request) => {
|
||||
const project = await service.resolveAndGet(request.params.id);
|
||||
return {
|
||||
prompt: project.prompt,
|
||||
servers: project.servers.map((ps) => ({
|
||||
name: (ps.server as Record<string, unknown>).name as string,
|
||||
description: (ps.server as Record<string, unknown>).description as string,
|
||||
})),
|
||||
};
|
||||
});
|
||||
}
|
||||
|
||||
207
src/mcpd/src/routes/prompts.ts
Normal file
207
src/mcpd/src/routes/prompts.ts
Normal file
@@ -0,0 +1,207 @@
|
||||
import type { FastifyInstance } from 'fastify';
|
||||
import type { Prompt } from '@prisma/client';
|
||||
import type { PromptService } from '../services/prompt.service.js';
|
||||
import type { IProjectRepository, ProjectWithRelations } from '../repositories/project.repository.js';
|
||||
|
||||
type PromptWithLinkStatus = Prompt & { linkStatus: 'alive' | 'dead' | null };
|
||||
|
||||
/**
|
||||
* Enrich prompts with linkStatus by checking if the target project/server exists.
|
||||
* This is a structural check (does the target exist?) — not a runtime probe.
|
||||
*/
|
||||
async function enrichWithLinkStatus(
|
||||
prompts: Prompt[],
|
||||
projectRepo: IProjectRepository,
|
||||
): Promise<PromptWithLinkStatus[]> {
|
||||
// Cache project lookups to avoid repeated DB queries
|
||||
const projectCache = new Map<string, ProjectWithRelations | null>();
|
||||
|
||||
const results: PromptWithLinkStatus[] = [];
|
||||
|
||||
for (const p of prompts) {
|
||||
if (!p.linkTarget) {
|
||||
results.push({ ...p, linkStatus: null } as PromptWithLinkStatus);
|
||||
continue;
|
||||
}
|
||||
|
||||
try {
|
||||
// Parse: project/server:uri
|
||||
const slashIdx = p.linkTarget.indexOf('/');
|
||||
if (slashIdx < 1) { results.push({ ...p, linkStatus: 'dead' as const }); continue; }
|
||||
const projectName = p.linkTarget.slice(0, slashIdx);
|
||||
const rest = p.linkTarget.slice(slashIdx + 1);
|
||||
const colonIdx = rest.indexOf(':');
|
||||
if (colonIdx < 1) { results.push({ ...p, linkStatus: 'dead' as const }); continue; }
|
||||
const serverName = rest.slice(0, colonIdx);
|
||||
|
||||
// Check if project exists (cached)
|
||||
if (!projectCache.has(projectName)) {
|
||||
projectCache.set(projectName, await projectRepo.findByName(projectName));
|
||||
}
|
||||
const project = projectCache.get(projectName);
|
||||
if (!project) { results.push({ ...p, linkStatus: 'dead' as const }); continue; }
|
||||
|
||||
// Check if server is linked to that project
|
||||
const hasServer = project.servers.some((s) => s.server.name === serverName);
|
||||
results.push({ ...p, linkStatus: hasServer ? 'alive' as const : 'dead' as const });
|
||||
} catch {
|
||||
results.push({ ...p, linkStatus: 'dead' as const });
|
||||
}
|
||||
}
|
||||
|
||||
return results;
|
||||
}
|
||||
|
||||
export function registerPromptRoutes(
|
||||
app: FastifyInstance,
|
||||
service: PromptService,
|
||||
projectRepo: IProjectRepository,
|
||||
): void {
|
||||
// ── Prompts (approved) ──
|
||||
|
||||
app.get<{ Querystring: { project?: string; scope?: string; projectId?: string } }>('/api/v1/prompts', async (request) => {
|
||||
let prompts: Prompt[];
|
||||
const projectName = request.query.project;
|
||||
if (projectName) {
|
||||
const project = await projectRepo.findByName(projectName);
|
||||
if (!project) {
|
||||
throw Object.assign(new Error(`Project not found: ${projectName}`), { statusCode: 404 });
|
||||
}
|
||||
prompts = await service.listPrompts(project.id);
|
||||
} else if (request.query.projectId) {
|
||||
prompts = await service.listPrompts(request.query.projectId);
|
||||
} else if (request.query.scope === 'global') {
|
||||
prompts = await service.listGlobalPrompts();
|
||||
} else {
|
||||
prompts = await service.listPrompts();
|
||||
}
|
||||
return enrichWithLinkStatus(prompts, projectRepo);
|
||||
});
|
||||
|
||||
app.get<{ Params: { id: string } }>('/api/v1/prompts/:id', async (request) => {
|
||||
const prompt = await service.getPrompt(request.params.id);
|
||||
const [enriched] = await enrichWithLinkStatus([prompt], projectRepo);
|
||||
return enriched;
|
||||
});
|
||||
|
||||
app.post('/api/v1/prompts', async (request, reply) => {
|
||||
const prompt = await service.createPrompt(request.body);
|
||||
reply.code(201);
|
||||
return prompt;
|
||||
});
|
||||
|
||||
app.put<{ Params: { id: string } }>('/api/v1/prompts/:id', async (request) => {
|
||||
return service.updatePrompt(request.params.id, request.body);
|
||||
});
|
||||
|
||||
app.delete<{ Params: { id: string } }>('/api/v1/prompts/:id', async (request, reply) => {
|
||||
await service.deletePrompt(request.params.id);
|
||||
reply.code(204);
|
||||
});
|
||||
|
||||
// ── Prompt Requests (pending proposals) ──
|
||||
|
||||
app.get<{ Querystring: { project?: string; scope?: string } }>('/api/v1/promptrequests', async (request) => {
|
||||
const projectName = request.query.project;
|
||||
if (projectName) {
|
||||
const project = await projectRepo.findByName(projectName);
|
||||
if (!project) {
|
||||
throw Object.assign(new Error(`Project not found: ${projectName}`), { statusCode: 404 });
|
||||
}
|
||||
return service.listPromptRequests(project.id);
|
||||
}
|
||||
if (request.query.scope === 'global') {
|
||||
return service.listGlobalPromptRequests();
|
||||
}
|
||||
return service.listPromptRequests();
|
||||
});
|
||||
|
||||
app.get<{ Params: { id: string } }>('/api/v1/promptrequests/:id', async (request) => {
|
||||
return service.getPromptRequest(request.params.id);
|
||||
});
|
||||
|
||||
app.put<{ Params: { id: string } }>('/api/v1/promptrequests/:id', async (request) => {
|
||||
return service.updatePromptRequest(request.params.id, request.body);
|
||||
});
|
||||
|
||||
app.delete<{ Params: { id: string } }>('/api/v1/promptrequests/:id', async (request, reply) => {
|
||||
await service.deletePromptRequest(request.params.id);
|
||||
reply.code(204);
|
||||
});
|
||||
|
||||
app.post('/api/v1/promptrequests', async (request, reply) => {
|
||||
const body = request.body as Record<string, unknown>;
|
||||
// Resolve project name → ID if provided
|
||||
if (body.project && typeof body.project === 'string') {
|
||||
const project = await projectRepo.findByName(body.project);
|
||||
if (!project) {
|
||||
throw Object.assign(new Error(`Project not found: ${body.project}`), { statusCode: 404 });
|
||||
}
|
||||
const { project: _, ...rest } = body;
|
||||
const req = await service.propose({ ...rest, projectId: project.id });
|
||||
reply.code(201);
|
||||
return req;
|
||||
}
|
||||
const req = await service.propose(body);
|
||||
reply.code(201);
|
||||
return req;
|
||||
});
|
||||
|
||||
// Approve: atomic delete request → create prompt
|
||||
app.post<{ Params: { id: string } }>('/api/v1/promptrequests/:id/approve', async (request) => {
|
||||
return service.approve(request.params.id);
|
||||
});
|
||||
|
||||
// Regenerate summary/chapters for a prompt
|
||||
app.post<{ Params: { id: string } }>('/api/v1/prompts/:id/regenerate-summary', async (request) => {
|
||||
return service.regenerateSummary(request.params.id);
|
||||
});
|
||||
|
||||
// Compact prompt index for gating LLM (name, priority, summary, chapters)
|
||||
app.get<{ Params: { name: string } }>('/api/v1/projects/:name/prompt-index', async (request) => {
|
||||
const project = await projectRepo.findByName(request.params.name);
|
||||
if (!project) {
|
||||
throw Object.assign(new Error(`Project not found: ${request.params.name}`), { statusCode: 404 });
|
||||
}
|
||||
const prompts = await service.listPrompts(project.id);
|
||||
return prompts.map((p) => ({
|
||||
name: p.name,
|
||||
priority: p.priority,
|
||||
summary: p.summary,
|
||||
chapters: p.chapters,
|
||||
linkTarget: p.linkTarget,
|
||||
}));
|
||||
});
|
||||
|
||||
// ── Project-scoped endpoints (for mcplocal) ──
|
||||
|
||||
// Visible prompts: approved + session's pending requests
|
||||
app.get<{ Params: { name: string }; Querystring: { session?: string } }>(
|
||||
'/api/v1/projects/:name/prompts/visible',
|
||||
async (request) => {
|
||||
const project = await projectRepo.findByName(request.params.name);
|
||||
if (!project) {
|
||||
throw Object.assign(new Error(`Project not found: ${request.params.name}`), { statusCode: 404 });
|
||||
}
|
||||
return service.getVisiblePrompts(project.id, request.query.session);
|
||||
},
|
||||
);
|
||||
|
||||
// LLM propose: create a PromptRequest for a project
|
||||
app.post<{ Params: { name: string } }>(
|
||||
'/api/v1/projects/:name/promptrequests',
|
||||
async (request, reply) => {
|
||||
const project = await projectRepo.findByName(request.params.name);
|
||||
if (!project) {
|
||||
throw Object.assign(new Error(`Project not found: ${request.params.name}`), { statusCode: 404 });
|
||||
}
|
||||
const body = request.body as Record<string, unknown>;
|
||||
const req = await service.propose({
|
||||
...body,
|
||||
projectId: project.id,
|
||||
});
|
||||
reply.code(201);
|
||||
return req;
|
||||
},
|
||||
);
|
||||
}
|
||||
30
src/mcpd/src/routes/rbac-definitions.ts
Normal file
30
src/mcpd/src/routes/rbac-definitions.ts
Normal file
@@ -0,0 +1,30 @@
|
||||
import type { FastifyInstance } from 'fastify';
|
||||
import type { RbacDefinitionService } from '../services/rbac-definition.service.js';
|
||||
|
||||
export function registerRbacRoutes(
|
||||
app: FastifyInstance,
|
||||
service: RbacDefinitionService,
|
||||
): void {
|
||||
app.get('/api/v1/rbac', async () => {
|
||||
return service.list();
|
||||
});
|
||||
|
||||
app.get<{ Params: { id: string } }>('/api/v1/rbac/:id', async (request) => {
|
||||
return service.getById(request.params.id);
|
||||
});
|
||||
|
||||
app.post('/api/v1/rbac', async (request, reply) => {
|
||||
const def = await service.create(request.body);
|
||||
reply.code(201);
|
||||
return def;
|
||||
});
|
||||
|
||||
app.put<{ Params: { id: string } }>('/api/v1/rbac/:id', async (request) => {
|
||||
return service.update(request.params.id, request.body);
|
||||
});
|
||||
|
||||
app.delete<{ Params: { id: string } }>('/api/v1/rbac/:id', async (request, reply) => {
|
||||
await service.delete(request.params.id);
|
||||
reply.code(204);
|
||||
});
|
||||
}
|
||||
30
src/mcpd/src/routes/secrets.ts
Normal file
30
src/mcpd/src/routes/secrets.ts
Normal file
@@ -0,0 +1,30 @@
|
||||
import type { FastifyInstance } from 'fastify';
|
||||
import type { SecretService } from '../services/secret.service.js';
|
||||
|
||||
export function registerSecretRoutes(
|
||||
app: FastifyInstance,
|
||||
service: SecretService,
|
||||
): void {
|
||||
app.get('/api/v1/secrets', async () => {
|
||||
return service.list();
|
||||
});
|
||||
|
||||
app.get<{ Params: { id: string } }>('/api/v1/secrets/:id', async (request) => {
|
||||
return service.getById(request.params.id);
|
||||
});
|
||||
|
||||
app.post('/api/v1/secrets', async (request, reply) => {
|
||||
const secret = await service.create(request.body);
|
||||
reply.code(201);
|
||||
return secret;
|
||||
});
|
||||
|
||||
app.put<{ Params: { id: string } }>('/api/v1/secrets/:id', async (request) => {
|
||||
return service.update(request.params.id, request.body);
|
||||
});
|
||||
|
||||
app.delete<{ Params: { id: string } }>('/api/v1/secrets/:id', async (request, reply) => {
|
||||
await service.delete(request.params.id);
|
||||
reply.code(204);
|
||||
});
|
||||
}
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user