feat: eager vLLM warmup and smart page titles in paginate stage

- Add warmup() to LlmProvider interface for eager subprocess startup
- ManagedVllmProvider.warmup() starts vLLM in background on project load
- ProviderRegistry.warmupAll() triggers all managed providers
- NamedProvider proxies warmup() to inner provider
- paginate stage generates LLM-powered descriptive page titles when
  available, cached by content hash, falls back to generic "Page N"
- project-mcp-endpoint calls warmupAll() on router creation so vLLM
  is loading while the session initializes

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
Michal
2026-03-03 19:07:39 +00:00
parent 0427d7dc1a
commit 03827f11e4
147 changed files with 17561 additions and 2093 deletions

View File

@@ -186,7 +186,7 @@ async function extractTree(): Promise<CmdInfo> {
const CANONICAL_RESOURCES = [
'servers', 'instances', 'secrets', 'templates', 'projects',
'users', 'groups', 'rbac', 'prompts', 'promptrequests',
'serverattachments', 'all',
'serverattachments', 'proxymodels', 'all',
];
const ALIAS_ENTRIES: [string, string][] = [
@@ -201,6 +201,7 @@ const ALIAS_ENTRIES: [string, string][] = [
['prompt', 'prompts'], ['prompts', 'prompts'],
['promptrequest', 'promptrequests'], ['promptrequests', 'promptrequests'], ['pr', 'promptrequests'],
['serverattachment', 'serverattachments'], ['serverattachments', 'serverattachments'], ['sa', 'serverattachments'],
['proxymodel', 'proxymodels'], ['proxymodels', 'proxymodels'], ['pm', 'proxymodels'],
['all', 'all'],
];