feat(db+mcpd): Agent lifecycle + chat.service kind=virtual branch (v3 Stage 1)

Two pieces of v3 plumbing — schema + the latent v1 chat.service bug.

Schema (db):
- Agent gains kind/providerSessionId/lastHeartbeatAt/status/inactiveSince
  mirroring Llm's v1 lifecycle. Reuses LlmKind / LlmStatus enums; no
  new types. Existing rows backfill kind=public/status=active so v1
  CRUD is unaffected.
- @@index([kind, status]) for the GC sweep, @@index([providerSessionId])
  for disconnect-cascade lookups.
- 4 new prisma-level tests cover defaults, persisting virtual fields,
  the (kind, status) GC index, and providerSessionId lookups.
  Total agent-schema tests: 20/20.

chat.service (mcpd) — fixes the v1 latent bug:
- LlmView's kind is now plumbed through prepareContext as ctx.llmKind.
- Two new private helpers, runOneInference / streamInference, branch
  on ctx.llmKind: 'public' goes through the existing adapter
  registry, 'virtual' relays through VirtualLlmService.enqueueInferTask
  (mirrors the route-handler branch from v1 Stage 3).
- Streaming bridges VirtualLlmService's onChunk callback API to an
  async iterator via a small queue + wake pattern.
- ChatService gains an optional virtualLlms constructor parameter;
  main.ts wires it in. Older test wirings without it raise a clear
  "virtualLlms dispatcher not wired" error when the row is virtual,
  rather than silently falling through to the public path against an
  empty URL.

This unblocks any Agent (public OR future v3-virtual) pinned to a
kind=virtual Llm. Pre-this-stage, those agents 502'd against the
empty url field.

Tests: 4 new chat-service-virtual-llm.test.ts cover the relay path
non-streaming, streaming, missing-dispatcher error, and rejection
surfacing. mcpd suite: 841/841 (was 833, +8 across stages 1+v3-Stage-1).
Workspace: 2054/2054 across 153 files.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
This commit is contained in:
Michal
2026-04-27 16:55:02 +01:00
parent 9374a2652b
commit 9afd24a3aa
7 changed files with 490 additions and 26 deletions

View File

@@ -317,6 +317,78 @@ describe('agent / chat-thread / chat-message schema', () => {
expect(reloaded?.defaultPersonalityId).toBeNull();
});
// ── v3: Agent.kind virtual + lifecycle fields ──
it('defaults a freshly inserted Agent to kind=public, status=active', async () => {
const user = await makeUser();
const llm = await makeLlm('llm-default-kind');
const agent = await makeAgent({ name: 'fresh', llmId: llm.id, ownerId: user.id });
expect(agent.kind).toBe('public');
expect(agent.status).toBe('active');
expect(agent.providerSessionId).toBeNull();
expect(agent.lastHeartbeatAt).toBeNull();
expect(agent.inactiveSince).toBeNull();
});
it('persists kind=virtual + lifecycle fields together', async () => {
const user = await makeUser();
const llm = await makeLlm('llm-pub-virtual');
const now = new Date();
const agent = await prisma.agent.create({
data: {
name: 'local-coder',
llmId: llm.id,
ownerId: user.id,
kind: 'virtual',
providerSessionId: 'sess-abc',
lastHeartbeatAt: now,
status: 'active',
},
});
expect(agent.kind).toBe('virtual');
expect(agent.providerSessionId).toBe('sess-abc');
expect(agent.lastHeartbeatAt?.getTime()).toBe(now.getTime());
});
it('finds virtual agents by (kind, status) cheaply (GC sweep query)', async () => {
const user = await makeUser();
const llm = await makeLlm('llm-gc-agent');
await prisma.agent.create({ data: { name: 'pub-1', llmId: llm.id, ownerId: user.id } });
await prisma.agent.create({
data: { name: 'v-active', llmId: llm.id, ownerId: user.id, kind: 'virtual', providerSessionId: 's1' },
});
await prisma.agent.create({
data: { name: 'v-inactive', llmId: llm.id, ownerId: user.id, kind: 'virtual', providerSessionId: 's2', status: 'inactive', inactiveSince: new Date() },
});
const stale = await prisma.agent.findMany({
where: { kind: 'virtual', status: 'inactive' },
select: { name: true },
});
expect(stale.map((a) => a.name)).toEqual(['v-inactive']);
});
it('finds agents by providerSessionId (used on mcplocal disconnect cascade)', async () => {
const user = await makeUser();
const llm = await makeLlm('llm-sess-cascade');
await prisma.agent.create({
data: { name: 'a', llmId: llm.id, ownerId: user.id, kind: 'virtual', providerSessionId: 'shared' },
});
await prisma.agent.create({
data: { name: 'b', llmId: llm.id, ownerId: user.id, kind: 'virtual', providerSessionId: 'shared' },
});
await prisma.agent.create({
data: { name: 'c', llmId: llm.id, ownerId: user.id, kind: 'virtual', providerSessionId: 'other' },
});
const owned = await prisma.agent.findMany({
where: { providerSessionId: 'shared' },
select: { name: true },
orderBy: { name: 'asc' },
});
expect(owned.map((a) => a.name)).toEqual(['a', 'b']);
});
it('binds the same prompt to multiple personalities of an agent', async () => {
const user = await makeUser();
const llm = await makeLlm('llm-shared-prompt');