Gripr currently lacks a way for users to quickly get an overview of their data using natural language. "Ask Gripr" is a new page with a chat interface where users can ask questions such as "Give me a summary of project ABC" and receive structured, AI-generated responses based on the data they have access to. The feature must comply with existing RBAC/ABAC access controls.
Decisions made:
the core API (not a separate service)New file: services/core-api/migrations/20260307120000-create-chat-tables.js
Creates two tables in a single transaction (follows the outbox migration pattern):
chat_conversation
| Column | Type | Details | | -- | -- | -- | | id | UUID PK | UUIDV4 | |domainid | UUID NOT NULL | FK → domain(id) | | userid| UUID NOT NULL | FK → "user"(id) | | title | VARCHAR(255) NULL | Auto-generated from first message | |createdat | TIMESTAMPTZ | NOT NULL DEFAULT NOW() | | updatedat| TIMESTAMPTZ | NOT NULL DEFAULT NOW() | | deleted_at | TIMESTAMPTZ NULL | Soft delete |
Indexes: (domain_id), (user_id), (deleted_at)
chat_message
| Column | Type | Details | | -- | -- | -- | | id | UUID PK | UUIDV4 | |domainid | UUID NOT NULL | FK → domain(id) | | conversationid| UUID NOT NULL | FK →chatconversation(id) ON DELETE CASCADE | | role | VARCHAR(16) NOT NULL | 'user' or 'assistant' | | content | TEXT NOT NULL | Message content | | contextsummary| JSONB NULL | Which entities were looked up (for audit) | |tokencount | INTEGER NULL | Token usage | | model | VARCHAR(64) NULL | Which LLM model was used | | createdat| TIMESTAMPTZ | NOT NULL DEFAULT NOW() | | updated_at | TIMESTAMPTZ | NOT NULL DEFAULT NOW() |
Indexes: (conversation_id), (domain_id)
New files:
packages/sequel-models/src/models/chat-conversation.tspackages/sequel-models/src/models/chat-message.tsModify: packages/sequel-models/src/models/index.ts — export both
Follows the pattern of existing models with @Table, @Column, @ForeignKey, @BelongsTo, and @HasMany decorators. ChatConversation is paranoid: true (soft delete).
ask-gripr.module.ts
controllers/
ask-gripr.controller.ts
index.ts
services/
ask-gripr.service.ts ← Orchestrator: conversations, messages, flow
ask-gripr-context.service.ts ← Retrieves and builds data context
ask-gripr-llm.service.ts ← OpenAI abstraction
index.ts
dto/
create-conversation.dto.ts
send-message.dto.ts
get-conversations-query.dto.ts
index.ts
interfaces/
llm-provider.interface.ts
context.interface.ts
index.ts
enum/
chat-role.enum.ts
index.ts
config/
system-prompts.ts ← System prompts for LLM
ask-gripr-llm.service.ts)Wrapper around the OpenAI npm package that maps to an interface:
interface ILLMProvider {
chatCompletion(params: LLMParams): Promise<LLMResponse>;
chatCompletionStream(params: LLMParams): AsyncIterable<{ chunk: string; done: boolean }>;
}
OpenAI SDK with streaming (stream: true)@nestjs/config: OPENAI_API_KEY, OPENAI_MODEL (default gpt-4o), OPENAI_MAX_TOKENS (default 4096)ask-gripr-context.service.ts)Builds a structured context document for the LLM based on the user’s question. Reuses existing access control patterns.
Flow:
SearchService)Please authenticate to join the conversation.
Pågår
Feature Request
16 days ago
Linear
Get notified by email when there are changes.
Pågår
Feature Request
16 days ago
Linear
Get notified by email when there are changes.