Ask GRIPR

Ask Gripr - AI Chat Feature Implementation Plan

Context

Gripr currently lacks a way for users to quickly get an overview of their data using natural language. "Ask Gripr" is a new page with a chat interface where users can ask questions such as "Give me a summary of project ABC" and receive structured, AI-generated responses based on the data they have access to. The feature must comply with existing RBAC/ABAC access controls.

Decisions made:

  • New module in the core API (not a separate service)
  • SSE streaming for real-time token delivery
  • V1 scope: Projects + tasks + hours
  • Conversation history is stored in the DB

Phase 1: Database & Models

1.1 Migration

New file: services/core-api/migrations/20260307120000-create-chat-tables.js

Creates two tables in a single transaction (follows the outbox migration pattern):

chat_conversation

| Column | Type | Details | | -- | -- | -- | | id | UUID PK | UUIDV4 | |domainid | UUID NOT NULL | FK → domain(id) | | userid| UUID NOT NULL | FK → "user"(id) | | title | VARCHAR(255) NULL | Auto-generated from first message | |createdat | TIMESTAMPTZ | NOT NULL DEFAULT NOW() | | updatedat| TIMESTAMPTZ | NOT NULL DEFAULT NOW() | | deleted_at | TIMESTAMPTZ NULL | Soft delete |

Indexes: (domain_id), (user_id), (deleted_at)

chat_message

| Column | Type | Details | | -- | -- | -- | | id | UUID PK | UUIDV4 | |domainid | UUID NOT NULL | FK → domain(id) | | conversationid| UUID NOT NULL | FK →chatconversation(id) ON DELETE CASCADE | | role | VARCHAR(16) NOT NULL | 'user' or 'assistant' | | content | TEXT NOT NULL | Message content | | contextsummary| JSONB NULL | Which entities were looked up (for audit) | |tokencount | INTEGER NULL | Token usage | | model | VARCHAR(64) NULL | Which LLM model was used | | createdat| TIMESTAMPTZ | NOT NULL DEFAULT NOW() | | updated_at | TIMESTAMPTZ | NOT NULL DEFAULT NOW() |

Indexes: (conversation_id), (domain_id)

1.2 Sequelize Models

New files:

  • packages/sequel-models/src/models/chat-conversation.ts
  • packages/sequel-models/src/models/chat-message.ts

Modify: packages/sequel-models/src/models/index.ts — export both

Follows the pattern of existing models with @Table, @Column, @ForeignKey, @BelongsTo, and @HasMany decorators. ChatConversation is paranoid: true (soft delete).


Phase 2: Backend Module

2.1 Module Structure

services/core-api/src/modules/ask-gripr/
 ask-gripr.module.ts
  controllers/
    ask-gripr.controller.ts
    index.ts
  services/
    ask-gripr.service.ts              ← Orchestrator: conversations, messages, flow
    ask-gripr-context.service.ts      ← Retrieves and builds data context
    ask-gripr-llm.service.ts          ← OpenAI abstraction
    index.ts
  dto/
    create-conversation.dto.ts
    send-message.dto.ts
    get-conversations-query.dto.ts
    index.ts
  interfaces/
    llm-provider.interface.ts
    context.interface.ts
    index.ts
  enum/
    chat-role.enum.ts
    index.ts
  config/
    system-prompts.ts                 ← System prompts for LLM

2.2 LLM Service (ask-gripr-llm.service.ts)

Wrapper around the OpenAI npm package that maps to an interface:

interface ILLMProvider {
  chatCompletion(params: LLMParams): Promise<LLMResponse>;
  chatCompletionStream(params: LLMParams): AsyncIterable<{ chunk: string; done: boolean }>;
}

  • Uses the OpenAI SDK with streaming (stream: true)
  • Configuration via @nestjs/config: OPENAI_API_KEY, OPENAI_MODEL (default gpt-4o), OPENAI_MAX_TOKENS (default 4096)
  • Abstracted behind an interface so that the provider can be swapped (Azure OpenAI, Anthropic, etc.)

2.3 Context Service (ask-gripr-context.service.ts)

Builds a structured context document for the LLM based on the user’s question. Reuses existing access control patterns.

Flow:

  1. Parses the user's question to identify entities (regex + search via existing SearchService)
  2. For identified projects: fetches data the user has access to...

Please authenticate to join the conversation.

Upvoters
Status

Pågår

Board

Feature Request

Date

16 days ago

Author

Linear

Subscribe to post

Get notified by email when there are changes.