Skip to content

Data Flow Architecture

How requests flow through the ChatAI Plugin system.

Request Flow Overview

Message Processing

1. Message Reception

javascript
// apps/chat.js
async accept(e) {
  // Check if message should trigger AI
  if (!this.shouldTrigger(e)) return false
  
  // Process message
  await this.processMessage(e)
}

2. Trigger Check

3. Context Building

javascript
// Build conversation context
const context = await contextService.getContext(userId, groupId)

// Add system prompt
const messages = [
  { role: 'system', content: preset.systemPrompt },
  ...context.messages,
  { role: 'user', content: userMessage }
]

4. LLM Request

javascript
// Send to LLM with tools
const response = await llmClient.sendMessage(messages, {
  tools: availableTools,
  temperature: config.temperature,
  maxTokens: config.maxTokens
})

5. Tool Execution

6. Response Delivery

javascript
// Format and send response
const reply = formatResponse(response)
await e.reply(reply)

// Save to context
await contextService.addMessage(userId, groupId, {
  role: 'assistant',
  content: response.text
})

Tool Call Flow

Memory Flow

Error Handling

Next Steps

Released under the MIT License