Skip to content

Layered Architecture Design

ChatAI Plugin adopts a three-layer architecture: Application, Service, and Core layers.

Architecture Overview

Application Layer

Location: apps/

Handles Yunzai-Bot message events and command routing.

ModuleResponsibility
chat.jsMain chat handler
Commands.jsCommand processing
Management.jsAdmin commands
ChatListener.jsEvent listener
GroupEvents.jsGroup event handling

Processing Flow

Code Example

javascript
// apps/ChatListener.js
export class ChatListener extends plugin {
  constructor() {
    super({
      name: 'ChatAI-Listener',
      event: 'message',
      priority: 100
    })
  }

  async accept(e) {
    // Trigger condition check
    if (!this.shouldTrigger(e)) return false
    
    // Call chat handler
    return await handleChat(e)
  }
}

Service Layer

Location: src/services/

Contains business logic and API services.

ServiceDirectoryResponsibility
LLM Serviceservices/llm/AI model calls
Agent Serviceservices/agent/Skills orchestration
Storage Serviceservices/storage/Data persistence
Web Serviceservices/webServer.jsHTTP API
Routesservices/routes/API route definitions

ChatService

javascript
// services/llm/ChatService.js
export class ChatService {
  async chat(options) {
    const { messages, model, tools } = options
    
    // 1. Get adapter
    const adapter = this.getAdapter(model)
    
    // 2. Build request
    const request = this.buildRequest(messages, tools)
    
    // 3. Send request
    const response = await adapter.chat(request)
    
    // 4. Handle tool calls
    if (response.toolCalls) {
      return await this.handleToolCalls(response)
    }
    
    return response
  }
}

SkillsAgent

javascript
// services/agent/SkillsAgent.js
export class SkillsAgent {
  // Get executable skills
  getExecutableSkills() {
    // Apply permission filtering
    return this.filterByPermission(this.allTools)
  }
  
  // Execute skill
  async execute(skillName, args) {
    // Permission check
    this.checkPermission(skillName)
    
    // Auto-fill parameters
    args = this.fillAutoParams(args)
    
    // Call MCP
    return await McpManager.callTool(skillName, args)
  }
}

Core Layer

Location: src/core/, src/mcp/

Infrastructure and protocol implementations.

ModuleResponsibility
adapters/LLM client adapters
McpManagerTool management
McpClientMCP protocol
BuiltinMcpServerBuilt-in tools
cache/Caching layer
javascript
// core/adapters/AbstractClient.js
class AbstractClient {
  async sendMessage(messages, options) {
    // Abstract method
  }
  
  parseToolCalls(response) {
    // Parse tool calls from response
  }
}

Layer Dependencies

Rules:

  • Upper layers depend on lower layers
  • Lower layers never depend on upper layers
  • Cross-layer calls should go through interfaces

Module Dependencies

Extension Points

Extension PointLocationDescription
LLM Adapterscore/adapters/Support new AI models
Built-in Toolssrc/mcp/tools/Add new tool categories
Custom Toolsdata/tools/User-defined tools
API Routesservices/routes/Extend API endpoints

Benefits

BenefitDescription
SeparationClear responsibility boundaries
TestabilityLayers can be tested independently
FlexibilityEasy to replace implementations
MaintainabilityChanges isolated to specific layers

Next Steps

Released under the MIT License