LLM Adapters Core
Unified interface for multiple AI model providers.
Architecture
AbstractClient
Base class in src/core/adapters/AbstractClient.js:
javascript
class AbstractClient {
// Send message to LLM
async sendMessage(messages, options) { }
// Parse tool calls from response
parseToolCalls(response) { }
// Convert message format
convertMessages(messages) { }
}Supported Providers
| Provider | Client | Features |
|---|---|---|
| OpenAI | OpenAIClient | Tool calling, streaming, vision |
| Claude | ClaudeClient | Tool use, streaming |
| Gemini | GeminiClient | Function calling |
| DeepSeek | Uses OpenAI client | OpenAI-compatible |
Message Flow
Creating Custom Adapter
- Extend
AbstractClient - Implement required methods
- Register in adapter factory
javascript
import { AbstractClient } from './AbstractClient.js'
export class MyClient extends AbstractClient {
async sendMessage(messages, options) {
// Implementation
}
}Next Steps
- Chat Service - Message handling
- MCP System - Tool integration