Agent System
Backflow includes an AI agent system for workflows, embeddings, and LLM interactions.
Enable Agent System
json
{
"agent": {
"enabled": true,
"embedding": {
"provider": "openai",
"model": "text-embedding-3-small",
"apiKey": "{{env.OPENAI_API_KEY}}"
},
"vectorStore": {
"provider": "qdrant",
"url": "{{env.QDRANT_URL}}",
"apiKey": "{{env.QDRANT_API_KEY}}",
"collection": "documents"
},
"llm": {
"provider": "anthropic",
"model": "claude-sonnet-4-20250514",
"apiKey": "{{env.ANTHROPIC_API_KEY}}"
}
}
}LLM Providers
OpenAI
json
{
"llm": {
"provider": "openai",
"model": "gpt-4o",
"apiKey": "{{env.OPENAI_API_KEY}}"
}
}Anthropic
json
{
"llm": {
"provider": "anthropic",
"model": "claude-sonnet-4-20250514",
"apiKey": "{{env.ANTHROPIC_API_KEY}}"
}
}LMStudio (Local)
json
{
"llm": {
"provider": "lmstudio",
"baseUrl": "http://localhost:1234/v1",
"model": "local-model"
}
}Google Gemini
json
{
"llm": {
"provider": "google",
"model": "gemini-2.0-flash",
"apiKey": "{{env.GOOGLE_API_KEY}}"
}
}Embeddings
Generate Embeddings
bash
POST /agent/embeddings
Content-Type: application/json
{
"texts": ["Hello world", "Another text"]
}Store Document
bash
POST /agent/embeddings/document
Content-Type: application/json
{
"document": "Full document text...",
"contentType": "article",
"metadata": {
"source": "web",
"url": "https://example.com"
}
}Search Documents
bash
POST /agent/embeddings/search
Content-Type: application/json
{
"query": "search query",
"limit": 10,
"threshold": 0.7
}LLM Streaming
Stream Chat
bash
POST /llm/stream
Content-Type: application/json
{
"messages": [
{"role": "user", "content": "Hello!"}
],
"stream": true
}With System Prompt
bash
POST /llm/stream
Content-Type: application/json
{
"messages": [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What is TypeScript?"}
]
}Agent Endpoints
| Method | Endpoint | Description |
|---|---|---|
| GET | /agent/health | Agent system health |
| POST | /agent/embeddings | Generate embeddings |
| POST | /agent/embeddings/document | Store document |
| POST | /agent/embeddings/search | Search documents |
| POST | /agent/embeddings/batch | Batch operations |
| POST | /llm/stream | LLM streaming chat |
| GET | /llm/models | List available models |
MCP Tools
Connect to MCP (Model Context Protocol) servers:
json
{
"agent": {
"mcpServers": [
{
"name": "filesystem",
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/path"]
}
]
}
}List Tools
bash
GET /agent/mcp/toolsExecute Tool
bash
POST /agent/mcp/execute
Content-Type: application/json
{
"server": "filesystem",
"tool": "read_file",
"args": {
"path": "/path/to/file.txt"
}
}Vector Store Providers
Qdrant
json
{
"vectorStore": {
"provider": "qdrant",
"url": "https://your-cluster.qdrant.io:6333",
"apiKey": "{{env.QDRANT_API_KEY}}",
"collection": "documents"
}
}Qdrant (Local)
bash
# Start Qdrant
npm run qdrantjson
{
"vectorStore": {
"provider": "qdrant",
"url": "http://localhost:6333",
"collection": "documents"
}
}