Skip to main content

Overview

Fabi.ai provides an MCP (Model Context Protocol) server that allows you to integrate Fabi’s AI data analysis capabilities directly into your development workflow or your client/interface of choice. The MCP server enables AI assistants and development tools to interact with Fabi.ai, creating threads, submitting chat requests, and saving Smartbooks programmatically. The Fabi MCP server is the single fastest way for you to implement an AI assistant to chat directly with your data.

Authentication

The Fabi MCP server supports two authentication methods:

Token authentication

Token authentication is the recommended method for programmatic access. You can generate MCP tokens from your Fabi.ai settings:
  1. Navigate to https://app.fabi.ai/settings/mcp
  2. Generate a new MCP token
  3. Copy the token securely - it will only be shown once

OAuth authentication

OAuth authentication is also supported for user-based integrations. Follow the OAuth flow to authenticate your application and use the following URL: https://app.fabi.ai/mcp

Configuration

To connect to the Fabi MCP server, add the following configuration to your MCP client settings:
{
  "mcpServers": {
    "fabi": {
      "command": "npx",
      "args": [
        "mcp-remote",
        "https://app.fabi.ai/mcp",
        "--header",
        "Authorization: Bearer <your-fabi-mcp-token>"
      ]
    }
  }
}
Replace <your-fabi-mcp-token> with the token you generated from the settings page.

Available tools

The Fabi MCP server provides the following tools for interacting with Fabi.ai:

Create a thread

Creates a new data analysis session (backed by a Smartbook) for SQL/Python queries. Used to start analyzing database tables, running queries, or exploring data. The thread will persist your analysis history and generated code. Parameters:
  • title (optional): Title for the analysis session
Returns: Thread UUID for subsequent operations

Submit chat

Delegate a data analysis task to the Fabi autonomous agent. Takes a description of what you want in natural language - Fabi will independently handle all complexity: discovering data sources, using RAG to find relevant table/column schemas and semantics, generating SQL/Python code, validating queries with dry runs, executing them, and formatting results. Parameters:
  • thread_uuid: UUID of the thread from create_thread
  • message: Natural language data analysis request (e.g., ‘show top 10 customers by revenue’)
  • context_cell_uuids (optional): Previous cell UUIDs to reference in this analysis
  • context_dataframes (optional): Variable names of dataframes to use as context
Returns: Request UUID and initial status

Get chat result

Poll for the result of a long-running chat analysis request. Used by the agent when submit_chat returns early due to timeout (after 45 seconds). The chat continues processing in the background - call this periodically to check if results are ready. Parameters:
  • request_uuid: UUID of the chat request (returned by submit_chat)
Returns: Processing status or completed results with data preview

Save to Smartbook

Save AI-generated cells from chat history to the Smartbook for dashboard publishing or collaboration. Used by the agent to persist, publish, or share the analysis as a dashboard. This accepts pending chat results and converts them into executable Smartbook cells. Regular analyses are already viewable in chat history and don’t need saving. Parameters:
  • thread_uuid: UUID of the thread to save
Returns: Confirmation of saved Smartbook cells

Usage examples

Basic workflow

  1. Create a thread to start a new analysis session
  2. Submit chat requests with natural language queries
  3. Get chat results to retrieve analysis outputs
  4. Save to Smartbook (optional) to persist results for sharing

Example: Analyzing sales data

// 1. Create a new thread
const thread = await mcp.callTool('fabi', 'create_thread', {
  title: 'Sales Analysis Q4 2024'
});

// 2. Submit an analysis request
const request = await mcp.callTool('fabi', 'submit_chat', {
  thread_uuid: thread.uuid,
  message: 'Show top 10 customers by revenue in Q4 2024'
});

// 3. Get the results
const results = await mcp.callTool('fabi', 'get_chat_result', {
  request_uuid: request.uuid
});

// 4. Save to Smartbook for sharing
await mcp.callTool('fabi', 'save_to_smartbook', {
  thread_uuid: thread.uuid
});

Troubleshooting

  • First try using the AI Analyst Agent in the Fabi UI to ensure it works as expected. If it does then
  • If using token-based authentication, make sure you’re using a valid token
  • If using a local agent, restart the agent after configuring the tools

Best practices

Token security

  • Store MCP tokens securely and never commit them to version control
  • Use environment variables or secure secret management
  • Rotate tokens regularly for enhanced security
  • Each token should only be used by one application or user

Error handling

  • Implement retry logic for get_chat_result when status is “processing”
  • Handle timeout scenarios gracefully
  • Validate thread UUIDs before making subsequent calls

Performance

  • Use context_cell_uuids and context_dataframes to build on previous analyses
  • Only call save_to_smartbook when you need to persist results
  • Batch related queries in the same thread for better context