Root Documentation
CentralMind Gateway: AI-First Data Gateway
πΈ Introduction
π Interactive Demo via Github CodeSpaces
AI agents and LLM-powered applications need fast, secure access to data, but traditional APIs and databases arenβt built for this purpose. Weβre building an API layer that automatically generates secure, LLM-optimized APIs for your structured data.
Our solution:
- Filters out PII and sensitive data to ensure compliance with GDPR, CPRA, SOC 2, and other regulations
- Adds traceability and auditing capabilities, ensuring AI applications arenβt black boxes and security teams maintain control
- Optimizes for AI workloads, supporting Model Context Protocol (MCP) with enhanced meta information to help AI agents understand APIs, along with built-in caching and security features
Our primary users are companies deploying AI agents for customer support, analytics, where they need models to access the data without direct SQL access to databases elemenating security, compliance and peformance risks.
Features
- β‘ Automatic API Generation β Creates APIs automatically using LLM based on table schema and sampled data
- ποΈ Structured Database Support β Supports PostgreSQL, MySQL, ClickHouse, and Snowflake
- π Multiple Protocol Support β Provides APIs as REST or MCP Server including SSE mode
- π API Documentation β Auto-generated Swagger documentation and OpenAPI 3.1.0 specification
- π PII Protection β Implements regex plugin or Microsoft Presidio plugin for PII and sensitive data redaction
- β‘ Flexible Configuration β Easily extensible via YAML configuration and plugin system
- π³ Deployment Options β Run as a binary or Docker container with ready-to-use Helm chart
- π¦ Local & On-Premises β Support for self-hosted LLMs through configurable AI endpoints and models
- π Row-Level Security (RLS) β Fine-grained data access control using Lua scripts
- π Authentication Options β Built-in support for API keys and OAuth
- π Comprehensive Monitoring β Integration with OpenTelemetry (OTel) for request tracking and audit trails
- ποΈ Performance Optimization β Implements time-based and LRU caching strategies
How it Works
1. Connect & Discover
Gateway connects to your structured databases like PostgreSQL and automatically analyzes the schema and data samples to generate an optimized API structure based on your prompt. LLM is used only on discovery stage to produce API configuration. The tool uses AI services (OpenAI or compatible providers) to generate the API configuration while ensuring security through PII detection.
2. Deploy
Gateway supports multiple deployment options from standalone binary, docker or Kubernetes. Check our launching guide for detailed instructions. The system uses YAML configuration and plugins for easy customization.
3. Use & Integrate
Access your data through REST APIs or Model Context Protocol (MCP) with built-in security features. Gateway seamlessly integrates with AI models and applications like LangChain, OpenAI and Claude Desktop using function calling or Cursor through MCP. You can also setup telemetry to local or remote destination in otel format.
Documentation
Getting Started
Additional Resources
How to Build
# Clone the repositorygit clone https://github.com/centralmind/gateway.git
# Navigate to project directorycd gateway
# Install dependenciesgo mod download
# Build the projectgo build .
API Generation
Gateway uses LLM models to generate your API configuration. Follow these steps:
- Create a database connection configuration file (
connection.yaml
):
hosts: - localhostuser: "your-database-user"password: "your-database-password"database: "your-database-name"port: 5432
- Run the discovery command:
./gateway discover \ --config connection.yaml \ --db-type postgres \ --ai-api-key $OPENAI_API_KEY \ --prompt "Generate for me awesome readonly api"
- Monitor the generation process:
INFO π API Discovery ProcessINFO Step 1: Read configsINFO β
Step 1 completed. Done.
INFO Step 2: Discover dataINFO Discovered Tables:INFO - payment_dim: 3 columns, 39 rowsINFO - fact_table: 9 columns, 1000000 rowsINFO β
Step 2 completed. Done.
# Additional steps and output...
INFO β
All steps completed. Done.
INFO --- Execution Statistics ---INFO Total time taken: 1m10sINFO Tokens used: 16543 (Estimated cost: $0.0616)INFO Tables processed: 6INFO API methods created: 18INFO Total number of columns with PII data: 2
- Review the generated configuration in
gateway.yaml
:
api: name: Awesome Readonly API description: '' version: '1.0'database: type: postgres connection: YOUR_CONNECTION_INFO tables: - name: payment_dim columns: # Table columns endpoints: - http_method: GET http_path: /some_path mcp_method: some_method summary: Some readable summary description: 'Some description' query: SQL Query with params params: # Query parameters
Running the API
Binary Mode
./gateway start --config gateway.yaml rest
Docker Compose
docker compose -f ./example/simple/docker-compose.yml up
MCP Protocol Integration
Gateway implements the MCP protocol for seamless integration with Claude and other tools. For detailed setup instructions, see our Claude integration guide.
- Build the gateway binary:
go build .
- Configure Claude Desktop tool configuration:
{ "mcpServers": { "gateway": { "command": "PATH_TO_GATEWAY_BINARY", "args": ["start", "--config", "PATH_TO_GATEWAY_YAML_CONFIG", "mcp-stdio"] } }}
Roadmap
It is always subject to change, and the roadmap will highly depend on user feedback. At this moment, we are planning the following features:
Database and Connectivity
- ποΈ Extended Database Integrations - Redshift, S3 (Iceberg and Parquet), Oracle DB, Microsoft SQL Server, Elasticsearch
- π SSH tunneling - ability to use jumphost or ssh bastion to tunnel connections
Enhanced Functionality
- π Advanced Query Capabilities - Complex filtering syntax and Aggregation functions as parameters
- π Enhanced MCP Security - API key and OAuth authentication
Platform Improvements
- π¦ Schema Management - Automated schema evolution and API versioning
- π¦ Advanced Traffic Management - Intelligent rate limiting, Request throttling
- βοΈ Write Operations Support - Insert, Update operations