Skip to content

Commit 4405418

Browse files
milos85vasicclaude
andcommitted
feat: Add messaging abstraction layer (Phase 1 - RabbitMQ/Kafka integration)
Implement comprehensive messaging abstraction layer as foundation for RabbitMQ and Kafka integration: Core Interfaces: - MessageBroker interface for unified broker operations - TaskQueueBroker interface for RabbitMQ-style task queuing - EventStreamBroker interface for Kafka-style event streaming - Message, Task, and Event types with full functionality Features: - Publish/Subscribe patterns with flexible options - Priority queues for task prioritization - Dead letter queue support for failed messages - Message middleware (logging, retry, timeout, circuit breaker) - Comprehensive metrics collection - Rate limiting and deduplication support - In-memory fallback broker for testing/development Files added: - internal/messaging/broker.go - Core interfaces and Message type - internal/messaging/options.go - Publish/Subscribe options - internal/messaging/metrics.go - Broker metrics collection - internal/messaging/errors.go - Error types and handling - internal/messaging/middleware.go - Message middleware - internal/messaging/task_queue.go - Task queue interface - internal/messaging/event_stream.go - Event stream interface - internal/messaging/hub.go - Unified messaging hub - internal/messaging/inmemory/ - In-memory broker implementation Test coverage: 54.3% (messaging), 34.5% (inmemory) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
1 parent 597251e commit 4405418

21 files changed

+8824
-0
lines changed

docs/requests/GraphQL_And_Toon.md

Lines changed: 41 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,41 @@
1+
Yes, you can combine GraphQL and TOON, and they work together through a shared framework: the Model Context Protocol (MCP). They are highly complementary but serve different purposes in the data pipeline for AI systems.
2+
3+
Think of it this way:
4+
5+
· GraphQL acts as a smart data query layer, asking for exactly what you need from various backend systems.
6+
· TOON serves as a highly efficient data transport layer, optimizing the payload sent to the AI model to save tokens and cost.
7+
8+
🔗 How They Connect via MCP
9+
10+
The magic happens in the MCP server, which sits between your data sources (like GraphQL APIs) and the AI application (like Claude or Gemini).
11+
12+
Integration Pattern 1: Building an MCP Server with a GraphQL Backend
13+
You build a custom MCP server that uses GraphQL to fetch data and TOON to format the output.
14+
15+
· Step 1: Build the Server: Create an MCP server (using TypeScript, Python, etc.) that defines "tools".
16+
· Step 2: Use GraphQL Client: Inside each tool's function, use a GraphQL client to fetch structured data from your backend API.
17+
· Step 3: Encode to TOON: Before returning the data to the LLM, pass the JSON result through a TOON encoder (like @toon-format/toon) to create a token-efficient payload.
18+
19+
Integration Pattern 2: Using Existing MCP Servers
20+
You can chain existing servers for a quick start. For example, use an Apollo MCP Server to expose GraphQL tools, and a TOON MCP Server to handle the optimization.
21+
22+
· TOON MCP Server: A ready-made server provides encode_toon and decode_toon tools. You can configure it in AI applications like LobeChat or Gemini CLI to compress any JSON data, including responses from a GraphQL API.
23+
24+
📝 Implementation Steps
25+
26+
Here is a practical path to combine them:
27+
28+
1. Expose Your Data with GraphQL: Ensure your backend data is accessible via a GraphQL API with introspection enabled.
29+
2. Build or Adopt an MCP Server:
30+
· Quick Path: Use a GraphQL-focused MCP server (like Apollo's) to expose your API as tools.
31+
· Custom Path: Build your own server. In its tools, call your GraphQL endpoint, then encode the result to TOON using a library.
32+
3. Connect to an AI Client: Configure your MCP server in an AI application (e.g., Claude Desktop, Gemini CLI) that supports MCP.
33+
34+
⚖️ Key Considerations
35+
36+
· When it's best: Ideal for AI agents frequently processing large, uniform datasets (like user lists, product catalogs) from a GraphQL backend.
37+
· Potential complexity: For highly nested, irregular GraphQL responses, TOON's token savings may be less dramatic, and the encoding step adds slight processing overhead.
38+
39+
In summary, you combine GraphQL and TOON by using an MCP server as the orchestrator. GraphQL fetches precise data, and TOON optimizes it for the LLM, with MCP providing the standard protocol to connect everything.
40+
41+
If you have a specific use case or are deciding between building a custom server or using existing tools, I can offer more targeted guidance.
Lines changed: 87 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,87 @@
1+
Yes, it is possible to use GraphQL with LLM APIs, and this combination is actively being adopted as a modern pattern for building more efficient and intelligent AI applications. The benefits are significant, especially around data efficiency and developer experience, and major industry players are building and standardizing this integration today.
2+
3+
Here are the key benefits of this approach:
4+
5+
· Dramatically Reduces Token Overhead and Cost: GraphQL's ability to request only the exact data fields needed (e.g., just name and id) prevents over-fetching, reducing the amount of data processed by the LLM and lowering API costs.
6+
· Provides Deterministic, Orchestrated Execution: A single GraphQL query can efficiently combine data from multiple backend APIs (like fetching a user's orders, loyalty status, and shipping info in one go), letting the LLM focus on reasoning instead of managing multiple API calls.
7+
· Offers a Self-Documenting, Machine-Reasonable Interface: LLMs can use GraphQL's built-in introspection to explore the API schema autonomously, understanding available data and relationships without extensive custom documentation.
8+
· Enhances Developer Experience: Tools like the Apollo MCP Server let you expose existing GraphQL operations as ready-to-use AI tools without writing new code, significantly speeding up development.
9+
· Enables Natural Language to API Queries: LLMs can translate user questions like "Show me my recent orders" into precise GraphQL queries, creating intuitive, conversational interfaces for complex data.
10+
11+
📈 Current Compatibility and Adoption
12+
13+
This integration is not a niche concept; it is a clear trend with growing enterprise and tooling support.
14+
15+
· Enterprise Adoption Forecast: Gartner predicted that over 50% of enterprises would use GraphQL in production by 2025, a massive increase from less than 10% in 2021. Other forecasts suggest this momentum continues, with over 60% adoption expected by 2027.
16+
· Standardized Protocols (MCP): The Model Context Protocol (MCP), designed to standardize how LLMs connect to tools and data, aligns perfectly with GraphQL's strengths. Major platforms like Apollo and Hygraph have released MCP servers to connect AI systems directly to GraphQL APIs.
17+
· Industry-Wide Validation: Companies like IBM, Netflix, Meta, and Salesforce are actively using and speaking about GraphQL at industry conferences. Its evolution is now explicitly tied to supporting AI agents and workflows.
18+
19+
🛠️ How to Get Started Today
20+
21+
If you have an existing GraphQL API and want to make it accessible to LLMs, the path is straightforward:
22+
23+
· Expose Your Schema: Ensure your GraphQL API has introspection enabled. This allows LLMs to discover and understand your data model.
24+
· Consider an MCP Server: For a production-ready setup, tools like the Apollo MCP Server can automatically expose your GraphQL operations as tools for AI platforms like Claude, turning your API into a suite of capabilities an LLM can use.
25+
· Design for AI Consumption:
26+
· Use clear naming for types and fields (e.g., customerOrders instead of getData).
27+
· Provide structured, machine-readable error messages.
28+
· Implement query cost analysis and depth limiting to prevent overly complex queries from AI agents.
29+
30+
For a deeper dive, the Apollo MCP Server documentation and the detailed guide on Nordic APIs are excellent practical resources.
31+
32+
Would you like a more detailed look at a specific aspect, such as setting up an MCP server or designing a schema optimized for AI agents?
33+
34+
I will explain both setting up an MCP server and designing an AI-optimized GraphQL schema, using the details from the search results and industry practices.
35+
36+
🛠️ Setting Up an MCP Server for Your GraphQL API
37+
38+
The Model Context Protocol (MCP) is a standard that allows AI systems like Claude to safely discover and use tools (like your APIs). Setting one up for your GraphQL API is straightforward.
39+
40+
Here is a comparison of the two main approaches, with a clear path to get started:
41+
42+
Approach 1: Use a Dedicated MCP Server (Recommended)
43+
44+
· Best for: Any existing GraphQL API.
45+
· How it works: Tools like the Apollo MCP Server connect to your GraphQL endpoint, introspect its schema, and automatically expose defined operations (queries/mutations) as ready-to-use AI tools.
46+
· Key Benefit: Zero code required for the tools themselves. You define the tools by writing standard GraphQL operations in a file, and the server handles the rest.
47+
· Setup Path:
48+
1. Ensure your GraphQL API has introspection enabled.
49+
2. Use the Apollo MCP Server (npx @apollo/mcp-server) to point to your API.
50+
3. Define your tools by creating .graphql files with specific queries or mutations. The server's --introspection feature allows LLMs to explore the schema dynamically.
51+
52+
Approach 2: Build a Custom MCP Server
53+
54+
· Best for: Unique use cases needing custom logic or connecting non-GraphQL data sources.
55+
· How it works: You use MCP SDKs (in Python, TypeScript) to manually build a server. Each "tool" is a function you code that calls your backend systems.
56+
· Key Drawback: This can require significant custom code for data fetching and response shaping, reintroducing the complexity GraphQL aims to solve.
57+
· Setup Path: Start with MCP SDKs and define each tool's logic manually, which involves more development overhead.
58+
59+
🧠 Designing a GraphQL Schema for AI Agents
60+
61+
When LLMs interact with your API, they act as a "developer" that must understand and correctly use your data graph. Your schema is their primary guide.
62+
63+
Core Design Principles
64+
65+
· Clarity Over Cleverness: Use intuitive, descriptive names for types (CustomerOrder) and fields (lastPurchaseDate). Avoid abbreviations.
66+
· Document Generously: Use GraphQL's built-in description fields. Explain what each field and type represents. This text is fed directly to the LLM.
67+
· Structure as a Meaningful Graph: Explicitly define relationships between types (e.g., a Customer has a list of Orders). This allows an LLM to traverse your data logically in a single query.
68+
· Keep It Simple for Machines: AI agents work best with predictable, strongly-typed structures. Avoid polymorphic types or overly complex unions unless necessary.
69+
70+
Essential Technical Safeguards
71+
AI agents can generate unexpected queries, so these guards are critical:
72+
73+
· Implement Query Depth Limiting: Prevent queries that drill too deeply into the graph (e.g., user.posts.comments.user.posts...), which can overload your backend.
74+
· Use Query Cost Analysis: Assign complexity points to fields and limit the maximum cost per query. This protects expensive database operations.
75+
· Enable Introspection Selectively: While introspection is vital for AI discovery, consider disabling it in production public endpoints or using it with authentication to prevent schema leakage.
76+
77+
📈 The Big Picture: Adoption & Strategy
78+
79+
This integration is now a major trend.
80+
81+
· Enterprise Momentum: Forecasts suggest over 60% of enterprises will use GraphQL in production by 2027, driven partly by AI needs.
82+
· Strategic Alignment: Analysts note that GraphQL's structured graph is uniquely suited for "machine reasoning," making it a foundational layer for AI agents and the MCP ecosystem.
83+
· Performance Proven: Empirical studies show GraphQL for LLM-based applications significantly reduces latency and token consumption compared to traditional REST APIs, leading to lower costs and faster responses.
84+
85+
For implementation, the most effective path is to start with your existing GraphQL API, enable introspection, and experiment with the Apollo MCP Server to see how your operations become AI tools.
86+
87+
Would you like to dive deeper into a specific aspect, such as the performance data from the comparative studies or more detailed steps for writing your first GraphQL operations for MCP tools?

0 commit comments

Comments
 (0)