Skip to main content
One of the core design goals of @betterdata/commerce-gateway is portability. This guide explains how to build a single commerce engine that serves multiple LLMs simultaneously.

The Strategy

Instead of building separate integrations for Claude, ChatGPT, and Grok, you define your Backends and Tools once. You then spin up multiple “Adapters” that speak to the gateway core.

Implementation

You can run multiple adapters within the same application:
import { MCPServer } from '@betterdata/commerce-gateway/mcp';
import { OpenAIAdapter } from '@betterdata/commerce-gateway/openai';

const backends = { products: myProductBackend };

// 1. Start MCP Server for internal use
const mcp = new MCPServer({ backends });
mcp.start();

// 2. Start HTTP server with OpenAI adapter for public use
const openai = new OpenAIAdapter({ backends });
serveHttp(async (req) => {
  return await openai.handleRequest(req);
});

Benefits of Multi-LLM

1. Consistency

Regardless of which AI the user talks to, the product data, pricing, and availability will be identical.

2. Lower Maintenance

When you add a new checkout feature or a reward program, you only implement it once in your OrderBackend. All connected LLMs automatically get the new capability.

3. Future Proofing

As new LLMs (like Google’s Gemini) release tool-calling features, Better Data will add new adapters, allowing your store to expand to new ecosystems with zero code changes to your backends.

Best Practices

  • Shared Sessions: Use Redis to share the sessionId and cart across providers. This allows a user to “Start in Claude, finish in ChatGPT”.
  • Capability Discovery: Use the getCapabilities API to detect if a specific LLM support a feature (like streaming or image generation) before attempting to use it.