Skip to content

LLM Providers

Configure which Large Language Model providers your agents use. Moxxy supports built-in providers with first-class integration and any OpenAI-compatible provider via custom registration.

Supported Providers

ProviderTypeModelsAuth
AnthropicBuilt-in (first-class)Claude Opus, Claude Sonnet, Claude HaikuAPI key or OAuth session
OpenAIBuilt-inGPT-4.1, o3, o4-miniAPI key
OllamaBuilt-in (local)Any locally hosted modelNone required
OpenAI-compatibleCustom registrationxAI/Grok, Google Gemini, DeepSeek, etc.Base URL + API key

Anthropic (First-Class)

Anthropic is the primary, first-class provider in Moxxy with full streaming support.

Available Models

ModelBest For
Claude OpusComplex reasoning, long-form analysis
Claude SonnetBalanced performance, general use
Claude HaikuFast responses, lightweight tasks

Authentication

Anthropic supports two authentication methods:

API Key:

Set the ANTHROPIC_API_KEY environment variable:

bash
export ANTHROPIC_API_KEY="sk-ant-xxx"

OAuth Session:

Moxxy supports Anthropic OAuth session tokens with automatic console token refresh. This is useful for development and testing without a static API key.

OpenAI

Available Models

ModelBest For
GPT-4.1General use, complex tasks
o3Deep reasoning
o4-miniBalanced reasoning, cost-effective

Authentication

Set the OPENAI_API_KEY environment variable:

bash
export OPENAI_API_KEY="sk-xxx"

Ollama (Local)

Run models locally with no API key required. Moxxy auto-discovers a running Ollama instance on the local machine.

Setup

  1. Install Ollama from ollama.com
  2. Pull a model:
    bash
    ollama pull llama3
  3. Start Ollama (if not running as a service):
    bash
    ollama serve

Moxxy will automatically detect and connect to the local Ollama instance. No API key or additional configuration is needed.

OpenAI-Compatible Providers

Any provider that implements the OpenAI-compatible API can be registered as a custom provider. This includes xAI/Grok, Google Gemini, DeepSeek, and others.

Register a Custom Provider

bash
curl -X POST http://localhost:3000/v1/providers \
  -H "Authorization: Bearer mox_YOUR_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "xai",
    "base_url": "https://api.x.ai/v1",
    "api_key": "xai-xxx",
    "models": ["grok-3"]
  }'

Examples

xAI / Grok:

bash
curl -X POST http://localhost:3000/v1/providers \
  -H "Authorization: Bearer mox_YOUR_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "xai",
    "base_url": "https://api.x.ai/v1",
    "api_key": "xai-xxx",
    "models": ["grok-3"]
  }'

Google Gemini:

bash
curl -X POST http://localhost:3000/v1/providers \
  -H "Authorization: Bearer mox_YOUR_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "gemini",
    "base_url": "https://generativelanguage.googleapis.com/v1beta/openai",
    "api_key": "xxx",
    "models": ["gemini-2.5-pro", "gemini-2.0-flash"]
  }'

DeepSeek:

bash
curl -X POST http://localhost:3000/v1/providers \
  -H "Authorization: Bearer mox_YOUR_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "deepseek",
    "base_url": "https://api.deepseek.com/v1",
    "api_key": "xxx",
    "models": ["deepseek-chat", "deepseek-reasoner"]
  }'

Provider Management API

All provider operations are available through the REST API at port 3000 with Bearer token authentication (mox_ prefix).

List Installed Providers

bash
curl http://localhost:3000/v1/providers \
  -H "Authorization: Bearer mox_YOUR_TOKEN"

Install / Register a Provider

bash
curl -X POST http://localhost:3000/v1/providers \
  -H "Authorization: Bearer mox_YOUR_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "provider_name",
    "base_url": "https://api.provider.com/v1",
    "api_key": "xxx",
    "models": ["model-a", "model-b"]
  }'

List Available Models for a Provider

bash
curl http://localhost:3000/v1/providers/{id}/models \
  -H "Authorization: Bearer mox_YOUR_TOKEN"

Environment Variables

VariableDescription
ANTHROPIC_API_KEYAnthropic API key
OPENAI_API_KEYOpenAI API key

Security Best Practices

API Key Storage

  • Store provider API keys using environment variables or the vault system
  • Never hardcode keys in agent personas or configuration files
  • Never log or print API keys

Key Rotation

  1. Generate a new key in the provider dashboard
  2. Update the environment variable or vault entry
  3. Restart the gateway
  4. Revoke the old key

Troubleshooting

Invalid API Key

Error: Authentication failed - check your API key
  1. Verify the key is correct and not expired
  2. Ensure the key has proper permissions
  3. Check that the correct environment variable is set

Rate Limiting

Error: Rate limit exceeded
  1. Wait and retry (Moxxy handles backoff automatically)
  2. Consider upgrading your provider plan
  3. Use a lighter model for high-volume tasks

Model Not Available

Error: Model 'xxx' not found
  1. Check model name spelling
  2. Verify model is available for your account and provider
  3. For OpenAI-compatible providers, ensure the model is listed in the registration

Ollama Connection Failed

  1. Verify Ollama is running: ollama list
  2. Check that Ollama is listening on the default port
  3. Ensure the desired model is pulled locally

Open source · Self-hosted · Data sovereign