Mistral (La Plateforme): Open and Proprietary Mistral models

open-and-proprietary-mistral-models
Released Jun 17, 2025 · 256K context · Free / Free
chat

Open and Proprietary Mistral models — free model from Mistral (La Plateforme).

Try Open and Proprietary Mistral models

Test this model directly in the playground.

Click to start testing in Playground...

One-Click Config

Optimized configs for your favorite AI tools.

Claude Code

# Claude Code works via OpenRouter's Anthropic-compatible API.
# Note: Only paid Anthropic Claude models are supported (e.g. claude-sonnet-4.6, claude-opus-4).
# Browse available Claude models at: https://openrouter.ai/models?q=anthropic

# Add to ~/.zshrc or ~/.bashrc
export OPENROUTER_API_KEY="<your-openrouter-api-key>"  # Get at https://openrouter.ai/settings/keys
export ANTHROPIC_BASE_URL="https://openrouter.ai/api"
export ANTHROPIC_AUTH_TOKEN="$OPENROUTER_API_KEY"
export ANTHROPIC_API_KEY=""  # Must be explicitly empty to avoid conflicts

# Optional: pin specific models for each role
# export ANTHROPIC_DEFAULT_SONNET_MODEL="anthropic/claude-sonnet-4.6"
# export ANTHROPIC_DEFAULT_HAIKU_MODEL="anthropic/claude-haiku-4.5"

# Then simply run: claude

Cursor

# Cursor → Settings (⚙️) → Models → Add Model
# Enter the model name exactly as shown, then fill in:
#   Override OpenAI Base URL: https://api.mistral.ai/v1
#   OpenAI API Key: <your-api-key>   # Get at https://console.mistral.ai/
# Click "Verify" to confirm the connection, then enable the model.
#
# Model name to add: Open and Proprietary Mistral models

Codex

# Add to ~/.zshrc or ~/.bashrc
export OPENAI_BASE_URL="https://api.mistral.ai/v1"
export OPENAI_API_KEY="<your-api-key>"  # Get at https://console.mistral.ai/

# Then run:
codex --model "Open and Proprietary Mistral models"

Gemini CLI

# ~/.gemini/settings.json
{
  "apiKey": "<your-api-key>",
  "model": "Open and Proprietary Mistral models"
}
# Get API key at https://console.mistral.ai/

OpenCode

// ~/.config/opencode/opencode.json
{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "free-llm": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "Free LLM",
      "options": {
        "baseURL": "https://api.mistral.ai/v1",
        "apiKey": "<your-api-key>"
      },
      "models": {
        "Open and Proprietary Mistral models": { "name": "Open and Proprietary Mistral models" }
      }
    }
  }
}
// Get API key at https://console.mistral.ai/

Hermes

# Step 1 — Edit config.yaml
# Windows: C:\Users\<you>\AppData\Local\hermes\config.yaml
# macOS/Linux: ~/.config/hermes/config.yaml

model:
  default: Open and Proprietary Mistral models
  provider: custom
  base_url: ${CUSTOM_BASE_URL}
  api_key: ${CUSTOM_API_KEY}
  model_aliases:
    Open and Proprietary Mistral models:
      model: "Open and Proprietary Mistral models"
      provider: "custom"

# Step 2 — Edit .env (same directory as config.yaml)
# Windows: C:\Users\<you>\AppData\Local\hermes\.env
# macOS/Linux: ~/.config/hermes/.env

# ========================
# Custom API (OpenAI-compatible)
# ========================
CUSTOM_API_KEY=<your-api-key>        # Get at https://console.mistral.ai/
CUSTOM_BASE_URL=https://api.mistral.ai/v1

OpenClaw

// ~/.openclaw/openclaw.json  (JSON5 format)
{
  "agents": {
    "defaults": {
      "model": {
        "primary": "Open and Proprietary Mistral models",
      },
    },
  },
  "models": {
    "providers": {
      // Option A — Built-in provider (OpenAI, Anthropic, Google…)
      // Just add apiKey; OpenClaw handles the baseUrl automatically
      // "openai": { "apiKey": "<your-api-key>" },

      // Option B — Custom OpenAI-compatible base URL (e.g. OpenRouter, NVIDIA)
      "free-llm": {
        "baseUrl": "https://api.mistral.ai/v1",
        "apiKey": "<your-api-key>",  // Get at https://console.mistral.ai/
        "api": "openai-completions", // openai-completions | anthropic-messages | …
        "models": [
          { "id": "Open and Proprietary Mistral models", "name": "Open and Proprietary Mistral models" },
        ],
      },
    },
  },
}
// Apply: openclaw gateway restart
// Verify: openclaw doctor --fix

FAQ

Is it really free?

Yes, Open and Proprietary Mistral models is provided as part of Mistral (La Plateforme)'s free tier. No credit card is required to sign up.

How to use it with Cursor?

Go to Cursor Settings > Models, add a custom model named "Open and Proprietary Mistral models", and set the Base URL to "https://api.mistral.ai/v1".