Expanding AI model access for WordPress sites with OpenRouter: 400+ Models in One Provider

long and empty road under blue sky

A few days ago, I shared my experiments with local, open-source AI models using Ollama and the WP AI Client. While Ollama offers a fantastic solution for running local models and accessing cloud-based open source options, I found myself wanting easier access to a broader range of commercial and proprietary models as well.

That’s when I discovered OpenRouter, a service that provides unified access to over 400 AI models from various providers through a single API. Over the weekend, Matt commented on the proposal to merge the WP AI Client into Core, and mentioned OpenRouter. Naturally, I had to check it out. And then, because I can’t help myself, I fired up my AI coding agent and built another WordPress plugin to integrate it with the WP AI Client.

Introducing WP OpenRouter Provider

The WP OpenRouter Provider is a WordPress plugin that brings OpenRouter’s extensive model library to the WordPress AI Client. This means you can access models from:

  • Anthropic (Claude 3.5 Sonnet, Claude Opus, etc.)
  • OpenAI (GPT-4 Turbo, GPT-4o, etc.)
  • Meta (Llama 3.1, Llama 3.2, etc.)
  • Google (Gemini Pro, Gemini Flash, etc.)
  • Mistral (Mistral Large, Mistral Medium, etc.)
  • And many more providers

All through a single API key and unified interface.

Why OpenRouter After Ollama?

You might wonder why I built this when the Ollama provider already works great. Here’s my thinking:

Ollama excels at:

  • Running models completely locally (no API calls, full privacy)
  • Free usage of open source models
  • Offline functionality
  • Experimenting with local AI without ongoing costs

OpenRouter shines for:

  • Access to cutting-edge proprietary models (Claude 3.5, GPT-4, etc.)
  • Models that are too large to run locally
  • Production applications that need reliability and scale
  • Comparing different models without managing multiple API keys
  • Pay-as-you-go pricing with competitive rates

They’re complementary tools. Use Ollama for local experiments and privacy-sensitive work. Use OpenRouter when you need access to the latest commercial models or want to compare different providers.

Key Features

Setting up the OpenRouter provider is straightforward. Once installed and activated, the plugin offers:

Searchable Model Selection

The settings page includes a dropdown that makes it easy to find models among the 400+ options. You can:

  • Search by model name
  • Filter by provider (Anthropic, OpenAI, Meta, etc.)
  • Toggle to show only free models
  • View context length and pricing information

OpenAI-Compatible API

OpenRouter uses an OpenAI-compatible endpoint, which means the integration with the WP AI Client is seamless. If you’ve worked with OpenAI’s API before, the experience will be familiar.

Attribution Headers

The plugin supports optional HTTP-Referer and X-Title headers for usage tracking. These are enabled by default and auto-populated from your WordPress settings, making it easy to track which site is generating your API usage.

Smart Caching

To reduce unnecessary API calls, the plugin caches the model list for 10 minutes using WordPress transients. This makes the settings page load faster while ensuring you see updated model information.

Getting Started

Here’s the quick setup:

  1. Clone the repo, install the dependencies, and then activate the plugin
  2. Get your API key from openrouter.ai/settings/keys
  3. Enter your API key at Settings > OpenRouter AI
  4. Select your preferred model from the dropdown
  5. Start using it with the WP AI Client

The plugin includes several WordPress filters for customization, including options to modify the base URL, filter the models list, adjust request timeouts, and customize cache duration.

What’s Next?

Like the Ollama provider, this is still experimental code. I’ve tested it with my WP AI Client Demo in a local WordPress installation, but I’d like to expand testing to:

  • Standalone PHP AI Client implementations
  • The AI Experiments plugin
  • More real-world use cases
  • And probably a bit of “anti AI generated code patterns” clean up

I’m also curious about building tools that can intelligently route requests between local Ollama models and cloud OpenRouter models based on the task complexity. Imagine a WordPress plugin that uses a local Llama model for simple content generation but switches to Claude 3.5 Sonnet for complex reasoning tasks.

The Bigger Picture

Both the Ollama and OpenRouter providers demonstrate the flexibility of the WordPress AI Client’s architecture. By creating custom providers, we can connect WordPress to virtually any AI service, whether it’s running on our local machine or in the cloud.

As the WP AI Client moves toward potential inclusion in WordPress core, having a diverse ecosystem of model providers will be crucial. Developers should be able to choose the right model for their specific use case, whether that’s privacy-focused local models, cost-effective open source options, or cutting-edge commercial models.

You can find the code and full documentation on GitHub: github.com/jonathanbossenger/wp-openrouter-provider

If you’re experimenting with AI in WordPress, I’d love to hear what you’re building and which models you’re finding most useful. Feel free to open an issue on GitHub or reach out on social media.


Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.