Experimenting with local open source AI models using Ollama and the WP AI Client

I’ve been spending quite a bit of time digging into the WordPress AI Client and PHP AI Client as part of my research for a blog post on the WordPress Developer blog.
While working on my demo code for the post, I wondered if it would be possible to use these clients with local AI models. At the time of this writing, Google is the only model provider that offers an API Key on a free tier, and I’m pretty sure that will go away soon.
I then remembered that I had discovered it’s possible to use local open-source models through services like Ollama in one of my live streams last year. So, a couple of PHP AI Client DeepWiki searches and a few hours of AI-assisted coding later, I created a plugin that enables both local and cloud open source models through Ollama:
https://github.com/jonathanbossenger/wp-ollama-model-provider
It essentially creates a custom Ollama provider for the WP AI Client (which is built on top of the PHP AI Client) and allows anyone to use local open source models through Ollama on a WordPress site running the WP AI Client plugin. Since Ollama also offers cloud-based open source models, I made sure the plugin supports the Ollama Cloud service as well.
The code needs some manual cleaning up. I’ve also only tested it using my WP AI Client Demo in a local WordPress installation. I’d like to make it work for a standalone PHP AI Client implementation and the AI Experiments plugin.
For now, though, it’s an interesting glimpse into what could be possible if the WP AI Client is merged into core.
Leave a Reply