Ollama provider for the PHP and WP AI Client packages.
Ollama provider for the PHP AI Client SDK. Works as both a Composer package with the php-ai-client package and as a WordPress plugin with the AI Client that is bundled with WordPress 7.0+.
Ollama lets you run large language models locally or remotely. Ollama exposes an OpenAI-compatible API, and this provider uses that API to communicate with any model you have pulled into Ollama (Llama, Mistral, Gemma, Phi, and many more) or any available Ollama Cloud model.
- PHP 7.4+
- php-ai-client
^1.3or WordPress 7.0+ - Ollama running locally or remotely (like Ollama Cloud)
- Upload the plugin files to
/wp-content/plugins/ai-provider-for-ollama/. - Activate the plugin through the 'Plugins' menu in WordPress.
- Go to Settings > Ollama to configure the host URL and see available models.
composer require fueled/ai-provider-for-ollamaBy default, the provider connects to http://localhost:11434. You can change this in two ways:
- Environment variable (takes precedence): Set the
OLLAMA_HOSTenvironment variable. - WordPress admin: Go to Settings > Ollama and enter your Ollama host URL.
For local Ollama instances, no API key is needed. The plugin automatically registers an empty API key as a fallback.
For remote Ollama instances that require authentication (e.g., Ollama Cloud), enter the API key in the Settings > Connectors screen. If using Ollama Cloud, you also need to set your Ollama host URL in the Settings > Ollama screen to https://ollama.com.
$result = wp_ai_client_prompt( 'Hello, how are you?' )
->using_provider( 'ollama' )
->using_system_instruction( 'You are a helpful assistant.' )
->generate_text();use Fueled\AiProviderForOllama\Provider\OllamaProvider;
use WordPress\AiClient\AiClient;
use WordPress\AiClient\Providers\Http\DTO\ApiKeyRequestAuthentication;
require_once 'vendor/autoload.php';
$registry = AiClient::defaultRegistry();
$registry->registerProvider(OllamaProvider::class);
$registry->setProviderRequestAuthentication('ollama', new ApiKeyRequestAuthentication(''));
$result = AiClient::prompt('Hello!')
->usingProvider('ollama')
->generateText();Active: Fueled is actively working on this, and we expect to continue work for the foreseeable future including keeping tested up to the most recent version of WordPress. Bug reports, feature requests, questions, and pull requests are welcome.
A complete listing of all notable changes to AI Provider for Ollama are documented in CHANGELOG.md.
Please read CODE_OF_CONDUCT.md for details on our code of conduct, CONTRIBUTING.md for details on the process for submitting pull requests to us, and CREDITS.md for a listing of maintainers, contributors, and libraries for AI Provider for Ollama.


