OpenAI compatibility · Ollama Blog
Ollama now has built-in compatbility with the OpenAI Chat Completion API, making it possible to use more tooling and application with Ollama locally.
Setup
Start by downloading Ollama, and then pull a model such as Llama 2 or Mistral:
ollama pull llama2
Usage
cURL
To invoke Ollama’s OpenAI compatible API endpoint, use the same OpenAI format and change the hostname to http://localhost:11434:
curl http://localhost:11434/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "ll...
Read more at ollama.ai