Ollama now has built-in compatibility with the OpenAI Chat Completions API, making it possible to use more tooling and applications with Ollama locally.
Great news given that more and more powerful local LLM are coming. Still waiting for the embeddings compatibility for my use case. via