Ollama
Run models locally with Ollama and connect to Nexus — no API key required.
Ollama runs models locally and exposes an OpenAI-compatible API. No API key is required.
Installation
import "github.com/xraph/nexus/providers/ollama"Quick Start
provider := ollama.New()
gw := nexus.New(
nexus.WithProvider(provider),
)No API key is needed. Ollama connects to http://localhost:11434/v1 by default.
Options
| Option | Description |
|---|---|
ollama.WithBaseURL(url) | Override the API base URL (default: http://localhost:11434/v1) |
Capabilities
| Capability | Supported |
|---|---|
| Chat | Yes |
| Streaming | Yes |
| Embeddings | Yes |
| Vision | Yes |
| Tools | Yes |
| Thinking | No |
Models
| Model | Context | Max Output | Price |
|---|---|---|---|
llama3.1:8b | 131K | 4,096 | Free (local) |
llama3.1:70b | 131K | 4,096 | Free (local) |
mistral:7b | 32,768 | 4,096 | Free (local) |
nomic-embed-text | 8,192 | — | Free (local) |