Nexus

Ollama

Run models locally with Ollama and connect to Nexus — no API key required.

Ollama runs models locally and exposes an OpenAI-compatible API. No API key is required.

Installation

import "github.com/xraph/nexus/providers/ollama"

Quick Start

provider := ollama.New()

gw := nexus.New(
    nexus.WithProvider(provider),
)

No API key is needed. Ollama connects to http://localhost:11434/v1 by default.

Options

OptionDescription
ollama.WithBaseURL(url)Override the API base URL (default: http://localhost:11434/v1)

Capabilities

CapabilitySupported
ChatYes
StreamingYes
EmbeddingsYes
VisionYes
ToolsYes
ThinkingNo

Models

ModelContextMax OutputPrice
llama3.1:8b131K4,096Free (local)
llama3.1:70b131K4,096Free (local)
mistral:7b32,7684,096Free (local)
nomic-embed-text8,192Free (local)

On this page