ollama.client

Module providing a D language binding for the Ollama REST API.

This module defines the OllamaClient class, which facilitates interaction with an Ollama server for tasks such as text generation, chat interactions, model management, embeddings, and tool calling. It supports both native Ollama endpoints and OpenAI-compatible endpoints, using std.net.curl for HTTP requests and std.json for JSON processing.

Members

Aliases

StreamCallback
alias StreamCallback = void delegate(JSONValue chunk) @(safe)

Callback type used by the streaming methods. Receives one fully-parsed NDJSON chunk per call; chunk["done"] is true on the final chunk.

Classes

OllamaClient
class OllamaClient

A client class for interacting with the Ollama REST API.

Manifest constants

DEFAULT_HOST
enum DEFAULT_HOST;

Default host URL for the Ollama server.

Structs

Message
struct Message

Represents a single message in a chat interaction.

OllamaOptions
struct OllamaOptions

Typed options for controlling model generation behavior.

Tool
struct Tool

A tool (function) definition passed to chat() to enable tool/function calling.

ToolCall
struct ToolCall

Represents a tool/function call made by the model in a chat response.

ToolFunction
struct ToolFunction

Function schema for tool/function calling definitions.

Examples

import ollama.client;
import std.stdio;

void main() {
    auto client = new OllamaClient();
    auto response = client.generate("llama3", "What is the weather like?");
    writeln(response["response"].str);
}

See Also

Meta