ballerinax/ai.ollama Ballerina library

1.0.1

Overview

This module offers APIs for connecting with Ollama Large Language Models (LLM).

Prerequisites

Ensure that your Ollama server is running locally before using this module in your Ballerina application.

Quickstart

To use the ai.ollama module in your Ballerina application, update the .bal file as follows:

Step 1: Import the module

Import the ai.ollama module.

Copy
import ballerinax/ai.ollama;

Step 2: Intialize the Model Provider

Here's how to initialize the Model Provider:

Copy
import ballerina/ai;
import ballerinax/ai.ollama;

final ai:ModelProvider ollamaModel = check new ollama:ModelProvider("ollamaModelName");

Step 4: Invoke chat completion

Copy
ai:ChatMessage[] chatMessages = [{role: "user", content: "hi"}];
ai:ChatAssistantMessage response = check ollamaModel->chat(chatMessages, tools = []);

chatMessages.push(response);

Import

import ballerinax/ai.ollama;Copy

Other versions

Metadata

Released date: 4 days ago

Version: 1.0.1

License: Apache-2.0


Compatibility

Platform: java21

Ballerina version: 2201.12.0

GraalVM compatible: Yes


Pull count

Total: 230

Current verison: 106


Weekly downloads


Source repository


Keywords

AI

Agent

Ollama

Model

Provider


Contributors