Module ai.mistral

ballerinax/ai.mistral Ballerina library

1.0.0

Overview

This module offers APIs for connecting with MistralAI Large Language Models (LLM).

Prerequisites

Before using this module in your Ballerina application, first you must obtain the nessary configuration to engage the LLM.

Quickstart

To use the ai.mistral module in your Ballerina application, update the .bal file as follows:

Step 1: Import the module

Import the ai.mistral module.

Copy
import ballerinax/ai.mistral;

Step 2: Intialize the Model Provider

Here's how to initialize the Model Provider:

Copy
import ballerina/ai;
import ballerinax/ai.mistral;

final ai:ModelProvider mistralModel = check new mistral:ModelProvider("mistralApiKey", mistral:MINISTRAL_3B_2410);

Step 4: Invoke chat completion

Copy
ai:ChatMessage[] chatMessages = [{role: "user", content: "hi"}];
ai:ChatAssistantMessage response = check mistralModel->chat(chatMessages, tools = []);

chatMessages.push(response);

Import

import ballerinax/ai.mistral;Copy

Other versions

Metadata

Released date: 19 days ago

Version: 1.0.0

License: Apache-2.0


Compatibility

Platform: java21

Ballerina version: 2201.12.0

GraalVM compatible: Yes


Pull count

Total: 70

Current verison: 34


Weekly downloads


Source repository


Keywords

AI

Agent

Mistral

Model

Provider


Contributors