Module mistral

ballerinax/mistral Ballerina library

1.0.2

Overview

Mistral AI is a research lab focused on developing high-performance, open-weights large language models (LLMs). It provides developers and businesses with powerful APIs and tools to build innovative applications using both free and commercial models.

The Mistral connector offers APIs to connect and interact with the endpoints of the Mistral AI API, enabling seamless integration with Mistral's language models.

Key Features

  • Support for chat completions and text generation
  • Integration with high-performance Mistral LLMs
  • Secure communication with API key authentication
  • Flexible model parameters and prompt management

Setup guide

To use the Mistral AI Connector, you must have access to the Mistral AI API through a Mistral AI account and an active API key. If you do not have a Mistral AI account, you can sign up for one here

Create a Mistral AI API key

  1. Visit the Mistral AI platform, head to the Mistral AI console dashboard, and sign up to get started.

  2. Navigate to the API Keys panel.

  3. Choose a plan based on your requirements. Mistral AI Platform

  4. Proceed to create a new API key. Mistral AI Platform

  5. Enter the necessary details as prompted and click on Create new key. Mistral AI Platform

  6. Copy the API key and store it securely Mistral AI Platform

Quickstart

To use the Mistaral connector in your Ballerina application, update the .bal as follow:

Step 1: Import the module

Import the ballerinax/mistral module

Copy
import ballerinax/mistral;

Step 2: Create a new connector instance

Create a mistral:Client with the obtained API Key and initialize the connector.

Copy
configurable string token = ?;

mistral:Client mistralClient = check new (
    config = {auth: {token: token}}
);

Step 3: Invoke the connector operation

Now, you can utilize available connector operations.

Generate a response for given message

Copy
mistral:ChatCompletionRequest request = {
    model: "mistral-small-latest",
    messages: [
        {
            role: "user",
            content: "What is the capital of France?"
        }
    ]
};

mistral:ChatCompletionResponse response = check mistralClient->/chat/completions.post(request);

Step 4: Run the Ballerina application

Execute the command below to run the Ballerina application:

Copy
bal run

Import

import ballerinax/mistral;Copy

Other versions

Metadata

Released date: about 17 hours ago

Version: 1.0.2

License: Apache-2.0


Compatibility

Platform: any

Ballerina version: 2201.12.0

GraalVM compatible: Yes


Pull count

Total: 3147

Current verison: 46


Weekly downloads


Source repository


Keywords

AI/LLM

Cost/Freemium

Vendor/Mistral

Area/AI & Machine Learning

Type/Connector


Contributors