sumudunissanka/ai.googleapis.vertex

1.5.1

Overview

This module offers APIs for connecting with models hosted on Google Vertex AI, including Google Gemini models and partner models from Anthropic, Mistral, Meta, DeepSeek, Qwen, Kimi, and MiniMax available through the Vertex AI Model Garden.

Prerequisites

Before using this module in your Ballerina application, you must have a Google Cloud project with Vertex AI enabled.

Quickstart

To use the ai.googleapis.vertex module in your Ballerina application, update the .bal file as follows:

Step 1: Import the module

Import the ai.googleapis.vertex module.

Copy
import ballerinax/ai.googleapis.vertex;

Step 2: Initialize the Model Provider

Three authentication options are supported:

Option 1 — OAuth2 refresh token (recommended for development; credentials from ~/.config/gcloud/application_default_credentials.json):

Copy
import ballerina/ai;
import ballerinax/ai.googleapis.vertex;

final ai:ModelProvider vertexModel = check new vertex:ModelProvider(
    auth = {
        clientId: "your-client-id",
        clientSecret: "your-client-secret",
        refreshToken: "your-refresh-token"
    },
    projectId = "your-gcp-project-id",
    location = "us-central1",
    model = "google/gemini-2.0-flash"
);

Option 2 — Service account JSON key file (recommended for production):

Copy
final ai:ModelProvider vertexModel = check new vertex:ModelProvider(
    auth = "/path/to/service-account-key.json",
    projectId = "your-gcp-project-id",
    location = "us-central1",
    model = "google/gemini-2.0-flash"
);

Option 3 — Service account inline credentials (use when you need to override scopes):

Copy
final ai:ModelProvider vertexModel = check new vertex:ModelProvider(
    auth = {
        clientEmail: "your-sa@your-project.iam.gserviceaccount.com",
        privateKey: "-----BEGIN RSA PRIVATE KEY-----\n..."
    },
    projectId = "your-gcp-project-id",
    location = "us-central1",
    model = "google/gemini-2.0-flash"
);

Step 3: Invoke chat completion

Copy
ai:ChatMessage[] chatMessages = [{role: "user", content: "hi"}];
ai:ChatAssistantMessage response = check vertexModel->chat(chatMessages, tools = []);

chatMessages.push(response);

Step 4: Generate typed output

Copy
type Sentiment record {|
    string label;
    decimal score;
|};

@ai:JsonSchema {
    "type": "object",
    "required": ["label", "score"],
    "properties": {
        "label": {"type": "string", "enum": ["positive", "neutral", "negative"]},
        "score": {"type": "number"}
    }
}
type SentimentType Sentiment;

Sentiment|error result = vertexModel->generate(
    `Analyze the sentiment of: "I love this product!"`
);

Step 5: Use an embedding provider

Copy
import ballerina/ai;
import ballerinax/ai.googleapis.vertex;

final ai:EmbeddingProvider vertexEmbedding = check new vertex:EmbeddingProvider(
    auth = "/path/to/service-account-key.json",
    projectId = "your-gcp-project-id",
    location = "us-central1"
);

ai:Embedding embedding = check vertexEmbedding->embed(<ai:TextChunk>{content: "Hello, world!"});

Import

import sumudunissanka/ai.googleapis.vertex;Copy

Other versions

See more...

Metadata

Released date: 14 days ago

Version: 1.5.1

License: Apache-2.0


Compatibility

Platform: java21

Ballerina version: 2201.12.6

GraalVM compatible: Yes


Pull count

Total: 24

Current verison: 3


Weekly downloads


Source repository


Keywords

Agent

Model Provider

Vendor/Google

Area/AI


Contributors