Table of contents
Official Content
  • This documentation is valid for:

The Chat API simplifies how businesses connect to various AI models across different providers. It acts as a single point of access, eliminating the need to manage multiple SDKs or APIs. This means you can interact with any supported LLM using just one SDK. In addition, the API centralizes the use of any Assistant in a single entry point.

Check the generic variables needed to use the API.

Endpoint

The general endpoint is as follows:

Method Path
POST /chat
or
 /chat/completions


The /chat/completions and /chat endpoints are supported and you can choose either endpoint. Both have the same interface (Request/Response) as the OpenAI Chat Completion API. These endpoints can be called using the OpenAI SDKs.

POST/chat

Request

  • Method: POST
  • Path: $BASE_URL/chat
  • Headers:
    • Content-Type: application/json
    • Authorization: Bearer $SAIA_PROJECT_APITOKEN
    • saia-conversation-id: $CONVERSATION_ID
      This header only applies to Flows. It is necessary to maintain the context of the conversation. If omitted, each message will be treated as a new conversation without context. The value can be any alphanumeric string and changing it will start a new conversation. Conversations are valid for 10 minutes of inactivity, after which they are automatically deleted.
      Currently, if the response takes more than 60 seconds to be generated, it will not be received. Therefore, this endpoint is suitable only for simple Flows that do not require many interaction steps.
  • Request Body

The payload will vary depending on the selected Assistant. The general pattern is as follows:

{
    "model": "string", // mandatory
    "messages": [ ... ], // at least one message
    "stream": boolean // optional
}

The model needs to address the assistant type and name or  bot_id, depending on the Type. Then, the parameters will vary depending on the type. Its format is as follows: 

"model": "saia:<assistant_type>:<assistant_name>|<bot_id>"

The bot_id, found in the OVERVIEW, located in the Flow Builder Side Navigation Menu, is used if the type is flow.

Type Description
assistant Identifies an Assistant API, Data Analyst Assistant and API Assistant
search Identifies a RAG Assistant
flow Identifies a Flow

The messages element defines the desired messages to be added. The minimal value needs to be the following, where the content details the user input.

{
    "role": "string", /* user, system and may support others depending on the selected model */
    "content": "string"
}

You can add additional parameters. Below are possible body samples.

Assistant Sample

{
    "model": "saia:assistant:translate-to-spanish", /* Using a Standard Assistant named 'translate-to-spanish' */
    "messages": [
        {
            "role": "user",
            "content": "Hi, welcome to Globant Enterprise AI!!"
        }
    ],
    "stream": true
}

The expected result is to stream the translated content depending on the Prompt defined.

cURL Sample

curl -X POST "$BASE_URL/chat" \
 -H "Authorization: Bearer $SAIA_PROJECT_APITOKEN" \
 -H "Content-Type: application/json" \
 -d '{
    "model": "saia:assistant:translate-to-spanish",
    "messages": [
        {
            "role": "user",
            "content": "Hi, welcome to Globant Enterprise AI!!"
        }
    ],
    "stream": true
}'

Data Analyst Sample

{
   "model": "saia:assistant:myDataAnalystAssistant", /* Using a DataAnalyst Assistant named 'myDataAnalystAssistant'. It can also be referenced by its "assistantId" value (uuid) instead of the name. */
   "messages": [
       {
           "role": "user",
           "content": "What was the month with the highest Ancillaries sales?"
       }
   ],
   "stream": true
}

cURL Sample

curl -X POST "$BASE_URL/chat" \
 -H "Authorization: Bearer $SAIA_PROJECT_APITOKEN" \
 -H "Content-Type: application/json" \
 --data '{
    "model": "saia:assistant:myDataAnalystAssistant",
    "messages": [
        {
           "role": "user",
           "content": "What was the month with the highest Ancillaries sales?"
        }
    ],
   "stream": true
}'

API Assistant Sample

{
    "model": "saia:assistant:test-openapi-weather-assistant",
    "stream": false,
    "messages": [
        {
            "role": "user",
            "content": "weather in madrid"
        }
    ]
}

cURL Sample

curl -X POST "$BASE_URL/chat" \
 -H "Authorization: Bearer $SAIA_PROJECT_APITOKEN" \
 -H "Content-Type: application/json" \
 --data '{
    "model": "saia:assistant:test-openapi-weather-assistant",
    "messages": [
       {
            "role": "user",
            "content": "weather in madrid"
        }
    ],
   "stream": false
}'

RAG Sample

{
    "threadId": "uuid_as_string", /* conversation identifier (optional) */
    "model": "saia:search:Default", /* Using the Default RAG Assistant */
    "messages": [
        {
            "role": "user",
            "content": "Summarize the features of Globant Enterprise AI"
        }
    ],
    "stream": true,
    "temperature": 0,
    "max_tokens": 500
}

The expected result is to query the Default RAG Assistant and stream a reply once the sources are obtained.

cURL Sample

curl -X POST "$BASE_URL/chat" \
 -H "Authorization: Bearer $SAIA_PROJECT_APITOKEN" \
 -H "Content-Type: application/json" \
 --data '{
    "model": "saia:search:Default",
    "messages": [
        {
            "role": "user",
            "content": "Summarize the features of Globant Enterprise AI"
        }
    ],
    "stream": true
}'

Flow Sample

{
    "model": "saia:flow:87507723-3e3b-47f6-a1d0-aa53370c71d2",  /* Where 87507723-3e3b-47f6-a1d0-aa53370c71d2 is the bot_id*/  
    "messages": 
      {
        "role": "user",
        "content": "Question for flow"
      }
    
}

cURL Sample

curl --location "$BASE_URL/chat" \
-H "Content-Type: application/json;charset=utf-8" \
-H "Authorization: Bearer $SAIA_PROJECT_APITOKEN" \
-H "saia-conversation-id: $CONVERSATION_ID" \
--data '{
    "model": "saia:flow:87507723-3e3b-47f6-a1d0-aa53370c71d2",    
    "messages": 
      {
        "role": "user",
        "content": "Question for flow"
      }
    
}'

How to integrate Globant Enterprise AI with third-party SDKs

cURL

curl -X POST "$BASE_URL/chat" \
-H 'Authorization: Bearer $SAIA_PROJECT_APITOKEN' \
-H 'X-Saia-Cache-Enabled: false' \
-H 'Content-Type: application/json' \
-data '{
    "model": "openai/gpt-4o",
    "messages": [
         {
            "role": "system",
            "content": "You are a professional Translator. Translate the user text to English. Just output one word. "
        },
        {
            "role": "user",
            "content": "Hola"
        }
    ],
    "stream": false
}'

Note that in the model parameter, you must specify the model in the format provider/nameModel.

OpenAI SDK for Python

from openai import OpenAI

api_key = "$(SAIA_API_KEY)"
api_base = "https://api.saia.ai/chat"

openai = OpenAI(api_key=api_key, base_url=api_base)

completion = openai.chat.completions.create(model="gpt-3.5-turbo", messages=[{"role": "user", "content": "Hello world"}])

print(completion.choices[0].message.content)

OpenAI SDK for TypeScript

const { Configuration, OpenAIApi } = require("openai");

const configuration = new Configuration({
  apiKey: SAIA_APITOKEN,
  basePath: "https://api.saia.ai/chat",  
});
const openai = new OpenAIApi(configuration);

async function main() {
  const chatCompletion = await openai.createChatCompletion({
    model: "gpt-3.5-turbo",
    messages: [{role: "user", content: "Hello world"}],
  });
  
  console.log(chatCompletion.data.choices[0].message);
}

main();

How to call Gemini 1.5 pro

cURL

curl -X POST "$BASE_URL/chat" \
-H 'Authorization: Bearer $SAIA_PROJECT_APITOKEN' \
-H 'X-Saia-Cache-Enabled: false' \
-H 'Content-Type: application/json' \
-data '{
    "model": "vertex_ai/gemini-1.5-pro-preview-0409",
    "messages": [
         {
            "role": "system",
            "content": "You are a professional Translator. Translate the user text to English. Just output one word. "
        },
        {
            "role": "user",
            "content": "Hola"
        }
    ],
    "stream": false
}'

Creating and Using an Assistant with Variables

First, you need to create an assistant with the variables, and then pass the variables when you use it.

cURL - Creating the Assistant

curl --location 'https://api.saia.ai/v1/assistant' 
--header 'Content-Type: application/json' 
--header 'Authorization: Bearer $SAIA_APITOKEN' 
--data '{
    "type": "chat",
    "name": "Test-variables",
    "prompt": "You are a translator. Translate to {language}."
}'

cURL - Using the Assistant

curl --location 'https://api.saia.ai/chat' 
--header 'Saia-Auth: $SAIA_APITOKEN' 
--header 'X-Saia-Cache-Enabled: false' 
--header 'Content-Type: application/json' 
--data '{
    "model": "saia:assistant:Test-variables",
    "messages": [
        {
            "role": "user",
            "content": "Hello"
        }
    ],
    "variables": [
        {
            "key": "language",
            "value": "French"
        }
  ],
    "stream": false
}'

 

Last update: October 2024 | © GeneXus. All rights reserved. GeneXus Powered by Globant