Table of contents
Official Content
  • This documentation is valid for:

Below you can find known issues when working with Supported LLMs.

Invalid 'max_tokens': integer below minimum value

The following error appears when executing an assistant where the  max_tokens parameter is set to -1.

Error code: 400
Invalid 'max_tokens': integer below minimum value. Expected a value >= 1, but got -1 instead.
type: invalid_request_error
param: max_tokens
code: integer_below_min_value

The case was reproduced using OpenAI provider. Assign a maximum value according to the selected model as -1 is not detailed as supported.

max_tokens is too large

The following error appears when executing an assistant

Error connecting to the SAIA service cause: 400
max_tokens is too large: 12000. This model supports at most 4096 completion tokens, whereas you provided 12000

Check the max_token parameter supported for your assistant configured model; the selected max_token parameter is greater than the maximum supported.

The response was filtered due to the prompt triggering Azure OpenAI's content management policy

The following error appears when executing an assistant with a complex query using Azure OpenAI endpoints

The response was filtered due to the prompt triggering Azure OpenAIs content management policy.
Please modify your prompt and retry.
To learn more about our content filtering policies please read our documentation
https://go.microsoft.com/fwlink/?linkid=2198766

Check the deployment made for the associated endpoint, make sure to set the content filter to the empty value (default); do not use the Microsoft.Default.v2 configuration.

Last update: September 2024 | © GeneXus. All rights reserved. GeneXus Powered by Globant