Azure OpenAI Embedding Models

Azure OpenAI supports the development of Retrieval-Augmented Generation (RAG) applications by offering embedding models that transform text into high-dimensional vector representations. These embeddings enable similarity search, semantic retrieval, and other vector-based operations.

Prerequisites

Azure OpenAI Account and API Key

To use Azure OpenAI models in your Quarkus application, configure your Azure credentials and endpoint.

  1. Obtain your Azure OpenAI endpoint, resource name, deployment name, and either an api-key or an Azure AD access token from the Azure Portal.

  2. Configure your application.properties with the necessary details:

quarkus.langchain4j.azure-openai.resource-name=
quarkus.langchain4j.azure-openai.deployment-name=

# And one of the below depending on your scenario
quarkus.langchain4j.azure-openai.api-key=
quarkus.langchain4j.azure-openai.ad-token=

Azure OpenAI Quarkus Extension

To use Azure OpenAI embedding models in your Quarkus application, add the quarkus-langchain4j-azure-openai extension:

<dependency>
    <groupId>io.quarkiverse.langchain4j</groupId>
    <artifactId>quarkus-langchain4j-azure-openai</artifactId>
    <version>1.0.2</version>
</dependency>

If no other LLM extension is present, AI Services will automatically select the configured Azure OpenAI embedding model.

This extension also provides support for Azure OpenAI chat and image models. See the corresponding sections for details.

Configuration

Configuration property fixed at build time - All other configuration properties are overridable at runtime

Configuration property

Type

Default

Whether the model should be enabled

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_CHAT_MODEL_ENABLED

boolean

true

Whether the model should be enabled

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_EMBEDDING_MODEL_ENABLED

boolean

true

Whether the model should be enabled

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_MODERATION_MODEL_ENABLED

boolean

true

Whether the model should be enabled

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_IMAGE_MODEL_ENABLED

boolean

true

The name of your Azure OpenAI Resource. You’re required to first deploy a model before you can make calls.

This and quarkus.langchain4j.azure-openai.deployment-name are required if quarkus.langchain4j.azure-openai.endpoint is not set. If quarkus.langchain4j.azure-openai.endpoint is set then this is never read.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_RESOURCE_NAME

string

The domain name of your Azure OpenAI Resource. You’re required to first deploy a model before you can make calls.

This and quarkus.langchain4j.azure-openai.domain-name are required if quarkus.langchain4j.azure-openai.endpoint is not set. If quarkus.langchain4j.azure-openai.endpoint is set then this is never read.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_DOMAIN_NAME

string

openai.azure.com

The name of your model deployment. You’re required to first deploy a model before you can make calls.

This and quarkus.langchain4j.azure-openai.resource-name are required if quarkus.langchain4j.azure-openai.endpoint is not set. If quarkus.langchain4j.azure-openai.endpoint is set then this is never read.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_DEPLOYMENT_NAME

string

The endpoint for the Azure OpenAI resource.

If not specified, then quarkus.langchain4j.azure-openai.resource-name, quarkus.langchain4j.azure-openai.domain-name (defaults to "openai.azure.com") and quarkus.langchain4j.azure-openai.deployment-name are required. In this case the endpoint will be set to https://${quarkus.langchain4j.azure-openai.resource-name.${quarkus.langchain4j.azure-openai.domain-name}/openai/deployments/${quarkus.langchain4j.azure-openai.deployment-name}}

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_ENDPOINT

string

The Azure AD token to use for this operation. If present, then the requests towards OpenAI will include this in the Authorization header. Note that this property overrides the functionality of quarkus.langchain4j.azure-openai.api-key.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_AD_TOKEN

string

The API version to use for this operation. This follows the YYYY-MM-DD format

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_API_VERSION

string

2023-05-15

Azure OpenAI API key

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_API_KEY

string

Timeout for OpenAI calls

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_TIMEOUT

Duration 

10s

The maximum number of times to retry. 1 means exactly one attempt, with retrying disabled.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_MAX_RETRIES

int

1

Whether the OpenAI client should log requests

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_LOG_REQUESTS

boolean

false

Whether the OpenAI client should log responses

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_LOG_RESPONSES

boolean

false

Whether to enable the integration. Defaults to true, which means requests are made to the OpenAI provider. Set to false to disable all requests.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_ENABLE_INTEGRATION

boolean

true

This property will override the quarkus.langchain4j.azure-openai.resource-name specifically for chat models if it is set.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_CHAT_MODEL_RESOURCE_NAME

string

<dummy>

This property will override the quarkus.langchain4j.azure-openai.domain-name specifically for chat models if it is set.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_CHAT_MODEL_DOMAIN_NAME

string

<dummy>

This property will override the quarkus.langchain4j.azure-openai.deployment-name specifically for chat models if it is set.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_CHAT_MODEL_DEPLOYMENT_NAME

string

<dummy>

This property will override the quarkus.langchain4j.azure-openai.endpoint specifically for chat models if it is set.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_CHAT_MODEL_ENDPOINT

string

<dummy>

The Azure AD token to use for this operation. If present, then the requests towards OpenAI will include this in the Authorization header. Note that this property overrides the functionality of quarkus.langchain4j.azure-openai.embedding-model.api-key.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_CHAT_MODEL_AD_TOKEN

string

The API version to use for this operation. This follows the YYYY-MM-DD format

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_CHAT_MODEL_API_VERSION

string

Azure OpenAI API key

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_CHAT_MODEL_API_KEY

string

What sampling temperature to use, with values between 0 and 2. Higher values means the model will take more risks. A value of 0.9 is good for more creative applications, while 0 (argmax sampling) is good for ones with a well-defined answer. It is recommended to alter this or topP, but not both.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_CHAT_MODEL_TEMPERATURE

double

${quarkus.langchain4j.temperature:1.0}

An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with topP probability mass. 0.1 means only the tokens comprising the top 10% probability mass are considered. It is recommended to alter this or temperature, but not both.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_CHAT_MODEL_TOP_P

double

1.0

The maximum number of tokens to generate in the completion. The token count of your prompt plus max_tokens can’t exceed the model’s context length. Most models have a context length of 2048 tokens (except for the newest models, which support 4096).

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_CHAT_MODEL_MAX_TOKENS

int

Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model’s likelihood to talk about new topics.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_CHAT_MODEL_PRESENCE_PENALTY

double

0

Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model’s likelihood to repeat the same line verbatim.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_CHAT_MODEL_FREQUENCY_PENALTY

double

0

Whether chat model requests should be logged

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_CHAT_MODEL_LOG_REQUESTS

boolean

false

Whether chat model responses should be logged

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_CHAT_MODEL_LOG_RESPONSES

boolean

false

The response format the model should use. Some models are not compatible with some response formats, make sure to review OpenAI documentation.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_CHAT_MODEL_RESPONSE_FORMAT

string

This property will override the quarkus.langchain4j.azure-openai.resource-name specifically for embedding models if it is set.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_EMBEDDING_MODEL_RESOURCE_NAME

string

This property will override the quarkus.langchain4j.azure-openai.domain-name specifically for embedding models if it is set.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_EMBEDDING_MODEL_DOMAIN_NAME

string

This property will override the quarkus.langchain4j.azure-openai.deployment-name specifically for embedding models if it is set.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_EMBEDDING_MODEL_DEPLOYMENT_NAME

string

This property will override the quarkus.langchain4j.azure-openai.endpoint specifically for embedding models if it is set.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_EMBEDDING_MODEL_ENDPOINT

string

The Azure AD token to use for this operation. If present, then the requests towards OpenAI will include this in the Authorization header. Note that this property overrides the functionality of quarkus.langchain4j.azure-openai.embedding-model.api-key.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_EMBEDDING_MODEL_AD_TOKEN

string

The API version to use for this operation. This follows the YYYY-MM-DD format

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_EMBEDDING_MODEL_API_VERSION

string

Azure OpenAI API key

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_EMBEDDING_MODEL_API_KEY

string

Whether embedding model requests should be logged

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_EMBEDDING_MODEL_LOG_REQUESTS

boolean

false

Whether embedding model responses should be logged

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_EMBEDDING_MODEL_LOG_RESPONSES

boolean

false

This property will override the quarkus.langchain4j.azure-openai.resource-name specifically for image models if it is set.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_IMAGE_MODEL_RESOURCE_NAME

string

This property will override the quarkus.langchain4j.azure-openai.domain-name specifically for image models if it is set.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_IMAGE_MODEL_DOMAIN_NAME

string

This property will override the quarkus.langchain4j.azure-openai.deployment-name specifically for image models if it is set.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_IMAGE_MODEL_DEPLOYMENT_NAME

string

This property will override the quarkus.langchain4j.azure-openai.endpoint specifically for image models if it is set.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_IMAGE_MODEL_ENDPOINT

string

The Azure AD token to use for this operation. If present, then the requests towards OpenAI will include this in the Authorization header. Note that this property overrides the functionality of quarkus.langchain4j.azure-openai.embedding-model.api-key.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_IMAGE_MODEL_AD_TOKEN

string

The API version to use for this operation. This follows the YYYY-MM-DD format

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_IMAGE_MODEL_API_VERSION

string

Azure OpenAI API key

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_IMAGE_MODEL_API_KEY

string

Model name to use

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_IMAGE_MODEL_MODEL_NAME

string

dall-e-3

Configure whether the generated images will be saved to disk. By default, persisting is disabled, but it is implicitly enabled when quarkus.langchain4j.openai.image-mode.directory is set and this property is not to false

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_IMAGE_MODEL_PERSIST

boolean

false

The path where the generated images will be persisted to disk. This only applies of quarkus.langchain4j.openai.image-mode.persist is not set to false.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_IMAGE_MODEL_PERSIST_DIRECTORY

path

${java.io.tmpdir}/dall-e-images

The format in which the generated images are returned.

Must be one of url or b64_json

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_IMAGE_MODEL_RESPONSE_FORMAT

string

url

The size of the generated images.

Must be one of 1024x1024, 1792x1024, or 1024x1792 when the model is dall-e-3.

Must be one of 256x256, 512x512, or 1024x1024 when the model is dall-e-2.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_IMAGE_MODEL_SIZE

string

1024x1024

The quality of the image that will be generated.

hd creates images with finer details and greater consistency across the image.

This param is only supported for when the model is dall-e-3.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_IMAGE_MODEL_QUALITY

string

standard

The number of images to generate.

Must be between 1 and 10.

When the model is dall-e-3, only n=1 is supported.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_IMAGE_MODEL_NUMBER

int

1

The style of the generated images.

Must be one of vivid or natural. Vivid causes the model to lean towards generating hyper-real and dramatic images. Natural causes the model to produce more natural, less hyper-real looking images.

This param is only supported for when the model is dall-e-3.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_IMAGE_MODEL_STYLE

string

vivid

A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_IMAGE_MODEL_USER

string

Whether image model requests should be logged

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_IMAGE_MODEL_LOG_REQUESTS

boolean

false

Whether image model responses should be logged

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI_IMAGE_MODEL_LOG_RESPONSES

boolean

false

Named model config

Type

Default

The name of your Azure OpenAI Resource. You’re required to first deploy a model before you can make calls.

This and quarkus.langchain4j.azure-openai.deployment-name are required if quarkus.langchain4j.azure-openai.endpoint is not set. If quarkus.langchain4j.azure-openai.endpoint is set then this is never read.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__RESOURCE_NAME

string

The domain name of your Azure OpenAI Resource. You’re required to first deploy a model before you can make calls.

This and quarkus.langchain4j.azure-openai.domain-name are required if quarkus.langchain4j.azure-openai.endpoint is not set. If quarkus.langchain4j.azure-openai.endpoint is set then this is never read.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__DOMAIN_NAME

string

openai.azure.com

The name of your model deployment. You’re required to first deploy a model before you can make calls.

This and quarkus.langchain4j.azure-openai.resource-name are required if quarkus.langchain4j.azure-openai.endpoint is not set. If quarkus.langchain4j.azure-openai.endpoint is set then this is never read.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__DEPLOYMENT_NAME

string

The endpoint for the Azure OpenAI resource.

If not specified, then quarkus.langchain4j.azure-openai.resource-name, quarkus.langchain4j.azure-openai.domain-name (defaults to "openai.azure.com") and quarkus.langchain4j.azure-openai.deployment-name are required. In this case the endpoint will be set to https://${quarkus.langchain4j.azure-openai.resource-name.${quarkus.langchain4j.azure-openai.domain-name}/openai/deployments/${quarkus.langchain4j.azure-openai.deployment-name}}

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__ENDPOINT

string

The Azure AD token to use for this operation. If present, then the requests towards OpenAI will include this in the Authorization header. Note that this property overrides the functionality of quarkus.langchain4j.azure-openai.api-key.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__AD_TOKEN

string

The API version to use for this operation. This follows the YYYY-MM-DD format

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__API_VERSION

string

2023-05-15

Azure OpenAI API key

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__API_KEY

string

Timeout for OpenAI calls

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__TIMEOUT

Duration 

10s

The maximum number of times to retry. 1 means exactly one attempt, with retrying disabled.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__MAX_RETRIES

int

1

Whether the OpenAI client should log requests

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__LOG_REQUESTS

boolean

false

Whether the OpenAI client should log responses

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__LOG_RESPONSES

boolean

false

Whether to enable the integration. Defaults to true, which means requests are made to the OpenAI provider. Set to false to disable all requests.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__ENABLE_INTEGRATION

boolean

true

This property will override the quarkus.langchain4j.azure-openai.resource-name specifically for chat models if it is set.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__CHAT_MODEL_RESOURCE_NAME

string

<dummy>

This property will override the quarkus.langchain4j.azure-openai.domain-name specifically for chat models if it is set.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__CHAT_MODEL_DOMAIN_NAME

string

<dummy>

This property will override the quarkus.langchain4j.azure-openai.deployment-name specifically for chat models if it is set.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__CHAT_MODEL_DEPLOYMENT_NAME

string

<dummy>

This property will override the quarkus.langchain4j.azure-openai.endpoint specifically for chat models if it is set.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__CHAT_MODEL_ENDPOINT

string

<dummy>

The Azure AD token to use for this operation. If present, then the requests towards OpenAI will include this in the Authorization header. Note that this property overrides the functionality of quarkus.langchain4j.azure-openai.embedding-model.api-key.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__CHAT_MODEL_AD_TOKEN

string

The API version to use for this operation. This follows the YYYY-MM-DD format

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__CHAT_MODEL_API_VERSION

string

Azure OpenAI API key

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__CHAT_MODEL_API_KEY

string

What sampling temperature to use, with values between 0 and 2. Higher values means the model will take more risks. A value of 0.9 is good for more creative applications, while 0 (argmax sampling) is good for ones with a well-defined answer. It is recommended to alter this or topP, but not both.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__CHAT_MODEL_TEMPERATURE

double

${quarkus.langchain4j.temperature:1.0}

An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with topP probability mass. 0.1 means only the tokens comprising the top 10% probability mass are considered. It is recommended to alter this or temperature, but not both.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__CHAT_MODEL_TOP_P

double

1.0

The maximum number of tokens to generate in the completion. The token count of your prompt plus max_tokens can’t exceed the model’s context length. Most models have a context length of 2048 tokens (except for the newest models, which support 4096).

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__CHAT_MODEL_MAX_TOKENS

int

Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model’s likelihood to talk about new topics.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__CHAT_MODEL_PRESENCE_PENALTY

double

0

Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model’s likelihood to repeat the same line verbatim.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__CHAT_MODEL_FREQUENCY_PENALTY

double

0

Whether chat model requests should be logged

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__CHAT_MODEL_LOG_REQUESTS

boolean

false

Whether chat model responses should be logged

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__CHAT_MODEL_LOG_RESPONSES

boolean

false

The response format the model should use. Some models are not compatible with some response formats, make sure to review OpenAI documentation.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__CHAT_MODEL_RESPONSE_FORMAT

string

This property will override the quarkus.langchain4j.azure-openai.resource-name specifically for embedding models if it is set.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__EMBEDDING_MODEL_RESOURCE_NAME

string

This property will override the quarkus.langchain4j.azure-openai.domain-name specifically for embedding models if it is set.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__EMBEDDING_MODEL_DOMAIN_NAME

string

This property will override the quarkus.langchain4j.azure-openai.deployment-name specifically for embedding models if it is set.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__EMBEDDING_MODEL_DEPLOYMENT_NAME

string

This property will override the quarkus.langchain4j.azure-openai.endpoint specifically for embedding models if it is set.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__EMBEDDING_MODEL_ENDPOINT

string

The Azure AD token to use for this operation. If present, then the requests towards OpenAI will include this in the Authorization header. Note that this property overrides the functionality of quarkus.langchain4j.azure-openai.embedding-model.api-key.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__EMBEDDING_MODEL_AD_TOKEN

string

The API version to use for this operation. This follows the YYYY-MM-DD format

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__EMBEDDING_MODEL_API_VERSION

string

Azure OpenAI API key

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__EMBEDDING_MODEL_API_KEY

string

Whether embedding model requests should be logged

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__EMBEDDING_MODEL_LOG_REQUESTS

boolean

false

Whether embedding model responses should be logged

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__EMBEDDING_MODEL_LOG_RESPONSES

boolean

false

This property will override the quarkus.langchain4j.azure-openai.resource-name specifically for image models if it is set.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__IMAGE_MODEL_RESOURCE_NAME

string

This property will override the quarkus.langchain4j.azure-openai.domain-name specifically for image models if it is set.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__IMAGE_MODEL_DOMAIN_NAME

string

This property will override the quarkus.langchain4j.azure-openai.deployment-name specifically for image models if it is set.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__IMAGE_MODEL_DEPLOYMENT_NAME

string

This property will override the quarkus.langchain4j.azure-openai.endpoint specifically for image models if it is set.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__IMAGE_MODEL_ENDPOINT

string

The Azure AD token to use for this operation. If present, then the requests towards OpenAI will include this in the Authorization header. Note that this property overrides the functionality of quarkus.langchain4j.azure-openai.embedding-model.api-key.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__IMAGE_MODEL_AD_TOKEN

string

The API version to use for this operation. This follows the YYYY-MM-DD format

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__IMAGE_MODEL_API_VERSION

string

Azure OpenAI API key

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__IMAGE_MODEL_API_KEY

string

Model name to use

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__IMAGE_MODEL_MODEL_NAME

string

dall-e-3

Configure whether the generated images will be saved to disk. By default, persisting is disabled, but it is implicitly enabled when quarkus.langchain4j.openai.image-mode.directory is set and this property is not to false

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__IMAGE_MODEL_PERSIST

boolean

false

The path where the generated images will be persisted to disk. This only applies of quarkus.langchain4j.openai.image-mode.persist is not set to false.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__IMAGE_MODEL_PERSIST_DIRECTORY

path

${java.io.tmpdir}/dall-e-images

The format in which the generated images are returned.

Must be one of url or b64_json

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__IMAGE_MODEL_RESPONSE_FORMAT

string

url

The size of the generated images.

Must be one of 1024x1024, 1792x1024, or 1024x1792 when the model is dall-e-3.

Must be one of 256x256, 512x512, or 1024x1024 when the model is dall-e-2.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__IMAGE_MODEL_SIZE

string

1024x1024

The quality of the image that will be generated.

hd creates images with finer details and greater consistency across the image.

This param is only supported for when the model is dall-e-3.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__IMAGE_MODEL_QUALITY

string

standard

The number of images to generate.

Must be between 1 and 10.

When the model is dall-e-3, only n=1 is supported.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__IMAGE_MODEL_NUMBER

int

1

The style of the generated images.

Must be one of vivid or natural. Vivid causes the model to lean towards generating hyper-real and dramatic images. Natural causes the model to produce more natural, less hyper-real looking images.

This param is only supported for when the model is dall-e-3.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__IMAGE_MODEL_STYLE

string

vivid

A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse.

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__IMAGE_MODEL_USER

string

Whether image model requests should be logged

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__IMAGE_MODEL_LOG_REQUESTS

boolean

false

Whether image model responses should be logged

Environment variable: QUARKUS_LANGCHAIN4J_AZURE_OPENAI__MODEL_NAME__IMAGE_MODEL_LOG_RESPONSES

boolean

false

About the Duration format

To write duration values, use the standard java.time.Duration format. See the Duration#parse() Java API documentation for more information.

You can also use a simplified format, starting with a number:

  • If the value is only a number, it represents time in seconds.

  • If the value is a number followed by ms, it represents time in milliseconds.

In other cases, the simplified format is translated to the java.time.Duration format for parsing:

  • If the value is a number followed by h, m, or s, it is prefixed with PT.

  • If the value is a number followed by d, it is prefixed with P.

You can configure multiple Azure OpenAI embedding models in your application using named configurations:

# Default configuration
quarkus.langchain4j.azure-openai.embedding-model.model-name=text-embedding-3-large

# Custom configuration (under ‘my-retriever’)
quarkus.langchain4j.azure-openai.my-retriever.embedding-model.model-name=text-embedding-ada-002

In your RAG implementation, select a model using the @ModelName annotation:

import io.quarkiverse.langchain4j.ModelName;
import dev.langchain4j.model.embedding.EmbeddingModel;
import jakarta.inject.Inject;

@Inject EmbeddingModel defaultEmbeddingModel;
@Inject @ModelName("my-retriever") EmbeddingModel namedEmbeddingModel;