Getting Started with Quarkus LangChain4j

This guide shows you how to get started with Quarkus LangChain4j by building a simple CLI application that generates poems using an AI model.

Install the Extension

To use the extension, add the dependency for your preferred model provider.

For example, to use OpenAI:

<dependency>
  <groupId>io.quarkiverse.langchain4j</groupId>
  <artifactId>quarkus-langchain4j-openai</artifactId>
  <version>1.0.2</version>
</dependency>
Want to use another model serving?

See the Models serving page for other available providers. In this guide, we will use OpenAI as an example.

Configure the Model Provider

Set your OpenAI API key in application.properties:

quarkus.langchain4j.openai.api-key=sk-...
You can also use the QUARKUS_LANGCHAIN4J_OPENAI_API_KEY environment variable.
Enable request/response logging

You can log the interactions between your application and the model:

quarkus.langchain4j.log-requests=true
quarkus.langchain4j.log-responses=true

Define an AI Service

An AI service is a Java interface annotated to describe how your application interacts with the model.

It follows the Ambassador pattern, where the interface acts as a contract, and the implementation is generated automatically.

package io.quarkiverse.langchain4j.samples;

import jakarta.enterprise.context.ApplicationScoped;

import dev.langchain4j.service.SystemMessage;
import dev.langchain4j.service.UserMessage;
import io.quarkiverse.langchain4j.RegisterAiService;

@RegisterAiService (1)
@SystemMessage("You are a professional poet") (2)
@ApplicationScoped (3)
public interface MyAiService {

    @UserMessage("""
                Write a poem about {topic}.
                The poem should be {lines} lines long. (4)
            """)
    String writeAPoem(String topic, int lines); (5)
}
1 Declares the AI service.
2 Sets the system message to define the model’s behavior.
3 Declares it as a CDI singleton.
4 Defines the prompt template.
5 This method invokes the LLM and returns the result.

Use the AI Service from a CLI App

Let’s build a simple CLI application that asks the AI service to generate a poem.

Add the quarkus-picocli extension:

<dependency>
  <groupId>io.quarkus</groupId>
  <artifactId>quarkus-picocli</artifactId>
</dependency>

Then create a class named PoemCommand.java:

package io.quarkiverse.langchain4j.samples;

import jakarta.enterprise.context.control.ActivateRequestContext;
import jakarta.inject.Inject;
import picocli.CommandLine;
import picocli.CommandLine.Command;
import picocli.CommandLine.Parameters;

@Command(name = "poem", mixinStandardHelpOptions = true)
public class PoemCommand implements Runnable {

    @Parameters(paramLabel = "<topic>", defaultValue = "quarkus",
            description = "The topic.")
    String topic;

    @CommandLine.Option(names = "--lines", defaultValue = "4",
            description = "The number of lines in the poem.")
    int lines;

    @Inject
    MyAiService myAiService;

    @Override
    public void run() {
        System.out.println(myAiService.writeAPoem(topic, lines));
    }
}

Run the Application in Dev Mode

Start the application in dev mode:

./mvnw quarkus:dev

Then run the command:

java -jar target/quarkus-app/quarkus-run.jar --lines=5 "AI with Quarkus"

This will output a poem like:

In the realm of code and light,
Quarkus dances, swift and bright,
AI whispers in the breeze,
Crafting dreams with agile ease,
Future's song in bytes takes flight.

Exit the app with Ctrl+C.

Package the Application

Build the application:

./mvnw package

Run the packaged JAR:

java -jar target/quarkus-app/quarkus-run.jar --lines=5 "AI with Quarkus"
Compile to Native Image

To compile to native, run:

./mvnw package -Dnative

Then run it with:

./target/*-runner --lines=5 "Quarkus poetry"