Lab 4 – Agentic workflows with LangChain4j

In this lab, you will build an Investment Memo Agent. This demonstrates the "Agent + Tool" pattern: the workflow calls a deterministic HTTP API to fetch real-time market data, then passes that data to an AI Agent to produce a structured, professional investment recommendation.

1. Add LangChain4j and a Model Provider

First, add the LangChain4j extension for your preferred LLM provider. Quarkus Flow works seamlessly with any provider supported by Quarkus LangChain4j.

Ollama (Local models - Recommended for this lab)
<dependency>
  <groupId>io.quarkiverse.langchain4j</groupId>
  <artifactId>quarkus-langchain4j-ollama</artifactId>
</dependency>
OpenAI (Cloud-based)
<dependency>
  <groupId>io.quarkiverse.langchain4j</groupId>
  <artifactId>quarkus-langchain4j-openai</artifactId>
</dependency>

2. Define the AI Agent

Our Agent is a simple Java interface. We use @RegisterAiService to tell Quarkus to create a CDI bean that talks to the LLM.

2.1 The Data Models

We want the AI to return structured JSON, not just a raw string. We define Java Records to act as the "Schema" for the AI’s response and the "Prompt" for its input.

package org.acme.agentic;

import java.util.List;

/**
 * Output returned by the {@link InvestmentAnalystAgent}.
 */
public record InvestmentMemo(String summary, String stance, List<String> keyRisks) {
}
package org.acme.agentic;

/**
 * Input sent to the {@link InvestmentAnalystAgent}.
 * <p>
 * It combines the user's intent (objective / horizon) with the raw JSON market data returned by the HTTP task.
 */
public record InvestmentPrompt(String ticker, String objective, String horizon, String marketDataJson) {
}

2.2 The AI Service Interface

The @SystemMessage defines the Agent’s persona and strict output requirements (JSON fields like summary, stance, and keyRisks).

/**
 * Simple investment analyst agent.
 * <p>
 * It receives an {@link InvestmentPrompt} (ticker + JSON market snapshot) and returns an {@link InvestmentMemo} with a
 * short recommendation.
 */
@ApplicationScoped
@RegisterAiService
@SystemMessage("""
        You are a careful, conservative investment analyst.

        Given:
        - a stock ticker
        - a description of the investment objective
        - an investment horizon
        - and a compact JSON snapshot of market data,

        you MUST respond with a short JSON document that can be mapped to:
          InvestmentMemo {
            String summary;
            String stance;      // BUY, HOLD or AVOID
            List<String> keyRisks;
          }

        Be concise and avoid marketing language.
        """)
public interface InvestmentAnalystAgent {

    /**
     * Analyze the prompt and produce an investment memo.
     *
     * @param memoryId
     *        Conversation / workflow memory id (provided by Quarkus Flow).
     * @param prompt
     *        Ticker, objective, horizon and raw market-data JSON.
     */
    @UserMessage("""
            Ticker: {prompt.ticker}
            Objective: {prompt.objective}
            Horizon: {prompt.horizon}

            Here is the JSON market-data snapshot you should analyze:

            {prompt.marketDataJson}

            Produce an InvestmentMemo JSON as specified above.
            """)
    InvestmentMemo analyse(@MemoryId String memoryId, @V("prompt") InvestmentPrompt prompt);
}

3. Create the Downstream "Market Data" Tool

An Agent is only as good as the data it receives. We’ll create a simple JAX-RS resource to simulate a financial data API that provides the "ground truth" (prices, ratios) to the AI.

package org.acme.agentic;

import java.math.BigDecimal;

/**
 * Simple DTO representing the market data returned by MarketDataResource.
 */
public record MarketDataSnapshot(String ticker, BigDecimal price, double pe, double dividendYield, String currency) {
}
package org.acme.agentic;

import java.math.BigDecimal;

import jakarta.ws.rs.GET;
import jakarta.ws.rs.Path;
import jakarta.ws.rs.PathParam;
import jakarta.ws.rs.Produces;
import jakarta.ws.rs.core.MediaType;

/**
 * Simple in-memory market data endpoint used by the InvestmentMemoFlow.
 * <p>
 * GET /market-data/{ticker}
 */
@Path("/market-data")
@Produces(MediaType.APPLICATION_JSON)
public class MarketDataResource {

    @GET
    @Path("/{ticker}")
    public MarketDataSnapshot marketData(@PathParam("ticker") String ticker) {
        String normalized = ticker.toUpperCase();

        BigDecimal price;
        double pe;
        double dividendYield;
        String currency = "USD";

        switch (normalized) {
            case "CSU.TO" -> {
                price = new BigDecimal("3170.25");
                pe = 28.4;
                dividendYield = 0.003;
                currency = "CAD";
            }
            case "RY.TO" -> {
                price = new BigDecimal("130.10");
                pe = 12.5;
                dividendYield = 0.043;
                currency = "CAD";
            }
            default -> {
                price = new BigDecimal("100.00");
                pe = 15.0;
                dividendYield = 0.0;
            }
        }

        return new MarketDataSnapshot(normalized, price, pe, dividendYield, currency);
    }
}

4. Orchestrate the Flow (The "Agent + Tool" Loop)

Now we wire them together. This is the "Aha!" moment where the workflow coordinates the tools. We use the outputAs filter to take the raw HTTP response and "shape" it into the InvestmentPrompt the Agent expects.

package org.acme.agentic;

import static io.serverlessworkflow.fluent.func.FuncWorkflowBuilder.workflow;
import static io.serverlessworkflow.fluent.func.dsl.FuncDSL.agent;
import static io.serverlessworkflow.fluent.func.dsl.FuncDSL.get;

import java.util.Map;

import jakarta.enterprise.context.ApplicationScoped;
import jakarta.inject.Inject;

import org.eclipse.microprofile.config.inject.ConfigProperty;

import io.quarkiverse.flow.Flow;
import io.serverlessworkflow.api.types.Workflow;

/**
 * Workflow that:
 * <p>
 * 1. Calls an HTTP endpoint to fetch market data for a ticker. 2. Pipes the result into a LangChain4j agent which
 * writes an InvestmentMemo.
 */
@ApplicationScoped
public class InvestmentMemoFlow extends Flow {

    private final InvestmentAnalystAgent analyst;

    @ConfigProperty(name = "org.acme.agentic.market-data.url")
    String marketDataUrl;

    @Inject
    public InvestmentMemoFlow(InvestmentAnalystAgent analyst) {
        this.analyst = analyst;
    }

    @Override
    public Workflow descriptor() {
        // tag::workflow[]
        return workflow("investment-memo").tasks(
                // 1) Fetch market data via HTTP and turn it into an InvestmentPrompt
                // tag::http-step[]
                get("fetchMarketData", marketDataUrl).outputAs((result, wf, tf) -> {
                    // This is the original task input, as sent by the workflow user
                    // It has the user's objective and horizon
                    // It could be a record, but we use as a Map to exemplify how to handle this type of object in the
                    // example.
                    final Map<String, Object> input = tf.input().asMap().orElseThrow();
                    // This is the task output before the outputAs filter
                    final String response = tf.rawOutput().asText().orElseThrow();
                    return new InvestmentPrompt(result.ticker(), input.get("objective").toString(),
                            input.get("horizon").toString(), response);
                }, MarketDataSnapshot.class),
                // end::http-step[]

                // 2) Call the LLM-backed investment analyst agent
                agent("investmentAnalyst", analyst::analyse, InvestmentPrompt.class)).build();
        // end::workflow[]
    }
}

Understanding the Data Bridge (outputAs)

Inside the fetchMarketData task, we use a lambda to bridge the tool and the agent:

                get("fetchMarketData", marketDataUrl).outputAs((result, wf, tf) -> {
                    // This is the original task input, as sent by the workflow user
                    // It has the user's objective and horizon
                    // It could be a record, but we use as a Map to exemplify how to handle this type of object in the
                    // example.
                    final Map<String, Object> input = tf.input().asMap().orElseThrow();
                    // This is the task output before the outputAs filter
                    final String response = tf.rawOutput().asText().orElseThrow();
                    return new InvestmentPrompt(result.ticker(), input.get("objective").toString(),
                            input.get("horizon").toString(), response);
                }, MarketDataSnapshot.class),
  • result: The Java object returned by the HTTP call (MarketDataSnapshot).

  • tf.input(): The original data the user sent (e.g., the objective and horizon).

  • The Return: We combine them into a single InvestmentPrompt object. The engine automatically passes this object as the input to the next task: the AI Agent.

5. Run and Visualize

  1. Start Quarkus in dev mode: ./mvnw quarkus:dev.

  2. Open the Quarkus Dev UI (http://localhost:8080/q/dev).

  3. Find Quarkus Flow → Workflows and select investment-memo.

  4. Paste a sample request into the Input pane:

    {
      "ticker": "CSU.TO",
      "objective": "High growth, low risk",
      "horizon": "10 years"
    }
  5. Click Start workflow.

In the Output tab, you will see a structured InvestmentMemo JSON object. If you check your terminal logs, you’ll see the HTTP call firing first, followed by the LLM prompt.

6. Summary

You have successfully completed Lab 4! You learned:

  • How to use Quarkus LangChain4j agents as first-class workflow tasks.

  • How to use the "Agent + Tool" pattern to provide real-time data to an LLM.

  • How to use outputAs as a data shaper to bridge deterministic APIs and non-deterministic AI.

  • How to achieve Type Safety from the HTTP response all the way to the AI’s structured DTO.

Next up: Let’s see how Quarkus Flow can automate this even further. Proceed to Lab 5 – Agentic workflows via Annotations.