Agentic AI Topology and Internals
While you can orchestrate AI agents manually using the Java DSL, Quarkus Flow also provides deep, native integration with the LangChain4j Agentic Workflow API (e.g., @SequenceAgent, @ParallelAgent).
This page explains the internal architecture of that integration: how Quarkus Flow compiles these external annotations into standard workflows, and how the engine bridges the gap between AI memory and workflow state.
| For practical instructions on building these workflows, see Orchestrate LangChain4j Agents and Patterns. |
1. The Build-Time Compiler
When you add the quarkus-flow-langchain4j extension to your project, Quarkus Flow acts as an Ahead-of-Time (AoT) compiler for your AI patterns.
It does not run a separate "AI Engine" at runtime. Instead, during the Quarkus build phase, it translates LangChain4j semantics directly into the CNCF Serverless Workflow vocabulary.
The Compilation Process
-
Scanning: The extension scans your
@RegisterAiServiceinterfaces for methods annotated with Agentic routing annotations (like@SequenceAgentor@ParallelAgent). -
Translation: For each annotated method, it programmatically generates a standard
WorkflowDefinitionbean.-
A
@SequenceAgentis translated into a linear sequence of standardcalltasks. -
A
@ParallelAgentis translated into afork/joinstate topology.
-
-
Registration: These generated definitions are injected into the Workflow Registry.
Because of this build-time translation, the generated AI workflows benefit from the exact same runtime execution speeds, tracing, and Dev UI visualizers as a workflow you wrote manually using the Java DSL.
2. The Memory Bridge (AgenticScope vs. Workflow Context)
The biggest architectural challenge when integrating an external AI framework with a workflow engine is state management.
-
LangChain4j relies on an
AgenticScopeobject to hold input parameters, intermediate LLM generations, and conversation history. -
Quarkus Flow relies on a structured JSON document (the Global Context) that is passed between tasks.
How the Bridge Works
To prevent data duplication and ensure state consistency, Quarkus Flow implements an AgenticScope-aware workflow data model.
When a generated AI workflow executes:
-
The engine intercepts the underlying
AgenticScopecreated by LangChain4j. -
It dynamically maps the
AgenticScope.state()directly to the Workflow Data Context. -
The Result: This means that standard tasks (and
jqexpressions) can seamlessly read and write variables directly into the AI’s memory scope, and the AI agents can read data injected by standard HTTP or Messaging tasks, all without manual data marshaling.
3. Runtime Execution Semantics
When a generated AI workflow is triggered (either via the Dev UI or by an incoming event), Quarkus Flow must preserve the specific execution semantics required by LangChain4j (such as triggering specific output mappers or respecting AgenticScope lifecycles).
To achieve this, Quarkus Flow uses a Bean Invoker strategy. Instead of evaluating the compiled WorkflowDefinition natively from scratch, the engine executes the workflow through the CDI proxy of the annotated agentic interface method.
This ensures that LangChain4j remains the ultimate authority on how the prompt is formed and how the LLM is called, while Quarkus Flow remains the authority on observability, retries, and task transitions.
See also
-
Orchestrate LangChain4j Agents and Patterns — how to use these features in practice.
-
Engine Architecture — the broader build-time paradigm of Quarkus Flow.