Java DSL cheatsheet
Quarkus Flow uses the CNCF Workflow Java Fluent DSL to define workflows in code. This page shows:
-
how to define a workflow class,
-
the complete set of chainable task providers from the DSL (the ones you pass directly into
workflow(…).tasks(…)), and -
how to shape data with transformations:
inputFrom,outputAs, andexportAs.
|
Source of truth for the DSL classes:
|
Define a workflow class (required)
Workflows are discovered at build time from CDI beans that extend io.quarkiverse.flow.Flow and override descriptor().
package org.acme;
import static io.serverlessworkflow.fluent.func.dsl.FuncDSL.set;
import java.util.Map;
import jakarta.enterprise.context.ApplicationScoped;
import io.quarkiverse.flow.Flow;
import io.serverlessworkflow.api.types.Workflow;
import io.serverlessworkflow.fluent.func.FuncWorkflowBuilder;
@ApplicationScoped
public class HelloWorkflow extends Flow {
@Override
public Workflow descriptor() {
return FuncWorkflowBuilder.workflow("hello")
// setting the workflow context with a map carrying a message.
// it can be translated to JSON as { "message": "hello world!" }
.tasks(set(Map.of("message", "hello world!")))
.build();
}
}
Setup (imports)
import io.serverlessworkflow.api.types.Workflow;
import io.serverlessworkflow.fluent.func.spec.FuncWorkflowBuilder;
// Static imports recommended for brevity:
import static io.serverlessworkflow.fluent.func.dsl.FuncDSL.*;
import static io.serverlessworkflow.fluent.func.spec.FuncWorkflowBuilder.workflow;
Complete task providers (chainable)
All helpers below return a chainable provider/step (e.g., FuncTaskConfigurer, FuncCallStep, EmitStep, ListenStep, ConsumeStep, FuncCallHttpStep, FuncCallOpenAPIStep) so you can pass them straight into .tasks(…):
| Method | Returns | What it does | Example |
|---|---|---|---|
|
|
Set/merge into data context using jq-style JSON expression. |
|
|
|
Set/merge from a map (useful in Java without inline JSON). |
|
|
|
Call a Java |
|
|
|
Call a Java |
|
|
|
Named Java function call (input inferred). |
|
|
|
Named + explicit input type. |
|
|
|
Call with workflow context |
|
|
|
Named context-aware call. |
|
|
|
Call with workflow instance id |
|
|
|
Named |
|
|
|
Call that can see both workflow and task context |
|
|
|
Named context+task-aware call. |
|
|
|
Call with a stable unique id |
|
|
|
Named unique-id-aware call. |
|
|
|
Sugar for “agent-style” calls that need a memory id (the unique id described above). |
|
|
|
Named agent call. |
|
|
|
Fire-and-forget side-effect. No data is exported; only effects. |
|
|
|
Named side-effect. |
|
|
|
Low-level emit using the CloudEvent builder. |
|
|
|
Named low-level emit. |
|
|
|
Emit event; body encoded via |
|
|
|
Named variant. |
|
|
|
Custom bytes payload from |
|
|
|
Named custom bytes emit. |
|
|
|
Emit JSON CloudEvent for POJO input (content-type |
|
|
|
Named JSON emit. |
|
|
|
Listen using a spec (e.g., |
|
|
|
Named listen. |
|
|
|
Low-level switch (builder consumer). |
|
|
|
Named low-level switch. |
|
|
|
Switch composed from |
|
|
|
Named switch composed from cases. |
|
|
|
Typed single-case jump. |
|
|
|
JQ expression single-case jump. |
|
|
|
Typed single-case + default task. |
|
|
|
Typed single-case + default directive ( |
|
|
|
JQ single-case + default task. |
|
|
|
JQ single-case + directive. |
|
|
|
Iterate over a collection computed from the current input. |
|
|
|
Iterate over a constant collection. |
|
|
|
Iterate over a constant list (convenience). |
|
|
|
Group multiple steps/configurers (useful inside |
|
|
|
Start a fluent HTTP call spec (method, headers, body, etc.). Pass it directly to |
|
|
|
Named HTTP call spec. |
|
|
|
HTTP spec preconfigured with endpoint expression + auth. |
|
|
|
HTTP spec preconfigured with a concrete URI + auth. |
|
|
|
Convenience HTTP |
|
|
|
Named |
|
|
|
|
|
|
|
Named |
|
|
|
|
|
|
|
Named |
|
|
|
|
|
|
|
Named |
|
|
|
|
|
|
|
Named |
|
|
|
|
|
|
|
Named |
|
|
|
Attach an HTTP spec as a task (unnamed or using its internal name). |
|
|
|
Attach an HTTP spec as a named task (overrides internal name). |
|
|
|
Low-level HTTP call based on a builder configurer. |
|
|
|
Named low-level HTTP call. |
|
|
|
Start a fluent OpenAPI call spec (document, operation, params, auth, etc.). |
|
|
|
Named OpenAPI spec. |
|
|
|
Attach an OpenAPI spec as a task (unnamed or using its internal name). |
|
|
|
Attach an OpenAPI spec as a named task (overrides internal name). |
|
|
|
Low-level OpenAPI call based on a builder configurer. |
|
|
|
Named low-level OpenAPI call. |
|
| Helpers that help you build specs for emit/listen/switch/HTTP/OpenAPI but are not task providers themselves: |
-
Events & listen:
event(…),eventJson(…),eventBytes(…),to(),toOne(…),toAny(…),toAll(…) -
Switch helpers:
cases(…),caseOf(…),caseDefault(…)
You pass the resulting steps/specs (e.g. listen(…), emitJson(…), openapi(), http(), get(…), post(…)) into .tasks(…).
Data flow and transformations
Quarkus Flow follows the Serverless Workflow Specification data-flow model: every step can shape what it extracts as input, how it transforms its direct result, and what it merges back into the global context.
|
This section is only a quick reference. For a full explanation, context-aware Java functions, and more examples, see Data flow and context management. |
inputFrom(…) — what the task extracts from the context
Choose which slice of the global workflow context a task receives as input.
-
jq:
function(pricing::quote, QuoteRequest.class) .inputFrom("$.cart.quoteRequest") -
Java (with domain type):
function(pricing::quote, QuoteRequest.class) .inputFrom((MyCheckout ctx) -> ctx.quoteRequest(), MyCheckout.class)
outputAs(…) — transforming the direct task result
Shape the direct result returned by the task before it is merged back into the workflow context.
-
jq:
function(nlp::classify, Text.class) .outputAs("{ sentiment: ., reviewed: false }") -
Java:
agent("investmentAnalyst", analyst::analyse, InvestmentMemo.class) .outputAs(memo -> Map.of("memo", memo), InvestmentMemo.class);
exportAs(…) — merging back into the global context
Dictate how the task’s (optionally transformed) output is merged into the global workflow context for downstream tasks to use.
-
Java:
agent("draftNewsletter", drafter::draft, Draft.class) .exportAs((draft, wfContext) -> { var current = (NewsletterContext) wfContext.currentData(); return current.withDraft(draft); }, Draft.class);
End-to-end example (agentic + HITL + transforms)
Workflow w = workflow("intelligent-newsletter")
.tasks(
// 1) Draft Agent: Extracts the prompt, generates draft, merges back to context
agent("draftAgent", drafterAgent::draft, String.class)
.inputFrom("$.seedPrompt")
.exportAs("{ draft: . }"),
// 2) Critique Agent: Extracts only the draft, returns a composite status
agent("criticAgent", criticAgent::critique, String.class)
.inputFrom("$.draft")
.exportAs(r -> Map.of("review", r, "status", r.needsRevision() ? "REVISION" : "OK")),
// 3) Emit review request for human oversight
emitJson("org.acme.newsletter.review.required", CriticAgentReview.class),
// 4) Wait for human review; transform the raw event collection into a single result
listen("waitHumanReview", toOne("org.acme.newsletter.review.done"))
.outputAs((java.util.Collection<Object> c) -> c.iterator().next()),
// 5) Branch: revision loop or final send based on human input
switchWhenOrElse(
(HumanReview h) -> ReviewStatus.NEEDS_REVISION.equals(h.status()),
"draftAgent",
"sendNewsletter",
HumanReview.class
),
// 6) Final side-effect
consume("sendNewsletter",
(HumanReview reviewedDraft) ->
mailService.send("subscribers@acme.finance.org", "Weekly Newsletter", reviewedDraft.draft()),
HumanReview.class
)
)
.build();
Tips
-
Prefer static imports:
workflow,set,function,withContext,withFilter,withUniqueId,agent,emitJson,listen,switchWhenOrElse,consume,to,event,http,get,post,openapi,call. -
Name tasks you branch to: stable task names make
switchWhen*targets explicit and work well withwithUniqueId/agentmemory ids. -
Keep transformations close to the step that needs them (
inputFrom,exportAs,outputAs) for readability. -
HTTP/OpenAPI: you can either:
-
use fluent specs directly as steps:
tasks(http().GET().endpoint("…")), or -
wrap them with
call("name", http().GET()…)/call("name", openapi().operation("…"))to force an explicit task name.
-
-
Remember: the workflow context is often JSON; your domain objects are (de)serialized via Jackson, so you can keep typed payloads in your steps while still using jq for quick projections.