Lab 3 – Events with YAML workflows
In this lab, you will move away from the Java DSL and build a fully event-driven orchestration using YAML. You will configure Quarkus Flow to listen for incoming Kafka messages and trigger a transformation process automatically.
You will:
-
Define a Serverless Workflow in YAML using the 1.0.0 specification.
-
Wire the workflow to SmallRye Reactive Messaging.
-
Start the workflow automatically via an incoming CloudEvent.
-
Manually trigger the same workflow from the Flow Dev UI to see the dual-start capability.
1. Add Messaging to the classpath
To enable event-driven capabilities, you need a messaging connector. For this lab, we will use the Kafka connector. Thanks to Quarkus Dev Services, you don’t need a local Kafka instance; Quarkus will spin one up for you in a container automatically.
Add the following to your pom.xml:
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-messaging-kafka</artifactId>
</dependency>
Next, enable the default Flow messaging bridge in your application.properties. This creates the default flow-in and flow-out channels.
quarkus.flow.messaging.defaults-enabled=true
2. Create the YAML workflow
Create a new file at src/main/flow/uppercase-message.yaml. This workflow acts as a "Transformer" that listens for a specific event type, modifies the data using a jq expression, and emits a result event.
document:
dsl: '1.0.0'
namespace: demo
name: uppercase-message
version: '0.1.0'
# This section triggers the workflow automatically when a matching event arrives
schedule:
on:
one:
with:
type: demo.message.incoming
do:
- setUpper:
set:
message: "${ .message | ascii_upcase }"
- emitResult:
emit:
event:
with:
type: demo.message.uppercased
source: demo/workshop
data:
message: '${ .message }'
|
The |
3. Map channels to Kafka topics
Now, connect the engine’s internal channels to actual Kafka topics. Add these standard MicroProfile Reactive Messaging properties to your application.properties:
# Inbound (Listen for 'demo.message.incoming')
mp.messaging.incoming.flow-in.connector=smallrye-kafka
mp.messaging.incoming.flow-in.topic=flow-in
mp.messaging.incoming.flow-in.value.deserializer=org.apache.kafka.common.serialization.ByteArrayDeserializer
mp.messaging.incoming.flow-in.key.deserializer=org.apache.kafka.common.serialization.StringDeserializer
# Outbound (Emit 'demo.message.uppercased')
mp.messaging.outgoing.flow-out.connector=smallrye-kafka
mp.messaging.outgoing.flow-out.topic=flow-out
mp.messaging.outgoing.flow-out.value.serializer=org.apache.kafka.common.serialization.ByteArraySerializer
mp.messaging.outgoing.flow-out.key.serializer=org.apache.kafka.common.serialization.StringSerializer
4. Drive the workflow
Run your application in dev mode: ./mvnw quarkus:dev. You can now trigger the workflow in two different ways.
4.1 Trigger via an Event (Asynchronous)
If you have a Kafka producer, you can send a structured CloudEvent to the flow-in topic.
{
"specversion": "1.0",
"type": "demo.message.incoming",
"source": "demo/workshop",
"id": "1",
"datacontenttype": "application/json",
"data": {
"message": "hello from events"
}
}
The engine will see the matching type, start the workflow, uppercase the message, and push a new event to flow-out.
4.2 Trigger via the Flow Dev UI (Synchronous)
Even if you don’t have a Kafka producer ready, you can test the logic directly:
-
Open the Quarkus Dev UI (
http://localhost:8080/q/dev). -
Navigate to Flow → Workflows and select
uppercase-message. -
In the Input pane, paste the following JSON and click Start workflow:
{ "message": "hello from devui" }
The workflow runs exactly the same way. It executes the do sequence, and because of the emit task, it will still send a CloudEvent to the flow-out Kafka topic.
5. Observe the results
With dev mode running, you can verify the output in the Kafka Dev UI:
-
In the Quarkus Dev UI, find the Apache Kafka card.
-
Click on Topics and select the
flow-outtopic. -
You should see a message with the
demo.message.uppercasedtype and the data value"HELLO FROM DEVUI".
6. Summary
You have successfully completed Lab 3! You learned:
-
How to define polyglot workflows in YAML that the engine compiles at build time.
-
How to use the
schedule.ontrigger to build "Event-to-Workflow" reactive patterns. -
How to map internal Flow channels to Kafka topics using MicroProfile configuration.
-
That the Flow Dev UI allows you to test event-emitting logic without needing an external event producer.
Next up: Let’s add some intelligence to our orchestration. Proceed to Lab 4 – Agentic workflows.