Orchestrating AI Workflows
Design, connect, and control multi-step AI pipelines using Flows—Timbal's flexible workflow engine.
What Are Flows?
A Flow is a programmable pipeline that lets you chain together steps—functions, LLMs, or even other flows—while controlling how data moves between them. Flows enable you to build complex, intelligent workflows with clear logic, memory, and branching.
Building Blocks of a Flow
Steps
Steps are the core units of work.
Each step can be:
- a function
- a BaseStep
- another flow.
Steps process data, perform actions, and pass results onward.
Links
Links define the order and dependencies between steps.
They control how data and execution flow from one step to another, and can be used for tool calls, tool results, and conditional branching.

Controlling Step Inputs
When building flows, you often need to control how each step receives its inputs. Timbal provides two powerful methods for this:
- Data Maps (
set_data_map
): Dynamically connect a step's input to the output of another step or a flow input. - Data Values (
set_data_value
): Set a static value or template for a step's input.
Data Maps
Purpose: Connect a step's input parameter to the output of another step, or to a flow input.
Syntax:
step_name.parameter
: The input parameter of a step (e.g.,check.fahrenheit
).source
: The data key to use as the value. This can be:- The output of another step (e.g.,
to_fahrenheit.return
) - A flow input (e.g.,
input_x
)
- The output of another step (e.g.,
Example:
This means:
- The
celsius
parameter of theto_fahrenheit
step receives the output ofget_temp
. - The
fahrenheit
parameter of thecheck
step receives the output ofto_fahrenheit
.
Data Values
Purpose: Set a static value or template for a step's input.
Syntax:
step_name.parameter
: The input parameter of a step (e.g.,check.threshold
).value
: A constant (e.g.,86
), or a template string (e.g.,"{{step_1.return}} and {{step_2.return}}"
).
Example:
This means:
- The
threshold
parameter of thecheck
step will always be set to86
.
Inputs and Outputs
Inputs and outputs in a flow are special cases of data mapping:
- Inputs: Use
.set_input("step.parameter", "input_name")
to specify that a step should receive its value from a flow input. - Outputs: Use
.set_output("step.return", "result_name")
to specify which step's output is returned by the flow.
The output key will always be ".return" (e.g., "to_fahrenheit.return"), since it refers to the return value of the step.
Example:
This means:
- The flow expects an input called
input_celsius
. - The output of
to_fahrenheit
will be available asfahrenheit
in the flow's result.
Example: Temperature Alert Flow
Dynamic Data with String Interpolation
Template strings let you combine and transform outputs from multiple steps.
This is especially useful for LLM prompts or merging results.
Integrating LLMs
You can add LLMs (Large Language Models) as steps in your flow using .add_llm(). LLMs can use memory, call tools, and be chained with other steps for advanced reasoning.
- Memory: Use the memory_id parameter to enable persistent context across runs.
- Tool Use: Connect LLMs to tools or functions using .add_link(..., is_tool=True) and .add_link(..., is_tool_result=True) for advanced workflows.
- Prompt Construction: Use string interpolation to dynamically build prompts from previous step outputs.
Suppose you want to fetch an email, then have an LLM summarize it:
What’s happening here?*
fetch_email
retrieves the email text.- The LLM step receives a prompt that includes the email content.
- The LLM generates a summary, which is returned as the flow output.
You can chain multiple steps, use memory for context, and connect LLMs to external tools for even more powerful workflows.
For more, see the Advanced documentation
Enabling Memory and Finalizing Your Flow
To enable advanced features like persistent memory, you need to finalize your flow using the .compile() method.
Compiling your flow validates its structure and (optionally) attaches a state saver for memory.
Why compile?
Compiling ensures your flow is ready for production, with all steps, data maps, and memory configured correctly.
How to enable memory?
Pass a state saver (like InMemorySaver) to .compile() to persist context across runs.
See [State Savers] for more information.
How to Run Your Flow
Once your flow is defined and compiled, you can execute it in two main ways:
Get the final output:
Stream events as they happen:
For more, see the Flows documentation, Advanced Flow Concepts, and Examples.