Skip to content

Workflow

Workflow #

Event-driven orchestrator to define and run application flows using typed steps.

A Workflow is composed of @step-decorated callables that accept and emit typed Events. Steps can be declared as instance methods or as free functions registered via the decorator.

Key features: - Validation of step signatures and event graph before running - Typed start/stop events - Streaming of intermediate events - Optional human-in-the-loop events - Retry policies per step - Resource injection

Examples:

Basic usage:

from workflows import Workflow, step
from workflows.events import StartEvent, StopEvent

class MyFlow(Workflow):
    @step
    async def start(self, ev: StartEvent) -> StopEvent:
        return StopEvent(result="done")

result = await MyFlow(timeout=60).run(topic="Pirates")

Custom start/stop events and streaming:

handler = MyFlow().run()
async for ev in handler.stream_events():
    ...
result = await handler
See Also
Source code in workflows/workflow.py
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
class Workflow(metaclass=WorkflowMeta):
    """
    Event-driven orchestrator to define and run application flows using typed steps.

    A `Workflow` is composed of `@step`-decorated callables that accept and emit
    typed [Event][workflows.events.Event]s. Steps can be declared as instance
    methods or as free functions registered via the decorator.

    Key features:
    - Validation of step signatures and event graph before running
    - Typed start/stop events
    - Streaming of intermediate events
    - Optional human-in-the-loop events
    - Retry policies per step
    - Resource injection

    Examples:
        Basic usage:

        ```python
        from workflows import Workflow, step
        from workflows.events import StartEvent, StopEvent

        class MyFlow(Workflow):
            @step
            async def start(self, ev: StartEvent) -> StopEvent:
                return StopEvent(result="done")

        result = await MyFlow(timeout=60).run(topic="Pirates")
        ```

        Custom start/stop events and streaming:

        ```python
        handler = MyFlow().run()
        async for ev in handler.stream_events():
            ...
        result = await handler
        ```

    See Also:
        - [step][workflows.decorators.step]
        - [Event][workflows.events.Event]
        - [Context][workflows.context.context.Context]
        - [WorkflowHandler][workflows.handler.WorkflowHandler]
        - [RetryPolicy][workflows.retry_policy.RetryPolicy]
    """

    # Populated by the metaclass; declared here for type checkers.
    _step_functions: dict[str, StepFunction]
    _step_functions_version: int = 0

    _runtime: Runtime
    _workflow_name: str | None

    def __init__(
        self,
        timeout: float | None = 45.0,
        disable_validation: bool = False,
        verbose: bool = False,
        resource_manager: ResourceManager | None = None,
        num_concurrent_runs: int | None = None,
        runtime: Runtime | None = None,
        workflow_name: str | None = None,
        skip_graph_checks: set[WorkflowGraphCheck] | None = None,
    ) -> None:
        """
        Initialize a workflow instance.

        Args:
            timeout (float | None): Max seconds to wait for completion. `None`
                disables the timeout.
            disable_validation (bool): Skip pre-run validation of the event graph
                (not recommended).
            verbose (bool): If True, print step activity.
            resource_manager (ResourceManager | None): Custom resource manager
                for dependency injection.
            num_concurrent_runs (int | None): Limit on concurrent `run()` calls.
            runtime (Runtime | None): Optional runtime to use for this workflow.
                If not provided, uses the current context-scoped runtime or
                falls back to basic_runtime.
            workflow_name (str | None): Optional explicit name for this workflow.
                If not provided, a module-qualified name is computed from
                the class's `__module__` and `__qualname__` attributes.
            skip_graph_checks (set[str] | None): Optional set of graph validation
                checks to skip (e.g. "reachability", "terminal_event"). Use to
                allow intentional patterns that would otherwise fail validation.
        """
        # Inline imports: every module below imports ``Workflow`` transitively,
        # so deferring to call time breaks the cycle.
        from workflows.plugins._context import get_current_runtime
        from workflows.runtime.verbose import VerboseDecorator

        from .representation.validate import (
            _collect_events,
            _ensure_start_event_class,
            _ensure_stop_event_class,
        )

        # Configuration
        self._timeout = timeout
        self._verbose = verbose
        self._disable_validation = disable_validation
        self._num_concurrent_runs = num_concurrent_runs
        # Store explicit name (None means use computed name)
        self._workflow_name = workflow_name

        step_configs = self._step_configs()
        cls_name = self.__class__.__name__
        # Detect StartEvent issues before StopEvent for clearer guidance
        self._start_event_class = _ensure_start_event_class(step_configs, cls_name)
        self._stop_event_class = _ensure_stop_event_class(step_configs, cls_name)
        # Populated by _validate(); empty until a successful validation runs.
        self._catch_error_handlers: dict[str, CatchErrorHandler] = {}
        self._handler_for_step: dict[str, str] = {}
        self._events = _collect_events(step_configs)
        # Resource management
        self._resource_manager = resource_manager or ResourceManager()
        # Instrumentation
        self._dispatcher = dispatcher
        self._runtime_locked = False
        # Validation cache: set after first successful _validate(); skip re-validation on run() until invalidated.
        # _validated_version tracks which _step_functions_version was validated so add_step() invalidates the cache.
        self._validation_result: bool | None = None
        self._validated_version: int = -1
        checks = skip_graph_checks or set()
        valid_checks = set(get_args(WorkflowGraphCheck))
        unknown = checks - valid_checks
        if unknown:
            raise WorkflowValidationError(
                f"Unknown graph check names: {', '.join(sorted(unknown))}. "
                f"Valid names are: {', '.join(sorted(valid_checks))}"
            )
        self._skip_graph_checks: set[WorkflowGraphCheck] = checks

        # Runtime registration: explicit > context-scoped > basic_runtime
        self._runtime = runtime if runtime is not None else get_current_runtime()
        if self._verbose:
            self._runtime = VerboseDecorator(self._runtime)
        # Register with runtime for tracking (no-op for BasicRuntime)
        self._runtime.track_workflow(self)

    def _validate_valid_step_message(self, step: str, message: Event) -> None:
        """Validate that a step name exists in the workflow."""
        if step not in self._get_steps():
            raise WorkflowRuntimeError(f"Step {step} does not exist")

        step_func = self._get_steps()[step]
        step_config = step_func._step_config
        if type(message) not in step_config.accepted_events:
            raise WorkflowRuntimeError(
                f"Step {step} does not accept event of type {type(message)}"
            )

    @property
    def runtime(self) -> Runtime:
        """The runtime this workflow is registered with."""
        return self._runtime

    def _switch_runtime(self, new_runtime: Runtime) -> None:
        if new_runtime is self._runtime:
            return
        if self._runtime_locked:
            raise RuntimeError(
                "Cannot reassign runtime after workflow has been launched"
            )
        old = self._runtime
        old.untrack_workflow(self)
        self._runtime = new_runtime
        new_runtime.track_workflow(self)

    @property
    def workflow_name(self) -> str:
        """
        The workflow name.

        If an explicit name was provided at construction, returns that.
        Otherwise, returns a module-qualified name based on the class's
        __module__ and __qualname__ attributes.

        Examples:
            - Explicit: `Workflow(workflow_name="my-workflow")` → `"my-workflow"`
            - Module-level class: `"mymodule.MyWorkflow"`
            - Nested class: `"mymodule.Outer.Inner"`
            - Function-scoped: `"mymodule.func.<locals>.LocalWorkflow"`
        """
        if self._workflow_name is not None:
            return self._workflow_name
        cls = self.__class__
        return f"{cls.__module__}.{cls.__qualname__}"

    def _switch_workflow_name(self, name: str) -> None:
        if self._runtime_locked and name != self._workflow_name:
            raise RuntimeError(
                "Cannot change workflow_name after workflow has been launched"
            )
        self._workflow_name = name

    def _step_configs(self) -> dict[str, StepConfig]:
        """Return ``{step_name: StepConfig}`` for every registered step."""
        return {name: func._step_config for name, func in self._get_steps().items()}

    @property
    def start_event_class(self) -> type[StartEvent]:
        """The `StartEvent` subclass accepted by this workflow.

        Determined by inspecting step input types.
        """
        return self._start_event_class

    @property
    def events(self) -> list[type[Event]]:
        """Returns all known events emitted by this workflow.

        Determined by inspecting step input/output types.
        """
        return self._events

    @property
    def stop_event_class(self) -> type[RunResultT]:
        """The `StopEvent` subclass produced by this workflow.

        Determined by inspecting step return annotations.
        """
        return self._stop_event_class

    @classmethod
    def _get_steps_from_class(cls) -> dict[str, StepFunction]:
        """Returns all the steps, whether defined as methods or free functions."""
        return {**get_steps_from_class(cls), **cls._step_functions}

    @classmethod
    def add_step(cls, func: StepFunction) -> None:
        """
        Adds a free function as step for this workflow instance.

        It raises an exception if a step with the same name was already added to the workflow.
        """
        step_config: StepConfig | None = getattr(func, "_step_config", None)
        if not step_config:
            msg = f"Step function {func.__name__} is missing the `@step` decorator."
            raise WorkflowValidationError(msg)

        if func.__name__ in cls._get_steps_from_class():
            msg = f"A step {func.__name__} is already part of this workflow, please choose another name."
            raise WorkflowValidationError(msg)

        cls._step_functions[func.__name__] = func
        cls._step_functions_version += 1

    def _get_steps(self) -> dict[str, StepFunction]:
        """Returns all the steps, whether defined as methods or free functions."""
        return {**get_steps_from_instance(self), **self.__class__._step_functions}

    def _get_start_event_instance(
        self, start_event: StartEvent | None, **kwargs: Any
    ) -> StartEvent:
        if start_event is not None:
            # start_event was used wrong
            if not isinstance(start_event, StartEvent):
                msg = "The 'start_event' argument must be an instance of 'StartEvent'."
                raise ValueError(msg)

            # start_event is ok but point out that additional kwargs will be ignored in this case
            if kwargs:
                msg = (
                    "Keyword arguments are not supported when 'run()' is invoked with the 'start_event' parameter."
                    f" These keyword arguments will be ignored: {kwargs}"
                )
                logger.warning(msg)
            return start_event

        # Old style start event creation, with kwargs used to create an instance of self._start_event_class
        try:
            return self._start_event_class(**kwargs)
        except ValidationError as e:
            ev_name = self._start_event_class.__name__
            msg = f"Failed creating a start event of type '{ev_name}' with the keyword arguments: {kwargs}"
            logger.debug(e)
            raise WorkflowRuntimeError(msg)

    def run(
        self,
        ctx: Context | None = None,
        start_event: StartEvent | None = None,
        **kwargs: Any,
    ) -> WorkflowHandler:
        """Run the workflow and return a handler for results and streaming.

        This schedules the workflow execution in the background and returns a
        [WorkflowHandler][workflows.handler.WorkflowHandler] that can be awaited
        for the final result or used to stream intermediate events.

        You may pass either a concrete `start_event` instance or keyword
        arguments that will be used to construct the inferred
        [StartEvent][workflows.events.StartEvent] subclass.

        Args:
            ctx (Context | None): Optional context to resume or share state
                across runs. If omitted, a fresh context is created.
            start_event (StartEvent | None): Optional explicit start event.
            **kwargs (Any): Keyword args to initialize the start event when
                `start_event` is not provided.

        Returns:
            WorkflowHandler: A future-like object to await the final result and
            stream events.

        Raises:
            WorkflowValidationError: If validation fails and validation is
                enabled.
            WorkflowRuntimeError: If the start event cannot be created from kwargs.
            WorkflowTimeoutError: If execution exceeds the configured timeout.

        Examples:
            ```python
            # Create and run with kwargs
            handler = MyFlow().run(topic="Pirates")

            # Stream events
            async for ev in handler.stream_events():
                ...

            # Await final result
            result = await handler
            ```

            If you subclassed the start event, you can also directly pass it in:

            ```python
            result = await my_workflow.run(start_event=MyStartEvent(topic="Pirates"))
            ```
        """
        from workflows.context import Context

        if not self._runtime_locked:
            # don't allow switching runtime after a workflow has been launched
            self._runtime_locked = True

        # Validate the workflow
        self._validate()

        # Extract run_id before passing remaining kwargs to start event
        run_id = kwargs.pop("run_id", None)

        # If a previous context is provided, pass its serialized form
        ctx = ctx if ctx is not None else Context(self)
        # TODO(v3) - remove dependency on is running for choosing whether to send a StartEvent.
        # Is not an easily synchronously queryable property.
        start_event_instance: StartEvent | None = (
            None
            if ctx.is_running
            else self._get_start_event_instance(start_event, **kwargs)
        )
        return ctx._workflow_run(
            workflow=self, start_event=start_event_instance, run_id=run_id
        )

    def validate(
        self,
        *,
        validate_resource_configs: bool = True,
        validate_resources: bool = False,
    ) -> bool:
        """
        Validate the workflow to ensure it's well-formed.

        This method validates the event graph and optionally validates resources:
        - Event production/consumption (set-based checks)
        - Graph structure: all steps reachable from an input event (StartEvent or HumanResponseEvent),
          and only output events (StopEvent, InputRequiredEvent) may be terminal
        - Resource configs (JSON files with Pydantic validation) are validated by default
        - Resource factories are not validated by default (may require env vars)
        - Circular resource dependencies are caught when validate_resources=True

        Validation result is cached after the first successful run(); subsequent run() calls
        skip re-validation. Calling validate() explicitly always re-runs all checks.

        Args:
            validate_resource_configs: If True (default), validate that resource
                config files exist and contain valid data for their Pydantic models.
            validate_resources: If False (default), skip resolving resource factories
                during validation. Set to True to also validate that resource
                factories can be resolved and detect circular dependencies
                (may require environment variables or external connections).

        Returns:
            True if the workflow uses human-in-the-loop, False otherwise.
        """
        return self._validate(
            validate_resource_configs=validate_resource_configs,
            validate_resources=validate_resources,
            force=True,  # Explicit validate() call should always run
        )

    def _validate(
        self,
        *,
        validate_resource_configs: bool = True,
        validate_resources: bool = False,
        force: bool = False,
    ) -> bool:
        if self._disable_validation and not force:
            return False
        stale = self._validated_version != self.__class__._step_functions_version
        if not force and not stale and self._validation_result is not None:
            return self._validation_result

        # Inline import: ``representation`` transitively imports ``Workflow``.
        from .representation.validate import (
            _validate_resource_configs,
            _validate_resources,
            _validate_workflow,
        )

        step_configs = self._step_configs()
        result = _validate_workflow(
            step_configs, self.__class__.__name__, self._skip_graph_checks
        )
        self._start_event_class = result.start_event_class
        self._stop_event_class = result.stop_event_class
        self._catch_error_handlers = result.catch_error_handlers
        self._handler_for_step = result.handler_for_step

        if validate_resource_configs:
            if errors := _validate_resource_configs(step_configs):
                raise WorkflowValidationError(
                    "Resource config validation failed:\n"
                    + "\n".join(f"  - {e}" for e in errors)
                )

        if validate_resources:
            errors = asyncio.run(
                _validate_resources(step_configs, self._resource_manager)
            )
            if errors:
                raise WorkflowValidationError(
                    "Resource validation failed:\n"
                    + "\n".join(f"  - {e}" for e in errors)
                )

        self._validation_result = result.uses_hitl
        self._validated_version = self.__class__._step_functions_version
        return self._validation_result

runtime property #

runtime: Runtime

The runtime this workflow is registered with.

workflow_name property #

workflow_name: str

The workflow name.

If an explicit name was provided at construction, returns that. Otherwise, returns a module-qualified name based on the class's module and qualname attributes.

Examples:

  • Explicit: Workflow(workflow_name="my-workflow")"my-workflow"
  • Module-level class: "mymodule.MyWorkflow"
  • Nested class: "mymodule.Outer.Inner"
  • Function-scoped: "mymodule.func.<locals>.LocalWorkflow"

start_event_class property #

start_event_class: type[StartEvent]

The StartEvent subclass accepted by this workflow.

Determined by inspecting step input types.

events property #

events: list[type[Event]]

Returns all known events emitted by this workflow.

Determined by inspecting step input/output types.

stop_event_class property #

stop_event_class: type[RunResultT]

The StopEvent subclass produced by this workflow.

Determined by inspecting step return annotations.

__init__ #

__init__(timeout: float | None = 45.0, disable_validation: bool = False, verbose: bool = False, resource_manager: ResourceManager | None = None, num_concurrent_runs: int | None = None, runtime: Runtime | None = None, workflow_name: str | None = None, skip_graph_checks: set[WorkflowGraphCheck] | None = None) -> None

Parameters:

Name Type Description Default
timeout float | None

Max seconds to wait for completion. None disables the timeout.

45.0
disable_validation bool

Skip pre-run validation of the event graph (not recommended).

False
verbose bool

If True, print step activity.

False
resource_manager ResourceManager | None

Custom resource manager for dependency injection.

None
num_concurrent_runs int | None

Limit on concurrent run() calls.

None
runtime Runtime | None

Optional runtime to use for this workflow. If not provided, uses the current context-scoped runtime or falls back to basic_runtime.

None
workflow_name str | None

Optional explicit name for this workflow. If not provided, a module-qualified name is computed from the class's __module__ and __qualname__ attributes.

None
skip_graph_checks set[str] | None

Optional set of graph validation checks to skip (e.g. "reachability", "terminal_event"). Use to allow intentional patterns that would otherwise fail validation.

None
Source code in workflows/workflow.py
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
def __init__(
    self,
    timeout: float | None = 45.0,
    disable_validation: bool = False,
    verbose: bool = False,
    resource_manager: ResourceManager | None = None,
    num_concurrent_runs: int | None = None,
    runtime: Runtime | None = None,
    workflow_name: str | None = None,
    skip_graph_checks: set[WorkflowGraphCheck] | None = None,
) -> None:
    """
    Initialize a workflow instance.

    Args:
        timeout (float | None): Max seconds to wait for completion. `None`
            disables the timeout.
        disable_validation (bool): Skip pre-run validation of the event graph
            (not recommended).
        verbose (bool): If True, print step activity.
        resource_manager (ResourceManager | None): Custom resource manager
            for dependency injection.
        num_concurrent_runs (int | None): Limit on concurrent `run()` calls.
        runtime (Runtime | None): Optional runtime to use for this workflow.
            If not provided, uses the current context-scoped runtime or
            falls back to basic_runtime.
        workflow_name (str | None): Optional explicit name for this workflow.
            If not provided, a module-qualified name is computed from
            the class's `__module__` and `__qualname__` attributes.
        skip_graph_checks (set[str] | None): Optional set of graph validation
            checks to skip (e.g. "reachability", "terminal_event"). Use to
            allow intentional patterns that would otherwise fail validation.
    """
    # Inline imports: every module below imports ``Workflow`` transitively,
    # so deferring to call time breaks the cycle.
    from workflows.plugins._context import get_current_runtime
    from workflows.runtime.verbose import VerboseDecorator

    from .representation.validate import (
        _collect_events,
        _ensure_start_event_class,
        _ensure_stop_event_class,
    )

    # Configuration
    self._timeout = timeout
    self._verbose = verbose
    self._disable_validation = disable_validation
    self._num_concurrent_runs = num_concurrent_runs
    # Store explicit name (None means use computed name)
    self._workflow_name = workflow_name

    step_configs = self._step_configs()
    cls_name = self.__class__.__name__
    # Detect StartEvent issues before StopEvent for clearer guidance
    self._start_event_class = _ensure_start_event_class(step_configs, cls_name)
    self._stop_event_class = _ensure_stop_event_class(step_configs, cls_name)
    # Populated by _validate(); empty until a successful validation runs.
    self._catch_error_handlers: dict[str, CatchErrorHandler] = {}
    self._handler_for_step: dict[str, str] = {}
    self._events = _collect_events(step_configs)
    # Resource management
    self._resource_manager = resource_manager or ResourceManager()
    # Instrumentation
    self._dispatcher = dispatcher
    self._runtime_locked = False
    # Validation cache: set after first successful _validate(); skip re-validation on run() until invalidated.
    # _validated_version tracks which _step_functions_version was validated so add_step() invalidates the cache.
    self._validation_result: bool | None = None
    self._validated_version: int = -1
    checks = skip_graph_checks or set()
    valid_checks = set(get_args(WorkflowGraphCheck))
    unknown = checks - valid_checks
    if unknown:
        raise WorkflowValidationError(
            f"Unknown graph check names: {', '.join(sorted(unknown))}. "
            f"Valid names are: {', '.join(sorted(valid_checks))}"
        )
    self._skip_graph_checks: set[WorkflowGraphCheck] = checks

    # Runtime registration: explicit > context-scoped > basic_runtime
    self._runtime = runtime if runtime is not None else get_current_runtime()
    if self._verbose:
        self._runtime = VerboseDecorator(self._runtime)
    # Register with runtime for tracking (no-op for BasicRuntime)
    self._runtime.track_workflow(self)

add_step classmethod #

add_step(func: StepFunction) -> None

Adds a free function as step for this workflow instance.

It raises an exception if a step with the same name was already added to the workflow.

Source code in workflows/workflow.py
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
@classmethod
def add_step(cls, func: StepFunction) -> None:
    """
    Adds a free function as step for this workflow instance.

    It raises an exception if a step with the same name was already added to the workflow.
    """
    step_config: StepConfig | None = getattr(func, "_step_config", None)
    if not step_config:
        msg = f"Step function {func.__name__} is missing the `@step` decorator."
        raise WorkflowValidationError(msg)

    if func.__name__ in cls._get_steps_from_class():
        msg = f"A step {func.__name__} is already part of this workflow, please choose another name."
        raise WorkflowValidationError(msg)

    cls._step_functions[func.__name__] = func
    cls._step_functions_version += 1

run #

run(ctx: Context | None = None, start_event: StartEvent | None = None, **kwargs: Any) -> WorkflowHandler

Run the workflow and return a handler for results and streaming.

This schedules the workflow execution in the background and returns a WorkflowHandler that can be awaited for the final result or used to stream intermediate events.

You may pass either a concrete start_event instance or keyword arguments that will be used to construct the inferred StartEvent subclass.

Parameters:

Name Type Description Default
ctx Context | None

Optional context to resume or share state across runs. If omitted, a fresh context is created.

None
start_event StartEvent | None

Optional explicit start event.

None
**kwargs Any

Keyword args to initialize the start event when start_event is not provided.

{}

Returns:

Name Type Description
WorkflowHandler WorkflowHandler

A future-like object to await the final result and

WorkflowHandler

stream events.

Raises:

Type Description
WorkflowValidationError

If validation fails and validation is enabled.

WorkflowRuntimeError

If the start event cannot be created from kwargs.

WorkflowTimeoutError

If execution exceeds the configured timeout.

Examples:

# Create and run with kwargs
handler = MyFlow().run(topic="Pirates")

# Stream events
async for ev in handler.stream_events():
    ...

# Await final result
result = await handler

If you subclassed the start event, you can also directly pass it in:

result = await my_workflow.run(start_event=MyStartEvent(topic="Pirates"))
Source code in workflows/workflow.py
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
def run(
    self,
    ctx: Context | None = None,
    start_event: StartEvent | None = None,
    **kwargs: Any,
) -> WorkflowHandler:
    """Run the workflow and return a handler for results and streaming.

    This schedules the workflow execution in the background and returns a
    [WorkflowHandler][workflows.handler.WorkflowHandler] that can be awaited
    for the final result or used to stream intermediate events.

    You may pass either a concrete `start_event` instance or keyword
    arguments that will be used to construct the inferred
    [StartEvent][workflows.events.StartEvent] subclass.

    Args:
        ctx (Context | None): Optional context to resume or share state
            across runs. If omitted, a fresh context is created.
        start_event (StartEvent | None): Optional explicit start event.
        **kwargs (Any): Keyword args to initialize the start event when
            `start_event` is not provided.

    Returns:
        WorkflowHandler: A future-like object to await the final result and
        stream events.

    Raises:
        WorkflowValidationError: If validation fails and validation is
            enabled.
        WorkflowRuntimeError: If the start event cannot be created from kwargs.
        WorkflowTimeoutError: If execution exceeds the configured timeout.

    Examples:
        ```python
        # Create and run with kwargs
        handler = MyFlow().run(topic="Pirates")

        # Stream events
        async for ev in handler.stream_events():
            ...

        # Await final result
        result = await handler
        ```

        If you subclassed the start event, you can also directly pass it in:

        ```python
        result = await my_workflow.run(start_event=MyStartEvent(topic="Pirates"))
        ```
    """
    from workflows.context import Context

    if not self._runtime_locked:
        # don't allow switching runtime after a workflow has been launched
        self._runtime_locked = True

    # Validate the workflow
    self._validate()

    # Extract run_id before passing remaining kwargs to start event
    run_id = kwargs.pop("run_id", None)

    # If a previous context is provided, pass its serialized form
    ctx = ctx if ctx is not None else Context(self)
    # TODO(v3) - remove dependency on is running for choosing whether to send a StartEvent.
    # Is not an easily synchronously queryable property.
    start_event_instance: StartEvent | None = (
        None
        if ctx.is_running
        else self._get_start_event_instance(start_event, **kwargs)
    )
    return ctx._workflow_run(
        workflow=self, start_event=start_event_instance, run_id=run_id
    )

validate #

validate(*, validate_resource_configs: bool = True, validate_resources: bool = False) -> bool

Validate the workflow to ensure it's well-formed.

This method validates the event graph and optionally validates resources: - Event production/consumption (set-based checks) - Graph structure: all steps reachable from an input event (StartEvent or HumanResponseEvent), and only output events (StopEvent, InputRequiredEvent) may be terminal - Resource configs (JSON files with Pydantic validation) are validated by default - Resource factories are not validated by default (may require env vars) - Circular resource dependencies are caught when validate_resources=True

Validation result is cached after the first successful run(); subsequent run() calls skip re-validation. Calling validate() explicitly always re-runs all checks.

Parameters:

Name Type Description Default
validate_resource_configs bool

If True (default), validate that resource config files exist and contain valid data for their Pydantic models.

True
validate_resources bool

If False (default), skip resolving resource factories during validation. Set to True to also validate that resource factories can be resolved and detect circular dependencies (may require environment variables or external connections).

False

Returns:

Type Description
bool

True if the workflow uses human-in-the-loop, False otherwise.

Source code in workflows/workflow.py
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
def validate(
    self,
    *,
    validate_resource_configs: bool = True,
    validate_resources: bool = False,
) -> bool:
    """
    Validate the workflow to ensure it's well-formed.

    This method validates the event graph and optionally validates resources:
    - Event production/consumption (set-based checks)
    - Graph structure: all steps reachable from an input event (StartEvent or HumanResponseEvent),
      and only output events (StopEvent, InputRequiredEvent) may be terminal
    - Resource configs (JSON files with Pydantic validation) are validated by default
    - Resource factories are not validated by default (may require env vars)
    - Circular resource dependencies are caught when validate_resources=True

    Validation result is cached after the first successful run(); subsequent run() calls
    skip re-validation. Calling validate() explicitly always re-runs all checks.

    Args:
        validate_resource_configs: If True (default), validate that resource
            config files exist and contain valid data for their Pydantic models.
        validate_resources: If False (default), skip resolving resource factories
            during validation. Set to True to also validate that resource
            factories can be resolved and detect circular dependencies
            (may require environment variables or external connections).

    Returns:
        True if the workflow uses human-in-the-loop, False otherwise.
    """
    return self._validate(
        validate_resource_configs=validate_resource_configs,
        validate_resources=validate_resources,
        force=True,  # Explicit validate() call should always run
    )