Context
Context #
Bases: Generic[MODEL_T]
Global, per-run context for a Workflow. Provides an interface into the
underlying broker run, for both external (workflow run oberservers) and
internal consumption by workflow steps.
The Context coordinates event delivery between steps, tracks in-flight work,
exposes a global state store, and provides utilities for streaming and
synchronization. It is created by a Workflow at run time and can be
persisted and restored.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
workflow
|
Workflow
|
The owning workflow instance. Used to infer step configuration and instrumentation. |
required |
previous_context
|
dict[str, Any] | None
|
A previous context snapshot to resume from. |
None
|
serializer
|
BaseSerializer | None
|
A serializer to use for serializing and deserializing the current and previous context snapshots. |
None
|
Attributes:
| Name | Type | Description |
|---|---|---|
is_running |
bool
|
Whether the workflow is currently running. |
store |
InMemoryStateStore[MODEL_T]
|
Type-safe, async state store shared across steps. See also InMemoryStateStore. |
Examples:
Basic usage inside a step:
from workflows import step
from workflows.events import StartEvent, StopEvent
@step
async def start(self, ctx: Context, ev: StartEvent) -> StopEvent:
await ctx.store.set("query", ev.topic)
ctx.write_event_to_stream(ev) # surface progress to UI
return StopEvent(result="ok")
Persisting the state of a workflow across runs:
from workflows import Context
# Create a context and run the workflow with the same context
ctx = Context(my_workflow)
result_1 = await my_workflow.run(..., ctx=ctx)
result_2 = await my_workflow.run(..., ctx=ctx)
# Serialize the context and restore it
ctx_dict = ctx.to_dict()
restored_ctx = Context.from_dict(my_workflow, ctx_dict)
result_3 = await my_workflow.run(..., ctx=restored_ctx)
See Also
Source code in workflows/context/context.py
60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 581 582 583 584 585 586 587 | |
store
property
#
store: InMemoryStateStore[MODEL_T]
Typed, process-local state store shared across steps.
If no state was initialized yet, a default DictState store is created.
Returns:
| Type | Description |
|---|---|
InMemoryStateStore[MODEL_T]
|
InMemoryStateStore[MODEL_T]: The state store instance. |
collect_events #
collect_events(ev: Event, expected: list[Type[Event]], buffer_id: str | None = None) -> list[Event] | None
Buffer events until all expected types are available, then return them.
This utility is helpful when a step can receive multiple event types
and needs to proceed only when it has a full set. The returned list is
ordered according to expected.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
ev
|
Event
|
The incoming event to add to the buffer. |
required |
expected
|
list[Type[Event]]
|
Event types to collect, in order. |
required |
buffer_id
|
str | None
|
Optional stable key to isolate buffers across steps or workers. Defaults to an internal key derived from the task name or expected types. |
None
|
Returns:
| Type | Description |
|---|---|
list[Event] | None
|
list[Event] | None: The events in the requested order when complete, |
list[Event] | None
|
otherwise |
Examples:
@step
async def synthesize(
self, ctx: Context, ev: QueryEvent | RetrieveEvent
) -> StopEvent | None:
events = ctx.collect_events(ev, [QueryEvent, RetrieveEvent])
if events is None:
return None
query_ev, retrieve_ev = events
# ... proceed with both inputs present ...
See Also
Source code in workflows/context/context.py
393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 | |
from_dict
classmethod
#
from_dict(workflow: 'Workflow', data: dict[str, Any], serializer: BaseSerializer | None = None) -> 'Context[MODEL_T]'
Reconstruct a Context from a serialized payload.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
workflow
|
Workflow
|
The workflow instance that will own this context. |
required |
data
|
dict[str, Any]
|
Payload produced by to_dict. |
required |
serializer
|
BaseSerializer | None
|
Serializer used to decode state and events. Defaults to JSON. |
None
|
Returns:
| Type | Description |
|---|---|
'Context[MODEL_T]'
|
Context[MODEL_T]: A context instance initialized with the persisted state and queues. |
Raises:
| Type | Description |
|---|---|
ContextSerdeError
|
If the payload is missing required fields or is in an incompatible format. |
Examples:
ctx_dict = ctx.to_dict()
my_db.set("key", json.dumps(ctx_dict))
ctx_dict = my_db.get("key")
restored_ctx = Context.from_dict(my_workflow, json.loads(ctx_dict))
result = await my_workflow.run(..., ctx=restored_ctx)
Source code in workflows/context/context.py
344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 | |
get_result #
get_result() -> RunResultT
Return the final result of the workflow run.
Deprecated
This method is deprecated and will be removed in a future release.
Prefer awaiting the handler returned by Workflow.run, e.g.:
result = await workflow.run(..., ctx=ctx).
Examples:
# Preferred
result = await my_workflow.run(..., ctx=ctx)
# Deprecated
result_agent = ctx.get_result()
Returns:
| Name | Type | Description |
|---|---|---|
RunResultT |
RunResultT
|
The value provided via a |
Source code in workflows/context/context.py
537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 | |
send_event #
send_event(message: Event, step: str | None = None) -> None
Dispatch an event to one or all workflow steps.
If step is omitted, the event is broadcast to all step queues and
non-matching steps will ignore it. When step is provided, the target
step must accept the event type or a
WorkflowRuntimeError is raised.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
message
|
Event
|
The event to enqueue. |
required |
step
|
str | None
|
Optional step name to target. |
None
|
Raises:
| Type | Description |
|---|---|
WorkflowRuntimeError
|
If the target step does not exist or does not accept the event type. |
Examples:
It's common to use this method to fan-out events:
@step
async def my_step(self, ctx: Context, ev: StartEvent) -> WorkerEvent | GatherEvent:
for i in range(10):
ctx.send_event(WorkerEvent(msg=i))
return GatherEvent()
You also see this method used from the caller side to send events into the workflow:
handler = my_workflow.run(...)
async for ev in handler.stream_events():
if isinstance(ev, SomeEvent):
handler.ctx.send_event(SomeOtherEvent(msg="Hello!"))
result = await handler
Source code in workflows/context/context.py
432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 | |
to_dict #
to_dict(serializer: BaseSerializer | None = None) -> dict[str, Any]
Serialize the context to a JSON-serializable dict.
Persists the global state store, event queues, buffers, accepted events, broker log, and running flag. This payload can be fed to from_dict to resume a run or carry state across runs.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
serializer
|
BaseSerializer | None
|
Value serializer used for state and event payloads. Defaults to JsonSerializer. |
None
|
Returns:
| Type | Description |
|---|---|
dict[str, Any]
|
dict[str, Any]: A dict suitable for JSON encoding and later |
dict[str, Any]
|
restoration via |
See Also
Examples:
ctx_dict = ctx.to_dict()
my_db.set("key", json.dumps(ctx_dict))
ctx_dict = my_db.get("key")
restored_ctx = Context.from_dict(my_workflow, json.loads(ctx_dict))
result = await my_workflow.run(..., ctx=restored_ctx)
Source code in workflows/context/context.py
292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 | |
wait_for_event
async
#
wait_for_event(event_type: Type[T], waiter_event: Event | None = None, waiter_id: str | None = None, requirements: dict[str, Any] | None = None, timeout: float | None = 2000) -> T
Wait for the next matching event of type event_type.
Optionally emits a waiter_event to the event stream once per waiter_id to
inform callers that the workflow is waiting for external input.
This helps to prevent duplicate waiter events from being sent to the event stream.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
event_type
|
type[T]
|
Concrete event class to wait for. |
required |
waiter_event
|
Event | None
|
Optional event to write to the stream once when the wait begins. |
None
|
waiter_id
|
str | None
|
Stable identifier to avoid emitting multiple waiter events for the same logical wait. |
None
|
requirements
|
dict[str, Any] | None
|
Key/value filters that must be
satisfied by the event via |
None
|
timeout
|
float | None
|
Max seconds to wait. |
2000
|
Returns:
| Name | Type | Description |
|---|---|---|
T |
T
|
The received event instance of the requested type. |
Raises:
| Type | Description |
|---|---|
TimeoutError
|
If the timeout elapses. |
Examples:
@step
async def my_step(self, ctx: Context, ev: StartEvent) -> StopEvent:
response = await ctx.wait_for_event(
HumanResponseEvent,
waiter_event=InputRequiredEvent(msg="What's your name?"),
waiter_id="user_name",
timeout=60,
)
return StopEvent(result=response.response)
Source code in workflows/context/context.py
472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 | |
write_event_to_stream #
write_event_to_stream(ev: Event | None) -> None
Enqueue an event for streaming to [WorkflowHandler]](workflows.handler.WorkflowHandler).
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
ev
|
Event | None
|
The event to stream. |
required |
Examples:
@step
async def my_step(self, ctx: Context, ev: StartEvent) -> StopEvent:
ctx.write_event_to_stream(ev)
return StopEvent(result="ok")
Source code in workflows/context/context.py
520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 | |
DictState #
Bases: DictLikeModel
Dynamic, dict-like Pydantic model for workflow state.
Used as the default state model when no typed state is provided. Behaves like a mapping while retaining Pydantic validation and serialization.
Examples:
from workflows.context.state_store import DictState
state = DictState()
state["foo"] = 1
state.bar = 2 # attribute-style access works for nested structures
See Also
Source code in workflows/context/state_store.py
24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 | |
InMemoryStateStore #
Bases: Generic[MODEL_T]
Async, in-memory, type-safe state manager for workflows.
This store holds a single Pydantic model instance representing global workflow state. When the generic parameter is omitted, it defaults to DictState for flexible, dictionary-like usage.
Thread-safety is ensured with an internal asyncio.Lock. Consumers can
either perform atomic reads/writes via get_state and set_state, or make
in-place, transactional edits via the edit_state context manager.
Examples:
Typed state model:
from pydantic import BaseModel
from workflows.context.state_store import InMemoryStateStore
class MyState(BaseModel):
count: int = 0
store = InMemoryStateStore(MyState())
async with store.edit_state() as state:
state.count += 1
Dynamic state with DictState:
from workflows.context.state_store import InMemoryStateStore, DictState
store = InMemoryStateStore(DictState())
await store.set("user.profile.name", "Ada")
name = await store.get("user.profile.name")
See Also
Source code in workflows/context/state_store.py
52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 | |
get_state
async
#
get_state() -> MODEL_T
Return a shallow copy of the current state model.
Returns:
| Name | Type | Description |
|---|---|---|
MODEL_T |
MODEL_T
|
A |
Source code in workflows/context/state_store.py
105 106 107 108 109 110 111 | |
set_state
async
#
set_state(state: MODEL_T) -> None
Replace the current state model.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
state
|
MODEL_T
|
New state of the same type as the existing model. |
required |
Raises:
| Type | Description |
|---|---|
ValueError
|
If the type differs from the existing state type. |
Source code in workflows/context/state_store.py
113 114 115 116 117 118 119 120 121 122 123 124 125 126 | |
to_dict #
to_dict(serializer: BaseSerializer) -> dict[str, Any]
Serialize the state and model metadata for persistence.
For DictState, each individual item is serialized using the provided
serializer since values can be arbitrary Python objects. For other
Pydantic models, defers to the serializer (e.g. JSON) which can leverage
model-aware encoding.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
serializer
|
BaseSerializer
|
Strategy used to encode values. |
required |
Returns:
| Type | Description |
|---|---|
dict[str, Any]
|
dict[str, Any]: A payload suitable for |
dict[str, Any]
|
Source code in workflows/context/state_store.py
128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 | |
from_dict
classmethod
#
from_dict(serialized_state: dict[str, Any], serializer: BaseSerializer) -> InMemoryStateStore[MODEL_T]
Restore a state store from a serialized payload.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
serialized_state
|
dict[str, Any]
|
The payload produced by to_dict. |
required |
serializer
|
BaseSerializer
|
Strategy to decode stored values. |
required |
Returns:
| Type | Description |
|---|---|
InMemoryStateStore[MODEL_T]
|
InMemoryStateStore[MODEL_T]: A store with the reconstructed model. |
Source code in workflows/context/state_store.py
176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 | |
edit_state
async
#
edit_state() -> AsyncGenerator[MODEL_T, None]
Edit state transactionally under a lock.
Yields the mutable model and writes it back on exit. This pattern avoids read-modify-write races and keeps updates atomic.
Yields:
| Name | Type | Description |
|---|---|---|
MODEL_T |
AsyncGenerator[MODEL_T, None]
|
The current state model for in-place mutation. |
Source code in workflows/context/state_store.py
216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 | |
get
async
#
get(path: str, default: Optional[Any] = Ellipsis) -> Any
Get a nested value using dot-separated paths.
Supports dict keys, list indices, and attribute access transparently at each segment.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
path
|
str
|
Dot-separated path, e.g. "user.profile.name". |
required |
default
|
Any
|
If provided, return this when the path does not
exist; otherwise, raise |
Ellipsis
|
Returns:
| Name | Type | Description |
|---|---|---|
Any |
Any
|
The resolved value. |
Raises:
| Type | Description |
|---|---|
ValueError
|
If the path is invalid and no default is provided or if the path depth exceeds limits. |
Source code in workflows/context/state_store.py
233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 | |
set
async
#
set(path: str, value: Any) -> None
Set a nested value using dot-separated paths.
Intermediate containers are created as needed. Dicts, lists, tuples, and Pydantic models are supported where appropriate.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
path
|
str
|
Dot-separated path to write. |
required |
value
|
Any
|
Value to assign. |
required |
Raises:
| Type | Description |
|---|---|
ValueError
|
If the path is empty or exceeds the maximum depth. |
Source code in workflows/context/state_store.py
269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 | |
clear
async
#
clear() -> None
Reset the state to its type defaults.
Raises:
| Type | Description |
|---|---|
ValueError
|
If the model type cannot be instantiated from defaults (i.e., fields missing default values). |
Source code in workflows/context/state_store.py
305 306 307 308 309 310 311 312 313 314 315 | |
BaseSerializer #
Bases: ABC
Interface for value serialization used by the workflow context and state store.
Implementations must encode arbitrary Python values into a string and be able to reconstruct the original values from that string.
See Also
Source code in workflows/context/serializers.py
17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 | |
JsonSerializer #
Bases: BaseSerializer
JSON-first serializer that understands Pydantic models and LlamaIndex components.
Behavior:
- Pydantic models are encoded as JSON with their qualified class name so they
can be faithfully reconstructed.
- LlamaIndex components (objects exposing class_name and to_dict) are
serialized to their dict form alongside the qualified class name.
- Dicts and lists are handled recursively.
Fallback for unsupported objects is to attempt JSON encoding directly; if it
fails, a ValueError is raised.
Examples:
s = JsonSerializer()
payload = s.serialize({"x": 1, "y": [2, 3]})
data = s.deserialize(payload)
assert data == {"x": 1, "y": [2, 3]}
See Also
Source code in workflows/context/serializers.py
36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 | |
serialize_value #
serialize_value(value: Any) -> Any
Events with a wrapper type that includes type metadata, so that they can be reserialized into the original Event type. Traverses dicts and lists recursively.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
value
|
Any
|
The value to serialize. |
required |
Returns:
| Name | Type | Description |
|---|---|---|
Any |
Any
|
The serialized value. A dict, list, string, number, or boolean. |
Source code in workflows/context/serializers.py
63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 | |
serialize #
serialize(value: Any) -> str
Serialize an arbitrary value to a JSON string.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
value
|
Any
|
The value to encode. |
required |
Returns:
| Name | Type | Description |
|---|---|---|
str |
str
|
JSON string. |
Raises:
| Type | Description |
|---|---|
ValueError
|
If the value cannot be encoded to JSON. |
Source code in workflows/context/serializers.py
98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 | |
deserialize_value #
deserialize_value(data: Any) -> Any
Helper to deserialize a single dict or other json value from its discriminator fields back into a python class.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
data
|
Any
|
a dict, list, string, number, or boolean |
required |
Returns:
| Name | Type | Description |
|---|---|---|
Any |
Any
|
The deserialized value. |
Source code in workflows/context/serializers.py
116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 | |
deserialize #
deserialize(value: str) -> Any
Deserialize a JSON string into Python objects.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
value
|
str
|
JSON string. |
required |
Returns:
| Name | Type | Description |
|---|---|---|
Any |
Any
|
The reconstructed value. |
Source code in workflows/context/serializers.py
137 138 139 140 141 142 143 144 145 146 147 | |
PickleSerializer #
Bases: JsonSerializer
Hybrid serializer: JSON when possible, Pickle as a safe fallback.
This serializer attempts JSON first for readability and portability, and transparently falls back to Pickle for objects that cannot be represented in JSON. Deserialization prioritizes Pickle and falls back to JSON.
Warning
Pickle can execute arbitrary code during deserialization. Only deserialize trusted payloads.
Note: Used to be called JsonPickleSerializer but it was renamed to PickleSerializer.
Examples:
s = PickleSerializer()
class Foo:
def __init__(self, x):
self.x = x
payload = s.serialize(Foo(1)) # will likely use Pickle
obj = s.deserialize(payload)
assert isinstance(obj, Foo)
Source code in workflows/context/serializers.py
150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 | |
serialize #
serialize(value: Any) -> str
Serialize with JSON preference and Pickle fallback.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
value
|
Any
|
The value to encode. |
required |
Returns:
| Name | Type | Description |
|---|---|---|
str |
str
|
Encoded string (JSON or base64-encoded Pickle bytes). |
Source code in workflows/context/serializers.py
176 177 178 179 180 181 182 183 184 185 186 187 188 | |
deserialize #
deserialize(value: str) -> Any
Deserialize with Pickle preference and JSON fallback.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
value
|
str
|
Encoded string. |
required |
Returns:
| Name | Type | Description |
|---|---|---|
Any |
Any
|
The reconstructed value. |
Notes
Use only with trusted payloads due to Pickle security implications.
Source code in workflows/context/serializers.py
190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 | |