Overview
LlamaAgents at a Glance
Section titled “LlamaAgents at a Glance”LlamaAgents is the most advanced way to build multi-step document workflows. Stitch together Parse, Extract, Classify, and arbitrary custom operations into pipelines that perform knowledge tasks on your documents—without needing to wire up infrastructure, persistence, or deployment yourself.
Get from zero to a working pipeline quickly. Start from templates, configure and deploy. When you need customization, it’s real Python underneath: fork and extend without a rewrite. All of this is powered by Agent Workflows, our event-driven orchestration framework with built-in support for branching, parallelism, human-in-the-loop review, durability, and observability.
Get Started
Section titled “Get Started”Start fast: Click-to-deploy a starter template directly in LlamaCloud. Choose a pre-built workflow like SEC Insights or Invoice Matching, configure and deploy.
Customize: When you need more control, fork to GitHub and edit the Python code directly. Use the llamactl CLI to develop locally, then deploy to LlamaCloud or self-host.
Go deeper: Use Agent Workflows directly in your own applications. Run workflows as async processes, or mount them as endpoints in your existing server.
Components
Section titled “Components”llamactl CLI: The development and deployment tool. Initialize from starter templates, serve locally, and deploy to LlamaCloud or export for self-hosting.
Agent Workflows: The powerful event-driven orchestration framework underneath it all. Use standalone as an async library, or let llamactl serve them. Built-in durability and observability.
llama-cloud-services: LlamaCloud’s document primitives (Parse, Extract, Classify), Agent Data for structured storage, and vector indexes for retrieval. llamactl handles authentication automatically.
@llamaindex/ui: React hooks for workflow-powered frontends. Deploy alongside your backend with llamactl.
Workflows Client: Call deployed workflows via REST API or typed Python client.