Overview
LlamaAgents at a Glance
Section titled “LlamaAgents at a Glance”LlamaAgents is the most advanced way to build agent workflows. Author and run multi-step document agents from scratch locally using our open-source Agent Workflows, or build and deploy them in the cloud with our vibe-coding Agent Builder in LlamaCloud—without wiring up infrastructure, persistence, or deployment yourself.
Stitch together Parse, Extract, Split, Classify, and custom operations into Workflows that perform knowledge tasks on your documents. When you need full control, it’s real Python underneath: fork and extend without a rewrite. Agent Workflows give you event-driven orchestration with branching, parallelism, human-in-the-loop review, durability, and observability.
-
Build locally: Use the
llamactlCLI to create projects from starter templates, develop and serve workflows on your machine, then deploy to LlamaCloud or self-host. You can also use Agent Workflows directly in your own Python applications—run them as async processes or mount them as endpoints in your existing server. -
Build in the cloud: Use Agent Builder in LlamaCloud (Agents → Builder) to describe your workflow in plain language; an AI coding agent generates a complete, deployable workflow. The code is yours—customize it in GitHub or run it on your own infrastructure. For a one-click path, click-to-deploy a starter template like SEC Insights or Invoice Matching.
-
Go deeper: Combine local development with cloud services. Use Agent Workflows for orchestration and WorkflowClient to call deployed workflows via REST or the typed Python client.
Components
Section titled “Components”llamactl CLI: Development and deployment for local workflow apps. Initialize from starter templates, serve locally, and deploy to LlamaCloud or export for self-hosting.
Agent Workflows: The event-driven orchestration framework at the core. Use it as an async library in your own code, or let llamactl serve it. Built-in durability and observability.
Agent Builder: In LlamaCloud → Agents → Builder. Natural-language, vibe-coding interface to create document workflows; the agent generates real Python you can deploy or take to GitHub.
llama-cloud-services: LlamaCloud document primitives (Parse, Extract, Classify), Agent Data for structured storage, and vector indexes. llamactl handles authentication when deploying to the cloud.
@llamaindex/ui: React hooks for workflow-powered frontends. Deploy alongside your backend with llamactl.
Workflows Client: Call deployed workflows via REST API or typed Python client.