Skip to content

Deployment Config Reference

LlamaDeploy reads configuration from your repository to run your app. The configuration is defined in your project’s pyproject.toml.

[tool.llamadeploy]
name = "my-app"
env_files = [".env"]
[tool.llamadeploy.workflows]
workflow-one = "my_app.workflows:some_workflow"
workflow-two = "my_app.workflows:another_workflow"
[tool.llamadeploy.ui]
directory = "ui"
build_output_dir = "ui/static"

Deployments can be configured to automatically inject authentication for LlamaCloud.

[tool.llamadeploy]
llama_cloud = true

When this is set:

  • During development, llamactl prompts you to log in to LlamaCloud if you’re not already. After that, it injects LLAMA_CLOUD_API_KEY, LLAMA_CLOUD_PROJECT_ID, and LLAMA_CLOUD_BASE_URL into your Python server process and JavaScript build.
  • When deployed, LlamaCloud automatically injects a dedicated API key into the Python process. The frontend process receives a short-lived session cookie specific to each user visiting the application. Therefore, configure the project ID on the frontend API client so that LlamaCloud API requests from the frontend and backend are scoped to the same project ID.

Most apps need API keys (e.g., OpenAI). You can specify them via a .env file and reference it in your config:

[tool.llamadeploy]
env_files = [".env"]

Then set your secrets:

.env
OPENAI_API_KEY=sk-xxxx

If you prefer to keep your pyproject.toml simple, you can write the same configuration in a llama_deploy.yaml or llama_deploy.toml file. All fields use the same structure and types; omit the tool.llamadeploy prefix.

FieldTypeDefaultDescription
namestring"default"URL-safe deployment name. In pyproject.toml, if omitted it falls back to project.name.
workflowsmap<string,string>Map of workflowName -> "module.path:workflow".
env_fileslist<string>[".env"]Paths to env files to load. Relative to the config file. Duplicate entries are removed.
envmap<string,string>{}Environment variables injected at runtime.
llama_cloudbooleanfalseIndicates that a deployment connects to LlamaCloud. Set to true to automatically inject a LlamaCloud API key.
uiUIConfignullOptional UI configuration. directory is required if ui is present.
FieldTypeDefaultDescription
directorystringPath to UI source, relative to the config directory. Required when ui is set.
build_output_dirstring${directory}/distBuilt UI output directory. If set in TOML/pyproject.toml, the path is relative to the config file. If set via package.json (llamadeploy.build_output_dir), it is resolved as ${directory}/${build_output_dir}.
package_managerstring"npm" (or inferred)Package manager used to build the UI. If not set, inferred from package.json packageManager (e.g., pnpm@9.0.0pnpm).
build_commandstring"build"NPM script name used to build.
serve_commandstring"dev"NPM script name used to serve in development.
proxy_portinteger4502Port the app server proxies to in development.

Note: after setting ui.directory so that package.json can be found, you can configure the UI within it instead.

For example:

{
"name": "my-ui",
"packageManager": "pnpm@9.7.0",
"scripts": { "build": "vite build", "dev": "vite" },
"llamadeploy": {
"build_output_dir": "dist",
"package_manager": "pnpm",
"build_command": "build",
"serve_command": "dev",
"proxy_port": 5173
}
}