---
title: Full-Stack Projects | Developer Documentation
---

We’ve created both tooling and a variety of example projects (all open-source) to help you get started building a full-stack LLM application.

## create-llama

`create-llama` is a command-line tool that will generate a full-stack application template for you. It supports both FastAPI, Vercel, and Node backends. This is one of the easiest ways to get started!

Resources:

- [create-llama Blog](https://blog.llamaindex.ai/create-llama-a-command-line-tool-to-generate-llamaindex-apps-8f7683021191)
- [create-llama Repo](https://github.com/run-llama/LlamaIndexTS/tree/main/packages/create-llama)
- [create-llama Additional Templates](https://github.com/jerryjliu/create_llama_projects)

## Full-Stack Applications

The LlamaIndex team has also built some in-house projects - all of them open-sourced with MIT license - that you can use out of the box, or use as a template to kickstart your own project.

Check them out below.

### SEC Insights

- [SEC Insights App](https://secinsights.ai/)
- [SEC Insights Repo](https://github.com/run-llama/sec-insights%3E)

### Chat LlamaIndex

- [Chat LlamaIndex App](https://chat-llamaindex.vercel.app/)
- [Chat LlamaIndex Repo](https://github.com/run-llama/chat-llamaindex)

### RAGs

[RAGs Repo](https://github.com/run-llama/rags)

### RAG CLI

[RAG CLI](/python/framework/getting_started/starter_tools/rag_cli/index.md)
