---
title: Large Language Models (LLMs) | Developer Documentation
---

The LLM is responsible for reading text and generating natural language responses to queries. By default, LlamaIndex.TS uses `gpt-4o`.

The LLM can be explicitly updated through `Settings`.

## Installation

```
npm i llamaindex @llamaindex/openai
```

```
import { OpenAI } from "@llamaindex/openai";
import { Settings } from "llamaindex";


Settings.llm = new OpenAI({ model: "gpt-3.5-turbo", temperature: 0 });
```

## Azure OpenAI

To use Azure OpenAI, you only need to set a few environment variables.

For example:

```
export AZURE_OPENAI_KEY="<YOUR KEY HERE>"
export AZURE_OPENAI_ENDPOINT="<YOUR ENDPOINT, see https://learn.microsoft.com/en-us/azure/ai-services/openai/quickstart?tabs=command-line%2Cpython&pivots=rest-api>"
export AZURE_OPENAI_DEPLOYMENT="gpt-4" # or some other deployment name
```

## Local LLM

For local LLMs, currently we recommend the use of [Ollama](/typescript/framework/modules/models/llms/ollama/index.md) LLM.

## Available LLMs

Most available LLMs are listed in the sidebar on the left. Additionally the following integrations exist without separate documentation:

- [HuggingFaceLLM](/typescript/framework-api-reference/classes/huggingfacellm/index.md) and [HuggingFaceInferenceAPI](/typescript/framework-api-reference/classes/huggingfaceinferenceapi/index.md).
- [ReplicateLLM](/typescript/framework-api-reference/classes/replicatellm/index.md) see [replicate.com](https://replicate.com/)

Check the [LlamaIndexTS Github](https://github.com/run-llama/LlamaIndexTS) for the most up to date overview of integrations.

## API Reference

- [OpenAI](/typescript/framework-api-reference/classes/openai/index.md)
