DeepSeek LLM
import { Settings } from "llamaindex";import { DeepSeekLLM } from "@llamaindex/deepseek";
Settings.llm = new DeepSeekLLM({  apiKey: "<YOUR_API_KEY>",  model: "deepseek-coder", // or "deepseek-chat"});Example
Section titled “Example”import { Document, VectorStoreIndex, Settings } from "llamaindex";import { DeepSeekLLM } from "@llamaindex/deepseek";
const deepseekLlm = new DeepSeekLLM({  apiKey: "<YOUR_API_KEY>",  model: "deepseek-coder", // or "deepseek-chat"});
async function main() {  const response = await llm.deepseekLlm.chat({    messages: [      {        role: "system",        content: "You are an AI assistant",      },      {        role: "user",        content: "Tell me about San Francisco",      },    ],    stream: false,  });  console.log(response);}Limitations
Section titled “Limitations”Currently does not support function calling.
Currently does not support json-output param while still is very good at json generating.