Prompts
Prompting is the fundamental input that gives LLMs their expressive power. LlamaIndex uses prompts to build the index, do insertion, perform traversal during querying, and to synthesize the final answer.
Users may also provide their own prompt templates to further customize the behavior of the framework. The best method for customizing is copying the default prompt from the link above, and using that as the base for any modifications.
Usage Pattern
Section titled “Usage Pattern”Currently, there are two ways to customize prompts in LlamaIndex:
For both methods, you will need to create an function that overrides the default prompt.
// Define a custom promptconst newTextQaPrompt: TextQaPrompt = ({ context, query }) => { return `Context information is below.---------------------${context}---------------------Given the context information and not prior knowledge, answer the query.Answer the query in the style of a Sherlock Holmes detective novel.Query: ${query}Answer:`;};
1. Customizing the default prompt on initialization
Section titled “1. Customizing the default prompt on initialization”The first method is to create a new instance of a Response Synthesizer (or the module you would like to update the prompt) by using the getResponseSynthesizer function. Instead of passing the custom prompt to the deprecated responseBuilder parameter, call getResponseSynthesizer with the mode as the first argument and supply the new prompt via the options parameter.
// Create an instance of Response Synthesizer
// Deprecated usage:const responseSynthesizer = new ResponseSynthesizer({ responseBuilder: new CompactAndRefine(undefined, newTextQaPrompt),});
// Current usage:const responseSynthesizer = getResponseSynthesizer('compact', { textQATemplate: newTextQaPrompt})
// Create indexconst index = await VectorStoreIndex.fromDocuments([document]);
// Query the indexconst queryEngine = index.asQueryEngine({ responseSynthesizer });
const response = await queryEngine.query({ query: "What did the author do in college?",});
2. Customizing submodules prompt
Section titled “2. Customizing submodules prompt”The second method is that most of the modules in LlamaIndex have a getPrompts
and a updatePrompt
method that allows you to override the default prompt. This method is useful when you want to change the prompt on the fly or in submodules on a more granular level.
// Create indexconst index = await VectorStoreIndex.fromDocuments([document]);
// Query the indexconst queryEngine = index.asQueryEngine();
// Get a list of prompts for the query engineconst prompts = queryEngine.getPrompts();
// output: { "responseSynthesizer:textQATemplate": defaultTextQaPrompt, "responseSynthesizer:refineTemplate": defaultRefineTemplatePrompt }
// Now, we can override the default promptqueryEngine.updatePrompt({ "responseSynthesizer:textQATemplate": newTextQaPrompt,});
const response = await queryEngine.query({ query: "What did the author do in college?",});