Skip to content

Memory in Bedrock AgentCore

Bedrock AgentCore Memory is a managed AWS service that allows you to create and manage memory resources that store conversation contexts for agents.

To get started, you will need to install LlamaIndex and the AgentCore Memory integration

%pip install llama-index llama-index-memory-bedrock-agentcore

Additionally, you can install the following requirements to follow along with the provided example. The yahoo finance tool is used in the example & the bedrock converse integration is used as the LLM for the FunctionAgent

%pip install llama-index-tools-yahoo-finance llama-index-llms-bedrock-converse

Initialize the AgentCore Memory context & AgentCore memory classes

from llama_index.memory.bedrock_agentcore import (
AgentCoreMemory,
AgentCoreMemoryContext,
)
context = AgentCoreMemoryContext(
actor_id="<REQUIRED>",
memory_id="<REQUIRED>",
session_id="<REQUIRED>",
memory_strategy_id="<OPTIONAL>",
namespace="<OPTIONAL>",
)
agentcore_memory = AgentCoreMemory(context=context)

Initialize the FunctionAgent or ReActAgent with any tool and LLM.

from llama_index.llms.bedrock_converse import BedrockConverse
from llama_index.core.agent.workflow import FunctionAgent
from llama_index.tools.yahoo_finance import YahooFinanceToolSpec
llm = BedrockConverse(model="us.anthropic.claude-sonnet-4-20250514-v1:0")
finance_tool_spec = YahooFinanceToolSpec()
agent = FunctionAgent(
tools=finance_tool_spec.to_tool_list(),
llm=llm,
)

Invoke the agent’s tool to store this in memory.

response = await agent.run(
"What is the stock price for Amazon?", memory=agentcore_memory
)
print(str(response))

Query the agent to consult its memory source for the response.

response = await agent.run(
"What stock prices have I asked for?", memory=agentcore_memory
)
print(str(response))