Many users ask: “Do I need OpenAI or Gemini API keys for OpenClaw memory to work?”
No. OpenClaw memory itself works without OpenAI/Gemini.
What is optional is semantic memory search (memory_search).
That feature needs an embeddings provider (remote or local).
OpenClaw memory is stored in Markdown files in your workspace, such as:
memory/YYYY-MM-DD.mdMEMORY.md (optional)These files are the source of truth.
You need an API key only when using a remote embeddings provider (openai, gemini, or voyage) for memory search.
Also important: Codex OAuth for chat/completions does not automatically cover embeddings for memory search.
If you want no external embedding API, use local embeddings:
agents: {
defaults: {
memorySearch: {
provider: "local",
local: {
modelPath: "hf:your-embedding-model.gguf"
},
fallback: "none"
}
}
}
memorySearch.provider is not setOpenClaw auto-selects in this order:
local (if local model path is configured and available)openai (if key is available)gemini (if key is available)voyage (if key is available)fallback = "none"fallback = "none" means “do not fall back to another provider.”
By itself, it does not always mean “disable memory search entirely.” Whether search works still depends on your selected provider and whether that provider is configured correctly.
If you want a chronological “what happened” timeline, see:
It complements memory files/search, but does not replace embedding-based semantic recall.