Easymemory
A 100% local memory layer for chatbots and agents with an MCP server for Claude, GPT, Gemini, and lo
Rating
Votes
0
score
Downloads
0
total
Price
Free
API key required
Works With
About
Your Memory, Any LLM - Auto-saving conversational memory with MCP support.
┌─────────────────────────────────────────────────────────────┐
│ EASYMEMORY │
├─────────────────────────────────────────────────────────────┤
│ │
│ ┌─────────┐ ┌─────────┐ ┌─────────┐ ┌─────────┐ │
│ │ Claude │ │ GPT │ │ Gemini │ │ Local │ │
│ └────┬────┘ └────┬────┘ └────┬────┘ └────┬────┘ │
│ │ │ │ │ │
│ └────────────┴─────┬──────┴────────────┘ │
│ │ │
│ ┌──────▼──────┐ │
│ │ MCP Server │ │
│ └──────┬──────┘ │
│ │ │
│ ┌──────▼──────┐ │
│ │ Memory │ │
│ │ Store │ │
│ └─────────────┘ │
│ │
└─────────────────────────────────────────────────────────────┘
✨ Features
- 🔄 Auto-Save: Every conversation automatically saved
- 🔍 Smart Search: Semantic search across all memories
- 🧩 Hybrid Retrieval+: Graph + Vector + Keyword + Built-in Local Knowledge Index (no external libs)
- 📄 Document Support: PDF, DOCX, TXT, Markdown
- 🔌 MCP Server: Works with Claude, GPT, any MCP-compatible LLM
- 💾 100% Local: Your data stays on your machine
- 🏢 Enterprise Security: OAuth2 (Client Credentials), API Keys, rate limit, audit log
- 🔗 Integrations: Slack JSON import + Notion/GDrive folder indexing
- 🚀 Easy Setup: One command to start
📦 Installation
# Clone the repo
git clone https://github.com/yourusername/easymemory.git
cd easymemory
# Install in development mode
pip install -e .
🚀 Quick Start
Option 1: MCP Server (for Claude Desktop, GPT, etc.)
# Start the MCP server
easymemory-server --port 8100
Then configure your LLM client to connect to http://localhost:8100/mcp
Health checks:
http://localhost:8100/healthzhttp://localhost:8100/readyz
Option 2: Interactive Agent
# With Ollama
easymemory-agent --provider ollama --model llama3.1:8b
# With OpenAI
easymemory-agent --provider openai --model gpt-4
Option 3: Use in Python
import asyncio
from easymemory.agent import EasyMemoryAgent
async def main():
async with EasyMemoryAgent(
llm_provider="ollama",
model="llama3.1:8b"
) as agent:
# Chat - automatically saves everything!
resDon't lose this
Three weeks from now, you'll want Easymemory again. Will you remember where to find it?
Save it to your library and the next time you need Easymemory, it’s one tap away — from any AI app you use. Group it into a bench with the rest of the team for that kind of task and you can pull the whole stack at once.
⚡ Pro tip for geeks: add a-gnt 🤵🏻♂️ as a custom connector in Claude or a custom GPT in ChatGPT — one click and your library is right there in the chat. Or, if you’re in an editor, install the a-gnt MCP server and say “use my [bench name]” in Claude Code, Cursor, VS Code, or Windsurf.
a-gnt's Take
Our honest review
This plugs directly into your AI and gives it new abilities it didn't have before. A 100% local memory layer for chatbots and agents with an MCP server for Claude, GPT, Gemini, and lo. Once connected, just ask your AI to use it. It's completely free and works across most major AI apps. This one just landed in the catalog — worth trying while it's fresh.
Tips for getting started
Tap "Get" above, pick your AI app, and follow the steps. Most installs take under 30 seconds.
Heads up: this needs an API key to work. You'll get one from the service's website (usually free). The setup guide tells you exactly where.
What's New
Imported from GitHub
Ratings & Reviews
0.0
out of 5
0 ratings
No reviews yet. Be the first to share your experience.