by iikarus · MCP Server · ★ 40
Dragon Brain — persistent long-term memory for AI agents via MCP (Model Context Protocol). Knowledge graph (FalkorDB) + vector search (Qdrant) + CUDA GPU embeddings. Works with Claude, Gemini CLI, Cursor, Windsurf, VS Code Copilot. 31 tools, 1116 tests.
| Stars | 40 |
| Forks | 7 |
| Language | Python |
| Category | MCP Server |
| License | MIT |
| Quality Score | 40.3/100 |
| Open Issues | 2 |
| Last Updated | 2026-03-26 |
| Created | 2026-02-23 |
| Platforms | claude-code, cli, codex, gemini, mcp, python |
| Est. Tokens | ~99k |
These tools work well together with Dragon-Brain for enhanced workflows:
Explore other popular mcp server tools:
Dragon-Brain is Dragon Brain — persistent long-term memory for AI agents via MCP (Model Context Protocol). Knowledge graph (FalkorDB) + vector search (Qdrant) + CUDA GPU embeddings. Works with Claude, Gemini CLI, Cur. It is categorized as a MCP Server with 40 GitHub stars.
Dragon-Brain is primarily written in Python. It covers topics such as ai-memory, claude, codex-cli.
You can find installation instructions and usage details in the Dragon-Brain GitHub repository at github.com/iikarus/Dragon-Brain. The project has 40 stars and 7 forks, indicating an active community.
Dragon-Brain is released under the MIT license, making it free to use and modify according to the license terms.