Txtai MCP Server
Txtai MCP servers enable AI models to interact with txtai, an AI-powered search engine that builds vector indexes (also known as embeddings) to perform similarity searches.
Overview
The MCP Txtai Server enables AI models to interact with Txtai, an AI-powered search engine that builds vector indexes (also known as embeddings) to perform similarity searches. Txtai is widely used for semantic search, retrieval-augmented generation (RAG), and building intelligent applications.
Community Server:
Developed and maintained by rmtech1
Key Features
Semantic Search
Perform semantic search across stored memories
Persistent Storage
Persistent storage with file-based backend
Tag-based Memory
Tag-based memory organization and retrieval
Claude and Cline AI Integration
Integration with Claude and Cline AI
Available Tools
Quick Reference
| Tool | Purpose | Category |
|---|---|---|
store_memory | Store new memory content | Write |
retrieve_memory | Retrieve memories based on semantic search | Read |
search_by_tag | Search memories by tags | Read |
delete_memory | Delete a specific memory | Write |
get_stats | Get database statistics | Discovery |
check_health | Check database and embedding model health | Discovery |
Detailed Usage
store_memory▶
Store new memory content with metadata and tags.
use_mcp_tool({
server_name: "txtai-assistant",
tool_name: "store_memory",
arguments: {
content: "Important information to remember",
tags: ["important"]
}
});
Returns a confirmation message.
retrieve_memory▶
Retrieve memories based on semantic search.
use_mcp_tool({
server_name: "txtai-assistant",
tool_name: "retrieve_memory",
arguments: {
query: "what was the important information?",
n_results": 5
}
});
Returns a list of matching memories.
search_by_tag▶
Search memories by tags.
use_mcp_tool({
server_name: "txtai-assistant",
tool_name: "search_by_tag",
arguments: {
tags: ["important", "context"]
}
});
Returns a list of matching memories.
delete_memory▶
Delete a specific memory by content hash.
use_mcp_tool({
server_name: "txtai-assistant",
tool_name: "delete_memory",
arguments: {
content_hash: "hash_value"
}
});
Returns a confirmation message.
get_stats▶
Get database statistics.
use_mcp_tool({
server_name: "txtai-assistant",
tool_name: "get_stats",
arguments: {}
});
Returns database statistics.
check_health▶
Check database and embedding model health.
use_mcp_tool({
server_name: "txtai-assistant",
tool_name: "check_health",
arguments: {}
});
Returns the health status of the server.
Installation
{
"mcpServers": {
"txtai-assistant": {
"command": "path/to/txtai-assistant-mcp/scripts/start.sh",
"env": {}
}
}
}
Custom Connection:
The server can be configured using environment variables in the .env file.
Common Use Cases
1. Store and retrieve memories
Store and retrieve memories based on semantic search:
// Store a memory
use_mcp_tool({
server_name: "txtai-assistant",
tool_name: "store_memory",
arguments: {
content: "Important information to remember",
tags: ["important"]
}
});
// Retrieve memories
use_mcp_tool({
server_name: "txtai-assistant",
tool_name: "retrieve_memory",
arguments: {
query: "what was the important information?",
n_results": 5
}
});
2. Search by tags
Search memories by tags:
use_mcp_tool({
server_name: "txtai-assistant",
tool_name: "search_by_tag",
arguments: {
tags: ["important", "context"]
}
});
Sources
Related Articles
Letta MCP Server
Letta MCP servers enable AI models to interact with the Letta platform, providing capabilities for agent management, memory operations, and tool integration.
S3 MCP Server
S3 MCP servers enable AI models to interact with Amazon S3 object storage, providing capabilities for file operations, metadata management, and versioning in a secure and scalable environment.
Canva Dev MCP Server
The Canva Dev MCP Server enables AI models to interact with the Canva development ecosystem, providing capabilities for generating apps, adopting the App UI Kit, localizing apps, and ensuring compliance with Canva's design guidelines.