Txtai MCP Server
Txtai MCP servers enable AI models to interact with txtai, an AI-powered search engine that builds vector indexes (also known as embeddings) to perform similarity searches.
Overview
The MCP Txtai Server enables AI models to interact with Txtai, an AI-powered search engine that builds vector indexes (also known as embeddings) to perform similarity searches. Txtai is widely used for semantic search, retrieval-augmented generation (RAG), and building intelligent applications.
Community Server:
Developed and maintained by rmtech1
Key Features
Semantic Search
Perform semantic search across stored memories
Persistent Storage
Persistent storage with file-based backend
Tag-based Memory
Tag-based memory organization and retrieval
Claude and Cline AI Integration
Integration with Claude and Cline AI
Available Tools
Quick Reference
| Tool | Purpose | Category |
|---|---|---|
store_memory | Store new memory content | Write |
retrieve_memory | Retrieve memories based on semantic search | Read |
search_by_tag | Search memories by tags | Read |
delete_memory | Delete a specific memory | Write |
get_stats | Get database statistics | Discovery |
check_health | Check database and embedding model health | Discovery |
Detailed Usage
store_memory▶
Store new memory content with metadata and tags.
use_mcp_tool({
server_name: "txtai-assistant",
tool_name: "store_memory",
arguments: {
content: "Important information to remember",
tags: ["important"]
}
});
Returns a confirmation message.
retrieve_memory▶
Retrieve memories based on semantic search.
use_mcp_tool({
server_name: "txtai-assistant",
tool_name: "retrieve_memory",
arguments: {
query: "what was the important information?",
n_results": 5
}
});
Returns a list of matching memories.
search_by_tag▶
Search memories by tags.
use_mcp_tool({
server_name: "txtai-assistant",
tool_name: "search_by_tag",
arguments: {
tags: ["important", "context"]
}
});
Returns a list of matching memories.
delete_memory▶
Delete a specific memory by content hash.
use_mcp_tool({
server_name: "txtai-assistant",
tool_name: "delete_memory",
arguments: {
content_hash: "hash_value"
}
});
Returns a confirmation message.
get_stats▶
Get database statistics.
use_mcp_tool({
server_name: "txtai-assistant",
tool_name: "get_stats",
arguments: {}
});
Returns database statistics.
check_health▶
Check database and embedding model health.
use_mcp_tool({
server_name: "txtai-assistant",
tool_name: "check_health",
arguments: {}
});
Returns the health status of the server.
Installation
{
"mcpServers": {
"txtai-assistant": {
"command": "path/to/txtai-assistant-mcp/scripts/start.sh",
"env": {}
}
}
}
Custom Connection:
The server can be configured using environment variables in the .env file.
Common Use Cases
1. Store and retrieve memories
Store and retrieve memories based on semantic search:
// Store a memory
use_mcp_tool({
server_name: "txtai-assistant",
tool_name: "store_memory",
arguments: {
content: "Important information to remember",
tags: ["important"]
}
});
// Retrieve memories
use_mcp_tool({
server_name: "txtai-assistant",
tool_name: "retrieve_memory",
arguments: {
query: "what was the important information?",
n_results": 5
}
});
2. Search by tags
Search memories by tags:
use_mcp_tool({
server_name: "txtai-assistant",
tool_name: "search_by_tag",
arguments: {
tags: ["important", "context"]
}
});
Sources
Related Articles
Confluence MCP Server
Confluence MCP servers provide interfaces for LLMs to interact with Atlassian Confluence workspaces. These servers enable AI models to manage documentation, collaborate on content, and automate knowledge management tasks.
Ollama Deep Researcher: AI Model for Web Search & LLM Synthesis
Ollama Deep Researcher MCP servers enable AI models to perform advanced topic research using web search and LLM synthesis, powered by a local MCP server.
Salesforce DX MCP Server
Salesforce DX MCP servers enable AI models to interact with Salesforce instances, providing capabilities for managing orgs, metadata, data, and users.