MCP-NLP is a FastMCP application designed to provide NLP (Natural Language Processing) capabilities using the Model Context Protocol (MCP).
- FastMCP Framework v2: A modern framework for fast, Pythonic way to build MCP servers.
- Model Context Protocol (MCP): A protocol that allows for the management and control of LLM contexts.
- NLP-modules
textdistance
: A module for calculating text distance metrics
Before you begin, ensure you have the following installed:
-
Clone the repository:
git clone https://github.com/tivaliy/mcp-nlp.git cd mcp-nlp
-
Install dependencies (using
uv
):uv sync
The MCP-NLP server supports two authentication modes:
-
Unauthenticated Mode (default):
- No API key required to access the server
- Set environment variable
API_KEY_ENABLED=False
-
API Key Authentication:
- Requires a valid API key in the request header
- Set environment variable
API_KEY_ENABLED=True
andAPI_KEY=your-secret-key
- By default, the header name is
X-API-Key
(can be customized withAPI_KEY_NAME
)
Example .env
file:
# Authentication configuration
API_KEY_ENABLED=True
API_KEY=not-a-secret
# API_KEY_NAME=X-API-Key # Optional: customize header name
The MCP-NLP server currently provides the following MCP tools:
Calculate similarity/distance between text sequences using various algorithms:
-
#textdistance_measure
:- Purpose: Measures text distance between two sequences of strings
- Parameters:
source
(required): The source text stringreference
(required): The reference text string to compare againstalgorithm
(optional): The algorithm to use (default:levenshtein
)metric
(optional): The metric to use (default:normalized_similarity
)
- Returns: A float value representing the calculated distance/similarity
-
#textdistance_list_metrics
:- Purpose: Lists all supported metrics for text distance algorithms
- Parameters: None
- Returns: A list of available metrics:
distance
,similarity
,normalized_distance
,normalized_similarity
,maximum
-
Supported Metrics:
distance
: Raw distance scoresimilarity
: Raw similarity scorenormalized_distance
: Distance normalized to a 0-1 scalenormalized_similarity
: Similarity normalized to a 0-1 scale (default)maximum
: Maximum possible value for the algorithm
-
Default Algorithm:
Levenshtein
To run the application locally:
-
Start the FastMCP application:
mcp-nlp --transport streamable-http
-
Access the MCP server endpoint at
http://127.0.0.1:8000/mcp
(in case ofstreamable-http
transport)
To run the MCP server in a Docker container:
-
Build the Docker image:
docker build -t mcp-nlp .
-
Run the Docker container:
docker run --rm -e TRANSPORT=streamable-http -p 8000:8000 mcp-nlp
-
Access the MCP server endpoint at
http://127.0.0.1:8000/mcp
(in case ofstreamable-http
transport)
Make sure to set the TRANSPORT
environment variable to streamable-http
or sse
when running the Docker container.
To use the MCP-NLP server with VS Code:
-
Make sure your MCP-NLP server is running
-
Add the server configuration to your VS Code
settings.json
(usingstdio
transport):{ "servers": { "mcp-nlp": { "type": "stdio", "command": "${workspaceFolder}/.venv/bin/mcp-nlp", "env": { "API_KEY_ENABLED": "false" } } } }
-
Enable MCP in VS Code:
"chat.mcp.enabled": true, "github.copilot.advanced": { "mcp.enabled": true }
-
You can now use the MCP-NLP tools directly in VS Code through GitHub Copilot
MCP
| Model Context Protocol
| FastMCP
| NLP
This project is licensed under the MIT License. See the LICENSE file for details.