A Model Context Protocol (MCP) server that retrieves information from Wikipedia to provide context to Large Language Models (LLMs). This tool helps AI assistants access factual information from Wikipedia to ground their responses in reliable sources.
The Wikipedia MCP server provides real-time access to Wikipedia information through a standardized Model Context Protocol interface. This allows LLMs to retrieve accurate and up-to-date information directly from Wikipedia to enhance their responses.
- Search Wikipedia: Find articles matching specific queries
- Retrieve Article Content: Get full article text with all information
- Article Summaries: Get concise summaries of articles
- Section Extraction: Retrieve specific sections from articles
- Link Discovery: Find links within articles to related topics
- Related Topics: Discover topics related to a specific article
- Revision History: Access complete edit history of Wikipedia pages with user details and changes
- User Contributions: Track all contributions made by specific Wikipedia users
- Revision Comparison: Compare different versions of articles to see what changed
- Page Creator Discovery: Find out who originally created a Wikipedia page
- Talk Page Access: Retrieve Wikipedia talk pages and discussion content
- Edit Spike Detection: Statistically detect periods of unusual editing activity using z-scores
- Significance Analysis: Identify the most important edits using weighted scoring algorithms
- Controversy Detection: Discover edit wars, reverts, and contentious periods in article history
- Multi-language Support: Access Wikipedia in different languages by specifying the
--languageor-largument when running the server (e.g.,wikipedia-mcp --language tafor Tamil). - Country/Locale Support: Use intuitive country codes like
--country US,--country China, or--country TWinstead of language codes. Automatically maps to appropriate Wikipedia language variants. - Language Variant Support: Support for language variants such as Chinese traditional/simplified (e.g.,
zh-hansfor Simplified Chinese,zh-twfor Traditional Chinese), Serbian scripts (sr-latn,sr-cyrl), and other regional variants. - Optional caching: Cache API responses for improved performance using --enable-cache
- Google ADK Compatibility: Fully compatible with Google ADK agents and other AI frameworks that use strict function calling schemas
The best way to install for Claude Desktop usage is with pipx, which installs the command globally:
# Install pipx if you don't have it
pip install pipx
pipx ensurepath
# Install the Wikipedia MCP server
pipx install wikipedia-mcpThis ensures the wikipedia-mcp command is available in Claude Desktop's PATH.
To install wikipedia-mcp for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @Rudra-ravi/wikipedia-mcp --client claudeYou can also install directly from PyPI:
pip install wikipedia-mcpNote: If you use this method and encounter connection issues with Claude Desktop, you may need to use the full path to the command in your configuration. See the Configuration section for details.
# Create a virtual environment
python3 -m venv venv
# Activate the virtual environment
source venv/bin/activate
# Install the package
pip install git+https://github.com/rudra-ravi/wikipedia-mcp.git# Clone the repository
git clone https://github.com/rudra-ravi/wikipedia-mcp.git
cd wikipedia-mcp
# Create a virtual environment
python3 -m venv wikipedia-mcp-env
source wikipedia-mcp-env/bin/activate
# Install in development mode
pip install -e .# If installed with pipx
wikipedia-mcp
# If installed in a virtual environment
source venv/bin/activate
wikipedia-mcp
# Specify transport protocol (default: stdio)
wikipedia-mcp --transport stdio # For Claude Desktop
wikipedia-mcp --transport sse # For HTTP streaming
# Specify language (default: en for English)
wikipedia-mcp --language ja # Example for Japanese
wikipedia-mcp --language zh-hans # Example for Simplified Chinese
wikipedia-mcp --language zh-tw # Example for Traditional Chinese (Taiwan)
wikipedia-mcp --language sr-latn # Example for Serbian Latin script
# Specify country/locale (alternative to language codes)
wikipedia-mcp --country US # English (United States)
wikipedia-mcp --country China # Chinese Simplified
wikipedia-mcp --country Taiwan # Chinese Traditional (Taiwan)
wikipedia-mcp --country Japan # Japanese
wikipedia-mcp --country Germany # German
wikipedia-mcp --country france # French (case insensitive)
# List all supported countries
wikipedia-mcp --list-countries
# Optional: Specify port for SSE (default 8000)
wikipedia-mcp --transport sse --port 8080
# Optional: Enable caching
wikipedia-mcp --enable-cache
# Combine options
wikipedia-mcp --country Taiwan --enable-cache --transport sse --port 8080Add the following to your Claude Desktop configuration file:
Option 1: Using command name (requires wikipedia-mcp to be in PATH)
{
"mcpServers": {
"wikipedia": {
"command": "wikipedia-mcp"
}
}
}Option 2: Using full path (recommended if you get connection errors)
{
"mcpServers": {
"wikipedia": {
"command": "/full/path/to/wikipedia-mcp"
}
}
}Option 3: With country/language specification
{
"mcpServers": {
"wikipedia-us": {
"command": "wikipedia-mcp",
"args": ["--country", "US"]
},
"wikipedia-taiwan": {
"command": "wikipedia-mcp",
"args": ["--country", "TW"]
},
"wikipedia-japan": {
"command": "wikipedia-mcp",
"args": ["--country", "Japan"]
}
}
}To find the full path, run: which wikipedia-mcp
Configuration file locations:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%/Claude/claude_desktop_config.json - Linux:
~/.config/Claude/claude_desktop_config.json
Note: If you encounter connection errors, see the Troubleshooting section for solutions.
The Wikipedia MCP server provides the following tools for LLMs to interact with Wikipedia:
Search Wikipedia for articles matching a query.
Parameters:
query(string): The search termlimit(integer, optional): Maximum number of results to return (default: 10)
Returns:
- A list of search results with titles, snippets, and metadata
Get the full content of a Wikipedia article.
Parameters:
title(string): The title of the Wikipedia article
Returns:
- Article content including text, summary, sections, links, and categories
Get a concise summary of a Wikipedia article.
Parameters:
title(string): The title of the Wikipedia article
Returns:
- A text summary of the article
Get the sections of a Wikipedia article.
Parameters:
title(string): The title of the Wikipedia article
Returns:
- A structured list of article sections with their content
Get the links contained within a Wikipedia article.
Parameters:
title(string): The title of the Wikipedia article
Returns:
- A list of links to other Wikipedia articles
Get topics related to a Wikipedia article based on links and categories.
Parameters:
title(string): The title of the Wikipedia articlelimit(integer, optional): Maximum number of related topics (default: 10)
Returns:
- A list of related topics with relevance information
Get a summary of a Wikipedia article tailored to a specific query.
Parameters:
title(string): The title of the Wikipedia articlequery(string): The query to focus the summary onmax_length(integer, optional): Maximum length of the summary (default: 250)
Returns:
- A dictionary containing the title, query, and the focused summary
Get a summary of a specific section of a Wikipedia article.
Parameters:
title(string): The title of the Wikipedia articlesection_title(string): The title of the section to summarizemax_length(integer, optional): Maximum length of the summary (default: 150)
Returns:
- A dictionary containing the title, section title, and the section summary
Extract key facts from a Wikipedia article, optionally focused on a specific topic within the article.
Parameters:
title(string): The title of the Wikipedia articletopic_within_article(string, optional): A specific topic within the article to focus fact extractioncount(integer, optional): Number of key facts to extract (default: 5)
Returns:
- A dictionary containing the title, topic, and a list of extracted facts
Get the complete revision history for a Wikipedia page.
Parameters:
title(string): The title of the Wikipedia articlelimit(integer, optional): Maximum number of revisions to return (default: 50)
Returns:
- A dictionary containing revision history with user information, timestamps, comments, and size changes
Get all contributions made by a specific Wikipedia user.
Parameters:
username(string): The username to get contributions forlimit(integer, optional): Maximum number of contributions to return (default: 50)
Returns:
- A dictionary containing user contributions across different pages
Get detailed information and statistics about a Wikipedia user.
Parameters:
username(string): The username to get information for
Returns:
- A dictionary containing user registration date, edit count, user groups, and block status
Compare two specific revisions of a Wikipedia page.
Parameters:
from_rev(integer): The older revision IDto_rev(integer): The newer revision ID
Returns:
- A dictionary containing the comparison with differences between versions
Find who originally created a Wikipedia page.
Parameters:
title(string): The title of the Wikipedia article
Returns:
- A dictionary containing information about the page creator and first revision
Get detailed information about a specific revision.
Parameters:
revid(integer): The revision ID to get details for
Returns:
- A dictionary containing full content and metadata for the revision
Get the content and metadata of a Wikipedia talk page.
Parameters:
title(string): The title of the Wikipedia article (talk page will be auto-derived)
Returns:
- A dictionary containing talk page content, metadata, and discussion structure
Analyze edit activity patterns and detect spikes using statistical methods.
Parameters:
title(string): The title of the Wikipedia articlestart_datetime(string, optional): Start datetime in ISO format (e.g., "2024-01-15T14:30:00Z")end_datetime(string, optional): End datetime in ISO format. Defaults to now if not specifiedwindow_size(string, optional): Time window for grouping ("day", "week", "month") - default: "day"z_threshold(float, optional): Z-score threshold for spike detection (default: 2.0 = top 2.5%)
Returns:
- A dictionary containing activity analysis, statistical measures, and detected spikes
Get the most significant revisions based on weighted scoring algorithm.
Parameters:
title(string): The title of the Wikipedia articlestart_datetime(string, optional): Start datetime in ISO formatend_datetime(string, optional): End datetime in ISO formatlimit(integer, optional): Maximum number of significant revisions to return (default: 50)min_significance(float, optional): Minimum significance score (0.0-1.0) to include (default: 0.5)
Returns:
- A dictionary containing ranked significant revisions with detailed scoring factors
The Wikipedia MCP server supports intuitive country and region codes as an alternative to language codes. This makes it easier to access region-specific Wikipedia content without needing to know language codes.
Use --list-countries to see all supported countries:
wikipedia-mcp --list-countriesThis will display countries organized by language, for example:
Supported Country/Locale Codes:
========================================
en: US, USA, United States, UK, GB, Canada, Australia, ...
zh-hans: CN, China
zh-tw: TW, Taiwan
ja: JP, Japan
de: DE, Germany
fr: FR, France
es: ES, Spain, MX, Mexico, AR, Argentina, ...
pt: PT, Portugal, BR, Brazil
ru: RU, Russia
ar: SA, Saudi Arabia, AE, UAE, EG, Egypt, ...
# Major countries by code
wikipedia-mcp --country US # United States (English)
wikipedia-mcp --country CN # China (Simplified Chinese)
wikipedia-mcp --country TW # Taiwan (Traditional Chinese)
wikipedia-mcp --country JP # Japan (Japanese)
wikipedia-mcp --country DE # Germany (German)
wikipedia-mcp --country FR # France (French)
wikipedia-mcp --country BR # Brazil (Portuguese)
wikipedia-mcp --country RU # Russia (Russian)
# Countries by full name (case insensitive)
wikipedia-mcp --country "United States"
wikipedia-mcp --country China
wikipedia-mcp --country Taiwan
wikipedia-mcp --country Japan
wikipedia-mcp --country Germany
wikipedia-mcp --country france # Case insensitive
# Regional variants
wikipedia-mcp --country HK # Hong Kong (Traditional Chinese)
wikipedia-mcp --country SG # Singapore (Simplified Chinese)
wikipedia-mcp --country "Saudi Arabia" # Arabic
wikipedia-mcp --country Mexico # SpanishThe server automatically maps country codes to appropriate Wikipedia language editions:
- English-speaking: US, UK, Canada, Australia, New Zealand, Ireland, South Africa β
en - Chinese regions:
- CN, China β
zh-hans(Simplified Chinese) - TW, Taiwan β
zh-tw(Traditional Chinese - Taiwan) - HK, Hong Kong β
zh-hk(Traditional Chinese - Hong Kong) - SG, Singapore β
zh-sg(Simplified Chinese - Singapore)
- CN, China β
- Major languages: JPβ
ja, DEβde, FRβfr, ESβes, ITβit, RUβru, etc. - Regional variants: Supports 140+ countries and regions
If you specify an unsupported country, you'll get a helpful error message:
$ wikipedia-mcp --country INVALID
Error: Unsupported country/locale: 'INVALID'.
Supported country codes include: US, USA, UK, GB, CA, AU, NZ, IE, ZA, CN.
Use --language parameter for direct language codes instead.
Use --list-countries to see supported country codes.The Wikipedia MCP server supports language variants for languages that have multiple writing systems or regional variations. This feature is particularly useful for Chinese, Serbian, Kurdish, and other languages with multiple scripts or regional differences.
zh-hans- Simplified Chinesezh-hant- Traditional Chinesezh-tw- Traditional Chinese (Taiwan)zh-hk- Traditional Chinese (Hong Kong)zh-mo- Traditional Chinese (Macau)zh-cn- Simplified Chinese (China)zh-sg- Simplified Chinese (Singapore)zh-my- Simplified Chinese (Malaysia)
sr-latn- Serbian Latin scriptsr-cyrl- Serbian Cyrillic script
ku-latn- Kurdish Latin scriptku-arab- Kurdish Arabic script
no- Norwegian (automatically mapped to BokmΓ₯l)
# Access Simplified Chinese Wikipedia
wikipedia-mcp --language zh-hans
# Access Traditional Chinese Wikipedia (Taiwan)
wikipedia-mcp --language zh-tw
# Access Serbian Wikipedia in Latin script
wikipedia-mcp --language sr-latn
# Access Serbian Wikipedia in Cyrillic script
wikipedia-mcp --language sr-cyrlWhen you specify a language variant like zh-hans, the server:
- Maps the variant to the base Wikipedia language (e.g.,
zhfor Chinese variants) - Uses the base language for API connections to the Wikipedia servers
- Includes the variant parameter in API requests to get content in the specific variant
- Returns content formatted according to the specified variant's conventions
This approach ensures optimal compatibility with Wikipedia's API while providing access to variant-specific content and formatting.
Once the server is running and configured with Claude Desktop, you can use prompts like:
- "Tell me about quantum computing using the Wikipedia information."
- "Summarize the history of artificial intelligence based on Wikipedia."
- "What does Wikipedia say about climate change?"
- "Find Wikipedia articles related to machine learning."
- "Get me the introduction section of the article on neural networks from Wikipedia."
- "Search Wikipedia China for information about the Great Wall." (uses Chinese Wikipedia)
- "Tell me about Tokyo from Japanese Wikipedia sources."
- "What does German Wikipedia say about the Berlin Wall?"
- "Find information about the Eiffel Tower from French Wikipedia."
- "Get Taiwan Wikipedia's article about Taiwanese cuisine."
- "Search Traditional Chinese Wikipedia for information about Taiwan."
- "Find Simplified Chinese articles about modern China."
- "Get information from Serbian Latin Wikipedia about Belgrade."
The server also provides MCP resources (similar to HTTP endpoints but for MCP):
search/{query}: Search Wikipedia for articles matching the queryarticle/{title}: Get the full content of a Wikipedia articlesummary/{title}: Get a summary of a Wikipedia articlesections/{title}: Get the sections of a Wikipedia articlelinks/{title}: Get the links in a Wikipedia articlesummary/{title}/query/{query}/length/{max_length}: Get a query-focused summary of an articlesummary/{title}/section/{section_title}/length/{max_length}: Get a summary of a specific article sectionfacts/{title}/topic/{topic_within_article}/count/{count}: Extract key facts from an article
# Clone the repository
git clone https://github.com/rudra-ravi/wikipedia-mcp.git
cd wikipedia-mcp
# Create a virtual environment
python3 -m venv venv
source venv/bin/activate
# Install the package in development mode
pip install -e .
# Install development and test dependencies
pip install -r requirements-dev.txt
# Run the server
wikipedia-mcpwikipedia_mcp/: Main package__main__.py: Entry point for the packageserver.py: MCP server implementationwikipedia_client.py: Wikipedia API clientapi/: API implementationcore/: Core functionalityutils/: Utility functions
tests/: Test suitetest_basic.py: Basic package teststest_cli.py: Command-line interface teststest_server_tools.py: Comprehensive server and tool tests
To understand how a Wikipedia article evolved over time:
-
Get revision history: Use
get_page_revisionsto see recent edits- Shows who made changes, when, and what they changed
- Includes edit summaries and size changes
-
Find the original creator: Use
get_page_creatorto discover who started the article- Returns the first revision and author information
-
Compare versions: Use
compare_revisionswith revision IDs to see specific changes- Shows detailed differences between any two versions
To research a Wikipedia editor's work:
-
Get user statistics: Use
get_user_infoto see their profile- Shows edit count, registration date, user groups
- Indicates if the user is blocked
-
Review contributions: Use
get_user_contributionsto see their edits- Lists all pages they've edited with timestamps
- Shows the size and nature of their changes
-
Examine specific edits: Use
get_revision_detailswith revision IDs from their contributions- Retrieves the full content of their edits
Research a controversial topic's edit history:
1. search_wikipedia("controversial topic")
2. get_page_revisions("Article Title", limit=100)
3. compare_revisions(older_revid, newer_revid) for major changes
4. get_user_info("username") for frequent editors
Verify article reliability:
1. get_page_creator("Article Title")
2. get_user_info("creator_username")
3. get_page_revisions("Article Title", limit=20)
4. Check edit patterns and contributor diversity
Detect and investigate controversies/edit wars:
1. analyze_edit_activity("Controversial Topic", z_threshold=2.0)
2. For detected spikes, use get_significant_revisions() to find major edits
3. get_talk_page("Controversial Topic") to see discussions
4. compare_revisions() for specific conflicts
5. get_user_info() for key editors involved
The new analysis tools enable sophisticated detection of Wikipedia controversies:
Spike Detection Algorithm:
- Uses statistical z-scores (standard deviations) to identify unusual activity
- Detects both edit count spikes and editor count spikes
- Configurable sensitivity (z_threshold parameter)
Significance Scoring:
- 30% - Normalized byte changes (size of edit relative to article size)
- 25% - Revert proximity (how quickly edit was reversed)
- 20% - Editor experience (new vs. established editors)
- 15% - Discussion references (mentions of talk pages, disputes)
- 10% - Edit war patterns (rapid back-and-forth edits)
Use Cases:
- Identify controversial topics and periods
- Analyze edit wars and vandalism patterns
- Research significant historical events in Wikipedia
- Track how articles evolve during breaking news
- Study editor behavior and conflict resolution
The project includes a comprehensive test suite to ensure reliability and functionality.
The test suite is organized in the tests/ directory with the following test files:
test_basic.py: Basic package functionality teststest_cli.py: Command-line interface and transport teststest_server_tools.py: Comprehensive tests for all MCP tools and Wikipedia client functionality
# Install test dependencies
pip install -r requirements-dev.txt
# Run all tests
python -m pytest tests/ -v
# Run tests with coverage
python -m pytest tests/ --cov=wikipedia_mcp --cov-report=html# Run only unit tests (excludes integration tests)
python -m pytest tests/ -v -m "not integration"
# Run only integration tests (requires internet connection)
python -m pytest tests/ -v -m "integration"
# Run specific test file
python -m pytest tests/test_server_tools.py -v- WikipediaClient Tests: Mock-based tests for all client methods
- Search functionality
- Article retrieval
- Summary extraction
- Section parsing
- Link extraction
- Related topics discovery
- Server Tests: MCP server creation and tool registration
- CLI Tests: Command-line interface functionality
- Real API Tests: Tests that make actual calls to Wikipedia API
- End-to-End Tests: Complete workflow testing
The project uses pytest.ini for test configuration:
[pytest]
markers =
integration: marks tests as integration tests (may require network access)
slow: marks tests as slow running
testpaths = tests
addopts = -v --tb=shortAll tests are designed to:
- Run reliably in CI/CD environments
- Handle network failures gracefully
- Provide clear error messages
- Cover edge cases and error conditions
When contributing new features:
- Add unit tests for new functionality
- Include both success and failure scenarios
- Mock external dependencies (Wikipedia API)
- Add integration tests for end-to-end validation
- Follow existing test patterns and naming conventions
Problem: Claude Desktop shows errors like spawn wikipedia-mcp ENOENT or cannot find the command.
Cause: This occurs when the wikipedia-mcp command is installed in a user-specific location (like ~/.local/bin/) that's not in Claude Desktop's PATH.
Solutions:
-
Use full path to the command (Recommended):
{ "mcpServers": { "wikipedia": { "command": "/home/username/.local/bin/wikipedia-mcp" } } }To find your exact path, run:
which wikipedia-mcp -
Install with pipx for global access:
pipx install wikipedia-mcp
Then use the standard configuration:
{ "mcpServers": { "wikipedia": { "command": "wikipedia-mcp" } } } -
Create a symlink to a global location:
sudo ln -s ~/.local/bin/wikipedia-mcp /usr/local/bin/wikipedia-mcp
- Article Not Found: Check the exact spelling of article titles
- Rate Limiting: Wikipedia API has rate limits; consider adding delays between requests
- Large Articles: Some Wikipedia articles are very large and may exceed token limits
The Model Context Protocol (MCP) is not a traditional HTTP API but a specialized protocol for communication between LLMs and external tools. Key characteristics:
- Uses stdio (standard input/output) or SSE (Server-Sent Events) for communication
- Designed specifically for AI model interaction
- Provides standardized formats for tools, resources, and prompts
- Integrates directly with Claude and other MCP-compatible AI systems
Claude Desktop acts as the MCP client, while this server provides the tools and resources that Claude can use to access Wikipedia information.
Contributions are welcome! Please see CONTRIBUTING.md for guidelines.
This project is licensed under the MIT License - see the LICENSE file for details.
- π Portfolio: ravikumar-dev.me
- π Blog: Medium
- πΌ LinkedIn: in/ravi-kumar-e
- π¦ Twitter: @Ravikumar_d3v

