CyberFeeds is a lightweight Python script designed to aggregate the latest cybersecurity news from top industry RSS feeds. It fetches, filters, and saves links to recent articles, ensuring you stay up-to-date with the latest threats, vulnerabilities, and security trends without duplicates.
- Multi-Source Aggregation: Fetches news from trusted sources:
- Smart Filtering: Only retrieves articles published in the last 7 days.
- Deduplication: Checks against previously saved files to prevent duplicate links.
- Automatic Cleanup: Automatically manages storage by keeping only the most recent weekly files (configurable).
- Simple Output: Generates a clean, plain text file containing a list of URLs for easy reading or integration with other tools.
- Python 3.x
feedparserlibrary
-
Clone the repository:
git clone https://github.com/yourusername/cyberfeeds.git cd cyberfeeds -
Install dependencies:
pip install feedparser
Run the script directly using Python:
python cyberfeeds.pyThe script will:
- Fetch the latest RSS feeds.
- Filter for articles from the last week.
- Check for duplicates against existing history.
- Save new links to a file named
cybernews_links_week_YYYY-MM-DD.txt. - Clean up old files if the limit is exceeded.
You can customize the script by modifying the configuration variables at the top of cyberfeeds.py:
# --- CONFIG ---
FEEDS = [
"https://feeds.feedburner.com/TheHackersNews",
"https://www.bleepingcomputer.com/feed/",
"https://www.securityweek.com/feed"
]
OUTPUT_DIR = "/mnt/e/daily_news" # Directory to save output files
MAX_ITEMS_PER_FEED = 10 # Maximum links to fetch per feed
KEEP_WEEKS = 1 # Number of weekly files to keep before deleting old onesThe output is a simple text file (e.g., cybernews_links_week_2025-11-10.txt) containing one URL per line:
https://thehackernews.com/2025/11/example-article.html
https://www.bleepingcomputer.com/news/security/example-story/
...
This project is open-source and available for personal and educational use.