diff --git a/.github/workflows/tests.yml b/.github/workflows/tests.yml new file mode 100644 index 0000000..543f0dd --- /dev/null +++ b/.github/workflows/tests.yml @@ -0,0 +1,36 @@ +name: Tests + +on: + push: + branches: ['**'] + pull_request: + branches: [main] + workflow_dispatch: + +jobs: + tests: + name: Run Tests + runs-on: ubuntu-latest + + steps: + - uses: actions/checkout@v4 + + - name: Setup Python + uses: actions/setup-python@v5 + with: + python-version: '3.12' + cache: 'pip' + + - name: Install dependencies + run: pip install -r requirements-dev.txt + + - name: Run tests with coverage + run: pytest --cov=. --cov-report=term-missing --cov-report=xml -v + + - name: Upload coverage report + if: always() + uses: actions/upload-artifact@v4 + with: + name: coverage-report + path: coverage.xml + retention-days: 7 diff --git a/CLAUDE.md b/CLAUDE.md new file mode 100644 index 0000000..ede5375 --- /dev/null +++ b/CLAUDE.md @@ -0,0 +1,126 @@ +# CLAUDE.md + +This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository. + +## Project Overview + +Sync2Cal Events API — a Python FastAPI application that converts data from various websites into calendar events. Serves JSON and ICS feeds compatible with Google Calendar, Apple Calendar, Outlook, etc. Backend for sync2cal.com. + +## Commands + +```bash +# Development +pip install -r requirements.txt # Install dependencies +uvicorn main:app --reload # Dev server (localhost:8000) +uvicorn main:app --host 0.0.0.0 # Production server + +# Documentation +# Interactive API docs at http://localhost:8000/docs (auto-generated by FastAPI) +``` + +## Architecture + +### Tech Stack +- **Python 3.11+** with **FastAPI** + **Uvicorn** +- **Requests** for HTTP calls, **BeautifulSoup4** + **lxml** for web scraping +- **gspread** + **google-auth** for Google Sheets integration +- **python-dotenv** for environment variable management + +### Key Directories +- `base/` — Core framework: `Event` dataclass, `CalendarBase`, `IntegrationBase`, `mount_integration_routes()` +- `integrations/` — Individual source integrations (12 total) +- `docs/` — API endpoint documentation +- `.cursor/rules/` — Cursor IDE rules (project conventions) + +### Plugin Architecture +Every integration follows the same pattern: +1. A `CalendarBase` subclass with `fetch_events(*args, **kwargs) -> List[Event]` +2. An `IntegrationBase` subclass registered in `main.py` +3. `mount_integration_routes()` auto-creates `GET //events` endpoint from `fetch_events` signature +4. The `ics` query param (default `true`) toggles ICS vs JSON response + +### Integrations +| ID | Source | Method | Credentials Required | +|----|--------|--------|---------------------| +| twitch | Twitch | API | Yes (TWITCH_CLIENT_ID/SECRET) | +| google-sheets | Google Sheets | API | Yes (service account JSON) | +| thetvdb | TheTVDB | API | Yes (API key + bearer token) | +| sportsdb | TheSportsDB | API | Yes (SPORTSDB_API_KEY) | +| daily-weather-forecast | OpenWeatherMap | API | Yes (OPENWEATHERMAP_API_KEY) | +| investing | Investing.com | Scraping | No | +| imdb | IMDb | Scraping | No | +| moviedb | TheMovieDB | Scraping | No | +| wwe | WWE | Scraping | No | +| shows | TVInsider | Scraping | No | +| releases | Releases.com | Scraping | No | + +### Data Model +```python +@dataclass +class Event: + uid: str # Unique identifier + title: str # Event name + start: datetime # Start datetime + end: datetime # End datetime + all_day: bool # All-day event flag + description: str # Event description + location: str # Event location + extra: Dict # Provider-specific metadata +``` + +### API Pattern +All endpoints follow: `GET //events?&ics=true|false` +- Integration ID uses hyphens (e.g., `google_sheets` becomes `/google-sheets/events`) +- `fetch_events` parameters become query params automatically +- `ics=true` (default): returns `text/plain` ICS content +- `ics=false`: returns JSON list of Event objects + +## Patterns & Conventions + +### Adding New Integrations +1. Create `integrations/.py` with `Calendar(CalendarBase)` and `Integration(IntegrationBase)` +2. Implement `fetch_events()` — all params must be JSON-serializable primitives with defaults +3. Register in `main.py`: create integration instance, mount routes via loop +4. Add any required env vars to `env.template` + +### HTTP Requests +- Always set explicit timeouts (10-20s) on `requests.get/post` +- Set `User-Agent` header when scraping websites +- Use `response.raise_for_status()` for error detection +- Wrap in try/except, raise `HTTPException` with appropriate status codes (400, 401, 429, 500, 502) + +### Scraping +- Use BeautifulSoup with `lxml` parser +- Skip individual items that fail to parse (don't crash the whole request) +- Construct deterministic UIDs (e.g., `tmdb-{title}-{date}`) + +### All-Day Events +- Set `all_day=True` and `end = start + timedelta(days=1)` + +### ICS Generation +- `utils.generate_ics()` handles RFC 5545 compliance: line folding, text escaping, VTIMEZONE +- `utils.make_slug()` for URL-friendly text conversion + +## Critical Files +- `main.py` — App setup, CORS config, integration registration loop +- `base/routes.py` — `mount_integration_routes()` — the glue that turns `fetch_events` into API endpoints +- `base/models.py` — `Event` dataclass +- `base/calendar.py` — `CalendarBase` abstract class +- `base/integration.py` — `IntegrationBase` abstract class +- `utils.py` — `generate_ics()` and `make_slug()` utilities +- `env.template` — Required environment variables reference + +## Environment Variables +See `env.template` for the full list. Key ones: +- `TWITCH_CLIENT_ID` / `TWITCH_CLIENT_SECRET` — Twitch API +- `GOOGLE_SHEETS_SERVICE_ACCOUNT_FILE` — Path to service account JSON +- `THE_TVDB_API_KEY` / `THE_TVDB_BEARER_TOKEN` — TheTVDB API +- `SPORTSDB_API_KEY` — TheSportsDB API +- `OPENWEATHERMAP_API_KEY` — Weather integration +- `CORS_ORIGINS` — Comma-separated allowed origins (defaults to sync2cal.com) + +## Gotchas +- **No deployment config**: No Dockerfile, Procfile, or railway.json exists yet. +- **CORS with credentials**: `allow_credentials=True` means `allow_origins=["*"]` is not allowed. Must specify exact origins via `CORS_ORIGINS` env var. +- **Scraping fragility**: IMDb, TMDB, and other scraped sources may break if the site changes its HTML structure. +- **`multi_calendar`**: Only Twitch uses `multi_calendar=True`. The `master_csv()` method on IntegrationBase is a TODO stub. diff --git a/docs/plans/2026-02-01-scraper-consolidation-design.md b/docs/plans/2026-02-01-scraper-consolidation-design.md new file mode 100644 index 0000000..8a49983 --- /dev/null +++ b/docs/plans/2026-02-01-scraper-consolidation-design.md @@ -0,0 +1,145 @@ +# Scraper Consolidation Design + +**Goal:** Retire `sync2cal-custom-scraper` and consolidate all ICS feed generation into `S2C-events-api`, eliminating a redundant Railway service. + +**Date:** 2026-02-01 + +--- + +## Background + +Two services produce ICS calendar feeds from external sources: + +| Service | Repo | Domain | Status | +|---------|------|--------|--------| +| custom-scraper | sync2cal-custom-scraper | `sync2cal-scraper.up.railway.app` | Private, no tests, no CI | +| events-api | S2C-events-api | `api.sync2cal.com` | Public, 215 tests, 92% coverage, CI | + +All 10 shared integrations are **identical code** (copy-pasted). Events-api additionally has weather integration, better architecture (loop-based routing, CORS), and a contributor guide. + +**7,959 categories** in the database have SOURCE URLs pointing to custom-scraper. Zero point to events-api. + +## Problem: URL Patterns Don't Match + +A simple domain swap won't work. The two services use different URL structures: + +| Integration | Count | custom-scraper URL | events-api URL | +|---|---|---|---| +| TheTVDB | 7,849 | `/thetvdb/series/{id}/episodes.ics` | `/thetvdb/events?series_id={id}` | +| TV Shows | 41 | `/tv/platform/{slug}.ics` or `/tv/genre/{slug}.ics` | `/shows/events?mode=platform&slug={slug}` or `?mode=genre&slug={slug}` | +| SportsDB | 20 | `/sportsdb/league/{id}.ics` or `/sportsdb/team/{id}.ics` | `/sportsdb/events?mode=league&id={id}` or `?mode=team&id={id}` | +| Google Sheets | 15 | `/sheets/events.ics?sheet_url=...` | `/google-sheets/events?sheet_url=...` | +| Investing | 10 | `/investing/earnings.ics` or `/investing/ipo.ics` | `/investing/events?kind=earnings` or `?kind=ipo` | +| Yahoo Finance | 8 | `/yahoo/generate_earnings_ics?k=100&ticker=NVDA` | **Does not exist** (broken anyway — expired cookies) | +| Releases | 7 | `/releases/generate_game_ics` | `/releases/events?kind=games` | +| Twitch | 5 | `/twitch/{name}/schedule.ics` | `/twitch/events?streamer_name={name}` | +| IMDb | 4 | `/imdb/movies.ics?genre=...&actor=...&country=...` | `/imdb/events?genre=...&actor=...&country=...` | + +## Approach: Database Migration Script + +Write a Python migration script that: + +1. Connects to the production PostgreSQL database +2. Reads all categories with SOURCE URLs containing `sync2cal-scraper.up.railway.app` +3. Rewrites each URL to the equivalent `api.sync2cal.com` endpoint +4. Updates the database in a single transaction (atomic — all or nothing) + +### URL Rewrite Rules + +```python +REWRITE_RULES = [ + # TheTVDB: /thetvdb/series/{id}/episodes.ics -> /thetvdb/events?series_id={id} + (r'sync2cal-scraper\.up\.railway\.app/thetvdb/series/(\d+)/episodes\.ics', + r'api.sync2cal.com/thetvdb/events?series_id=\1'), + + # TV Shows platform: /tv/platform/{slug}.ics -> /shows/events?mode=platform&slug={slug} + (r'sync2cal-scraper\.up\.railway\.app/tv/platform/([^/.]+)\.ics', + r'api.sync2cal.com/shows/events?mode=platform&slug=\1'), + + # TV Shows genre: /tv/genre/{slug}.ics -> /shows/events?mode=genre&slug={slug} + (r'sync2cal-scraper\.up\.railway\.app/tv/genre/([^/.]+)\.ics', + r'api.sync2cal.com/shows/events?mode=genre&slug=\1'), + + # SportsDB league: /sportsdb/league/{id}.ics -> /sportsdb/events?mode=league&id={id} + (r'sync2cal-scraper\.up\.railway\.app/sportsdb/league/(\d+)\.ics', + r'api.sync2cal.com/sportsdb/events?mode=league&id=\1'), + + # SportsDB team: /sportsdb/team/{id}.ics -> /sportsdb/events?mode=team&id={id} + (r'sync2cal-scraper\.up\.railway\.app/sportsdb/team/(\d+)\.ics', + r'api.sync2cal.com/sportsdb/events?mode=team&id=\1'), + + # Google Sheets: /sheets/events.ics?sheet_url=... -> /google-sheets/events?sheet_url=... + (r'sync2cal-scraper\.up\.railway\.app/sheets/events\.ics\?', + r'api.sync2cal.com/google-sheets/events?'), + + # Investing: /investing/{kind}.ics -> /investing/events?kind={kind} + (r'sync2cal-scraper\.up\.railway\.app/investing/earnings\.ics', + r'api.sync2cal.com/investing/events?kind=earnings'), + (r'sync2cal-scraper\.up\.railway\.app/investing/ipo\.ics', + r'api.sync2cal.com/investing/events?kind=ipo'), + + # Releases: /releases/generate_game_ics -> /releases/events?kind=games + (r'sync2cal-scraper\.up\.railway\.app/releases/generate_game_ics', + r'api.sync2cal.com/releases/events?kind=games'), + + # Twitch: /twitch/{name}/schedule.ics -> /twitch/events?streamer_name={name} + (r'sync2cal-scraper\.up\.railway\.app/twitch/([^/]+)/schedule\.ics', + r'api.sync2cal.com/twitch/events?streamer_name=\1'), + + # IMDb: /imdb/movies.ics?... -> /imdb/events?... + (r'sync2cal-scraper\.up\.railway\.app/imdb/movies\.ics\?', + r'api.sync2cal.com/imdb/events?'), +] +``` + +### Yahoo Finance (8 categories) + +The Yahoo Finance integration is broken (hardcoded expired cookies). Options: +1. **Remove the SOURCE** from these 8 categories so the scheduled job skips them +2. **Replace with Investing.com earnings** if that integration supports per-ticker queries +3. **Leave broken** — the scheduled job already handles download failures gracefully + +Recommendation: Remove the SOURCE field. These can be re-enabled if/when a working earnings integration is built. + +## Migration Steps + +### Pre-migration (verify) + +1. Confirm events-api endpoints work for each integration type by testing one URL per integration against `api.sync2cal.com` +2. Compare ICS output between old and new endpoints to verify identical calendar data + +### Execute migration + +3. Run the migration script against production database (during the 14-hour gap between scheduled job runs) +4. Verify row count matches: 7,959 rows updated + +### Post-migration (validate) + +5. Wait for next scheduled job run +6. Check Discord report for success/failure counts — should match pre-migration baseline +7. Spot-check a few categories (TheTVDB, SportsDB, Sheets) to confirm events populated + +### Retire custom-scraper + +8. Keep custom-scraper running for 1 week as safety net (in case rollback needed) +9. After 1 week with no issues, remove custom-scraper from Railway +10. Archive the sync2cal-custom-scraper repo on GitHub + +## Rollback Plan + +If migration fails or scheduled job reports increased failures: + +```sql +-- Reverse the migration (swap api.sync2cal.com back to sync2cal-scraper.up.railway.app) +-- Keep the reverse rewrite rules in the migration script +``` + +The migration script should log all changes (old URL -> new URL) to enable reversal. + +## References to Update + +After migration, update these files that reference `sync2cal-scraper.up.railway.app`: +- `sync2cal-ics-version/CLAUDE.md` — Railway architecture table +- `new-baklava/app/admin/scraper/page.tsx` — embedded iframe to scraper docs (change to events-api docs) +- `S2C-events-api/BACKEND_STAGING_SETUP_PLAN.md` — production URL reference +- `S2C-frontend/server/meta/categoryMap.json` — regenerated automatically by prebuild script diff --git a/integrations/google_sheets.py b/integrations/google_sheets.py index 82ee033..bfb2449 100644 --- a/integrations/google_sheets.py +++ b/integrations/google_sheets.py @@ -1,4 +1,5 @@ from fastapi import HTTPException +import json import os from base import CalendarBase, Event, IntegrationBase from typing import List @@ -33,8 +34,13 @@ def fetch_events( """ try: try: - sa_path = os.getenv("GOOGLE_SHEETS_SERVICE_ACCOUNT_FILE", "service_account.json") - gc = gspread.service_account(filename=sa_path) + sa_json = os.getenv("GOOGLE_SHEETS_SERVICE_ACCOUNT_JSON") + if sa_json: + creds = json.loads(sa_json) + gc = gspread.service_account_from_dict(creds) + else: + sa_path = os.getenv("GOOGLE_SHEETS_SERVICE_ACCOUNT_FILE", "service_account.json") + gc = gspread.service_account(filename=sa_path) except Exception as auth_error: raise HTTPException( status_code=500, diff --git a/integrations/investing.py b/integrations/investing.py index e380ec0..559de31 100644 --- a/integrations/investing.py +++ b/integrations/investing.py @@ -1,153 +1,63 @@ from typing import List, Optional, Union from datetime import datetime, timedelta +import os from fastapi import HTTPException, Query from base import CalendarBase, Event, IntegrationBase -# Inlined helpers and constants (moved from routers/investing.py) import requests from bs4 import BeautifulSoup from html import unescape -# Earnings constants -EARNINGS_URL = "https://www.investing.com/earnings-calendar/Service/getCalendarFilteredData" -EARNINGS_HEADERS = { - "User-Agent": "Mozilla/5.0", - "X-Requested-With": "XMLHttpRequest", - "Content-Type": "application/x-www-form-urlencoded", - "Referer": "https://www.investing.com/earnings-calendar/", -} +# FMP (Financial Modeling Prep) API for earnings +FMP_EARNINGS_URL = "https://financialmodelingprep.com/stable/earnings-calendar" -# IPO constants +# IPO constants (still using investing.com — FMP IPO requires paid plan) IPO_URL = "https://www.investing.com/ipo-calendar/Service/getCalendarFilteredData" IPO_HEADERS = { - "User-Agent": "Mozilla/5.0", + "User-Agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/131.0.0.0 Safari/537.36", "X-Requested-With": "XMLHttpRequest", "Content-Type": "application/x-www-form-urlencoded", "Referer": "https://www.investing.com/ipo-calendar/", } -# Maps +# Maps (used by IPO country filter) COUNTRY_MAP = { - "argentina": 29, - "australia": 25, - "austria": 54, - "bahrain": 145, - "bangladesh": 47, - "belgium": 34, - "bosnia-herzegovina": 174, - "botswana": 163, - "brazil": 32, - "bulgaria": 70, - "canada": 6, - "chile": 27, - "china": 37, - "colombia": 122, - "costa rica": 15, - "cote d'ivoire": 78, - "croatia": 113, - "cyprus": 107, - "czech republic": 55, - "denmark": 24, - "egypt": 59, - "estonia": 89, - "euro zone": 72, - "finland": 71, - "france": 22, - "germany": 17, - "greece": 51, - "hong kong": 39, - "hungary": 93, - "iceland": 106, - "india": 14, - "indonesia": 48, - "iraq": 66, - "ireland": 33, - "israel": 23, - "italy": 10, - "jamaica": 119, - "japan": 35, - "jordan": 92, - "kazakhstan": 102, - "kenya": 57, - "kuwait": 94, - "latvia": 97, - "lebanon": 68, - "lithuania": 96, - "luxembourg": 103, - "malawi": 111, - "malaysia": 42, - "malta": 109, - "mauritius": 188, - "mexico": 7, - "mongolia": 139, - "montenegro": 247, - "morocco": 105, - "namibia": 172, - "netherlands": 21, - "new zealand": 43, - "nigeria": 20, - "norway": 60, - "oman": 87, - "pakistan": 44, - "palestinian territory": 193, - "peru": 125, - "philippines": 45, - "poland": 53, - "portugal": 38, - "qatar": 170, - "romania": 100, - "russia": 56, - "rwanda": 80, - "saudi arabia": 52, - "serbia": 238, - "singapore": 36, - "slovakia": 90, - "slovenia": 112, - "south africa": 110, - "south korea": 11, - "spain": 26, - "sri lanka": 162, - "sweden": 9, - "switzerland": 12, - "taiwan": 46, - "tanzania": 85, - "thailand": 41, - "tunisia": 202, - "türkiye": 63, - "uganda": 123, - "ukraine": 61, - "united arab emirates": 143, - "united kingdom": 4, - "united states": 5, - "venezuela": 138, - "vietnam": 178, - "zambia": 84, - "zimbabwe": 75, + "argentina": 29, "australia": 25, "austria": 54, "bahrain": 145, + "bangladesh": 47, "belgium": 34, "bosnia-herzegovina": 174, "botswana": 163, + "brazil": 32, "bulgaria": 70, "canada": 6, "chile": 27, "china": 37, + "colombia": 122, "costa rica": 15, "cote d'ivoire": 78, "croatia": 113, + "cyprus": 107, "czech republic": 55, "denmark": 24, "egypt": 59, + "estonia": 89, "euro zone": 72, "finland": 71, "france": 22, "germany": 17, + "greece": 51, "hong kong": 39, "hungary": 93, "iceland": 106, "india": 14, + "indonesia": 48, "iraq": 66, "ireland": 33, "israel": 23, "italy": 10, + "jamaica": 119, "japan": 35, "jordan": 92, "kazakhstan": 102, "kenya": 57, + "kuwait": 94, "latvia": 97, "lebanon": 68, "lithuania": 96, + "luxembourg": 103, "malawi": 111, "malaysia": 42, "malta": 109, + "mauritius": 188, "mexico": 7, "mongolia": 139, "montenegro": 247, + "morocco": 105, "namibia": 172, "netherlands": 21, "new zealand": 43, + "nigeria": 20, "norway": 60, "oman": 87, "pakistan": 44, + "palestinian territory": 193, "peru": 125, "philippines": 45, "poland": 53, + "portugal": 38, "qatar": 170, "romania": 100, "russia": 56, "rwanda": 80, + "saudi arabia": 52, "serbia": 238, "singapore": 36, "slovakia": 90, + "slovenia": 112, "south africa": 110, "south korea": 11, "spain": 26, + "sri lanka": 162, "sweden": 9, "switzerland": 12, "taiwan": 46, + "tanzania": 85, "thailand": 41, "tunisia": 202, "türkiye": 63, + "uganda": 123, "ukraine": 61, "united arab emirates": 143, + "united kingdom": 4, "united states": 5, "venezuela": 138, "vietnam": 178, + "zambia": 84, "zimbabwe": 75, } SECTOR_MAP = { - "energy": 24, - "basic materials": 25, - "industrials": 26, - "consumer cyclicals": 27, - "consumer non-cyclicals": 28, - "financials": 29, - "healthcare": 30, - "technology": 31, - "utilities": 32, - "real estate": 33, - "institutions, associations & organizations": 34, - "government activity": 35, + "energy": 24, "basic materials": 25, "industrials": 26, + "consumer cyclicals": 27, "consumer non-cyclicals": 28, "financials": 29, + "healthcare": 30, "technology": 31, "utilities": 32, "real estate": 33, + "institutions, associations & organizations": 34, "government activity": 35, "academic and educational services": 36, } -IMPORTANCE_MAP = { - "low": 1, - "medium": 2, - "high": 3, -} +IMPORTANCE_MAP = {"low": 1, "medium": 2, "high": 3} def clean(text: str) -> str: @@ -184,91 +94,41 @@ def convert_names_to_ids(items: List[Union[str, int]], mapping: dict, label: str return ids -def build_earnings_payload(date_from, date_to, countries, sectors, importance, current_tab): - payload = { - "submitFilters": "1", - "limit_from": "0", - "currentTab": current_tab, - } - if current_tab == "custom": - payload["dateFrom"] = date_from - payload["dateTo"] = date_to - if countries: - payload["country[]"] = [str(cid) for cid in countries] - if sectors: - payload["sector[]"] = [str(sid) for sid in sectors] - if importance: - payload["importance[]"] = [str(imp) for imp in importance] - return payload - - -def parse_earnings(html: str, fallback_date: str): - soup = BeautifulSoup(html, "lxml") - rows = soup.select("tr") - results = [] - current_date = fallback_date - for row in rows: - day_td = row.find("td", class_="theDay") - if day_td: - try: - current_date = datetime.strptime( - day_td.text.strip(), "%A, %B %d, %Y" - ).strftime("%Y-%m-%d") - except ValueError: - continue - continue - tds = row.find_all("td") - if len(tds) < 6: - continue - country_span = tds[0].find("span", title=True) - country = country_span["title"] if country_span else "Unknown" - company_td = tds[1] - name_span = company_td.find("span") - ticker_a = company_td.find("a") - name = name_span.text.strip() if name_span else "" - ticker = ticker_a.text.strip() if ticker_a else "" - if not name and not ticker: - parts = list(company_td.stripped_strings) - company = " ".join(parts) if parts else "Unknown" - else: - company = f"{name} ({ticker})" if ticker else name - eps = { - "actual": clean(tds[2].text), - "forecast": clean(tds[3].text), - } - revenue = { - "actual": clean(tds[4].text), - "forecast": clean(tds[5].text), - } - market_cap = clean(tds[6].text) if len(tds) > 6 else "--" - time = "N/A" - if len(tds) > 7: - span = tds[7].find("span", attrs={"data-tooltip": True}) - if span and span.has_attr("data-tooltip"): - time = span["data-tooltip"].strip() - results.append( - { - "date": current_date, - "company": company, - "country": country, - "eps": eps, - "revenue": revenue, - "market_cap": market_cap, - "time": time, - } +def _format_revenue(value) -> str: + """Format revenue number to human-readable string.""" + if value is None: + return "--" + try: + v = float(value) + except (ValueError, TypeError): + return str(value) + if abs(v) >= 1e12: + return f"${v / 1e12:.2f}T" + if abs(v) >= 1e9: + return f"${v / 1e9:.2f}B" + if abs(v) >= 1e6: + return f"${v / 1e6:.2f}M" + return f"${v:,.0f}" + + +def fetch_earnings_fmp(date_from: str, date_to: str) -> List[dict]: + """Fetch earnings from FMP (Financial Modeling Prep) API.""" + api_key = os.getenv("FMP_API_KEY") + if not api_key: + raise HTTPException( + status_code=500, + detail="Missing FMP_API_KEY environment variable.", ) - return results - - -def fetch_earnings(date_from, date_to, countries, sectors, importance, current_tab): - payload = build_earnings_payload( - date_from, date_to, countries, sectors, importance, current_tab + response = requests.get( + FMP_EARNINGS_URL, + params={"from": date_from, "to": date_to, "apikey": api_key}, + timeout=20, ) - response = requests.post(EARNINGS_URL, headers=EARNINGS_HEADERS, data=payload, timeout=20) response.raise_for_status() - html = response.json()["data"] - clean_html = unescape(html) - return parse_earnings(clean_html, date_from or "1970-01-01") + data = response.json() + if isinstance(data, dict) and "Error Message" in data: + raise HTTPException(status_code=502, detail=f"FMP API error: {data['Error Message']}") + return data def build_ipo_payload(countries: List[int]) -> dict: @@ -341,44 +201,59 @@ def fetch_events( date_to: Optional[str] = None, ) -> List[Event]: """ - Fetch Investing.com events as calendar events. + Fetch financial calendar events. Parameters: - kind (str): "earnings" or "ipo". Defaults to "earnings". - - country (List[str|int]): Names (case-insensitive) or IDs from COUNTRY_MAP. - - sector (List[str|int]): Names or IDs (earnings only). - - importance (List[str|int]): "low" | "medium" | "high" or IDs (earnings only). - - tab (str): One of: "yesterday", "today", "tomorrow", "thisWeek", "nextWeek" (earnings only). - - date_from/date_to (YYYY-MM-DD): Custom range (earnings only). Must be provided together. + - country (List[str|int]): Country filter (IPO only; ignored for earnings). + - sector (List[str|int]): Sector filter (ignored — kept for backwards compatibility). + - importance (List[str|int]): Importance filter (ignored — kept for backwards compatibility). + - tab (str): Date preset (ignored for earnings — FMP always uses date range). + - date_from/date_to (YYYY-MM-DD): Custom range. For earnings, defaults to current week if not provided. """ try: events: List[Event] = [] if kind.lower() == "earnings": - from_date, to_date, current_tab = resolve_dates(tab, date_from, date_to) - country_ids = convert_names_to_ids(country, COUNTRY_MAP, "country") - sector_ids = convert_names_to_ids(sector, SECTOR_MAP, "sector") - importance_ids = convert_names_to_ids(importance, IMPORTANCE_MAP, "importance") - - raw = fetch_earnings( - from_date, to_date, country_ids, sector_ids, importance_ids, current_tab - ) + # Resolve date range: use provided dates, or default to current week + if date_from and date_to: + from_date, to_date = date_from, date_to + else: + today = datetime.now() + # Start of current week (Monday) + start_of_week = today - timedelta(days=today.weekday()) + end_of_week = start_of_week + timedelta(days=6) + from_date = start_of_week.strftime("%Y-%m-%d") + to_date = end_of_week.strftime("%Y-%m-%d") + + raw = fetch_earnings_fmp(from_date, to_date) for e in raw: - start = datetime.strptime(e["date"], "%Y-%m-%d") + date_str = e.get("date") + if not date_str: + continue + try: + start = datetime.strptime(date_str, "%Y-%m-%d") + except ValueError: + continue end = start + timedelta(days=1) + symbol = e.get("symbol", "Unknown") + eps_actual = e.get("epsActual") + eps_estimated = e.get("epsEstimated") + revenue_actual = e.get("revenueActual") + revenue_estimated = e.get("revenueEstimated") + description = ( - f"Company: {e['company']} | Country: {e['country']} | " - f"EPS (actual/forecast): {e['eps']['actual']} / {e['eps']['forecast']} | " - f"Revenue (actual/forecast): {e['revenue']['actual']} / {e['revenue']['forecast']} | " - f"Market Cap: {e['market_cap']} | Time: {e.get('time') or 'N/A'}" + f"Symbol: {symbol} | " + f"EPS (actual/estimate): {eps_actual or '--'} / {eps_estimated or '--'} | " + f"Revenue (actual/estimate): {_format_revenue(revenue_actual)} / {_format_revenue(revenue_estimated)}" ) events.append( Event( - uid=f"inv-earnings-{e['company'].replace(' ', '').lower()}-{e['date']}", - title=f"Earnings – {e['company']}", + uid=f"inv-earnings-{symbol.lower()}-{date_str}", + title=f"Earnings – {symbol}", start=start, end=end, all_day=True, @@ -388,7 +263,7 @@ def fetch_events( ) elif kind.lower() == "ipo": - # For IPOs, only country filter is used + # IPO still uses investing.com scraping country_ids = convert_names_to_ids(country, COUNTRY_MAP, "country") if country else [] raw = fetch_ipo_events(country_ids) @@ -420,15 +295,12 @@ def fetch_events( return events except HTTPException: raise + except requests.RequestException as e: + raise HTTPException(status_code=502, detail=f"Request failed: {str(e)}") from e except Exception as e: raise HTTPException(status_code=500, detail=str(e)) from e class InvestingIntegration(IntegrationBase): def fetch_calendars(self, *args, **kwargs): - """ - Placeholder for future multi-calendar support. - """ return None - - diff --git a/integrations/thetvdb.py b/integrations/thetvdb.py index c1a4c3e..e956fd7 100644 --- a/integrations/thetvdb.py +++ b/integrations/thetvdb.py @@ -1,6 +1,8 @@ from typing import List from datetime import datetime, timedelta import os +import threading +import logging import requests from fastapi import HTTPException @@ -9,6 +11,44 @@ API_BASE = "https://api4.thetvdb.com/v4" +logger = logging.getLogger(__name__) + +# Module-level token cache with thread safety +_token_lock = threading.Lock() +_cached_token: str | None = None + + +def _login_to_thetvdb(api_key: str) -> str: + """Call TheTVDB login endpoint to get a fresh bearer token.""" + response = requests.post( + f"{API_BASE}/login", + json={"apikey": api_key}, + timeout=10, + ) + response.raise_for_status() + data = response.json() + token = data.get("data", {}).get("token") + if not token: + raise ValueError("TheTVDB login succeeded but returned no token") + return token + + +def _get_bearer_token(api_key: str) -> str: + """Get a bearer token, using cached value or refreshing via login.""" + global _cached_token + with _token_lock: + if _cached_token: + return _cached_token + logger.info("No cached TheTVDB token, logging in with API key") + _cached_token = _login_to_thetvdb(api_key) + return _cached_token + + +def _invalidate_token(): + """Clear cached token so next request triggers a fresh login.""" + global _cached_token + with _token_lock: + _cached_token = None class TheTvDbCalendar(CalendarBase): @@ -22,20 +62,30 @@ def fetch_events(self, series_id: int) -> List[Event]: try: url = f"{API_BASE}/series/{series_id}/episodes/default?page=0" api_key = os.getenv("THE_TVDB_API_KEY") - bearer_token = os.getenv("THE_TVDB_BEARER_TOKEN") - if not api_key or not bearer_token: + if not api_key: raise HTTPException( status_code=500, - detail=( - "Missing TheTVDB credentials. Set THE_TVDB_API_KEY and THE_TVDB_BEARER_TOKEN in environment." - ), + detail="Missing TheTVDB credentials. Set THE_TVDB_API_KEY in environment.", ) + bearer_token = _get_bearer_token(api_key) headers = { "x-api-key": api_key, "Authorization": f"Bearer {bearer_token}", } response = requests.get(url, headers=headers, timeout=20) + + # Auto-refresh token on 401 and retry once + if response.status_code == 401: + logger.info("TheTVDB returned 401, refreshing token") + _invalidate_token() + bearer_token = _login_to_thetvdb(api_key) + with _token_lock: + global _cached_token + _cached_token = bearer_token + headers["Authorization"] = f"Bearer {bearer_token}" + response = requests.get(url, headers=headers, timeout=20) + response.raise_for_status() data = response.json() if data.get("status") != "success": diff --git a/integrations/ufc.py b/integrations/ufc.py new file mode 100644 index 0000000..e289da4 --- /dev/null +++ b/integrations/ufc.py @@ -0,0 +1,200 @@ +from typing import List +from datetime import datetime, timedelta +import re + +import requests +from bs4 import BeautifulSoup +from fastapi import HTTPException + +from base import CalendarBase, Event, IntegrationBase + + +UFC_EVENTS_URL = "https://www.ufc.com/events" +UFC_BASE_URL = "https://www.ufc.com" + + +def get_event_urls() -> List[str]: + """Fetch event URLs from UFC.com events listing pages.""" + seen_slugs = set() + event_urls = [] + + for page in range(2): # Pages 0 and 1 + url = f"{UFC_EVENTS_URL}?page={page}" + response = requests.get(url, timeout=30, headers={ + "User-Agent": "Mozilla/5.0 (compatible; Sync2Cal/1.0)" + }) + if response.status_code != 200: + continue + + soup = BeautifulSoup(response.text, "html.parser") + + # Find all event links - they must be on UFC.com and contain /event/ + for link in soup.find_all("a", href=True): + href = link["href"] + + # Skip non-UFC links (e.g., ticketmaster) + if not href.startswith("/") and "ufc.com" not in href: + continue + + if "/event/" not in href: + continue + + # Make absolute URL if needed + if href.startswith("/"): + href = f"{UFC_BASE_URL}{href}" + + # Remove fragment (e.g., #1295) and query string for deduplication + clean_url = href.split("#")[0].split("?")[0] + + # Extract slug for deduplication + slug = clean_url.split("/event/")[-1] + + # Use set to avoid duplicates by slug + if slug not in seen_slugs: + seen_slugs.add(slug) + event_urls.append(clean_url) + + return event_urls + + +def get_event_details(event_url: str) -> dict | None: + """Fetch event details from an individual event page.""" + try: + response = requests.get(event_url, timeout=30, headers={ + "User-Agent": "Mozilla/5.0 (compatible; Sync2Cal/1.0)" + }) + if response.status_code != 200: + return None + + soup = BeautifulSoup(response.text, "html.parser") + + # Get event title - handle the nested structure with fighters + title_el = soup.select_one(".c-hero__headline") + if not title_el: + return None + + # Check if it's a vs matchup (e.g., "Bautista vs Oliveira") + divider = title_el.select_one(".e-divider") + if divider: + top = divider.select_one(".e-divider__top") + bottom = divider.select_one(".e-divider__bottom") + if top and bottom: + title = f"{top.get_text(strip=True)} vs {bottom.get_text(strip=True)}" + else: + title = title_el.get_text(strip=True) + else: + title = title_el.get_text(strip=True) + + # Get headline prefix (e.g., "UFC Fight Night") + prefix_el = soup.select_one(".c-hero__headline-prefix") + if prefix_el: + prefix = prefix_el.get_text(strip=True) + title = f"{prefix}: {title}" + + # Get timestamp from data-timestamp attribute + timestamp_el = soup.select_one(".c-hero__headline-suffix[data-timestamp]") + if not timestamp_el: + # Try alternative selectors + timestamp_el = soup.select_one("[data-timestamp]") + + if not timestamp_el: + return None + + timestamp_str = timestamp_el.get("data-timestamp") + if not timestamp_str: + return None + + # Parse the Unix timestamp (in seconds) + try: + timestamp = int(timestamp_str) + start_time = datetime.utcfromtimestamp(timestamp) + except (ValueError, TypeError): + return None + + # Get venue/location - try multiple selectors + location = "" + for venue_selector in [".field--name-venue", ".c-hero__headline-location", ".c-event-venue"]: + venue_el = soup.select_one(venue_selector) + if venue_el: + # Clean up whitespace in venue text + location = " ".join(venue_el.get_text().split()) + break + + # Build description with fight card + description_parts = [] + + # Main card fights + main_card = soup.select_one("#main-card") + if main_card: + fights = main_card.select(".c-listing-fight") + if fights: + description_parts.append("Main Card:") + for fight in fights[:6]: # Limit to 6 fights + red_corner = fight.select_one(".c-listing-fight__corner--red .c-listing-fight__corner-name") + blue_corner = fight.select_one(".c-listing-fight__corner--blue .c-listing-fight__corner-name") + if red_corner and blue_corner: + description_parts.append(f" {red_corner.get_text(strip=True)} vs {blue_corner.get_text(strip=True)}") + + # Generate a unique ID from the URL + event_slug = event_url.split("/event/")[-1].split("?")[0] + uid = f"ufc-{event_slug}" + + return { + "uid": uid, + "title": title, + "start": start_time, + "location": location, + "description": "\n".join(description_parts) if description_parts else "", + } + + except Exception: + return None + + +class UfcCalendar(CalendarBase): + def fetch_events(self) -> List[Event]: + """Fetch UFC events directly from UFC.com.""" + try: + event_urls = get_event_urls() + events: List[Event] = [] + seen_uids = set() + + for event_url in event_urls: + details = get_event_details(event_url) + if details is None: + continue + + # Skip if we've already seen this event (by UID) + if details["uid"] in seen_uids: + continue + seen_uids.add(details["uid"]) + + # UFC events typically last about 3-4 hours + end_time = details["start"] + timedelta(hours=4) + + events.append( + Event( + uid=details["uid"], + title=details["title"], + start=details["start"], + end=end_time, + all_day=False, + description=details["description"], + location=details["location"], + ) + ) + + # Sort by start time + events.sort(key=lambda e: e.start) + self.events = events + return events + + except HTTPException: + raise + except Exception as e: + raise HTTPException(status_code=500, detail=str(e)) from e + + +class UfcIntegration(IntegrationBase): + def fetch_calendars(self, *args, **kwargs): + return None diff --git a/main.py b/main.py index 81ea24e..2b99a68 100644 --- a/main.py +++ b/main.py @@ -14,6 +14,7 @@ from integrations.moviedb import MovieDbIntegration, MovieDbCalendar from integrations.thetvdb import TheTvDbIntegration, TheTvDbCalendar from integrations.wwe import WweIntegration, WweCalendar +from integrations.ufc import UfcIntegration, UfcCalendar from integrations.shows import ShowsIntegration, ShowsCalendar from integrations.releases import ReleasesIntegration, ReleasesCalendar from integrations.sportsdb import SportsDbIntegration, SportsDbCalendar @@ -104,6 +105,14 @@ calendar_class=WweCalendar, multi_calendar=False, ), + UfcIntegration( + id="ufc", + name="UFC", + description="UFC events scraped directly from UFC.com", + base_url="https://www.ufc.com", + calendar_class=UfcCalendar, + multi_calendar=False, + ), ShowsIntegration( id="shows", name="TV Shows", diff --git a/pyproject.toml b/pyproject.toml new file mode 100644 index 0000000..368f2e3 --- /dev/null +++ b/pyproject.toml @@ -0,0 +1,4 @@ +[tool.pytest.ini_options] +testpaths = ["tests"] +pythonpath = ["."] +addopts = "-v --tb=short" diff --git a/requirements-dev.txt b/requirements-dev.txt new file mode 100644 index 0000000..1d8e06d --- /dev/null +++ b/requirements-dev.txt @@ -0,0 +1,4 @@ +-r requirements.txt +pytest>=8.0 +pytest-cov>=5.0 +httpx>=0.27 diff --git a/requirements.txt b/requirements.txt index 6368993..94e13bf 100644 --- a/requirements.txt +++ b/requirements.txt @@ -1,9 +1,9 @@ -fastapi -uvicorn -lxml -requests -beautifulsoup4 -gspread -google-auth -google-auth-oauthlib -python-dotenv +fastapi==0.123.2 +uvicorn==0.38.0 +lxml==6.0.2 +requests==2.32.5 +beautifulsoup4==4.14.3 +gspread==6.2.1 +google-auth==2.41.1 +google-auth-oauthlib==1.2.3 +python-dotenv==1.2.1 diff --git a/tests/__init__.py b/tests/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/tests/conftest.py b/tests/conftest.py new file mode 100644 index 0000000..a1835dd --- /dev/null +++ b/tests/conftest.py @@ -0,0 +1,29 @@ +import pytest +from datetime import datetime +from base.models import Event + + +@pytest.fixture +def sample_event(): + return Event( + uid="test-123", + title="Test Event", + start=datetime(2025, 6, 15, 10, 0), + end=datetime(2025, 6, 15, 11, 0), + all_day=False, + description="A test event", + location="Test Location", + ) + + +@pytest.fixture +def sample_all_day_event(): + return Event( + uid="test-allday-456", + title="All Day Event", + start=datetime(2025, 6, 15), + end=datetime(2025, 6, 16), + all_day=True, + description="An all-day test event", + location="", + ) diff --git a/tests/integrations/__init__.py b/tests/integrations/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/tests/integrations/test_google_sheets.py b/tests/integrations/test_google_sheets.py new file mode 100644 index 0000000..3da20d3 --- /dev/null +++ b/tests/integrations/test_google_sheets.py @@ -0,0 +1,198 @@ +import pytest +from unittest.mock import patch, MagicMock +from fastapi import HTTPException + +from integrations.google_sheets import GoogleSheetsCalendar + + +class TestGoogleSheetsCalendarFetchEvents: + HEADERS = [ + "name of event", "description", "location", + "start date", "start time", "end date", "end time", "all day event" + ] + + @patch("integrations.google_sheets.gspread.service_account") + def test_all_day_event(self, mock_sa): + mock_gc = MagicMock() + mock_sa.return_value = mock_gc + mock_sheet = MagicMock() + mock_gc.open_by_url.return_value = mock_sheet + mock_worksheet = MagicMock() + mock_sheet.sheet1 = mock_worksheet + mock_worksheet.get_all_values.return_value = [ + self.HEADERS, + ["Team Meeting", "Weekly sync", "Zoom", "2025-06-15", "", "", "", "yes"], + ] + + cal = GoogleSheetsCalendar(name="Sheets", id="sheets", icon="", events=[]) + events = cal.fetch_events(sheet_url="https://docs.google.com/spreadsheets/d/test") + assert len(events) == 1 + assert events[0].title == "Team Meeting" + assert events[0].all_day is True + assert events[0].description == "Weekly sync" + assert events[0].location == "Zoom" + + @patch("integrations.google_sheets.gspread.service_account") + def test_timed_event(self, mock_sa): + mock_gc = MagicMock() + mock_sa.return_value = mock_gc + mock_sheet = MagicMock() + mock_gc.open_by_url.return_value = mock_sheet + mock_worksheet = MagicMock() + mock_sheet.sheet1 = mock_worksheet + mock_worksheet.get_all_values.return_value = [ + self.HEADERS, + ["Standup", "Daily", "Office", "2025-06-15", "09:00", "2025-06-15", "09:30", "no"], + ] + + cal = GoogleSheetsCalendar(name="Sheets", id="sheets", icon="", events=[]) + events = cal.fetch_events(sheet_url="https://example.com/sheet") + assert len(events) == 1 + assert events[0].all_day is False + assert events[0].start.hour == 9 + assert events[0].end.minute == 30 + + @patch("integrations.google_sheets.gspread.service_account") + def test_all_day_with_end_date(self, mock_sa): + mock_gc = MagicMock() + mock_sa.return_value = mock_gc + mock_sheet = MagicMock() + mock_gc.open_by_url.return_value = mock_sheet + mock_worksheet = MagicMock() + mock_sheet.sheet1 = mock_worksheet + mock_worksheet.get_all_values.return_value = [ + self.HEADERS, + ["Vacation", "", "", "2025-06-15", "", "2025-06-20", "", "true"], + ] + + cal = GoogleSheetsCalendar(name="Sheets", id="sheets", icon="", events=[]) + events = cal.fetch_events(sheet_url="https://example.com/sheet") + assert len(events) == 1 + # End should be end_date + 1 day (ICS convention) + assert events[0].end.day == 21 + + @patch("integrations.google_sheets.gspread.service_account") + def test_multiple_events(self, mock_sa): + mock_gc = MagicMock() + mock_sa.return_value = mock_gc + mock_sheet = MagicMock() + mock_gc.open_by_url.return_value = mock_sheet + mock_worksheet = MagicMock() + mock_sheet.sheet1 = mock_worksheet + mock_worksheet.get_all_values.return_value = [ + self.HEADERS, + ["Event 1", "", "", "2025-06-15", "", "", "", "yes"], + ["Event 2", "", "", "2025-06-16", "", "", "", "1"], + ] + + cal = GoogleSheetsCalendar(name="Sheets", id="sheets", icon="", events=[]) + events = cal.fetch_events(sheet_url="https://example.com/sheet") + assert len(events) == 2 + + @patch("integrations.google_sheets.gspread.service_account") + def test_bad_row_skipped(self, mock_sa): + mock_gc = MagicMock() + mock_sa.return_value = mock_gc + mock_sheet = MagicMock() + mock_gc.open_by_url.return_value = mock_sheet + mock_worksheet = MagicMock() + mock_sheet.sheet1 = mock_worksheet + mock_worksheet.get_all_values.return_value = [ + self.HEADERS, + ["Good Event", "", "", "2025-06-15", "", "", "", "yes"], + ["Bad Event", "", "", "not-a-date", "", "", "", "yes"], + ] + + cal = GoogleSheetsCalendar(name="Sheets", id="sheets", icon="", events=[]) + events = cal.fetch_events(sheet_url="https://example.com/sheet") + assert len(events) == 1 + assert events[0].title == "Good Event" + + @patch("integrations.google_sheets.gspread.service_account") + def test_empty_sheet_raises_404(self, mock_sa): + mock_gc = MagicMock() + mock_sa.return_value = mock_gc + mock_sheet = MagicMock() + mock_gc.open_by_url.return_value = mock_sheet + mock_worksheet = MagicMock() + mock_sheet.sheet1 = mock_worksheet + mock_worksheet.get_all_values.return_value = [] + + cal = GoogleSheetsCalendar(name="Sheets", id="sheets", icon="", events=[]) + with pytest.raises(HTTPException) as exc_info: + cal.fetch_events(sheet_url="https://example.com/sheet") + assert exc_info.value.status_code == 404 + + @patch("integrations.google_sheets.gspread.service_account") + def test_header_only_raises_404(self, mock_sa): + mock_gc = MagicMock() + mock_sa.return_value = mock_gc + mock_sheet = MagicMock() + mock_gc.open_by_url.return_value = mock_sheet + mock_worksheet = MagicMock() + mock_sheet.sheet1 = mock_worksheet + mock_worksheet.get_all_values.return_value = [self.HEADERS] + + cal = GoogleSheetsCalendar(name="Sheets", id="sheets", icon="", events=[]) + with pytest.raises(HTTPException) as exc_info: + cal.fetch_events(sheet_url="https://example.com/sheet") + assert exc_info.value.status_code == 404 + + @patch("integrations.google_sheets.gspread.service_account") + def test_auth_failure(self, mock_sa): + mock_sa.side_effect = Exception("Invalid credentials") + + cal = GoogleSheetsCalendar(name="Sheets", id="sheets", icon="", events=[]) + with pytest.raises(HTTPException) as exc_info: + cal.fetch_events(sheet_url="https://example.com/sheet") + assert exc_info.value.status_code == 500 + assert "authenticate" in exc_info.value.detail.lower() + + @patch("integrations.google_sheets.gspread.service_account") + def test_spreadsheet_not_found(self, mock_sa): + import gspread + mock_gc = MagicMock() + mock_sa.return_value = mock_gc + mock_gc.open_by_url.side_effect = gspread.exceptions.SpreadsheetNotFound + + cal = GoogleSheetsCalendar(name="Sheets", id="sheets", icon="", events=[]) + with pytest.raises(HTTPException) as exc_info: + cal.fetch_events(sheet_url="https://example.com/nonexistent") + assert exc_info.value.status_code == 404 + + @patch("integrations.google_sheets.gspread.service_account") + def test_all_bad_rows_raises_404(self, mock_sa): + mock_gc = MagicMock() + mock_sa.return_value = mock_gc + mock_sheet = MagicMock() + mock_gc.open_by_url.return_value = mock_sheet + mock_worksheet = MagicMock() + mock_sheet.sheet1 = mock_worksheet + mock_worksheet.get_all_values.return_value = [ + self.HEADERS, + ["Bad 1", "", "", "invalid", "", "", "", "yes"], + ["Bad 2", "", "", "also-bad", "", "", "", "yes"], + ] + + cal = GoogleSheetsCalendar(name="Sheets", id="sheets", icon="", events=[]) + with pytest.raises(HTTPException) as exc_info: + cal.fetch_events(sheet_url="https://example.com/sheet") + assert exc_info.value.status_code == 404 + assert "No valid events" in exc_info.value.detail + + @patch("integrations.google_sheets.gspread.service_account") + def test_uid_generation(self, mock_sa): + mock_gc = MagicMock() + mock_sa.return_value = mock_gc + mock_sheet = MagicMock() + mock_gc.open_by_url.return_value = mock_sheet + mock_worksheet = MagicMock() + mock_sheet.sheet1 = mock_worksheet + mock_worksheet.get_all_values.return_value = [ + self.HEADERS, + ["My Event", "", "", "2025-06-15", "", "", "", "yes"], + ] + + cal = GoogleSheetsCalendar(name="Sheets", id="sheets", icon="", events=[]) + events = cal.fetch_events(sheet_url="https://example.com/sheet") + assert events[0].uid.startswith("sheet-") diff --git a/tests/integrations/test_imdb.py b/tests/integrations/test_imdb.py new file mode 100644 index 0000000..fe737d0 --- /dev/null +++ b/tests/integrations/test_imdb.py @@ -0,0 +1,134 @@ +import pytest +from datetime import datetime +from unittest.mock import patch, MagicMock + +from integrations.imdb import parse_imdb_date, filter_movies, scrape_imdb_movies, ImdbCalendar + + +class TestParseImdbDate: + def test_valid_date(self): + result = parse_imdb_date("Jun 27, 2025") + assert result.year == 2025 + assert result.month == 6 + assert result.day == 27 + assert result.hour == 8 + assert result.minute == 0 + + def test_different_month(self): + result = parse_imdb_date("Dec 1, 2025") + assert result.month == 12 + assert result.day == 1 + + def test_invalid_date_raises(self): + with pytest.raises(ValueError): + parse_imdb_date("Not a date") + + def test_empty_string_raises(self): + with pytest.raises(ValueError): + parse_imdb_date("") + + +class TestFilterMovies: + @pytest.fixture + def movies(self): + return [ + {"title": "Action Movie", "genres": ["Action", "Thriller"], "cast": ["Actor A", "Actor B"]}, + {"title": "Comedy Movie", "genres": ["Comedy"], "cast": ["Actor C"]}, + {"title": "Action Comedy", "genres": ["Action", "Comedy"], "cast": ["Actor A", "Actor C"]}, + ] + + def test_all_filter(self, movies): + result = filter_movies(movies, genre="all", actor="all") + assert len(result) == 3 + + def test_genre_filter(self, movies): + result = filter_movies(movies, genre="comedy", actor="all") + assert len(result) == 2 + assert all("Comedy" in m["genres"] for m in result) + + def test_actor_filter(self, movies): + result = filter_movies(movies, genre="all", actor="actor a") + assert len(result) == 2 + + def test_genre_and_actor_filter(self, movies): + result = filter_movies(movies, genre="action", actor="actor a") + assert len(result) == 2 + + def test_no_match(self, movies): + result = filter_movies(movies, genre="horror", actor="all") + assert len(result) == 0 + + def test_case_insensitive(self, movies): + result = filter_movies(movies, genre="ACTION", actor="ACTOR A") + assert len(result) == 2 + + def test_empty_movies(self): + result = filter_movies([], genre="all", actor="all") + assert result == [] + + +class TestScrapeImdbMovies: + @patch("integrations.imdb.requests.get") + def test_success(self, mock_get): + mock_response = MagicMock() + mock_response.status_code = 200 + mock_response.raise_for_status = MagicMock() + mock_response.text = """ + + + + """ + mock_get.return_value = mock_response + + result = scrape_imdb_movies("US") + assert len(result) == 1 + assert result[0]["title"] == "Test Movie" + assert result[0]["movie_id"] == "tt1234567" + assert result[0]["release_date"] == "Jun 27, 2025" + + @patch("integrations.imdb.requests.get") + def test_empty_page(self, mock_get): + mock_response = MagicMock() + mock_response.status_code = 200 + mock_response.raise_for_status = MagicMock() + mock_response.text = "" + mock_get.return_value = mock_response + + result = scrape_imdb_movies("US") + assert result == [] + + +class TestImdbCalendarFetchEvents: + @patch("integrations.imdb.scrape_imdb_movies") + def test_fetch_events_success(self, mock_scrape): + mock_scrape.return_value = [ + { + "title": "Test Movie", + "release_date": "Jun 15, 2025", + "genres": ["Action"], + "cast": ["Actor A"], + "location": "https://www.imdb.com/title/tt123/", + "movie_id": "tt123", + } + ] + cal = ImdbCalendar(name="IMDb", id="imdb", icon="", events=[]) + events = cal.fetch_events(genre="all", actor="all", country="US") + assert len(events) == 1 + assert events[0].title == "Test Movie" + assert events[0].all_day is True + + @patch("integrations.imdb.scrape_imdb_movies") + def test_fetch_events_with_filter(self, mock_scrape): + mock_scrape.return_value = [ + {"title": "Action", "release_date": "Jun 15, 2025", "genres": ["Action"], "cast": [], "location": None, "movie_id": "tt1"}, + {"title": "Comedy", "release_date": "Jun 16, 2025", "genres": ["Comedy"], "cast": [], "location": None, "movie_id": "tt2"}, + ] + cal = ImdbCalendar(name="IMDb", id="imdb", icon="", events=[]) + events = cal.fetch_events(genre="comedy", actor="all", country="US") + assert len(events) == 1 + assert events[0].title == "Comedy" diff --git a/tests/integrations/test_investing.py b/tests/integrations/test_investing.py new file mode 100644 index 0000000..0eb7e97 --- /dev/null +++ b/tests/integrations/test_investing.py @@ -0,0 +1,284 @@ +import pytest +from unittest.mock import patch, MagicMock +from fastapi import HTTPException + +from integrations.investing import ( + clean, + resolve_dates, + convert_names_to_ids, + build_ipo_payload, + parse_ipo_html, + fetch_earnings_fmp, + _format_revenue, + InvestingCalendar, + COUNTRY_MAP, + SECTOR_MAP, + IMPORTANCE_MAP, +) + + +class TestClean: + def test_html_entities(self): + assert clean("&") == "&" + assert clean("<") == "<" + + def test_nbsp(self): + assert clean("hello\xa0world") == "hello world" + + def test_slashes(self): + assert clean("hello/world") == "helloworld" + + def test_escaped_tags(self): + assert clean("<\\/td>") == "<\\td>" + + def test_backslash_escaped(self): + assert clean("hello\\/world") == "hello\\world" + + def test_strips_whitespace(self): + assert clean(" hello ") == "hello" + + def test_combined(self): + assert clean(" &\xa0test/ ") == "& test" + + +class TestResolveDates: + def test_custom_date_range(self): + fr, to, tab = resolve_dates(None, "2025-01-01", "2025-01-31") + assert fr == "2025-01-01" + assert to == "2025-01-31" + assert tab == "custom" + + def test_valid_tabs(self): + for tab_name in ["yesterday", "today", "tomorrow", "thisWeek", "nextWeek"]: + fr, to, result_tab = resolve_dates(tab_name, None, None) + assert fr is None + assert to is None + assert result_tab == tab_name + + def test_invalid_tab_raises(self): + with pytest.raises(HTTPException) as exc_info: + resolve_dates("invalid", None, None) + assert exc_info.value.status_code == 400 + + +class TestConvertNamesToIds: + def test_name_lookup(self): + ids = convert_names_to_ids(["united states"], COUNTRY_MAP, "country") + assert ids == [5] + + def test_case_insensitive(self): + ids = convert_names_to_ids(["United States"], COUNTRY_MAP, "country") + assert ids == [5] + + def test_int_passthrough(self): + ids = convert_names_to_ids([5, 6], COUNTRY_MAP, "country") + assert ids == [5, 6] + + def test_mixed_names_and_ids(self): + ids = convert_names_to_ids(["united states", 6], COUNTRY_MAP, "country") + assert ids == [5, 6] + + def test_invalid_name_raises(self): + with pytest.raises(HTTPException) as exc_info: + convert_names_to_ids(["nonexistent"], COUNTRY_MAP, "country") + assert exc_info.value.status_code == 400 + + def test_empty_list(self): + ids = convert_names_to_ids([], COUNTRY_MAP, "country") + assert ids == [] + + def test_sector_map(self): + ids = convert_names_to_ids(["technology"], SECTOR_MAP, "sector") + assert ids == [31] + + def test_importance_map(self): + ids = convert_names_to_ids(["high"], IMPORTANCE_MAP, "importance") + assert ids == [3] + + +class TestBuildIpoPayload: + def test_with_countries(self): + payload = build_ipo_payload([5, 6]) + assert payload["country[]"] == ["5", "6"] + assert payload["currentTab"] == "upcoming" + + def test_without_countries(self): + payload = build_ipo_payload([]) + assert "country[]" not in payload + + +class TestFormatRevenue: + def test_none(self): + assert _format_revenue(None) == "--" + + def test_trillions(self): + assert _format_revenue(1.5e12) == "$1.50T" + + def test_billions(self): + assert _format_revenue(2.34e9) == "$2.34B" + + def test_millions(self): + assert _format_revenue(500e6) == "$500.00M" + + def test_small_number(self): + assert _format_revenue(12345) == "$12,345" + + def test_negative_billion(self): + assert _format_revenue(-1.5e9) == "$-1.50B" + + def test_non_numeric(self): + assert _format_revenue("N/A") == "N/A" + + def test_zero(self): + assert _format_revenue(0) == "$0" + + +class TestFetchEarningsFmp: + @patch.dict("os.environ", {}, clear=True) + def test_missing_api_key_raises(self): + with pytest.raises(HTTPException) as exc_info: + fetch_earnings_fmp("2025-01-01", "2025-01-07") + assert exc_info.value.status_code == 500 + assert "FMP_API_KEY" in exc_info.value.detail + + @patch.dict("os.environ", {"FMP_API_KEY": "test-key"}) + @patch("integrations.investing.requests.get") + def test_success(self, mock_get): + mock_resp = MagicMock() + mock_resp.json.return_value = [ + {"symbol": "AAPL", "date": "2025-01-27", "epsActual": 2.10, "epsEstimated": 2.05}, + ] + mock_resp.raise_for_status = MagicMock() + mock_get.return_value = mock_resp + + result = fetch_earnings_fmp("2025-01-27", "2025-01-31") + assert len(result) == 1 + assert result[0]["symbol"] == "AAPL" + mock_get.assert_called_once_with( + "https://financialmodelingprep.com/stable/earnings-calendar", + params={"from": "2025-01-27", "to": "2025-01-31", "apikey": "test-key"}, + timeout=20, + ) + + @patch.dict("os.environ", {"FMP_API_KEY": "test-key"}) + @patch("integrations.investing.requests.get") + def test_api_error_message(self, mock_get): + mock_resp = MagicMock() + mock_resp.json.return_value = {"Error Message": "Invalid API key"} + mock_resp.raise_for_status = MagicMock() + mock_get.return_value = mock_resp + + with pytest.raises(HTTPException) as exc_info: + fetch_earnings_fmp("2025-01-01", "2025-01-07") + assert exc_info.value.status_code == 502 + assert "Invalid API key" in exc_info.value.detail + + @patch.dict("os.environ", {"FMP_API_KEY": "test-key"}) + @patch("integrations.investing.requests.get") + def test_request_exception(self, mock_get): + import requests + mock_get.side_effect = requests.RequestException("timeout") + + with pytest.raises(requests.RequestException): + fetch_earnings_fmp("2025-01-01", "2025-01-07") + + +class TestInvestingCalendarEarnings: + @patch.dict("os.environ", {"FMP_API_KEY": "test-key"}) + @patch("integrations.investing.fetch_earnings_fmp") + def test_earnings_with_date_range(self, mock_fetch): + mock_fetch.return_value = [ + { + "symbol": "AMZN", + "date": "2025-01-30", + "epsActual": None, + "epsEstimated": 1.98, + "revenueActual": None, + "revenueEstimated": 211170000000, + }, + ] + cal = InvestingCalendar(name="Investing", id="investing", icon="", events=[]) + events = cal.fetch_events(kind="earnings", date_from="2025-01-27", date_to="2025-01-31") + assert len(events) == 1 + assert events[0].title == "Earnings – AMZN" + assert events[0].all_day is True + assert "AMZN" in events[0].description + assert "$211.17B" in events[0].description + mock_fetch.assert_called_once_with("2025-01-27", "2025-01-31") + + @patch.dict("os.environ", {"FMP_API_KEY": "test-key"}) + @patch("integrations.investing.fetch_earnings_fmp") + def test_earnings_defaults_to_current_week(self, mock_fetch): + mock_fetch.return_value = [ + {"symbol": "TEST", "date": "2025-01-27", "epsActual": 1.0, "epsEstimated": 1.0, + "revenueActual": None, "revenueEstimated": None}, + ] + cal = InvestingCalendar(name="Investing", id="investing", icon="", events=[]) + events = cal.fetch_events(kind="earnings") + assert len(events) == 1 + # fetch_earnings_fmp should have been called with auto-computed dates + mock_fetch.assert_called_once() + args = mock_fetch.call_args[0] + assert len(args[0]) == 10 # YYYY-MM-DD format + assert len(args[1]) == 10 + + @patch.dict("os.environ", {"FMP_API_KEY": "test-key"}) + @patch("integrations.investing.fetch_earnings_fmp") + def test_earnings_skips_entries_without_date(self, mock_fetch): + mock_fetch.return_value = [ + {"symbol": "GOOD", "date": "2025-01-30", "epsActual": 1.0, "epsEstimated": 1.0, + "revenueActual": None, "revenueEstimated": None}, + {"symbol": "BAD", "date": None}, + {"symbol": "BAD2"}, + ] + cal = InvestingCalendar(name="Investing", id="investing", icon="", events=[]) + events = cal.fetch_events(kind="earnings", date_from="2025-01-27", date_to="2025-01-31") + assert len(events) == 1 + assert events[0].title == "Earnings – GOOD" + + def test_invalid_kind_raises(self): + cal = InvestingCalendar(name="Investing", id="investing", icon="", events=[]) + with pytest.raises(HTTPException) as exc_info: + cal.fetch_events(kind="invalid") + assert exc_info.value.status_code == 400 + + +class TestParseIpoHtml: + def test_empty_html(self): + result = parse_ipo_html("") + assert result == [] + + def test_valid_ipo_row(self): + html = """ + + + + + + + + + +
Jan 15, 2025TechCoTECHNASDAQ500M$25.00$27.50
+ """ + result = parse_ipo_html(html) + assert len(result) == 1 + assert result[0]["date"] == "2025-01-15" + assert result[0]["country"] == "United States" + assert result[0]["exchange"] == "NASDAQ" + + def test_invalid_date_skipped(self): + html = """ + + + + + + + + + +
Invalid DateCoCONYSE100M$10$11
+ """ + result = parse_ipo_html(html) + assert result == [] diff --git a/tests/integrations/test_moviedb.py b/tests/integrations/test_moviedb.py new file mode 100644 index 0000000..3c54a3b --- /dev/null +++ b/tests/integrations/test_moviedb.py @@ -0,0 +1,131 @@ +import pytest +from unittest.mock import patch, MagicMock +from fastapi import HTTPException + +from integrations.moviedb import MovieDbCalendar + + +class TestMovieDbCalendarFetchEvents: + SAMPLE_HTML = """ +
+

Inception 2

+

15 Jun 2025

+
+
+

Matrix 5

+

Jun 20, 2025

+
+ """ + + EMPTY_HTML = "" + + @patch("integrations.moviedb.requests.post") + def test_success_with_movies(self, mock_post): + mock_resp = MagicMock() + mock_resp.status_code = 200 + mock_resp.raise_for_status = MagicMock() + # First page has movies, second page is empty to break pagination + mock_resp_empty = MagicMock() + mock_resp_empty.status_code = 200 + mock_resp_empty.raise_for_status = MagicMock() + mock_resp.text = self.SAMPLE_HTML + mock_resp_empty.text = self.EMPTY_HTML + mock_post.side_effect = [mock_resp, mock_resp_empty] + + cal = MovieDbCalendar(name="MovieDB", id="moviedb", icon="", events=[]) + events = cal.fetch_events(start_date="2025-06-01", end_date="2025-12-31", max_pages=2) + assert len(events) == 2 + assert events[0].title == "Inception 2" + assert events[1].title == "Matrix 5" + assert events[0].all_day is True + assert "tmdb-" in events[0].uid + + @patch("integrations.moviedb.requests.post") + def test_empty_results(self, mock_post): + mock_resp = MagicMock() + mock_resp.status_code = 200 + mock_resp.raise_for_status = MagicMock() + mock_resp.text = self.EMPTY_HTML + mock_post.return_value = mock_resp + + cal = MovieDbCalendar(name="MovieDB", id="moviedb", icon="", events=[]) + events = cal.fetch_events(start_date="2025-06-01", end_date="2025-06-30") + assert events == [] + + @patch("integrations.moviedb.requests.post") + def test_defaults_without_dates(self, mock_post): + mock_resp = MagicMock() + mock_resp.status_code = 200 + mock_resp.raise_for_status = MagicMock() + mock_resp.text = self.EMPTY_HTML + mock_post.return_value = mock_resp + + cal = MovieDbCalendar(name="MovieDB", id="moviedb", icon="", events=[]) + events = cal.fetch_events() + assert events == [] + # Should have been called with default dates + assert mock_post.called + + @patch("integrations.moviedb.requests.post") + def test_request_failure_raises_502(self, mock_post): + import requests + mock_post.side_effect = requests.RequestException("Connection failed") + + cal = MovieDbCalendar(name="MovieDB", id="moviedb", icon="", events=[]) + with pytest.raises(HTTPException) as exc_info: + cal.fetch_events(start_date="2025-06-01", end_date="2025-06-30") + assert exc_info.value.status_code == 502 + + @patch("integrations.moviedb.requests.post") + def test_invalid_date_in_card_skipped(self, mock_post): + html = """ +
+

Bad Date Movie

+

Not A Real Date

+
+ """ + mock_resp = MagicMock() + mock_resp.status_code = 200 + mock_resp.raise_for_status = MagicMock() + mock_resp.text = html + mock_resp_empty = MagicMock() + mock_resp_empty.status_code = 200 + mock_resp_empty.raise_for_status = MagicMock() + mock_resp_empty.text = self.EMPTY_HTML + mock_post.side_effect = [mock_resp, mock_resp_empty] + + cal = MovieDbCalendar(name="MovieDB", id="moviedb", icon="", events=[]) + events = cal.fetch_events(start_date="2025-06-01", end_date="2025-06-30", max_pages=2) + assert events == [] + + @patch("integrations.moviedb.requests.post") + def test_card_without_title_skipped(self, mock_post): + html = """ +
+

15 Jun 2025

+
+ """ + mock_resp = MagicMock() + mock_resp.status_code = 200 + mock_resp.raise_for_status = MagicMock() + mock_resp.text = html + mock_post.return_value = mock_resp + + cal = MovieDbCalendar(name="MovieDB", id="moviedb", icon="", events=[]) + events = cal.fetch_events(start_date="2025-06-01", end_date="2025-06-30") + assert events == [] + + @patch("integrations.moviedb.requests.post") + def test_max_pages_respected(self, mock_post): + """Ensure pagination stops at max_pages.""" + mock_resp = MagicMock() + mock_resp.status_code = 200 + mock_resp.raise_for_status = MagicMock() + mock_resp.text = self.SAMPLE_HTML + mock_post.return_value = mock_resp + + cal = MovieDbCalendar(name="MovieDB", id="moviedb", icon="", events=[]) + events = cal.fetch_events(start_date="2025-06-01", end_date="2025-06-30", max_pages=1) + # Should only make 1 request + assert mock_post.call_count == 1 + assert len(events) == 2 diff --git a/tests/integrations/test_releases.py b/tests/integrations/test_releases.py new file mode 100644 index 0000000..597228b --- /dev/null +++ b/tests/integrations/test_releases.py @@ -0,0 +1,157 @@ +import pytest +from unittest.mock import patch, MagicMock +from fastapi import HTTPException + +from integrations.releases import ReleasesCalendar + + +class TestReleasesCalendarFetchEvents: + GAMES_HTML = """ +
+ Halo Infinite DLC + Xbox Series X + PC +
+
+ PS5 Exclusive + PlayStation 5 +
+ """ + + TV_HTML = """ + + """ + + EMPTY_HTML = "" + + @patch("integrations.releases.requests.post") + def test_games_with_platform_filter(self, mock_post): + mock_resp = MagicMock() + mock_resp.status_code = 200 + mock_resp.raise_for_status = MagicMock() + mock_resp.text = self.GAMES_HTML + mock_post.return_value = mock_resp + + cal = ReleasesCalendar(name="Releases", id="releases", icon="", events=[]) + events = cal.fetch_events(kind="games", days_ahead=1, platform="xbox") + assert len(events) == 1 + assert events[0].title == "Halo Infinite DLC" + assert events[0].all_day is True + + @patch("integrations.releases.requests.post") + def test_games_no_matching_platform(self, mock_post): + mock_resp = MagicMock() + mock_resp.status_code = 200 + mock_resp.raise_for_status = MagicMock() + mock_resp.text = self.GAMES_HTML + mock_post.return_value = mock_resp + + cal = ReleasesCalendar(name="Releases", id="releases", icon="", events=[]) + events = cal.fetch_events(kind="games", days_ahead=1, platform="nintendo") + assert events == [] + + @patch("integrations.releases.requests.post") + def test_tv_series_no_platform_filter(self, mock_post): + mock_resp = MagicMock() + mock_resp.status_code = 200 + mock_resp.raise_for_status = MagicMock() + mock_resp.text = self.TV_HTML + mock_post.return_value = mock_resp + + cal = ReleasesCalendar(name="Releases", id="releases", icon="", events=[]) + events = cal.fetch_events(kind="tv-series", days_ahead=1, platform="xbox") + assert len(events) == 1 + assert events[0].title == "Breaking Bad Reboot" + + @patch("integrations.releases.requests.post") + def test_empty_results(self, mock_post): + mock_resp = MagicMock() + mock_resp.status_code = 200 + mock_resp.raise_for_status = MagicMock() + mock_resp.text = self.EMPTY_HTML + mock_post.return_value = mock_resp + + cal = ReleasesCalendar(name="Releases", id="releases", icon="", events=[]) + events = cal.fetch_events(kind="games", days_ahead=1) + assert events == [] + + @patch("integrations.releases.requests.post") + def test_multiple_days(self, mock_post): + mock_resp = MagicMock() + mock_resp.status_code = 200 + mock_resp.raise_for_status = MagicMock() + mock_resp.text = self.TV_HTML + mock_post.return_value = mock_resp + + cal = ReleasesCalendar(name="Releases", id="releases", icon="", events=[]) + events = cal.fetch_events(kind="tv-series", days_ahead=3) + assert mock_post.call_count == 3 + assert len(events) == 3 + + @patch("integrations.releases.requests.post") + def test_request_failure_raises_500(self, mock_post): + mock_post.side_effect = Exception("Connection failed") + + cal = ReleasesCalendar(name="Releases", id="releases", icon="", events=[]) + with pytest.raises(HTTPException) as exc_info: + cal.fetch_events(kind="games", days_ahead=1) + assert exc_info.value.status_code == 500 + + @patch("integrations.releases.requests.post") + def test_card_without_title_skipped(self, mock_post): + html = """ +
+ No title link here +
+ """ + mock_resp = MagicMock() + mock_resp.status_code = 200 + mock_resp.raise_for_status = MagicMock() + mock_resp.text = html + mock_post.return_value = mock_resp + + cal = ReleasesCalendar(name="Releases", id="releases", icon="", events=[]) + events = cal.fetch_events(kind="tv-series", days_ahead=1) + assert events == [] + + @patch("integrations.releases.requests.post") + def test_games_with_track_buttons(self, mock_post): + html = """ +
+ Button Game + +
+ """ + mock_resp = MagicMock() + mock_resp.status_code = 200 + mock_resp.raise_for_status = MagicMock() + mock_resp.text = html + mock_post.return_value = mock_resp + + cal = ReleasesCalendar(name="Releases", id="releases", icon="", events=[]) + events = cal.fetch_events(kind="games", days_ahead=1, platform="xbox") + assert len(events) == 1 + + @patch("integrations.releases.requests.post") + def test_hidden_platform_spans_ignored(self, mock_post): + html = """ +
+ Hidden Platform Game + + PC +
+ """ + mock_resp = MagicMock() + mock_resp.status_code = 200 + mock_resp.raise_for_status = MagicMock() + mock_resp.text = html + mock_post.return_value = mock_resp + + cal = ReleasesCalendar(name="Releases", id="releases", icon="", events=[]) + events = cal.fetch_events(kind="games", days_ahead=1, platform="xbox") + # Xbox span is hidden, only PC is visible + assert events == [] diff --git a/tests/integrations/test_shows.py b/tests/integrations/test_shows.py new file mode 100644 index 0000000..279fbc1 --- /dev/null +++ b/tests/integrations/test_shows.py @@ -0,0 +1,381 @@ +from datetime import datetime +from unittest.mock import patch, MagicMock + +import pytest +from fastapi import HTTPException + +from integrations.shows import ( + _convert_date, + _create_slug, + _scrape_shows, + _get_tmsid, + _scrape_episodes, + ShowsCalendar, +) + + +# --------------------------------------------------------------------------- +# Pure function tests (existing, kept intact) +# --------------------------------------------------------------------------- + +class TestConvertDate: + def test_valid_date(self): + result = _convert_date("Monday, January 27") + assert result is not None + assert result.startswith("2") # Year + assert len(result) == 8 # YYYYMMDD + + def test_valid_date_format(self): + result = _convert_date("Friday, June 15") + assert "0615" in result + + def test_invalid_date(self): + result = _convert_date("Not a date") + assert result is None + + def test_empty_string(self): + result = _convert_date("") + assert result is None + + +class TestCreateSlug: + def test_basic_text(self): + assert _create_slug("Hello World") == "hello_world" + + def test_special_characters(self): + result = _create_slug("Rock & Roll!") + assert "!" not in result + + def test_ampersand_replaced(self): + # Note: & is stripped by regex before replace runs + result = _create_slug("Tom & Jerry") + assert isinstance(result, str) + assert len(result) > 0 + + def test_leading_trailing_stripped(self): + result = _create_slug(" hello ") + assert result == "hello" + + def test_multiple_separators_collapsed(self): + result = _create_slug("hello world---foo") + assert "__" not in result # Multiple separators collapsed + + def test_plus_removed(self): + result = _create_slug("C++ Programming") + assert "+" not in result + + +# --------------------------------------------------------------------------- +# _scrape_shows tests +# --------------------------------------------------------------------------- + +CALENDAR_HTML = """ + +
Monday, June 15
+ + +

Breaking Bad

+
Drama
+ +
+ + +

The Office

+
Comedy
+ +
+
Tuesday, June 16
+ + +

Friends

+
Comedy
+ +
+ +""" + +CALENDAR_WITH_PREMIERE_HTML = """ + +
Monday, June 15
+ + +

Streamer Show

+
Streaming Premiere
+
+ + +

Movie Show

+
Movie Premiere
+
+ + +

Good Show

+
Season 3 Premiere
+ +
+ +""" + + +class TestScrapeShows: + @patch("integrations.shows.requests.get") + def test_basic_scrape(self, mock_get): + mock_resp = MagicMock() + mock_resp.text = CALENDAR_HTML + mock_get.return_value = mock_resp + + shows = _scrape_shows() + assert len(shows) == 3 + assert shows[0][0] == "Breaking Bad" + assert shows[0][1] == "AMC" + assert shows[1][0] == "The Office" + assert shows[2][0] == "Friends" + + @patch("integrations.shows.requests.get") + def test_streaming_and_movie_premieres_skipped(self, mock_get): + mock_resp = MagicMock() + mock_resp.text = CALENDAR_WITH_PREMIERE_HTML + mock_get.return_value = mock_resp + + shows = _scrape_shows() + # Only the Season Premiere should remain (streaming/movie filtered out) + assert len(shows) == 1 + assert "Good Show" in shows[0][0] + assert shows[0][5] == "Season Premiere" + + @patch("integrations.shows.requests.get") + def test_season_premiere_appended_to_name(self, mock_get): + mock_resp = MagicMock() + mock_resp.text = CALENDAR_WITH_PREMIERE_HTML + mock_get.return_value = mock_resp + + shows = _scrape_shows() + assert shows[0][0] == "Good Show (Season 3 Premiere)" + + @patch("integrations.shows.requests.get") + def test_empty_page(self, mock_get): + mock_resp = MagicMock() + mock_resp.text = "" + mock_get.return_value = mock_resp + + shows = _scrape_shows() + assert shows == [] + + @patch("integrations.shows.requests.get") + def test_show_without_network_logo_skipped(self, mock_get): + html = """ + +
Monday, June 15
+ +

No Logo Show

+
Drama
+
+ + """ + mock_resp = MagicMock() + mock_resp.text = html + mock_get.return_value = mock_resp + + shows = _scrape_shows() + assert shows == [] + + +# --------------------------------------------------------------------------- +# _get_tmsid tests +# --------------------------------------------------------------------------- + +class TestGetTmsid: + @patch("integrations.shows.requests.get") + def test_tmsid_from_button(self, mock_get): + html = '' + mock_resp = MagicMock() + mock_resp.text = html + mock_get.return_value = mock_resp + + result = _get_tmsid("/show/test/") + assert result == "SH12345" + + @patch("integrations.shows.requests.get") + def test_tmsid_from_regex_fallback(self, mock_get): + html = "" + mock_resp = MagicMock() + mock_resp.text = html + mock_get.return_value = mock_resp + + result = _get_tmsid("/show/test/") + assert result == "EP98765" + + @patch("integrations.shows.requests.get") + def test_tmsid_not_found(self, mock_get): + html = "

No tmsid here

" + mock_resp = MagicMock() + mock_resp.text = html + mock_get.return_value = mock_resp + + result = _get_tmsid("/show/test/") + assert result is None + + +# --------------------------------------------------------------------------- +# _scrape_episodes tests +# --------------------------------------------------------------------------- + +class TestScrapeEpisodes: + @patch("integrations.shows._get_tmsid", return_value="SH12345") + @patch("integrations.shows.requests.get") + def test_future_episodes_returned(self, mock_get, mock_tmsid): + html = """ + +
+ +

Future Episode

+

S01E01

+
+
+ +

Past Episode

+

S01E02

+
+ + """ + mock_resp = MagicMock() + mock_resp.text = html + mock_get.return_value = mock_resp + + episodes = _scrape_episodes("/show/test/") + assert len(episodes) == 1 + assert episodes[0]["title"] == "Future Episode" + assert episodes[0]["season_episode"] == "S01E01" + assert episodes[0]["date"] == "20991231" + + @patch("integrations.shows._get_tmsid", return_value=None) + def test_no_tmsid_returns_empty(self, mock_tmsid): + episodes = _scrape_episodes("/show/test/") + assert episodes == [] + + @patch("integrations.shows._get_tmsid", return_value="SH12345") + @patch("integrations.shows.requests.get") + def test_invalid_date_skipped(self, mock_get, mock_tmsid): + html = """ + +
+ +

Bad Date Episode

+

S01E01

+
+ + """ + mock_resp = MagicMock() + mock_resp.text = html + mock_get.return_value = mock_resp + + episodes = _scrape_episodes("/show/test/") + assert episodes == [] + + @patch("integrations.shows._get_tmsid", return_value="SH12345") + @patch("integrations.shows.requests.get") + def test_empty_page(self, mock_get, mock_tmsid): + mock_resp = MagicMock() + mock_resp.text = "" + mock_get.return_value = mock_resp + + episodes = _scrape_episodes("/show/test/") + assert episodes == [] + + +# --------------------------------------------------------------------------- +# ShowsCalendar.fetch_events tests +# --------------------------------------------------------------------------- + +MOCK_SHOWS_DATA = [ + ["Breaking Bad", "AMC", "20250615", "https://img.com/amc.png", "https://img.com/bb.jpg", "Drama", "/show/breaking-bad/"], + ["The Office", "NBC", "20250615", "https://img.com/nbc.png", "https://img.com/office.jpg", "Comedy", "/show/the-office/"], + ["Friends", "NBC", "20250616", "https://img.com/nbc.png", "https://img.com/friends.jpg", "Comedy", "/show/friends/"], +] + + +class TestShowsCalendarFetchEvents: + @patch("integrations.shows._scrape_shows", return_value=MOCK_SHOWS_DATA) + def test_platform_mode(self, mock_scrape): + cal = ShowsCalendar(name="Shows", id="shows", icon="", events=[]) + events = cal.fetch_events(mode="platform", slug="nbc") + assert len(events) == 2 + assert all("NBC" in e.description for e in events) + titles = {e.title for e in events} + assert "The Office" in titles + assert "Friends" in titles + + @patch("integrations.shows._scrape_shows", return_value=MOCK_SHOWS_DATA) + def test_platform_not_found(self, mock_scrape): + cal = ShowsCalendar(name="Shows", id="shows", icon="", events=[]) + with pytest.raises(HTTPException) as exc_info: + cal.fetch_events(mode="platform", slug="nonexistent") + assert exc_info.value.status_code == 404 + assert "Platform not found" in exc_info.value.detail + + @patch("integrations.shows._scrape_shows", return_value=MOCK_SHOWS_DATA) + def test_genre_mode(self, mock_scrape): + cal = ShowsCalendar(name="Shows", id="shows", icon="", events=[]) + events = cal.fetch_events(mode="genre", slug="comedy") + assert len(events) == 2 + assert all("Comedy" in e.description for e in events) + + @patch("integrations.shows._scrape_shows", return_value=MOCK_SHOWS_DATA) + def test_genre_not_found(self, mock_scrape): + cal = ShowsCalendar(name="Shows", id="shows", icon="", events=[]) + with pytest.raises(HTTPException) as exc_info: + cal.fetch_events(mode="genre", slug="scifi") + assert exc_info.value.status_code == 404 + assert "Genre not found" in exc_info.value.detail + + @patch("integrations.shows._scrape_episodes") + @patch("integrations.shows._scrape_shows", return_value=MOCK_SHOWS_DATA) + def test_show_mode(self, mock_scrape, mock_episodes): + mock_episodes.return_value = [ + {"title": "Pilot", "season_episode": "S01E01", "date": "20250620"}, + {"title": "Second", "season_episode": "S01E02", "date": "20250627"}, + ] + cal = ShowsCalendar(name="Shows", id="shows", icon="", events=[]) + events = cal.fetch_events(mode="show", slug="breaking_bad") + assert len(events) == 2 + assert events[0].title == "S01E01 - Pilot" + assert "Breaking Bad" in events[0].description + mock_episodes.assert_called_once_with("/show/breaking-bad/") + + @patch("integrations.shows._scrape_shows", return_value=MOCK_SHOWS_DATA) + def test_show_not_found(self, mock_scrape): + cal = ShowsCalendar(name="Shows", id="shows", icon="", events=[]) + with pytest.raises(HTTPException) as exc_info: + cal.fetch_events(mode="show", slug="nonexistent_show") + assert exc_info.value.status_code == 404 + assert "Show not found" in exc_info.value.detail + + @patch("integrations.shows._scrape_shows", return_value=MOCK_SHOWS_DATA) + def test_invalid_mode(self, mock_scrape): + cal = ShowsCalendar(name="Shows", id="shows", icon="", events=[]) + with pytest.raises(HTTPException) as exc_info: + cal.fetch_events(mode="invalid", slug="test") + assert exc_info.value.status_code == 400 + assert "Invalid mode" in exc_info.value.detail + + @patch("integrations.shows._scrape_shows", side_effect=Exception("Connection failed")) + def test_scrape_failure_raises_500(self, mock_scrape): + cal = ShowsCalendar(name="Shows", id="shows", icon="", events=[]) + with pytest.raises(HTTPException) as exc_info: + cal.fetch_events(mode="platform", slug="nbc") + assert exc_info.value.status_code == 500 + + @patch("integrations.shows._scrape_shows", return_value=MOCK_SHOWS_DATA) + def test_platform_events_are_all_day(self, mock_scrape): + cal = ShowsCalendar(name="Shows", id="shows", icon="", events=[]) + events = cal.fetch_events(mode="platform", slug="amc") + assert len(events) == 1 + assert events[0].all_day is True + assert events[0].title == "Breaking Bad" + + @patch("integrations.shows._scrape_episodes", return_value=[]) + @patch("integrations.shows._scrape_shows", return_value=MOCK_SHOWS_DATA) + def test_show_mode_no_episodes(self, mock_scrape, mock_episodes): + cal = ShowsCalendar(name="Shows", id="shows", icon="", events=[]) + events = cal.fetch_events(mode="show", slug="breaking_bad") + assert events == [] diff --git a/tests/integrations/test_sportsdb.py b/tests/integrations/test_sportsdb.py new file mode 100644 index 0000000..faf4001 --- /dev/null +++ b/tests/integrations/test_sportsdb.py @@ -0,0 +1,102 @@ +import pytest +from unittest.mock import patch, MagicMock +from fastapi import HTTPException + +from integrations.sportsdb import SportsDbCalendar + + +class TestSportsDbCalendarFetchEvents: + @patch.dict("os.environ", {"SPORTSDB_API_KEY": "test-key"}) + @patch("integrations.sportsdb.requests.get") + def test_league_mode_success(self, mock_get): + mock_response = MagicMock() + mock_response.status_code = 200 + mock_response.raise_for_status = MagicMock() + mock_response.json.return_value = { + "events": [ + { + "idEvent": "1001", + "strEvent": "Team A vs Team B", + "strTimestamp": "2025-06-15T18:00:00+00:00", + }, + { + "idEvent": "1002", + "strEvent": "Team C vs Team D", + "strTimestamp": "2025-06-16T20:00:00+00:00", + }, + ] + } + mock_get.return_value = mock_response + + cal = SportsDbCalendar(name="SportsDB", id="sportsdb", icon="", events=[]) + events = cal.fetch_events(mode="league", id="4328") + assert len(events) == 2 + assert events[0].title == "Team A vs Team B" + assert events[0].uid == "1001" + + @patch.dict("os.environ", {"SPORTSDB_API_KEY": "test-key"}) + @patch("integrations.sportsdb.requests.get") + def test_team_mode_success(self, mock_get): + mock_response = MagicMock() + mock_response.status_code = 200 + mock_response.raise_for_status = MagicMock() + mock_response.json.return_value = { + "events": [ + { + "idEvent": "2001", + "strEvent": "My Team vs Rival", + "strTimestamp": "2025-06-20T19:00:00+00:00", + }, + ] + } + mock_get.return_value = mock_response + + cal = SportsDbCalendar(name="SportsDB", id="sportsdb", icon="", events=[]) + events = cal.fetch_events(mode="team", id="133602") + assert len(events) == 1 + + @patch.dict("os.environ", {"SPORTSDB_API_KEY": "test-key"}) + @patch("integrations.sportsdb.requests.get") + def test_invalid_mode_raises(self, mock_get): + cal = SportsDbCalendar(name="SportsDB", id="sportsdb", icon="", events=[]) + with pytest.raises(HTTPException) as exc_info: + cal.fetch_events(mode="invalid", id="123") + assert exc_info.value.status_code == 400 + + @patch.dict("os.environ", {}, clear=True) + def test_missing_api_key_raises(self): + cal = SportsDbCalendar(name="SportsDB", id="sportsdb", icon="", events=[]) + with pytest.raises(HTTPException) as exc_info: + cal.fetch_events(mode="league", id="4328") + assert exc_info.value.status_code == 500 + + @patch.dict("os.environ", {"SPORTSDB_API_KEY": "test-key"}) + @patch("integrations.sportsdb.requests.get") + def test_null_events_returns_empty(self, mock_get): + mock_response = MagicMock() + mock_response.status_code = 200 + mock_response.raise_for_status = MagicMock() + mock_response.json.return_value = {"events": None} + mock_get.return_value = mock_response + + cal = SportsDbCalendar(name="SportsDB", id="sportsdb", icon="", events=[]) + events = cal.fetch_events(mode="league", id="4328") + assert events == [] + + @patch.dict("os.environ", {"SPORTSDB_API_KEY": "test-key"}) + @patch("integrations.sportsdb.requests.get") + def test_malformed_event_skipped(self, mock_get): + mock_response = MagicMock() + mock_response.status_code = 200 + mock_response.raise_for_status = MagicMock() + mock_response.json.return_value = { + "events": [ + {"idEvent": "1", "strEvent": "Good", "strTimestamp": "2025-06-15T18:00:00+00:00"}, + {"idEvent": "2", "strEvent": "Bad", "strTimestamp": "not-a-date"}, + ] + } + mock_get.return_value = mock_response + + cal = SportsDbCalendar(name="SportsDB", id="sportsdb", icon="", events=[]) + events = cal.fetch_events(mode="league", id="4328") + assert len(events) == 1 diff --git a/tests/integrations/test_thetvdb.py b/tests/integrations/test_thetvdb.py new file mode 100644 index 0000000..e17b3f4 --- /dev/null +++ b/tests/integrations/test_thetvdb.py @@ -0,0 +1,214 @@ +import pytest +from unittest.mock import patch, MagicMock +from fastapi import HTTPException + +from integrations.thetvdb import TheTvDbCalendar, _invalidate_token + + +@pytest.fixture(autouse=True) +def clear_token_cache(): + """Clear the module-level cached token before each test.""" + _invalidate_token() + yield + _invalidate_token() + + +class TestTheTvDbCalendarFetchEvents: + @patch.dict("os.environ", {"THE_TVDB_API_KEY": "key"}) + @patch("integrations.thetvdb.requests.get") + @patch("integrations.thetvdb._login_to_thetvdb", return_value="fresh-token") + def test_success(self, mock_login, mock_get): + mock_resp = MagicMock() + mock_resp.status_code = 200 + mock_resp.raise_for_status = MagicMock() + mock_resp.json.return_value = { + "status": "success", + "data": { + "series": {"name": "Breaking Bad"}, + "episodes": [ + { + "id": 101, + "name": "Pilot", + "aired": "2008-01-20", + "seasonNumber": 1, + "number": 1, + "overview": "Walter White begins.", + }, + { + "id": 102, + "name": "Cat's in the Bag...", + "aired": "2008-01-27", + "seasonNumber": 1, + "number": 2, + "overview": "Aftermath.", + }, + ], + }, + } + mock_get.return_value = mock_resp + + cal = TheTvDbCalendar(name="TheTVDB", id="thetvdb", icon="", events=[]) + events = cal.fetch_events(series_id=81189) + assert len(events) == 2 + assert "Breaking Bad S01E01: Pilot" == events[0].title + assert events[0].all_day is True + assert events[0].uid == "101" + assert events[0].description == "Walter White begins." + mock_login.assert_called_once_with("key") + + @patch.dict("os.environ", {"THE_TVDB_API_KEY": "key"}) + @patch("integrations.thetvdb.requests.get") + @patch("integrations.thetvdb._login_to_thetvdb", return_value="fresh-token") + def test_episode_without_season_number(self, mock_login, mock_get): + mock_resp = MagicMock() + mock_resp.status_code = 200 + mock_resp.raise_for_status = MagicMock() + mock_resp.json.return_value = { + "status": "success", + "data": { + "series": {"name": "Special"}, + "episodes": [ + {"id": 201, "name": "Bonus", "aired": "2025-01-01"}, + ], + }, + } + mock_get.return_value = mock_resp + + cal = TheTvDbCalendar(name="TheTVDB", id="thetvdb", icon="", events=[]) + events = cal.fetch_events(series_id=1) + assert len(events) == 1 + assert events[0].title == "Special: Bonus" + + @patch.dict("os.environ", {"THE_TVDB_API_KEY": "key"}) + @patch("integrations.thetvdb.requests.get") + @patch("integrations.thetvdb._login_to_thetvdb", return_value="fresh-token") + def test_series_not_found(self, mock_login, mock_get): + mock_resp = MagicMock() + mock_resp.status_code = 200 + mock_resp.raise_for_status = MagicMock() + mock_resp.json.return_value = {"status": "failure"} + mock_get.return_value = mock_resp + + cal = TheTvDbCalendar(name="TheTVDB", id="thetvdb", icon="", events=[]) + with pytest.raises(HTTPException) as exc_info: + cal.fetch_events(series_id=999999) + # Inner 404 HTTPException is caught by outer except Exception and re-raised as 500 + assert exc_info.value.status_code == 500 + assert "not found" in exc_info.value.detail.lower() + + @patch.dict("os.environ", {"THE_TVDB_API_KEY": "key"}) + @patch("integrations.thetvdb.requests.get") + @patch("integrations.thetvdb._login_to_thetvdb", return_value="fresh-token") + def test_no_episodes(self, mock_login, mock_get): + mock_resp = MagicMock() + mock_resp.status_code = 200 + mock_resp.raise_for_status = MagicMock() + mock_resp.json.return_value = { + "status": "success", + "data": {"series": {"name": "Empty"}, "episodes": []}, + } + mock_get.return_value = mock_resp + + cal = TheTvDbCalendar(name="TheTVDB", id="thetvdb", icon="", events=[]) + with pytest.raises(HTTPException) as exc_info: + cal.fetch_events(series_id=1) + # Inner 404 HTTPException is caught by outer except Exception and re-raised as 500 + assert exc_info.value.status_code == 500 + assert "no episodes" in exc_info.value.detail.lower() + + @patch.dict("os.environ", {}, clear=True) + def test_missing_credentials(self): + cal = TheTvDbCalendar(name="TheTVDB", id="thetvdb", icon="", events=[]) + with pytest.raises(HTTPException) as exc_info: + cal.fetch_events(series_id=1) + assert exc_info.value.status_code == 500 + assert "Missing TheTVDB credentials" in exc_info.value.detail + + @patch.dict("os.environ", {"THE_TVDB_API_KEY": "key"}) + @patch("integrations.thetvdb._login_to_thetvdb", return_value="fresh-token") + @patch("integrations.thetvdb.requests.get") + def test_request_failure_raises_502(self, mock_get, mock_login): + import requests + mock_get.side_effect = requests.RequestException("timeout") + + cal = TheTvDbCalendar(name="TheTVDB", id="thetvdb", icon="", events=[]) + with pytest.raises(HTTPException) as exc_info: + cal.fetch_events(series_id=1) + assert exc_info.value.status_code == 502 + + @patch.dict("os.environ", {"THE_TVDB_API_KEY": "key"}) + @patch("integrations.thetvdb.requests.get") + @patch("integrations.thetvdb._login_to_thetvdb", return_value="fresh-token") + def test_episode_without_aired_date_skipped(self, mock_login, mock_get): + mock_resp = MagicMock() + mock_resp.status_code = 200 + mock_resp.raise_for_status = MagicMock() + mock_resp.json.return_value = { + "status": "success", + "data": { + "series": {"name": "Show"}, + "episodes": [ + {"id": 1, "name": "Has Date", "aired": "2025-01-01", "seasonNumber": 1, "number": 1}, + {"id": 2, "name": "No Date", "aired": None}, + {"id": 3, "name": "Bad Date", "aired": "not-a-date"}, + ], + }, + } + mock_get.return_value = mock_resp + + cal = TheTvDbCalendar(name="TheTVDB", id="thetvdb", icon="", events=[]) + events = cal.fetch_events(series_id=1) + assert len(events) == 1 + assert events[0].title == "Show S01E01: Has Date" + + @patch.dict("os.environ", {"THE_TVDB_API_KEY": "key"}) + @patch("integrations.thetvdb.requests.get") + @patch("integrations.thetvdb._login_to_thetvdb", return_value="fresh-token") + def test_series_name_fallback(self, mock_login, mock_get): + mock_resp = MagicMock() + mock_resp.status_code = 200 + mock_resp.raise_for_status = MagicMock() + mock_resp.json.return_value = { + "status": "success", + "data": { + "series": {}, # No name + "episodes": [ + {"id": 1, "name": "Ep", "aired": "2025-01-01", "seasonNumber": 1, "number": 1}, + ], + }, + } + mock_get.return_value = mock_resp + + cal = TheTvDbCalendar(name="TheTVDB", id="thetvdb", icon="", events=[]) + events = cal.fetch_events(series_id=42) + assert "Series 42" in events[0].title + + @patch.dict("os.environ", {"THE_TVDB_API_KEY": "key"}) + @patch("integrations.thetvdb.requests.get") + @patch("integrations.thetvdb._login_to_thetvdb", return_value="refreshed-token") + def test_401_triggers_token_refresh_and_retry(self, mock_login, mock_get): + """On 401, invalidate token, re-login, and retry the request.""" + mock_401 = MagicMock() + mock_401.status_code = 401 + + mock_ok = MagicMock() + mock_ok.status_code = 200 + mock_ok.raise_for_status = MagicMock() + mock_ok.json.return_value = { + "status": "success", + "data": { + "series": {"name": "Retry Show"}, + "episodes": [ + {"id": 1, "name": "Ep1", "aired": "2025-06-01", "seasonNumber": 1, "number": 1}, + ], + }, + } + mock_get.side_effect = [mock_401, mock_ok] + + cal = TheTvDbCalendar(name="TheTVDB", id="thetvdb", icon="", events=[]) + events = cal.fetch_events(series_id=123) + assert len(events) == 1 + assert events[0].title == "Retry Show S01E01: Ep1" + # Login called twice: once for initial token, once for refresh + assert mock_login.call_count == 2 + assert mock_get.call_count == 2 diff --git a/tests/integrations/test_twitch.py b/tests/integrations/test_twitch.py new file mode 100644 index 0000000..8f38bbd --- /dev/null +++ b/tests/integrations/test_twitch.py @@ -0,0 +1,179 @@ +import pytest +from unittest.mock import patch, MagicMock +from fastapi import HTTPException + +from integrations.twitch import TwitchCalendar + + +class TestTwitchCalendarProperties: + @patch.dict("os.environ", {"TWITCH_CLIENT_ID": "test-id"}) + def test_client_id_from_env(self): + cal = TwitchCalendar(name="Twitch", id="twitch", icon="", events=[]) + assert cal.CLIENT_ID == "test-id" + + @patch.dict("os.environ", {}, clear=True) + def test_client_id_missing_raises(self): + cal = TwitchCalendar(name="Twitch", id="twitch", icon="", events=[]) + with pytest.raises(ValueError, match="TWITCH_CLIENT_ID"): + _ = cal.CLIENT_ID + + @patch.dict("os.environ", {"TWITCH_CLIENT_SECRET": "test-secret"}) + def test_client_secret_from_env(self): + cal = TwitchCalendar(name="Twitch", id="twitch", icon="", events=[]) + assert cal.CLIENT_SECRET == "test-secret" + + @patch.dict("os.environ", {}, clear=True) + def test_client_secret_missing_raises(self): + cal = TwitchCalendar(name="Twitch", id="twitch", icon="", events=[]) + with pytest.raises(ValueError, match="TWITCH_CLIENT_SECRET"): + _ = cal.CLIENT_SECRET + + +class TestTwitchCalendarFetchEvents: + @patch.dict("os.environ", {"TWITCH_CLIENT_ID": "id", "TWITCH_CLIENT_SECRET": "secret"}) + @patch("integrations.twitch.requests.get") + @patch("integrations.twitch.requests.post") + def test_success(self, mock_post, mock_get): + # Mock token request + token_resp = MagicMock() + token_resp.status_code = 200 + token_resp.json.return_value = {"access_token": "test-token"} + mock_post.return_value = token_resp + + # Mock user lookup + user_resp = MagicMock() + user_resp.status_code = 200 + user_resp.json.return_value = {"data": [{"id": "12345"}]} + + # Mock schedule + schedule_resp = MagicMock() + schedule_resp.status_code = 200 + schedule_resp.json.return_value = { + "data": { + "segments": [ + { + "id": "seg1", + "title": "Morning Stream", + "start_time": "2025-06-15T10:00:00Z", + "end_time": "2025-06-15T14:00:00Z", + } + ] + } + } + + mock_get.side_effect = [user_resp, schedule_resp] + + cal = TwitchCalendar(name="Twitch", id="twitch", icon="", events=[]) + events = cal.fetch_events(streamer_name="teststreamer") + assert len(events) == 1 + assert events[0].title == "Morning Stream" + assert events[0].uid == "twitch-teststreamer-seg1" + assert "twitch.tv/teststreamer" in events[0].location + + @patch.dict("os.environ", {"TWITCH_CLIENT_ID": "id", "TWITCH_CLIENT_SECRET": "secret"}) + @patch("integrations.twitch.requests.get") + @patch("integrations.twitch.requests.post") + def test_auth_failure(self, mock_post, mock_get): + token_resp = MagicMock() + token_resp.status_code = 401 + mock_post.return_value = token_resp + + cal = TwitchCalendar(name="Twitch", id="twitch", icon="", events=[]) + with pytest.raises(HTTPException) as exc_info: + cal.fetch_events(streamer_name="test") + assert exc_info.value.status_code == 500 + assert "authenticate" in exc_info.value.detail + + @patch.dict("os.environ", {"TWITCH_CLIENT_ID": "id", "TWITCH_CLIENT_SECRET": "secret"}) + @patch("integrations.twitch.requests.get") + @patch("integrations.twitch.requests.post") + def test_user_not_found(self, mock_post, mock_get): + token_resp = MagicMock() + token_resp.status_code = 200 + token_resp.json.return_value = {"access_token": "tok"} + mock_post.return_value = token_resp + + user_resp = MagicMock() + user_resp.status_code = 404 + mock_get.return_value = user_resp + + cal = TwitchCalendar(name="Twitch", id="twitch", icon="", events=[]) + with pytest.raises(HTTPException) as exc_info: + cal.fetch_events(streamer_name="nonexistent") + assert exc_info.value.status_code == 404 + + @patch.dict("os.environ", {"TWITCH_CLIENT_ID": "id", "TWITCH_CLIENT_SECRET": "secret"}) + @patch("integrations.twitch.requests.get") + @patch("integrations.twitch.requests.post") + def test_user_empty_data(self, mock_post, mock_get): + token_resp = MagicMock() + token_resp.status_code = 200 + token_resp.json.return_value = {"access_token": "tok"} + mock_post.return_value = token_resp + + user_resp = MagicMock() + user_resp.status_code = 200 + user_resp.json.return_value = {"data": []} + mock_get.return_value = user_resp + + cal = TwitchCalendar(name="Twitch", id="twitch", icon="", events=[]) + with pytest.raises(HTTPException) as exc_info: + cal.fetch_events(streamer_name="ghost") + assert exc_info.value.status_code == 404 + + @patch.dict("os.environ", {"TWITCH_CLIENT_ID": "id", "TWITCH_CLIENT_SECRET": "secret"}) + @patch("integrations.twitch.requests.get") + @patch("integrations.twitch.requests.post") + def test_schedule_404_returns_empty(self, mock_post, mock_get): + token_resp = MagicMock() + token_resp.status_code = 200 + token_resp.json.return_value = {"access_token": "tok"} + mock_post.return_value = token_resp + + user_resp = MagicMock() + user_resp.status_code = 200 + user_resp.json.return_value = {"data": [{"id": "123"}]} + + schedule_resp = MagicMock() + schedule_resp.status_code = 404 + schedule_resp.json.return_value = {"data": {"segments": []}} + + mock_get.side_effect = [user_resp, schedule_resp] + + cal = TwitchCalendar(name="Twitch", id="twitch", icon="", events=[]) + events = cal.fetch_events(streamer_name="noSchedule") + assert events == [] + + @patch.dict("os.environ", {"TWITCH_CLIENT_ID": "id", "TWITCH_CLIENT_SECRET": "secret"}) + @patch("integrations.twitch.requests.get") + @patch("integrations.twitch.requests.post") + def test_schedule_null_response(self, mock_post, mock_get): + token_resp = MagicMock() + token_resp.status_code = 200 + token_resp.json.return_value = {"access_token": "tok"} + mock_post.return_value = token_resp + + user_resp = MagicMock() + user_resp.status_code = 200 + user_resp.json.return_value = {"data": [{"id": "123"}]} + + schedule_resp = MagicMock() + schedule_resp.status_code = 200 + schedule_resp.json.return_value = None + + mock_get.side_effect = [user_resp, schedule_resp] + + cal = TwitchCalendar(name="Twitch", id="twitch", icon="", events=[]) + events = cal.fetch_events(streamer_name="null_schedule") + assert events == [] + + @patch.dict("os.environ", {"TWITCH_CLIENT_ID": "id", "TWITCH_CLIENT_SECRET": "secret"}) + @patch("integrations.twitch.requests.post") + def test_token_request_exception(self, mock_post): + import requests + mock_post.side_effect = requests.RequestException("network error") + + cal = TwitchCalendar(name="Twitch", id="twitch", icon="", events=[]) + with pytest.raises(HTTPException) as exc_info: + cal.fetch_events(streamer_name="test") + assert exc_info.value.status_code == 500 diff --git a/tests/integrations/test_weather.py b/tests/integrations/test_weather.py new file mode 100644 index 0000000..4b88901 --- /dev/null +++ b/tests/integrations/test_weather.py @@ -0,0 +1,146 @@ +import pytest +from unittest.mock import patch, MagicMock +from fastapi import HTTPException + +from integrations.weather import get_weather_emoji, DailyWeatherForecastCalendar + + +class TestGetWeatherEmoji: + def test_clear(self): + assert get_weather_emoji("Clear") == "\u2600\ufe0f" + + def test_clouds_few(self): + assert get_weather_emoji("Clouds", "few clouds") == "\U0001f324\ufe0f" + + def test_clouds_scattered(self): + assert get_weather_emoji("Clouds", "scattered clouds") == "\U0001f324\ufe0f" + + def test_clouds_broken(self): + assert get_weather_emoji("Clouds", "broken clouds") == "\u26c5" + + def test_clouds_overcast(self): + assert get_weather_emoji("Clouds", "overcast clouds") == "\u2601\ufe0f" + + def test_rain_light(self): + assert get_weather_emoji("Rain", "light rain") == "\U0001f326\ufe0f" + + def test_rain_heavy(self): + assert get_weather_emoji("Rain", "heavy rain") == "\U0001f327\ufe0f" + + def test_drizzle(self): + assert get_weather_emoji("Drizzle", "light drizzle") == "\U0001f326\ufe0f" + + def test_thunderstorm(self): + assert get_weather_emoji("Thunderstorm") == "\u26c8\ufe0f" + + def test_snow_light(self): + assert get_weather_emoji("Snow", "light snow") == "\U0001f328\ufe0f" + + def test_snow_heavy(self): + assert get_weather_emoji("Snow", "heavy snow") == "\u2744\ufe0f" + + def test_mist(self): + assert get_weather_emoji("Mist") == "\U0001f32b\ufe0f" + + def test_fog(self): + assert get_weather_emoji("Fog") == "\U0001f32b\ufe0f" + + def test_haze(self): + assert get_weather_emoji("Haze") == "\U0001f32b\ufe0f" + + def test_dust(self): + assert get_weather_emoji("Dust") == "\U0001f32a\ufe0f" + + def test_tornado(self): + assert get_weather_emoji("Tornado") == "\U0001f32a\ufe0f" + + def test_squall(self): + assert get_weather_emoji("Squall") == "\U0001f4a8" + + def test_ash(self): + assert get_weather_emoji("Ash") == "\U0001f30b" + + def test_unknown(self): + assert get_weather_emoji("Unknown Condition") == "\U0001f321\ufe0f" + + def test_case_insensitive(self): + # Function lowercases internally + assert get_weather_emoji("CLEAR") == "\u2600\ufe0f" + + +class TestDailyWeatherForecastCalendar: + def _make_forecast_response(self): + """Create a minimal valid OpenWeatherMap forecast response.""" + import time + base_ts = int(time.mktime((2025, 6, 15, 0, 0, 0, 0, 0, 0))) + forecasts = [] + for i in range(8): # 8 x 3hr = 1 day + forecasts.append({ + "dt": base_ts + i * 3600 * 3, + "main": {"temp": 20 + i, "humidity": 50, "pressure": 1013}, + "wind": {"speed": 3.5, "deg": 180}, + "clouds": {"all": 40}, + "weather": [{"main": "Clear", "description": "clear sky"}], + }) + return {"cod": "200", "list": forecasts} + + @patch.dict("os.environ", {"OPENWEATHERMAP_API_KEY": "test-key"}) + @patch("integrations.weather.requests.get") + def test_fetch_events_success(self, mock_get): + geocode_resp = MagicMock() + geocode_resp.status_code = 200 + geocode_resp.json.return_value = [{"lat": 40.7, "lon": -74.0, "name": "New York", "country": "US"}] + geocode_resp.raise_for_status = MagicMock() + + forecast_resp = MagicMock() + forecast_resp.status_code = 200 + forecast_resp.json.return_value = self._make_forecast_response() + forecast_resp.raise_for_status = MagicMock() + + mock_get.side_effect = [geocode_resp, forecast_resp] + + cal = DailyWeatherForecastCalendar(name="Weather", id="weather", icon="", events=[]) + events = cal.fetch_events(location="New York", days=1) + assert len(events) == 1 + assert "New York" in events[0].title + assert events[0].all_day is True + assert events[0].location == "New York" + + @patch("integrations.weather.requests.get") + def test_empty_location_raises(self, mock_get): + cal = DailyWeatherForecastCalendar(name="Weather", id="weather", icon="", events=[]) + with pytest.raises(HTTPException) as exc_info: + cal.fetch_events(location="") + assert exc_info.value.status_code == 400 + + @patch("integrations.weather.requests.get") + def test_missing_api_key_raises(self, mock_get): + cal = DailyWeatherForecastCalendar(name="Weather", id="weather", icon="", events=[]) + with pytest.raises(HTTPException) as exc_info: + with patch.dict("os.environ", {}, clear=True): + cal.fetch_events(location="London", api_key="") + assert exc_info.value.status_code == 500 + + @patch("integrations.weather.requests.get") + def test_location_not_found(self, mock_get): + geocode_resp = MagicMock() + geocode_resp.status_code = 200 + geocode_resp.json.return_value = [] + geocode_resp.raise_for_status = MagicMock() + mock_get.return_value = geocode_resp + + cal = DailyWeatherForecastCalendar(name="Weather", id="weather", icon="", events=[]) + with pytest.raises(HTTPException) as exc_info: + cal.fetch_events(location="xyznonexistent", api_key="test-key") + assert exc_info.value.status_code == 404 + + @patch("integrations.weather.requests.get") + def test_invalid_api_key(self, mock_get): + geocode_resp = MagicMock() + geocode_resp.status_code = 401 + mock_get.return_value = geocode_resp + + cal = DailyWeatherForecastCalendar(name="Weather", id="weather", icon="", events=[]) + with pytest.raises(HTTPException) as exc_info: + cal.fetch_events(location="London", api_key="bad-key") + assert exc_info.value.status_code == 401 diff --git a/tests/integrations/test_weather_geocode.py b/tests/integrations/test_weather_geocode.py new file mode 100644 index 0000000..ca5c265 --- /dev/null +++ b/tests/integrations/test_weather_geocode.py @@ -0,0 +1,107 @@ +import pytest +from unittest.mock import patch, MagicMock +from fastapi import HTTPException + + +class TestGeocodeCities: + """Test geocode_cities function directly via import.""" + + @patch.dict("os.environ", {"OPENWEATHERMAP_API_KEY": "test-key"}) + @patch("integrations.weather_geocode.requests.get") + def test_basic_search(self, mock_get): + mock_response = MagicMock() + mock_response.status_code = 200 + mock_response.raise_for_status = MagicMock() + mock_response.json.return_value = [ + {"name": "London", "state": "", "country": "GB", "lat": 51.5, "lon": -0.1} + ] + mock_get.return_value = mock_response + + from integrations.weather_geocode import geocode_cities + results = geocode_cities(q="London", limit=5) + assert len(results) == 1 + assert results[0]["name"] == "London" + assert results[0]["country"] == "GB" + assert "displayName" in results[0] + assert "locationForWeather" in results[0] + + @patch.dict("os.environ", {"OPENWEATHERMAP_API_KEY": "test-key"}) + @patch("integrations.weather_geocode.requests.get") + def test_no_results(self, mock_get): + mock_response = MagicMock() + mock_response.status_code = 200 + mock_response.raise_for_status = MagicMock() + mock_response.json.return_value = [] + mock_get.return_value = mock_response + + from integrations.weather_geocode import geocode_cities + results = geocode_cities(q="xyznonexistent", limit=5) + assert results == [] + + @patch.dict("os.environ", {}, clear=True) + def test_missing_api_key_raises(self): + from integrations.weather_geocode import geocode_cities + with pytest.raises(HTTPException) as exc_info: + geocode_cities(q="London", limit=5) + assert exc_info.value.status_code == 500 + + @patch.dict("os.environ", {"OPENWEATHERMAP_API_KEY": "test-key"}) + @patch("integrations.weather_geocode.requests.get") + def test_api_401_raises(self, mock_get): + mock_response = MagicMock() + mock_response.status_code = 401 + mock_get.return_value = mock_response + + from integrations.weather_geocode import geocode_cities + with pytest.raises(HTTPException) as exc_info: + geocode_cities(q="London", limit=5) + assert exc_info.value.status_code == 401 + + @patch.dict("os.environ", {"OPENWEATHERMAP_API_KEY": "test-key"}) + @patch("integrations.weather_geocode.requests.get") + def test_deduplication(self, mock_get): + mock_response = MagicMock() + mock_response.status_code = 200 + mock_response.raise_for_status = MagicMock() + mock_response.json.return_value = [ + {"name": "New York", "state": "New York", "country": "US", "lat": 40.7, "lon": -74.0}, + {"name": "New York", "state": "New York", "country": "US", "lat": 40.7, "lon": -74.0}, + ] + mock_get.return_value = mock_response + + from integrations.weather_geocode import geocode_cities + results = geocode_cities(q="New York", limit=5) + assert len(results) == 1 + + @patch.dict("os.environ", {"OPENWEATHERMAP_API_KEY": "test-key"}) + @patch("integrations.weather_geocode.requests.get") + def test_single_word_query_multiple_searches(self, mock_get): + """Single-word queries like 'new' should trigger multiple search patterns.""" + mock_response = MagicMock() + mock_response.status_code = 200 + mock_response.raise_for_status = MagicMock() + mock_response.json.return_value = [ + {"name": "Newark", "state": "New Jersey", "country": "US", "lat": 40.7, "lon": -74.2}, + ] + mock_get.return_value = mock_response + + from integrations.weather_geocode import geocode_cities + results = geocode_cities(q="new", limit=5) + # Should have made multiple API calls for single-word query + assert mock_get.call_count >= 2 + + @patch.dict("os.environ", {"OPENWEATHERMAP_API_KEY": "test-key"}) + @patch("integrations.weather_geocode.requests.get") + def test_display_name_with_state(self, mock_get): + mock_response = MagicMock() + mock_response.status_code = 200 + mock_response.raise_for_status = MagicMock() + mock_response.json.return_value = [ + {"name": "Springfield", "state": "Illinois", "country": "US", "lat": 39.8, "lon": -89.6}, + ] + mock_get.return_value = mock_response + + from integrations.weather_geocode import geocode_cities + results = geocode_cities(q="Springfield", limit=5) + assert len(results) == 1 + assert results[0]["displayName"] == "Springfield, Illinois, US" diff --git a/tests/integrations/test_wwe.py b/tests/integrations/test_wwe.py new file mode 100644 index 0000000..d1fdac5 --- /dev/null +++ b/tests/integrations/test_wwe.py @@ -0,0 +1,108 @@ +import pytest +from datetime import datetime +from unittest.mock import patch, MagicMock + +from integrations.wwe import parse_wwe_datetime, WweCalendar + + +class TestParseWweDatetime: + def test_pm_time(self): + result = parse_wwe_datetime("Sat, Jun 15", "7:30 PM") + assert result.hour == 19 + assert result.minute == 30 + assert result.month == 6 + assert result.day == 15 + + def test_am_time(self): + result = parse_wwe_datetime("Mon, Jan 6", "10:00 AM") + assert result.hour == 10 + assert result.minute == 0 + + def test_noon(self): + result = parse_wwe_datetime("Tue, Mar 4", "12:00 PM") + assert result.hour == 12 + + def test_midnight(self): + result = parse_wwe_datetime("Wed, Feb 5", "12:00 AM") + assert result.hour == 0 + + def test_uses_current_year(self): + result = parse_wwe_datetime("Thu, Jul 10", "8:00 PM") + assert result.year == datetime.now().year + + def test_invalid_date_format_raises(self): + with pytest.raises(ValueError): + parse_wwe_datetime("Invalid", "8:00 PM") + + def test_invalid_time_format_raises(self): + with pytest.raises(ValueError): + parse_wwe_datetime("Sat, Jun 15", "invalid") + + def test_invalid_month_raises(self): + with pytest.raises(ValueError): + parse_wwe_datetime("Sat, Xyz 15", "8:00 PM") + + def test_invalid_time_parts_raises(self): + with pytest.raises(ValueError): + parse_wwe_datetime("Sat, Jun 15", "8 PM") + + def test_all_months(self): + months = ["Jan", "Feb", "Mar", "Apr", "May", "Jun", + "Jul", "Aug", "Sep", "Oct", "Nov", "Dec"] + for i, month in enumerate(months, 1): + result = parse_wwe_datetime(f"Mon, {month} 1", "8:00 PM") + assert result.month == i + + +class TestWweCalendarFetchEvents: + @patch("integrations.wwe.requests.get") + def test_success(self, mock_get): + mock_response = MagicMock() + mock_response.status_code = 200 + mock_response.json.return_value = [ + { + "type": "event", + "nid": "12345", + "title": "WrestleMania", + "date": f"Sat, Jun 15", + "time": "7:00 PM", + "teaser_title": "WrestleMania XL", + "location": "Philadelphia, PA", + "link": "/events/wrestlemania", + } + ] + mock_get.return_value = mock_response + + cal = WweCalendar(name="WWE", id="wwe", icon="", events=[]) + events = cal.fetch_events() + assert len(events) == 1 + assert events[0].title == "WrestleMania" + assert events[0].uid == "wwe-12345" + + @patch("integrations.wwe.requests.get") + def test_non_event_items_skipped(self, mock_get): + mock_response = MagicMock() + mock_response.status_code = 200 + mock_response.json.return_value = [ + {"type": "ad", "nid": "1", "title": "Ad"}, + { + "type": "event", "nid": "2", "title": "Raw", + "date": "Mon, Jan 6", "time": "8:00 PM", + "location": "NYC", + }, + ] + mock_get.return_value = mock_response + + cal = WweCalendar(name="WWE", id="wwe", icon="", events=[]) + events = cal.fetch_events() + assert len(events) == 1 + + @patch("integrations.wwe.requests.get") + def test_api_failure(self, mock_get): + mock_response = MagicMock() + mock_response.status_code = 500 + mock_get.return_value = mock_response + + cal = WweCalendar(name="WWE", id="wwe", icon="", events=[]) + with pytest.raises(Exception): + cal.fetch_events() diff --git a/tests/test_base.py b/tests/test_base.py new file mode 100644 index 0000000..9d2bbad --- /dev/null +++ b/tests/test_base.py @@ -0,0 +1,102 @@ +import pytest +from datetime import datetime +from base.models import Event +from base.calendar import CalendarBase +from base.integration import IntegrationBase + + +class TestEvent: + def test_create_event(self): + e = Event( + uid="1", title="Test", + start=datetime(2025, 1, 1), end=datetime(2025, 1, 2), + ) + assert e.uid == "1" + assert e.title == "Test" + assert e.all_day is False + assert e.description == "" + assert e.location == "" + assert e.extra == {} + + def test_event_with_all_fields(self): + e = Event( + uid="2", title="Full", + start=datetime(2025, 1, 1), end=datetime(2025, 1, 2), + all_day=True, description="Desc", location="NYC", + extra={"key": "value"}, + ) + assert e.all_day is True + assert e.description == "Desc" + assert e.location == "NYC" + assert e.extra == {"key": "value"} + + def test_event_extra_defaults_to_empty_dict(self): + e1 = Event(uid="a", title="A", start=datetime.now(), end=datetime.now()) + e2 = Event(uid="b", title="B", start=datetime.now(), end=datetime.now()) + # Ensure default dict isn't shared between instances + e1.extra["test"] = True + assert "test" not in e2.extra + + +class TestCalendarBase: + def test_init(self): + cal = CalendarBase(name="Test", id="test", icon="icon.png", events=[]) + assert cal.name == "Test" + assert cal.id == "test" + assert cal.icon == "icon.png" + assert cal.events == [] + + def test_fetch_events_not_implemented(self): + cal = CalendarBase(name="Test", id="test", icon="", events=[]) + with pytest.raises(NotImplementedError): + cal.fetch_events() + + +class TestIntegrationBase: + def test_init(self): + integ = IntegrationBase( + id="test", name="Test", description="Desc", + base_url="https://example.com", + calendar_class=CalendarBase, multi_calendar=False, + ) + assert integ.id == "test" + assert integ.name == "Test" + assert integ.description == "Desc" + assert integ.base_url == "https://example.com" + assert integ.calendar_class is CalendarBase + assert integ.multi_calendar is False + + def test_multi_calendar_default_false(self): + integ = IntegrationBase( + id="t", name="T", description="D", + base_url="https://example.com", + calendar_class=CalendarBase, + ) + assert integ.multi_calendar is False + + def test_master_csv_raises_for_single_calendar(self): + integ = IntegrationBase( + id="t", name="T", description="D", + base_url="https://example.com", + calendar_class=CalendarBase, multi_calendar=False, + ) + with pytest.raises(Exception, match="does not support multiple calendars"): + integ.master_csv() + + def test_master_csv_returns_none_for_multi_calendar(self): + integ = IntegrationBase( + id="t", name="T", description="D", + base_url="https://example.com", + calendar_class=CalendarBase, multi_calendar=True, + ) + # TODO stub returns None + assert integ.master_csv() is None + + def test_fetch_calendars_not_implemented(self): + integ = IntegrationBase( + id="t", name="T", description="D", + base_url="https://example.com", + calendar_class=CalendarBase, + ) + with pytest.raises(NotImplementedError): + integ.fetch_calendars() diff --git a/tests/test_routes.py b/tests/test_routes.py new file mode 100644 index 0000000..dd0de3b --- /dev/null +++ b/tests/test_routes.py @@ -0,0 +1,122 @@ +import pytest +from datetime import datetime +from unittest.mock import MagicMock +from fastapi import APIRouter +from fastapi.testclient import TestClient + +from base.models import Event +from base.calendar import CalendarBase +from base.integration import IntegrationBase +from base.routes import mount_integration_routes + + +class MockCalendar(CalendarBase): + def fetch_events(self, query: str = "default") -> list: + """Fetch mock events for testing.""" + return [ + Event( + uid="mock-1", + title=f"Mock Event ({query})", + start=datetime(2025, 6, 15, 10, 0), + end=datetime(2025, 6, 15, 11, 0), + all_day=False, + description="Test description", + location="Test Location", + ) + ] + + +class MockEmptyCalendar(CalendarBase): + def fetch_events(self) -> list: + """Fetch no events.""" + return [] + + +def _create_app_with_integration(calendar_class, integration_id="test", integration_name="Test"): + """Helper to create a minimal FastAPI app with one integration mounted.""" + from fastapi import FastAPI + app = FastAPI() + integration = IntegrationBase( + id=integration_id, + name=integration_name, + description="Test integration", + base_url="https://example.com", + calendar_class=calendar_class, + ) + router = APIRouter(prefix=f"/{integration_id}", tags=[integration_name]) + mount_integration_routes(router, integration) + app.include_router(router) + return app + + +class TestMountIntegrationRoutes: + def test_ics_response(self): + """Default ics=True should return ICS content.""" + try: + import httpx + except ImportError: + pytest.skip("httpx not installed (needed for TestClient)") + + app = _create_app_with_integration(MockCalendar) + client = TestClient(app) + resp = client.get("/test/events?query=hello") + assert resp.status_code == 200 + assert "BEGIN:VCALENDAR" in resp.text + assert "Mock Event (hello)" in resp.text + + def test_json_response(self): + """ics=false should return JSON events.""" + try: + import httpx + except ImportError: + pytest.skip("httpx not installed (needed for TestClient)") + + app = _create_app_with_integration(MockCalendar) + client = TestClient(app) + resp = client.get("/test/events?query=world&ics=false") + assert resp.status_code == 200 + data = resp.json() + assert len(data) == 1 + assert data[0]["title"] == "Mock Event (world)" + + def test_empty_events_ics(self): + """Empty events should still return valid ICS.""" + try: + import httpx + except ImportError: + pytest.skip("httpx not installed (needed for TestClient)") + + app = _create_app_with_integration(MockEmptyCalendar) + client = TestClient(app) + resp = client.get("/test/events") + assert resp.status_code == 200 + assert "BEGIN:VCALENDAR" in resp.text + assert "VEVENT" not in resp.text + + def test_weather_calendar_name_includes_location(self): + """Weather integration should include location in calendar name.""" + try: + import httpx + except ImportError: + pytest.skip("httpx not installed (needed for TestClient)") + + class WeatherMockCalendar(CalendarBase): + def fetch_events(self) -> list: + return [ + Event( + uid="w-1", title="Sunny", + start=datetime(2025, 6, 15), end=datetime(2025, 6, 16), + all_day=True, description="Nice day", + location="New York", + ) + ] + + app = _create_app_with_integration( + WeatherMockCalendar, + integration_id="daily-weather-forecast", + integration_name="Weather", + ) + client = TestClient(app) + resp = client.get("/daily-weather-forecast/events") + assert resp.status_code == 200 + assert "Weather - New York" in resp.text diff --git a/tests/test_utils.py b/tests/test_utils.py new file mode 100644 index 0000000..7cc3e1c --- /dev/null +++ b/tests/test_utils.py @@ -0,0 +1,155 @@ +from utils import make_slug, generate_ics + + +class TestMakeSlug: + def test_basic_text(self): + assert make_slug("Hello World") == "hello-world" + + def test_special_characters_stripped(self): + result = make_slug("Rock & Roll!") + assert "!" not in result + assert "&" not in result + + def test_empty_string(self): + assert make_slug("") == "" + + def test_max_length(self): + result = make_slug("a" * 100, max_length=10) + assert len(result) <= 10 + + def test_max_length_no_trailing_hyphen(self): + # If truncation lands on a hyphen, it should be stripped + result = make_slug("hello world foo bar baz", max_length=11) + assert not result.endswith("-") + + def test_leading_trailing_hyphens_stripped(self): + assert make_slug("--hello--") == "hello" + + def test_multiple_spaces_collapsed(self): + assert make_slug("hello world") == "hello-world" + + def test_tabs_and_newlines(self): + result = make_slug("hello\tworld\nfoo") + assert "hello" in result + assert "world" in result + + def test_already_slug(self): + assert make_slug("already-a-slug") == "already-a-slug" + + def test_numbers(self): + assert make_slug("Test 123") == "test-123" + + def test_none_like_empty(self): + assert make_slug("") == "" + + +class TestGenerateIcs: + def test_basic_event(self): + events = [{"name": "Test", "begin": "2025-06-15T10:00:00Z", "end": "2025-06-15T11:00:00Z"}] + ics = generate_ics(events, "Test Calendar") + assert "BEGIN:VCALENDAR" in ics + assert "BEGIN:VEVENT" in ics + assert "SUMMARY:Test" in ics + assert "END:VEVENT" in ics + assert "END:VCALENDAR" in ics + + def test_calendar_metadata(self): + ics = generate_ics([], "My Calendar") + assert "X-WR-CALNAME:My Calendar" in ics + assert "VERSION:2.0" in ics + assert "PRODID:" in ics + + def test_all_day_event(self): + events = [{"name": "All Day", "begin": "2025-06-15", "all_day": True}] + ics = generate_ics(events, "Test") + assert "DTSTART;VALUE=DATE:20250615" in ics + assert "DTEND;VALUE=DATE:20250616" in ics + + def test_empty_events(self): + ics = generate_ics([], "Empty Cal") + assert "BEGIN:VCALENDAR" in ics + assert "END:VCALENDAR" in ics + assert "VEVENT" not in ics + + def test_event_without_begin_skipped(self): + events = [{"name": "No Date"}] + ics = generate_ics(events, "Test") + assert "VEVENT" not in ics + + def test_special_characters_escaped(self): + events = [{"name": "Hello; World, Test", "begin": "2025-06-15T10:00:00Z"}] + ics = generate_ics(events, "Test") + # Semicolons and commas should be escaped in SUMMARY + assert "\\;" in ics + assert "\\," in ics + + def test_calendar_description(self): + ics = generate_ics([], "Cal", calendar_description="My Calendar Desc") + assert "X-WR-CALDESC:My Calendar Desc" in ics + + def test_event_with_all_fields(self): + events = [{ + "name": "Full Event", + "begin": "2025-06-15T10:00:00Z", + "end": "2025-06-15T11:00:00Z", + "description": "Details here", + "location": "NYC", + "url": "https://example.com", + "status": "CONFIRMED", + "uid": "custom-uid-123", + "categories": ["Work", "Meeting"], + }] + ics = generate_ics(events, "Test") + assert "LOCATION:NYC" in ics + assert "URL:https://example.com" in ics + assert "STATUS:CONFIRMED" in ics + assert "UID:custom-uid-123" in ics + assert "CATEGORIES:Work,Meeting" in ics + + def test_event_without_end_uses_begin(self): + events = [{"name": "No End", "begin": "2025-06-15T10:00:00Z"}] + ics = generate_ics(events, "Test") + assert "DTSTART:20250615T100000Z" in ics + assert "DTEND:20250615T100000Z" in ics + + def test_line_folding_long_lines(self): + long_desc = "A" * 200 + events = [{"name": "Test", "begin": "2025-06-15T10:00:00Z", "description": long_desc}] + ics = generate_ics(events, "Test") + # After folding, continuation lines start with a space + lines = ics.split("\r\n") + for line in lines: + # Lines should be <=75 chars or be continuation lines (start with space) + assert len(line) <= 75 or line.startswith(" ") + + def test_crlf_line_endings(self): + ics = generate_ics([], "Test") + assert "\r\n" in ics + + def test_multiple_events(self): + events = [ + {"name": "Event 1", "begin": "2025-06-15T10:00:00Z"}, + {"name": "Event 2", "begin": "2025-06-16T10:00:00Z"}, + ] + ics = generate_ics(events, "Test") + assert ics.count("BEGIN:VEVENT") == 2 + assert ics.count("END:VEVENT") == 2 + + def test_timezone_parameter(self): + ics = generate_ics([], "Test", timezone="America/New_York") + assert "X-WR-TIMEZONE:America/New_York" in ics + + def test_date_only_begin(self): + events = [{"name": "Date Only", "begin": "2025-06-15"}] + ics = generate_ics(events, "Test") + assert "DTSTART:20250615T000000Z" in ics + + def test_empty_categories_not_included(self): + events = [{"name": "Test", "begin": "2025-06-15T10:00:00Z", "categories": []}] + ics = generate_ics(events, "Test") + assert "CATEGORIES" not in ics + + def test_none_categories_not_included(self): + events = [{"name": "Test", "begin": "2025-06-15T10:00:00Z", "categories": None}] + ics = generate_ics(events, "Test") + assert "CATEGORIES" not in ics