Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions wing-command/.eslintrc.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
{
"extends": "next/core-web-vitals"
}
51 changes: 51 additions & 0 deletions wing-command/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
# Dependencies
node_modules/
.pnp
.pnp.js

# Testing
coverage/

# Next.js
.next/
out/

# Production
build/

# Misc
.DS_Store
*.pem

# Debug
npm-debug.log*
yarn-debug.log*
yarn-error.log*

# Local env files
.env*.local
.env

# Vercel
.vercel

# TypeScript
*.tsbuildinfo
next-env.d.ts

# Python
__pycache__/
*.py[cod]
*$py.class
.Python
venv/
ENV/
.venv/

# IDE
.idea/
.vscode/

*.swp
*.swo
.claude/
137 changes: 137 additions & 0 deletions wing-command/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,137 @@
# Wing Command

[**Live App**](https://wingscommand.up.railway.app/)

A hyper-local chicken wing price comparison tool for Super Bowl watch parties. Wing Command uses the TinyFish API to dispatch parallel web agents across DoorDash, UberEats, Grubhub, and Google, finding the best wing deals near any US zip code in real-time.

## TinyFish API Usage

Wing Command fires 4 TinyFish agents simultaneously using `Promise.allSettled` for fault-tolerant parallel scraping:

```typescript
// lib/agentql.ts — Core TinyFish API call
async function runMinoScrape(url: string, goal: string, timeoutMs: number) {
const response = await fetch(MINO_API_URL, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'x-api-key': MINO_API_KEY,
},
body: JSON.stringify({ url, goal }),
signal: AbortSignal.timeout(timeoutMs),
});
const data = await response.json();
return { success: true, data: data.result };
}

// Parallel scraping across 4 platforms
export async function scrapeAllSources(zipCode, lat, lng, flavor, city, state) {
const results = await Promise.allSettled([
withTimeout(scrapeGoogle(zipCode, city, state), 120000, []),
withTimeout(scrapeDoorDash(zipCode, city, state), 120000, []),
withTimeout(scrapeGrubhub(zipCode, city, state), 120000, []),
withTimeout(scrapeUberEats(zipCode, city, state), 120000, []),
]);

// Merge results — if one platform fails, others still return data
const allRestaurants = [];
results.forEach((result) => {
if (result.status === 'fulfilled') {
allRestaurants.push(...result.value);
}
});
return deduplicateAndProcess(allRestaurants);
}
```

Each platform scraper uses a natural language goal to extract structured JSON:

```typescript
// Example: DoorDash scraper goal
const goal = `Find chicken wings restaurants that deliver to zip code ${zipCode}.
Extract a JSON array of restaurants with: name, address, delivery_time,
rating, image_url, is_open, store_url. Return as JSON array called "restaurants".`;

const result = await runMinoScrape(searchUrl, goal);
```

## How to Run

### Prerequisites

- Node.js >= 18
- TinyFish API key ([sign up at tinyfish.ai](https://tinyfish.ai))
- Supabase project (free tier works)
- Upstash Redis (optional, recommended for caching)

### Setup

1. Install dependencies:
```bash
npm install
```

2. Create `.env.local`:
```env
# Required
AGENTQL_API_KEY=your_tinyfish_api_key
SUPABASE_SERVICE_ROLE_KEY=your_service_role_key
NEXT_PUBLIC_SUPABASE_URL=https://your-project.supabase.co
NEXT_PUBLIC_SUPABASE_ANON_KEY=your_anon_key

# Optional (caching)
UPSTASH_REDIS_REST_URL=https://your-redis.upstash.io
UPSTASH_REDIS_REST_TOKEN=your_redis_token
```

3. Run the database schema:
```bash
# Execute supabase/schema.sql in your Supabase SQL editor
```

4. Start the app:
```bash
npm run dev
```

5. Open http://localhost:3000

## Architecture

```
User Browser
|
v
Next.js 14 (App Router)
|
|-- GET /api/scout?zip=94306 ---------> TinyFish API (parallel agents)
| |-- DoorDash search
| |-- UberEats search
| |-- Grubhub search
| |-- Google search
| v
| Merge + Deduplicate + Score
|
|-- GET /api/menu?spot_id=xxx --------> TinyFish API (per-restaurant)
| |-- Extract menu items + prices
| v
| Calculate $/wing
|
|-- GET /api/deals?spot_id=xxx -------> TinyFish API
| |-- Scan deal roundups (KCL, TODAY.com)
| v
| Match deals to restaurants
|
|-- Supabase (PostgreSQL) -----------> Persistence (wing_spots, menus)
|-- Upstash Redis -------------------> Cache (15-min TTL, scouting locks)
```

## Tech Stack

- **Framework:** Next.js 14 (App Router), TypeScript, Tailwind CSS
- **Animations:** Framer Motion
- **Database:** Supabase (PostgreSQL)
- **Cache:** Upstash Redis
- **Web Agents:** TinyFish API (parallel scraping)
- **Geocoding:** Nominatim (OpenStreetMap, no API key needed)
- **Deployment:** Railway
166 changes: 166 additions & 0 deletions wing-command/app/api/deals/route.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,166 @@
// ===========================================
// Wing Scout — Super Bowl Deals API Endpoint
// Aggregator-first: check global deals cache → fuzzy match → fallback
// ===========================================

import { NextRequest, NextResponse } from 'next/server';
import { createServerClient } from '@/lib/supabase';
import {
getCachedDeals,
cacheDeals,
getCachedAggregatorDeals,
setAggregatorScoutingLock,
isAggregatorScoutingInProgress,
setDealsScoutingLock,
isDealsScoutingInProgress,
} from '@/lib/cache';
import {
startBackgroundAggregatorScrape,
startBackgroundDealsScrape,
matchDealsToSpot,
} from '@/lib/deals';
import { DealsResponse } from '@/lib/types';

export const runtime = 'nodejs';
export const maxDuration = 300; // 5 minutes — Railway has no limit, but set generous max

export async function GET(request: NextRequest) {
const searchParams = request.nextUrl.searchParams;
const spotId = searchParams.get('spot_id');
const isPoll = searchParams.get('poll') === 'true';

if (!spotId) {
return NextResponse.json<DealsResponse>(
{ success: false, deals: [], cached: false, message: 'spot_id is required' },
{ status: 400 }
);
}

try {
// ===========================================
// Stage 1: Check per-spot Redis cache (30-min TTL)
// ===========================================
const cachedDeals = await getCachedDeals(spotId);
if (cachedDeals) {
console.log(`Deals cache hit for ${spotId}: ${cachedDeals.length} deals`);
return NextResponse.json<DealsResponse>({
success: true,
deals: cachedDeals,
cached: true,
message: cachedDeals.length > 0
? `${cachedDeals.length} Super Bowl deal(s) (cached)`
: 'No Super Bowl specials found (cached)',
});
}

// ===========================================
// Stage 2: Look up spot details from Supabase
// ===========================================
const supabase = createServerClient();
const { data: spot, error: spotError } = await supabase
.from('wing_spots')
.select('name, address, platform_ids')
.eq('id', spotId)
.single();

if (!spot || spotError) {
console.log(`Deals: spot not found: ${spotId}`);
return NextResponse.json<DealsResponse>(
{ success: false, deals: [], cached: false, message: 'Spot not found' },
{ status: 404 }
);
}

// ===========================================
// Stage 3: Check global aggregator cache → fuzzy match
// ===========================================
const aggregatorDeals = await getCachedAggregatorDeals();
if (aggregatorDeals && aggregatorDeals.length > 0) {
// Aggregator data exists — try to match this spot
const matchedDeals = matchDealsToSpot(spot.name, aggregatorDeals);

if (matchedDeals.length > 0) {
// Chain match found — cache per-spot and return
console.log(`Aggregator match for ${spotId} (${spot.name}): ${matchedDeals.length} deals`);
await cacheDeals(spotId, matchedDeals);
return NextResponse.json<DealsResponse>({
success: true,
deals: matchedDeals,
cached: false,
message: `${matchedDeals.length} Super Bowl deal(s) found`,
});
}

// No aggregator match — this is likely a local restaurant.
// Fall through to Stage 5 (website-only fallback) below.
console.log(`No aggregator match for ${spotId} (${spot.name}) — trying website fallback`);
}

// ===========================================
// Stage 4: Poll handling
// ===========================================
if (isPoll) {
// Check if either aggregator or per-spot scouting is in progress
const aggScouting = await isAggregatorScoutingInProgress();
const spotScouting = await isDealsScoutingInProgress(spotId);
const anyScouting = aggScouting || spotScouting;

return NextResponse.json<DealsResponse>({
success: false,
deals: [],
cached: false,
scouting: anyScouting,
message: anyScouting
? 'Still scouting Super Bowl deals...'
: 'No Super Bowl specials found',
});
}

// ===========================================
// Stage 5: Trigger background scrapes
// ===========================================

// If no aggregator cache at all → trigger global aggregator scrape
if (!aggregatorDeals) {
const gotAggLock = await setAggregatorScoutingLock();
if (gotAggLock) {
console.log('Launching background aggregator scrape (first request)');
startBackgroundAggregatorScrape();
} else {
console.log('Aggregator scrape already in progress');
}

return NextResponse.json<DealsResponse>({
success: false,
deals: [],
cached: false,
scouting: true,
message: 'Scouting Super Bowl deals...',
});
}

// Aggregator cache exists but no match (local restaurant)
// → trigger website-only fallback for this specific spot
const gotSpotLock = await setDealsScoutingLock(spotId);
if (gotSpotLock) {
console.log(`Launching website-only fallback for ${spotId}: ${spot.name}`);
startBackgroundDealsScrape(spotId, spot.name, spot.address, spot.platform_ids);
} else {
console.log(`Website fallback already in progress for ${spotId}`);
}

return NextResponse.json<DealsResponse>({
success: false,
deals: [],
cached: false,
scouting: true,
message: 'Scouting website for deals...',
});
} catch (error) {
console.error('Deals API error:', error);
return NextResponse.json<DealsResponse>(
{ success: false, deals: [], cached: false, message: 'Failed to fetch deals' },
{ status: 500 }
);
}
}
Loading