Skip to content

Python SDK for WaveGuard physics-based anomaly detection API. One call. Any data.

License

Notifications You must be signed in to change notification settings

gpartin/WaveGuardClient

Repository files navigation

PyPI v2.2.0 CUDA MCP Smithery

WaveGuard Python SDK

Anomaly detection powered by wave physics. Not machine learning.
One API call. Fully stateless. Works on any data type. Zero false alarms.

Benchmarks β€’ Quickstart β€’ Use Cases β€’ Examples β€’ MCP / Claude β€’ API Reference


What is WaveGuard?

WaveGuard is a general-purpose anomaly detection API. Send it any data β€” server metrics, financial transactions, log files, sensor readings, time series β€” and get back anomaly scores, confidence levels, and explanations of which features triggered the alert.

No training pipelines. No model management. No state. One API call.

Your data  β†’  WaveGuard API (GPU)  β†’  Anomaly scores + explanations

Under the hood, it uses GPU-accelerated wave physics instead of machine learning. You don't need to know or care about the physics β€” it's all server-side.

How does it actually work?

Your data is encoded onto a 64Β³ lattice and run through coupled wave equation simulations on GPU. Normal data produces stable wave patterns; anomalies produce divergent ones. A 52-dimensional statistical fingerprint is compared between training and test data. Everything is torn down after each call β€” nothing is stored.

The key advantage over ML: no training data requirements (2+ samples is enough), no model drift, no retraining, no hyperparameter tuning. Same API call works on structured data, text, numbers, and time series.

Benchmarks (v2.2)

WaveGuard v2.2 vs scikit-learn across 6 real-world scenarios (10 training + 10 test samples each).

TL;DR: WaveGuard v2.2 wins 4 of 6 scenarios and averages 0.76 F1 β€” competitive with sklearn methods while requiring zero ML expertise.

F1 Score (balanced precision-recall)

Scenario WaveGuard IsolationForest LOF OneClassSVM
Server Metrics (IT Ops) 0.87 0.71 0.87 0.62
Financial Fraud 0.83 0.74 0.77 0.77
IoT Sensors (Industrial) 0.87 0.69 0.69 0.65
Network Traffic (Security) 0.82 0.61 0.77 0.61
Time-Series (Monitoring) 0.46 0.77 0.80 0.67
Sparse Features (Logs) 0.72 0.90 0.82 0.78
Average 0.76 0.74 0.79 0.68

What's new in v2.2

Multi-resolution scoring tracks each feature's local lattice energy in addition to global fingerprint distance. This catches subtle per-feature anomalies (like 3 of 10 IoT sensors drifting) that v2.1's global averaging missed. IoT F1 improved from 0.30 β†’ 0.87.

When to choose WaveGuard over sklearn

Choose WaveGuard when... Choose sklearn when...
False alarms are expensive (alert fatigue, SRE pages) You need to catch every possible anomaly
You have no ML expertise on the team You have data scientists who can tune models
You need a zero-config API call You can manage model lifecycle (train/save/load)
Data schema changes frequently Feature engineering is stable
Your AI agent needs anomaly detection (MCP) Everything runs locally, no API calls
Reproduce these benchmarks
pip install WaveGuardClient scikit-learn
python benchmarks/benchmark_vs_sklearn.py

Results saved to benchmarks/benchmark_results.json. Benchmarks use deterministic random seeds for reproducibility.

Install

pip install WaveGuardClient

That's it. The only dependency is requests. All physics runs server-side on GPU.

Quickstart

The same scan() call works on any data type. Here are three different industries β€” same API:

Detect a compromised server

from waveguard import WaveGuard

wg = WaveGuard(api_key="YOUR_KEY")

result = wg.scan(
    training=[
        {"cpu": 45, "memory": 62, "disk_io": 120, "errors": 0},
        {"cpu": 48, "memory": 63, "disk_io": 115, "errors": 0},
        {"cpu": 42, "memory": 61, "disk_io": 125, "errors": 1},
    ],
    test=[
        {"cpu": 46, "memory": 62, "disk_io": 119, "errors": 0},    # βœ… normal
        {"cpu": 99, "memory": 95, "disk_io": 800, "errors": 150},   # 🚨 anomaly
    ],
)

for r in result.results:
    print(f"{'🚨' if r.is_anomaly else 'βœ…'}  score={r.score:.1f}  confidence={r.confidence:.0%}")

Flag a fraudulent transaction

result = wg.scan(
    training=[
        {"amount": 74.50, "items": 3, "session_sec": 340, "returning": 1},
        {"amount": 52.00, "items": 2, "session_sec": 280, "returning": 1},
        {"amount": 89.99, "items": 4, "session_sec": 410, "returning": 0},
    ],
    test=[
        {"amount": 68.00, "items": 2, "session_sec": 300, "returning": 1},     # βœ… normal
        {"amount": 4200.00, "items": 25, "session_sec": 8, "returning": 0},     # 🚨 fraud
    ],
)

Catch a security event in logs

result = wg.scan(
    training=[
        "2026-02-24 10:15:03 INFO  Request processed in 45ms [200 OK]",
        "2026-02-24 10:15:04 INFO  Request processed in 52ms [200 OK]",
        "2026-02-24 10:15:05 INFO  Cache hit ratio=0.94 ttl=300s",
    ],
    test=[
        "2026-02-24 10:20:03 INFO  Request processed in 48ms [200 OK]",                  # βœ… normal
        "2026-02-24 10:20:04 CRIT  xmrig consuming 98% CPU, port 45678 open",             # 🚨 crypto miner
        "2026-02-24 10:20:05 WARN  GET /api/users?id=1;DROP TABLE users-- from 185.x.x",  # 🚨 SQL injection
    ],
    encoder_type="text",
)

Same client. Same scan() call. Any data.

Use Cases

WaveGuard works on any structured, numeric, or text data. If you can describe "normal," it can detect deviations.

Industry What You Scan What It Catches
DevOps Server metrics (CPU, memory, latency) Memory leaks, DDoS attacks, runaway processes
Fintech Transactions (amount, velocity, location) Fraud, money laundering, account takeover
Security Log files, access events SQL injection, crypto miners, privilege escalation
IoT / Manufacturing Sensor readings (temp, pressure, vibration) Equipment failure, calibration drift
E-commerce User behavior (session time, cart, clicks) Bot traffic, bulk purchase fraud, scraping
Healthcare Lab results, vitals, biomarkers Abnormal readings, data entry errors
Time Series Metric windows (latency, throughput) Spikes, flatlines, seasonal breaks

The API doesn't know your domain. It just knows what "normal" looks like (your training data) and flags anything that deviates. This makes it general β€” you bring the context, it brings the detection.

Supported Data Types

All auto-detected from data shape. No configuration needed:

Type Example Use When
JSON objects {"cpu": 45, "memory": 62} Structured records with named fields
Numeric arrays [1.0, 1.2, 5.8, 1.1] Feature vectors, embeddings
Text strings "ERROR segfault at 0x0" Logs, messages, free text
Time series [100, 102, 98, 105, 99] Metric windows, sequential readings

Examples

Every example is a runnable Python script that hits the live API:

# Example Industry What It Shows
🏭 IoT Predictive Maintenance Manufacturing Detect bearing failure, leaks, overloads from sensor data
πŸ”’ Network Intrusion Detection Cybersecurity Catch port scans, C2 beacons, DDoS, data exfiltration
πŸ€– MCP Agent Demo AI/Agents Claude calls WaveGuard via MCP β€” zero ML knowledge
01 Quickstart General Minimal scan in 10 lines
02 Server Monitoring DevOps Memory leak + DDoS detection
03 Log Analysis Security SQL injection, crypto miner detection
04 Time Series Monitoring Latency spikes, flatline detection
06 Batch Scanning E-commerce 20 transactions, fraud flagging
07 Error Handling Production Retry logic, exponential backoff
pip install WaveGuardClient
python examples/iot_predictive_maintenance.py

MCP Server (Claude Desktop)

The first physics-based anomaly detector available as an MCP tool. Give any AI agent the ability to detect anomalies β€” zero ML knowledge required.

Quick setup

{
  "mcpServers": {
    "waveguard": {
      "command": "uvx",
      "args": ["--from", "WaveGuardClient", "waveguard-mcp"]
    }
  }
}

Then ask Claude: "Are any of these sensor readings anomalous?" β€” it calls waveguard_scan automatically.

Available MCP tools

Tool Description
waveguard_scan Detect anomalies in any structured data
waveguard_scan_timeseries Auto-window time-series and detect anomalous segments
waveguard_health Check API status and GPU availability

See the MCP Agent Demo for a working example, or the MCP Integration Guide for full setup.

Azure Migration

Azure Anomaly Detector retires October 2026. WaveGuard is a drop-in replacement:

# Before (Azure) β€” 3+ API calls, stateful, time-series only
client = AnomalyDetectorClient(endpoint, credential)
model = client.train_multivariate_model(request)   # minutes
result = client.detect_multivariate_batch_anomaly(model_id, data)
client.delete_multivariate_model(model_id)

# After (WaveGuard) β€” 1 API call, stateless, any data type
wg = WaveGuard(api_key="YOUR_KEY")
result = wg.scan(training=normal_data, test=new_data)  # seconds

See Azure Migration Guide for details.

API Reference

wg.scan(training, test, encoder_type=None, sensitivity=None)

Parameter Type Description
training list 2+ examples of normal data
test list 1+ samples to check
encoder_type str Force: "json", "numeric", "text", "timeseries" (default: auto)
sensitivity float 0.5–3.0, lower = more sensitive (default: 1.0)

Returns ScanResult with .results (per-sample) and .summary (aggregate).

wg.health() / wg.tier()

Health check (no auth) and subscription tier info.

Error Handling

from waveguard import WaveGuard, AuthenticationError, RateLimitError

try:
    result = wg.scan(training=data, test=new_data)
except AuthenticationError:
    print("Bad API key")
except RateLimitError:
    print("Too many requests β€” back off and retry")

Full API reference: docs/api-reference.md

Project Structure

WaveGuardClient/
β”œβ”€β”€ waveguard/              # Python SDK package
β”‚   β”œβ”€β”€ __init__.py         # Public API exports
β”‚   β”œβ”€β”€ client.py           # WaveGuard client class
β”‚   └── exceptions.py       # Exception hierarchy
β”œβ”€β”€ mcp_server/             # MCP server for Claude Desktop
β”‚   └── server.py           # stdio + HTTP transport
β”œβ”€β”€ benchmarks/             # Reproducible benchmarks vs sklearn
β”‚   β”œβ”€β”€ benchmark_vs_sklearn.py
β”‚   └── benchmark_results.json
β”œβ”€β”€ examples/               # 9 runnable examples
β”œβ”€β”€ docs/                   # Documentation
β”‚   β”œβ”€β”€ getting-started.md
β”‚   β”œβ”€β”€ api-reference.md
β”‚   β”œβ”€β”€ mcp-integration.md
β”‚   └── azure-migration.md
β”œβ”€β”€ tests/                  # Test suite
β”œβ”€β”€ pyproject.toml          # Package config (pip install -e .)
└── CHANGELOG.md

Development

git clone https://github.com/gpartin/WaveGuardClient.git
cd WaveGuardClient
pip install -e ".[dev]"
pytest

Links

License

MIT β€” see LICENSE.

About

Python SDK for WaveGuard physics-based anomaly detection API. One call. Any data.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages