Skip to content

P-008: Multi-target support for testing multiple Matomo instances #8

@Puttrix

Description

@Puttrix

Overview

Add support for distributing load across multiple Matomo instances, enabling users to test scenarios with multiple tracking endpoints concurrently.

Motivation

  • Scalability testing: Validate that multiple Matomo instances can handle distributed loads
  • Multi-tenant scenarios: Test SaaS deployments with separate instances per customer
  • High availability: Verify failover and load balancing configurations
  • Realistic workloads: Mirror production architectures with multiple tracking endpoints

Acceptance Criteria

Backend (Loader)

  • Loader accepts array of target configurations (URL, Site ID, Token Auth)
  • Support distribution strategies:
    • Round-robin (default)
    • Weighted distribution (custom ratios)
    • Random selection
  • Concurrent request execution using asyncio + httpx.AsyncClient
  • Per-target error handling and retry logic
  • Per-target metrics tracking (visits sent, success rate, latency)
  • Unit tests covering multi-target routing and distribution

Configuration Model

  • Extend Pydantic models to support target arrays
  • Validate each target independently (URL format, connectivity)
  • SQLite schema updates for multi-target presets
  • Backward compatibility with single-target configs

Control UI - Backend

  • Update /api/validate to test all target connections
  • Modify /api/status to return per-target statistics
  • Extend /api/presets to persist multi-target configurations
  • Add /api/targets CRUD endpoints (optional, if needed)

Control UI - Frontend

  • Dynamic target list editor (add/remove targets)
  • Per-target configuration fields (URL, Site ID, Token)
  • Distribution strategy selector (round-robin/weighted/random)
  • Status dashboard shows per-target metrics
  • Visual indicators for target health (green/yellow/red)
  • Test connectivity for all targets before starting

Documentation & Testing

  • Update README.md with multi-target examples
  • Extend WEB_UI_GUIDE.md with target management instructions
  • Integration tests for multi-target scenarios
  • ADR documenting design decisions

Technical Approach

Async Concurrency Pattern

Based on research from httpx and docker-py documentation:

import httpx
import asyncio

async def send_to_target(client: httpx.AsyncClient, target: dict, tracking_data: dict):
    """Send tracking request to a single target"""
    try:
        response = await client.post(
            f"{target['url']}/matomo.php",
            data=tracking_data,
            timeout=10.0
        )
        return {"target": target["url"], "status": response.status_code}
    except Exception as e:
        return {"target": target["url"], "error": str(e)}

async def distribute_requests(targets: list, tracking_data: dict):
    """Concurrently send requests to all targets"""
    async with httpx.AsyncClient(http2=True) as client:
        tasks = [send_to_target(client, target, tracking_data) for target in targets]
        results = await asyncio.gather(*tasks, return_exceptions=True)
        return results

Data Model Extension

class Target(BaseModel):
    url: HttpUrl
    site_id: int
    token_auth: Optional[str] = None
    weight: int = 1  # For weighted distribution
    enabled: bool = True

class MultiTargetConfig(BaseModel):
    targets: List[Target]
    distribution_strategy: Literal["round-robin", "weighted", "random"] = "round-robin"
    # ... existing config fields

Implementation Plan

  1. Research & Design (Step 1 - CURRENT)

    • Research async patterns (Context7 MCP) ✅
    • Create GitHub issue ✅
    • Draft ADR for design decisions
  2. Backend Data Models (Step 2)

    • Extend Pydantic models
    • Update SQLite schema
    • Add migration if needed
  3. Loader Multi-Target Support (Step 3)

    • Implement async distribution logic
    • Add per-target metrics
    • Create unit tests
  4. Control UI Updates (Step 4)

    • Backend API updates
    • Frontend target editor
    • Status dashboard enhancements
  5. Documentation & Testing (Step 5)

    • Integration tests
    • User documentation
    • Create PR

Dependencies

  • None (can start immediately)

Estimated Effort

  • Backend: 4-6 hours
  • Frontend: 3-4 hours
  • Testing & Docs: 2-3 hours
  • Total: 8-12 hours

References

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions