feat: add real-time gaze ingestion and WebSocket streaming per session#45
feat: add real-time gaze ingestion and WebSocket streaming per session#45Avinash-Alapati wants to merge 3 commits intoruxailab:mainfrom
Conversation
|
Hi @KarinePistili @jvJUCA , I’ve opened a PR adding session-based real-time gaze ingestion and WebSocket streaming (non-breaking, backend only). |
|
Added a follow-up commit to integrate live gaze streaming from the calibration flow. Now the frontend periodically posts gaze points to POST /api/session/gaze during training/validation, and the backend logs + broadcasts them via Socket.IO rooms per session. This validates the full REST → WebSocket pipeline end-to-end with the actual eye-tracking flow (not just a mock client). |
What this adds
POST /api/session/gazeto ingest live gaze pointsWhy this is needed
Currently the API supports calibration and batch prediction, but there is no support for streaming gaze data during live usability sessions.
This change adds the backend infrastructure required for real-time gaze visualization and future session replay, which is a core part of the GSoC project scope.
Scope
Demo
Short demo showing REST → WebSocket live streaming using a test observer client:
Real-time.Gaze.Streaming.Demo.mp4
Next steps (planned)