Backend schema, dataset ingestion, and frontend integration#49
Backend schema, dataset ingestion, and frontend integration#49ahadsiddiqui wants to merge 9 commits intoDataBytes-Organisation:mainfrom
Conversation
- frontend connected with backend successfully. - Dataset can be stored inside PostgreSQL
There was a problem hiding this comment.
Hi Ahad,
Please update the readme with instructions on how to import the data into the database
There was a problem hiding this comment.
@ahadsiddiqui sensor 2 data format caused failed to import, sensor 1 and 3 worked
There was a problem hiding this comment.
these scripts works for me
node scripts/ingest.js --file ../datasets/2881821-sensor1.csv --map mappings/sensor1.json
node scripts/ingest.js --file ../datasets/1321079-sensor2.csv --map mappings/sensor2.json
node scripts/ingest.js --file ../datasets/518150-sensor3.csv --map mappings/sensor3.json
There was a problem hiding this comment.
please resolve the conflict
|
Hi Ahad, |
- Added 2 more datasets for Sensor 2 and 3. - Updated Readme _ Multi Stream Visualization sort out
This pull request delivers the first working end-to-end flow for the MVP.
Set up Express.js backend with health endpoint and routing structure.
Implemented controllers and routes for:
/api/datasets → list datasets
/api/datasets/:id/meta → dataset fields + time bounds
/api/series → return raw time series (with quality_flag) or aggregated intervals.
Designed and applied PostgreSQL schema (datasets, timeseries_long) with quality_flag support.
Added .env support and connection pooling.
Added ingest.js script to load CSV datasets into PostgreSQL in long format.
Verified dataset ingestion (e.g. sensor1) and confirmed rows appear in DB + via API.
Documented commands in README for installing PostgreSQL, creating appdb, and running ingest.
Connected React dashboard to backend via Vite proxy.
Updated Dashboard.jsx to send requests with dataset, streams, interval, start/end times.
Logs and visualises data from backend (with merged chart data).
Updated Chart.jsx to plot multiple streams and color points by quality_flag:
Green = normal
Red = anomaly (once DS team updates flags).
Endpoints tested with curl and frontend Analyse button.
Confirmed full path: CSV → PostgreSQL → API → React chart.
Screenshots/logs available showing loaded server data and merged chart data.
As backend lead, coordinated alignment between frontend and backend teams.
Proposed and finalised system architecture (React → Node.js → PostgreSQL).
Documented setup steps in README to ensure teammates can run the repo without errors.
Provided guidance during syncs on connection flow, schema design, and integration strategy.
Files touched
Backend
newBackend/server.js (setup + routes)
newBackend/src/controllers/seriesController.js
newBackend/src/controllers/datasetController.js
newBackend/src/routes/seriesRoutes.js
newBackend/src/routes/datasetRoutes.js
newBackend/src/ingest/ingest.js
sql/schema.sql
frontend/src/components/Dashboard.jsx
frontend/src/components/Chart.jsx
(minor config) frontend/vite.config.js
README.md (setup instructions + workflow)
Data Science team can extend ingestion/anomaly detection and update quality_flag.
More datasets can be ingested and visualised without schema changes.
Future enhancements: dedicated anomaly endpoints, richer flags like bad data, anomaly, correlated.