Releases: Stock-XAI/backend
snapshot-0611
Snapshot (Week 15 – 2025.06.11)
1. Concurrency Handling in Prediction & Explanation APIs (#16)
-
Fixed potential race conditions during concurrent API calls to
/stock-info/predand/stock-info/exp. -
Both
PredictionandExplanationtables are uniquely constrained by(ticker_id, predicted_date, horizon_days), which could raiseIntegrityErrorunder concurrent inserts. -
Wrapped insertion logic in
try/exceptblocks:- On
IntegrityError, transaction is rolled back and a fresh query is issued to retrieve the existing cache.
- On
-
Ensures consistent behavior and avoids 500 errors even under high request concurrency.
-
Improved robustness for real-time multi-user scenarios and ngrok-based external inference pipelines.
-
Also normalized API URL handling to avoid trailing slash issues.
Related PRs
- [#16] Prevent duplicate insertions by handling race conditions in prediction and explanation
snapshot-0608
Snapshot (Week 14 - 2025.06.08)
1. XAI Explanation API Integration (#14)
- Replaced mock
generate_explanation()function with actual inference. - Integrated with the ngrok-hosted Colab server to fetch real explanation results (
tokens,token_scores). - Added full SQLAlchemy model and CRUD for the
Explanationtable. - Explanation results are now stored as JSON-serialized fields (
Textcolumns). - Defined
ExplainationDataschema and connected it to/stock-info/expAPI response.
2. Modularization & Schema Refactor for Regression (#15)
-
Split the original
/stock-infoendpoint into three focused APIs:/stock-info/basic: Returns chart and news data/stock-info/pred: Returns numerical regression prediction/stock-info/exp: Returns model explanation
-
Refactored
PredictionDataschema from classification to regression (result: float). -
Removed
StockInfoWrapperand unified response format to simple JSON payloads.
Related PRs
snapshot-0516
Bi-Weekly Snapshot (Week 11 - 2025.05.16)
1. Model Integration and Deployment
- Connected to the Colab-hosted
capston-team-5/finma-7b-4bit-quantizedmodel API. - Implemented the
run_predictionlogic and deployed prediction endpoint. - Verified end-to-end inference pipeline and deployed with Docker Compose.
2. Database Migration and ORM Setup
- Migrated backend database from MongoDB to MySQL (Amazon RDS).
- Defined core relational models using SQLAlchemy:
Ticker,ChartData,Prediction,News. - Updated database connection and query structure to match new relational schema.
- DB schema documentation: db/README.md
3. CRUD Refactoring and DB Caching
-
Refactored backend into modular CRUD files:
chart.py,news.py,prediction.py,explanation.py -
Introduced
get_session()helper for unified session management. -
Added DB caching logic to reduce redundant external API calls:
- Chart: Fetches and stores only missing date ranges.
- News: Compares latest
pubDatewith DB before fetching. - Prediction: Caches based on
(ticker, horizon, predicted_date).
4. Multi-Market & Horizon-Aware Charting
-
/chartand/stock-infoendpoints now support:- Weekly data for
horizon = 7 - Monthly data for
horizon = 30
- Weekly data for
-
Previously always returned 30-days daily data → Now aligns better with prediction scale.
Related PRs
Bi-Weekly Snapshot - 04/14
Bi-Weekly Snapshot (Week 7 - 2025.04.14)
-
Completed API Documentation
- Finalized API specifications for core endpoints including
/stock-infoand/search. - Defined supported query parameters and example responses for frontend integration.
- Finalized API specifications for core endpoints including
-
Initial Backend Setup and Structure
- Set up the backend using FastAPI with a modular router structure (
routers/stock.py,routers/search.py). - Configured Docker (
Dockerfile,docker-compose.yml) and connected to MongoDB Atlas using environment variables in.env.
- Set up the backend using FastAPI with a modular router structure (
-
Implemented Core API Functionality
/stock-info: Integrated with yfinance to fetch actual stock price data (1-month period), and included placeholders for news, prediction, and XAI interpretation./search: Built an autocomplete API using MongoDB. If a keyword is provided, performs regex search over tickers and company names. If omitted, returns a curated list of 50 popular U.S. stocks.- Collected and inserted ticker data (ticker + name) for 518 companies from the NASDAQ-100 and S&P 500 lists into MongoDB for real-time search functionality.
- Returned data is now connected to the real database and live price feed.
-
CORS Configuration for Development
- Added CORS settings (
CORSMiddleware) to allow frontend requests from development ports (e.g.,localhost:5173), ensuring smooth API communication.
- Added CORS settings (