Skip to content

Jarvis017/Blood_Glucose_Prediction

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Blood Glucose Prediction with Mixture of Experts Architecture

Overview

This project aims to predict blood glucose levels in patients using a sophisticated Mixture of Experts (MoE) architecture. The architecture consists of multiple expert models, each specializing in different neural network structures. The goal is to leverage the strengths of various models to achieve a more accurate and robust prediction of blood glucose levels.

Preprocessing

Data preprocessing is a crucial step to ensure the quality and reliability of the predictions. The preprocessing steps include:

  1. Extracting CGM data of patients
  2. Detecting and removing outliers

Architecture

The Mixture of Experts architecture employed in this project includes the following components:

  • GRU Expert: Utilizes Gated Recurrent Units (GRU) to capture temporal dependencies in blood glucose data.
  • LSTM Expert: Employs Long Short-Term Memory (LSTM) networks, known for their ability to learn long-term dependencies, which is crucial for time-series data.
  • Dense Layer Expert: Uses a fully connected (Dense) layer network to model the non-linear relationships in the data.

The mixture model combines the outputs of these experts to produce a final prediction. A gating mechanism is used to assign weights to each expert's output based on the input data, allowing the model to dynamically select the most relevant expert(s) for each prediction.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 100.0%