Awesome Mixture of Experts (MoE): A Curated List of Mixture of Experts (MoE) and Mixture of Multimodal Experts (MoME)
-
Updated
Sep 25, 2024
Awesome Mixture of Experts (MoE): A Curated List of Mixture of Experts (MoE) and Mixture of Multimodal Experts (MoME)
This is the official source code of our IEA/AIE 2021 paper
Implements a gated network for fusing features from different modalities for object detection
Add a description, image, and links to the gating-network topic page so that developers can more easily learn about it.
To associate your repository with the gating-network topic, visit your repo's landing page and select "manage topics."