Awesome Mixture of Experts (MoE): A Curated List of Mixture of Experts (MoE) and Mixture of Multimodal Experts (MoME)
-
Updated
Sep 19, 2024
Awesome Mixture of Experts (MoE): A Curated List of Mixture of Experts (MoE) and Mixture of Multimodal Experts (MoME)
A curated list of foundation models for vision and language tasks
[ICML 2024] Probabilistic Conceptual Explainers: Trustworthy Conceptual Explanations for Vision Foundation Models
This repository contains the python package for Helical
[MICCAI 2024] Official code repository of paper titled "BAPLe: Backdoor Attacks on Medical Foundation Models using Prompt Learning" accepted in MICCAI 2024 conference.
✨✨A curated list of latest advances on Foundation Models with Federated Learning
Images to inference with no labeling (use foundation models to train supervised models).
SaprotHub: Making Protein Modeling Accessible to All Biologists
The Paper List on Data Contamination for Large Language Models Evaluation.
[ECCV2024] Video Foundation Models & Data for Multimodal Understanding
Making large AI models cheaper, faster and more accessible
Generative AI - Use Watsonx to respond to natural language questions using RAG (context, few-shot, watson-studio, rag, vector-database, foundation-models, llm, prompt-engineering, retrieval-augmented-generation, milvus).
The codebase for the book "AI-Powered Search" (Manning Publications, 2024)
Implementation of the Aurora model for atmospheric forecasting
Papers, codes, datasets, applications, tutorials.
ChromBERT: A pre-trained foundation model for context-specific transcription regulatory network
World Model based Autonomous Driving Platform in CARLA 🚗
Foundation model benchmarking tool. Run any model on any AWS platform and benchmark for performance across instance type and serving stack options.
A toolkit for developing foundation models using Electronic Health Record (EHR) data.
Official implementation of ICLR 2024 paper "Contrastive Learning Is Spectral Clustering On Similarity Graph" (https://arxiv.org/abs/2303.15103)
Add a description, image, and links to the foundation-models topic page so that developers can more easily learn about it.
To associate your repository with the foundation-models topic, visit your repo's landing page and select "manage topics."