Scaling Data-Constrained Language Models
-
Updated
Sep 22, 2024 - Jupyter Notebook
Scaling Data-Constrained Language Models
Reproducible scaling laws for contrastive language-image learning (https://arxiv.org/abs/2212.07143)
🔥🔥🔥 Latest Advances on Large Recommendation Models
[NeurIPS'24 Spotlight] Observational Scaling Laws
A toolkit for scaling law research ⚖
Dimensionless learning codes for our paper called "Data-driven discovery of dimensionless numbers and governing laws from scarce measurements".
Code for reproducing the experiments on large-scale pre-training and transfer learning for the paper "Effect of large-scale pre-training on full and few-shot transfer learning for natural and medical images" (https://arxiv.org/abs/2106.00116)
Official code for the paper, "Scaling Offline Model-Based RL via Jointly-Optimized World-Action Model Pretraining"
[ICML 2023] "Data Efficient Neural Scaling Law via Model Reusing" by Peihao Wang, Rameswar Panda, Zhangyang Wang
[NeurIPS 2023] Multi-fidelity hyperparameter optimization with deep power laws that achieves state-of-the-art results across diverse benchmarks.
code for Scaling Laws for Language Transfer Learning
A method for calculating scaling laws for LLMs from publicly available models
First temporal graph foundation model dataset and benchmark
Code for CoNLL BabyLM workshop Mini Minds: Exploring Bebeshka and Zlata Baby Models
🌹[ICML 2024] Selecting Large Language Model to Fine-tune via Rectified Scaling Law
Scaling laws web calculator to get a model's training compute flops, costs and energy utilization.
[NeurIPS 2023] Multi-fidelity hyperparameter optimization with deep power laws that achieves state-of-the-art results across diverse benchmarks.
This is a repository containing the code, data, and visulizations that I made as a part of my senior capstone project
Add a description, image, and links to the scaling-laws topic page so that developers can more easily learn about it.
To associate your repository with the scaling-laws topic, visit your repo's landing page and select "manage topics."