See description: https://www.isaacleonard.com/ml/distributed/
https://rustup.rs/
You will need nightly rust
rustup default nightly
and make sure it is up to date.
rustup update
Now you can build it.
cargo build --release
See how fast different word sizes are on your machine. (Replace "zen+" with the appropriate arch name.)
./target/release/count_bits_benchmark 20 zen+ "Threadripper 2950X 16-Core" word_perf_zen+.json
Plot the numbers (replace 3000 with a scaling factor appropriate for your machine):
./target/release/plot_word_perf 3000 word_perf_zen+.json word_perf_zen+.png
And finally stack some layers and measure accuracy on a given training set:
./target/release/layer_demo 20 "/path/to_some/big/chunk/of/text.txt"
Or run setup.sh
.
- Binarized Neural Networks: Training Deep Neural Networks with Weights and Activations Constrained to +1 or -1
- XNOR-Net: ImageNet Classification Using Binary Convolutional Neural Networks
- Transfer Learning with Binary Neural Networks
- Combinatorial Attacks on Binarized Neural Networks
- FINN: A Framework for Fast, Scalable Binarized Neural Network Inference
- Probabilistic Binary Neural Networks
- A Review of Binarized Neural Networks
- Binary Neural Networks: A Survey
- Enabling Binary Neural Network Training on the Edge
https://openreview.net/pdf?id=ryxfHnCctX
- Ternary Weight Networks
- Ternary Neural Networks with Fine-Grained Quantization
- Ternary Neural Networks for Resource-Efficient AI Applications
- Unrolling Ternary Neural Networks
- Two-Bit Networks for Deep Learning on Resource-Constrained Embedded Devices
- Recursive Binary Neural Network Learning Model with 2-bit/weight Storage Requirement
- Welcoming the Era of Deep Neuroevolution (uber)
- Deep Neuroevolution: Genetic Algorithms Are a Competitive Alternative for Training Deep Neural Networks for Reinforcement Learning
- Evolving Deep Neural Networks
- Evolution Strategies as a Scalable Alternative to Reinforcement Learning
AdaComp : Adaptive Residual Gradient Compression for Data-Parallel Distributed Training
- Deep Clustering for Unsupervised Learning of Visual Features
- Learning Feature Representations with K-means
- Convolutional Clustering for Unsupervised Learning
- https://arxiv.org/abs/1901.06656
- https://arxiv.org/abs/1803.09522
- https://arxiv.org/abs/1812.11446
- https://arxiv.org/abs/1608.05343
- https://arxiv.org/abs/1310.6343
- A Theory of Local Learning, the Learning Channel, and the Optimality of Backpropagation
- https://arxiv.org/abs/1905.11786
Loc Quang Trinh's thesis paper: https://dspace.mit.edu/bitstream/handle/1721.1/123128/1128279897-MIT.pdf
Neural Network with Binary Activations for Efficient Neuromorphic Computing