MNN: A blazing-fast, lightweight inference engine battle-tested by Alibaba, powering high-performance on-device LLMs and Edge AI.
-
Updated
Mar 6, 2026 - C++
MNN: A blazing-fast, lightweight inference engine battle-tested by Alibaba, powering high-performance on-device LLMs and Edge AI.
Official implementation of "Searching for Winograd-aware Quantized Networks" (MLSys'20)
Different matrix multiplication implementation and benchmarking on CPUs
Winograd Convolution Implementation using by cpp
Implementation of parallel algorithms and comparison from speed of operation with sequential implementation. Algorithms: Ant Colony Optimization Algorithm, Gauss Algorithm, Winograd Algorithm
This repository deals with algorithms that I have implemented with multithreading
winograd with strassen algorithm implementation
Add a description, image, and links to the winograd-algorithm topic page so that developers can more easily learn about it.
To associate your repository with the winograd-algorithm topic, visit your repo's landing page and select "manage topics."