Skip to content

Commit

Permalink
Update index.html
Browse files Browse the repository at this point in the history
  • Loading branch information
edwardmagongo authored Nov 1, 2024
1 parent 31b35ec commit 3a5b08a
Showing 1 changed file with 31 additions and 23 deletions.
54 changes: 31 additions & 23 deletions index.html
Original file line number Diff line number Diff line change
Expand Up @@ -736,7 +736,7 @@ <h3 class="title is-4">4.3.3 ResNet-18</h3>
<!-- Analysis of QIANets Framework -->
<div class="columns is-centered has-text-centered">
<div class="column is-four-fifths">
<h2 class="title is-3">Conclusion</h2>
<h2 class="title is-3">Analysis of QIANets Framework</h2>
<div class="content has-text-justified">
<p>
The QIANets framework exhibits effective latency reductions across models, achieving compression ratios of x1.6 for ResNet,
Expand Down Expand Up @@ -783,7 +783,7 @@ <h2 class="title is-3">Limitations</h2>
While our results demonstrate the potential of QIANets and quantum-inspired principles in model compression, they also
reflect on several factors that influence the performance of our approach:
</p>
<ol>
<ol>
<li>
<strong>Data Constraints:</strong> The evaluation was restricted to the relatively simple CIFAR-10 dataset,
which may not represent the full diversity, complexity, or scalability challenges encountered in
Expand Down Expand Up @@ -819,7 +819,8 @@ <h2 class="title is-3">Limitations</h2>
</div>
</section>
<!-- Acknowledgments and Disclosure of Funding. -->
<div class="columns is-centered has-text-centered">
<div class="container is-max-desktop">
<div class="columns is-centered has-text-centered">
<div class="column is-four-fifths">
<h2 class="title is-3">Acknowledgments and Disclosure of Funding</h2>
<div class="content has-text-justified">
Expand All @@ -835,32 +836,39 @@ <h2 class="title is-3">Acknowledgments and Disclosure of Funding</h2>
</section>

<section class="section">
<div class="container is-max-desktop">
<div class="container is-max-desktop">
<!-- References -->
<div class="columns is-centered has-text-centered">
<div class="column is-four-fifths">
<h2 class="title is-3">Referances</h2>
<div class="content has-text-justified">
<ol>
<li>Su, N. M., &amp; Crandall, D. J. (2021). The affective growth of computer vision. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 9291–9300. <a href="https://doi.org/10.1109/CVPR46437.2021.00929259">doi:10.1109/CVPR46437.2021.00929259</a></li>
<li>Anumol, C. S. (2023, November). Advancements in CNN Architectures for Computer Vision: A Comprehensive Review. In 2023 Annual International Conference on Emerging Research Areas: International Conference on Intelligent Systems (AICERA/ICIS), pp. 1–7. IEEE. <a href="https://doi.org/10.1109/AICERA.2023.9580123263">doi:10.1109/AICERA.2023.9580123263</a></li>
<li>He, K., Zhang, X., Ren, S., &amp; Sun, J. (2016). Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778. <a href="https://doi.org/10.1109/CVPR.2016.90">doi:10.1109/CVPR.2016.90</a></li>
<li>Szegedy, C., Liu, W., Jia, Y., Sermanet, P., Reed, S., Anguelov, D., ... &amp; Rabinovich, A. (2015). Going deeper with convolutions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1–9. <a href="https://doi.org/10.1109/CVPR.2015.7298594">doi:10.1109/CVPR.2015.7298594</a></li>
<li>Honegger, D., Oleynikova, H., &amp; Pollefeys, M. (2014, September). Real-time and low latency embedded computer vision hardware based on a combination of FPGA and mobile CPU. In 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 4930–4935. IEEE. <a href="https://doi.org/10.1109/IROS.2014.6943192">doi:10.1109/IROS.2014.6943192</a></li>
<li>Li, Z., Li, H., &amp; Meng, L. (2023). Model compression for deep neural networks: A survey. Computers, 12(3), 60. <a href="https://doi.org/10.3390/computers12030060">doi:10.3390/computers12030060</a></li>
<li>Divya, R., &amp; Peter, J. D. (2021, November). Quantum machine learning: A comprehensive review on optimization of machine learning algorithms. In 2021 Fourth International Conference on Microelectronics, Signals &amp; Systems (ICMSS), pp. 1–6. IEEE. <a href="https://doi.org/10.1109/ICMSS53240.2021.9532427">doi:10.1109/ICMSS53240.2021.9532427</a></li>
<li>Pandey, S., Basisth, N. J., Sachan, T., Kumari, N., &amp; Pakray, P. (2023). Quantum machine learning for natural language processing application. Physica A: Statistical Mechanics and its Applications, 627, 129123. <a href="https://doi.org/10.1016/j.physa.2022.129123">doi:10.1016/j.physa.2022.129123</a></li>
<li>Francy, S., &amp; Singh, R. (2024). Edge AI: Evaluation of Model Compression Techniques for Convolutional Neural Networks. arXiv preprint arXiv:2409.02134. <a href="https://arxiv.org/abs/2409.02134">arXiv:2409.02134</a></li>
<li>Cheng, Y., Wang, D., Zhou, P., &amp; Zhang, T. (2017). A survey of model compression and acceleration for deep neural networks. arXiv preprint arXiv:1710.09282. In Proceedings of the IEEE Signal Processing Magazine, 35(1), 126–136. <a href="https://doi.org/10.1109/MSP.2017.2765695">doi:10.1109/MSP.2017.2765695</a></li>
<li>Hou, Z., Qin, M., Sun, F., Ma, X., Yuan, K., Xu, Y., ... &amp; Kung, S. Y. (2022). Chex: Channel exploration for CNN model compression. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 12287–12298. <a href="https://doi.org/10.1109/CVPR52688.2022.01204">doi:10.1109/CVPR52688.2022.01204</a></li>
<li>Han, S., Pool, J., Tran, J., &amp; Dally, W. (2015). Learning both weights and connections for efficient neural network. Advances in Neural Information Processing Systems, 28. NIPS Link</li>
<li>Shi, S., Wang, Z., Cui, G., Wang, S., Shang, R., Li, W., ... &amp; Gu, Y. (2022). Quantum-inspired complex convolutional neural networks. Applied Intelligence, 52(15), 17912–17921. <a href="https://doi.org/10.1007/s10489-022-03525-3">doi:10.1007/s10489-022-03525-3</a></li>
<li>Hu, Z., Dong, P., Wang, Z., Lin, Y., Wang, Y., &amp; Jiang, W. (2022, October). Quantum neural network compression. In Proceedings of the 41st IEEE/ACM International Conference on Computer-Aided Design, pp. 1–9. <a href="https://doi.org/10.1145/3508352.3549415">doi:10.1145/3508352.3549415</a></li>
<li>Tomut, A., Jahromi, S. S., Singh, S., Ishtiaq, F., Muñoz, C., Bajaj, P. S., ... &amp; Orus, R. (2024). CompactifAI: Extreme Compression of Large Language Models using Quantum-Inspired Tensor Networks. arXiv preprint arXiv:2401.14109. <a href="https://arxiv.org/abs/2401.14109">arXiv:2401.14109</a></li>
<li>LeCun, Y., Boser, B., Denker, J. S., Henderson, D., Howard, R. E., Hubbard, W., &amp; Jackel, L. D. (1989). Backpropagation applied to handwritten zip code recognition. Neural Computation, 1(4), 541–551. <a href="https://doi.org/10.1162/neco.1989.1.4.541">doi:10.1162/neco.1989.1.4.541</a></li>
<li>Hanson, S., &amp; Pratt, L. (1988). Comparing biases for minimal network construction with back-propagation. Advances in Neural Information Processing Systems, 1. NIPS Link</li>
<li>Hassibi, B., Stork, D. G., &amp; Wolff, G. J. (1993, March). Optimal brain surgeon and general network pruning. In IEEE International Conference on Neural Networks, pp. 293–299. IEEE. <a href="https://doi.org/10.1109/ICNN.1993.298572">doi:10.1109/ICNN.1993.298572</a></li>
</ol>
<li>Su, N. M., & Crandall, D. J. (2021). The affective growth of computer vision. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 9291–9300. <a href="https://doi.org/10.1109/CVPR46437.2021.00917">https://doi.org/10.1109/CVPR46437.2021.00917</a></li>
<li>Huang, G., Liu, Z., Van Der Maaten, L., & Weinberger, K. Q. (2017). Densely connected convolutional networks. In Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 4700–4708. <a href="https://doi.org/10.1109/CVPR.2017.243">https://doi.org/10.1109/CVPR.2017.243</a></li>
<li>Szegedy, C., Liu, W., Jia, Y., Sermanet, P., Reed, S., Anguelov, D., ... & Rabinovich, A. (2015). Going deeper with convolutions. In Proceedings of Vision and Pattern Recognition, pp. 1–9. <a href="https://doi.org/10.1109/CVPR.2015.7298594">https://doi.org/10.1109/CVPR.2015.7298594</a></li>
<li>He, K., Zhang, X., Ren, S., & Sun, J. (2016). Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778. <a href="https://doi.org/10.1109/CVPR.2016.90">https://doi.org/10.1109/CVPR.2016.90</a></li>
<li>Anumol, C. S. (2023, November). Advancements in CNN Architectures for Computer Vision: A Comprehensive Review. In 2023 Annual International Conference on Emerging Research Areas: International Conference on Intelligent Systems (AICERA/ICIS), pp. 1–7. IEEE. <a href="https://doi.org/10.1109/AICERA/ICIS59538.2023.10420413">https://doi.org/10.1109/AICERA/ICIS59538.2023.10420413</a></li>
<li>Honegger, D., Oleynikova, H., & Pollefeys, M. (2014, September). Real-time and low latency embedded computer vision hardware based on a combination of FPGA and mobile CPU. In 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 4930–4935. IEEE. <a href="https://doi.org/10.1109/IROS.2014.6943263">https://doi.org/10.1109/IROS.2014.6943263</a></li>
<li>Li, Z., Li, H., & Meng, L. (2023). Model compression for deep neural networks: A survey. Computers, 12(3), 60. <a href="https://doi.org/10.3390/computers12030060">https://doi.org/10.3390/computers12030060</a></li>
<li>Divya, R., & Peter, J. D. (2021, November). Quantum machine learning: A comprehensive review on optimization of machine learning algorithms. In 2021 Fourth International Conference on Microelectronics, Signals & Systems (ICMSS), pp. 1–6. IEEE. <a href="https://doi.org/10.1109/ICMSS53060.2021.9673630">https://doi.org/10.1109/ICMSS53060.2021.9673630</a></li>
<li>Pandey, S., Basisth, N. J., Sachan, T., Kumari, N., & Pakray, P. (2023). Quantum machine learning for natural language processing application. Physica A: Statistical Mechanics and its Applications, 627, 129123. <a href="https://doi.org/10.1016/j.physa.2023.129123">https://doi.org/10.1016/j.physa.2023.129123</a></li>
<li>Francy, S., & Singh, R. (2024). Edge AI: Evaluation of Model Compression Techniques for Convolutional Neural Networks. arXiv preprint arXiv:2409.02134. <a href="https://doi.org/10.48550/arXiv.2409.02134">https://doi.org/10.48550/arXiv.2409.02134</a></li>
<li>Cheng, Y., Wang, D., Zhou, P., & Zhang, T. (2017). A survey of model compression and acceleration for deep neural networks. arXiv preprint arXiv:1710.09282. In Proceedings of the IEEE Signal Processing Magazine, 35(1), 126–136. <a href="https://doi.org/10.1109/MSP.2017.2765695">https://doi.org/10.1109/MSP.2017.276569 <li>Hou, Z., Qin, M., Sun, F., Ma, X., Yuan, K., Xu, Y., ... & Kung, S. Y. (2022). Chex: Channel exploration for CNN model compression. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 12287–12298. <a href="https://doi.org/10.1109/CVPR52688.2022.01197">https://doi.org/10.1109/CVPR52688.2022.01197</a></li>
<li>Han, S., Pool, J., Tran, J., & Dally, W. (2015). Learning both weights and connections for efficient neural network. Advances in Neural Information Processing Systems, 28. <a href="https://papers.nips.cc/paper/2015/hash/3e15cc52c93b93997f4c62e87d3d0ad7-Abstract.html">NIPS Link</a></li>
<li>Shi, S., Wang, Z., Cui, G., Wang, S., Shang, R., Li, W., ... & Gu, Y. (2022). Quantum-inspired complex convolutional neural networks. Applied Intelligence, 52(15), 17912–17921. <a href="https://doi.org/10.1007/s10489-022-03525-0">https://doi.org/10.1007/s10489-022-03525-0</a></li>
<li>Hu, Z., Dong, P., Wang, Z., Lin, Y., Wang, Y., & Jiang, W. (2022, October). Quantum neural network compression. In Proceedings of the 41st IEEE/ACM International Conference on Computer-Aided Design, pp. 1–9. <a href="https://doi.org/10.1145/3508352.3549382">https://doi.org/10.1145/3508352.3549382</a></li>
<li>Tomut, A., Jahromi, S. S., Singh, S., Ishtiaq, F., Muñoz, C., Bajaj, P. S., ... & Orus, R. (2024). CompactifAI: Extreme Compression of Large Language Models using Quantum-Inspired Tensor Networks. arXiv preprint arXiv:2401.14109. <a href="https/arXiv.2401.14109">https://doi.org/10.48550/arXiv.2401.14109</a></li>
<li>LeCun, Y., Boser, B., Denker, J. S., Henderson, D., Howard, R. E., Hubbard, W., & Jackel, L. D. (1989). Backpropagation applied to handwritten zip code recognition. Neural Computation, 1(4), 541–551. <a href="https://doi.org/10.1162/neco.1989.1.4.541">https://doi.org/10.1162/neco.1989.1.4.541</a></li>
<li>Hanson, S., & Pratt, L. (1988). Comparing biases for minimal network construction with backpropagation. Advances in Neural Information Processing Systems, 1.</li>
<li>Hassibi, B., Stork, D. G., & Wolff, G. J. (1993, March). Optimal brain surgeon and general network pruning. In IEEE International Conference on Neural Networks, pp. 293–299. IEEE.</li>
<li>Lécuyer, M., McDaniel, P., & Papernot, N. (2019). Privacy-preserving and attack-resilient deep learning via compression. arXiv preprint arXiv:1902.00838.</li>
<li>Rumelhart, D. E., Hinton, G. E., & Williams, R. J. (1986). Learning representations by back-propagating errors. Nature, 323(6088), 533–536. <a href="https://doi.org/10.1038/323533a0">https://doi.org/10.1038/323533a0</a></li>
<li>Bengio, Y., LeCun, Y., & Hinton, G. (2015). Deep learning. Nature, 521(7553), 436–444. <a href="https://doi.org/10.1038/nature14539">https://doi.org/10.1038/nature14539</a></li>
<li>Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep learning. MIT press.</li>
<li>Hinton, G. E., & Salakhutdinov, R. R. (2006). Reducing the dimensionality of data with neural networks. Science, 313(5786), 504–507. <a href="https://doi.org/10.1126/science.1127647">https://doi.org/10.1126/science.1127647</a></li>
<li>Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., ... & Polosukhin, I. (2017). Attention is all you need. Advances in Neural Information, 30. <a href="https://papers.nips.cc/paper/2017/hash/3f5ee243547dee91fbd053c1c4a845aa-Abstract.html">NIPS Link</a></li>
<li>Krizhevsky, A., Sutskever, I., & Hinton, G. E. (2012). ImageNet classification with deep convolutional neural networks. Advances in Neural Information Processing Systems, 25.</li>
</ol>
</div>
</div>
</div>
Expand Down

0 comments on commit 3a5b08a

Please sign in to comment.