diff --git a/README.md b/README.md index 5e45a86..ccef212 100644 --- a/README.md +++ b/README.md @@ -23,7 +23,7 @@ - [Intel® Distribution of OpenVINO™ Toolkit](#intel-distribution-of-openvino-toolkit) - [Intel® Movidius™ Neural Compute Stick 2](#intel-movidius-neural-compute-stick-2) - [Acute Lymphoblastic Leukemia oneAPI Classifier 2021](#acute-lymphoblastic-leukemia-oneapi-classifier-2021) - - [GETTING STARTED](#getting-started) +- [GETTING STARTED](#getting-started) - [Contributing](#contributing) - [Contributors](#contributors) - [Versioning](#versioning) @@ -100,10 +100,9 @@ Source: [Acute Lymphoblastic Leukemia Tensorflow Classifier 2020](https://github [Intel® Optimization for TensorFlow](https://software.intel.com/content/www/us/en/develop/articles/intel-optimization-for-tensorflow-installation-guide.html) optimizes the popular Tensorflow framework using Intel® Math Kernel Library for Deep Neural Networks (Intel® MKL-DNN). Intel® MKL-DNN is an open-source library for enhancing performance by accelerating deep learning libraries such as Tensorflow on Intel architecture. ## Intel® Distribution of OpenVINO™ Toolkit -[Intel® Distribution of OpenVINO™ Toolkit](https://software.intel.com/content/www/us/en/develop/tools/openvino-toolkit.html) is based on Convolutional Neural Networks and optimizes models used on Intel CPUs/GPUs, VPUs, FPGA etc. Models are converted to [Interassetste Representations (IR)](https://docs.openvinotoolkit.org/latest/openvino_docs_MO_DG_IR_and_opsets.html) which allow them to be used with the [Inference Engine](https://docs.openvinotoolkit.org/2020.2/_docs_IE_DG_Deep_Learning_Inference_Engine_DevGuide.html). +[Intel® Distribution of OpenVINO™ Toolkit](https://software.intel.com/content/www/us/en/develop/tools/openvino-toolkit.html) is based on Convolutional Neural Networks and optimizes models used on Intel CPUs/GPUs, VPUs, FPGA etc. Models are converted to [Intermediate Representations (IR)](https://docs.openvinotoolkit.org/latest/openvino_docs_MO_DG_IR_and_opsets.html) which allow them to be used with the [Inference Engine](https://docs.openvinotoolkit.org/2020.2/_docs_IE_DG_Deep_Learning_Inference_Engine_DevGuide.html). ## Intel® Movidius™ Neural Compute Stick 2 -(https://software.intel.com/content/www/us/en/develop/hardware/neural-compute-stick.html) The [Intel® Movidius™ Neural Compute Stick 2](https://software.intel.com/content/www/us/en/develop/hardware/neural-compute-stick.html) is a USB plug & play AI device for deep learning inference at the edge. Combined with the Intel® OpenVINO™ Toolkit, developers can develop, fine-tune, and deploy convolutional neural networks (CNNs) on low-power applications that require real-time inference.   @@ -122,7 +121,9 @@ To create the newly improved **Acute Lymphoblastic Leukemia oneAPI Classifier 20 - Test the model using commandline and classify unseen data using HTTP requests to a local API endpoint and via the HIAS network. -## GETTING STARTED +  + +# GETTING STARTED Ready to get started ? Head over to the [Getting Started guide](documentation/getting-started.md) for instructions on how to download/install and setup the Acute Lymphoblastic Leukemia oneAPI Classifier 2021.