Skip to content

Commit

Permalink
Verification commit
Browse files Browse the repository at this point in the history
  • Loading branch information
AlexKoff88 committed Nov 9, 2023
1 parent 9ce9954 commit 0c0f691
Show file tree
Hide file tree
Showing 2 changed files with 0 additions and 2 deletions.
1 change: 0 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,6 @@ Intel [Neural Compressor](https://www.intel.com/content/www/us/en/developer/tool
[OpenVINO](https://docs.openvino.ai/latest/index.html) is an open-source toolkit that enables high performance inference capabilities for Intel CPUs, GPUs, and special DL inference accelerators ([see](https://docs.openvino.ai/latest/openvino_docs_OV_UG_supported_plugins_Supported_Devices.html) the full list of supported devices). It is supplied with a set of tools to optimize your models with compression techniques such as quantization, pruning and knowledge distillation. Optimum Intel provides a simple interface to optimize your Transformers and Diffusers models, convert them to the OpenVINO Intermediate Representation (IR) format and run inference using OpenVINO Runtime.



## Installation

To install the latest release of 🤗 Optimum Intel with the corresponding required dependencies, you can use `pip` as follows:
Expand Down
1 change: 0 additions & 1 deletion optimum/intel/openvino/configuration.py
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,6 @@
"{re}.*conv_.*",
],
},
"overflow_fix": "disable",
}


Expand Down

0 comments on commit 0c0f691

Please sign in to comment.