OpenVINO(TM) Runtime
Intel® Distribution of OpenVINO™ toolkit is an open-source toolkit for optimizing and deploying AI inference. It can be used to develop applications and solutions based on deep learning tasks, such as: emulation of human vision, automatic speech recognition, natural language processing, recommendation systems, etc. It provides high-performance and rich deployment options, from edge to cloud.
If you have already finished developing your models and converting them to the OpenVINO model format, you can install OpenVINO Runtime to deploy your applications on various devices. The OpenVINO™ Python package includes a set of libraries for an easy inference integration with your products.
Before you start the installation, check the supported operating systems and required Python* versions. The complete list of supported hardware is available in the System Requirements.
C++ libraries are also required for the installation on Windows*. To install that, you can download the Visual Studio Redistributable file (.exe).
NOTE: This package can be installed on other versions of Linux and Windows OSes, but only the specific versions above are fully validated.
Use a virtual environment to avoid dependency conflicts.
To create a virtual environment, use the following commands:
On Windows:
python -m venv openvino_env
On Linux and macOS:
python3 -m venv openvino_env
NOTE: On Linux and macOS, you may need to install pip. For example, on Ubuntu execute the following command to get pip installed:
sudo apt install python3-venv python3-pip
.
On Windows:
openvino_env\Scripts\activate
On Linux and macOS:
source openvino_env/bin/activate
Run the command below:
python -m pip install --upgrade pip
Run the command below:
pip install openvino
Run the command below:
python -c "from openvino.runtime import Core; print(Core().available_devices)"
If installation was successful, you will see the list of available devices.
Component | Content | Description |
---|---|---|
OpenVINO Runtime | openvino package |
OpenVINO Runtime is a set of C++ libraries with C and Python bindings providing a common API to deliver inference solutions on the platform of your choice. Use the OpenVINO Runtime API to read PyTorch*, TensorFlow*, TensorFlow Lite*, ONNX*, and PaddlePaddle* models and execute them on preferred devices. OpenVINO Runtime uses a plugin architecture and includes the following plugins: CPU, GPU, Auto Batch, Auto, Hetero. |
OpenVINO Model Converter (OVC) | ovc |
OpenVINO Model Converter converts models that were trained in popular frameworks to a format usable by OpenVINO components. Supported frameworks include ONNX*, TensorFlow*, TensorFlow Lite*, and PaddlePaddle*. |
Benchmark Tool | benchmark_app |
Benchmark Application allows you to estimate deep learning inference performance on supported devices for synchronous and asynchronous modes. |
For general troubleshooting steps and issues, see Troubleshooting Guide for OpenVINO Installation. The following sections also provide explanations to several error messages.
Users in China might encounter errors while downloading sources via PIP during OpenVINO™ installation. To resolve the issues, try the following solution:
Add the download source using the -i
parameter with the Python pip
command. For example:
pip install openvino -i https://mirrors.aliyun.com/pypi/simple/
Use the --trusted-host
parameter if the URL above is http
instead of https
.
On Windows*, some libraries are necessary to run OpenVINO. To resolve this issue, install the C++ redistributable (.exe). You can also view a full download list on the official support page.
To resolve missing external dependency on Ubuntu*, execute the following command:
sudo apt-get install libpython3.8
Copyright © 2018-2023 Intel Corporation
LEGAL NOTICE: Your use of this software and any required dependent software (the “Software Package”) is subject to the terms and conditions of the Apache 2.0 License for the Software Package, which may also include notices, disclaimers, or license terms for third party or open source software included in or with the Software Package, and your use indicates your acceptance of all such terms. Please refer to the “third-party-programs.txt” or other similarly-named text file included with the Software Package for additional details.
Intel is committed to the respect of human rights and avoiding complicity in human rights abuses, a policy reflected in the Intel Global Human Rights Principles. Accordingly, by accessing the Intel material on this platform you agree that you will not use the material in a product or application that causes or contributes to a violation of an internationally recognized human right.