Skip to content

OpenVINO™ Execution Provider for ONNXRuntime 5.6

Compare
Choose a tag to compare
@vthaniel vthaniel released this 04 Apr 13:30
e0b66ca

Description:
OpenVINO™ Execution Provider For ONNXRuntime v5.6 Release based on the latest OpenVINO™ 2025.0.0 Release and OnnxRuntime 1.21.0 Release

For all the latest information, Refer to our official documentation:
https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html

This release supports ONNXRuntime 1.21.0 with the latest OpenVINO™ 2025.0 release.
Please refer to the OpenVINO™ Execution Provider For ONNXRuntime build instructions for information on system pre-requisites as well as instructions to build from source.
https://onnxruntime.ai/docs/build/eps.html#openvino

Modifications:

Supports OpenVINO 2025.0.0.
Supports EP Weight Sharing feature so that weight sharing can happen between two models.
Support added for contrib operators.
Extends support for creation of EP Context Nodes for models undergoing Subgraph Partitioning.

Samples:
https://github.com/microsoft/onnxruntime-inference-examples

Python Package:
https://pypi.org/project/onnxruntime-openvino/

Installation and usage Instructions on Windows:

pip install onnxruntime-openvino

/* Steps If using python openvino package to set openvino runtime environment */
pip install openvino==2025.0.0
<Add these 2 lines in the application code>
import onnxruntime.tools.add_openvino_win_libs as utils
utils.add_openvino_libs_to_path()

C# Package:
Download the Microsoft.ML.OnnxRuntime.Managed nuget from the link below
https://www.nuget.org/packages/Microsoft.ML.OnnxRuntime.Managed
and use it with the Intel.ML.OnnxRuntime.OpenVino nuget from the link below
https://www.nuget.org/packages/Intel.ML.OnnxRuntime.OpenVino

ONNXRuntime APIs usage:
Please refer to the link below for Python/C++ APIs:
https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html#configuration-options