You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/build/eps.md
+20-36Lines changed: 20 additions & 36 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -337,19 +337,20 @@ See more information on the OpenVINO™ Execution Provider [here](../execution-p
337
337
### Prerequisites
338
338
{: .no_toc }
339
339
340
-
1. Install the OpenVINO™ offline/online installer from Intel<sup>®</sup> Distribution of OpenVINO™<sup>TM</sup> Toolkit **Release 2024.3**for the appropriate OS and target hardware:
Follow [documentation](https://docs.openvino.ai/2024/home.html) for detailed instructions.
345
-
346
-
*2024.5 is the current recommended OpenVINO™ version. [OpenVINO™ 2024.5](https://docs.openvino.ai/2024/index.html) is minimal OpenVINO™ version requirement.*
340
+
1. Install the OpenVINO™ offline/online installer from Intel<sup>®</sup> Distribution of OpenVINO™<sup>TM</sup> Toolkit **Release 2025.3**for the appropriate OS and target hardware:
Follow [documentation](https://docs.openvino.ai/2025/index.html) for detailed instructions.
345
+
346
+
*2025.3 is the current recommended OpenVINO™ version. [OpenVINO™ 2025.0](https://docs.openvino.ai/2025/index.html) is minimal OpenVINO™ version requirement.*
347
347
348
-
2. Configure the target hardware with specific follow on instructions:
349
-
* To configure Intel<sup>®</sup> Processor Graphics(GPU) please follow these instructions: [Windows](https://docs.openvino.ai/2024/get-started/configurations/configurations-intel-gpu.html#windows), [Linux](https://docs.openvino.ai/2024/get-started/configurations/configurations-intel-gpu.html#linux)
348
+
2. Install CMake 3.28 or higher. Download from the [official CMake website](https://cmake.org/download/).
350
349
350
+
3. Configure the target hardware with specific follow on instructions:
351
+
* To configure Intel<sup>®</sup> Processor Graphics(GPU) please follow these instructions: [Windows](https://docs.openvino.ai/2025/get-started/install-openvino/configurations/configurations-intel-gpu.html#windows), [Linux](https://docs.openvino.ai/2025/get-started/install-openvino/configurations/configurations-intel-gpu.html#linux)
351
352
352
-
3. Initialize the OpenVINO™ environment by running the setupvars script as shown below. This is a required step:
353
+
4. Initialize the OpenVINO™ environment by running the setupvars script as shown below. This is a required step:
353
354
* For Windows:
354
355
```
355
356
C:\<openvino_install_directory>\setupvars.bat
@@ -358,30 +359,30 @@ See more information on the OpenVINO™ Execution Provider [here](../execution-p
358
359
```
359
360
$ source<openvino_install_directory>/setupvars.sh
360
361
```
361
-
**Note:** If you are using a dockerfile to use OpenVINO™ Execution Provider, sourcing OpenVINO™ won't be possible within the dockerfile. You would have to explicitly set the LD_LIBRARY_PATH to point to OpenVINO™ libraries location. Refer our [dockerfile](https://github.com/microsoft/onnxruntime/blob/main/dockerfiles/Dockerfile.openvino).
*Note: The default Windows CMake Generator is Visual Studio 2019, but you can also use the newer Visual Studio 2022 by passing `--cmake_generator "Visual Studio 17 2022"` to `.\build.bat`*
*`--build_wheel` Creates python wheel file in dist/ folder. Enable it when building from source.
381
382
*`--use_openvino` builds the OpenVINO™ Execution Provider in ONNX Runtime.
382
-
* `<hardware_option>`: Specifies the default hardware target for building OpenVINO™ Execution Provider. This can be overriden dynamically at runtime with another option (refer to [OpenVINO™-ExecutionProvider](../execution-providers/OpenVINO-ExecutionProvider.md#summary-of-options) for more details on dynamic device selection). Below are the options for different Intel target devices.
383
+
*`<hardware_option>`: Specifies the default hardware target for building OpenVINO™ Execution Provider. This can be overriden dynamically at runtime with another option (refer to [OpenVINO™-ExecutionProvider](../execution-providers/OpenVINO-ExecutionProvider.md#configuration-options) for more details on dynamic device selection). Below are the options for different Intel target devices.
383
384
384
-
Refer to [Intel GPU device naming convention](https://docs.openvino.ai/2024/openvino-workflow/running-inference/inference-devices-and-modes/gpu-device.html#device-naming-convention) for specifying the correct hardware target in cases where both integrated and discrete GPU's co-exist.
385
+
Refer to [Intel GPU device naming convention](https://docs.openvino.ai/2025/openvino-workflow/running-inference/inference-devices-and-modes/gpu-device.html#device-naming-convention) forspecifying the correct hardware targetin cases where both integrated and discrete GPU's co-exist.
The DEVICE_TYPE can be any of these devices from this list ['CPU','GPU', 'NPU']
401
394
402
-
A minimum of two device's should be specified for a valid HETERO or MULTI or AUTO device build.
403
-
404
-
```
405
-
Example's: HETERO:GPU,CPU or AUTO:GPU,CPU or MULTI:GPU,CPU
406
-
```
407
395
408
396
#### Disable subgraph partition Feature
409
-
* Builds the OpenVINO™ Execution Provider in ONNX Runtime with sub graph partitioning disabled.
410
-
411
-
* With this option enabled. Fully supported models run on OpenVINO Execution Provider else they completely fall back to default CPU EP.
397
+
* Builds the OpenVINO™ Execution Provider in ONNX Runtime with graph partitioning disabled, which will run fully supported models on OpenVINO Execution Provider else they completely fall back to default CPU EP,
412
398
413
399
* To enable this feature during build time. Use `--use_openvino ` `<hardware_option>_NO_PARTITION`
414
400
415
401
```
416
-
Usage: --use_openvino CPU_FP32_NO_PARTITION or --use_openvino GPU_FP32_NO_PARTITION or
417
-
--use_openvino GPU_FP16_NO_PARTITION
402
+
Usage: --use_openvino CPU_NO_PARTITION or --use_openvino GPU_NO_PARTITION or --use_openvino NPU_NO_PARTITION
418
403
```
419
404
420
-
For more information on OpenVINO™ Execution Provider's ONNX Layer support, Topology support, and Intel hardware enabled, please refer to the document [OpenVINO™-ExecutionProvider](../execution-providers/OpenVINO-ExecutionProvider.md)
405
+
For more information on OpenVINO™ Execution Provider's ONNX Layer support, Topology support, and Intel hardware enabled, please refer to the document [OpenVINO™-ExecutionProvider](../execution-providers/OpenVINO-ExecutionProvider.md#support-coverage)
421
406
422
407
---
423
-
424
408
## QNN
425
409
See more information on the QNN execution provider [here](../execution-providers/QNN-ExecutionProvider.md).
0 commit comments