-
Notifications
You must be signed in to change notification settings - Fork 2.4k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[DOCS] separate OV integrations section
- Loading branch information
1 parent
90b5dc7
commit 8eeb5ce
Showing
13 changed files
with
331 additions
and
67 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,31 @@ | ||
OpenVINO Ecosystem | ||
================== | ||
|
||
.. meta:: | ||
:description: Explore the OpenVINO™ ecosystem of tools and resources for developing deep | ||
learning solutions. | ||
|
||
.. toctree:: | ||
:maxdepth: 1 | ||
:hidden: | ||
|
||
OpenVINO Integrations <openvino-ecosystem/openvino-integrations> | ||
The OpenVINO Project <openvino-ecosystem/openvino-project> | ||
OpenVINO Adoptions <openvino-ecosystem/openvino-adoptions> | ||
|
||
|
||
OpenVINO™, as a toolkit should, involves multiple components and integrations that may be used | ||
in various areas of your Deep Learning pipelines. This section will give you an overview of a | ||
whole ecosystem of resources either developed under the OpenVINO umbrella, integrating it with | ||
external solutions, or utilizing its potential. | ||
|
||
| :doc:`OpenVINO Integrations <./openvino-ecosystem/openvino-integrations>` | ||
| See what other tools OpenVINO is easily integrated with and how you can benefit from its | ||
performance, without rewriting your software. | ||
| :doc:`The OpenVINO project <./openvino-ecosystem/openvino-project>` | ||
| Check out the most noteworthy components of the OpenVINO project. | ||
| :doc:`OpenVINO adoptions <./openvino-ecosystem/openvino-adoptions>` | ||
| Here, you will find information about a selection of software projects utilizing OpenVINO. | ||
63 changes: 63 additions & 0 deletions
63
docs/articles_en/openvino-ecosystem/openvino-adoptions.rst
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,63 @@ | ||
OpenVINO Adoptions | ||
========================== | ||
|
||
OpenVINO has been adopted by multiple AI projects in various areas. For an extensive list of | ||
community-based projects involving OpenVINO, see the | ||
`Awesome OpenVINO repository <https://github.com/openvinotoolkit/awesome-openvino>`__. | ||
|
||
Here is a small selection of adoptions, including proprietary and commercial tools: | ||
|
||
|
||
|
||
|
||
| **DaVinCI Resolve** | ||
| :bdg-link-info:`Official Website <https://www.blackmagicdesign.com/products/davinciresolve>` | ||
DaVinci resolve is a professional video editing suite by Blackmagicdesign. It uses OpenVINO to | ||
run some of its industry-leading AI features. | ||
|hr| | ||
|
||
| **OpenVINO AI Plugins for GIMP** | ||
| :bdg-link-dark:`Official Repository <https://github.com/intel/openvino-ai-plugins-gimp>` | ||
Gimp is an image editor that has promoted open source values for over two decades. Now, you can | ||
use generative AI directly in the application, thanks to the OpenVINO plugin, just like in the | ||
leading graphics suites. | ||
|hr| | ||
|
||
| **OpenVINO AI Plugins for Audacity** | ||
| :bdg-link-info:`Official Website <https://www.audacityteam.org/download/openvino/>` | ||
:bdg-link-dark:`Official Repository <https://github.com/intel/openvino-plugins-ai-audacity>` | ||
Audacity is a hugely popular audio editing and recording application. Now, it offers AI-based | ||
plugins running on OpenVINO, providing new effects, generators, and analyzers. | ||
|hr| | ||
|
||
|
||
| **VisionGuard** | ||
| :bdg-link-dark:`Official Repository <https://github.com/inbasperu/VisionGuard>` | ||
A desktop tool developed within Google Summer of Code. Its aim is to help computer users battle | ||
eye strain, utilizing gaze estimation. | ||
|hr| | ||
|
||
| **OpenVINO Code** | ||
| :bdg-link-dark:`Official Repository <https://github.com/openvinotoolkit/openvino_contrib/tree/master/modules/openvino_code>` | ||
A coding assistant. A community-developed extension for Visual Studio Code, aiming to help | ||
programmers by providing code completion and suggestions. | ||
|hr| | ||
|
||
| **NVIDIA GPU Plugin** | ||
| :bdg-link-dark:`Official Repository <https://github.com/openvinotoolkit/openvino_contrib/tree/master/modules/nvidia_plugin>` | ||
A device plugin for OpenVINO. A community-developed extension, enabling inference using | ||
NVIDIA GPUs. | ||
|hr| | ||
|
||
|
||
|
||
|
||
.. |hr| raw:: html | ||
|
||
<hr style="margin-top:-12px!important;border-top:1px solid #383838;"> |
182 changes: 182 additions & 0 deletions
182
docs/articles_en/openvino-ecosystem/openvino-integrations.rst
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,182 @@ | ||
OpenVINO™ Integrations | ||
============================== | ||
|
||
|
||
.. meta:: | ||
:description: Check a list of integrations between OpenVINO and other Deep Learning solutions. | ||
|
||
|
||
|
||
.. = 1 ======================================================================================== | ||
**Hugging Face Optimum-Intel** | ||
|
||
|hr| | ||
|
||
.. grid:: 1 1 2 2 | ||
:gutter: 4 | ||
|
||
.. grid-item:: | ||
|
||
| Grab and use models leveraging OpenVINO within the Hugging Face API. | ||
The repository hosts pre-optimized OpenVINO IR models, so that you can use | ||
them in your projects without the need for any adjustments. | ||
| Benefits: | ||
| - Minimize complex coding for Generative AI. | ||
.. grid-item:: | ||
|
||
* :doc:`Run inference with HuggingFace and Optimum Intel <../learn-openvino/llm_inference_guide/llm-inference-hf>` | ||
* `A notebook example: llm-chatbot <https://github.com/openvinotoolkit/openvino_notebooks/tree/main/notebooks/254-llm-chatbot>`__ | ||
* `Hugging Face Inference documentation <https://huggingface.co/docs/optimum/main/intel/openvino/inference>`__ | ||
* `Hugging Face Compression documentation <https://huggingface.co/docs/optimum/main/intel/openvino/optimization>`__ | ||
* `Hugging Face Reference Documentation <https://huggingface.co/docs/optimum/main/intel/openvino/reference>`__ | ||
|
||
.. dropdown:: Check example code | ||
:animate: fade-in-slide-down | ||
:color: secondary | ||
|
||
.. code-block:: py | ||
-from transformers import AutoModelForCausalLM | ||
+from optimum.intel.openvino import OVModelForCausalLM | ||
from transformers import AutoTokenizer, pipeline | ||
model_id = "togethercomputer/RedPajama-INCITE-Chat-3B-v1" | ||
-model = AutoModelForCausalLM.from_pretrained(model_id) | ||
+model = OVModelForCausalLM.from_pretrained(model_id, export=True) | ||
.. = 2 ======================================================================================== | ||
**OpenVINO Execution Provider for ONNX Runtime** | ||
|
||
|hr| | ||
|
||
.. grid:: 1 1 2 2 | ||
:gutter: 4 | ||
|
||
.. grid-item:: | ||
|
||
| Utilize OpenVINO as a backend with your existing ONNX Runtime code. | ||
| Benefits: | ||
| - Enhanced inference performance on Intel hardware with minimal code modifications. | ||
.. grid-item:: | ||
|
||
* A notebook example: YOLOv8 object detection | ||
* `ONNX User documentation <https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html>`__ | ||
* `Build ONNX RT with OV EP <https://oliviajain.github.io/onnxruntime/docs/build/eps.html#openvino>`__ | ||
* `ONNX Examples <https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html#openvino-execution-provider-samples-tutorials>`__ | ||
|
||
|
||
.. dropdown:: Check example code | ||
:animate: fade-in-slide-down | ||
:color: secondary | ||
|
||
.. code-block:: cpp | ||
device = `CPU_FP32` | ||
# Set OpenVINO as the Execution provider to infer this model | ||
sess.set_providers([`OpenVINOExecutionProvider`], [{device_type` : device}]) | ||
.. = 3 ======================================================================================== | ||
**Torch.compile with OpenVINO** | ||
|
||
|hr| | ||
|
||
.. grid:: 1 1 2 2 | ||
:gutter: 4 | ||
|
||
.. grid-item:: | ||
|
||
| Use OpenVINO for Python-native applications by JIT-compiling code into optimized kernels. | ||
| Benefits: | ||
| - Enhanced inference performance on Intel hardware with minimal code modifications. | ||
.. grid-item:: | ||
|
||
* :doc:`PyTorch Deployment via torch.compile <../openvino-workflow/torch-compile>` | ||
* A notebook example: n.a. | ||
* `torch.compiler documentation <https://pytorch.org/docs/stable/torch.compiler.html>`__ | ||
* `torch.compiler API reference <https://pytorch.org/docs/stable/torch.compiler_api.html>`__ | ||
|
||
.. dropdown:: Check example code | ||
:animate: fade-in-slide-down | ||
:color: secondary | ||
|
||
.. code-block:: python | ||
import openvino.torch | ||
... | ||
model = torch.compile(model, backend='openvino') | ||
... | ||
.. = 4 ======================================================================================== | ||
**OpenVINO LLMs with LlamaIndex** | ||
|
||
|hr| | ||
|
||
.. grid:: 1 1 2 2 | ||
:gutter: 4 | ||
|
||
.. grid-item:: | ||
|
||
| Build context-augmented GenAI applications with the LlamaIndex framework and enhance | ||
runtime performance with OpenVINO. | ||
| Benefits: | ||
| - Minimize complex coding for Generative AI. | ||
.. grid-item:: | ||
|
||
* :doc:`LLM inference with Optimum-intel <../learn-openvino/llm_inference_guide/llm-inference-hf>` | ||
* `A notebook example: llm-agent-rag <https://github.com/openvinotoolkit/openvino_notebooks/blob/latest/notebooks/llm-agent-react/llm-agent-rag-llamaindex.ipynb>`__ | ||
* | ||
* `Inference documentation <https://docs.llamaindex.ai/en/stable/examples/llm/openvino/>`__ | ||
* `Rerank documentation <https://docs.llamaindex.ai/en/stable/examples/node_postprocessor/openvino_rerank/>`__ | ||
* `Embeddings documentation <https://docs.llamaindex.ai/en/stable/examples/embeddings/openvino/>`__ | ||
* `API Reference <https://docs.llamaindex.ai/en/stable/api_reference/llms/openvino/>`__ | ||
|
||
.. dropdown:: Check example code | ||
:animate: fade-in-slide-down | ||
:color: secondary | ||
|
||
.. code-block:: python | ||
ov_config = { | ||
"PERFORMANCE_HINT": "LATENCY", | ||
"NUM_STREAMS": "1", | ||
"CACHE_DIR": "", | ||
} | ||
ov_llm = OpenVINOLLM( | ||
model_id_or_path="HuggingFaceH4/zephyr-7b-beta", | ||
context_window=3900, | ||
max_new_tokens=256, | ||
model_kwargs={"ov_config": ov_config}, | ||
generate_kwargs={"temperature": 0.7, "top_k": 50, "top_p": 0.95}, | ||
messages_to_prompt=messages_to_prompt, | ||
completion_to_prompt=completion_to_prompt, | ||
device_map="cpu", | ||
) | ||
.. ============================================================================================ | ||
.. |hr| raw:: html | ||
|
||
<hr style="margin-top:-12px!important;border-top:1px solid #383838;"> |
Oops, something went wrong.