Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

updates to calculator docs #56

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
79 changes: 78 additions & 1 deletion mediapipe/calculators/ovms/calculators.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ OVMSInferenceAdapter is an implementation of [OpenVINO Model API](https://github

### OpenVINOInferenceCalculator

[OpenVINOInferenceCalculator](openvinoinferencecalculator.cc) is using `OVMSInferenceAdapter` received as `input_side_packet` to execute inference with [OpenVINO Model Server C-API](https://github.com/openvinotoolkit/model_server/blob/main/docs/model_server_c_api.md). It can use `options` field `tag_to_input_tensor_names` and `tag_to_output_tensor_names` to map MediaPipe stream names and servable (Model/DAG) inputs and/or outputs. Options `input_order_list` and `output_order_list` can be used together with packet types using `std::vector<T>` to transform input/output maps to desired order in vector of tensors. This guarantees correct order of inputs and outputs in the pipeline. Example of usage can be found [here](../../modules/pose_landmark/pose_landmark_by_roi_cpu.pbtxt).
[OpenVINOInferenceCalculator](openvinoinferencecalculator.cc) is using `OVMSInferenceAdapter` received as `input_side_packet` to execute inference with [OpenVINO Model Server C-API](https://github.com/openvinotoolkit/model_server/blob/main/docs/model_server_c_api.md). It can use `options` field `tag_to_input_tensor_names` and `tag_to_output_tensor_names` to map MediaPipe stream names and servable (Model/DAG) inputs and/or outputs. Options `input_order_list` and `output_order_list` can be used together with packet types using `std::vector<T>` to transform input/output maps to desired order in vector of tensors. This guarantees correct order of inputs and outputs in the pipeline.

Accepted packet types and tags are listed below:

Expand All @@ -28,5 +28,82 @@ Accepted packet types and tags are listed below:

In case of missing tag calculator assumes that the packet type is `ov::Tensor'.

## Example of the graph with OpenVINO calculators:

The example below includes two models in the pipeline.
Each model is associated with one OpenVINOInferenceCalculator calculator which takes one side packet which is the Session from OpenVINOModelServerSessionCalculator.


```
input_stream: "in1"
input_stream: "in2"
output_stream: "out"
node {
calculator: "OpenVINOModelServerSessionCalculator"
output_side_packet: "SESSION:increment"

node_options: {
[type.googleapis.com/mediapipe.OpenVINOModelServerSessionCalculatorOptions]: {
servable_name: "increment"
servable_version: "1"
server_config: "/config/config.json"
}
}
}
node {
calculator: "OpenVINOModelServerSessionCalculator"
output_side_packet: "SESSION:add"
node_options: {
[type.googleapis.com/mediapipe.OpenVINOModelServerSessionCalculatorOptions]: {
servable_name: "add"
servable_version: "1"
server_config: "/config/config.json"
}
}
}
node {
calculator: "OpenVINOInferenceCalculator"
input_side_packet: "SESSION:increment"
input_stream: "INCREMENT_IN:in1"
output_stream: "iNCREMENT_OUT:increment_output"
node_options: {
[type.googleapis.com/mediapipe.OpenVINOInferenceCalculatorOptions]: {
tag_to_input_tensor_names {
key: "INCREMENT_IN"
value: "b"
}
tag_to_output_tensor_names {
key: "INCREMENT_OUT"
value: "a"
}
}
}
}
node {
calculator: "OpenVINOInferenceCalculator"
input_side_packet: "SESSION:add"
input_stream: "ADD_INPUT1:increment_output"
input_stream: "ADD_INPUT2:in2"
output_stream: "SUM:out"
node_options: {
[type.googleapis.com/mediapipe.OpenVINOInferenceCalculatorOptions]: {
tag_to_input_tensor_names {
key: "ADD_INPUT1"
value: "input1"
}
tag_to_input_tensor_names {
key: "ADD_INPUT2"
value: "input2"
}
tag_to_output_tensor_names {
key: "SUM"
value: "sum"
}
}
}
}
```
![example](./example.png)

## How to adjust existing graphs to perform inference with OpenVINO Model Server
Please check following [link](https://github.com/openvinotoolkit/mediapipe/compare/master...openvinotoolkit:mediapipe:main) and look up differences in existing MediaPipe pbtxt files.
Binary file added mediapipe/calculators/ovms/example.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
Expand Up @@ -30,4 +30,6 @@ message OpenVINOModelServerSessionCalculatorOptions {
// service_url: "13.21.212.171:9718"
optional string service_url = 3;
optional string server_config = 4;
// model server config.json path must be identical in all calculators used in the graph
// when the graph is deployed in the OVMS, server_config param is not needed.
Comment on lines 32 to +34
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
optional string server_config = 4;
// model server config.json path must be identical in all calculators used in the graph
// when the graph is deployed in the OVMS, server_config param is not needed.
// model server config.json path must be identical in all calculators used in the graph
// when the graph is deployed in the OVMS, server_config param is not needed.
optional string server_config = 4;

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Move comment before the line being commented.

}