Skip to content

Commit

Permalink
Remove config dependency from OpenVINOInferencer (openvinotoolkit#939)
Browse files Browse the repository at this point in the history
* Use case Dobot - WIP commit (openvinotoolkit#884)

* WIP commit

* training via config file and robot control

* 2 notebooks robot control

* Removed utils directory from notebooks/500_use_cases

* Removed utils directory from `notebooks/500_use_cases` (openvinotoolkit#911)

Removed utils directory from notebooks/500_use_cases

* Download the dataset and the api from anomalib assets

* adding readme file

* adding instructions

* adding instructions

* adding instructions

* adding instructions

* adding instructions

* adding instructions

* adding instructions

* adding instructions

* adding instructions

* adding instructions

* Update README.md

* Model training is done via API now

* Metadata change

* Metadata change

* Metadata change

* Metadata change

* Metadata change

* Metadata change

* Metadata change

* update docstring

* Address pre-commit

* Address pre-commit issues

* Address pre-commit issues

* Fixed the bugs spotted by Paula

* Fix export tests

* Address pre-commit issues

* Update test_export.py

* Update base.txt

* Revert metadata rename in changelog

* Addressed reviewer comments

* remove config from openvino in notebooks

* remove config from openvino in notebooks

* add unreleased section to changelog

* Address tox output

* Change the directory path of dobot

* Update CHANGELOG.md

* Revert the figure back in notebook

* Revert padim configs

* Fix getting started jupyter notebook that failed

* Fix metadata path

---------

Co-authored-by: Paula Ramos <[email protected]>
  • Loading branch information
samet-akcay and Paula Ramos authored Mar 10, 2023
1 parent 6834e5b commit f023a83
Show file tree
Hide file tree
Showing 23 changed files with 239 additions and 519 deletions.
4 changes: 4 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,10 @@ datasets
results
!anomalib/core/results

# Jupyter Notebooks
notebooks/500_use_cases/501_dobot/
!notebooks/500_use_cases/501_dobot/*.ipynb

# VENV
.python-version
.anomalib
Expand Down
2 changes: 2 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,8 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).

### Changed

- Remove `config` flag from `OpenVINOInferencer` (<https://github.com/openvinotoolkit/anomalib/pull/939>)

### Deprecated

###  Fixed
Expand Down
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -198,12 +198,12 @@ Example OpenVINO Inference:
python tools/inference/openvino_inference.py \
--config anomalib/models/padim/config.yaml \
--weights results/padim/mvtec/bottle/openvino/openvino_model.bin \
--meta_data results/padim/mvtec/bottle/openvino/meta_data.json \
--meta_data results/padim/mvtec/bottle/openvino/metadata.json \
--input datasets/MVTec/bottle/test/broken_large/000.png \
--output results/padim/mvtec/bottle/images
```

> Ensure that you provide path to `meta_data.json` if you want the normalization to be applied correctly.
> Ensure that you provide path to `metadata.json` if you want the normalization to be applied correctly.

You can also use Gradio Inference to interact with the trained models using a UI. Refer to our [guide](https://openvinotoolkit.github.io/anomalib/guides/inference.html#gradio-inference) for more details.

Expand Down
12 changes: 6 additions & 6 deletions docs/source/tutorials/inference.rst
Original file line number Diff line number Diff line change
Expand Up @@ -52,17 +52,17 @@ To run OpenVINO inference, first make sure that your model has been exported to
+-----------+----------+--------------------------------------------------------------------------------------+
| save_data | False | Path to which the output images should be saved. Leave empty for live visualization. |
+-----------+----------+--------------------------------------------------------------------------------------+
| meta_data | True | Path to the JSON file containing the model's meta data (e.g. normalization |
| metadata | True | Path to the JSON file containing the model's meta data (e.g. normalization |
| | | parameters and anomaly score threshold). |
+-----------+----------+--------------------------------------------------------------------------------------+
| device | False | Device on which OpenVINO will perform the computations (``CPU``, ``GPU`` or ``VPU``) |
+-----------+----------+--------------------------------------------------------------------------------------+

For correct inference results, the ``meta_data`` argument should be specified and point to the ``meta_data.json`` file that was generated when exporting the OpenVINO IR model. The file is stored in the same folder as the ``.xml`` and ``.bin`` files of the model.
For correct inference results, the ``metadata`` argument should be specified and point to the ``metadata.json`` file that was generated when exporting the OpenVINO IR model. The file is stored in the same folder as the ``.xml`` and ``.bin`` files of the model.

As an example, OpenVINO inference can be triggered by the following command:

``python tools/inference/openvino.py --config padim.yaml --weights results/openvino/model.xml --input image.png --meta_data results/openvino/meta_data.json``
``python tools/inference/openvino.py --config padim.yaml --weights results/openvino/model.xml --input image.png --metadata results/openvino/metadata.json``

Similar to PyTorch inference, the visualization results will be displayed on the screen, and optionally saved to the file system location specified by the ``save_data`` parameter.

Expand All @@ -80,15 +80,15 @@ The gradio inference is supported for both PyTorch and OpenVINO models.
+-----------+----------+------------------------------------------------------------------+
| weights | True | Path to the OpenVINO IR model file (either ``.xml`` or ``.bin``) |
+-----------+----------+------------------------------------------------------------------+
| meta_data | False | Path to the JSON file containing the model's meta data. |
| metadata | False | Path to the JSON file containing the model's meta data. |
| | | This is needed only for OpenVINO model. |
+-----------+----------+------------------------------------------------------------------+
| threshold | False | Threshold value used for identifying anomalies. Range 1-100. |
+-----------+----------+------------------------------------------------------------------+
| share | False | Share Gradio `share_url` |
+-----------+----------+------------------------------------------------------------------+

To use gradio with OpenVINO model, first make sure that your model has been exported to the OpenVINO IR format and ensure that the `meta_data` argument points to the ``meta_data.json`` file that was generated when exporting the OpenVINO IR model. The file is stored in the same folder as the ``.xml`` and ``.bin`` files of the model.
To use gradio with OpenVINO model, first make sure that your model has been exported to the OpenVINO IR format and ensure that the `metadata` argument points to the ``metadata.json`` file that was generated when exporting the OpenVINO IR model. The file is stored in the same folder as the ``.xml`` and ``.bin`` files of the model.

As an example, PyTorch model can be used by the following command:

Expand All @@ -105,4 +105,4 @@ Similarly, you can use OpenVINO model by the following command:
python python tools/inference/gradio_inference.py \
--config ./anomalib/models/padim/config.yaml \
--weights ./results/padim/mvtec/bottle/openvino/openvino_model.onnx \
--meta_data ./results/padim/mvtec/bottle/openvino/meta_data.json
--metadata ./results/padim/mvtec/bottle/openvino/metadata.json
5 changes: 2 additions & 3 deletions notebooks/000_getting_started/001_getting_started.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -434,7 +434,7 @@
],
"source": [
"openvino_model_path = output_path / \"openvino\" / \"model.bin\"\n",
"metadata_path = output_path / \"openvino\" / \"meta_data.json\"\n",
"metadata_path = output_path / \"openvino\" / \"metadata.json\"\n",
"print(openvino_model_path.exists(), metadata_path.exists())"
]
},
Expand All @@ -445,9 +445,8 @@
"outputs": [],
"source": [
"inferencer = OpenVINOInferencer(\n",
" config=CONFIG_PATH, # Pass the config file to the inferencer.\n",
" path=openvino_model_path, # Path to the OpenVINO IR model.\n",
" meta_data_path=metadata_path, # Path to the metadata file.\n",
" metadata_path=metadata_path, # Path to the metadata file.\n",
" device=\"CPU\", # We would like to run it on an Intel CPU.\n",
")"
]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,6 @@
"# importing required libraries\n",
"import cv2 # OpenCV library\n",
"import matplotlib.pyplot as plt\n",
"\n",
"from anomalib.deploy import OpenVINOInferencer"
]
},
Expand Down Expand Up @@ -248,7 +247,6 @@
"# If acquisition is False this notebook will work in inference mode\n",
"if acquisition is False:\n",
" # If you are running inference check where the OpenVINO model is stored\n",
" CONFIG_PATH = notebook_path / \"cubes_config.yaml\"\n",
" openvino_model_path = notebook_path / \"openvino\" / \"model.bin\"\n",
" metadata_path = notebook_path / \"openvino\" / \"meta_data.json\"\n",
" print(\"OpenVINO model exist: \", openvino_model_path.exists())\n",
Expand All @@ -257,9 +255,8 @@
" print(\"Metadata path: \", metadata_path)\n",
"\n",
" inferencer = OpenVINOInferencer(\n",
" config=CONFIG_PATH, # Pass the config file to the inferencer.\n",
" path=openvino_model_path, # Path to the OpenVINO IR model.\n",
" meta_data_path=metadata_path, # Path to the metadata file.\n",
" metadata_path=metadata_path, # Path to the metadata file.\n",
" device=\"CPU\", # We would like to run it on an Intel CPU.\n",
" )\n",
"\n",
Expand Down Expand Up @@ -558,19 +555,11 @@
" dType.SetPTPCmdEx(api, 0, Calibration_X, Calibration_Y, Calibration_Z, 0, 1)\n",
" cam_stream.stop() # stop the webcam stream"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "5b1bb666",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "project_anomalib",
"display_name": "anomalib",
"language": "python",
"name": "python3"
},
Expand All @@ -584,11 +573,11 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.16"
"version": "3.8.13"
},
"vscode": {
"interpreter": {
"hash": "18f8999b3d132acda9ed72c7f0f7e54d3c533278cffbadac58c30769cf876377"
"hash": "ae223df28f60859a2f400fae8b3a1034248e0a469f5599fd9a89c32908ed7a84"
}
}
},
Expand Down
Loading

0 comments on commit f023a83

Please sign in to comment.