Hello Classification Sample — OpenVINO™ documentation (original) (raw)

This sample demonstrates how to do inference of image classification models using Synchronous Inference Request API. Before using the sample, refer to the following requirements:

How It Works#

At startup, the sample application sets log message capturing callback and reads command-line parameters. Then it prepares input data, loads a specified model and image to the OpenVINO™ Runtime plugin, performs synchronous inference, and processes output data, logging each step in a standard output stream.

You can see the explicit description of each sample step atIntegration Stepssection of “Integrate OpenVINO™ Runtime with Your Application” guide.

Running#

Python

python hello_classification.py

C++

hello_classification

C

hello_classification_c

To run the sample, you need to specify a model and an image:

Note

Example#

  1. Download a pre-trained model.
  2. You can convert it by using:
    Python
    import openvino as ov
    ov_model = ov.convert_model('./models/alexnet')

or, when model is a Python model object

ov_model = ov.convert_model(alexnet)
CLI 3. Perform inference of an image, using a model on a GPU, for example:
Python
python hello_classification.py ./models/alexnet/alexnet.xml ./images/banana.jpg GPU
C++
hello_classification ./models/googlenet-v1.xml ./images/car.bmp GPU
C
hello_classification_c alexnet.xml ./opt/intel/openvino/samples/scripts/car.png GPU

Sample Output#

Python

The sample application logs each step in a standard output stream and outputs top-10 inference results.

[ INFO ] Creating OpenVINO Runtime Core [ INFO ] Reading the model: /models/alexnet/alexnet.xml [ INFO ] Loading the model to the plugin [ INFO ] Starting inference in synchronous mode [ INFO ] Image path: /images/banana.jpg [ INFO ] Top 10 results: [ INFO ] class_id probability [ INFO ] -------------------- [ INFO ] 954 0.9703885 [ INFO ] 666 0.0219518 [ INFO ] 659 0.0033120 [ INFO ] 435 0.0008246 [ INFO ] 809 0.0004433 [ INFO ] 502 0.0003852 [ INFO ] 618 0.0002906 [ INFO ] 910 0.0002848 [ INFO ] 951 0.0002427 [ INFO ] 961 0.0002213 [ INFO ] [ INFO ] This sample is an API example, for any performance measurements please use the dedicated benchmark_app tool

C++

The application outputs top-10 inference results.

[ INFO ] OpenVINO Runtime version ......... [ INFO ] Build ........... [ INFO ] [ INFO ] Loading model files: /models/googlenet-v1.xml [ INFO ] model name: GoogleNet [ INFO ] inputs [ INFO ] input name: data [ INFO ] input type: f32 [ INFO ] input shape: {1, 3, 224, 224} [ INFO ] outputs [ INFO ] output name: prob [ INFO ] output type: f32 [ INFO ] output shape: {1, 1000}

Top 10 results:

Image /images/car.bmp

classid probability


656 0.8139648 654 0.0550537 468 0.0178375 436 0.0165405 705 0.0111694 817 0.0105820 581 0.0086823 575 0.0077515 734 0.0064468 785 0.0043983

C

The application outputs top-10 inference results.

Top 10 results:

Image /opt/intel/openvino/samples/scripts/car.png

classid probability


656 0.666479 654 0.112940 581 0.068487 874 0.033385 436 0.026132 817 0.016731 675 0.010980 511 0.010592 569 0.008178 717 0.006336

This sample is an API example, for any performance measurements use the dedicated benchmark_app tool.

Additional Resources#