Bert Benchmark Python Sample — OpenVINO™ documentation (original) (raw)

This sample demonstrates how to estimate performance of a Bert model using Asynchronous Inference Request API. This sample does not have configurable command line arguments. Feel free to modify sample’s source code to try out different options.

How It Works#

The sample downloads a model and a tokenizer, exports the model to ONNX format, reads the exported model and reshapes it to enforce dynamic input shapes. Then, it compiles the resulting model, downloads a dataset and runs a benchmark on the dataset.

You can see the explicit description of each sample step atIntegration Stepssection of “Integrate OpenVINO™ Runtime with Your Application” guide.

Running#

  1. Install the openvino Python package:
    python -m pip install openvino
  2. Install packages from requirements.txt:
    python -m pip install -r requirements.txt
  3. Run the sample

Sample Output#

The sample outputs how long it takes to process a dataset.

Additional Resources#