GET STARTED — OpenVINO™ documentation (original) (raw)

Welcome to OpenVINO! This guide introduces installation and learning materials for Intel® Distribution of OpenVINO™ toolkit. The guide walks through the following steps:
Quick Start Example Install OpenVINO Learn OpenVINO

For a quick reference, check outthe Quick Start Guide [pdf]

1. Quick Start Example (No Installation Required)#

https://user-images.githubusercontent.com/15709723/127752390-f6aa371f-31b5-4846-84b9-18dd4f662406.gif

Try out OpenVINO’s capabilities with this quick start examplethat estimates depth in a scene using an OpenVINO monodepth model to quickly see how to load a model, prepare an image, inference the image, and display the result.

2. Install OpenVINO#

See the installation overview page for options to install OpenVINO and set up a development environment on your device.

3. Learn OpenVINO#

OpenVINO provides a wide array of examples and documentation showing how to work with models, run inference, and deploy applications. Step through the sections below to learn the basics of OpenVINO and explore its advanced optimization features. For further details, visit OpenVINO documentation.

OpenVINO Basics#

Learn the basics of working with models and inference in OpenVINO. Begin with “Hello World” Interactive Tutorials that show how to prepare models, run inference, and retrieve results using the OpenVINO API. Then, explore OpenVINO Code Samples that can be adapted for your own application.

Interactive Tutorials - Jupyter Notebooks#

Start with interactive Python that show the basics of model inference, the OpenVINO API, how to convert models to OpenVINO format, and more.

OpenVINO Code Samples#

View sample code for various C++ and Python applications that can be used as a starting point for your own application. For C++ developers, step through the Get Started with C++ Samples to learn how to build and run an image classification program that uses OpenVINO’s C++ API.

Integrate OpenVINO With Your Application#

Learn how to use the OpenVINO API to implement an inference pipeline in your application.

OpenVINO Advanced Features#

OpenVINO provides features to improve your model’s performance, optimize your runtime, maximize your application’s throughput on target hardware, and much more. Visit the links below to learn more about these features and how to use them.

Model Compression and Quantization#

Use OpenVINO’s model compression tools to reduce your model’s latency and memory footprint while maintaining good accuracy.

Automated Device Configuration#

OpenVINO’s hardware device configuration options enable you to write an application once and deploy it anywhere with optimal performance.

Flexible Model and Pipeline Configuration#

Pipeline and model configuration features in OpenVINO Runtime allow you to easily optimize your application’s performance on any target hardware.

Additional Resources#