The Neural Compute Stick 2 is great little USB device made to run neural network inference with its dedicated hardware, a RISC processor with a set of specialized instructions. At around 99€, it might be interesting coupled with a Raspberry Pi to run some medium sized neural networks in real time. But before that, we need to install the required software!
You need to have a Raspberry Pi setup with the latest Raspbian Stretch and an internet connection.
Install OpenVino toolkit
We are now going to install the OpenVino toolkit which will enable us to run inference on the RaspberryPi with the Compute Stick.
Step 1 - Download the toolkit
cd wget https://download.01.org/openvinotoolkit/2018_R5/packages/l_openvino_toolkit_ie_p_2018.5.445.tgz tar -xf l_openvino_toolkit_ie_p_2018.5.445.tgz rm l_openvino_toolkit_ie_p_2018.5.445.tgz sed -i "s|<INSTALLDIR>|$(pwd)/inference_engine_vpu_arm|" inference_engine_vpu_arm/bin/setupvars.sh
Step 2 - Update the vars
Go to your bashrc file
Add the following line at the bottom of the file
Reload the bash environment
You should see the following message:
[setupvars.sh] OpenVINO environment initialized
If that's the case, type the following line and reboot.
sudo usermod -a -G users "$(whoami)" sudo reboot
Step 3 - Update USB rules
Right now you should be ready to go. We'll test the installation.
Step 4 - Test installation
We'll use my repo showing you how to convert Keras model to IR file for the NCS2.
cd git clone https://github.com/MrEliptik/Keras_to_TF_NCS2.git cd Keras_to_TF_NCS2 python3 predict_mnist.py
If everything works, you should see the following output:
[ INFO ] Loading network files: IR_model/IR_model.xml IR_model/IR_model.bin [ INFO ] Preparing input blobs 1 1 28 28 (28, 28) (1, 28, 28) [ INFO ] Loading model to the plugin [ INFO ] Starting inference (1 iterations) [ INFO ] Average running time of one iteration: 1.8284320831298828 ms [ INFO ] Processing output blob [[0. 0. 0. 0. 0. 0. 1. 0. 0. 0.]]
The last line is the resulting output vector, as you can see, we have a 1 at the index 6, the image has been correctly classified, on the Pi, using the stick !
As you can see, installing the inference engine on the RPi3 is pretty straight forward. Note that this OpenVino installation is only made for inference, you'll not be able to create networks or convert them with this install. It makes sense, as the pi is not powerful. The workflow is as follows: Create and train on your computer using Keras, Tensorflow, Caffe ; Convert model to IR file with Model Optimizer ; Push the IR files to your Pi ; Run the inference on your pi, with the Neural Compute Stick 2.
My next goal is to compare the performance between the Pi, a desktop computer and the NCS2. Note that the USB on the pi will slow down inference as it's not USB 3.
See you around for more!