This article describes how to measure the performance of a neural network model compiled for Coral Edge TPU on STM32MPUs platforms.
1. Installation[edit source]
1.1. Installing from the OpenSTLinux AI package repository[edit source]
After having configured the AI OpenSTLinux package you can install X-LINUX-AI components for this application. The minimum package required is the coral-edgetpu-benchmark, it could be installed directly on your board using the following command:
x-linux-ai -i coral-edgetpu-benchmark
The model used in this example can be installed from the following package:
x-linux-ai -i object-detect-models-ssd-mobilenet-v1-10-300
2. How to use the Benchmark application[edit source]
2.1. Executing with the command line[edit source]
The coral_edgetpu_benchmark application is located in the userfs partition:
/usr/local/bin/coral-edgetpu-*/tools/coral_edgetpu_benchmark
It accepts the following input parameters:
Usage: ./coral-edgetpu-benchmark -m --model_file <.tflite file path>: .tflite model to be executed -l --loops <int>: provide the number of time the inference will be executed (by default nb_loops=1) --help: show this help
2.2. Testing with COCO SSD MobileNet V1[edit source]
The model used for testing is the ssd_mobilenet_v1_10_300_int8_edgetpu.tflite which is a COCO SSD MobilenetV1. It is a model used for object detection.
On the target, the model is located here:
/usr/local/x-linux-ai/object-detection/models/coco_ssd_mobilenet/
To launch the application, use the following command:
/usr/local/bin/coral-edgetpu-*/tools/tflite_edgetpu_benchmark -m /usr/local/demo-ai/object-detection/models/coco_ssd_mobilenet/ssd_mobilenet_v1_10_300_int8_edgetpu.tflite -l 50
Console output:
model file set to: /usr/local/x-linux-ai/object-detection/models/coco_ssd_mobilenet/ssd_mobilenet_v1_10_300_int8_edgetpu.tflite This benchmark will execute 50 inference(s) Bus 002 Device 004: ID 18d1:9302 Google Inc. Loaded model /usr/local/x-linux-ai/object-detection/models/coco_ssd_mobilenet/ssd_mobilenet_v1_10_300_int8_edgetpu.tflite resolved reporter inferences are running: # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # inference time: min=58315us max=109714us avg=66009.4us