Last edited 6 days ago

People tracking and heatmap

Applicable for STM32MP23x lines, STM32MP25x lines

This article explains how to use the people tracking and heatmap application based on OpenVX [1] back-end.

1. Description[edit | edit source]

The people tracking and heatmap neural network model allows the identification and localization of people within an image, with the possibility to enable a tracker and a heatmap based on the current and previous locations of people in the scene. The models used with this application are the YoloV8, this is a state-of-art deep learning model for object detection.

People tracking and heatmap application


The application demonstrates a computer vision use case where frames are captured from a camera input (/dev/videox) and analyzed by a neural network model executed using the OpenVX framework.
A Gstreamer pipeline is used to stream camera frames (using v4l2src), to execute neural network inference (using appsink) and to send the images (using udpsink) along with the neural network information through ethernet.
The inference result is displayed on a remote PC connected to the STM32MPU board. The overlay is created using OpenCV and Tkinter.

The model used in this application is YoloV8, downloaded from the ultralytics github fork repository[2].

Info white.png Information
For this application, a TensorFlow Lite per-tensor asymmetric quantized model is used, which is accelerated using the neural processing unit (NPU). The model is then converted to NBG format using ST Edge AI tool. For more information about this tool, refer to the dedicated article.

2. Installation on target[edit | edit source]

2.1. Install from the OpenSTLinux AI package repository[edit | edit source]

Warning white.png Warning
The software package is provided AS IS, and by downloading it, you agree to be bound to the terms of the software license agreement (SLA0048). The detailed content licenses can be found here.

After configuring the AI OpenSTLinux package, install X-LINUX-AI components for the semantic segmentation application:

2.1.1. Install on STM32MP2x board with AI hardware accelerator[edit | edit source]

The OpenVX application is installed to utilize the neural processing unit (NPU) and graphics processing unit (GPU). It is available only in Python and on STM32MP2x boards with an AI hardware accelerator.

  • To install this application, use the following command:
x-linux-ai -i stai-mpu-people-tracking-heatmap-python-ovx-npu
Warning DB.png Important
People tracking and heatmap application is provided exclusively for STM32MP2x with AI hardware accelerator. It is implemented as a Python application based on OpenVX back-end


  • Then, restart the demo launcher:
systemctl restart weston-graphical-session.service

2.2. Source code location[edit | edit source]

  • In the OpenSTLinux Distribution with X-LINUX-AI Expansion Package:
<Distribution Package installation directory>/layers/meta-st/meta-st-x-linux-ai/recipes-samples/people-tracking-heatmap/files/stai_mpu
  • On the target:
/usr/local/x-linux-ai/people-tracking-heatmap/stai_mpu_people_tracking_heatmap.py
  • On GitHub:
recipes-samples/people-tracking-heatmap/files/stai-mpu

3. How to use the application[edit | edit source]

3.1. Launching via the demo launcher[edit | edit source]

Info white.png Information
The application cannot be launched using the demo launcher or the command line.

When you click the icon to run the people-tracking and heatmap Python OpenVX application, a message appears:

Demo launcher


This application is launched from the host PC by using the docker application. To learn to run this application, refer to the next section.

4. Installation on host PC[edit | edit source]

To simplify the deployment of the host PC application, a Docker container is available with all the source files. This docker is based on Ubuntu 20.04, all the environment is properly configured in the Dockerfile.

4.1. Build docker[edit | edit source]

Info white.png Information
Docker must be installed properly on the host PC to be able to execute this part

First, build the Docker image build from the sources provides in the meta-st-x-linux-ai repository. Navigate to the host-script directory:

cd <Distribution Package installation directory>/layers/meta-st/meta-st-x-linux-ai/host-script/st_people_tracking_heatmap_host.tar.xz

Uncompress the archive:

 tar -xJvf st_people_tracking_heatmap_host.tar.xz && cd st_people_tracking_heatmap_host/docker/

Build the Docker image:

 docker build --network=host --build-arg http_proxy --build-arg https_proxy --no-cache -t st_people_tracking_heatmap_host .

Set a screen local link:

 xhost +local:

Start the Docker container:

 docker run --network=host -e DISPLAY=$DISPLAY  -v /tmp/.X11-unix:/tmp/.X11-unix:ro -it --rm --name my-running-app st_people_tracking_heatmap_host

The following overlay appears on the screen:

Demo launcher

To start the application, write the host PC IP address and the board IP address in the Host PC IP and Board IP fields, respectively.

Warning DB.png Important
Ensure the board is powered on and connected to the same network as the host PC. The preview appears after 10-20 seconds.

Once the application starts, the camera stream appears. You can now use all the other options to customize the demo\s appearance.

5. Tracking explanation[edit | edit source]

The tracking consists of three steps:

  1. Object detection:
    • The neural network detects objects in individual video frames. It identifies the presence and location of objects.
  2. Object identification:
    • Each detected object is assigned a unique identifier. This helps in distinguishing between different instances of the same object class (e.g., multiple people).
  3. Object tracking:
    • The model keeps track of each object's movement across consecutive frames. It predicts the object's position in the next frame based on its current trajectory and speed.

6. Heatmap explanation[edit | edit source]

A heatmap is a visual representation context that shows the concentration or frequency of detected people in a specific areas over time. Here's how it works with a YoloV8 model:

  1. Detection and coordinate mapping:
    • YOLOv8 detects people in each video frame and provides their coordinates (e.g., the bottom center of the bounding box).
  2. Heat accumulation:
    • When a person is detected at a particular coordinate, the corresponding location in the heatmap increases in intensity or "heat".
    • If a person remains stationary, the heat at that location continues to increase, indicating a higher concentration of presence.
  3. Heat dissipation:
    • If a person moves away from a location, the heat in that cell gradually decreases over time.
    • This simulates the fading of presence, reflecting less frequent occupancy.
  4. Visualization:
    • The heatmap is visualized using colors ranging from cool (e.g., blue) to hot (e.g., red). Hotter colors indicate areas with higher concentrations of people over time.

7. Using application options[edit | edit source]

Once the application is running, several options can be enabled or disabled. On the right, there are three tabs: "Connection", "Customization" and "Controls".

  • Connection tab:
    • this tab is used only to enter the IP addresses of the host PC and of the target. Once this is done, it is no longer needed.
  • Customization tab
    • This tab offers overlay customization functionalities such as theme selection, video preview size selection, and the possibility to hide or display the text box.
  • The Controls tab
    • This tab is separated into two sections.
      1. Tracking:
        This section allows the user to enable or disable the tracking mechanism and enable or disable the trace following the detected people when tracking is activated.
      2. Heatmap:
        This section allows the user to enable or disable the heatmap and choose between three different heatmaps: Live, 1 hour, Infinite. Additionally, it allows scaling the last heatmap called "Infinite" usinga scroll bar.
        • The Live view heats and cools very quickly.
        • The 1 hour view displays information from the past hour, taking more time to heat and cool down.
        • The Infinite view shows all information since the application started, with the scroll bar reducing the maximum value to display positions as heat.


8. References[edit | edit source]