English
English
English
  • SWARM Documentation
  • What's new?
    • Version 2024.2
    • Version 2024.1
    • Version 2023.3
      • Update 1
  • SWARM in a nutshell
    • SWARM Perception Platform Overview
  • Quick start guide
    • P101, P401 or OP101
      • P101 - Perception Box
      • P401 - Perception Box
      • OP101AC - Outdoor Perception Box
      • OP101DC - Outdoor Perception Box
    • Virtual Perception Box
      • System requirements
      • Install VPX Agent on NVIDIA Jetson (Jetpack 4.6)
      • Install VPX Agent on NVIDIA Jetson (Jetpack 5.1.2)
      • Install VPX Agent on X86/NVIDIA Server
  • Solution areas
    • Traffic Insights
      • Set-up Traffic Counting
      • Set-up Traffic Counting with speed estimates
      • Set-up Intersection Insights
    • Parking Insights
      • Set-up Barrierless Parking
      • Set-up Barrierless Parking with ANPR
        • Set-up guide and recommendations - ANPR
      • Set-up Single Space/Multi Space Parking
        • Standard examples
    • Advanced Traffic Insights
      • Set-up Adaptive Traffic Control
      • Set-up Journey Time & Traffic Flow
        • Set-up guide - Installation
        • Technical concept
      • Set-up Queue Length Detection
    • People Entry/Exit counting
  • SWARM Control Center
    • Devices
      • Camera & Device Monitoring
      • Camera Configuration
        • Scenario Configuration
          • Models
          • Calibration support
          • Camera settings
        • Rule Engine
          • Use Case Examples for Rule Engine
      • Device Health
    • Data Analytics
      • Creation and organization of dashboards
      • Dashboard overview & Widget creation
        • Traffic Scenario
        • Parking Scenario
        • Generic Scenario
    • Data Integration
      • Data Analytics API (REST API)
      • Raw event data with Custom MQTT server
      • SCC API
    • Administration
      • Monitoring Alerts
      • License Management
      • User Management
  • Test & Performance measurements
    • Benchmarks
      • How do we measure Performance?
    • White paper for use cases
      • Traffic Counting
      • Barrierless Parking and ANRP
  • Useful knowledge
    • 🚒Troubleshooting Guidelines
    • Network Requirements
    • Browser Compatibility SCC
    • Our Object Classes
    • Number Plate Area Code
  • Guidelines
    • How to access the debug output?
    • How to use Azure IotHub as Custom Broker
    • VPX
      • Upgrade IotEdge from 1.1 to 1.4
      • Upgrade Jetpack from 4.4.1 to 4.6.0
  • Getting Support
    • Get in touch
    • FAQs
Powered by GitBook
On this page

Was this helpful?

Export as PDF
  1. Solution areas
  2. Advanced Traffic Insights
  3. Set-up Journey Time & Traffic Flow

Technical concept

In this page it is described how the detection and the matching of the journeys work from a technical perspective

PreviousSet-up guide - InstallationNextSet-up Queue Length Detection

Last updated 1 year ago

Was this helpful?

In order to detect journeys of vehicles, there is a need to detect the same vehicle at several predefined locations. This means there needs to be a dedicated identifier in order to tell if these are the same vehicles.

For vehicles, the unique identifier is obviously the license plate (LP). So, the LP will be taken as the unique identifier for matching vehicles across several locations. As LPs are considered personal data, a salt hashing function will be applied to pseudo-anonymize the personal data.

How does it work?

Based on the SWARM standard use case for traffic counting, the object (vehicle) will be detected and classified. If the journey time feature is enabled, the algorithm will run an LP detection and an LP reading for each detected vehicle. The raw string of the LP will then be pseudonymized with a so-called hashing mechanism, and the pseudonymized random text will be sent within the standard Counting Line (CL) event over the encrypted network.

In the upcoming section, more details of the single steps are described:

Detect vehicle

In each frame of the video stream, vehicles are detected and classified as cars, trucks, and buses. Alongside this, the vehicle is tracked across the frames of the video stream.

Detect license plate

For each classified vehicle, the license plate is detected and mapped to the object.

Read license plate

For each detected license plate, an optical character recognition (OCR) is applied to read the plate. The output of this part is a text which includes the raw string of the license plate.

Pseudonymize the license plate (Hashing)

In order to hash the LP, a salt shaker generates random salts in the backend (Cloud) and distributes the salts to the edge devices. A salt is random data that is used as an input to hash data, for example, passwords or in our case LPs. The salt will not be saved in the backend. The only point, where the salts are temporarily stored is in the edge device (Perception box).

In order to increase the safety of potential attacks, the salt has a validity window of 12 hours. After the validity window, a new randomly generated salt will be used. The graphic below illustrates an example of the hashing function used for LPs.

Salts 1-4 are generated by the salt shaker and distributed to each edge device. In order to always detect all journeys, each LP is hashed with two salts. Two salts are needed, as a journey could potentially have a longer travel time than the salt validity time. In the upcoming section, match event on possible journeys, it is shown why two salts per LP are needed.

Trigger and send event

If the vehicle crosses a counting line (CL), a CL event with the hashes (h) from the detected LP is sent via MQTT to the Cloud (Microsoft Azure) and saved in a structured database (DB).

Match event on possible journeys

On the cloud, the DB is regularly checked for possible matches within the hashes. As shown above, two hashes are created per detected vehicle. If one of the two hashes is the same for two different detections it will be saved as a journey with the journey time information, class, edge device names & GPS coordinates of the edge device.

In case the same hash is found in several locations, a multi-hop journey will be saved based on the sorting of the timestamps. (e.g.: Journey from location A to B to C)

Anonymize data

After 12h, which is the validity time of the salt used for pseudonymizing the license plate, the pseudonymized LP will be deleted. This action makes the pseudonymized data anonymized. In summary, it means, that after 12 hours past the detection of the vehicle and LP all data are anonymized.

Object Detection
License Plate Detection
Reading License Plate
Hashing Process with Salt Shaker
Event triggered
Event transmission from Edge to Cloud
Journey Time Allocation