English
English
English
  • SWARM Documentation
  • What's new?
    • Version 2024.2
    • Version 2024.1
    • Version 2023.3
      • Update 1
  • SWARM in a nutshell
    • SWARM Perception Platform Overview
  • Quick start guide
    • P101, P401 or OP101
      • P101 - Perception Box
      • P401 - Perception Box
      • OP101AC - Outdoor Perception Box
      • OP101DC - Outdoor Perception Box
    • Virtual Perception Box
      • System requirements
      • Install VPX Agent on NVIDIA Jetson (Jetpack 4.6)
      • Install VPX Agent on NVIDIA Jetson (Jetpack 5.1.2)
      • Install VPX Agent on X86/NVIDIA Server
  • Solution areas
    • Traffic Insights
      • Set-up Traffic Counting
      • Set-up Traffic Counting with speed estimates
      • Set-up Intersection Insights
    • Parking Insights
      • Set-up Barrierless Parking
      • Set-up Barrierless Parking with ANPR
        • Set-up guide and recommendations - ANPR
      • Set-up Single Space/Multi Space Parking
        • Standard examples
    • Advanced Traffic Insights
      • Set-up Adaptive Traffic Control
      • Set-up Journey Time & Traffic Flow
        • Set-up guide - Installation
        • Technical concept
      • Set-up Queue Length Detection
    • People Entry/Exit counting
  • SWARM Control Center
    • Devices
      • Camera & Device Monitoring
      • Camera Configuration
        • Scenario Configuration
          • Models
          • Calibration support
          • Camera settings
        • Rule Engine
          • Use Case Examples for Rule Engine
      • Device Health
    • Data Analytics
      • Creation and organization of dashboards
      • Dashboard overview & Widget creation
        • Traffic Scenario
        • Parking Scenario
        • Generic Scenario
    • Data Integration
      • Data Analytics API (REST API)
      • Raw event data with Custom MQTT server
      • SCC API
    • Administration
      • Monitoring Alerts
      • License Management
      • User Management
  • Test & Performance measurements
    • Benchmarks
      • How do we measure Performance?
    • White paper for use cases
      • Traffic Counting
      • Barrierless Parking and ANRP
  • Useful knowledge
    • 🚒Troubleshooting Guidelines
    • Network Requirements
    • Browser Compatibility SCC
    • Our Object Classes
    • Number Plate Area Code
  • Guidelines
    • How to access the debug output?
    • How to use Azure IotHub as Custom Broker
    • VPX
      • Upgrade IotEdge from 1.1 to 1.4
      • Upgrade Jetpack from 4.4.1 to 4.6.0
  • Getting Support
    • Get in touch
    • FAQs
Powered by GitBook
On this page
  • What data can be generated?
  • What needs to be considered for a successful analysis?
  • Environment specification
  • Hardware Specifications

Was this helpful?

Export as PDF
  1. Solution areas
  2. Advanced Traffic Insights

Set-up Journey Time & Traffic Flow

Detailed information on the solution for Journey time and area-wide traffic flow in terms of data generation, camera set up and Analytics options.

PreviousSet-up Adaptive Traffic ControlNextSet-up guide - Installation

Last updated 1 year ago

Was this helpful?

Next to the traffic frequency at given locations, you are wondering about statistics on how long the vehicles take from one to another location and how the traffic flows across your city and municipality. With this solution, you can generate that data with a single sensor solution from SWARM.

What data can be generated?

For this use case, SWARM software is providing you with the most relevant traffic insights - The counts of vehicles including the classification of the SWARM main classes can be covered. On top, you have the opportunity to add a second counting line, calibrate the distance in between and estimate the speed of the vehicles passing both lines. By combining more sensors in different locations the journey time as well as the statistical traffic flow distribution will be generated.

The journey time and traffic flow distribution can be generated for vehicles only (car, bus and truck).

What needs to be considered for a successful analysis?

Find detailed information about camera requirements/settings as well as camera positioning in the table below.

Set up parameters
Recommended

Pixels Per Meter is a measurement used to define the amount of potential image detail that a camera offers at a given distance.

250 PPM (vehicle)

Using the camera parameters defined below ensures achieving the minimum required PPM value)

Camera video resolution

1920x1080 pixel

Camera video protocol/codec

RTSP/H264

USB 3.0/UYVY, YUY2, YVYU

Camera Focal Length

min. 3.612 mm motorized adjustable focal length

Camera mounting - distance to object center

Camera mounting height

Camera mounting - vertical angle to the object

<40°

0° - 20°

Possible cameras for this use case

Camera
Link

Dahua

IPC_HFW5442EP-ZE-B

HikVision

DS-2CD2646G2-IZS

The configuration of the solution can be managed centrally in . Below, you can see how the standard is configured for optimal results.

In order to start your configuration, take care that you have configured your

Configuration settings

Configuration
Settings

Model

Configuration option

Counting Line

Journey Time

Choose Journey time mode on Global settings

Raw Tracks

Disabled

What should be configured?

In order to retrieve the best accuracy we strongly recommend to configure a focus area on the maximum two lanes which should be covered for the use case.

Think of focus areas as inverted privacy zones - the model only "sees" objects inside an area, the rest of the image is black.

In order to receive the counting data as well as the journey time data, a Counting Line needs to be configured as an event trigger.

How to place the event triggers?

For receiving the best accuracy of the counting including the Journey time information the Counting Line should be placed at a point where the vehicle and the plate will be seen for approx. 10m in distance. On top take care to configure the Counting line at a place where the track calibration still shows stable tracks.

You can choose the direction IN/OUT as you want in order to retrieve the data as needed. On top, you have the option to give a custom direction name for the IN and OUT direction

Visualize data

Scenario

In our Traffic Scenario section, you can find more details about the possible Widgets to be created in the Traffic Scenario Dashboards.

Examples

Here is an example for a Journey time widget. Journey time can be shown as average, median or up to two different percentiles.

Another example below that visualizes the journey distribution. There is a slidebar to go through the different time periods of the chosen aggregation level. On top the figures can be changed easily between absolute and relative values.

Retrieve your data

If you need your data for further local analysis, you have the option to export the data of any created widget as .csv file for further processing in Excel.

Environment specification

Object velocity

< 80 km/h

Day/Night/Lighting

Daytime/Well illuminated/Night vision

Indoor/Outdoor

Outdoor

Accuracy

In this technical documentation, accuracy refers to the penetration rate of a single sensor, which is the percentage of correctly identified license plates divided by the total number of vehicles counted during a ground truth count.

The current penetration rate for this use case is 60%, taking into account different day/nighttimes, weather conditions, and traffic situations. When calculating journey time between two sensors, approximately 36% of journeys are used as the baseline, which is calculated by multiplying the penetration rate of both sensors.

The accuracy is sufficient to generate data that can be used to make valid conclusions about vehicle traffic patterns and journey times.

Hardware Specifications

Supported Products

VPX, P401, P101/OP101

Frames Per Second

25

Tip: Use the or .

5-20 meters Please consider that the zoom needs to be adjusted according to the capture distance. More details are in the .

3-8 meters Please follow the in detail.

Note: setting the correct distance to vehicle and camera mounting height should result in the correct vertical angle to the vehicle

Camera mounting - horizontal angle to the object

Make sure to place focus areas in a way that it covers enough space before an event trigger so that the model is able to "see" the objects for a similar amount of time as if the focus area wasn't there. The model ignores all objects outside a focus area so there is no detection, no classification, no tracking and no ANPR reading conducted.

You can visualize data via in different widgets.

If you would like to integrate the data in your IT environment, you can use the . In Data Analytics, you will find a description of the Request to use for retrieving the data of each widget.

❗
Data Analytics
Traffic Scenario
API
Pixels per Meter (PPM)
Axis lens calculator
generic lens calculator
installation set-up guide
installation set-up guide
More information in the Installation set-up guide.
https://www.dahuasecurity.com/asset/upload/uploads/soft/20190506/DH-IPC-HFW5442E-ZE.pdf
https://www.hikvision.com/en/products/IP-Products/Network-Cameras/Pro-Series-EasyIP-/ds-2cd2646g2-izs/
Traffic & Parking (Standard)
SWARM Control Center
camera and data configuration.
Journey Time (average and median) widget of two journeys.
Journey distributions widget.
csv. export
REST API