English
English
English
  • SWARM Documentation
  • What's new?
    • Version 2024.2
    • Version 2024.1
    • Version 2023.3
      • Update 1
  • SWARM in a nutshell
    • SWARM Perception Platform Overview
  • Quick start guide
    • P101, P401 or OP101
      • P101 - Perception Box
      • P401 - Perception Box
      • OP101AC - Outdoor Perception Box
      • OP101DC - Outdoor Perception Box
    • Virtual Perception Box
      • System requirements
      • Install VPX Agent on NVIDIA Jetson (Jetpack 4.6)
      • Install VPX Agent on NVIDIA Jetson (Jetpack 5.1.2)
      • Install VPX Agent on X86/NVIDIA Server
  • Solution areas
    • Traffic Insights
      • Set-up Traffic Counting
      • Set-up Traffic Counting with speed estimates
      • Set-up Intersection Insights
    • Parking Insights
      • Set-up Barrierless Parking
      • Set-up Barrierless Parking with ANPR
        • Set-up guide and recommendations - ANPR
      • Set-up Single Space/Multi Space Parking
        • Standard examples
    • Advanced Traffic Insights
      • Set-up Adaptive Traffic Control
      • Set-up Journey Time & Traffic Flow
        • Set-up guide - Installation
        • Technical concept
      • Set-up Queue Length Detection
    • People Entry/Exit counting
  • SWARM Control Center
    • Devices
      • Camera & Device Monitoring
      • Camera Configuration
        • Scenario Configuration
          • Models
          • Calibration support
          • Camera settings
        • Rule Engine
          • Use Case Examples for Rule Engine
      • Device Health
    • Data Analytics
      • Creation and organization of dashboards
      • Dashboard overview & Widget creation
        • Traffic Scenario
        • Parking Scenario
        • Generic Scenario
    • Data Integration
      • Data Analytics API (REST API)
      • Raw event data with Custom MQTT server
      • SCC API
    • Administration
      • Monitoring Alerts
      • License Management
      • User Management
  • Test & Performance measurements
    • Benchmarks
      • How do we measure Performance?
    • White paper for use cases
      • Traffic Counting
      • Barrierless Parking and ANRP
  • Useful knowledge
    • 🚒Troubleshooting Guidelines
    • Network Requirements
    • Browser Compatibility SCC
    • Our Object Classes
    • Number Plate Area Code
  • Guidelines
    • How to access the debug output?
    • How to use Azure IotHub as Custom Broker
    • VPX
      • Upgrade IotEdge from 1.1 to 1.4
      • Upgrade Jetpack from 4.4.1 to 4.6.0
  • Getting Support
    • Get in touch
    • FAQs
Powered by GitBook
On this page
  • What data can be generated?
  • What needs to be considered for a successful analysis?
  • Possible cameras for this use case
  • Environment specification
  • Hardware Specifications

Was this helpful?

Export as PDF
  1. Solution areas
  2. Advanced Traffic Insights

Set-up Queue Length Detection

How to get insights on traffic congestions in terms of data generation, camera set up and Analytics options.

PreviousTechnical conceptNextPeople Entry/Exit counting

Last updated 1 year ago

Was this helpful?

Next to the traffic frequency at given locations, you are wondering about the length of a queue when traffic congestion is given. In combination with the speed of the detected vehicle, you can get proper insights into the length and speed of the current queue.

What data can be generated?

For this use case, SWARM software is providing you with the most relevant traffic insights - The counts of vehicles including the classification of the SWARM main classes can be covered. On top, you have the opportunity to add a second counting line, calibrate the distance in between and estimate the speed of the vehicles passing both lines. By combining this with different Regions of Interest (RoI) you can retrieve the needed insights into traffic congestion.

For traffic frequency, all SWARM main classes can be generated. Depending on the camera settings, we can detect present vehicles up to 70 m.

What needs to be considered for a successful analysis?

Possible cameras for this use case

Find detailed information about camera requirements/settings as well as camera positioning in the table below.

Set up parameters
Recommended

Pixels Per Meter is a measurement used to define the amount of potential image detail that a camera offers at a given distance.

> 60 PPM

Using the camera parameters defined below ensures to achieve the minimum required PPM value)

Camera video resolution

1280×720 pixel

Camera video protocol/codec

RTSP/H264

Camera Focal Length

min. 2.8 mm variofocal lense*

Camera mounting - distance to object center

5–70 meters*

Camera mounting height

3–8 meters

Camera mounting - vertical angle to the object

<50°

Note: setting the correct distance to vehicle and camera mounting height should result in the correct vertical angle to the vehicle

0° - 90°

*The higher the distance of the objects to the camera, the higher the focal length, the higher the dead zone. In order to achieve the needed PPM for the detection of objects (30 PPM, please consider the following illustration and table:

Object Distance
Focal Length
Dead Zone

30

2,8 mm

2,8 mm

50

5 mm

5 m

70

7 mm

>8 m

Possible cameras for this use case

Camera
Link

HikVision

DS-2CD2646G2-IZS

Configuration settings

Configuration
Settings

Model

Configuration option

Counting Line & RoIs

ANPR

Disabled

Raw Tracks

Disabled

What should be configured?

In order to receive the counting data including speed as well as the RoIs occupancy, a Counting Line and several RoIs need to be configured as event triggers. Depending on the specific use case and object distance, several triggers might need to be combined.

How to place the event triggers?

In order to receive information on how fast vehicles are driving and how many objects are currently present in a specific region, you need to configure counting lines with speed estimation and generic RoIs.

You can choose the direction IN/OUT as you want in order to retrieve the data as needed and give a custom name to that direction.

Visualize data

Scenario

In our Traffic Scenario section, you can find more details about the possible Widgets to be created in the Traffic Scenario Dashboards for Speed Events and combined trigger.

If you need your data for further local analysis, you have the option to export the data of any created widget as .csv file for further processing in Excel.

Environment specification

Object velocity

< 130 km/h

Day/Night/Lighting

Daytime/Well illuminated/Night vision

Indoor/Outdoor

Outdoor

Expected Accuracy (Counting+Classification)

(when all environmental, hardware, and camera requirements are met)

Counting >95% (vehicles, bicycles) Classification of main classes: >95% Classification of subclasses: >85%

Hardware Specifications

Supported Products

VPX, P401, P101/OP101, P100/OP100

Frames Per Second

25

Tip: Use the or .

Camera mounting - horizontal angle to the object

The configuration of the solution can be managed centrally in . Below, you can see how the standard is configured for optimal results.

In order to start your configuration, take care that you have configured your accordingly.

You can visualize data via in different widgets.

If you would like to integrate the data in your IT environment, you can use the . In Data Analytics, you will find a description of the Request to use for retrieving the data of each widget.

SWARM Control Center
camera
Data Analytics
Traffic Scenario
API
Pixels per Meter (PPM)
Axis lens calculator
generic lens calculator
https://www.hikvision.com/en/products/IP-Products/Network-Cameras/Pro-Series-EasyIP-/ds-2cd2646g2-izs/
Traffic & Parking (Standard)
3 Object Distances and Focal Lenghts
Combine CL, Speed and RoI for Queue Length Detection
csv. export
REST API