English
English
English
  • SWARM Documentation
  • What's new?
    • Version 2024.2
    • Version 2024.1
    • Version 2023.3
      • Update 1
  • SWARM in a nutshell
    • SWARM Perception Platform Overview
  • Quick start guide
    • P101, P401 or OP101
      • P101 - Perception Box
      • P401 - Perception Box
      • OP101AC - Outdoor Perception Box
      • OP101DC - Outdoor Perception Box
    • Virtual Perception Box
      • System requirements
      • Install VPX Agent on NVIDIA Jetson (Jetpack 4.6)
      • Install VPX Agent on NVIDIA Jetson (Jetpack 5.1.2)
      • Install VPX Agent on X86/NVIDIA Server
  • Solution areas
    • Traffic Insights
      • Set-up Traffic Counting
      • Set-up Traffic Counting with speed estimates
      • Set-up Intersection Insights
    • Parking Insights
      • Set-up Barrierless Parking
      • Set-up Barrierless Parking with ANPR
        • Set-up guide and recommendations - ANPR
      • Set-up Single Space/Multi Space Parking
        • Standard examples
    • Advanced Traffic Insights
      • Set-up Adaptive Traffic Control
      • Set-up Journey Time & Traffic Flow
        • Set-up guide - Installation
        • Technical concept
      • Set-up Queue Length Detection
    • People Entry/Exit counting
  • SWARM Control Center
    • Devices
      • Camera & Device Monitoring
      • Camera Configuration
        • Scenario Configuration
          • Models
          • Calibration support
          • Camera settings
        • Rule Engine
          • Use Case Examples for Rule Engine
      • Device Health
    • Data Analytics
      • Creation and organization of dashboards
      • Dashboard overview & Widget creation
        • Traffic Scenario
        • Parking Scenario
        • Generic Scenario
    • Data Integration
      • Data Analytics API (REST API)
      • Raw event data with Custom MQTT server
      • SCC API
    • Administration
      • Monitoring Alerts
      • License Management
      • User Management
  • Test & Performance measurements
    • Benchmarks
      • How do we measure Performance?
    • White paper for use cases
      • Traffic Counting
      • Barrierless Parking and ANRP
  • Useful knowledge
    • 🚒Troubleshooting Guidelines
    • Network Requirements
    • Browser Compatibility SCC
    • Our Object Classes
    • Number Plate Area Code
  • Guidelines
    • How to access the debug output?
    • How to use Azure IotHub as Custom Broker
    • VPX
      • Upgrade IotEdge from 1.1 to 1.4
      • Upgrade Jetpack from 4.4.1 to 4.6.0
  • Getting Support
    • Get in touch
    • FAQs
Powered by GitBook
On this page
  • Live Calibration
  • Track calibration

Was this helpful?

Export as PDF
  1. SWARM Control Center
  2. Devices
  3. Camera Configuration
  4. Scenario Configuration

Calibration support

In order to configure the stream properely for best data accuracy there are two options which will support you in the configuration process.

PreviousModelsNextCamera settings

Last updated 1 year ago

Was this helpful?

Live Calibration

For easy calibration, you can use our Live calibration in the top right corner drop down of the preview frame. As you can see in the screenshot below, this mode offers visibility about what objects the software is able to detect in the current previewed frame.

We suggest to use this calibration view especially for calibrating your configurations with Region of Interests.

The detected objects are surrounded by a so-called bounding box. Any bounding box also displays the center of the object. In order to distinguish the objects, we offer the calibration more in differentiated colors of the main classes. Any event that gets delivered via MQTT is triggered by the center of the object (dot in the center of the bounding box).

Track calibration

The track calibration feature gives the option to overlay a relevant amount of object tracks on the screen. With the overlay of the tracks, it will be clearly visible where in the frame the objects are detected the best. According to this input, it is much easier to configure your needed use cases properly and have good results with the first configuration try.

With track calibration history enabled you will be able to access the track calibration for every hour of the past 24 hours.

The track calibration images will be stored on the edge device and are only accessible through the Control Center. Make sure that viewing, storing, and processing of these images for up to 24 hours is compliant with your applicable data privacy regulations.

The color of the tracks are split by object class so that they can be distinguished between cars, trucks, buses, people and bicycles.

The colors of the tracks and bounding boxes are differentiated per main class. Find the legend for the colors on the question mark in the preview frame as shown in the Screenshot below.

We suggest to use this calibration support for any as well as

Traffic monitoring use case
Barrierless parking use case.
Single & Multispace use case
Live Calibration
Enable the feature for camera streams
A slider allows you access the track calibrations of the last 24 hours
Track Calibration