English
English
English
  • SWARM Documentation
  • What's new?
    • Version 2024.2
    • Version 2024.1
    • Version 2023.3
      • Update 1
  • SWARM in a nutshell
    • SWARM Perception Platform Overview
  • Quick start guide
    • P101, P401 or OP101
      • P101 - Perception Box
      • P401 - Perception Box
      • OP101AC - Outdoor Perception Box
      • OP101DC - Outdoor Perception Box
    • Virtual Perception Box
      • System requirements
      • Install VPX Agent on NVIDIA Jetson (Jetpack 4.6)
      • Install VPX Agent on NVIDIA Jetson (Jetpack 5.1.2)
      • Install VPX Agent on X86/NVIDIA Server
  • Solution areas
    • Traffic Insights
      • Set-up Traffic Counting
      • Set-up Traffic Counting with speed estimates
      • Set-up Intersection Insights
    • Parking Insights
      • Set-up Barrierless Parking
      • Set-up Barrierless Parking with ANPR
        • Set-up guide and recommendations - ANPR
      • Set-up Single Space/Multi Space Parking
        • Standard examples
    • Advanced Traffic Insights
      • Set-up Adaptive Traffic Control
      • Set-up Journey Time & Traffic Flow
        • Set-up guide - Installation
        • Technical concept
      • Set-up Queue Length Detection
    • People Entry/Exit counting
  • SWARM Control Center
    • Devices
      • Camera & Device Monitoring
      • Camera Configuration
        • Scenario Configuration
          • Models
          • Calibration support
          • Camera settings
        • Rule Engine
          • Use Case Examples for Rule Engine
      • Device Health
    • Data Analytics
      • Creation and organization of dashboards
      • Dashboard overview & Widget creation
        • Traffic Scenario
        • Parking Scenario
        • Generic Scenario
    • Data Integration
      • Data Analytics API (REST API)
      • Raw event data with Custom MQTT server
      • SCC API
    • Administration
      • Monitoring Alerts
      • License Management
      • User Management
  • Test & Performance measurements
    • Benchmarks
      • How do we measure Performance?
    • White paper for use cases
      • Traffic Counting
      • Barrierless Parking and ANRP
  • Useful knowledge
    • 🚒Troubleshooting Guidelines
    • Network Requirements
    • Browser Compatibility SCC
    • Our Object Classes
    • Number Plate Area Code
  • Guidelines
    • How to access the debug output?
    • How to use Azure IotHub as Custom Broker
    • VPX
      • Upgrade IotEdge from 1.1 to 1.4
      • Upgrade Jetpack from 4.4.1 to 4.6.0
  • Getting Support
    • Get in touch
    • FAQs
Powered by GitBook
On this page
  • What data can be generated?
  • Camera placement
  • General & easy recommendations for deciding where to place the camera:
  • Get a better overview for installations with more details on camera distance to objects and mounting height:
  • What needs to be considered for a successful analysis?
  • Environment requirements
  • Hardware Specifications

Was this helpful?

Export as PDF
  1. Solution areas
  2. Parking Insights

Set-up Single Space/Multi Space Parking

Gather real time occupancy state about specific parking spaces - free or occupied

PreviousSet-up guide and recommendations - ANPRNextStandard examples

Last updated 1 year ago

Was this helpful?

You have a parking space where you simply want to know if your specific parking spaces are occupied or free, SWARM provides a perfect solution for doing that quite easily. See yourself:

What data can be generated?

For this use case, SWARM software is providing you with any relevant data for Single Space detection within your parking space. The solution is to provide you with the occupancy state of each of your configured parking lots.

The single space detection will give you information about the occupancy state of your parking lot (free or occupied) as well as the information about the object in your parking space, including the classification. Nevertheless, consider that the following configuration set-up is optimized to detect vehicles and not people and bicycles. On top the classification is depending on the camera installation, for a more top-down view the classification will be less accurate.

Camera placement

Good camera placement and understanding of the following section are key for accurate detections for Single Space Parking.

The main challenge in planning a camera installation is to avoid potential occlusions by other cars. We advise using the or and testing your parking setup for the following conditions:

  • put a car on one of the parking spaces

  • put a large vehicle (high van, small truck - the largest vehicle that you expect in your parking) on all parking spaces next to your car

  • if you still can see >70 % of the car, then this parking spot is valid.

General & easy recommendations for deciding where to place the camera:

  • Parking spots have to be fully visible (inside the field of view of the camera). We do not guarantee full accuracy for cropped single parking spaces.

  • Avoid objects (trees, poles, flags, walls, other vehicles) that occlude the parking spaces. Avoid camera positions, where cars (especially high cars like vans) occlude other cars.

  • Occlusions by other parking cars, mainly happen if parking spaces are aligned in direction of camera-alignment lines.

Get a better overview for installations with more details on camera distance to objects and mounting height:

What needs to be considered for a successful analysis?

Find detailed information about camera requirements/settings as well as camera positioning in the table below.

Recommended

Recommended

Pixels Per Meter is a measurement used to define the amount of potential image detail that a camera offers at a given distance.

> 60 PPM

Using the camera parameters defined below ensures to achieve the minimum required PPM value)

Camera video resolution

1280×720 pixel

Camera video protocol/codec

RTSP/H264

Camera Focal Length

2.8mm - 4mm

Camera mounting - distance to object center

5-30 m (cars in the center of the image)

For 5 meters distance we guarantee a high accuracy for 3 parking spaces, aligned orthogonal to the camera.

The higher the distance, to the camera, the more parking-spaces can be monitored.

Camera mounting height

Indoor: 2,5 - 5m Outdoor: 2,5 - 10m Higher is better. Vehicles can potentially occlude the parked cars, hence we recommend higher mounting points.

Wide Dynamic Range

Must be enabled

Night-mode

ENABLED

Configuration settings

Configuration
Settings

Model

Configuration option

Single Space (Roi) or Multi Space (Roi)

Raw tracks

Disabled

How to place the configuration type?

In the Parking Event templates you will find the two options Single Space (RoI) and Multi Space (RoI). These event types are the ones you need to set up this use case. Use an Single Space (RoI) in case you configure a parking space for a single car. In case you have an area where you expect more than one car choose the Multi Space (RoI). The difference between these two event types is the maximum capacity that you can set in the trigger settings.

Place the Region of interest (RoI) on the parking space you would like to configure. Consider that a vehicle is in the RoI if the center point of the object is in the ROI.

As the center point of the object is defining if the object is in an ROI or not please take care to configure the ROI taking into consideration the perspective.

Visualize data

Scenario

In our Parking Scenario section, you can find more details about the possible Widgets to be created in the Parking Scenario Dashboards.

Example

You are able to visualize the data for any Single- or Multispace parking lot you have configured with the Parking RoI. So you are able to see the occupancy status as well as the number of vehicles in each RoI or aggregated across one or several camera streams. You have the option to add Current & Historic Parking Utilization or the Single Multi Space Occupancy widgets for your data in this use case.

Retrieve your data

If you need your data for further local analysis, you have the option to export the data of any created widget as csv file for further processing in Excel.

Environment requirements

Object velocity

0 km/h

Day/Night/Lighting

Daytime

Nighttime (Only well illuminated or night vision mode)

Indoor/Outdoor

Indoor or Outdoor

Expected Accuracy

(when all environmental, hardware and camera requirements met)

>95% Classification is not considered

Hardware Specifications

Supported Products

VPX, P401, P101/OP101, P100/OP100

Frames Per Second (FPS)

5

Tip: Use the or .

The configuration of the solution can be managed centrally in . Below, you can see how to configure a Single Space Parking use case to get the best results

In order to start your configuration, take care that you have configured your

If the distance from the camera to the object (parking space) is higher, the perspective will have a higher impact and you need to adapt the ROI as well according to the perspective. In order to support the calibration in the best way, you can use the calibration mode which can be activated on the top right of the configuration frame. There you will see the detection boxes and center points of the vehicles which are at that moment in the camera. So take care to configure the RoI accordingly that the center point will be in the RoI.

You can visualize data via in different widgets.

If you would like to integrate the data in your IT environment, you can use the . In Data Analytics, you will find a description of the Request to use for retrieving the data of each widget.

Standard examples
SWARM Control Center
camera and data configuration.
Data Analytics
Parking Scenario
API
Pixels per Meter (PPM)
Axis lens calculator
generic lens calculator
Traffic & Parking (Accuracy+)
Axis lens calculator
generic lens calculator
Fully visible single parking spaces
Avoid occlusions by other objects
Avoid occlusions by other vehicles
csv. export
REST API