UPLOAD DATA
...
Indepth theory
Calibrations

Calibrations Overview

3min

Scenes with 2D and 3D data across various Coordinate Systems need calibration to align sensors by location and orientation. This includes projecting 3D points onto the camera’s image plane.

All calibrations detail a sensor’s 3D position and orientation relative to the reference system. They also map 3D points to the camera’s image plane.

  • For LiDAR/RADAR, there is only one type of calibration available, read more here.
  • For cameras, we support different types of Standard Camera Calibrations, where you only have to provide the intrinsic parameters of the camera.

Unsupported camera model

If your camera model is not supported, you can also provide a Custom Camera Calibrations where you provide the implementation in the form of a WebAssembly module.

How to create a calibration

See this example of how to create a calibration for a LIDAR sensor and two camera sensors of type Pinhole. For other camera types as Kannala, Fisheye etc, see kognic-io-examples.

Python


Reuse calibration

Note that after a calibration has been created you can, and should, reuse the same calibration for multiple scenes if possible, see below.

Existing calibrations can be fetched with the retrieved id or with the provided external id. This can either be done via the client in Python or via kognicutil.

Text

Text