We are now hosting documentation for LibMultiSense at http://docs.carnegierobotics.com/libmultisense/. As new versions of LibMultiSense are released, this documentation will be updated. You can generate this HTML documentation yourself by installing and running the Doxygen tool on the checked out LibMultiSense source code.Read More
Version 3.4.7 of the MultiSense ROS driver for the CRL MultiSense line of products was released on 3/4/2016.
This version adds in support for a new MultiSense DeviceStatus message that contains information about the internal condition (power, temperature, and status) of the MultiSense. It also replaces the MultiSense SL calibration check utility with a version that no-longer relies on a MATLAB runtime being installed.Read More
MultiSense Firmware v3.4 firmware incorporates the following new features:
- Add support for persistent imager gain and black-level offset adjustments
- Add in external calibration non-volatile storage
- Add support for external lighting on S21 cameras
- Add additional device status messages
Version 3.4.5 of the MultiSense ROS driver for the CRL MultiSense line of products was released on 10/20/2015.
This version changes image encoding types from 8UC1/16UC1 to mono8/mono16 data formats. In some versions of ROS these types are not equivalent which was causing processing and display issues for some customers.Read More
The process outlines how to perform a calibration of a MultiSense unit yourself and then load that new calibration onto the device. It is suggested that you save the factory calibration before uploading a calibration of your own.
Note: The accuracy of these methods is not guaranteed. However, this can be helpful in various ways: getting an approximate calibration as a sanity check, getting a temporary calibration to use during the few days that it would take for Carnegie Robotics LLC to recalibrate the device, etc.
- Save the factory calibration
rosrun multisense_lib ImageCalUtility -e crl_extrinsics.yml -i crl_intrinsics.yml
- Run the calibration process
rosrun camera_calibration cameracalibrator.py --size 12x9 --square 0.108 right:=/multisense/right/image_mono left:=/multisense/left/image_mono right_camera:=/multisense/right left_camera:=/multisense/left --no-service-check
- Retrieve the calibration values. Extract the tar file and retrieve ost.txt.
- Fill in the intrinsic and extrinsic yml file templates from the ost.txt file (see below for details).
- Program the new calibration parameters to the device. Note: changes are not applied until you relaunch the device.
rosrun multisense_lib ImageCalUtility -e extrinsics.yml -i intrinsics.yml -s
Note on MultiSense Topics & Scaling
The values that are affected by the resolution of the images include fx, fy, cx, and cy. The fx, fy, cx, and cy values that you see in the MultiSense calibration topics correspond to the maximum resolution of the imager. Therefore, they must be scaled according to the current operating resolution. For example, if the maximum operating resolution of the camera is 2048x1088, and you are currently operating at 1024x544, then fx and cx need to be scaled by (1024/2048), and fy and cy need to be scaled by (544/1088).
Alternatively, you can use the values in the topic /multisense/calibration/raw_cam_config, which are pre-scaled based on the current operating resolution.
Note: The maximum camera operating resolution can be found in /multisense/calibration/device_info. The current operating resolution is transmitted in each image header.
Mapping between ost.txt and MultiSense YML terms
# oST version 5.0 parameters ## ## [image] ## Relationship between these width ## values and the full resolution 1024 ## of the MultiSense imager height ## determines your scaling factor 544 ## ## [narrow_stereo/left] ## ## camera matrix ## Scale and use for M1 452.862943 0.000000 500.689767 ## 0.000000 453.075014 259.871269 ## 0.000000 0.000000 1.000000 ## ## distortion ## Use as D1 -0.005753 0.011164 -0.000383 0.002185 0.000000 ## ## rectification ## Use for R1 0.999814 0.004405 -0.018761 ## -0.004402 0.999990 0.000210 ## 0.018761 -0.000128 0.999824 ## ## projection ## Scale and use for P1 486.641243 0.000000 518.700439 0.000000 ## 0.000000 486.641243 259.667084 0.000000 ## 0.000000 0.000000 1.000000 0.000000 ## ## [narrow_stereo/right] ## ## camera matrix ## Scale and use for M2 454.563964 0.000000 489.861282 ## 0.000000 454.989749 258.359351 ## 0.000000 0.000000 1.000000 ## ## distortion ## Use for D2 -0.011751 0.025128 -0.000291 0.000835 0.000000 ## ## rectification ## Use for R2 0.999811 0.001660 -0.019391 ## -0.001663 0.999999 -0.000153 ## 0.019391 0.000185 0.999812 ## ## projection ## Scale, use for P2 486.641243 0.000000 518.700439 -227.281969 ## Entry [0,3] should be around -200 0.000000 486.641243 259.667084 0.000000 ## 0.000000 0.000000 1.000000 0.000000 ##
File Template: intrinsic.yml
%YAML:1.0 M1: !!opencv-matrix rows: 3 cols: 3 dt: d data: [ fx_left, 0, cx_left, 0, fy_left, cy_left, 0, 0, 1] D1: !!opencv-matrix rows: 1 cols: 8 dt: d data: [k1_left, k2_left, p1_left, p2_left, k3_left, k4_left, k5_left, k6_left] M2: !!opencv-matrix rows: 3 cols: 3 dt: d data: [ fx_right, 0, cx_right, 0, fy_right, cy_right, 0, 0, 1] D2: !!opencv-matrix rows: 1 cols: 8 dt: d data: [k1_right, k2_ right, p1_ right, p2_ right, k3_ right, k4_ right, k5_ right, k6_ right]
Definitions of the above template terms
- left camera matrix [unrectified]
- left distortion coefficients [k=radial, p=tangential]
- right camera matrix [unrectified]
- right distortion coefficients [k=radial, p=tangential]
File Template: extrinsics.yml
%YAML:1.0 R1: !!opencv-matrix rows: 3 cols: 3 dt: d data: [ r11_left, r12_left, r13_left, r21_left, r22_left, r23_left, r31_left, r32_left, r33_left] P1: !!opencv-matrix rows: 3 cols: 4 dt: d data: [f, 0, cx_left, 0, 0, f, cy_left, 0, 0, 0, 0, 1 ] R2: !!opencv-matrix rows: 3 cols: 3 dt: d data: [ r11_right, r12_right, r13_right, r21_right, r22_right, r23_right, r31_right, r32_right, r33_right] P2: !!opencv-matrix rows: 3 cols: 4 dt: d data: [f, 0, cx_right, 0, 0, f, cy_right, 0, 0, 0, 0, 1 ]
Definitions of the above template terms
- left rotation matrix [rectification matrix]
- left projection matrix [rectified camera matrix]
- right rotation matrix [rectification matrix]
- right projection matrix [rectified camera matrix]
The MultiSense ROS driver was updated on 6/22/2015.
Issue 48 : Adds an effort field to the joint states topic.
Issue 50 : Adds support for a new topic /multisense/openni_depth which follows the OpenNI Raw depth format. This topic version uses less bandwidth than the traditional depth image and is able to be compressed with image transport.
Documentation was added to the Depth Camera section.
More information about these changes can be found in the ROS driver documentation.Read More
The Cloud Demo Software Suite is a set of open-source demo applications showing usage of Carnegie Robotic's MultiSense stereo products including the S7 and S21.
Note that these software packages generally assume operation with the 2 MP version of CRL's stereo systems and will need some tweaking to work with 4 MP imagers.
Here are the installation and build instructions.
All of our MultiSense SL, S7, S7S, and S21 sensors are offered in two different versions: with 2 megapixel (CMV2000) and 4 megapixel (CMV4000) imagers from CMOSIS. This document describes the differences between the two versionsRead More
FOR CMV2000 IMAGERS
FOR CMV4000 IMAGERS
The Multisense-SL unit stores a unique calibration to transform laser range data into the left camera optical frame. This calibration is comprised of two static transforms; one from the motor frame to the left camera frame, the other from the laser frame to the spindle frame. There is an additional transform between the spindle frame and the motor frame which accounts for the rotation of the laser. The Multisense-SL ROS driver automatically generates a transform tree based off these three transforms. Additionally both the ROS driver and the underlying API offer mechanisms to perform custom transformations of laser data. The document details this transform tree and provide a basic example of how to perform this transformation.Read More