{"id":6080,"date":"2019-09-27T11:12:06","date_gmt":"2019-09-27T09:12:06","guid":{"rendered":"https:\/\/blog.generationrobots.com\/?p=6080"},"modified":"2023-06-29T11:02:06","modified_gmt":"2023-06-29T09:02:06","slug":"lidar-integration-with-ros-quickstart-guide-and-projects-ideas","status":"publish","type":"post","link":"https:\/\/www.generationrobots.com\/blog\/en\/lidar-integration-with-ros-quickstart-guide-and-projects-ideas\/","title":{"rendered":"LiDAR integration with ROS: quickstart guide and projects ideas"},"content":{"rendered":"\t\t<div data-elementor-type=\"wp-post\" data-elementor-id=\"6080\" class=\"elementor elementor-6080\" data-elementor-post-type=\"post\">\n\t\t\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-66d2f94a elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"66d2f94a\" data-element_type=\"section\" data-e-type=\"section\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-5bc1fccf\" data-id=\"5bc1fccf\" data-element_type=\"column\" data-e-type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-7e64b474 elementor-widget elementor-widget-text-editor\" data-id=\"7e64b474\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\n<p>\u00a0<\/p>\n<p class=\"has-text-align-center\"><span style=\"color: #000000;\"> <strong> Read our other blog posts from our \u201cLiDAR technology\u201d serie <\/strong> <\/span><\/p>\n\n<figure class=\"wp-block-table\">\n<table>\n<tbody>\n<tr>\n<td><a title=\"Which applications for a LiDAR?\" href=\"\/blog\/en\/which-applications-for-a-lidar\/\"> <img decoding=\"async\" src=\"https:\/\/static.generation-robots.com\/img\/cms\/bouton-CTA-articles-blog-lidar-3-EN.jpg\" alt=\"Which applications for a LiDAR?\" width=\"270\" height=\"110\" \/> <\/a><\/td>\n<td>\u00a0<\/td>\n<td>\u00a0<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<\/figure>\n\n<h2 class=\"wp-block-heading\">LiDAR integration with ROS: quickstart guide and projects ideas<\/h2>\n\n<p>In this post, you will learn <strong> how to connect and integrate your LiDAR with your PC or embedded system using ROS middleware <\/strong> on Ubuntu. We will also talk about <strong> data fusion <\/strong> (widely used in mobile robotics).<\/p>\n\n<h2 class=\"wp-block-heading\">Are you new to ROS?<\/h2>\n\n<p>The <strong> <a class=\"catalogue\" title=\"ROS \u2013 Robot Operating System\" href=\"\/blog\/en\/ros-robot-operating-system-2\/\"> Robot Operating System (ROS) <\/a> <\/strong> is a set of softwares libraries and tools that help you build robot applications. From drivers to state-of-the-art algorithms, and with powerful developer tools, you will almost certainly need ROS for your next robotics project. ROS is completely open source.<\/p>\n\n<p>If you don\u2019t have Ubuntu set up on your computer, follow this <a class=\"catalogue\" title=\"Set up Ubuntu 18.04\" href=\"https:\/\/howtoubuntu.org\/how-to-install-ubuntu-18-04-bionic-beaver\" target=\"_blank\" rel=\"noopener noreferrer\"> instructions <\/a> to download the latest version. You have to download ROS distribution matching your Ubuntu version, for the Ubuntu 18.04 version, you need to install <strong> ROS Melodic <\/strong> .<\/p>\n\n<p>Once your computer is running on Ubuntu and that ROS is set up, we recommend you to look up these <strong> <a class=\"catalogue\" title=\"ROS tutorials\" href=\"http:\/\/wiki.ros.org\/ROS\/Tutorials\" target=\"_blank\" rel=\"noopener noreferrer\"> ROS tutorials <\/a> <\/strong> to get familiar with this middleware (beginner and intermediate levels available).<\/p>\n\n<h2 class=\"wp-block-heading\">Receive data from your LiDAR<\/h2>\n\n<p>In order to connect the LiDAR to you PC, you must first take care of the power supply. Depending of your device, it may require 5VDC or 12\/24VDC, for instance.<\/p>\n\n<p>5VDC supply voltage is usually supported by a USB connector plugged into your PC, you just have to wire it and the LiDAR will be ready to spin.<\/p>\n\n<p>For higher supply voltage, you need an external alimentation (low frequency generator), a transformator\/converter wired to main supply, or a battery.<\/p>\n\n<p>Once you have connected your LiDAR with its power supply, you need to connect the data transmitter. <br \/>It could be through the same cable as USB supplier, with a different one, with a Rx\/Tx cable, a UART cable or an Ethernet cable. Usually, an adapter is sold with the LiDAR.<\/p>\n\n<h2 class=\"wp-block-heading\">Read data from your LiDAR<\/h2>\n\n<p>Once your LiDAR is ready to be used, you have to check if you have the permissions on data input port:<\/p>\n\n<p>Once you have connected the data transmitter cable on the USB or Ethernet port, type the following command line to check the permissions:<\/p>\n\n<p><em> $ ls -l \/dev\/tty <\/em><\/p>\n\n<p>You should see a new item labelled <em> ACMX <\/em> or <em> USBX <\/em> , <em> X <\/em> being a figure equal or higher than zero (depending on how many ports are already in-use).<\/p>\n\n<p>Your output should be in the form:<\/p>\n\n<p><em> $ crw-rw-XX- 1 root dialout 166, 0 2016-09-12 14:18 \/dev\/ttyACM0 <\/em><\/p>\n\n<p>or<\/p>\n\n<p><em> $ crw-rw-XX- 1 root dialout 166, 0 2016-09-12 14:18 \/dev\/ttyUSB0 <\/em><\/p>\n\n<ul class=\"wp-block-list\">\n<li>If XX is rw, then the laser is configured properly.<\/li>\n\n<li>If XX is \u2013, then the laser is not configured properly and you need to change permissions like below:<\/li>\n<\/ul>\n\n<p><em> $ sudo chmod a+rw \/dev\/ttyACM0 <\/em><\/p>\n\n<p>or<\/p>\n\n<p><em> $ sudo chmod a+rw \/dev\/ttyUSB0 <\/em><\/p>\n\n<p>Once the permissions are configured, you have to download the package of your LiDAR manufacturer:<\/p>\n\n<ul class=\"wp-block-list\">\n<li><a class=\"catalogue\" title=\"Slamtec GitHub\" href=\"https:\/\/github.com\/Slamtec\/rplidar_ros\" target=\"_blank\" rel=\"noopener noreferrer\"> Slamtec GitHub <\/a><\/li>\n\n<li><a class=\"catalogue\" title=\"Slamtec tutorial\" href=\"http:\/\/wiki.ros.org\/rplidar\" target=\"_blank\" rel=\"noopener noreferrer\"> Slamtec tutorial <\/a><\/li>\n\n<li><a class=\"catalogue\" title=\"YDLiDAR GitHub\" href=\"https:\/\/github.com\/EAIBOT\/ydlidar\" target=\"_blank\" rel=\"noopener noreferrer\"> YDLiDAR GitHub <\/a><\/li>\n\n<li><a class=\"catalogue\" title=\"Hokuyo GitHub\" href=\"https:\/\/github.com\/ros-drivers\/urg_node\" target=\"_blank\" rel=\"noopener noreferrer\"> Hokuyo GitHub <\/a><\/li>\n\n<li><a class=\"catalogue\" title=\"Hokuyo tutorial\" href=\"http:\/\/wiki.ros.org\/urg_node\" target=\"_blank\" rel=\"noopener noreferrer\"> Hokuyo tutorial <\/a><\/li>\n\n<li><a class=\"catalogue\" title=\"ROS SICK GitHub\" href=\"https:\/\/github.com\/SICKAG\/sick_scan\" target=\"_blank\" rel=\"noopener noreferrer\"> ROS SICK GitHub <\/a><\/li>\n\n<li><a class=\"catalogue\" title=\"ROS2 SICK GitHub\" href=\"https:\/\/github.com\/SICKAG\/sick_scan2\" target=\"_blank\" rel=\"noopener noreferrer\"> ROS2 SICK GitHub <\/a><\/li>\n\n<li><a class=\"catalogue\" title=\"SICK tutorial\" href=\"http:\/\/wiki.ros.org\/sick_scan\" target=\"_blank\" rel=\"noopener noreferrer\"> SICK tutorial <\/a><\/li>\n\n<li><a class=\"catalogue\" title=\"RoboSense GitHub\" href=\"https:\/\/github.com\/RoboSense-LiDAR\/ros_rslidar\" target=\"_blank\" rel=\"noopener noreferrer\"> RoboSense GitHub <\/a><\/li>\n<\/ul>\n\n<p>For downloading LiDAR package from GitHub in the src folder of your <a class=\"catalogue\" title=\"ROS environment\" href=\"http:\/\/wiki.ros.org\/ROS\/Tutorials\/InstallingandConfiguringROSEnvironment\" target=\"_blank\" rel=\"noopener noreferrer\"> ROS environment <\/a> , the commands are:<\/p>\n\n<p><em> $ cd ~\/your_workspace\/src <\/em><\/p>\n\n<p><em> $ git clone <\/em><\/p>\n\n<p>(please refer to the manufacturers GitHub links above)<\/p>\n\n<p>Write down the package name you have just downloaded.<\/p>\n\n<p><em> $ cd .. <\/em><\/p>\n\n<p><em> $ catkin_make <\/em><\/p>\n\n<p><em> $ source devel\/setup.bash <\/em><\/p>\n\n<p>Go in launch folder and find the launch file name which match with your LiDAR version and launch it:<\/p>\n\n<p><em> $ roslaunch votre_package votre_launch.launch <\/em><\/p>\n\n<p>To check that the LiDAR is publishing to \/scan, use<\/p>\n\n<p><em> $ rostopic list <\/em><\/p>\n\n<p>All active topics will be listed, check that \/scan is present. Next, check the messages being published to \/scan by using:<\/p>\n\n<p><em> $ rostopic echo \/scan <\/em><\/p>\n\n<p>You will be successfully able to stream data from the LiDAR. You can visualize the data on RViz with the command line below. More details about this method here: <a class=\"catalogue\" title=\"Using ROS to read data from a LiDAR\" href=\"http:\/\/www.daslhub.org\/unlv\/wiki\/doku.php?id=using_ros_to_read_data_from_a_hokuyo_scanning_laser_rangefinder\" target=\"_blank\" rel=\"noopener noreferrer\"> using ROS to read data from a LiDAR <\/a> . <strong> If you want to learn more about RViz, read this <a class=\"catalogue\" title=\"RViz: 3D vizualisation tool for ROS\" href=\"http:\/\/wiki.ros.org\/rviz\" target=\"_blank\" rel=\"noopener noreferrer\"> tutorial <\/a> . <\/strong><\/p>\n\n<p><em> $ rosrun rviz rviz <\/em><\/p>\n\n<p>Click add, then select the topic \/scan. If you have an error related to tf, you need to manually enter your fixed frame as <em> \u201c\/your_LiDAR_frame_id <\/em> \u201d in the textbox next to fixed frame on the left side of the GUI.<\/p>\n\n<p>The result should be a line mapping distances from the LiDAR in a rectangular coordinate system.<\/p>\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter\"><a href=\"https:\/\/blog.generationrobots.com\/wp-content\/uploads\/2019\/09\/lidar-ros-rviz-data.png\"> <img fetchpriority=\"high\" decoding=\"async\" width=\"602\" height=\"439\" class=\"wp-image-6085\" src=\"https:\/\/blog.generationrobots.com\/wp-content\/uploads\/2019\/09\/lidar-ros-rviz-data.png\" alt=\"LiDAR integration with ROS: quickstart guide and projects ideas\" srcset=\"https:\/\/www.generationrobots.com\/blog\/wp-content\/uploads\/2019\/09\/lidar-ros-rviz-data.png 602w, https:\/\/www.generationrobots.com\/blog\/wp-content\/uploads\/2019\/09\/lidar-ros-rviz-data-300x219.png 300w\" sizes=\"(max-width: 602px) 100vw, 602px\" \/> <\/a><\/figure><\/div>\n<p>To integrate the LiDAR with another package, you just need to call this launch from your main launch.<\/p>\n\n<p>Some LiDAR require some additional steps of configuration. The readme file of your manufacturer package should help you with that.<\/p>\n\n<p>On ROS, all 2D LiDAR will publish data on the topic of <a class=\"catalogue\" title=\"ROS LaserScan\" href=\"http:\/\/docs.ros.org\/melodic\/api\/sensor_msgs\/html\/msg\/LaserScan.html\" target=\"_blank\" rel=\"noopener noreferrer\"> LaserScan <\/a> and <a class=\"catalogue\" title=\"ROS PointCloud\" href=\"http:\/\/docs.ros.org\/melodic\/api\/sensor_msgs\/html\/msg\/PointCloud.html\" target=\"_blank\" rel=\"noopener noreferrer\"> PointCloud <\/a> type for 2D\/3D LiDAR. A lot of code samples are available online to help you process this data.<\/p>\n\n<h2 class=\"wp-block-heading\">Implementing LiDAR processed data<\/h2>\n\n<h3 class=\"wp-block-heading\">1) Mapping<\/h3>\n\n<h4 class=\"wp-block-heading\">LeGO-LOAM<\/h4>\n\n<p>LeGO-LOAM is specifically optimized for a horizontally placed VLP-16 or <a class=\"catalogue\" title=\"RoboSense 3D lidars\" href=\"\/en\/455-robosense-3d-lidars\"> <strong> Robosense LiDAR <\/strong> <\/a> on a ground vehicle. The LiDAR \u201cpostulates\u201d that there is always a ground plane in the scan. <br \/>The UGV we are using is <a class=\"catalogue\" title=\"Jackal unmanned ground vehicle\" href=\"\/en\/402144-jackal-unmanned-ground-vehicle.html\"> <strong> The UGV we are using is <\/strong> <\/a> . It has a built-in IMU.<\/p>\n\n<p>This <a class=\"catalogue\" title=\"Mapping with ROS: LeGO-LOAM\" href=\"https:\/\/github.com\/RobustFieldAutonomyLab\/LeGO-LOAM\" target=\"_blank\" rel=\"noopener noreferrer\"> repository <\/a> contains code for a lightweight and ground optimized LiDAR odometry and mapping (LeGO-LOAM) system for ROS compatible UGVs.<\/p>\n\n<p>The system takes in point cloud from a Velodyne VLP-16 LiDAR (placed horizontal) and optional IMU data as inputs. You can use another 3D LiDAR, like the <strong> RS-LIDAR-16 <\/strong> by Robosense, you need to change parameters.<\/p>\n\n<p>Il fournit une estimation de pose 6D en temps r\u00e9el.<\/p>\n\n<p>A demonstration of the system can be found <a class=\"catalogue\" title=\"LeGO-LOAM: Lightweight and Ground-Optimized Lidar Odometry and Mapping on Variable Terrain\" href=\"https:\/\/www.youtube.com\/watch?v=O3tz_ftHV48\" target=\"_blank\" rel=\"noopener noreferrer\"> here <\/a> .<\/p>\n\n<p>Must be set up:<\/p>\n\n<ul class=\"wp-block-list\">\n<li><a class=\"catalogue\" title=\"ROS versions: Kinetic and Melody\" href=\"http:\/\/wiki.ros.org\/ROS\/Installation\" target=\"_blank\" rel=\"noopener noreferrer\"> ROS <\/a> (tested with indigo and kinetic)<\/li>\n\n<li><a class=\"catalogue\" title=\"GTSAM ROS\" href=\"https:\/\/github.com\/borglab\/gtsam\/releases\" target=\"_blank\" rel=\"noopener noreferrer\"> GTSAM <\/a> (Georgia Tech Smoothing and Mapping library, 4.0.0-alpha2) <br \/>\n<h4>A-LOAM<\/h4>\n<p><a class=\"catalogue\" title=\"ROS A-LOAM\" href=\"https:\/\/github.com\/HKUST-Aerial-Robotics\/A-LOAM\" target=\"_blank\" rel=\"noopener noreferrer\"> A-LOAM <\/a> (Georgia Tech Smoothing and Mapping library, 4.0.0-alpha2)<\/p>\n<p>This code is modified from LOAM and <a class=\"catalogue\" title=\"LOAM_NOTED\" href=\"https:\/\/github.com\/cuitaixiang\/LOAM_NOTED\" target=\"_blank\" rel=\"noopener noreferrer\"> LOAM_NOTED <\/a> . This code is clean and simple without any complicated mathematical derivation or redundant operations.<\/p>\n<p>It is a good learning material for SLAM beginners.<\/p>\n<p>Must be set up:<\/p>\n<ul>\n<li><a class=\"catalogue\" title=\"ROS versions: Kinetic and Melody\" href=\"http:\/\/wiki.ros.org\/ROS\/Installation\" target=\"_blank\" rel=\"noopener noreferrer\"> ROS <\/a> (Kinetic or Melodic)<\/li>\n<li><a class=\"catalogue\" title=\"Ceres Solver\" href=\"http:\/\/ceres-solver.org\/installation.html\" target=\"_blank\" rel=\"noopener noreferrer\"> Ceres Solver <\/a><\/li>\n<li><a class=\"catalogue\" title=\"Prebuilt binaries for Linux\" href=\"http:\/\/www.pointclouds.org\/downloads\/linux.html\" target=\"_blank\" rel=\"noopener noreferrer\"> PCL <\/a><\/li>\n<\/ul>\n<h4>3D LIDAR-based Graph SLAM<\/h4>\n<p><i> <a class=\"catalogue\" title=\"3D LIDAR-based Graph SLAM \" href=\"https:\/\/github.com\/koide3\/hdl_graph_slam\" target=\"_blank\" rel=\"noopener noreferrer\"> hdl_graph_slam <\/a> <\/i> is an open source ROS package for real-time 6DOF SLAM using a 3D LIDAR. It is based on 3D Graph SLAM with NDT scan matching-based odometry estimation and loop detection.<\/p>\n<p>It also supports several graph constraints, such as GPS, IMU acceleration (gravity vector), IMU orientation (magnetic sensor), and floor plane (detected in a point cloud).<\/p>\n<p>We have tested this package with both Velodyne (HDL32e, VLP16) and <a class=\"catalogue\" title=\"RoboSense 3D LiDAR\" href=\"\/en\/455-robosense-3d-lidars\"> Robosense <\/a> (16 channels) sensors, in indoor and outdoor environments.<\/p>\n<h4>Spin Hokuyo<\/h4>\n<p>This <a class=\"catalogue\" title=\"Repository spin_hokuyo\" href=\"https:\/\/github.com\/RobustFieldAutonomyLab\/spin_hokuyo\" target=\"_blank\" rel=\"noopener noreferrer\"> repository <\/a> contains code to control a <a class=\"catalogue\" title=\"Dynamixel servo motor\" href=\"\/en\/169-dynamixel-servomotors\"> Dynamixel servomotor <\/a> and a 2D <a class=\"catalogue\" title=\"G\u00e9n\u00e9ration Robots - Hokuyo Laser range-finders\" href=\"\/en\/262-hokuyo-lidar\"> Hokuyo LiDAR <\/a> to create a 3D point cloud that can be visualized in RViz.<\/p>\n<p>This point cloud can then be used to create an octomap.<\/p>\n<p>Wiki Page: <a class=\"catalogue\" title=\"spin_hokuyo wiki page\" href=\"http:\/\/wiki.ros.org\/spin_hokuyo\" target=\"_blank\" rel=\"noopener noreferrer\"> http:\/\/wiki.ros.org\/spin_hokuyo <\/a><\/p>\n<p>Here is another 3D scanning project that involves a <a class=\"catalogue\" title=\"Hokuyo UTM-30LX Scanning Laser Range Finder\" href=\"\/en\/401433-hokuyo-utm-30lx-laser-range-finder.html\"> Hokuyo UTM-30LX LiDAR <\/a> : <a class=\"catalogue\" title=\"3D photobooth with a LiDAR Hokuyo\" href=\"https:\/\/github.com\/gcc-robotics\/3d_photobooth\/blob\/master\/CapstoneFinalReport_VisionTeam.pdf\" target=\"_blank\" rel=\"noopener noreferrer\"> 3D photobooth <\/a> .<\/p>\n<h3>2) Calibration<\/h3>\n<h4>ROS package to calibrate a camera and a LiDAR<\/h4>\n<p>This <a class=\"catalogue\" title=\"LiDAR camera calibration\" href=\"https:\/\/github.com\/ankitdhall\/lidar_camera_calibration\" target=\"_blank\" rel=\"noopener noreferrer\"> ROS package <\/a> is used to calibrate a Velodyne LiDAR with a camera (works for both monocular and stereo). Specifically, Point Gray Blackfly and ZED camera have been successfully calibrated against Velodyne VLP-16 using LiDAR_camera_calibration.<\/p>\n<p>is used to calibrate a Velodyne LiDAR with a camera (works for both monocular and stereo). Specifically, Point Gray Blackfly and ZED camera have been successfully calibrated against Velodyne VLP-16 using LiDAR_camera_calibration.<\/p>\n<div align=\"center\"><iframe src=\"https:\/\/www.youtube.com\/embed\/Om1SFPAZ5Lc\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\n     <\/iframe><\/div>\n<p>The package finds a rotation and translation that transform all the points in the LiDAR frame to the (monocular) camera frame.<\/p>\n<p>The package uses <i> aruco_ros <\/i> and a slightly modified <em> aruco_mapping <\/em> as dependencies, both of which are available in the dependencies folder in this repository.<\/p>\n<p>The package uses aruco_ros and a slightly modified aruco_mapping as dependencies, both of which are available in the dependencies folder in this repository.<\/p>\n<p>A paper was written about this project, you can read it <a class=\"catalogue\" title=\"LiDAR-Camera Calibration using 3D-3D Point correspondences\" href=\"https:\/\/arxiv.org\/pdf\/1705.09785.pdf\" target=\"_blank\" rel=\"noopener noreferrer\"> here <\/a> .<\/p>\n<h3>3) Tracking<\/h3>\n<p>Multiple object tracking with LiDAR<\/p>\n<p>Le <a class=\"catalogue\" title=\"Multiple objects tracking with a LiDAR\" href=\"https:\/\/github.com\/praveen-palanisamy\/multiple-object-tracking-lidar\" target=\"_blank\" rel=\"noopener noreferrer\"> PCL based ROS package <\/a> to Detect\/Cluster &#8211;&gt; Track &#8211;&gt; Classify static and dynamic objects in real-time from LIDAR scans implemented in C++.<\/p>\n<p>Features:<\/p>\n<ul>\n<li>K-D tree-based point cloud processing for object feature detection from point clouds<\/li>\n<li>Unsupervised k-means clustering based on detected features and refinement using RANSAC<\/li>\n<li>Stable tracking (object ID &amp; data association) with an ensemble of Kalman Filters<\/li>\n<li>Robust compared to k-means clustering with mean-flow tracking<\/li>\n<\/ul>\n<h3>4) Fusion<\/h3>\n<p>Extended Kalman Filter Project Starter Code<\/p>\n<p>Self-Driving Car Engineer Nanodegree Program. In this <a class=\"catalogue\" title=\"Filtre Extended Kalman\" href=\"https:\/\/github.com\/KathanSheth\/Extended-Kalman-Filter\" target=\"_blank\" rel=\"noopener noreferrer\"> project <\/a> , you will use a kalman filter to estimate the state of a moving object of interest with noisy LiDAR and radar measurements.<\/p>\n<p>Passing the project requires obtaining RMSE values that are lower than the tolerance outlined in the project rubric.<\/p>\n<p>This project involves the Term 2 Simulator which can be downloaded <a class=\"catalogue\" title=\"Term 2 simulator\" href=\"https:\/\/github.com\/udacity\/self-driving-car-sim\/releases\" target=\"_blank\" rel=\"noopener noreferrer\"> here <\/a> .<\/p>\n<p>This repository includes two files that can be used to set up and install <a class=\"catalogue\" title=\"uWebSockets\" href=\"https:\/\/github.com\/uWebSockets\/uWebSockets\" target=\"_blank\" rel=\"noopener noreferrer\"> uWebSocketIO <\/a> for either Linux or Mac systems. For windows you can use either Docker, VMware, or even <a class=\"catalogue\" title=\"How to Install and Use the Linux Bash Shell on Windows 10\" href=\"https:\/\/www.howtogeek.com\/249966\/how-to-install-and-use-the-linux-bash-shell-on-windows-10\/\" target=\"_blank\" rel=\"noopener noreferrer\"> Windows 10 Bash on Ubuntu <\/a> to install uWebSocketIO.<\/p>\n<ul>\n<li><a class=\"catalogue\" title=\"Librairies Python\" href=\"https:\/\/github.com\/mithi\/fusion-ekf-python\" target=\"_blank\" rel=\"noopener noreferrer\"> Python libraries <\/a><\/li>\n<li><a class=\"catalogue\" title=\"Librairies C++\" href=\"https:\/\/github.com\/mithi\/fusion-ekf\" target=\"_blank\" rel=\"noopener noreferrer\"> C++ libraries <\/a><\/li>\n<\/ul>\n<p>\u00a0<\/p>\n<\/li>\n<\/ul>\n\n<p>This is an extended Kalman Filter implementation for fusing LiDAR and radar sensor measurements. A Kalman filter can be used where information about some dynamic system are uncertain, so your best bet is to do some educated guesses about what the system is going to do next.<\/p>\n\n<p><b> In this case, we have two &lsquo;noisy&rsquo; sensors: <\/b><\/p>\n\n<ul class=\"wp-block-list\">\n<li>A LiDAR sensor that measures our position in cartesian-coordinates (x, y)<\/li>\n\n<li>A radar sensor that measures our position and velocity in polar coordinates (rho, phi, drho)<\/li>\n<\/ul>\n\n<p><b> We want to predict our position, and how fast we are going in what direction at any point in time: <\/b><\/p>\n\n<ul class=\"wp-block-list\">\n<li>In essence: the position and velocity of the system in cartesian coordinates: (x, y, vx, vy)<\/li>\n\n<li>Note that we are assuming a constant velocity model (CV) for this particular system <br \/>\n<h4>UKF Fusion<\/h4>\n<p>Compared to the <a class=\"catalogue\" title=\"Fusion EKF\" href=\"https:\/\/github.com\/mithi\/Fusion-EKF-CPP\" target=\"_blank\" rel=\"noopener noreferrer\"> Extended Kalman Filter <\/a> with a constant velocity model, RMSE should be lower for the <a class=\"catalogue\" title=\"An unscented Kalman Filter implementation in C++ for fusing lidar and radar sensor measurements\" href=\"https:\/\/github.com\/mithi\/fusion-ukf\" target=\"_blank\" rel=\"noopener noreferrer\"> unscented Kalman filter <\/a> especially for velocity.<\/p>\n<p>The CTRV model is more precise than a constant velocity model. And UKF is also known for handling non-linear equations better than EKF.<\/p>\n<p>Using a LiDAR is sometimes not enough for some applications. Indeed, other data must be collected to ensure accuracy. As the sensor is moving (if it&rsquo;s part of a <a class=\"catalogue\" title=\"Outdoor mobile robots\" href=\"\/en\/352-outdoor-mobile-robots\"> mobile robot <\/a> ), location and orientation of the device must be included to determine the position of the laser pulse at the time of sending and the time of return.<\/p>\n<p>This extra information is crucial to the data&rsquo;s integrity. Common additional sensors are:<\/p>\n<ul>\n<li>A camera<\/li>\n<li>A GPS chip<\/li>\n<li>A <a class=\"catalogue\" title=\"Gyroscopes and IMU\" href=\"\/en\/333-gyroscopes-and-imu\"> IMU (Inertial Measurement Unit) <\/a><\/li>\n<li>ToF sensors<\/li>\n<\/ul>\n<p>\u00a0<\/p>\n<\/li>\n<\/ul>\n\n<p>To get accurate values your need to merge those data and keep the ones most trusted depending of the environment and the situation. This step is called data fusion. Sometimes the sensors are giving the same kind of data but with different values due to their accuracy and error.<\/p>\n\n<p>It is very important to find the real value based on sensors\u2019 different values. <br \/>The data fusion is a crucial step. Merging all those data would able the robot to locate itself in its known environment. The higher the sensor accuracy and the navigation software accuracy are, the higher the estimated position will be.<\/p>\n\n<h4 class=\"wp-block-heading\">SLAM implementation in mobile robotics<\/h4>\n\n<p>The most famous SLAM algorithms are:<\/p>\n\n<ul class=\"wp-block-list\">\n<li><a class=\"catalogue\" title=\"SLAM EFK\" href=\"https:\/\/en.wikipedia.org\/wiki\/EKF_SLAM\" target=\"_blank\" rel=\"noopener noreferrer\"> EKF SLAM <\/a><\/li>\n\n<li><a class=\"catalogue\" title=\"FastSLAM\" href=\"https:\/\/en.wikipedia.org\/wiki\/FastSLAM\" target=\"_blank\" rel=\"noopener noreferrer\"> FastSLAM 2.0 <\/a><\/li>\n<\/ul>\n\n<p>The following diagram shows the architecture of the navigation stack of ROS which implements a SLAM algorithm:<\/p>\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter\"><a href=\"https:\/\/blog.generationrobots.com\/wp-content\/uploads\/2019\/09\/ros-slam-navigation-stack-setup.jpg\"> <img decoding=\"async\" width=\"602\" height=\"247\" class=\"wp-image-6073\" src=\"https:\/\/blog.generationrobots.com\/wp-content\/uploads\/2019\/09\/ros-slam-navigation-stack-setup.jpg\" alt=\"Architecture of the navigation stack of ROS\" srcset=\"https:\/\/www.generationrobots.com\/blog\/wp-content\/uploads\/2019\/09\/ros-slam-navigation-stack-setup.jpg 602w, https:\/\/www.generationrobots.com\/blog\/wp-content\/uploads\/2019\/09\/ros-slam-navigation-stack-setup-300x123.jpg 300w\" sizes=\"(max-width: 602px) 100vw, 602px\" \/> <\/a><\/figure><\/div>\n<p>Sensors are providing the input values: sensor sources and odometry source. Based on those two data sources, the algorithm calculates the sensor transformation, the pose estimate (amcl) and map the environment (map_server). Thanks to all this data, the path planner can take decisions.<\/p>\n\n<p>Sensors are essential, and the LiDAR is the best sensor to get LaserScan or PointCloud data.<\/p>\n\n<p>You can read an in-depth publication about SLAM algorithm in this <a class=\"catalogue\" title=\"Introduction to navigation with ROS\" href=\"https:\/\/www.dis.uniroma1.it\/~nardi\/Didattica\/CAI\/matdid\/robot-programming-ROS-introduction-to-navigation.pdf\" target=\"_blank\" rel=\"noopener noreferrer\"> introduction to navigation with ROS <\/a> .<\/p>\n\n<p class=\"has-text-align-center\"><span style=\"color: #000000;\"> <strong> Read our other blog posts from our \u201cLiDAR technology\u201d serie <\/strong> <\/span><\/p>\n\n<figure class=\"wp-block-table\">\n<table>\n<tbody>\n<tr>\n<td><a title=\"Which applications for a LiDAR?\" href=\"\/blog\/en\/which-applications-for-a-lidar\/\"> <img decoding=\"async\" src=\"https:\/\/static.generation-robots.com\/img\/cms\/bouton-CTA-articles-blog-lidar-3-EN.jpg\" alt=\"Which applications for a LiDAR?\" width=\"270\" height=\"110\" \/> <\/a><\/td>\n<td>\u00a0<\/td>\n<td>\u00a0<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<\/figure>\n\n<p>Do not hesitate to browse our <a class=\"catalogue\" title=\"G\u00e9n\u00e9ration Robots - Laser range-finders\" href=\"\/en\/206-lidar-sensors-for-robotics-and-automation\"> LiDAR selection <\/a> or to <a class=\"catalogue\" title=\"Contact G\u00e9n\u00e9ration Robots\" href=\"\/en\/contact-us\"> contact us <\/a> if you need additional information or a quotation.<\/p>\n\n<p class=\"has-text-align-center\"><strong> LiDAR distributed by G\u00e9n\u00e9ration Robots <\/strong><\/p>\n\n<figure class=\"wp-block-table\">\n<table>\n<tbody>\n<tr>\n<td class=\"has-text-align-center\" data-align=\"center\"><a class=\"catalogue\" title=\"Ouster Lidar for mobile robots\" href=\"\/en\/536-lidar-ouster\"> <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/static.generation-robots.com\/img\/ouster-lidar-robots-mobiles.jpg\" alt=\"Ouster Lidar for mobile robots\" width=\"170\" height=\"96\" \/> <\/a><\/td>\n<td class=\"has-text-align-center\" data-align=\"center\"><a class=\"catalogue\" title=\"Slamtec LiDAR on the G\u00e9n\u00e9ration Robots website\" href=\"\/en\/198_slamtec\"> <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/static.generation-robots.com\/img\/cms\/slamtec-lidar-logo.jpg\" alt=\"LiDARs Slamtec\" width=\"170\" height=\"96\" \/> <\/a><\/td>\n<td class=\"has-text-align-center\" data-align=\"center\"><a class=\"catalogue\" title=\"YDlidar LiDAR on the G\u00e9n\u00e9ration Robots website\" href=\"\/en\/222_ydlidar\"> <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/static.generation-robots.com\/img\/cms\/YDlidar-lidar-logo.jpg\" alt=\"LiDARs YDLidar\" width=\"170\" height=\"96\" \/> <\/a><\/td>\n<\/tr>\n<tr>\n<td class=\"has-text-align-center\" data-align=\"center\"><a class=\"catalogue\" title=\"RoboSense 3D lidars\" href=\"\/en\/455-robosense-3d-lidars\"> <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/static.generation-robots.com\/img\/cms\/robosense-lidar-logo.jpg\" alt=\"LiDARs Robosense\" width=\"170\" height=\"96\" \/> <\/a><\/td>\n<td class=\"has-text-align-center\" data-align=\"center\"><a class=\"catalogue\" title=\"G\u00e9n\u00e9ration Robots - Hokuyo Laser range-finders\" href=\"\/en\/262-hokuyo-lidar\"> <img loading=\"lazy\" decoding=\"async\" class=\"aligncenter\" src=\"https:\/\/static.generation-robots.com\/img\/cms\/hokuyo-lidar-logo.jpg\" alt=\"LiDARs Hokuyo\" width=\"170\" height=\"96\" \/> <\/a><\/td>\n<td class=\"has-text-align-center\" data-align=\"center\"><a class=\"catalogue\" title=\"Sick LMS111-10100 laser scanner, outdoor version, medium range\" href=\"\/en\/263-sick-lidar\"> <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/static.generation-robots.com\/img\/cms\/sick-lidar-logo.jpg\" alt=\"LiDARs SICK\" width=\"170\" height=\"96\" \/> <\/a><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<\/figure>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-37a9deb3 elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"37a9deb3\" data-element_type=\"section\" data-e-type=\"section\" data-settings=\"{&quot;background_background&quot;:&quot;gradient&quot;}\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-c14bf9b\" data-id=\"c14bf9b\" data-element_type=\"column\" data-e-type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-1a43e4f2 elementor-widget elementor-widget-text-editor\" data-id=\"1a43e4f2\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<h2>Contact us to discuss your project or to ask for a quote<\/h2>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-68c6bf31 elementor-button-align-center elementor-widget elementor-widget-form\" data-id=\"68c6bf31\" data-element_type=\"widget\" data-e-type=\"widget\" data-settings=\"{&quot;step_next_label&quot;:&quot;Next&quot;,&quot;step_previous_label&quot;:&quot;Previous&quot;,&quot;button_width&quot;:&quot;100&quot;,&quot;step_type&quot;:&quot;number_text&quot;,&quot;step_icon_shape&quot;:&quot;circle&quot;}\" data-widget_type=\"form.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<form class=\"elementor-form\" method=\"post\" id=\"Blog\" name=\"telecharger-cas-client\" aria-label=\"telecharger-cas-client\">\n\t\t\t<input type=\"hidden\" name=\"post_id\" value=\"6080\"\/>\n\t\t\t<input type=\"hidden\" name=\"form_id\" value=\"68c6bf31\"\/>\n\t\t\t<input type=\"hidden\" name=\"referer_title\" value=\"LiDAR integration with ROS: quickstart guide and projects ideas\" \/>\n\n\t\t\t\t\t\t\t<input type=\"hidden\" name=\"queried_id\" value=\"6080\"\/>\n\t\t\t\n\t\t\t<div class=\"elementor-form-fields-wrapper elementor-labels-\">\n\t\t\t\t\t\t\t\t<div class=\"elementor-field-type-text elementor-field-group elementor-column elementor-field-group-name elementor-col-50 elementor-field-required\">\n\t\t\t\t\t\t\t\t\t\t\t\t<label for=\"form-field-name\" class=\"elementor-field-label elementor-screen-only\">\n\t\t\t\t\t\t\t\tName\t\t\t\t\t\t\t<\/label>\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t<input size=\"1\" type=\"text\" name=\"form_fields[name]\" id=\"form-field-name\" class=\"elementor-field elementor-size-sm  elementor-field-textual\" placeholder=\"Name*\" required=\"required\">\n\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<div class=\"elementor-field-type-text elementor-field-group elementor-column elementor-field-group-field_f5f3ad0 elementor-col-50 elementor-field-required\">\n\t\t\t\t\t\t\t\t\t\t\t\t<label for=\"form-field-field_f5f3ad0\" class=\"elementor-field-label elementor-screen-only\">\n\t\t\t\t\t\t\t\tCompany\t\t\t\t\t\t\t<\/label>\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t<input size=\"1\" type=\"text\" name=\"form_fields[field_f5f3ad0]\" id=\"form-field-field_f5f3ad0\" class=\"elementor-field elementor-size-sm  elementor-field-textual\" placeholder=\"Company*\" required=\"required\">\n\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<div class=\"elementor-field-type-text elementor-field-group elementor-column elementor-field-group-field_c547fb3 elementor-col-50 elementor-field-required\">\n\t\t\t\t\t\t\t\t\t\t\t\t<label for=\"form-field-field_c547fb3\" class=\"elementor-field-label elementor-screen-only\">\n\t\t\t\t\t\t\t\tCountry\t\t\t\t\t\t\t<\/label>\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t<input size=\"1\" type=\"text\" name=\"form_fields[field_c547fb3]\" id=\"form-field-field_c547fb3\" class=\"elementor-field elementor-size-sm  elementor-field-textual\" placeholder=\"Country*\" required=\"required\">\n\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<div class=\"elementor-field-type-email elementor-field-group elementor-column elementor-field-group-email elementor-col-50 elementor-field-required\">\n\t\t\t\t\t\t\t\t\t\t\t\t<label for=\"form-field-email\" class=\"elementor-field-label elementor-screen-only\">\n\t\t\t\t\t\t\t\tEmail\t\t\t\t\t\t\t<\/label>\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t<input size=\"1\" type=\"email\" name=\"form_fields[email]\" id=\"form-field-email\" class=\"elementor-field elementor-size-sm  elementor-field-textual\" placeholder=\"Email*\" required=\"required\">\n\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<div class=\"elementor-field-type-textarea elementor-field-group elementor-column elementor-field-group-field_3aa81e4 elementor-col-100\">\n\t\t\t\t\t\t\t\t\t\t\t\t<label for=\"form-field-field_3aa81e4\" class=\"elementor-field-label elementor-screen-only\">\n\t\t\t\t\t\t\t\tMessage\t\t\t\t\t\t\t<\/label>\n\t\t\t\t\t\t<textarea class=\"elementor-field-textual elementor-field  elementor-size-sm\" name=\"form_fields[field_3aa81e4]\" id=\"form-field-field_3aa81e4\" rows=\"4\" placeholder=\"Message\"><\/textarea>\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<div class=\"elementor-field-type-acceptance elementor-field-group elementor-column elementor-field-group-field_88b93bf elementor-col-100\">\n\t\t\t\t\t\t\t<div class=\"elementor-field-subgroup\">\n\t\t\t<span class=\"elementor-field-option\">\n\t\t\t\t<input type=\"checkbox\" name=\"form_fields[field_88b93bf]\" id=\"form-field-field_88b93bf\" class=\"elementor-field elementor-size-sm  elementor-acceptance-field\">\n\t\t\t\t<label for=\"form-field-field_88b93bf\">We would like to contact you from time to time about our products and services and other content that may be of interest to you.\nI agree to receive other communications from G\u00e9n\u00e9ration Robots. <b>Read our <a class=\"catalogue\" title=\"Protection of personal data G\u00e9n\u00e9ration Robots\" href=\"\/en\/content\/120-protection-of-personal-data\">\nprivacy practices<\/a><\/b>.<\/label>\t\t\t<\/span>\n\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<div class=\"elementor-field-type-recaptcha_v3 elementor-field-group elementor-column elementor-field-group-field_0a34049 elementor-col-100 recaptcha_v3-bottomright\">\n\t\t\t\t\t<div class=\"elementor-field\" id=\"form-field-field_0a34049\"><div class=\"elementor-g-recaptcha\" data-sitekey=\"6LfccOcjAAAAAENMvgfWAyX3eUyTlTB9VNNzw3Ci\" data-type=\"v3\" data-action=\"Form\" data-badge=\"bottomright\" data-size=\"invisible\"><\/div><\/div>\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<div class=\"elementor-field-group elementor-column elementor-field-type-submit elementor-col-100 e-form__buttons\">\n\t\t\t\t\t<button class=\"elementor-button elementor-size-sm elementor-animation-grow\" type=\"submit\" id=\"contact_form\">\n\t\t\t\t\t\t<span class=\"elementor-button-content-wrapper\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"elementor-button-text\">Submit<\/span>\n\t\t\t\t\t\t\t\t\t\t\t\t\t<\/span>\n\t\t\t\t\t<\/button>\n\t\t\t\t<\/div>\n\t\t\t<\/div>\n\t\t<\/form>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<\/div>\n\t\t","protected":false},"excerpt":{"rendered":"<p>\u00a0 Read our other blog posts from our \u201cLiDAR technology\u201d serie \u00a0 \u00a0 LiDAR integration with ROS: quickstart guide and projects ideas In this post, you will learn how to connect and integrate your LiDAR with your PC or embedded system using ROS middleware on Ubuntu. We will also talk about data fusion (widely used[&#8230;]<br \/> <a class=\"button\" href=\"https:\/\/www.generationrobots.com\/blog\/en\/lidar-integration-with-ros-quickstart-guide-and-projects-ideas\/\" style=\"float:right;\">Read this article &gt;&gt;<\/a><\/p>\n","protected":false},"author":188,"featured_media":6079,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[10545,10535],"tags":[],"class_list":["post-6080","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-comparisons-and-tests-products","category-guides-and-tutorials"],"_links":{"self":[{"href":"https:\/\/www.generationrobots.com\/blog\/wp-json\/wp\/v2\/posts\/6080","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.generationrobots.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.generationrobots.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.generationrobots.com\/blog\/wp-json\/wp\/v2\/users\/188"}],"replies":[{"embeddable":true,"href":"https:\/\/www.generationrobots.com\/blog\/wp-json\/wp\/v2\/comments?post=6080"}],"version-history":[{"count":17,"href":"https:\/\/www.generationrobots.com\/blog\/wp-json\/wp\/v2\/posts\/6080\/revisions"}],"predecessor-version":[{"id":14962,"href":"https:\/\/www.generationrobots.com\/blog\/wp-json\/wp\/v2\/posts\/6080\/revisions\/14962"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.generationrobots.com\/blog\/wp-json\/wp\/v2\/media\/6079"}],"wp:attachment":[{"href":"https:\/\/www.generationrobots.com\/blog\/wp-json\/wp\/v2\/media?parent=6080"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.generationrobots.com\/blog\/wp-json\/wp\/v2\/categories?post=6080"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.generationrobots.com\/blog\/wp-json\/wp\/v2\/tags?post=6080"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}