Quantcast
Channel: ROS Answers: Open Source Q&A Forum - RSS feed
Viewing all 135 articles
Browse latest View live

Use ira_laser_tools to fuse the rplidar A2 and Kinect laser scan,but the output of the merger is only kinect's data.

$
0
0
I have a rpidar A2 and a Kinect camera,i use the `ira_laser_tools` to fuse the laser scan from the rplidar and kinect ,then use the merge data to bulid a 2d grid map by gmapping.But when i used the `rviz` to look the topic `/scan_multi`,which is the output of laserscan_multi_merger,has only a laser scan data. i use `roslaunch kinect_rpldiar_gmapping_demo.launch` to rosrun rplidar node and kinect node : and then `roslaunh ira_laser_merge.lauch` is following: the node`laserscan_multi_merger`,it said, `Error:"[pcl::concatenatePointCloud] Number of fields in cloud1 (4) !=Number of fields in cloud2(5)".` cloud1 is kinect's fake laser scan;cloud2 is rplidar's laser scan. Then i see the part-code about the `pcl::concatenatePointcloud`,which is the error comes from.(The all code is https://docs.ros.org/api/pcl_conversions/html/pcl__conversions_8h_source.html) if (!strip && cloud1.fields.size () != cloud2.fields.size ()) { PCL_ERROR ("[pcl::concatenatePointCloud] Number of fields in cloud1 (%u) != Number of fields in cloud2 (%u)\n", cloud1.fields.size (), cloud2.fields.size ()); return (false); } So,what's the cloud.field? meaning the kind of pointcloud? i want to know about it .because when i look the `/scan_multi` laserscan by using rviz,there is only one data from `/scan_kinect`.i guess ,is the question related to the abouve error?

message_filters for Array of topic

$
0
0
Hi there! I am wondering if anyone have ever tried to create a vectorised version of the example provided here [message_filters](http://wiki.ros.org/message_filters) (7.3 ApproximateTime Policy). In short what I am trying to do is to synchronise a list of topics defined in a launch file. In my current implementation I have a vector of topics (in my case of sensor_msgs::LaserScan type) and currently I have a subscriber function inside e FOR loop, but this seems to me not optimal and it also does not allow me to ensure a synchronisation between the messages coming from the topics. Thanks! in advance

odom dropped 100% of messages

$
0
0
Hello! I am attempting to run slam gmapping with namespaces with my turtlebot3 on ubuntu16.02. I have read through other questions and have found that my tf tree was not fully connected. I got around this by publishing a static transform from base_link to bot1_tf/base_link, but I am still getting a warning in the console that target=odom has dropped 100% of messages, and am not getting any sort of map in rviz. I have also tried editing the topic names in the gmapping.rviz file, in the associated .lua configuration file, and in the gmapping.launch file itself - to no avail. I do not have any display coming through on the map from LaserScan, and `roswtf` revealed the following information about bot1_tf/scan being disconnected, which I believe to be the source of the problem: Static checks summary: No errors or warnings ================================================================================ Beginning tests of your ROS graph. These may take awhile... analyzing graph... ... done analyzing graph running graph rules... ... done running graph rules running tf checks, this will take a second... ... tf checks complete Online checks summary: Found 1 warning(s). Warnings are things that may be just fine, but are sometimes at fault WARNING The following node subscriptions are unconnected: * /bot1/turtlebot3_core: * /bot1/reset * /bot1/motor_power * /bot1/sound * /bot1/cmd_vel * /rviz: * /bot1/map_updates * /bot1/move_base/DWAPlannerROS/global_plan * /bot1/sonar_f * /bot1/move_base/NavfnROS/plan * /bot1_tf/scan * /bot1/move_base/global_costmap/costmap_updates * /bot1/sonar_b * /bot1/move_base/global_costmap/costmap Here's a screenshot of my tf tree: ![tf tree](/upfiles/15325369549327418.png) Here's my .launch file that takes care of turtlebot ssh and starts initial nodes: And my slam launch file:

What does baud rate in SICK LMS do and how does it affect the frequency of the messages being published on the topic?

$
0
0
In SICK LMS range finder, what does baud rate do and how does it affect sampling rate, frequency of messages being published on /scan topic? Let the detection angle be θ. Anglular resolution be α. Then, anglemin = −θ/2 anglemax = θ/2 angle_increment = α which is angular resolution between two beams. If the laser scan has frequency of β scans per second then in 1/β second, 1 scan will give g = (θ/α) + 1 beams. Time increment can be calculated as t_increment = (1/β)/g Now how does buad rate fit into above calculations?

Why cannot launch node of type [laser_geometry/laser_geometry]

$
0
0
I need to transfer from `sensor_msgs::LaserScan` to `sensor_msgs::PointCloud2`. I found laser_geometry package, (http://wiki.ros.org/laser_geometry). Because there is no ROS API, I found a link which helps. The link is (https://answers.ros.org/question/11232/how-to-turn-laser-scan-to-point-cloud-map/). I compiled the .cpp file same to the link: #include "laser_geometry/laser_geometry.h" #include #include #include #include #include "ros/ros.h" class My_Filter{ public: My_Filter(); void scanCallback(const sensor_msgs::LaserScan::ConstPtr& scan); private: ros::NodeHandle node_; laser_geometry::LaserProjection projector_; tf::TransformListener tfListener_; ros::Publisher point_cloud_publisher_; ros::Subscriber scan_sub_; }; My_Filter::My_Filter(){ scan_sub_ = node_.subscribe ("/scan", 100, &My_Filter::scanCallback, this); point_cloud_publisher_ = node_.advertise ("filtered_cloud", 100, false); tfListener_.setExtrapolationLimit(ros::Duration(0.1)); } void My_Filter::scanCallback(const sensor_msgs::LaserScan::ConstPtr& scan){ sensor_msgs::PointCloud2 cloud; projector_.transformLaserScanToPointCloud("base_link", *scan, cloud, tfListener_); point_cloud_publisher_.publish(cloud); } int main(int argc, char** argv) { ros::init(argc, argv, "my_filter"); My_Filter filter; ros::spin(); return 0; } from my cmakelist.txt and package.xml, I wrote a launch file `target_link_libraries(laser_geometry ${Boost_LIBRARIES} ${tf_LIBRARIES})`, `laser_geometry` But I can't launch this node. ERROR: cannot launch node of type [laser_geometry/laser_geometry]: can't locate node [laser_geometry] in package [laser_geometry] Could you please give me some hints? How to use the laser_geometry package and launch the node? Thank you!

X Y and Z coordinates in LaserScan

$
0
0
How can I get the three points from the LaserScan?

I'm trying to build a package using catkin build,i even get notification that package has been built.But no package found is the error while using rosfind package.What should I do now?

$
0
0
[build] No packages were found in the source space '/home/atharva/catkin_ws/src/laserscan_kinect/src' [build] No packages to be built. [build] Package table is up to date. [build] Summary: All 0 packages succeeded! [build] Ignored: None. [build] Warnings: None. [build] Abandoned: None. [build] Failed: None. [build] Runtime: 0.0 seconds total. atharva@atharva-Inspiron-7537:~/catkin_ws/src/laserscan_kinect$ catkin build laserscan_kinect

Cannot find python library/module to convert ROS LaserScan data into ROS OccupancyGrid in code

$
0
0
HI, I am developing a ROS-SLAM based robot using RPi, Ubuntu Mate and ROS Kinetic. I am programming using Python 2.7 as I wish to avail both ros/rospy and native Pi GPIO functionalities. I am using a ydlidar x4 to gather laser scan data. My python code (laser scan publisher node) collects data from lidar and publishes LaserScan message. I am trying to program a subscriber node that subscribes to the LaserScan topic and performs mathematical operations (in python code, not in command line) to create 2d occupancy grid. This node should then publish the occupancy grid as a message that can be used to view map using rviz. I am able to publish and subscribe to LaserScan topic. I am also able to generate a simulated occupancy grid and publish it to an occupancy grid topic viewable in rviz. I am however unable to find a python friendly ROS module that will calculate occupancy grid from laser scan data. Can you please provide suggestions on any python module that will accept laser scan data and generate occupancy grid?

laserscan not showing up correctly in rviz

$
0
0
I an running ros on 2 raspberry pi's. one pi runs master, mapserver, gmapping or amcl and my motor control and sensor node. on the other I an running Rviz. I am in my motor control and sensor node I and turning sonar readings into lasersans by adapting a sonartolaserscan package. I added the laserscan layer in rviz and I see the scan dots but they are not oriented correctly. I can see the values change as I put an obstacle in different positions. I have the obstacles in such a way that I tell where the laserscan dots should be but they are not. Hopefully the image will attach. What should be direction in front of the robot is out the back and not directly opposite from where it should be (1) The one that should be on the robot's left (2) should be 90 degrees left but it is more like 120 degrees left and the one on the right should be 90 degrees right but it is move like 160 degrees right. Sorry I am not able to add and image I do not know if the problem is my urdf or my code or I am missing a transform. Here is the code where I set up the message for the laserscan with all values as no value ---------- scan.header.frame_id = "/base_link"; scan.angle_min = 1.81514242;//DEG_TO_RAD(MIN_ANGLE); scan.angle_max = -1.81514242;//DEG_TO_RAD(MAX_ANGLE); scan.angle_increment = 0.034906585;//DEG_TO_RAD(DEG_INCREMENT); scan.time_increment = (1.0 / 100) / (NUM_READINGS); scan.range_min = 0.0; scan.range_max = MAX_SONAR_CM; scan.ranges.resize(NUM_READINGS); scan.intensities.resize(NUM_READINGS); for(unsigned int i = 0; i < NUM_READINGS; ++i) { scan.ranges[i] = (double)(MAX_SONAR_CM + .15)/100.0; scan.intensities[i] = 0; } Then here is code where I set the appropriate laserdots that correspond to the sonar readings during each sensor update. ROS_INFO("begin scan conversion"); scan.header.stamp = ros::Time::now(); // overwrite the no signal with our sonar for the 14 readings within our beam for(unsigned int i = 0; i < 8; ++i){ scan.ranges[i] = (float)(LeftF/100.0) + SIDE_OFFSET; scan.intensities[i] = 0.5; } for(unsigned int i = 26-3; i < 26+3; ++i){ scan.ranges[i] = (float)Front/100.0 + FRONT_OFFSET; scan.intensities[i] = 0.5; } for(unsigned int i = 53-9; i < 53; ++i){ scan.ranges[i] = (float)Right/100.0 + SIDE_OFFSET; scan.intensities[i] = 0.5; } ptr_scan_pub->publish(scan); ROS_INFO("end scan conversion"); I tried to add a transform for base_link to laser but got an error message saying the transform is already published in my motor control and sensor node Here is the urdf

which sensor will give out the MultiEchoLaserScan sensor msgs? Which brand and which type?

$
0
0
The backpack demo_2d cartographer uses MultiEchoLaserScan Message for Lidar data. I wanna know that which sensor is giving out this kind of data? I want to have this configuration for large area mapping also.

I am new to ROS and working on a project using 2 VLP-16 LIDAR sensors. How can I visualize the laser scan of two VLP-16 sensors on 1 laptop in rviz?

$
0
0
I am using ROS Kinetic on Ubuntu 16.04.5. Can you please provide detailed description for the solution.

Cartographer_ros 3D Slam (localization) with horizontal lidar and IMU

$
0
0
I am trying to adapt the bagpack_3d tutorial of cartographer ROS to a sick laser scanner with an imu. The scanner and imu are working. However with the current setup, I do not get any localization/map (empty /submap_list). So basically somewhere in the \cartographer_node there is an error. ![image description](/upfiles/15513483276340323.png) ![image description](/upfiles/1551348342338716.png) My configuration file looks like this: include "map_builder.lua" include "trajectory_builder.lua" options = { map_builder = MAP_BUILDER, trajectory_builder = TRAJECTORY_BUILDER, map_frame = "map", tracking_frame = "imu_link", published_frame = "base_link", odom_frame = "odom", provide_odom_frame = true, publish_frame_projected_to_2d = false, use_odometry = false, use_nav_sat = false, use_landmarks = false, num_laser_scans = 1, num_multi_echo_laser_scans = 0, num_subdivisions_per_laser_scan = 1, num_point_clouds = 0, lookup_transform_timeout_sec = 0.2, submap_publish_period_sec = 0.3, pose_publish_period_sec = 5e-3, trajectory_publish_period_sec = 30e-3, rangefinder_sampling_ratio = 1., odometry_sampling_ratio = 1., fixed_frame_pose_sampling_ratio = 1., imu_sampling_ratio = 1., landmarks_sampling_ratio = 1., } TRAJECTORY_BUILDER_3D.num_accumulated_range_data = 160 MAP_BUILDER.use_trajectory_builder_3d = true MAP_BUILDER.num_background_threads = 7 POSE_GRAPH.optimization_problem.huber_scale = 5e2 POSE_GRAPH.optimize_every_n_nodes = 320 POSE_GRAPH.constraint_builder.sampling_ratio = 0.03 POSE_GRAPH.optimization_problem.ceres_solver_options.max_num_iterations = 10 POSE_GRAPH.constraint_builder.min_score = 0.62 POSE_GRAPH.constraint_builder.global_localization_min_score = 0.66 return options In comparison to the backpack tutorial (with 2 velodyne scanners), I have a simple horizontal lidar. Accordingly, I have changed the parameters: num_laser_scans = 1, num_point_clouds = 0 My major question: Do you have any idea, why I do not get any localization/map? My minor question: As I am using the melodic binary version: sudo apt install ros-melodic-cartographer-ros Using the debugger from visual studio code with ROS plugin, I can launch the cartographer, but pausing the debugger does not work/give me any address in the binaries. Do you have a good idea how to debug the binary ros packages? Thanks a lot!

Laserscan to PointCloud

$
0
0
Hello folks I need to convert the Laserscanner data from my Hokuyo Laser Range Finder into a PointCloud. I know there is a package called laser_assembler which should do this. But i can't manage to make it working. There's a launch file in the test directory, but this one didnt what i expected. What I need is a launch file, which converts the laserscanners topic /scan to a pointcloud topic /scan_pointcloud for example, so I can visualize it in Rviz as a first step. I will appreciate any kind of help! Regards, Timo

Limit the range scan on the TurtleBot3

$
0
0
Hello, I am a beginner in ROS and I am doing a project with the Burger version of Turtlebot3. Turtlebot3’s LDS default is set to 360 but I want to modify to 24. I know how to make this in the gazebo simulation. You can modify sample of LDS at: `turtlebot3/turtlebot3_description/urdf/turtlebot3_burger.gazebo.xacro`. 360 # The number of sample. Modify it to 2410.06.28319 The doubt is that I don't know how do to the same in the real version.

ekf_localization_node : Filterd odometry yaw depend on IMU too much and Laser scan drift when robot rotate with filtered odometry.

$
0
0
I have a `odometry/filtered` fuse by wheel odometry `odom` and IMU `imu_data` with ekf_localization. I let my robot facing a wall and do some test with the `odometry/filtered`. I have two problem right now: Problem 1 --------- - My `odometry/filtered` when I move the robot forward and backward, rotate the robot on the spot, the `odometry/filtered` output seem to be like good in rviz. - https://gph.is/g/4wg9DPD The rviz output, green arrow > `odometry/filtered`, blue arrow > `odom`, the axis in front the base link axis is my laser axis. - But when I lift the robot up to make the wheel loss contact with the ground and move the robot forward and backward, both the `odometry/filtered` and `odom` output in rviz will move forward and backward too. When I rotate robot on the spot without wheel and ground contact, only the wheel `odom` move, the `odometry/filtered` is not moving, this make sense to me because the robot is not rotating only the wheel itself is running. - When I leave the robot not moving and rotate only the IMU, for sure the `odom` won't move, but the `odometry/filtered` will be move exactly same with how I rotate my IMU manually. - Rviz output: When the robot lift up, first: Rotate the robot , Second: Move robot forward and backward, Third: Rotate only the IMU. https://gph.is/g/EvdnpAY - Is my `odometry/filtered` depend too much on the IMU? And is it related to covariance? How to make a better configuration on this? Problem 2 --------- - When I rotate the robot with the `odometry/filtered`, my laser scan will drift when the robot rotate and the laser will back on right position when the robot stop. - This is the rviz output when the robot is facing a wall and not rotating. The green arrow is `odometry/filtered` and the blue arrow is pure `odom` from wheel encoder. https://ibb.co/Y33m99M - This is the rviz output when I set the laser scan decay time to 20s and rotate the robot. The laser scan will rotate when the robot rotate. https://ibb.co/TB3ZpmB - Rviz video https://gph.is/g/ZnM9XNJ ## ODOM topic## header: seq: 42461 stamp: secs: 1560329405 nsecs: 936365909 frame_id: "odom" child_frame_id: "base_link" pose: pose: position: x: -0.210383832846 y: -0.0374875475312 z: 0.0 orientation: x: 0.0 y: 0.0 z: 0.145501269807 w: 0.98935806485 covariance: [0.01, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.01, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.01, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.01, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.01, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.01] twist: twist: linear: x: 0.0 y: 0.0 z: 0.0 angular: x: 0.0 y: 0.0 z: 0.0 covariance: [0.01, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.01, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.01, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.01, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.01, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.01] ## IMU topic ## header: seq: 86075 stamp: secs: 1560329453 nsecs: 734069150 frame_id: "base_imu" orientation: x: -0.00954644712178 y: 0.0186064635724 z: 0.145939352166 w: 0.989072479826 orientation_covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0] angular_velocity: x: 0.0 y: 0.0 z: 0.0 angular_velocity_covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0] linear_acceleration: x: -0.383203125 y: -0.129331054688 z: 9.733359375 linear_acceleration_covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0] My configuration ---------------- frequency: 20 sensor_timeout: 1 two_d_mode: true transform_time_offset: 0.01 transform_timeout: 0.01 print_diagnostics: true debug: false debug_out_file: /path/to/debug/file.txt publish_tf: true publish_acceleration: false # map_frame: map odom_frame: odom base_link_frame: base_link world_frame: odom odom0: /odom imu0: /imu_data odom0_queue_size: 2 odom0_nodelay: false odom0_differential: false odom0_relative: false odom0_config: [false, false, false, false, false, false, true, true, false, false, false, true, false, false, false] imu0_config: [false, false, false, false, false, false, false, false, false, false, false, true, false, false, false] imu0_nodelay: false imu0_differential: false imu0_relative: false imu0_queue_size: 5 imu0_remove_gravitational_acceleration: true process_noise_covariance: [0.05, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.05, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.06, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.03, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.03, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.06, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.025, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.025, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.04, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.01, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.01, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.02, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.01, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.01, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.015] initial_estimate_covariance: [1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9]

Viewing all 135 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>