I have created a ros node that does a tf transformation. I want to merge this new frame to an already existing bag file. I did this with rosbag record -a. (Played the existing bag and my ros node doing the transformation). But now I get the warning: ""detected jump back clearing tf Buffer" when I play the new bag. Is there anything I have to consider when merging a frame?
Done:
-> Set use_sim_time true before recording
-> playbag exisiting bag file with Argument --Clock
↧
Add tf to existing bag file
↧
How can I throttle multiple ROS nodes at once?
I currently have a three-camera setup that I want to record data from using `rosbag`. However, I want to slow down the video feed from these cameras (from default 30fps to 3fps). I found out that you can throttle a node using
However, when we tried to create 3 different nodes, we realized that you can only throttle one node at a time. Is there a workaround to how we can publish multiple throttle nodes parallely. Essentially, we are trying ot throttle the video feed from the 3 cameras. In addition does using the `-r` parameter for `rosbag` guarantee recording at a lower fps since that was the only workaround I could think of
↧
↧
gmapping issues with laser pose
I'm working on creating a server map using the tutorial:
http://wiki.ros.org/slam_gmapping/Tutorials/MappingFromLoggedData
The odometry data is being published, and I have a hokuyo utm-30lx laser scanner published as well. My robot is being modeled with a urdf file.
I made sure the hokuyo was perfectly horizontal and then start recording my bag file with this:
>rosbag record -O mylaserdata /scan /tf
After the recording, it plays back into rviz fairly well. Even with all of the scan overlap, it still resembled the room recorded.
The issue comes when I try to turn it into a map with:
>rosrun gmapping slam_gmapping scan:=scan
as soon as I start playing the bag file, it gives me this warning:
[ WARN] [1384916667.376566157, 1384914870.078301371]: Failed to compute laser pose, aborting initialization ("base_link" passed to lookupTransform argument target_frame does not exist. )
This warning repeatedly keeps coming up as the bag is playing. I've also tried:
>rosrun gmapping slam_gmapping scan:=base_scan _odom_frame:=odom
But it gives me the same warning.
1. Am I not correctly recording my tf data into the bag file? Shouldn't the urdf provide the laser pose?
2. The robot_state_publisher is in the topic named '/tf'. Why isn't it my base link from my urdf file? rosbag wouldn't record my base link when I tried.
3. On a somewhat different note, if my odometry data isn't too accurate, what does it mean for gmapping and using the Navigation Stack?
Advice would be greatly appreciated! Thank you.
Edit: I've tried to create the map live with rviz. Similar to how the map was made here.
http://wiki.ros.org/pr2_simulator/Tutorials/BuildingAMapInSimulation
Unfortunately, it still gives me the same error. I thought the error came from the target frame, and tried setting that on rviz. Still nothing.
I also double checked my urdf file, and I don't believe the issue is there. The laser scan is leveled to the laser scanner visualized modeled on the urdf file.
↧
Plotting rosbag topics in a single plot
I have a (large) rosbag.
I want to plot a few of these topics (in the same plot).
I don't need any playback feature.
I can load the bag with rqt and plot topics, one topic per plot. But can I have more topics in the same plot?
Another disadvantage is, that it requires a running roscore.
Are there any other tools that allows me to extract selected topics from a rosbag and plot them?
Bonus points if I can have a tool that:
1. does not require a roscore running
2. runs on Windows as well
↧
Simulating real camera data
Hi guys,
Is there a way I can simulate camera data similarly to how rosbag works? I tried rosbag but was missing tf data so that didn't work.
is it as simple as adding another topic to record? looking through the available topics I don't see anything that looks like what I need.
In essence what I want to do is, record the data stream of my cameras to pass to my perception pipeline so I can start working on making the robot move to a pick location. I'd like to work at my desk if possible and not at the robot since there isn't much room for my laptop there.
Thanks
↧
↧
Finding Bufferd frames in tf2_ros::Buffer (c++)
I am creating a post-processing utility where I read all the transforms from a .bag file into a tf2_ros::Buffer.
I would like to be able to ask the Buffer for all the frames it has cached. I don't see any **public** member functions in the [doccumentation](http://docs.ros.org/indigo/api/tf2_ros/html/c++/classtf2__ros_1_1Buffer.html#ab96886f1292cc3cd7452d65312ec94d0) that allow you to view cached frames.
What is the best way to access cached all the frames in a tf2_ros::Buffer?
↧
Tf_echo missing timestamp
Hi,
I start to play my rosbag and **then** I start tf_echo : rosrun tf tf_echo odom map
This is the output :
> At time 1559381315.708> - Translation: [0.000, 0.000, 0.000]> - Rotation: in Quaternion [0.000, 0.000, 0.000, 1.000]> in RPY (radian) [0.000, -0.000, 0.000]> in RPY (degree) [0.000, -0.000, 0.000]
I am happy with this result.
But now if I start tf_echo **first** and then I start to play the rosbag :
> At time 0.000> - Translation: [0.000, 0.000, 0.000]> - Rotation: in Quaternion [0.000, 0.000, 0.000, 1.000]> in RPY (radian) [0.000, -0.000, 0.000]> in RPY (degree) [0.000, -0.000, 0.000]
You see, it doesn't display the timestamp of the rosbag but "0.000" instead. Do you have any explanation ?
Thank you
↧
How to read a point cloud (.pcd) file and send it to a rosbag
Hello, I just started using the Point Cloud Library. I followed some initial tutorial [here](http://pointclouds.org/documentation/tutorials/reading_pcd.php) on how to use it.
I am trying to send a `point cloud` file `.pcd` into a `rosbag`.
I would like the `rosbag` to receive both the `ascii` and the `binary` files. I know that the `point cloud library` has the useful function `pcl::io::loadPCDFile` that I implemented on the following example (see below the snippet of code). However I am not sure how to go back to a new line to read the file and make the `rosbag` understand that.
My idea was to create a `void nextLine()` function that does that. But I stopped because I realized that for doing that I will have to write a `.pcd` parser from scratch according to the [structure](http://pointclouds.org/documentation/tutorials/pcd_file_format.php) of the point cloud file. I am not sure how to move on.
Final goal would be able to read any `.pcd` file and send them to a `rosbag` --> `rosbag play example.bag`
See below the snipped of code I am using:
**pointcloud_reader_node.cpp**
#include
#include
#include
#include "../include/cloud.h"
#include
#include
using namespace std;
int main()
{
rosbag::Bag bag;
bag.open("test.bag", rosbag::bagmode::Write);
std::string fstring = "/home/to/Desktop/file.pcd";
Cloud p;
p.readPCloud(fstring);
// while())
// {
// bag.write("point_cloud/data", p.pCloud2Msg.header.stamp , p.pCloud2Msg);
// }
return 0;
}
**cloud.h**
#include "cloud.h"
#ifndef CLOUD_H
#define CLOUD_H
#include
#include
#include
#include
#include
class Cloud
{
public:
void readPCloud(std::string filename);
sensor_msgs::PointCloud2 pCloud2Msg;
void nextLine();
private:
unsigned int msgPointCloud;
void packPointCloudMsg();
std::string path;
};
#endif// CLOUD_H
**cloud.cpp**
#include "cloud.h"
void Cloud::readPCloud(std::string filename)
{
pcl::PointCloud::Ptr cloud(new pcl::PointCloud);
if(pcl::io::loadPCDFile (filename, *cloud) == -1) // load point cloud file
{
PCL_ERROR("Could not read the file");
return;
}
std::cout<<"Loaded"<width * cloud->height<<"data points from /home/to/Desktop/file.pcd with the following fields: "<points.size(); ++i)
std::cout << " " << cloud->points[i].x<< " " << cloud->points[i].y<< " " << cloud->points[i].z << std::endl;
}
void Cloud::nextLine()
{
}
void Cloud::packPointCloudMsg()
{
ros::Time time = ros::Time::now();
pCloud2Msg.header.stamp = time;
pCloud2Msg.header.frame_id = "point_cloud";
pCloud2Msg.header.seq = msgPointCloud;
pCloud2Msg.height = 1;
pCloud2Msg.width = 5;
pCloud2Msg.is_bigendian = false;
pCloud2Msg.is_dense = false;
}
Thank you very much for pointing in the right direction or providing some example if anyone has ever had this problem before.
↧
Tried to insert a message with time less than ros::TIME_MIN Aborted (core dumped)
Hello, after successfully reading `.pcd` file from a file as asked on my [previous post](http://answers.ros.org/question/325693/conversion-from-point-cloud-pcl-to-pointcloud2-ros-not-showing-result-on-rviz/) I was trying to to write to a `rosbag` but it is not working. I wanted to check if messages are currently being received using `rostopic list /echo` but I am receiving a compiler error:
emanuele@Manny:~$ rosrun map_ros pointcloud_reader_node
terminate called after throwing an instance of 'rosbag::BagException'
what(): Tried to insert a message with time less than ros::TIME_MIN
Aborted (core dumped)
I have never received this error, and by reading official documentation and other useful [sources](http://answers.ros.org/question/243127/segmentation-fault-core-dumped/) I don't understand how this `ros::TIME_MIN` exception could be handled.
See below the code I am using for that:
**cloud.h**
class Cloud
{
public:
void readPCloud(std::string filename, ros::NodeHandle &n);
sensor_msgs::PointCloud2 pCloud2Msg;
ros::Subscriber cloud_sub;
ros::Publisher cloud_pub;
ros::Timer particleTimer;
ros::NodeHandle _n;
private:
void timerCallback(const ros::TimerEvent&)
{
publishPointCloud();
}
void publishPointCloud()
{
sensor_msgs::PointCloud2 particleCloudMsg;
particleCloudMsg.header.stamp = ros::Time::now();
particleCloudMsg.header.frame_id = "cloud";
cloud_pub.publish(particleCloudMsg);
}
};
**cloud.cpp**
void Cloud::readPCloud(std::string filename, ros::NodeHandle &n)
{
pcl::PointCloud::Ptr cloud(new pcl::PointCloud);
if(pcl::io::loadPCDFile (filename, *cloud) == -1) // load point cloud file
{
PCL_ERROR("Could not read the file");
return;
}
std::cout<<"Loaded"<width * cloud->height
<<"data points from /home/to/Desktop/cloud_25.pcd "
<points.size(); ++i)
std::cout << " " << cloud->points[i].x
<< " " << cloud->points[i].y
<< " " << cloud->points[i].z << std::endl;
particleTimer = n.createTimer(ros::Duration(1), &Cloud::timerCallback, this);
}
**pointcloud_reader_node.cpp**
int main(int argc, char** argv)
{
ros::init (argc, argv, "pcl_tutorial_cloud");
ros::NodeHandle nh;
ros::Publisher pub = nh.advertise("output", 1000);
ros::Rate loop_rate(1);
rosbag::Bag bag;
bag.open("cloud.bag", rosbag::bagmode::Write);
std::string fstring = "/home/to/Desktop/cloud_25.pcd";
Cloud p;
int count = 0;
while(ros::ok())
{
sensor_msgs::PointCloud2 pcloud2;
pub.publish(pcloud2);
bag.write("cloud/data", p.pCloud2Msg.header.stamp , p.pCloud2Msg);
ros::spinOnce();
loop_rate.sleep();
count ++;
}
return 0;
}
Thanks for shedding light on this issues
↧
↧
rosbag and parameters dump with unique timestamp in filename
Dear all,
my problem is the following:
I want to save a rosbag file and dump at the same time the content of the parameter server on a file.
I want the file and the rosbag to have exactly the same time appendix. This is a similar problem to [this](http://answers.ros.org/question/213145/use-rosparam-get-in-launch-file/) but not quite the same because it doesn't involve setting the timestamps into parameter server values.
I couldn't find a way to save the timestamp string into an `arg`, however the following code kind of works for the parameters:
The output says something like:
running rosparam dump robot_description_`date +'%Y-%m-%d-%H-%M-%S'`.yaml /robot_description
but the filename is correct.
However,
when it comes to the rosbag part, the command is not understood, and in particular it believes `+'%Y-%m-%d-%H-%M-%S'` is a topic to log instead of the argument of `date`.
I know I could exploit the automatic appending of the string from rosbag, but there is a tiny chance the two string would be different.
I've tried to execute a fake node with a bash script inside that sets up an environment variable to be captured with `$(env ...)` but it didn't work out.
I've tried to use python and `$(eval ...)`, but I couldn't get to work because I cant import and use `time` in the same line.
I understood that `param` are inaccessible from a launchfile (you can only set them but not get) unless you do dirty tricks with `launch-prefix`.
Is there another way (please don't tell me the only way is to create a Python script to run the rosbag)?
↧
Rosbags record parameter server data?
I am working with some rosbags, `rosbag info` shows that there are some `dynamically reconfigurable` parameters which i want to configure but these `parameters` are not shown when i use `rosparam list`.
In the documentation it was mentioned that `rosbags ` record data on the topics mentioned with `rosbag record`, so my question is- > Do rosbags record data other than the data they get by subscribing on the topics?
Specific information- I am using Ubuntu 16.04, ROS kinetic, the rosbags were taken from Velodyne VLP-16 LIDAR, I am not sure about the drivers/packages used with VLP-16 since i was only provided with the bags.
↧
How to sync images captured with different frequecies through timestamps
Hi,
I recently recorded some images with a RGB camera (operating with at 20 Hz) and a Lider (10 Hz). Now I would like to align those two images. However, they are not synchronized due to the different frequencies. Apparently, one could try to synchronize them my manipulating the timestamps like described here: http://wiki.ros.org/message_filters#Time_Synchronizer. However, I don't quite understand what does the synchronizer actually do.
Could someone please tell me whether it's the correct the direction and how I should proceed?
The info of my bag reads as the following:
xxx@ubuntu:~$ rosbag info /home/xxx/Desktop/2019-06-18-11-38-00.bag
path: /home/xxx/Desktop/2019-06-18-11-38-00.bag
version: 2.0
duration: 40.6s
start: Jun 18 2019 11:38:00.89 (1560850680.89)
end: Jun 18 2019 11:38:41.49 (1560850721.49)
size: 3.2 GB
messages: 46042
compression: none [2072/2072 chunks]
types: ouster_ros/PacketMsg [4f7b5949e76f86d01e96b0e33ba9b5e3]
sensor_msgs/Image [060021388200f6f0f447d0fcd9c64743]
sensor_msgs/Imu [6a62c6daae103f4ff57a132d6f95cec2]
sensor_msgs/PointCloud2 [1158d486dd51d683ce2f1be655c3c181]
topics: /VIS/cam0/image_raw 812 msgs : sensor_msgs/Image
/VIS/cam1/image_raw 812 msgs : sensor_msgs/Image
/VIS/cam2/image_raw 813 msgs : sensor_msgs/Image
/VIS/imu 7904 msgs : sensor_msgs/Imu
/img_node/intensity_image 406 msgs : sensor_msgs/Image
/img_node/noise_image 405 msgs : sensor_msgs/Image
/img_node/range_image 406 msgs : sensor_msgs/Image
/os1_cloud_node/imu 4057 msgs : sensor_msgs/Imu
/os1_cloud_node/points 405 msgs : sensor_msgs/PointCloud2
/os1_node/imu_packets 4055 msgs : ouster_ros/PacketMsg
/os1_node/lidar_packets 25967 msgs : ouster_ros/PacketMsg
Thanks in advance!
↧
When an intended exit occurs, rosbag status problem.
Hi, all
I am using rosbag to record some data automatically with systemd service in ubuntu.
When an intended exit occurs, the rosbag file is not completed status as *.bag.active.
I tried the bellow commands to restore the file.
`rosbag reindex`
`rosbag fix`
but it was not restored as unknown some problem.
Just It is possible to properly save using `rosnode kill "node_name"`. But it is not automatic system...
I want to know that a rosbag file will be saved properly when it terminates abnormally.
↧
↧
Possible to record a specific time window (from T1 to T2) of a rosbag?
Hello, I am trying to understand if it is possible to record only a specific time window (from Time T1 to time T2) in a `rosbag`.
The reason for this is that I need to print the result of specific topics into 2 different `.csv` files.
According to the `rosbag` [official documentation](http://wiki.ros.org/rosbag/Tutorials/Recording%20and%20playing%20back%20data) is possible to start recording from a specific point in time in the bag, and in fact that is what I am doing, but don't know how to set an end time:
But is it possible to set a limit in the time recording?
From the example below, I need to record from approximately minute 39 to minute 224 (4h 4' 0" )approximately so:
**Beginning of the recording** : 39*60 = 2340 seconds
**End of the recording**: 224*60 = 13440 second
![1](https://i.imgur.com/nZ4gGdG.png)
Thank for pointing in the right direction on this matter.
[1]: https://imgur.com/nZ4gGdG
↧
Slow Rosbag Image Extraction.
I am having issues extracting images from the bag files with the Python API, the extraction process is very slow. An example would be extracting frames from a bag containing video messages, the recording is 120 seconds long, but it takes close to 40 seconds to extract the data for just one image topic, which isn't ideal.
For context we use a stereo cameras that create 720p RGB images at 15FPS.
Current Data Flow:
Open Image Message -> ROS CVBridge to convert ROS image to OpenCV Image -> Use the tostring method to convert the image to byte array -> Send the byte Array to another service to create and store the image.
Below are the timings for the different parts of the program:
- Lopping over the bag without doing anything. 10 seconds
- Looping and converting to OpenCV Image Only. 12 Seconds
- Looping and converting using ROS CVBridge and converting image to Numpy Array. 30 seconds
Below is the bit of code I am working with:
bag = rosbag.Bag('blackflys_left_2019-05-02-13-40-46.bag')
for topic, msg, t in bag.read_messages(topics=['/stereo/left/image_raw']):
#Convert to a real image
cv2_img = bridge.imgmsg_to_cv2(msg, desired_encoding="passthrough")
#Encode the images as a bytes array
img_str = cv2.imencode('.jpg', cv2_img)[1].tostring()
bag.close()
Am I missing something or are these sort of speeds normal? We would ideally like to extract and store the images as fast as possible.
Thank you.
↧