Hi,
I would like to use the rgbdslam_v2 package http://felixendres.github.io/rgbdslam_v2/ on my own RGB-D data.
My RGB-D data is supplied as a a pair of image sequences, one consisting of depth images and the other one containing the corresponding registered rgb data.
Does anybody please know how to convert this pair of image streams into a rosbag file that could be subsequently used by rgbdslam_v2? The depth camera intrinsics are known ...
Thanks a lot
David
↧
How to convert a sequence of registered RGB and depth images into a ROSBAG?
↧
How to detect broken rosbag files
Hello ROS users,
I've recorded plenty rosbag files during a testrun with a 1min split and lz4 compression.
Afterwards some of the data got corrupted (by the harddrive, not an error with rosbag).
When playing the rosbag files, it works and starts normally, but if rosbag hits a broken file, I get the following msg:
ROSLZ4_DATA_ERROR: malformed data to decompress
I now want to identify the broken rosbag-files and either try to fix them or if they are not fixable, delete them.
But I don't find a way to identify these broken files, does anybody know a way?
My first thought was to use "rosbag check", but this seems not to be intended for this use, or is it and I am not able to see this?
Does anybody had a similar problem and does know a solution in the ROS eco-system?
kindly regards
Tobias
Tobias
↧
↧
Playback of rosbag with tf2-static transforms
Hello ROS users,
I've got a robot setup where (most) of the static transforms are defined by an URDF (which results in tf2-static tfs).
During a testrun we recoded plenty of bag-files which where splitted after 1 min.
The static transforms are only contained in the 1. bagfile.
(1. Question) Is there a way to record splitted bagfiles with static-tfs, that the static-tfs are contained in each bag-file?
Furthermore (2. question), if the bagfile containing the static-tfs is played, they are only received by rviz (and probably other nodes as well) if rviz was started prior the playback of the bag-file, which seems for me like a bug.
Is there a solution to fix that (e.g. have rviz receiving the static-tfs even when its started lets say 2sec after the rosbag playback started)?
regards
Tobias
Tobias
↧
Can't get an image from a rosbag file in rviz
Hello.
I have two mashines. One has a started roscore and iai_kinect2 package.
I recorded a rosbag file with one topic (/kinect2/qhd/image_color/compressed) on another PC.
But when I tried to look at it in rviz, I have the warning message "No image received". Though when I started rviz with iai node it show images. What happened with images in bag files? How can I record the data properly?
Thenks
↧
unable to extract images from bag file
http://wiki.ros.org/rosbag/Tutorials/Exporting%20image%20and%20video%20data
I am following this tutorial.
after I did roslaunch export.launch,
"" [rosbag-2] process has finished cleanly
log file: /home/krish/.ros/log/2e6402e2-2086-11e5-8d8c-0026c72b65a2/rosbag-2*.log"""
this is the message i got
I did ctrl+c
moved this file into test directory.
but I can;t see any images here
Also, i actually want to extract stereo images.
when I do rosbag info, I get:
types: sensor_msgs/Image [060021388200f6f0f447d0fcd9c64743]
topics: /rrc_camera/left/image_raw 171 msgs : sensor_msgs/Image
/rrc_camera/right/image_raw 171 msgs : sensor_msgs/Image
↧
↧
How to convert .csv or .mat file to ros?
I collect the odom and IMU data by dspace microautobox and at the same time I collected lidar data directly on ROS.
The odom data is .mat file now and it can be converted to .csv file. The lidar data has been recorded as a rosbag file.
Now I want to first convert the odom data to something available for ROS and then merge the two data to a single rosbag file. How can I make it?
I searched online for a long time but only can find how to convert rosbag file to other formats...
↧
how to use rosbag API?
I'm following the [C++ tutorial](http://wiki.ros.org/rosbag/Code%20API) and after I write down the code as .cpp file. What should I do next?
#include
#include
#include
int main(argc, char** argv)
{
ros::init(argc,argv,"bagit");
rosbag::Bag bag;
bag.open('test.bag',rosbag::bagmode::Write);
std_msgs::String str;
str.data=std::string("foo");
std_msgs::Int32 i;
i.data=42;
bag.write("chatter",ros::Time::now(),str);
bag.write("number",ros::Time::now(),i);
bag.close();
return 0;
}
↧
Can I record the data of the latest several seconds to a .bag?
Hi,
I would like to record the data of the latest several seconds, but do not find any way to do that.
I think that what I need is a queue. Say, a topic of 1 hz, and I want to record the latest three seconds. The first three seconds are pushed to the queue. When the time stamp is larger than three, the data in the head is pulled, and the new data is pushed.
t0-t1-t2 -> t1-t2-t3 -> t2-t3-t4
I do not know if I am right. Or, is there any easy way to do that? Many thanks.
↧
Access next element in topics from rosbag C++ API
Hi,
I have bagfiles with a bunch of different topics (that are synchronized). I would like to access the next element from each of those topics to perform some operations on them. And this would go on in a loop until the end of the bag file.
The [C++ API example](http://wiki.ros.org/rosbag/Code%20API#CA-f1c26e1c040572c3b8f12eae457dcc7484e9f089_1) just shows how to iterate over _all_ the messages from _all_ the topics and therefore needs another check to find out which topic the message came from. Is there a way to process a bunch of topics, but individually.
Also, I'm not sure of the purpose of the rosbag View class. Is this the only way to access the messages?
Thank you.
↧
↧
Is it normal to see duplicate console messages with identical time stamps?
I'm debugging some robot code by playing a bag file and capturing the console messages via rqt. One thing that strikes me as odd is that there are duplicate messages with the same time stamp but they have been received at different times.
I'm not allowed to upload pictures, so I'll copy the text from two messages. AFAIK, every message is being duplicated. The output is from rqt.
This is the first message and it is tagged as #3239.
Node: /inter_sector_planner
Time: 03:56:15.943835610 (2017-03-17)
Severity: Info
Published Topics: /robots/arrived_home, /rosout
InterSectorPlanner: Received status update from intra-sector planner...
Location:
/root/Dev/robots/ros_indigo/src/coverage_planner/robots_inter_sector_planner/src/inter_sector.cpp:SectorStatusCallback:301
----------------------------------------------------------------------------------------------------
This is the second message and it is tagged as #3261.
Node: /inter_sector_planner
Time: 03:56:15.943835610 (2017-03-17)
Severity: Info
Published Topics: /robots/arrived_home, /rosout
InterSectorPlanner: Received status update from intra-sector planner...
Location:
/root/Dev/robots/ros_indigo/src/coverage_planner/robots_inter_sector_planner/src/inter_sector.cpp:SectorStatusCallback:301
----------------------------------------------------------------------------------------------------
Does this mean that 2 instances are running? Why are messages duplicated?
↧
Is there a way to play a ROSBAG on repeat?
Is there a way to get rosbag to play a bag file on loop?
↧
find position of object in a bag
Hi guys,
I need to find(estimate) the 3D position of the 'people in the scene' -- (this is in the bag file) with respect to the camera.
My code is : (I didn't write it)
//ROS
#include
#include
#include
#include
#include
//OpenCV
#include
#include
#include
//BG SUBTRACTION
#include
cv::Ptr pMog;
cv::Mat fgMask;
void rgbCallback(const sensor_msgs::ImageConstPtr& msg)
{
cv_bridge::CvImageConstPtr cv_ptr;
try
{
cv_ptr = cv_bridge::toCvCopy(msg, sensor_msgs::image_encodings::BGR8);
}
catch (cv_bridge::Exception& ex)
{
ROS_ERROR("cv_bridge exception: %s", ex.what());
exit(-1);
}
pMog->operator()(cv_ptr->image, fgMask);
cv::imshow("RGB", cv_ptr->image);
cv::imshow("FgMask", fgMask);
cv::waitKey(30);
}
void depthCallback(const sensor_msgs::ImageConstPtr& msg)
{
cv_bridge::CvImageConstPtr cv_ptr;
try
{
cv_ptr = cv_bridge::toCvCopy(msg,
sensor_msgs::image_encodings::TYPE_32FC1);
}
catch (cv_bridge::Exception& ex)
{
ROS_ERROR("cv_bridge exception: %s", ex.what());
exit(-1);
}
cv::imshow("DEPTH", cv_ptr->image);
cv::waitKey(30);
}
int main(int argc, char **argv) {
ros::init(argc, argv, "kinectgrabber");
pMog = new cv::BackgroundSubtractorMOG(); //MOG Approach
ros::NodeHandle n;
ros::Subscriber sub =
n.subscribe("/camera/rgb/image_color", 1, rgbCallback);
//ros::Subscriber depth =
// n.subscribe("/camera/depth/image", 1, depthCallback);
ros::spin();
return 0;
}
I have a bag file (it's a short video with two man walking by) and I have to make the code read the bag file print out the 3D position of the people in the scene.
Any help is very much appreciated. Thank you in advance.
↧
TF error even after setting --clock and use_sim_time
Hello!
I am playing a rosbag after setting `rosparam set use_sim_time true` and then `rosbag play --clock 2017-03-22-08-09-56.bag -l`.
I am still getting this error:
TF_OLD_DATA ignoring data from the past for frame right_upper_forearm_visual at time 1.49018e+09 according to authority unknown_publisher
When running RViz, I can see the images of the rosbag, but they are not continuous. Frames do go missing and I am not getting entire visualization. Any idea as to what could be going wrong?
↧
↧
Can I use the Python Ros bag API without making a workspace?
I'm using roscpp, but I'd like to make a transferrable python script for working with bag files. However, the docs only tell me how to do it through a catkin workspace. Is there a way to import the necessary libraries without first making a workspace?
I'm referring to the error mentioned [in this post](http://answers.ros.org/question/12307/no-module-named-rosbag-error/)
The proposed solution was to change a manifest file. I'd like to be able to do this without going through a workspace so I can share the script with others.
↧
Are there any third party tools for fixing rosbag files?
We just carried out an experiment in which 15 minutes of raw depth images were recorded, at an approximate rate of 1GB/min. The resulting bag file looked the correct size, but 'rosbag info' shows that no messages were recorded. For example:
asymingt@asymingt:~/Desktop/SSLA$ rosbag info JSC\ First\ Working\ Test\ Picoflex_2017-02-28-11-55-22.bag
path: JSC First Working Test Picoflex_2017-02-28-11-55-22.bag
version: 2.0
size: 14.6 GB
asymingt@asymingt:~/Desktop/SSLA$
In fact, no start or end date are shown either. Only the size and bag version are printed. I have tried the rosbag fix, check and reindex commands without luck. This is the first time we have seen this behaviour and, unluckily for us, it affects an experiment that cannot be easily repeated.
Are there any third party tools for fixing or extracting data from corrupted bag file?
↧
Converting a bag with uncompressed Images to CompressedImages
I have an offline program that will bag.write sensor_msgs/Image and would rather compress them into a smaller format (e.g. JPEG) for easier streaming. Compressing the bags with bz2 or lz4 is insufficient. How do I do this offline (without using compressed image transport and publishing/subscribing) in C++ (for initial creation) and/or Python (post processing script on existing bags)?
↧
Why am I getting a nullptr from my bag file?
I am playing back a bagfile and I get a nullptr when I should be getting actual data.
Below is a screenshot of the bagfile being viewed from RQT. There is only one message called "decomposition/sectors" and I am certain that it should have data. Yet, when I am playing back the bag file in C++ code for a unit test, I get a nullptr for this particular item. Why am I getting this? The other messages will be properly read and instantiated, but not this one.
![image description](/upfiles/14909806363918583.png)
Here is a snippet of code which shows how I am reading this:
void CoverageReporterTester::SendMessage(const rosbag::MessageInstance& bag_msg)
{
if (topic_name == avidbots_topics::decomposition_sectors)
{
message_stack::sector_array msg1 = *bag_msg.instantiate();
SendMessage(topic_name, msg1);
}
}
/* Read bag file */
rosbag::Bag bag_file;
std::string bag_dir = ros::package::getPath("coverage_reporter") +
"/test/data/bag_files/coverage_reporter.bag";
bag_file.open(bag_dir, rosbag::bagmode::Read);
/* List of topics we are interested in */
std::vector topics;
topics.push_back(topics::decomposition_sectors);
topics.push_back(avidbots_topics::static_3d_costmap_topic);
rosbag::View view(bag_file, rosbag::TopicQuery(topics));
/* Loop through each message in the bag file which is associated with one of our topics listed above */
foreach(rosbag::MessageInstance const bag_msg, view)
{
coverage_reporter.SendMessage(bag_msg);
}
↧
↧
Rosbag API and TF transforming tools
I am currently writing some tools in Python to visually judge the quality of the odometry. For this I record a bag with odom topic as well as tf, lidar scans and wheel encoders.
Then I loop over the bag data, read all poses from odom and project the lidar scans to cartesian coordinates in the odom frame using the odom pose with the **closest timestamp**.
My odom is published at 100 Hz while the lidar scans are published at around 10 Hz so in worst case the time offset will be 5ms.
I then plot all lidar scans for all poses and see how well they overlap (Similar to setting decay_time of lidar scans to a very big value in Rviz)
Here is the result:
![image description](/upfiles/14919904558154573.png)
Compared to Rviz (with big decay time)
![image description](/upfiles/14919904846373419.png)
I noticed that my plot and the data displayed in Rviz (with a decay time covering all the length of the bag while doing rosbag play) looks slightly different.
Especially when zooming on known features, for example the half circle:
My plot:
![image description](/upfiles/14919905467122914.png)
Rviz:
![image description](/upfiles/14919905625225916.png)
The difference becomes very clear.
When looking in the source code of Rviz I saw that it just transforms the lidar scan to a pointcloud and then uses TF to transform it to the correct frame.
My question is how to do such a thing without having to play the bag, but only using the rosbag API, on Python? I suspect using the closest timestamp to be the source of error. Does TF interpolate to get the exact pose?
↧
Compressed camera images from rosbag to matlab
I have a bag file containing lots of sensor messages, including compressed camera images. I would like to use this stream of camera images in MATLAB.
I know of two possible approaches: either by exporting all images to JPG and load those into MATLAB, or by using [ROSMATLAB](http://www.mathworks.nl/hardware-support/robot-operating-system.html).
I think that the first approach would involve creating a Python script to collect the camera images form the `image_transport` node `republish`. This seems to involve an awful lot of decompressing and compressing (when saving back to JPG), so I'm looking for a way to directly collect the images from the bag file and save those. I believe that the compressed images in the bag file are JPG compressed, so it should be possible to directly map them.
The second approach would also involve the `republish` node, I think, since ROSMATLAB does not yet seem to support image transport. Also, I've run into problems collecting the images on the subscriber. I've seen [this](https://groups.google.com/d/msg/ros-sig-matlab/y8xaZ_OWNlM/Vj2CKvc7O0QJ) great tutorial on creating an image publisher, but I found it hard to reverse it. I've also seen [this](http://answers.ros.org/question/164831/depth-image-stream-from-kinect-in-matlab-with-ros-io-package/) question on ROS answers, which does a good job showing some of the basics, but I couldn't replicate it.
Long story short: can anyone provide a best practice to manage images from a rosbag in MATLAB?
↧
My rosbag is not showing any messages
Hi all!
I am trying to record a rosbag at click of a button on Baxter. At each click, a new bag starts recording with a different file name and the previous one closes. Below is the code trying to implement my idea, but I am getting an error
#!/usr/bin/env python
import rosbag
import rospy
from random import randint
from tf2_msgs.msg import TFMessage
from baxter_core_msgs.msg import DigitalIOState
from std_msgs.msg import String
from sensor_msgs.msg import Image
from sensor_msgs.msg import JointState
from baxter_core_msgs.msg import EndEffectorState
from create_folder import FileRead # reads a number from a text file
flag = 1 # initialized as global variable
print flag
tfMsg = TFMessage()
num = FileRead() #reads a number
print num[0]
bag = rosbag.Bag("/home/dhiraj/ros_ws/src/baxter_examples/scripts/"+str(num[0])+".bag",'w')
def callbackStop(data):
if data.state: # If the button is pressed
global flag
if flag == 1: # flag = 1 means bag is still recording
print "closing bag"
bag.close()
flag = 0 # stop recording
print "CLOSING flag value is", flag
def callbackRestart(data):
global flag
if flag == 1: #record bag only if flag is 1
print "RECORDING NEW BAG,with flag value",flag
bag.write("tf",tfMsg)
bag.write("/cameras/right_hand_camera/image",Image())
bag.write("/cameras/left_hand_camera/image",Image())
bag.write("/robot/joint_states",JointState())
bag.write("/robot/end_effector/right_gripper/state",EndEffectorState())
bag.write("/robot/end_effector/left_gripper/state",EndEffectorState())
bag.write("/softkinetic_left/rgb/image_color",Image())
bag.write("/softkinetic_right/rgb/image_color",Image())
bag.write("/softkinetic_left/depth/image_raw",Image())
bag.write("/softkinetic_right/depth/image_raw",Image())
bag.write("/camera/rgb/image_color",Image())
bag.write("/camera/depth/image_raw",Image())
def tf_info():
print "inside tf info"
rospy.init_node("get_tf")
restart = rospy.Subscriber("/robot/digital_io/right_shoulder_button/state",DigitalIOState, callbackRestart)
bagClose=rospy.Subscriber("/robot/digital_io/right_itb_button2/state",DigitalIOState,callbackStop)
rospy.spin()
if __name__=='__main__':
tf_info()
OUTPUT:
RECORDING NEW BAG,with flag value 1
RECORDING NEW BAG,with flag value 1
RECORDING NEW BAG,with flag value 1
closing bag
RECORDING NEW BAG,with flag value 1
[ERROR] [WallTime: 1492468923.696646] bad callback:
Traceback (most recent call last):
File "/opt/ros/indigo/lib/python2.7/dist-packages/rospy/topics.py", line 720, in _invoke_callback
cb(msg)
File "/home/dhiraj/ros_ws/src/baxter_examples/scripts/rosbag_recorder.py", line 32, in callbackStop
bag.close()
File "/opt/ros/indigo/lib/python2.7/dist-packages/rosbag/bag.py", line 417, in close
self._close_file()
File "/opt/ros/indigo/lib/python2.7/dist-packages/rosbag/bag.py", line 1201, in _close_file
self._file.close()
IOError: close() called during concurrent operation on the same file object.
**EDIT**
**Problem got solved when I changed**
def callbackStop(data):
if data.state:
global flag
if flag == 1:
flag = 0
print "closing bag, with value of flag", flag
bag.close()
Which means my callback was exiting right after the "bag.close()" according to the function written at the top, which was why it was not setting my flag =0. Making the edit mentioned just right above solved the problem.
I don't understand why my previous callbackStop was exiting after bag.close() without completing the entire function. Can you please explain?
↧