I recorded a rosbag and tried to visualize it in Rviz. I get the following message `MessageFilter [target=left_hand ]: Discarding message from [unknown_publisher] due to empty frame_id. This message will only print once.` On doing a rostopic echo, I can see that the frame_id is empty. I ran `rosrun tf view_frames` to generate the pdf.
How do I rewrite the header frame id?
↧
How to rewrite the frameid in a rosbag?
↧
My rosbag info shows messages, but my ros msgs are empty.
Below is my code snippet to record rostopics using ROSbag API. My messages are empty when I do a rostopic echo. I am not sure what is happening.
My bag is being recorded and my
`rosbag info 17Apr17.bag` gives my this output:
path: 17Apr17.bag
version: 2.0
duration: 8.9s
start: Apr 17 2017 18:57:55.21 (1492469875.21)
end: Apr 17 2017 18:58:04.08 (1492469884.08)
size: 1022.8 KB
messages: 10656
compression: none [2/2 chunks]
types: baxter_core_msgs/EndEffectorState [ade777f069d738595bc19e246b8ec7a0]
sensor_msgs/Image [060021388200f6f0f447d0fcd9c64743]
sensor_msgs/JointState [3066dcd76a6cfaef579bd0f34173e9fd]
tf2_msgs/TFMessage [94810edda583a504dfda3829e70d7eec]
topics: /camera/depth/image_raw 888 msgs : sensor_msgs/Image
/camera/rgb/image_color 888 msgs : sensor_msgs/Image
/cameras/left_hand_camera/image 888 msgs : sensor_msgs/Image
/cameras/right_hand_camera/image 888 msgs : sensor_msgs/Image
/robot/end_effector/left_gripper/state 888 msgs : baxter_core_msgs/EndEffectorState
/robot/end_effector/right_gripper/state 888 msgs : baxter_core_msgs/EndEffectorState
/robot/joint_states 888 msgs : sensor_msgs/JointState
/softkinetic_left/depth/image_raw 888 msgs : sensor_msgs/Image
/softkinetic_left/rgb/image_color 888 msgs : sensor_msgs/Image
/softkinetic_right/depth/image_raw 888 msgs : sensor_msgs/Image
/softkinetic_right/rgb/image_color 888 msgs : sensor_msgs/Image
tf 888 msgs : tf2_msgs/TFMessage
CODE:
#!/usr/bin/env python
import rosbag
import rospy
from random import randint
from tf2_msgs.msg import TFMessage
from baxter_core_msgs.msg import DigitalIOState
from std_msgs.msg import String
from sensor_msgs.msg import Image
from sensor_msgs.msg import JointState
from baxter_core_msgs.msg import EndEffectorState
from create_folder import FileRead
flag = 1
print flag
tfMsg = TFMessage()
num = FileRead()
print num[0]
bag = rosbag.Bag("/home/dhiraj/ros_ws/src/baxter_examples/scripts/"+str(num[0])+".bag",'w')
# Flag = 0 will not record(callbackStop) the file and Flag=1 will recrd the file
# I need to record rosbag by the click of the buttons and hence the the setup
def callbackStop(data):
if data.state:
global flag
if flag == 1:
print "closing bag"
flag = 0
print "closing bag with flag", flag
bag.close()
def callbackRestart(data):
if data.state == 0:
print "ENTERING CALLBVBACK"
global flag
if flag == 1:
print "RECORDING NEW BAG"
bag.write("tf",tfMsg)
bag.write("/cameras/right_hand_camera/image",Image())
bag.write("/cameras/left_hand_camera/image",Image())
bag.write("/robot/joint_states",JointState())
bag.write("/robot/end_effector/right_gripper/state",EndEffectorState())
bag.write("/robot/end_effector/left_gripper/state",EndEffectorState())
bag.write("/softkinetic_left/rgb/image_color",Image())
bag.write("/softkinetic_right/rgb/image_color",Image())
bag.write("/softkinetic_left/depth/image_raw",Image())
bag.write("/softkinetic_right/depth/image_raw",Image())
bag.write("/camera/rgb/image_color",Image())
bag.write("/camera/depth/image_raw",Image())
def tf_info():
rospy.init_node("get_tf")
rightStart = rospy.Subscriber("/robot/digital_io/right_shoulder_button/state",DigitalIOState, callbackRestart)
bagClosRight = rospy.Subscriber("/robot/digital_io/right_itb_button2/state",DigitalIOState,callbackStop)
bagCloseLeft = rospy.Subscriber("/robot/digital_io/left_itb_button2/state",DigitalIOState,callbackStop)
leftRestart = rospy.Subscriber("/robot/digital_io/left_itb_button1/state",DigitalIOState,callbackSetFlag)
rightRestart = rospy.Subscriber("/robot/digital_io/right_itb_button1/state",DigitalIOState,callbackSetFlag)
rospy.spin()
if __name__=='__main__':
tf_info()
↧
↧
Data saved samples down when recording a rosbag
I am trying to save RGB and Depth images from 5 different sensors and as well record a rosbag from them. When ever I try to record he rosbag, the number of images saved drops from 30Hz to 7 Hz. Without the rosbag, it looks fine. How can I handle this situation?
Should I try recording rosbag in a different system or should I selectively sample the data saved in rosbag?
Any ideas?
↧
Problem while Running GMapping
I have modified the GMapping code by Openslam.org, it is being compiled correctly. Now when I run the gmapping package with the bagfile created from Intel dataset in the following sequence, it is giving warnings followed by a strange error.
Sequence of command execution,
roscore
rosparam set use_sim_time true
rosrun tf static_transform_publisher 0 0 0 0 0 0 odom base_link 100
rosbag play –clock intel.bag
rosrun gmapping slam_gmapping scan:=laser
The warning message and error is as follows--
[ INFO] [1493025756.604426852, 976052875.616820005]: Initialization complete
update frame 0
update ld=0 ad=0
Laser Pose= 0 0 -0.002458
m_count 0
Rimita is registering First Scan
[ WARN] [1493025756.610961272, 976052875.626896914]: TF to MSG: Quaternion Not Properly Normalized
Laser error: processScan Dysfunction
Laser error: processScan Dysfunction
Laser error: processScan Dysfunction
Laser error: processScan Dysfunction
I am not understanding how to solve the error, please suggest a solution if anyone has faced the same.
Thank you.
↧
Convert CARMEN log file to rosbag
This is more of a question of 'Which docs should I read'.
I have some CARMEN log files which I need to convert to rosbag. I have read about bag and msg formats, however, I do not quite understand how do I store information from the original file in the bag format.
So, for example, I have a line:
ODOM 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 1202839366.707501 lakebodom 0.128184
The format for ODOM is already known, and it is:
ODOM x y theta tv rv accel
As I understand, I can store messages in bags, and messages contain entries in such format:
For example, `int64 var_name`.
So I could probably fill the msg for this particular record with stuff like `float32 x_a` and so on, but how do I assign values? How do I store information that `x_a` is 0.000000?
I also don't understand this: bags can contain a whole lot of different records, such as chunks, connections, message data, etc., but how do I know what to use and what not to use? As of now, I can only imagine writing a lot of messages.
↧
↧
Bug? Rosbag filter and /tf_static
Hi all
I am using a .bag file with RTAB-map ROS. It uses some static transforms between different camera frames to do the mapping.
When I feed "data.bag" into RTAB-map, it works fine.
Now, I create a dummy filtered rosbag, using:
rosbag filter data.bag data_filtered.bag "topic != '/thistopicdoesntexist'"
As this topic doesn't exist, it should create an identical .bag file. However, when I try to feed "data_filtered.bag" into RTAB-map, I get many errors like:
[ WARN] [1494031985.023853640, 1493939596.985478852]: Could not get transform from camera_depth to camera_color after 0.100000 seconds (for stamp=1493939594.615704)! Error=". canTransform returned after 0.10093 timeout was 0.1.".
1. It seems that the filtered bag somehow is missing data from the static TF transforms. When I "rostopic echo -b data_filtered.bag", however, it shows that the static transforms are there.
2. It seems that the dummy filtering actually alters the .bag. Both MD5 hash and file size are changed between "data.bag" and "data_filtered.bag"! No compression is used. Why is that?
Maybe the problem is related to the time stamp? In any case, I am using use_sim_time=true, and I am also using --clock when playing back the .bag files.
Any help would be appreciated.
Cheers,
- ibd
↧
How to extract images from *.bag after image_proc?
I an trying to create pipe in .launch file in the following way:
1. Decompress image
2. Apply image_proc
3. Extract it
Contents of the .launch file is the following:
https://pastebin.com/8qtJLK2w
As a result, some files are saved after image_proc and some are not. My guess is that nodes work simultaneusly and sometimes extraction executes before applying image_proc. However, I am just a novice in ros.
Could you please help me to fix my launch file to archive my goal?
↧
Control the frequency of saving data of rosbag record?
Is it possible to control the frequency of saving data with rosbag record I want them just collect the data at the frequency of 100Hz, not more or less.
↧
How to change different topics pace in rosbag?
I have 2 topics (A and B) in rosbag. but they are not synchrinized, A is 10s ahead of B. So I want to create a new rosbag and make A 10s delay to synchronize with B.
How should I do that ?
should I change the header.stamp to make it?
↧
↧
Why the maps generated from a rosbag (offline mode) are better than in live time?
Hi!!!
I've been used Gmapping and HectorSLAM to generated maps in live time and using a rosbag file (offline mode) that storaged the messages of /scan and /tf topics. I can't understand why the maps generated using a rosbag are better than in live time? I hope that someone can explain me please.
Thanks in advance!
↧
Can rosbag 'buffer exceeded' error be overcome by increasing the RAM?
I am trying to record a number of topics which includes depth map from a stereo camera, color images from the camera, IMU data and GPS data. I get the warning '*buffer exceeded*' after storing around 4 bag files. My ram size is 16 GB. Will increasing the RAM (to say 32 GB) and then increasing the buffer size, help me get over this problem?
following is the command I use to record the topics
rosbag record -O --buffsize=15000 --split --duration=5s
↧
Error while using kitti2bag
I am trying to convert the kitti dataset to a rosbag file using https://github.com/tomas789/kitti2bag/tree/gh-pages . However, when I use the command kitti2bag , the following error is displayed :
Traceback (most recent call last):
File "/usr/local/bin/kitti2bag", line 12, in
import tf
File "/opt/ros/kinetic/lib/python2.7/dist-packages/tf/__init__.py", line 28, in
from _tf import *
ImportError: No module named '_tf'
I tried changing _tf to tf in the _init_.py file, but then it displays
File "/usr/local/bin/kitti2bag", line 12, in
import tf
File "/opt/ros/kinetic/lib/python2.7/dist-packages/tf/__init__.py", line 28, in
from listener import *
ImportError: No module named 'listener'
I can't figure out how to go about this. Any and all help will be appreciated
↧
I ran into dependency problem with building catkin_ws for kitti_to_rosbag package
I am provided with a kitti dataset at http://kitti.is.tue.mpg.de/kitti/raw_data/2011_09_26_drive_0001/2011_09_26_drive_0001_extract.zip. I have to convert this dataset into rosbag using the package https://github.com/ethz-asl/kitti_to_rosbag. This package require dependencies provided in the link https://github.com/ethz-asl/kitti_to_rosbag/blob/master/kitti_to_rosbag/package.xml. After cloning kitti_to_rosbag package into my ~/cw/src and performing catkin_make in my catkin workspace produces the following error:> CMake Error at kitti_to_rosbag/kitti_to_rosbag/CMakeLists.txt:4 (find_package):
By not providing "Findcatkin_simple.cmake" in CMAKE_MODULE_PATH this
project has asked CMake to find a package configuration file provided by
"catkin_simple", but CMake did not find one. Could not find a package configuration file provided by "catkin_simple" with any
of the following names: catkin_simpleConfig.cmake and
catkin_simple-config.cmake
I followed the following commands:
>cd ~/cw/src>git clone https://github.com/ethz-asl/kitti_to_rosbag>cd ..>catkin_make
In an attempt to rectify this error, I also cloned all dependent packages(catkin_simple, cv_bridge, eigen_checks ...) provided in the package.xml for kitti_to_rosbag into my catkin/src folder. However, it gives this error
> No rule to make target '/home/user/catkin_ws/devel/lib/libgflags.so', needed by '/home/user/catkin_ws/devel/lib/libeigen_checks.so
Any help would be greatly appreciated . Having a deadline for this task, I am open to alternative packages/methods.
↧
↧
Bag file empty when recording
I use `rosbag play --clock` together with `use_sim_time=true` to publish recorded sensor information. The information is used in my nodes wich publish new topics other nodes should subscribe.
I would like to record those output but `rosbag record` does not record any of the active topics. Even specifying **all** doesn't help.
Is there a solution to record the outputs of my nodes?
↧
Is there a way to change the /tf in a rosbag?
I currently have a rosbag with a /tf topic containing only static transforms. Is there a way I can alter one of the transforms?
I have tried creating a new filtered rosbag not containing the /tf and generating the /tf from a launch file. But I get issues when trying to display the pointclouds in rviz.
I get the error : "Message removed because it is too old" when trying to display PointCloud2 topics.
↧
rosbag record performance issues
Hello ROS community,
We are running our ROS system on a 6-core multiprocessor machine (8GB RAM, 32GB eMMC storage, 64bit ubuntu Linux) and **experience significant load increase when turning on ROS recording**.
In detail, **without recording** our system has roughly a **load of 1 between 2**. Once we **turn on recording** (rosbag record -a) our **load climbs up to roughly 5 or even 6**.
The amount of data we log is roughly 200KB/sec and is mostly comprised of *float32* values. This is not insignificant, but I am still surprised that recording has such a high impact.
Since we do not fully max out our CPUs, I assume the load is mostly related to IO!?
*My questions:*
- Given the amount of data we log, is this load profile "normal"? or are there any tips to improve this?
- if this can hardly be improved would it make sense to isolate the recording process on a separate machine to protect our core robotics system from slowing down?
Thanks a lot for reading.
↧
Display CompressedDepth Image Python cv2
Folks,
What is the correct way to subscribe to a CompressDepth (32FC1, plus, no regular depth image available) image and display (e.g., cv2.imshow) in Python? I can do this easily in roscpp, but I get an empty array rospy, even if I use something like:
depth_image = cv_bridge.CvBridge().compressed_imgmsg_to_cv2(msg)
Or
depth_image = np.array(cv_bridge.CvBridge().compressed_imgmsg_to_cv2(msg))
And even:
np_arr = np.fromstring(msg.data, np.uint8)
image_np = cv2.imdecode(np_arr, cv2.CV_LOAD_IMAGE_UNCHANGED)
Nothing works
↧
↧
Possible to live the data from rosbag
I want to live the data from rosbag and display on graph with respect to time. I googled it ,but not get any solution. I know reading data and writting data into rosbag . I want to display graph when rosbag play command was executed.
↧
How can I use bag file data in Matlab?
I have datas in bag file such as velocity and acceleration from the sensors.
I want to read data with respect to time domain to make calculations in Matlab.
How can I gain the data and use in Matlab?
Thanks for your answers.
↧
Can I record the data of the latest several seconds to a .bag?
Hi,
I would like to record the data of the latest several seconds, but do not find any way to do that.
I think that what I need is a queue. Say, a topic of 1 hz, and I want to record the latest three seconds. The first three seconds are pushed to the queue. When the time stamp is larger than three, the data in the head is pulled, and the new data is pushed.
t0-t1-t2 -> t1-t2-t3 -> t2-t3-t4
I do not know if I am right. Or, is there any easy way to do that? Many thanks.
↧