Hello. I am a ROS beginner. I am doing SLAM of turtlebot 2 in the ubuntu 14.04, ros (indigo) personal computer environment.
I need the following data to write a paper.
How can I get the data?
(In navigation, I want data from RViz where I specified the goal point myself)
1. The global map and the trajectory that the robot actually moved in the global map
2. Movement distance and travel time from the start point to the goal point
3. Data for making graphs of speed and distance, speed and time
that's all.
What should I do?rosbag?
Thank you.
↧
I am writing data of turtlebot movement, and I want to write a paper. How can I get the data?
↧
How do I append/add to a rosbag file without overwriting it in Python?
Basically I was testing recording data to a rosbag using a timestamp.
I noticed that every time I open and close the file it just rewrites the data, so I just leave the file open when the program starts and close it when it ends. However if I want to write to that same file it just overwrites all of the data.
How can I just append/add onto the same bag file?
↧
↧
Gazebo rosbag usage
Dear all,
I have a simulation environment on Gazebo and want to be able to reproduce the behaviour of one robot in different simulations. Already am recording a rosbag while controlling a robot on a first instance simulation.
The goal is to replicate what has been recording exactly, the problem is on how to do it, since RTFactor is not always 1 neither is constant, replaying the bag using [rosbag play -r X] is not a clean option.
Just looked into [rosbag play --clock] but the issue remains the same, both simulation and the bag replay stay unsynchronized...
Any suggestion on how to overcome this?
↧
precise timing with rosbag or an alternative
The beginner's [tutorial](http://wiki.ros.org/ROS/Tutorials/Recording%20and%20playing%20back%20data) reads, that `rosbag` may be not precise when it comes to the timing. Some tasks I face are not Lyapunov stable, so precise data playback is necessary.
1) Could `rosbag` be precise with the usage of advanced technical hacks? E.g. RAM-disk as a bag file destination and dedicated CPU core for recording/playing. Are there any ROS directives/modes for precise timing?
2) Are there alternative means of recording and playing messages and requests/responses with precise timing (95% <1ms)? Unfortunately, some non-trivial data like camera output might be involved.
For simplicity let's assume we work with a single machine to avoid network latency issues.
ROS Kinetic, Ubuntu 16.04.05 x64.
↧
undefined reference to everything
Hi,
I have a package in my workspace that uses openCV and cv_bridge. I needed the cuda module of openCV, but it doesn't exist in the default openCV of ROS. I installed another version of openCV that included the cuda modules, I tried to link my workspace and my package to the new openCV and I had lots of errors like "undefined reference to 'cv_bridge::...' ", "undefined reference to 'ros::init::...' ", "undefined reference to 'rosbag::...' ". I tried to solve the problems but I wasn't able, so I tried to return to the default openCV (it worked fine with my package before I tried to use cuda, etc.), but it gives the same errors now. I have uninstalled and reinstalled ROS, deleted my workspace and the package and I have created them again, but I have the same errors. I have all the dependencies in the CMakeLists.txt.
These are some of the errors:
undefined reference to 'cv_bridge::toCvCopy(boost::shared_ptr> const> const&, std::string const&)'
undefined reference to 'ros::console::initializeLogLocation(ros::console::LogLocation*, std::string const&, ros::console::levels::Level)'
undefined reference to 'ros::init(int&, char**, std::string const&, unsigned int)'
undefined reference to 'ros::NodeHandle::NodeHandle(std::string const&, std::map, std::allocator>> const&)'
undefined reference to 'rosbag::Bag::open(std::string const&, unsigned int)'
undefined reference to 'rosbag::TypeQuery::TypeQuery(std::string const&)'
undefined reference to 'rosbag::MessageInstance::getTopic() const'
undefined reference to 'pcl_ros::transformPointCloud(std::string const&, sensor_msgs::PointCloud2_> const&, sensor_msgs::PointCloud2_>&, tf::TransformListener const&)'
undefined reference to 'pcl::PCDWriter::writeBinary(std::string const&, pcl::PCLPointCloud2 const&, Eigen::Matrix const&, Eigen::Quaternion const&)'
undefined reference to 'pcl::PCDWriter::writeASCII(std::string const&, pcl::PCLPointCloud2 const&, Eigen::Matrix const&, Eigen::Quaternion const&, int)'
undefined reference to 'rosbag::MessageInstance::getMD5Sum() const'
undefined reference to 'rosbag::Bag::readField(std::map, std::allocator>> const&, std::string const&, bool, std::string&) const'
undefined reference to 'rosbag::Bag::readField(std::map, std::allocator>> const&, std::string const&, bool, std::string&) const'
I don't know if trying to use the new openCV I have changed something in my CMakeLists.txt or package.xml, so I add them here.
CMakeLists.txt:
cmake_minimum_required(VERSION 2.8.3)
project(b_c_f)
## Compile as C++11, supported in ROS Kinetic and newer
# add_compile_options(-std=c++11)
## Find catkin macros and libraries
## if COMPONENTS list like find_package(catkin REQUIRED COMPONENTS xyz)
## is used, also find other catkin packages
find_package(PCL REQUIRED)
find_package(OpenCV REQUIRED)
find_package(catkin REQUIRED COMPONENTS
cv_bridge
pcl_conversions
pcl_msgs
pcl_ros
rosbag
roscpp
sensor_msgs
tf
)
###################################
## catkin specific configuration ##
###################################
## The catkin_package macro generates cmake config files for your package
## Declare things to be passed to dependent projects
## INCLUDE_DIRS: uncomment this if your package contains header files
## LIBRARIES: libraries you create in this project that dependent projects also need
## CATKIN_DEPENDS: catkin_packages dependent projects also need
## DEPENDS: system dependencies of this project that dependent projects also need
catkin_package(
# INCLUDE_DIRS include
# LIBRARIES b_c_f
# CATKIN_DEPENDS cv_bridge pcl_conversions pcl_msgs pcl_ros rosbag roscpp sensor_msgs tf
# DEPENDS system_lib
)
###########
## Build ##
###########
## Specify additional locations of header files
## Your package locations should be listed before other locations
include_directories(
include
include/b_c_f
src/log
${catkin_INCLUDE_DIRS}
${PCL_INCLUDE_DIRS}
${OpenCV_INCLUDE_DIRS}
)
## Declare a C++ executable
## With catkin_make all packages are built within a single CMake context
## The recommended prefix ensures that target names across packages don't collide
add_executable(bag_to_pcd
src/BagCameraFiltering.cpp
src/Blob.cpp
src/timing-helper.cpp
src/Tracker.cpp
src/Vehicle.cpp
src/log/Logger.cpp
src/log/LogItem.cpp
src/log/Log.cpp
src/log/Thread.cpp
#include/bag_camera_filtering/Vehicle.h
)
## Specify libraries to link a library or executable target against
target_link_libraries(bag_to_pcd
${catkin_LIBRARIES}
${OpenCV_LIBS}
${PCL_LIBRARIES}
)
package.xml:
b_c_f 0.0.0 The b_c_f package portico TODO catkin cv_bridge pcl_conversions pcl_msgs pcl_ros rosbag roscpp sensor_msgs tf cv_bridge pcl_conversions pcl_msgs pcl_ros rosbag roscpp sensor_msgs tf cv_bridge pcl_conversions pcl_msgs pcl_ros rosbag roscpp sensor_msgs tf
I don't know anything about how the bash or bashrc should be so the problem could be there too.
Pd: I cloned the cv_bridge from github as another package in my catkin_ws but that didn't solve anything.
Please, I need help!
↧
↧
postprocessing bag with transform node
Hi,
I have recorded a bag file from a robot (not simulated!) that contains raw sensor data that is in this state not usable by other nodes I require.
I also have a node that would on the robot perform preprocessing on this raw sensor data. Now I'd also like to be able to use this same node and perform postprocessing on my recorded bagfile (containing raw sensor data).
My current approach is to have a launch file that runs a rosbag play node, another rosbag record node and also starts the transform node. But I don't know how to tell rosbag record node within the launch file that the rosbag play node has finished.
Also another related question: the recorded bag I have is quite large. Right now I do the playback using /use_sim_time = true. Would it be possible in this case to not use the sim_time and therefore fast forward the post processing process?
Thanks for any insights!
↧
How to remove a TF from a ROS bag?
I have a bag that contains a large TF tree where one TF (base_link->virtual_cam) was calculated based on several TFs and parameters. I would like to replay this bag without this generated TF in order to interactively re-generate it as a play the bag.
Hence the question, how do I remove this TF from the bag? Alternatively, can I play the bag without outputting this TF?
Specs: Ubuntu 12.04, Fuerte
↧
How to save image topics of Full-HD size with ROSBag
Hello.
I'm trying to save image topic from Full-HD image camera.
I ran my camera package on my pc. The publisher have 5 size of queue.
I have used `ImageTransport` for publisher.
the resolution of a image is 1920x1080. It is approximately 6 MB. And, It have 30 fps.
The throughput of result .bag file is not constant. The variation is 17~30.
I tried to use some option for rosbag record like --buffsize, --chunksize. But there was no change for it.
When I do that, the memory of RAM has continued to increase steadily.
I want to get images without throughput decrease.
Is there any good solution or advise for me?
↧
[Python][Rosbag] Written rosbag has same start and end dates - All topics are published at the end
Hello everyone,
I've been running around code APIs, forums and other wells of knowledge but I can't find the solution to my problem.
Here's what I want to do:
- open a .mat file (Done)
- read the structure and fetch the data (Done)
- create a rosbag with a topic for each message in the mat file (Fail)
I can write to the bag file with
for t in range(MESSAGE_SIZE):
ros_msg.data = MESSAGE[t]
t_msg.secs = t_vec.sec
t_msg.nsecs = t_vec.nsec
bag.write("/topic", ros_msg, t_msg)
Where t_vec is an array, filled with timestamps of said messages. It's under the same format as the ROS time.
Execution of the code is done smoothly.
However, when I try to replay the bag to check what has been written down in it, I got these issues:
- rosbag info tells me that bag duration is 0.0s
- rosbag info tells me that the start and end date are the same (final value of the time vector used in the bag creation)
- when I play the bag, the duration displayed in the terminal is 17s, whereas the time vector is 89 seconds long.
- all messages of all topics are published only when the bag reaches its end.
- the /clock topic is well published and its time is correct.
I honestly don't know where to look now...
Any help would be greatly appreciated.
Thanks in advance!
↧
↧
Bag created with rosbag api is not publishing
I am new to ROS. I am attempting to take pointcloud data that is in a format not compatible with ROS and convert it to a ROS bag using the C++ rosbag API. I am having an issue with the newly created bag not publishing data to the topic. For example, whenever I use the command "rostopic echo my_topic" while playing the bag with "rosbag play newConverted.bag" there is nothing printed to the console.
If I open the newConverted.bag with rqt_bag, right click on a topic and select publish, the data is then published to the topic and I can see it in rviz and with the rosbag echo command as expected.
I must be missing something in my bag conversion code that tells ROS to automatically publish the topic when playing. Thoughts?
↧
Creating an array/list of existing message type without defining custom message type?
Is there a way to create an array of an existing message type without defining another message type?
As a use-case, say you have many `.bag` files, each with a username. You compile them into a summary bag file and would like all unique usernames in a `metadata` topic for convenience. Is there any way to accomplish something like this without a custom message type?
users = get_unique_users() # a function returns a list of strings: [user1, user2, ..., userN]
outbag = rosbag.Bag('/path/to/file.bag', 'w')
outbag.write('/metadata/unique_users', std_msgs.msg.String(users))
It seems that an array of an existing message is such a simple extension that I'd figure such a mechanism exists. My current approach was to create a new package, `std_msgs_array` with:
$ rosmsg show std_msgs_array/StringArray
string[] data
$ rosmsg show std_msgs_array/Float64Array
float64[] data
Is this expected or am I overlooking something? I do note the following from the [ROS Wiki](http://wiki.ros.org/std_msgs):
>...it's usually "better" (in the sense of making the code easier to understand, etc.) when developers use or create non-generic message types
Perhaps the intention is to force more project-specific message creation rather than relying on built-in, canned examples. This use-case might be somewhat fringe where I truly am looking for a "throwaway"; all I care about is getting a list of very basic values into the output `.bag` vs. something like sensor data specific to a more complicated process.
↧
rosbag play - skip every Nth file
Hi,
I have a bag with X images, and messages.
I want to test if rtabmap odometry can handle images at different frame rates (e.g. using half the frames)
Is it possible to play the bag in a way that skips every second frame, so I don't have to create a bag for every possible frame rate?
Thanks,
Avner
↧
Record bag file for use with time synchronizer
I've been testing an algorithm using the EuRoC dataset. The dataset publishes synchronized camera images from two cameras along with IMU data. My application uses a TimeSynchronizer to process both both frames and works great on the EuRoC dataset.
I recorded my own datasets using an IMU and two Point Grey cameras that are hardware triggered. I used rosbag to record the three topics I thought I needed however when I play it back, no data makes it through the TimeSynchronizer. I can remove the TimeSynchronizer and things work but I can only use one camera.
Looking at the header timestamps on both messages, they're generally well within 1ms of each other (generally within 500 us) .
How close do the timestamps have to be for the TimeSynchronizer to pass them? Is there something special I should've done when recording the bag files to indicate the two camera topics were synchronized?
↧
↧
rqt_image_view look at only one image
Hi
I have a rosbag with images and with rqt_image_view I can see them, but only the whole bag at once. Is there a way to look at one image for several secondes/minutes? Now the images are like a video, in the same frequency as the rosbag is recored.
Thanks
↧
Rosbags saving empty messages
I am trying to record data from data from a Razor 9DOF imu, Hokuyo LIDAR and a ZED camera. The code compiles and runs fine but it is saving empty messages to the rosbag file. I have tried printing data in at different points and it wont. Can you tell me what my code is missing and why the messages are empty? I am running ROS kinetic 1.12.14 on Ubuntu 16.04 on an Nvidia Jetson TX2.
#include
#include
#include
#include
#include
#include
sensor_msgs::Imu imu_msg;
sensor_msgs::LaserScan scan;
sensor_msgs::Image msgr;
sensor_msgs::Image msgl;
/**
* Call back functio for Imu. Argument is a pointer to the recieved message
*/
void imuCallback(const sensor_msgs::Imu &imu_msg)
{
}
/**
* Call back functio for lidar. Argument is a pointer to the recieved message
*/
void scanCallback(const sensor_msgs::LaserScan::ConstPtr& scan)
{
}
/**
* Subscriber callbacks. The argument of the callback is a constant pointer to the received message
*/
void imageRightRectifiedCallback(const sensor_msgs::Image &msgr) {
}
void imageLeftRectifiedCallback(const sensor_msgs::Image &msgl) {
}
int main(int argc, char** argv){
ros::init(argc, argv, "talker");
ros::NodeHandle n;
rosbag::Bag bag; // create a bag named bag
bag.open("data.bag", rosbag::bagmode::Write); //open bag and create a file to save to.
ros::Subscriber imu_sub = n.subscribe("imu_data", 1000, imuCallback); // subscribes to the imu and start the callback function
ros::Subscriber sub = n.subscribe("/scan", 1000, scanCallback); // subscribes to the lidar and start the callback function
// subscribes to the zed and get the right image and start the callback function
ros::Subscriber subRightRectified = n.subscribe("/zed/right/image_rect_color", 10, imageRightRectifiedCallback);
// subscribes to the zed and get the left image and start the callback function
ros::Subscriber subLeftRectified = n.subscribe("/zed/left/image_rect_color", 10, imageLeftRectifiedCallback);
while(ros::ok()){
bag.write("imu", ros::Time::now(), imu_msg); // sending msg into the bag
bag.write("lidar", ros::Time::now(), scan);
bag.write("ZED_RightImage", ros::Time::now(), msgr);
bag.write("ZED_LeftImage", ros::Time::now(), msgl);
ROS_INFO("Recording data...");
ros::spinOnce();
}
bag.close();
return 0;
}
↧
How to control rosbag record from a topic?
Hello ROS Community,
I am learning how to use ROS for controlling a UAV and have not been able to find any examples on how I might approach this properly. I'm not from a software engineering background so I have much to learn about programming still.
I would like to use a switch on my RC transmitter to start and stop recording a rosbag. I have the switch state being published under /mavros/rc/in. I have been able to access the channel and print out it's values with a node, however, I am having difficulty working out how to get the rosbag to initiate and stop correctly.
Is a node the best tool to use for this or would something like a service be more appropriate?
Any suggestions on how best to start recording a rosbag when a topic is above a threshold and stop when it is below a threshold?
Any help would be greatly appreciated. Thanks in advance!
Edit:
I have got something working now. The updated code is below. Any suggestions for improvements however would be very welcome! Hopefully this can be useful to others in the future too.
#!/usr/bin/env python
import rospy
import rosbag
from mavros_msgs.msg import RCIn
import subprocess, shlex
class Monitor_Channel(object):
def __init__(self):
self.is_recording = False
self.loop_rate = rospy.Rate(0.5)
rospy.Subscriber("/mavros/rc/in", RCIn, self.callback)
def callback(self, data):
self.switch_state = data.channels[5]
if self.switch_state > 1200 and self.is_recording == False:
subprocess.Popen(shlex.split("rosbag record -O /media/nvidia/C61C-EBE4/recording.bag /mavros/imu/data_raw __name:=bag_record"))
self.is_recording = True
if self.switch_state > 1200 and self.is_recording == True:
print("Recording data")
if self.switch_state < 1200 and self.is_recording == True:
subprocess.Popen(shlex.split("rosnode kill /bag_record"))
self.is_recording = False
print("Recording stopped")
def start(self):
while not rospy.is_shutdown():
self.loop_rate.sleep()
if __name__ == '__main__':
rospy.init_node("Monitor_Channel_6", anonymous=True)
record_node = Monitor_Channel()
record_node.start()
↧
Data Error with Pepperl Fuchs R2000 with ROS kinetic
Dear friends,
I've encountered some wrong data with Pepperl Fuchs R2000 with ROS kinetic, below are the details.
Environment:
System: Ubuntu 16.04 Lts(virtual machine)
hardware: Intel Core I7-7700HQ(2 cores, 4 threads)
Memory: 8G;
Ros Version: Kenitic;
Laser: Pepperl Fuchs R2000;
Issue:
I was trying to build a map with the sensor Pepperl Fuchs R2000, I found that a few wrong frames from the original data.
Below are my steps:
1. Use the rosbag record -a to record all the data from Pepperl Fuchs R2000;
2. The laser is on the top of a car, drag the car in the workshop, the recording takes about 10 minutes;
3. (Use the bag to build a map but this would fail, because the bag has some wrong data) play this bag;
4. Monitoring the data;
5. You could see a few frames as attached:
There’s certainly one wrong frame data.
Is there anyone encountered a similar phenomenon? or know what’s the problem? Thanks.
Best Regards.
![image description](/upfiles/15511750554418711.png)
↧
↧
Rosbag record dying after bag reaches 4.3GB
Hello,
I am using an Nvidia TX2 with a set of 3 cameras and a Pixhawk flight controller. I am attempting to record 3x 4k image streams and IMU data from the Pixhawk.
The issue I am having is that rosbag record consistently dies at 4.3GB of bag size. It also leaves the bag as "active" when it stops. I have tried this with 4K resolution as well as lower resolutions; the only difference in outcome is that it happens sooner with the 4k imagery (always at 4.3GB).
This is the rosbag command I use.
rosbag record -b 0 -O /media/nvidia/C61C-EBE4/debug.bag /mavros/rc/out /mavros/cam_imu_sync/cam_imu_stamp /mavros/imu/data_raw /mavros/imu/data /camF/camera /camF/camera_info /camRR/camera /camRR/camera_info /camRL/camera /camRL/camera_info __name:=bag_record
This is the error I get.
[ INFO] [1551125050.845895518]: Recording to /media/nvidia/C61C-EBE4/debug.bag.
terminate called after throwing an instance of 'rosbag::BagIOException'
what(): Error writing to file: writing 24883246 bytes, wrote 9454209 bytes
I have tried to find the answer and came across this: https://answers.ros.org/question/218816/error-closing/ but no answer there. Any ideas on what might be going wrong?
Thanks in advance!
↧
Duplicate messages using `rostopic echo -b -p
Hello,
I am using ROS Kinetic on Ubuntu 16.04.
I am trying to save the rosbag mgss from topics into .csv files. When I used the command below, I see duplicate messages.
`rostopic echo -b xxx.bag -p /topicname/rgb`
I checked the timestamp using rqt_bag and I did not see duplicate messages at all! So I concluded that it must be the command above. Is there a way to remove duplicates?
Thanks,
- Praneeta
↧
When does rosbag start capturing messages and why I see the delay in two messages?
I am trying to understand how much time it takes for rosbag to start capturing packet as soon as we type `rosbag record -a` .
After analyzing packets captured in the wireshark ( see the wireshark packet at https://goo.gl/9kR2WC) , I found that before receiving any messages from ros nodes, roscord subscribers do several packet negotiations with ROS MASTER and then closes the connection to directly connect with rosnode to receive message from them (In enclosed wireshark capture file, packet no. 91 to 314). I want to know that during this packet negotiation, are ros messages buffered and once connection is established they are recorded or during this packet negotiation are ros messages dropped?
The reason is my bot publishes position and velocity. Differentiating position should give velocity. When I plot differentiated position and velocity, I find that there is lag of 0.4 second in velocity data and time shifting velocity by 0.4 second matches the differentiated position. Please note that positions and velocity data comes from two different sensor(it means they come from two different ros nodes).
My whole point of discussion is to find out the source of the delay or why I need to do time shift in velocity.
↧