From a948415da8c155ae9457474d906ebfa54c696ae9 Mon Sep 17 00:00:00 2001 From: tfoote Date: Fri, 15 Mar 2024 16:08:12 +0000 Subject: [PATCH] =?UTF-8?q?Deploying=20to=20gh-pages=20from=20@=20ros-infr?= =?UTF-8?q?astructure/planet.ros.org@45f76061dd91faaf6ee8ad9c90148410b3834?= =?UTF-8?q?616=20=F0=9F=9A=80?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- CNAME | 1 + atom.xml | 2792 ++++++++++++++++++++++++++ foafroll.xml | 345 ++++ images/Robot_Head_clip_art.svg | 905 +++++++++ images/feed-icon-10x10.png | Bin 0 -> 469 bytes images/feed-icon-plus.svg | 142 ++ images/feed-icon.svg | 137 ++ images/planet_ros.svg | 153 ++ images/ros.ico | Bin 0 -> 1150 bytes images/ros_logo.svg | 28 + images/venus.png | Bin 0 -> 570 bytes index.html | 3356 ++++++++++++++++++++++++++++++++ opml.xml | 37 + planet.css | 132 ++ rss10.xml | 2550 ++++++++++++++++++++++++ rss20.xml | 2516 ++++++++++++++++++++++++ 16 files changed, 13094 insertions(+) create mode 100644 CNAME create mode 100644 atom.xml create mode 100644 foafroll.xml create mode 100644 images/Robot_Head_clip_art.svg create mode 100644 images/feed-icon-10x10.png create mode 100644 images/feed-icon-plus.svg create mode 100644 images/feed-icon.svg create mode 100644 images/planet_ros.svg create mode 100644 images/ros.ico create mode 100644 images/ros_logo.svg create mode 100644 images/venus.png create mode 100644 index.html create mode 100644 opml.xml create mode 100644 planet.css create mode 100644 rss10.xml create mode 100644 rss20.xml diff --git a/CNAME b/CNAME new file mode 100644 index 00000000..d05f0c5f --- /dev/null +++ b/CNAME @@ -0,0 +1 @@ +planet.ros.org diff --git a/atom.xml b/atom.xml new file mode 100644 index 00000000..fe41a591 --- /dev/null +++ b/atom.xml @@ -0,0 +1,2792 @@ + + + Planet ROS + 2024-03-15T16:08:10Z + Venus + + Open Robotics + info@openrobotics.org + + http://planet.ros.org/atom.xml + + + + + discourse.ros.org-topic-36651 + + ROS News for the Week of March 11th, 2024 +

ROS News for the Week of March 11th, 2024

+
+


+The ROSCon 2024 call for talks and workshops is now open! We want your amazing talks! Also, the ROSCon Diversity Scholarship deadline is coming up!

+
+


+ROS By-The-Bay is next week.. Open Robotic’s CEO @Vanessa_Yamzon_Orsi is dropping by to take your questions about the future of Open Robotics, and I recommend you swing by if you can. Just a heads up, we have to move to a different room on the other side of the complex; details are on Meetup.com.

+
+


+We’re planning a ROS Meetup in San Antonio on March 26th in conjunction with the ROS Industrial Consortium meeting. If you are in the area, or have colleagues in the region, please help us spread the word.

+
+


+We’ve line up a phenomenal guest for our next Gazebo Community Meeting; Ji Zhang from Carnegie Mellon will be speaking about his work on his work integrating ROS, Gazebo, and a variety of LIDAR-based SLAM techniques.

+

Events

+ +

News

+ +

ROS

+ +

ROS Questions

+

Got a minute? Please take some time to answer questions on Robotics Stack Exchange!

+

1 post - 1 participant

+

Read full topic

+
+ 2024-03-15T15:33:56Z + 2024-03-15T15:33:56Z + + + Katherine_Scott + + + + 2024-03-15T16:08:09Z + +
+ + + discourse.ros.org-topic-36624 + + ROSCon 2024 Call for Proposals Now Open +

ROSCon 2024 Call for Proposals

+

+

Hi Everyone,

+

The ROSCon call for proposals is now open! You can find full proposal details on the ROSCon website. ROSCon Workshop proposals are due by 2024-05-08T06:59:00Z UTC and can be submitted using this Google Form. ROSCon talks are due by 2024-06-04T06:59:00Z UTC and you can submit your proposals using Hot CRP. Please note that you’ll need a HotCRP account to submit your talk proposal. We plan to post the accepted workshops on or around 2024-07-08T07:00:00Z UTC and the accepted talks on or around 2024-07-15T07:00:00Z UTC respectively. If you think you will need financial assistance to attend ROSCon, and you meet the qualifications, please apply for our Diversity Scholarship Program as soon as possible. Diversity Scholarship applications are due on 2024-04-06T06:59:00Z UTC, well before the CFP deadlines or final speakers are announced. Questions and concerns about the ROSCon CFP can be directed to the ROSCon executive committee (roscon-2024-ec@openrobotics.org) or posted in this thread.

+

We recommend you start planning your talk early and take the time to workshop your submission with your friends and colleagues. You are more than welcome to use this Discourse thread and the #roscon-2024 channel on the ROS Discord to workshop ideas and organize collaborators.

+

Finally, I want to take a moment to recognize this year’s ROSCon Program Co-Chairs @Ingo_Lutkebohle and @Yadunund, along with a very long list of talk reviewers who are still being finalized. Reviewing talk proposals is fairly tedious task, and ROSCon wouldn’t happen without the efforts of our volunteers. If you happen to run into any of them at ROSCon please thank them for their service to the community.

+

Talk and Workshop Ideas for ROSCon 2024

+

If you’ve never been to ROSCon, but would like to submit a talk or workshop proposal, we recommend you take a look at the archive of previous ROSCon talks. Another good resource to consider are frequently discussed topics on ROS Discourse and Robotics Stack Exchange. In last year’s metric’s report I include a list of frequently asked topic tags from Robotics Stack that might be helpful. Aside from code, we really want to your robots! We want to see your race cars, mining robots, moon landers, maritime robots, development boards, and factories and hear about lessons you learned from making them happen. If you organize a working group, run a local meetup, or maintain a larger package we want to hear about your big wins in the past year.

+

While we can suggest a few ideas for talks and workshops that we would like to see at ROSCon 2024, what we really want is to hear from the community about topic areas that you think are important. If there is a talk you would like to see at ROSCon 2024 consider proposing a that topic in the comments below. Feel free to write a whole list! Some of our most memorable talks have been ten minute overviews of key ROS subsystems that everyone uses. If you think a half hour talk about writing a custom ROS 2 executor and benchmarking its performance would be helpful, please say so!

+

1 post - 1 participant

+

Read full topic

+
+ 2024-03-15T15:19:51Z + 2024-03-15T15:19:51Z + + + Katherine_Scott + + + + 2024-03-15T16:08:09Z + +
+ + + discourse.ros.org-topic-36644 + + Cobot Magic: AgileX achieved the whole process of Mobile Aloha model training in both the simulation and real environment +

Cobot Magic: AgileX achieved the whole process of Mobile Aloha model training in both the simulation and real environment

+

Mobile Aloha is a whole-body remote operation data collection system developed by Zipeng Fu, Tony Z. Zhao, and Chelsea Finn from Stanford University. link.

+

Based on Mobile Aloha, AgileX developed Cobot Magic, which can achieve the complete code of Mobile Aloha, with higher configurations and lower costs, and is equipped with larger-load robotic arms and high-computing power industrial computers. For more details about Cobot Magic please check the AgileX website .

+

Currently, AgileX has successfully completed the integration of Cobot Magic based on the Mobile Aloha source code project.
+推理

+

Simulation data training

+

Data collection

+

After setting up the Mobile Aloha software environment(metioned in last section), model training in the simulation environment and real environment can be achieved. The following is the data collection part of the simulation environment. The data is provided by the team of Zipeng Fu, Tony Z. Zhao, and Chelsea Finn team.You can find all scripted/human demo for simulated environments here. here

+

After downloading, copy it to the act-plus-plus/data directory. The directory structure is as follows:

+
act-plus-plus/data
+    ├── sim_insertion_human
+    │   ├── sim_insertion_human-20240110T054847Z-001.zip
+        ├── ...
+    ├── sim_insertion_scripted
+    │   ├── sim_insertion_scripted-20240110T054854Z-001.zip
+        ├── ... 
+    ├── sim_transfer_cube_human
+    │   ├── sim_transfer_cube_human-20240110T054900Z-001.zip
+    │   ├── ...
+    └── sim_transfer_cube_scripted
+        ├── sim_transfer_cube_scripted-20240110T054901Z-001.zip
+        ├── ...
+
+

Generate episodes and render the result graph. The terminal displays 10 episodes and 2 successful ones.

+
# 1 Run
+python3 record_sim_episodes.py --task_name sim_transfer_cube_scripted --dataset_dir <data save dir> --num_episodes 50
+
+# 2 Take sim_transfer_cube_scripted as an example
+python3 record_sim_episodes.py --task_name sim_transfer_cube_scripted --dataset_dir data/sim_transfer_cube_scripted --num_episodes 10
+
+# 2.1 Real-time rendering
+python3 record_sim_episodes.py --task_name sim_transfer_cube_scripted --dataset_dir data/sim_transfer_cube_scripted --num_episodes 10  --onscreen_render
+
+# 2.2 The output in the terminal shows
+ube_scripted --num_episodes 10
+episode_idx=0
+Rollout out EE space scripted policy
+episode_idx=0 Failed
+Replaying joint commands
+episode_idx=0 Failed
+Saving: 0.9 secs
+
+episode_idx=1
+Rollout out EE space scripted policy
+episode_idx=1 Successful, episode_return=57
+Replaying joint commands
+episode_idx=1 Successful, episode_return=59
+Saving: 0.6 secs
+...
+Saved to data/sim_transfer_cube_scripted
+Success: 2 / 10
+
+

The loaded image renders as follows:
+

+

Data Visualization

+

Visualize simulation data. The following figures show the images of episode0 and episode9 respectively.

+

The episode 0 screen in the data set is as follows, showing a case where the gripper fails to pick up.

+

episode0

+

The visualization of the data of episode 9 shows the successful case of grippering.

+

episode19

+

Print the data of each joint of the robotic arm in the simulation environment. Joint 0-13 is the data of 14 degrees of freedom of the robot arm and the gripper.

+

+

Model training and inference

+

Simulated environments datasets must be downloaded (see Data Collection)

+
python3 imitate_episodes.py --task_name sim_transfer_cube_scripted --ckpt_dir <ckpt dir> --policy_class ACT --kl_weight 10 --chunk_size 100 --hidden_dim 512 --batch_size 8 --dim_feedforward 3200 --num_epochs 2000  --lr 1e-5 --seed 0
+
+# run
+python3 imitate_episodes.py --task_name sim_transfer_cube_scripted --ckpt_dir trainings --policy_class ACT --kl_weight 1 --chunk_size 10 --hidden_dim 512 --batch_size 1 --dim_feedforward 3200  --lr 1e-5 --seed 0 --num_steps 2000
+
+# During training, you will be prompted with the following content. Since you do not have a W&B account, choose 3 directly.
+wandb: (1) Create a W&B account
+wandb: (2) Use an existing W&B account
+wandb: (3) Don't visualize my results
+wandb: Enter your choice:
+
+

After training is completed, the weights will be saved to the trainings directory. The results are as follows:

+
trainings
+  ├── config.pkl
+  ├── dataset_stats.pkl
+  ├── policy_best.ckpt
+  ├── policy_last.ckpt
+  └── policy_step_0_seed_0.ckpt
+
+

Evaluate the model trained above:

+
# 1 evaluate the policy  add --onscreen_render real-time render parameter
+python3 imitate_episodes.py --eval --task_name sim_transfer_cube_scripted --ckpt_dir trainings --policy_class ACT --kl_weight 1 --chunk_size 10 --hidden_dim 512 --batch_size 1 --dim_feedforward 3200  --lr 1e-5 --seed 0 --num_steps 20 --onscreen_render
+
+

And print the rendering picture.

+

+

Data Training in real environment

+

Data Collection

+

1.Environment dependency

+

1.1 ROS dependency

+

● Default: ubuntu20.04-noetic environment has been configured

+
sudo apt install ros-$ROS_DISTRO-sensor-msgs ros-$ROS_DISTRO-nav-msgs ros-$ROS_DISTRO-cv-bridge
+
+

1.2 Python dependency

+
# Enter the current working space directory and install the dependencies in the requirements.txt file.
+pip install -r requiredments.txt
+
+

2.Data collection

+

2.1 Run ‘collect_data’

+
python collect_data.py -h # see parameters
+python collect_data.py --max_timesteps 500 --episode_idx 0
+python collect_data.py --max_timesteps 500 --is_compress --episode_idx 0
+python collect_data.py --max_timesteps 500 --use_depth_image --episode_idx 1
+python collect_data.py --max_timesteps 500 --is_compress --use_depth_image --episode_idx 1
+
+

After the data collection is completed, it will be saved in the ${dataset_dir}/{task_name} directory.

+
python collect_data.py --max_timesteps 500 --is_compress --episode_idx 0
+# Generate dataset episode_0.hdf5 . The structure is :
+
+collect_data
+  ├── collect_data.py
+  ├── data                     # --dataset_dir 
+  │   └── cobot_magic_agilex   # --task_name 
+  │       ├── episode_0.hdf5   # The location of the generated data set file
+          ├── episode_idx.hdf5 # idx is depended on  --episode_idx
+          └── ...
+  ├── readme.md
+  ├── replay_data.py
+  ├── requiredments.txt
+  └── visualize_episodes.py
+
+

The specific parameters are shown:

+
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
NameExplanation
dataset_dirData set saving path
task_nametask name, as the file name of the data set
episode_idxAction block index number
max_timestepsThe number of time steps for the maximum action block
camera_namesCamera names, default [‘cam_high’, ‘cam_left_wrist’, ‘cam_right_wrist’]
img_front_topicCamera 1 Color Picture Topic
img_left_topicCamera 2 Color Picture Topic
img_right_topicCamera 3 Color Picture Topic
use_depth_imageWhether to use depth information
depth_front_topicCamera 1 depth map topic
depth_left_topicCamera 2 depth map topic
depth_right_topicCamera 3 depth map topic
master_arm_left_topicLeft main arm topic
master_arm_right_topicRight main arm topic
puppet_arm_left_topicLeft puppet arm topic
puppet_arm_right_topicRight puppet arm topic
use_robot_baseWhether to use mobile base information
robot_base_topicMobile base topic
frame_rateAcquisition frame rate. Because the camera image stabilization value is 30 frames, the default is 30 frames
is_compressWhether the image is compressed and saved
+

The picture of data collection from the camera perspective is as follows:

+

data collection

+

Data visualization

+

Run the following code:

+
python visualize_episodes.py --dataset_dir ./data --task_name cobot_magic_agilex --episode_idx 0
+
+

Visualize the collected data. --dataset_dir, --task_name and --episode_idx need to be the same as when ‘collecting data’. When you run the above code, the terminal will print the action and display a color image window. The visualization results are as follows:

+

+

After the operation is completed, episode${idx}qpos.png, episode${idx}base_action.png and episode${idx}video.mp4 files will be generated under ${dataset_dir}/{task_name}. The directory structure is as follows:

+
collect_data
+├── data
+│   ├── cobot_magic_agilex
+│   │   └── episode_0.hdf5
+│   ├── episode_0_base_action.png   # base_action
+│   ├── episode_0_qpos.png          # qpos
+│   └── episode_0_video.mp4         # Color video
+
+

Taking episode30 as an example, replay the collected episode30 data. The camera perspective is as follows:

+

data visualization

+

Model Training and Inference

+

The Mobile Aloha project has studied different strategies for imitation learning, and proposed a Transformer-based action chunking algorithm ACT (Action Chunking with Transformers). It is essentially an end-to-end strategy: directly mapping real-world RGB images to actions, allowing the robot to learn and imitate from the visual input without the need for additional artificially encoded intermediate representations, and using action chunking (Chunking) as the unit to predict and integrates accurate and smooth motion trajectories.

+

The model is as follows:

+

+

Disassemble and interpret the model.

+
    +
  1. Sample data
  2. +
+

+

Input: includes 4 RGB images, each image has a resolution of 480 × 640, and the joint positions of the two robot arms (7+7=14 DoF in total)

+

Output: The action space is the absolute joint positions of the two robots, a 14-dimensional vector. Therefore, with action chunking, the policy outputs a k × 14 tensor given the current observation (each action is defined as a 14-dimensional vector, so k actions are a k × 14 tensor)

+
    +
  1. Infer Z
  2. +
+

+

The input to the encoder is a [CLS] token, which consists of randomly initialized learning weights. Through a linear layer2, the joints are projected to the joint positions of the embedded dimensions (14 dimensions to 512 dimensions) to obtain the embedded joint positions embedded joints. Through another linear layer linear layer1, the k × 14 action sequence is projected to the embedded action sequence of the embedded dimension (k × 14 dimension to k × 512 dimension).

+

The above three inputs finally form a sequence of (k + 2) × embedding_dimension, that is, (k + 2) × 512, and are processed with the transformer encoder. Finally, just take the first output, which corresponds to the [CLS] tag, and use another linear network to predict the mean and variance of the Z distribution, parameterizing it as a diagonal Gaussian distribution. Use reparameterization to obtain samples of Z.

+
    +
  1. Predict a action sequence
  2. +
+

+

① First, for each image observation, it is processed by ResNet18 to obtain a feature map (15 × 20 × 728 feature maps), and then flattened to obtain a feature sequence (300 × 728). These features are processed using a linear layer Layer5 is projected to the embedding dimension (300×512), and in order to preserve spatial information, a 2D sinusoidal position embedding is added.

+

② Secondly, repeat this operation for all 4 images, and the resulting feature sequence dimension is 1200 × 512.

+

③ Next, the feature sequences from each camera are concatenated and used as one of the inputs of the transformer encoder. For the other two inputs: the current joint positions joints and the “style variable” z, they are passed through the linear layer linear layer6, linear layer respectively Layer7 is uniformly projected to 512 from their respective original dimensions (14, 15).

+

④ Finally, the encoder input of the transformer is 1202×512 (the feature dimension of the 4 images is 1200×512, the feature dimension of the joint position joins is 1×512, and the feature dimension of the style variable z is 1×512).

+

The input to the transformer decoder has two aspects:

+

On the one hand, the “query” of the transformer decoder is the first layer of fixed sinusoidal position embeddings, that is, the position embeddings (fixed) shown in the lower right corner of the above figure, whose dimension is k × 512

+

On the other hand, the “keys” and “values” in the cross-attention layer of the transformer decoder come from the output of the above-mentioned transformer encoder.

+

Thereby, the transformer decoder predicts the action sequence given the encoder output.

+

By collecting data and training the above model, you can observe that the results converge.

+

+

A third view of the model inference results is as follows. The robotic arm can infer the movement of placing colored blocks from point A to point B.

+

推理

+

Summary

+

Cobot Magic is a remote whole-body data collection device, developed by AgileX Robotics based on the Mobile Aloha project from Stanford University. With Cobot Magic, AgileX Robotics has successfully achieved the open-source code from the Stanford laboratory used on the Mobile Aloha platform, including in simulation and real environment.
+AgileX will continue to collect data from various motion tasks based on Cobot Magic for model training and inference. Please stay tuned for updates on
Github. And if you are interested in this Mobile Aloha project, join us with this slack link: Slack. Let’s talk about our ideas.

+

About AgileX

+

Established in 2016, AgileX Robotics is a leading manufacturer of mobile robot platforms and a provider of unmanned system solutions. The company specializes in independently developed multi-mode wheeled and tracked wire-controlled chassis technology and has obtained multiple international certifications. AgileX Robotics offers users self-developed innovative application solutions such as autonomous driving, mobile grasping, and navigation positioning, helping users in various industries achieve automation. Additionally, AgileX Robotics has introduced research and education software and hardware products related to machine learning, embodied intelligence, and visual algorithms. The company works closely with research and educational institutions to promote robotics technology teaching and innovation.

+

1 post - 1 participant

+

Read full topic

+
+ 2024-03-15T03:07:59Z + 2024-03-15T03:07:59Z + + + Agilex_Robotics + + + + 2024-03-15T16:08:10Z + +
+ + + discourse.ros.org-topic-36604 + + Cloud Robotics WG Strategy & Next Meeting Announcement +

Hi folks!

+

I wanted to tell you the results of the Cloud Robotics Working Group meeting from 2024-03-11. We met to discuss the long-term strategy of the group. You can see the full meeting recording on vimeo, with our meeting minutes here (thanks to Phil Roan for taking minutes this meeting!).

+

During the meeting, we went over some definitions of Cloud Robotics, our tenets going forward, and a phase approach of gathering data, analyzing it, and acting on it. We used slides to frame the discussion, which have since been updated from the discussion and will form the backbone of our discussion going forwards. The slide deck is publicly available here.

+

Next meeting will be about how to start collecting the data for the first phase. We will hold it 2024-03-25T17:00:00Z UTC2024-03-25T18:00:00Z UTC. If you’d like to join the group, you are welcome to, and you can sign up for our meeting invites at this Google Group.

+

Finally, we will regularly invite members and guests to give talks in our meetings. If you have a topic you’d like to talk about, or would like to invite someone to talk, please use this speaker signup sheet to let us know.

+

Hopefully I’ll see you all in future meetings!

+

6 posts - 4 participants

+

Read full topic

+
+ 2024-03-12T17:33:16Z + 2024-03-12T17:33:16Z + + + mikelikesrobots + + + + 2024-03-15T16:08:09Z + +
+ + + discourse.ros.org-topic-36583 + + Foxglove 2.0 - integrated UI, new pricing, and open source changes +

Hi everyone - excited to announce Foxglove 2.0, with a new integrated UI (merging Foxglove Studio and Data Platform), new pricing plans, and open source changes.

+

:handshake: Streamlined UI for smoother robotics observability
+:satellite: Automatic data offload through Foxglove Agent
+:credit_card: Updated pricing plans to make Foxglove accessible for teams of all sizes
+:mag_right: Changes to our open-source strategy (we’re discontinuing the open source edition of Foxglove Studio)

+

Read the details in our blog post.

+

Note that Foxglove is still free for academic teams and researchers! If you fall into that category, please contact us and we can upgrade your account.

+

15 posts - 10 participants

+

Read full topic

+
+ 2024-03-11T19:28:55Z + 2024-03-11T19:28:55Z + + + amacneil + + + + 2024-03-15T16:08:09Z + +
+ + + discourse.ros.org-topic-36572 + + Announcing open sourcing of ROS 2 Task Manager! +

:tada: Me and my team are happy to announce that we at Karelics have open sourced our ROS 2 Task Manager package. This solution allows you to convert your existing ROS actions and services into tasks, offering useful features such as automatic task conflict resolution, the ability to aggregate multiple tasks into larger Missions, and straightforward tracking for active tasks and their results.

+

Check out the package and examples of its usage with the Nav2 package:
+:link: https://github.com/Karelics/task_manager

+

For an introduction and deeper insights into our design decisions, see our blog post available at: https://karelics.fi/task-manager-ros-2-package/
+

+


+

+We firmly believe that this package will prove valuable to the ROS community and accelerate the development of the robot systems. We are excited to hear your thoughts and feedback on it!

+

1 post - 1 participant

+

Read full topic

+
+ 2024-03-11T12:52:42Z + 2024-03-11T12:52:42Z + + + jak + + + + 2024-03-15T16:08:09Z + +
+ + + discourse.ros.org-topic-36560 + + New Packages for Iron Irwini 2024-03-11 +

We’re happy to announce 1 new packages and 82 updates are now available in ROS 2 Iron Irwini :iron: :irwini: . This sync was tagged as iron/2024-03-11 .

+

Package Updates for iron

+

Added Packages [1]:

+
    +
  • ros-iron-apriltag-detector-dbgsym: 1.2.1-1
  • +
+

Updated Packages [82]:

+
    +
  • ros-iron-apriltag-detector: 1.2.0-1 → 1.2.1-1
  • +
  • ros-iron-controller-interface: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-controller-interface-dbgsym: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-controller-manager: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-controller-manager-dbgsym: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-controller-manager-msgs: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-controller-manager-msgs-dbgsym: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-etsi-its-cam-coding: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-cam-coding-dbgsym: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-cam-conversion: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-cam-msgs: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-cam-msgs-dbgsym: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-coding: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-conversion: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-conversion-dbgsym: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-denm-coding: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-denm-coding-dbgsym: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-denm-conversion: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-denm-msgs: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-denm-msgs-dbgsym: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-messages: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-msgs: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-msgs-utils: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-primitives-conversion: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-rviz-plugins: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-rviz-plugins-dbgsym: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-flir-camera-description: 2.0.8-1 → 2.0.8-2
  • +
  • ros-iron-flir-camera-msgs: 2.0.8-1 → 2.0.8-2
  • +
  • ros-iron-flir-camera-msgs-dbgsym: 2.0.8-1 → 2.0.8-2
  • +
  • ros-iron-hardware-interface: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-hardware-interface-dbgsym: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-hardware-interface-testing: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-hardware-interface-testing-dbgsym: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-joint-limits: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-libmavconn: 2.6.0-1 → 2.7.0-1
  • +
  • ros-iron-libmavconn-dbgsym: 2.6.0-1 → 2.7.0-1
  • +
  • ros-iron-mavlink: 2023.9.9-1 → 2024.3.3-1
  • +
  • ros-iron-mavros: 2.6.0-1 → 2.7.0-1
  • +
  • ros-iron-mavros-dbgsym: 2.6.0-1 → 2.7.0-1
  • +
  • ros-iron-mavros-extras: 2.6.0-1 → 2.7.0-1
  • +
  • ros-iron-mavros-extras-dbgsym: 2.6.0-1 → 2.7.0-1
  • +
  • ros-iron-mavros-msgs: 2.6.0-1 → 2.7.0-1
  • +
  • ros-iron-mavros-msgs-dbgsym: 2.6.0-1 → 2.7.0-1
  • +
  • ros-iron-mrpt2: 2.11.9-1 → 2.11.11-1
  • +
  • ros-iron-mrpt2-dbgsym: 2.11.9-1 → 2.11.11-1
  • +
  • ros-iron-mvsim: 0.8.3-1 → 0.9.1-1
  • +
  • ros-iron-mvsim-dbgsym: 0.8.3-1 → 0.9.1-1
  • +
  • ros-iron-ntrip-client: 1.2.0-3 → 1.3.0-1
  • +
  • ros-iron-ros2-control: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-ros2-control-test-assets: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-ros2controlcli: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-rqt-controller-manager: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-rtabmap: 0.21.3-1 → 0.21.4-1
  • +
  • ros-iron-rtabmap-conversions: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-conversions-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-dbgsym: 0.21.3-1 → 0.21.4-1
  • +
  • ros-iron-rtabmap-demos: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-examples: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-launch: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-msgs: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-msgs-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-odom: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-odom-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-python: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-ros: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-rviz-plugins: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-rviz-plugins-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-slam: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-slam-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-sync: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-sync-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-util: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-util-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-viz: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-viz-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-simple-launch: 1.9.0-1 → 1.9.1-1
  • +
  • ros-iron-spinnaker-camera-driver: 2.0.8-1 → 2.0.8-2
  • +
  • ros-iron-spinnaker-camera-driver-dbgsym: 2.0.8-1 → 2.0.8-2
  • +
  • ros-iron-transmission-interface: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-transmission-interface-dbgsym: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-ur-client-library: 1.3.4-1 → 1.3.5-1
  • +
  • ros-iron-ur-client-library-dbgsym: 1.3.4-1 → 1.3.5-1
  • +
+

Removed Packages [0]:

+

Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:

+
    +
  • Bence Magyar
  • +
  • Bernd Pfrommer
  • +
  • Felix Exner
  • +
  • Jean-Pierre Busch
  • +
  • Jose-Luis Blanco-Claraco
  • +
  • Luis Camero
  • +
  • Mathieu Labbe
  • +
  • Olivier Kermorgant
  • +
  • Rob Fisher
  • +
  • Vladimir Ermakov
  • +
+

1 post - 1 participant

+

Read full topic

+
+ 2024-03-11T01:54:48Z + 2024-03-11T01:54:48Z + + + Yadunund + + + + 2024-03-15T16:08:09Z + +
+ + + discourse.ros.org-topic-36532 + + ROS News for the Week of March 4th, 2024 +

ROS News for the Week of March 4th, 2024

+


+I’ve been working with the ROS Industrial team, and the Port of San Antonio, to put together a ROS Meetup in San Antonio / Austin in conjunction with the annual ROS Industrial Consortium Meeting. If you are attending the ROS-I meeting make sure you sign up!

+
+


+Gazebo Classic goes end of life in 2025! To help the community move over to modern Gazebo we’re holding open Gazebo office hours next Tuesday, March 12th, at 9am PST. If you have questions about the migration process please come by!

+
+

e1d28e85278dd4e221030828367839e4950b8cf9_2_671x500
+We often get questions about the “best” robot components for a particular application. I really hate answering these questions; my inner engineer just screams, “IT DEPENDS!” Unfortunately, w really don’t have a lot of apples-to-apples data to compare different hardware vendors.

+

Thankfully @iliao is putting in a ton of work to review ten different low cost LIDAR sensors. Check it out here.
+

+

teaser3
+This week we got a sneak peek at some of the cool CVPR 2024 papers. Check out, “Gaussian Splatting SLAM”, by Hidenobu Matsuki, Riku Murai, Paul H.J. Kelly, Andrew J. Davison, complete with source code.

+
+

1aa39368041ea4a73d78470ab0d7441453258cdf_2_353x500
+We got our new ROSCon France graphic this week! ROSCon France is currently accepting papers! Please consider applying if you speak French!

+

Events

+ +

News

+ +

ROS

+ +

ROS Questions

+

Please make ROS a better project for the next person! Take a moment to answer a question on Robotics Stack Exchange! Not your thing? Contribute to the ROS 2 Docs!

+

4 posts - 2 participants

+

Read full topic

+
+ 2024-03-08T21:50:00Z + 2024-03-08T21:50:00Z + + + Katherine_Scott + + + + 2024-03-15T16:08:09Z + +
+ + + discourse.ros.org-topic-36529 + + New packages for Humble Hawksbill 2024-03-08 +

Package Updates for Humble

+

Added Packages [13]:

+ +

Updated Packages [220]:

+
    +
  • ros-humble-ackermann-steering-controller: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-ackermann-steering-controller-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-admittance-controller: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-admittance-controller-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-apriltag-detector: 1.1.0-1 → 1.1.1-1
  • +
  • ros-humble-bicycle-steering-controller: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-bicycle-steering-controller-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-bno055: 0.4.1-1 → 0.5.0-1
  • +
  • ros-humble-camera-calibration: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-caret-analyze: 0.5.0-1 → 0.5.0-2
  • +
  • ros-humble-cob-actions: 2.7.9-1 → 2.7.10-1
  • +
  • ros-humble-cob-actions-dbgsym: 2.7.9-1 → 2.7.10-1
  • +
  • ros-humble-cob-msgs: 2.7.9-1 → 2.7.10-1
  • +
  • ros-humble-cob-msgs-dbgsym: 2.7.9-1 → 2.7.10-1
  • +
  • ros-humble-cob-srvs: 2.7.9-1 → 2.7.10-1
  • +
  • ros-humble-cob-srvs-dbgsym: 2.7.9-1 → 2.7.10-1
  • +
  • ros-humble-controller-interface: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-controller-interface-dbgsym: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-controller-manager: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-controller-manager-dbgsym: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-controller-manager-msgs: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-controller-manager-msgs-dbgsym: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-dataspeed-dbw-common: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dataspeed-ulc: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dataspeed-ulc-can: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dataspeed-ulc-can-dbgsym: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dataspeed-ulc-msgs: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dataspeed-ulc-msgs-dbgsym: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-fca: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-fca-can: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-fca-can-dbgsym: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-fca-description: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-fca-joystick-demo: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-fca-joystick-demo-dbgsym: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-fca-msgs: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-fca-msgs-dbgsym: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-ford: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-ford-can: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-ford-can-dbgsym: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-ford-description: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-ford-joystick-demo: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-ford-joystick-demo-dbgsym: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-ford-msgs: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-ford-msgs-dbgsym: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-polaris: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-polaris-can: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-polaris-can-dbgsym: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-polaris-description: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-polaris-joystick-demo: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-polaris-joystick-demo-dbgsym: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-polaris-msgs: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-polaris-msgs-dbgsym: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-depth-image-proc: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-depth-image-proc-dbgsym: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-diff-drive-controller: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-diff-drive-controller-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-draco-point-cloud-transport: 1.0.9-1 → 1.0.10-1
  • +
  • ros-humble-draco-point-cloud-transport-dbgsym: 1.0.9-1 → 1.0.10-1
  • +
  • ros-humble-effort-controllers: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-effort-controllers-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-etsi-its-cam-coding: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-cam-coding-dbgsym: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-cam-conversion: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-cam-msgs: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-cam-msgs-dbgsym: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-coding: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-conversion: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-conversion-dbgsym: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-denm-coding: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-denm-coding-dbgsym: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-denm-conversion: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-denm-msgs: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-denm-msgs-dbgsym: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-messages: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-msgs: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-msgs-utils: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-primitives-conversion: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-rviz-plugins: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-rviz-plugins-dbgsym: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-flir-camera-description: 2.0.8-2 → 2.0.8-3
  • +
  • ros-humble-flir-camera-msgs: 2.0.8-2 → 2.0.8-3
  • +
  • ros-humble-flir-camera-msgs-dbgsym: 2.0.8-2 → 2.0.8-3
  • +
  • ros-humble-force-torque-sensor-broadcaster: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-force-torque-sensor-broadcaster-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-forward-command-controller: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-forward-command-controller-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-gripper-controllers: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-gripper-controllers-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-hardware-interface: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-hardware-interface-dbgsym: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-hardware-interface-testing: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-hardware-interface-testing-dbgsym: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-image-pipeline: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-image-proc: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-image-proc-dbgsym: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-image-publisher: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-image-publisher-dbgsym: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-image-rotate: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-image-rotate-dbgsym: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-image-view: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-image-view-dbgsym: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-imu-sensor-broadcaster: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-imu-sensor-broadcaster-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-joint-limits: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-joint-limits-dbgsym: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-joint-state-broadcaster: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-joint-state-broadcaster-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-joint-trajectory-controller: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-joint-trajectory-controller-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-kinematics-interface: 0.2.0-1 → 0.3.0-1
  • +
  • ros-humble-kinematics-interface-kdl: 0.2.0-1 → 0.3.0-1
  • +
  • ros-humble-kinematics-interface-kdl-dbgsym: 0.2.0-1 → 0.3.0-1
  • +
  • ros-humble-launch-pal: 0.0.16-1 → 0.0.18-1
  • +
  • ros-humble-libmavconn: 2.6.0-1 → 2.7.0-1
  • +
  • ros-humble-libmavconn-dbgsym: 2.6.0-1 → 2.7.0-1
  • +
  • ros-humble-mavlink: 2023.9.9-1 → 2024.3.3-1
  • +
  • ros-humble-mavros: 2.6.0-1 → 2.7.0-1
  • +
  • ros-humble-mavros-dbgsym: 2.6.0-1 → 2.7.0-1
  • +
  • ros-humble-mavros-extras: 2.6.0-1 → 2.7.0-1
  • +
  • ros-humble-mavros-extras-dbgsym: 2.6.0-1 → 2.7.0-1
  • +
  • ros-humble-mavros-msgs: 2.6.0-1 → 2.7.0-1
  • +
  • ros-humble-mavros-msgs-dbgsym: 2.6.0-1 → 2.7.0-1
  • +
  • ros-humble-mrpt2: 2.11.9-1 → 2.11.11-1
  • +
  • ros-humble-mrpt2-dbgsym: 2.11.9-1 → 2.11.11-1
  • +
  • ros-humble-mvsim: 0.8.3-1 → 0.9.1-1
  • +
  • ros-humble-mvsim-dbgsym: 0.8.3-1 → 0.9.1-1
  • +
  • ros-humble-ntrip-client: 1.2.0-1 → 1.3.0-1
  • +
  • ros-humble-play-motion2: 0.0.13-1 → 1.0.0-1
  • +
  • ros-humble-play-motion2-dbgsym: 0.0.13-1 → 1.0.0-1
  • +
  • ros-humble-play-motion2-msgs: 0.0.13-1 → 1.0.0-1
  • +
  • ros-humble-play-motion2-msgs-dbgsym: 0.0.13-1 → 1.0.0-1
  • +
  • ros-humble-plotjuggler: 3.9.0-1 → 3.9.1-1
  • +
  • ros-humble-plotjuggler-dbgsym: 3.9.0-1 → 3.9.1-1
  • +
  • ros-humble-pmb2-2dnav: 4.0.9-1 → 4.0.12-1
  • +
  • ros-humble-pmb2-bringup: 5.0.15-1 → 5.0.16-1
  • +
  • ros-humble-pmb2-controller-configuration: 5.0.15-1 → 5.0.16-1
  • +
  • ros-humble-pmb2-description: 5.0.15-1 → 5.0.16-1
  • +
  • ros-humble-pmb2-laser-sensors: 4.0.9-1 → 4.0.12-1
  • +
  • ros-humble-pmb2-maps: 4.0.9-1 → 4.0.12-1
  • +
  • ros-humble-pmb2-navigation: 4.0.9-1 → 4.0.12-1
  • +
  • ros-humble-pmb2-robot: 5.0.15-1 → 5.0.16-1
  • +
  • ros-humble-point-cloud-interfaces: 1.0.9-1 → 1.0.10-1
  • +
  • ros-humble-point-cloud-interfaces-dbgsym: 1.0.9-1 → 1.0.10-1
  • +
  • ros-humble-point-cloud-transport: 1.0.15-1 → 1.0.16-1
  • +
  • ros-humble-point-cloud-transport-dbgsym: 1.0.15-1 → 1.0.16-1
  • +
  • ros-humble-point-cloud-transport-plugins: 1.0.9-1 → 1.0.10-1
  • +
  • ros-humble-point-cloud-transport-py: 1.0.15-1 → 1.0.16-1
  • +
  • ros-humble-position-controllers: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-position-controllers-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-psdk-interfaces: 1.0.0-1 → 1.1.0-1
  • +
  • ros-humble-psdk-interfaces-dbgsym: 1.0.0-1 → 1.1.0-1
  • +
  • ros-humble-psdk-wrapper: 1.0.0-1 → 1.1.0-1
  • +
  • ros-humble-psdk-wrapper-dbgsym: 1.0.0-1 → 1.1.0-1
  • +
  • ros-humble-range-sensor-broadcaster: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-range-sensor-broadcaster-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-ros2-control: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-ros2-control-test-assets: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-ros2-controllers: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-ros2-controllers-test-nodes: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-ros2caret: 0.5.0-2 → 0.5.0-6
  • +
  • ros-humble-ros2controlcli: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-rqt-controller-manager: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-rqt-gauges: 0.0.1-1 → 0.0.2-1
  • +
  • ros-humble-rqt-joint-trajectory-controller: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-rtabmap: 0.21.3-1 → 0.21.4-1
  • +
  • ros-humble-rtabmap-conversions: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-conversions-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-dbgsym: 0.21.3-1 → 0.21.4-1
  • +
  • ros-humble-rtabmap-demos: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-examples: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-launch: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-msgs: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-msgs-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-odom: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-odom-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-python: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-ros: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-rviz-plugins: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-rviz-plugins-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-slam: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-slam-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-sync: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-sync-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-util: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-util-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-viz: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-viz-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-simple-launch: 1.9.0-1 → 1.9.1-1
  • +
  • ros-humble-spinnaker-camera-driver: 2.0.8-2 → 2.0.8-3
  • +
  • ros-humble-spinnaker-camera-driver-dbgsym: 2.0.8-2 → 2.0.8-3
  • +
  • ros-humble-steering-controllers-library: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-steering-controllers-library-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-stereo-image-proc: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-stereo-image-proc-dbgsym: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-tiago-2dnav: 4.0.9-1 → 4.0.12-1
  • +
  • ros-humble-tiago-bringup: 4.1.2-1 → 4.2.3-1
  • +
  • ros-humble-tiago-controller-configuration: 4.1.2-1 → 4.2.3-1
  • +
  • ros-humble-tiago-description: 4.1.2-1 → 4.2.3-1
  • +
  • ros-humble-tiago-gazebo: 4.0.8-1 → 4.1.0-1
  • +
  • ros-humble-tiago-laser-sensors: 4.0.9-1 → 4.0.12-1
  • +
  • ros-humble-tiago-moveit-config: 3.0.7-1 → 3.0.10-1
  • +
  • ros-humble-tiago-navigation: 4.0.9-1 → 4.0.12-1
  • +
  • ros-humble-tiago-robot: 4.1.2-1 → 4.2.3-1
  • +
  • ros-humble-tiago-simulation: 4.0.8-1 → 4.1.0-1
  • +
  • ros-humble-tracetools-image-pipeline: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-tracetools-image-pipeline-dbgsym: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-transmission-interface: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-transmission-interface-dbgsym: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-tricycle-controller: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-tricycle-controller-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-tricycle-steering-controller: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-tricycle-steering-controller-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-ur-client-library: 1.3.4-1 → 1.3.5-1
  • +
  • ros-humble-ur-client-library-dbgsym: 1.3.4-1 → 1.3.5-1
  • +
  • ros-humble-velocity-controllers: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-velocity-controllers-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-zlib-point-cloud-transport: 1.0.9-1 → 1.0.10-1
  • +
  • ros-humble-zlib-point-cloud-transport-dbgsym: 1.0.9-1 → 1.0.10-1
  • +
  • ros-humble-zstd-point-cloud-transport: 1.0.9-1 → 1.0.10-1
  • +
  • ros-humble-zstd-point-cloud-transport-dbgsym: 1.0.9-1 → 1.0.10-1
  • +
+

Removed Packages [2]:

+ +

Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:

+
    +
  • Alejandro Hernandez Cordero
  • +
  • Alejandro Hernández
  • +
  • Bence Magyar
  • +
  • Bernd Pfrommer
  • +
  • Bianca Bendris
  • +
  • Boeing
  • +
  • Davide Faconti
  • +
  • Denis Štogl
  • +
  • Eloy Bricneo
  • +
  • Felix Exner
  • +
  • Felix Messmer
  • +
  • Jean-Pierre Busch
  • +
  • Jordan Palacios
  • +
  • Jordi Pages
  • +
  • Jose-Luis Blanco-Claraco
  • +
  • Kevin Hallenbeck
  • +
  • Luis Camero
  • +
  • Martin Pecka
  • +
  • Mathieu Labbe
  • +
  • Micho Radovnikovich
  • +
  • Noel Jimenez
  • +
  • Olivier Kermorgant
  • +
  • Rob Fisher
  • +
  • TIAGo PAL support team
  • +
  • Vincent Rabaud
  • +
  • Vladimir Ermakov
  • +
  • Víctor Mayoral-Vilches
  • +
  • flynneva
  • +
  • ymski
  • +
+

1 post - 1 participant

+

Read full topic

+
+ 2024-03-08T16:36:12Z + 2024-03-08T16:36:12Z + + + audrow + + + + 2024-03-15T16:08:10Z + +
+ + + discourse.ros.org-topic-36521 + + ROS1: Now is a great time to add `catkin_lint` to your packages! +

catkin_lint is an established ROS package that can do lots of useful checks on your CMakeLists.txt and package.xml . It can e.g. warn you about dependencies that do not match between package.xml and CMakeLists.txt, it can check existence of all rosdep keys in pacakge.xml, it will watch if all executable files in your package get installed, it will warn you about the most common wrong usages of CMake, and recently it even got the ability to warn you if you’re using a CMake feature that is too new for the CMake version you’ve put in cmake_minimum_required(). And there’s much more.

+

Personally, as a maintainer, I feel much more comfortable releasing a new version of a package once I see catkin_lint passed without complaints.

+

Until recently, automatic running of catkin_lint tests on packages released via buildfarm was problematic because the buildfarm doesn’t initialize rosdep cache and catkin_lint needed it for its working. The recently released version 1.6.22 of catkin lint no longer fails in this case, so it is able to run all other tests that do not require rosdep on the buildfarm, while disabling those that need rosdep (currently only checking that package.xml keys point to existing packages).

+

Adding automatic catkin_lint to your package is easy!

+

CMakeLists.txt:

+
if (CATKIN_ENABLE_TESTING)
+  find_package(roslint REQUIRED)
+  roslint_custom(catkin_lint "-W2" .)
+  roslint_add_test()
+endif()
+
+

package.xml:

+
<test_depend>python3-catkin-lint</test_depend>
+<test_depend>roslint</test_depend>
+
+

And that’s it!

+

If you want to run the test locally, you can either manually invoke catkin_lint . in your package directory, or make roslint in the build directory.

+

And if you’re okay with some warnings catkin_lint gives you, you can always ignore them either for a single line (#catkin_lint: ignore_once duplicate_find) or globally by adding arguments to the catkin_lint call (catkin_lint -W2 --ignore duplicate_find .).

+

Of course, the catkin_lint automation should not substitute manual runs of this tool before releasing a new version of your package. It should be a good habit to run caktin_lint after you finished editing your build files. However, having the automation built in, you can get assurance that even if you forget running the tool manually, the buildfarm will let you know :slight_smile:

+

You can see examples of catkin_lint used on buildfarm-released packages e.g. in our ROS utils stack: ros-utils/cras_topic_tools/CMakeLists.txt at master · ctu-vras/ros-utils · GitHub . Or scroll down on rosdep System Dependency: python3-catkin-lint to see all other.

+
+

NB: I’m not the developer of catkin_lint. @roehling @ FKIE is doing all of the awesome work!

+
+

NB2: When you’re at it, also have a look at:

+
find_package(roslaunch REQUIRED)
+roslaunch_add_file_check(launch IGNORE_UNSET_ARGS)
+
+

and

+
<test_depend>roslaunch</test_depend>
+
+

1 post - 1 participant

+

Read full topic

+
+ 2024-03-08T10:28:13Z + 2024-03-08T10:28:13Z + + + peci1 + + + + 2024-03-15T16:08:10Z + +
+ + + discourse.ros.org-topic-36515 + + Cobot Magic: Mobile Aloha system works on AgileX Robotics platform +

Introduction

+

AgileX Cobot Magic is a system based on Mobile ALOHA that can simultaneously remotely control the TRACER mobile base and robotic arms.

+

浇花1

+

Story

+

Mobile Aloha is a whole-body remote operation data collection system developed by Zipeng Fu, Tony Z. Zhao, and Chelsea Finn from Stanford University. Its hardware is based on 2 robotic arms (ViperX 300), equipped with 2 wrist cameras and 1 top camera, and a mobile base from AgileX Robotics Tracer differential motion robot, etc. Data collected using Mobile ALOHA, combined with supervised behavior cloning and joint training with existing static ALOHA datasets, can improve the performance of mobile manipulation tasks. With 50 demonstrations for each task, joint training can increase the success rate by 90%. Mobile ALOHA can autonomously perform complex mobile manipulation tasks such as cooking and opening doors. Special thanks to the Stanford research team Zipeng Fu, Tony Z. Zhao, and Chelsea Finn for their research on Mobile ALOHA, which enabled full open-source implementation. For more details about this project please check the link.

+

Based on Mobile Aloha, AgileX developed Cobot Magic, which can achieve the complete code of Mobile Aloha, with higher configurations and lower costs, and is equipped with larger-load robotic arms and high-computing power industrial computers. For more details about Cobot Magic please check the AgileX website.

+

AgileX Cobot Magic is a system based on Mobile ALOHA that can simultaneously remotely control the TRACER mobile base and robotic arms. It is equipped with an indoor differential drive AGV base, a high-performance robotic arm, an industrial-grade computer, and other components. AgileX Cobot Magic assists users in better utilizing open-source hardware and the Mobile ALOHA deep learning framework for robotics. It covers a wide range of tasks, from simple pick-and-place operations to more intricate and complex actions such as pouring, cooking, riding elevators, and organizing items.

+

+

Currently, AgileX has successfully completed the integration of Cobot Magic based on the Mobile Aloha source code project. It currently includes the entire process of data collection, data re-display, data visualization, demonstration mode, model training, inference, and so on. This project will introduce AgileX Cobot Magic and provide ongoing updates on the training progress of mobile manipulation tasks.

+

Hardware configuration

+

Here is the list of hardware in AgileX Cobot Magic.

+
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
ComponentItem NameModel
Standard ConfigurationWheeled Mobile RobotAgileX Tracer
Deep Camera x3Orbbec Dabai +
USB Hub12V Power Supply,7-port +
6 DOF Lightweight Robot Arm x4Customized by AgileX +
Adjustable Velcro x2Customized by AgileX +
Grip Tape x2Customized by AgileX +
Power Strip4 Outlets, 1.8m +
Mobile Power Station1000W +
ALOHA StandCustomized by AgileX +
OptionalConfigurationNano Development KitJetson Orin Nano Developer Kit (8G)
Industrial PCAPQ-X7010/GPU 4060/i7-9700-32g-4T +
IMUCH110 +
Display11.6" 1080p +
+

Note: An IPC is required. Users have two options: Nano Development kit and APQ-X7010 IPC.

+

Software configuration

+

Local computer:

+

Ubuntu20.04, cuda-11.3.

+

Environment configuration:

+
# 1. Create python virtual environment
+conda create -n aloha python=3.8
+
+# 2. Activate
+conda activate aloha
+
+# 3. Install cuda and torch
+pip install torch==1.11.0+cu113 torchvision==0.12.0+cu113 torchaudio==0.11.0 --extra-index-url https://download.pytorch.org/whl/cu113  
+
+
+# 4 Install detr
+##  Get act code
+git clone https://github.com/agilexrobotics/act-plus-plus.git
+cd act-plus-plus
+
+
+# 4.1 other dependencies
+pip install -r requirements.txt
+
+## 4.2 Install detr
+cd detr && pip install -v -e .
+
+

Simulated environment datasets

+

You can find all scripted/human demos for simulated environments here. here

+

After downloading, copy it to the act-plus-plus/data directory. The directory structure is as follows:

+
act-plus-plus/data
+    ├── sim_insertion_human
+    │   ├── sim_insertion_human-20240110T054847Z-001.zip
+        ├── ...
+    ├── sim_insertion_scripted
+    │   ├── sim_insertion_scripted-20240110T054854Z-001.zip
+        ├── ... 
+    ├── sim_transfer_cube_human
+    │   ├── sim_transfer_cube_human-20240110T054900Z-001.zip
+    │   ├── ...
+    └── sim_transfer_cube_scripted
+        ├── sim_transfer_cube_scripted-20240110T054901Z-001.zip
+        ├── ...
+
+

Demonstration

+

By now it is widely accepted that learning a task from scratch, i.e., without any prior knowledge, is a daunting undertaking. Humans, however, rarely attempt to learn from scratch. They extract initial biases as well as strategies on how to approach a learning problem from instructions and/or demonstrations of other humans. This is what we call ‘programming by demonstration’ or ‘Imitation learning’.

+

The demonstration usually contains decision data {T1, T2,…, Tm}. Each decision contains the state and action sequence

+

image.png

+

Extract all “state-action pairs” and build a new set

+

image.png

+

Currently, based on AgileX Cobot Magic, we can achieve multiple whole-body action tasks.

+

Here we will show different action task demonstrations collected using AgileX Cobot Magic.

+

Watering flowers

+

浇花1

+

Opening a box

+

开箱子1

+

Pouring rice

+

倒米1

+

Twisting a bottle cap

+

拧瓶盖1

+

Throwing a rubbish

+

扔垃圾1

+

Using AgileX Cobot Magic, users can flexibly complete various action tasks in life by controlling the teaching robot arm from simple pick and place skills to more sophisticated skills such as twisting bottle caps. The mobile chassis provides more possibilities for the robotic arms so that the robotic arm is no longer restricted to performing actions in a fixed place. The 14 + 2 DOFs provide limitless potential for collecting diverse data.

+

Data Presentation

+

+

Display the collected data of a certain demonstration of the AgileX Cobot Magic arms. The collected data includes the positional information of 14 joints at different time intervals.

+

+

Summary

+

Cobot Magic is a remote whole-body data collection device, developed by AgileX Robotics based on the Mobile Aloha project from Stanford University. With Cobot Magic, AgileX Robotics has successfully achieved the open-source code from the Stanford laboratory used on the Mobile Aloha platform. Data collection is no longer limited to desktops or specific surfaces thanks to the mobile base Tracer on the Cobot Magic, which enhances the richness and diversity of collected data.

+

AgileX will continue to collect data from various motion tasks based on Cobot Magic for model training and inference. Please stay tuned for updates on Github.

+

About AgileX

+

Established in 2016, AgileX Robotics is a leading manufacturer of mobile robot platforms and a provider of unmanned system solutions. The company specializes in independently developed multi-mode wheeled and tracked wire-controlled chassis technology and has obtained multiple international certifications. AgileX Robotics offers users self-developed innovative application solutions such as autonomous driving, mobile grasping, and navigation positioning, helping users in various industries achieve automation. Additionally, AgileX Robotics has introduced research and education software and hardware products related to machine learning, embodied intelligence, and visual algorithms. The company works closely with research and educational institutions to promote robotics technology teaching and innovation.

+

Appendix

+

ros_astra_camera configuration

+

ros_astra_camera-githubros_astra_camera-gitee

+

Camera Parameters

+
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
NameParameters
Baseline40mm
Depth distance0.3-3m
Depth map resolution640x400x30fps、320x200x30fps
Color image resolution1920x1080x30fps、1280x720x30fps、640x480x30fps
Accuracy6mm@1m (81% FOV area in accuracy calculation)
Depth FOVH 67.9° V 45.3°
Color FOVH 71° V 43.7° @ 1920x1080
Delay30-45ms
Data transmissionUSB2.0 or above
Working temperature10°C~40°C
SizeLength 59.5x Width 17.4x Thickness 11.1 mm
+
    +
  1. OrbbecSDK_ROS1Drive installation
  2. +
+
# 1 Install dependencies
+sudo apt install libgflags-dev  ros-$ROS_DISTRO-image-geometry ros-$ROS_DISTRO-camera-info-manager ros-$ROS_DISTRO-image-transport ros-$ROS_DISTRO-image-publisher ros-$ROS_DISTRO-libuvc-ros libgoogle-glog-dev libusb-1.0-0-dev libeigen3-dev 
+
+# 2 Download the code
+## 2.1 github
+git clone https://github.com/orbbec/OrbbecSDK_ROS1.git astra_ws/src
+## 2.2 gitee(Chinese region)
+git clone https://gitee.com/orbbecdeveloper/OrbbecSDK_ROS1 -b v1.4.6 astra_ws/src
+
+# 3 Compile orbbec_camera
+## 3.1 Enter astra_ws workspace
+cd astra_ws
+## 3.2 Compile orbbec_camera
+catkin_make
+
+# 4 Install udev rules.
+source devel/setup.bash && rospack list
+roscd orbbec_camera/scripts
+sudo cp 99-obsensor-libusb.rules /etc/udev/rules.d/99-obsensor-libusb.rules
+sudo udevadm control --reload && sudo  udevadm trigger
+
+# 5 Add ros_astra_camera package environment variables
+## 5.1 Enter astra_ws
+cd astra_ws
+## 5.2 Add environment variables
+echo "source $(pwd)/devel/setup.bash" >> ~/.bashrc 
+## 5.3 Environment variables work
+
+# 6 Launch
+## If step 5 is not performed, the code in 6.2 needs to be executed every time it is started to make the ros workspace environment take effect.
+## 6.1 astra_ws
+cd astra_ws
+## 6.2 workspace works
+source devel/setup.bash
+## 6.3 launch astra.launch
+roslaunch orbbec_camera astra.launch
+## 6.4 luanch dabai.launch
+roslaunch orbbec_camera dabai.launch
+
+
    +
  1. Configure orbbec_camera multiple camera nodes
  2. +
+

① Check the device serial number

+

● After installing the camera, run the following code

+
rosrun orbbec_camera list_devices_node | grep -i serial
+
+

● The output in the terminal

+
[ INFO] [1709728787.207920484]: serial: AU1P32201SA
+# Please recored this serial number. Each camera corresponds to a unique Serial number.
+
+

② Configure multiple camera nodes

+

● cobot_magic uses three Dabai cameras of orbbec_camera, so it is necessary to configure the corresponding camera according to the Serial number of each camera.

+

● Industrial computer PC plugs in the USB data cables of the three cameras and runs 1. View the code in the device number section to view the Serial numbers of the three cameras

+

● In order to clarify the topics corresponding to each camera in subsequent development, please fill in the Serial number in order.

+

● Create the multi_dabai.launch file in the astra_ws/src/launch directory with the following content:

+
# Mainly modify: 1 Camera name 、2 Serial number
+<launch>
+    <arg name="camera_name" default="camera"/>
+    <arg name="3d_sensor" default="dabai"/>
+    
+     <!-- 1 Mainly modify 1 camera name prefix and 2 Serial number. -->
+    <arg name="camera1_prefix" default="01"/>
+    <arg name="camera2_prefix" default="02"/>
+    <arg name="camera3_prefix" default="03"/>
+    
+    <!-- # 2 Serial number : Fill in the camera Serial number -->
+    <arg name="camera1_usb_port" default="camera1的serial number"/>
+    <arg name="camera2_usb_port" default="camera2的serial number"/>
+    <arg name="camera3_usb_port" default="camera3的serial number"/>
+ 
+    <arg name="device_num" default="3"/>
+    <include file="$(find orbbec_camera)/launch/$(arg 3d_sensor).launch">
+        <arg name="camera_name" value="$(arg camera_name)_$(arg camera1_prefix)"/>
+        <arg name="usb_port" value="$(arg camera1_usb_port)"/>
+        <arg name="device_num" value="$(arg device_num)"/>
+    </include>
+ 
+    <include file="$(find orbbec_camera)/launch/$(arg 3d_sensor).launch">
+        <arg name="camera_name" value="$(arg camera_name)_$(arg camera2_prefix)"/>
+        <arg name="usb_port" value="$(arg camera2_usb_port)"/>
+        <arg name="device_num" value="$(arg device_num)"/>
+    </include>
+    
+    <include file="$(find orbbec_camera)/launch/$(arg 3d_sensor).launch">
+        <arg name="camera_name" value="$(arg camera_name)_$(arg camera3_prefix)"/>
+        <arg name="usb_port" value="$(arg camera3_usb_port)"/>
+        <arg name="device_num" value="$(arg device_num)"/>
+    </include>
+</launch>
+
+

● Add permissions

+
# 1 Enter astra_camera/launch/
+roscd orbbec_camera/launch/
+ 
+# 2 multi_dabai.launch add permissions
+chmod +x multi_dabai.launch
+
+

● Launch ros

+
roslaunch orbbec_camera multi_dabai.launch
+
+

1 post - 1 participant

+

Read full topic

+
+ 2024-03-08T02:01:53Z + 2024-03-08T02:01:53Z + + + Agilex_Robotics + + + + 2024-03-15T16:08:09Z + +
+ + + discourse.ros.org-topic-36514 + + New Packages for Noetic 2024-03-07 +

We’re happy to announce 10 new packages and 46 updates are now available in ROS Noetic. This sync was tagged as noetic/2024-03-07.

+

Thank you to every maintainer and contributor who made these updates available!

+

Package Updates for ROS Noetic

+

Added Packages [10]:

+ +

Updated Packages [46]:

+
    +
  • ros-noetic-cras-cpp-common: 2.3.8-1 → 2.3.9-1
  • +
  • ros-noetic-cras-docs-common: 2.3.8-1 → 2.3.9-1
  • +
  • ros-noetic-cras-py-common: 2.3.8-1 → 2.3.9-1
  • +
  • ros-noetic-cras-topic-tools: 2.3.8-1 → 2.3.9-1
  • +
  • ros-noetic-etsi-its-cam-coding: 2.0.0-1 → 2.0.1-1
  • +
  • ros-noetic-etsi-its-cam-conversion: 2.0.0-1 → 2.0.1-1
  • +
  • ros-noetic-etsi-its-cam-msgs: 2.0.0-1 → 2.0.1-1
  • +
  • ros-noetic-etsi-its-coding: 2.0.0-1 → 2.0.1-1
  • +
  • ros-noetic-etsi-its-conversion: 2.0.0-1 → 2.0.1-1
  • +
  • ros-noetic-etsi-its-denm-coding: 2.0.0-1 → 2.0.1-1
  • +
  • ros-noetic-etsi-its-denm-conversion: 2.0.0-1 → 2.0.1-1
  • +
  • ros-noetic-etsi-its-denm-msgs: 2.0.0-1 → 2.0.1-1
  • +
  • ros-noetic-etsi-its-messages: 2.0.0-1 → 2.0.1-1
  • +
  • ros-noetic-etsi-its-msgs: 2.0.0-1 → 2.0.1-1
  • +
  • ros-noetic-etsi-its-msgs-utils: 2.0.0-1 → 2.0.1-1
  • +
  • ros-noetic-etsi-its-primitives-conversion: 2.0.0-1 → 2.0.1-1
  • +
  • ros-noetic-gnss-info: 1.0.1-1 → 1.0.2-1
  • +
  • ros-noetic-gnss-info-msgs: 1.0.1-1 → 1.0.2-1
  • +
  • ros-noetic-gnsstk-ros: 1.0.1-1 → 1.0.2-1
  • +
  • ros-noetic-image-transport-codecs: 2.3.8-1 → 2.3.9-1
  • +
  • ros-noetic-libmavconn: 1.17.0-1 → 1.18.0-1
  • +
  • ros-noetic-mavlink: 2023.9.9-1 → 2024.3.3-1
  • +
  • ros-noetic-mavros: 1.17.0-1 → 1.18.0-1
  • +
  • ros-noetic-mavros-extras: 1.17.0-1 → 1.18.0-1
  • +
  • ros-noetic-mavros-msgs: 1.17.0-1 → 1.18.0-1
  • +
  • ros-noetic-mrpt2: 2.11.9-1 → 2.11.11-1
  • +
  • ros-noetic-mvsim: 0.8.3-1 → 0.9.1-2
  • +
  • ros-noetic-ntrip-client: 1.2.0-1 → 1.3.0-1
  • +
  • ros-noetic-rtabmap: 0.21.3-1 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-conversions: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-costmap-plugins: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-demos: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-examples: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-launch: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-legacy: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-msgs: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-odom: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-python: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-ros: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-rviz-plugins: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-slam: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-sync: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-util: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-viz: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-test-mavros: 1.17.0-1 → 1.18.0-1
  • +
  • ros-noetic-ur-client-library: 1.3.4-1 → 1.3.5-1
  • +
+

Removed Packages [0]:

+

Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:

+
    +
  • Felix Exner
  • +
  • Florian Weisshardt
  • +
  • Jean-Pierre Busch
  • +
  • Jose-Luis Blanco-Claraco
  • +
  • Martin Pecka
  • +
  • Mathieu Labbe
  • +
  • Rob Fisher
  • +
  • Robert Haschke
  • +
  • Vladimir Ermakov
  • +
+

2 posts - 2 participants

+

Read full topic

+
+ 2024-03-08T01:28:07Z + 2024-03-08T01:28:07Z + + + sloretz + + + + 2024-03-15T16:08:09Z + +
+ + + discourse.ros.org-topic-36455 + + Interoperability Interest Group March 7, 2024: Standardizing Infrastructure Messages, Part 3 +

Community Page

+

Meeting Link

+

Calendar Link

+

Continuing our discussion from the last session, our next session will get into more depth on how errors for building infrastructure devices should be represented.

+

Some questions to consider:

+
    +
  • What level of detail needs to be standardized for error messages? +
      +
    • Is it enough to simply communicate that the devices is unusable?
    • +
    • Should the standardized error messages also provide enough information for a technician to troubleshoot the device?
    • +
    • Should detailed troubleshooting information be provided through a separate non-standard channel instead?
    • +
    +
  • +
  • How efficient should error messages be? +
      +
    • A simple error code is high performance and allows for millions of possible error types but then can only communicate the presence of one error at a time
    • +
    • Bitsets could express multiple simultaneous errors with high performance but then the number of error types is severely limited
    • +
    • Dynamic arrays of error codes can communicate many types of errors with no limit but then heap allocations are needed
    • +
    • A string of serialized JSON could represent unlimited types of errors and provide troubleshooting information for them, but then heap allocation and string parsing are needed
    • +
    +
  • +
  • Should standardized error definitions be specific to each type of building device, or should the definitions be abstract enough to use across all/multiple devices? +
      +
    • E.g. are doors and elevators different enough that they need their own error code definitions?
    • +
    • What kind of errors should we expect to report for each different type of device?
    • +
    +
  • +
+

We will be seeking input on all of the above questions and more. Please come armed with examples of your most hated device errors that you think a good standard should be able to express.

+

4 posts - 3 participants

+

Read full topic

+
+ 2024-03-04T15:23:04Z + 2024-03-04T15:23:04Z + + + grey + + + + 2024-03-15T16:08:09Z + +
+ + + discourse.ros.org-topic-36452 + + ROS Mapping and Navigation with AgileX Robotics Limo +

Limo is a smart educational robot published by AgileX Robotics. More details please visit: https://global.agilex.ai/
+

+

Four steering modes make LIMO substantially superior to other robots in its class. The available modes are: Omni-Wheel Steering, Tracked Steering, Four-Wheel Differential Steering and Ackermann Steering. These advanced steering modes plus a built-in 360° scanning LiDAR and RealSense infrared camera make the platform perfect for industrial and commercial tasks in any scenario. With these incredible features, LIMO can achieve precise self-localization, SLAM mapping, route planning and autonomous obstacle avoidance, reverse parking, traffic light recognition, and more.
+

+

Mapping

+

Gmapping

+

Gmapping is a widely adopted open-source SLAM algorithm that operates within the filtering SLAM framework. It effectively uses wheel odometry data and does not heavily rely on high-frequency laser LiDAR scans. When constructing a map of a smaller environment, Gmapping requires minimal computational resources to maintain high accuracy. Here the ROS encapsulated Gmapping package is used to achieve the Gmapping for Limo.

+

Note: The speed of limo should be slow in the process of mapping. If the speed is too fast, the effect of mapping will be affected.

+

Run the command in a new terminal. It launches the LiDAR.

+
 roslaunch limo_bringup limo_start.launch pub_odom_tf:=false
+
+

Then launch the gmapping algorithm. Open another new terminal, and enter the command:

+
roslaunch limo_bringup limo_gmapping.launch
+
+

After launching successfully, the rviz visualization tool will start up. The interface is shown in the figure.
+

+

At this time, the handle can be set to remote control mode and control limo mapping.

+

After building the map, run the following command to save the map to the specified directory:

+
    +
  1. Switch to the directory where you need to save the map, save the map to ~/agilex_ws/src/limo_ros/limo_bringup/maps/, and enter the command in the terminal:
  2. +
+
cd ~/agilex_ws/src/limo_ros/limo_bringup/maps/
+
+
    +
  1. After switching to /agilex_ws/limo_bringup/maps, continue to enter the command in the terminal:
  2. +
+
rosrun map_server map_saver –f map1
+
+

Note: map1 is the name of the saved map, and duplicate names should be avoided when saving the map.

+

Cartographer

+

Cartographer is a set of SLAM algorithms based on image optimization launched by Google. The main goal of this algorithm is to achieve low computing resource consumption and achieve the purpose of real-time SLAM. The algorithm is mainly divided into two parts. The first part is called Local SLAM. This part establishes and maintains a series of Submaps through each frame of the Laser Scan, and the so-called submap is a series of Grid Maps. The second part called Global SLAM, is to perform closed-loop detection through Loop Closure to eliminate accumulated errors: when a submap is built, no new laser scans will be inserted into the submap. The algorithm will add the submap to the closed-loop detection.

+

Note: Before running the command, please make sure that the programs in other terminals have been terminated. The termination command is: Ctrl+c.

+

Note: The speed of limo should be slow in the process of mapping. If the speed is too fast, the effect of mapping will be affected.

+

Launch a new terminal and enter the command:

+
roslaunch limo_bringup limo_start.launch pub_odom_tf:=false
+
+

Then start the cartographer mapping algorithm. Open another new terminal and enter the command:

+
roslaunch limo_bringup limo_cartographer.launch
+
+

After launching successfully, the rviz visualization interface will be shown as the figure below:
+

+

After building the map, it is necessary to save it. Three following commands need to be entered in the terminal:

+

(1)After completing the trajectory, no further data should be accepted.

+
rosservice call /finish_trajectory 0
+
+

(2)Serialize and save its current state.

+
rosservice call /write_state "{filename: '${HOME}/agilex_ws/src/limo_ros/limo_bringup/maps/mymap.pbstream'}"
+
+

(3)Convert pbstream to pgm and yaml

+
rosrun cartographer_ros cartographer_pbstream_to_ros_map -map_filestem=${HOME}/agilex_ws/src/limo_ros/limo_bringup/maps/mymap.pbstream -pbstream_filename=${HOME}/agilex_ws/src/limo_ros/limo_bringup/maps/mymap.pbstream -resolution=0.05
+
+

Generate the corresponding pgm and yaml, and put them in the directory:

+

${HOME}/agilex_ws/src/limo_ros/limo_bringup/maps/mymap.pbstream

+

Note: During the process of mapping, some warnings will appear in the terminal. This is caused by the excessive speed and the delayed data processing, which can be ignored.
+

+

Navigation

+

Navigation framework

+

The key to navigation is robot positioning and path planning. For these, ROS provides the following two packages.

+

(1)move_base:achieve the optimal path planning in robot navigation.

+

(2)amcl:achieve robot positioning in a two-dimensional map.

+

On the basis of the above two packages, ROS provides a complete navigation framework.
+


+The robot only needs to publish the necessary sensor information and navigation goal position, and ROS can complete the navigation function. In this framework, the move_base package provides the main operation and interactive interface of navigation. In order to ensure the accuracy of the navigation path, the robot also needs to accurately locate its own position. This part of the function is implemented by the amcl package.

+

1.1 Move_base package

+

move_base is a package for path planning in ROS, which is mainly composed of the following two planners.

+

(1) Global path planning (global_planner). Global path planning is to plan the overall path according to a given goal position and global map. In navigation, Dijkstra or A* algorithm is used for global path planning, and the optimal route from the robot to the goal position is calculated as the robot’s global path.

+

(2) Local real-time planning (local_planner). In practice, robots often cannot strictly follow the global path. So it is necessary to plan the path that the robot should travel in each cycle according to the map information and obstacles that may appear near the robot at any time. So that it conforms to the global optimal path as much as possible.

+

1.2 Amcl package

+

Autonomous positioning means that the robot can calculate its position on the map in any state. ROS provides developers with an adaptive (or kld sampling) Monte Carlo localization (amcl), which is a probabilistic positioning system that locates mobile robots in 2D. It implements an adaptive (or KLD-sampling) Monte Carlo localization, using particle filtering to track the pose of the robot on a known map.

+

1.3 Introduction of DWA_planner and TEB_planner

+
DWA_planner
+

The full name of DWA is Dynamic Window Approaches. The algorithm can search for multiple paths to avoid and travel, select the optimal path based on various evaluation criteria (whether it will hit an obstacle, the time required, etc.), and calculate the linear velocity and angular velocity during the driving cycle to avoid collisions with dynamic obstacles.

+
TEB_planner
+

The full name of “TEB” is Time Elastic Band Local Planner. This method performs subsequent modifications to the initial trajectory generated by the global path planner to optimize the robot’s motion trajectory and belongs to local path planning. In the process of trajectory optimization, the algorithm has a variety of optimization goals, including but not limited to: overall path length, trajectory running time, distance to obstacles, passing intermediate path points, and compliance with robot dynamics, kinematics, and geometric constraints. The“TEB method” explicitly considers the dynamic constraints of time and space in the state of motion, for example, the velocity and acceleration of the robot are limited.

+

Limo navigation

+

Note: In the four-wheel differential mode, the omnidirectional wheel mode and the track mode, the file run for the navigation is the same.

+

Note: Before running the command, please make sure that the programs in other terminals have been terminated. The termination command is: Ctrl+c.

+

(1)First launch the LiDAR and enter the command in the terminal:

+
roslaunch limo_bringup limo_start.launch pub_odom_tf:=false
+
+

(2)Launch the navigation and enter the command in the terminal:

+
roslaunch limo_bringup limo_navigation_diff.launch
+
+

Note: If it is Ackermann motion mode, please run:

+
roslaunch limo_bringup limo_navigation_ackerman.launch
+
+

After launching successfully, the rviz interface will be shown in the figure below:
+

+

Note: If you need to customize the opened map, please open the limo_navigation_diff.launch file to modify the parameters. The file directory is: ~/agilex_ws/src/limo_ros/limo_bringup/launch. Please modify map02 to the name of the map that needs to be replaced.

+

+

3)After launching the navigation, it may be observed that the laser-scanned shape does not align with the map, requiring manual correction. To rectify this, adjust the actual position of the chassis in the scene displayed on the rviz map. Use the rviz tools to designate an approximate position for the vehicle, providing it with a preliminary estimation. Subsequently, use the handle tool to remotely rotate the vehicle until automatic alignment is achieved. Once the laser shape overlaps with the scene shape on the map, the correction process is concluded. The operational steps are outlined as follows:
+

+

The correction is completed:

+

+

(4)Set the navigation goal point through 2D Nav Goal.
+

+

A purple path will be generated on the map. Switch the handle to command mode, and Limo will automatically navigate to the goal point.

+

+

Limo path inspection

+

(1)First launch the LiDAR and enter the command in the terminal:

+
roslaunch limo_bringup limo_start.launch pub_odom_tf:=false
+
+

(2)Launch the navigation and enter the command in the terminal:

+
roslaunch limo_bringup limo_navigation_diff.launch
+
+

Note: If it is Ackermann motion mode, please run:

+
roslaunch limo_bringup limo_navigation_ackerman.launch
+
+

(3)Launch the path recording function. Open a new terminal, and enter the command in the terminal:

+
roslaunch agilex_pure_pursuit record_path.launch
+
+

After the path recording is completed, terminate the path recording program, and enter the command in the terminal: Ctrl+c.

+

(4)Launch the path inspection function. Open a new terminal, and enter the command in the terminal:

+

Note: Switch the handle to command mode.

+
roslaunch agilex_pure_pursuit pure_pursuit.launch
+
+

2 posts - 2 participants

+

Read full topic

+
+ 2024-03-04T07:28:26Z + 2024-03-04T07:28:26Z + + + Agilex_Robotics + + + + 2024-03-15T16:08:09Z + +
+ + + discourse.ros.org-topic-36439 + + ROS Meetup Arab +

We’re excited to introduce the forthcoming installment of our Arabian Meet series, centered around the captivating theme of “Autonomous Racing: Advancing the Frontiers of Automated Technology.”

+

The topics we’ll explore include :

+
    +
  • Introduction to Autonomous Racing.
  • +
  • Autonomous Racing Competitions.
  • +
  • Racing Cars & Sensor Technologies.
  • +
  • ROS-Based Racing Simulator.
  • +
  • Autonomous Racing Software Architecture.
  • +
+

Stay tuned for more updates and save the date for this enlightening conversation! :spiral_calendar:

+

save time on the calendar:

+

+You can find the meeting link here:

+ +

+

1 post - 1 participant

+

Read full topic

+
+ 2024-03-03T07:16:59Z + 2024-03-03T07:16:59Z + + + khaledgabr77 + + + + 2024-03-15T16:08:09Z + +
+ + + discourse.ros.org-topic-36426 + + Potential Humanoid Robotics Monthly Working Group +

Hi Everyone,

+

I want to introduce myself - my name is Ronaldson Bellande, I’m a PhD Student/Founder CEO/CTO/Director of Research Organizations and a Startup; I’m working on. Can find information more about me in my linkedin and Github Profile

+

I wanted to create a monthly meeting working group, where we would meet monthly and discuss about humanoid robotics, what everyone is working on? Are looking for and are excited for? Anything Interested you are working in? and more in the space of humanoid robotics.

+

If there is interest I will start a Working Group, I’m passionate about this subject and other subject related to activities I’m constantly doing.

+

13 posts - 8 participants

+

Read full topic

+
+ 2024-03-02T01:57:09Z + 2024-03-02T01:57:09Z + + + RonaldsonBellande + + + + 2024-03-15T16:08:09Z + +
+ + + discourse.ros.org-topic-36367 + + ROS News for the Week of February 26th, 2024 +

ROS News for the Week of February 26th, 2024

+
+

belt2

+

In manufacturing I’ve seen talented people do things with clever light placement that transform an extremely difficult computer vision task into something that’s easily solved. I came across this paper this week that does just that for the robotic manipulation of objects. The paper is titled, “Dynamics-Guided Diffusion Model for Robot Manipulator Design” and the authors use diffusion models to make simple grippers that can manipulate a specific object into a given pose. The results are pretty cool and could be very useful for any roboticist with a 3D printer.

+
+


+Amazon is putting up US$1B to fund startups that combine robotics and “AI.” While regular startup investment has fallen off a bit, it looks like there are still funding opportunities for robotics companies.

+
+


+Last week everyone was talking about how NVIDIA’s market cap had hit US$2T. According to this LinkedIn post they are putting that money to good use by funding the development of the open source Nav2 project.

+
+


+Cross sensor calibration is a pain in the :peach:. A good robot model can only get you so far, and getting a bunch of sensor data to match up can be difficult for even the most seasoned engineers. The Github repository below attempts to build a toolbox to fix some these problems. LVT2Calib: Automatic and Unified Extrinsic Calibration Toolbox for Different 3D LiDAR, Visual Camera and Thermal Camerapaper

+

Events

+ +

News

+ +

ROS

+ +

ROS Questions

+

+

Got a minute? Please take a moment to answer a question on Robotics Stack Exchange and help out your fellow ROS users.

+

1 post - 1 participant

+

Read full topic

+
+ 2024-03-01T18:15:56Z + 2024-03-01T18:15:56Z + + + Katherine_Scott + + + + 2024-03-15T16:08:09Z + +
+ + + discourse.ros.org-topic-36406 + + Revival of client library working group? +

Hi,
+are there any plans, to revive this working group ?

+

15 posts - 5 participants

+

Read full topic

+
+ 2024-03-01T16:19:00Z + 2024-03-01T16:19:00Z + + + JM_ROS + + + + 2024-03-15T16:08:10Z + +
+ + + discourse.ros.org-topic-36399 + + Scalability issues with large number of nodes +

My team and I are developing a mobile platform for industrial tasks (such as rivet fastening or drilling), fully based in ROS2 Stack (Humble).

+

The stack comprises a bunch of nodes for different tasks (slam, motion planning, fiducial registration…) and are coordinated through a state machine node (based on smach).

+

The issue we are facing is that the state machine node (which is connected to most of the nodes in the stack) gets slower and slower until it stops receiving events from other nodes.

+

We’ve been debbuging this issue and our feeling is that the number of objects (nodes/clients/subscribers…) is too high and whole stack suffers a lot of overhead, being this most noticeable in the “biggest” node (the state machine).

+

Our stack has 80 nodes, and a total of 1505 objects

+
    +
  • Stack clients: 198
  • +
  • Stack services: 636
  • +
  • Stack publishers: 236
  • +
  • Stack subscribers: 173
  • +
+

My questions are:

+
    +
  • Is this number of nodes too high for an industrial robotics project? How large are usually projects using ROS2?
  • +
  • Which is the maximum number of objects in the stack? Is this a rmw limitation or ROS2 itself?
  • +
+

30 posts - 14 participants

+

Read full topic

+
+ 2024-03-01T09:35:36Z + 2024-03-01T09:35:36Z + + + leander2189 + + + + 2024-03-15T16:08:09Z + +
+ + + discourse.ros.org-topic-36330 + + Robot Fleet Management: Make vs. Buy? An Alternative +

Virtually every robotics CTO we’ve spoken to has told us about this dilemma about fleet management systems: neither “make” nor “buy” are great options! With Transitive we are providing an alternative.

+ + +

8 posts - 5 participants

+

Read full topic

+
+ 2024-02-26T23:00:34Z + 2024-02-26T23:00:34Z + + + chfritz + + + + 2024-03-15T16:08:10Z + +
+ + + discourse.ros.org-topic-36319 + + Rclcpp template metaprogramming bug. Help wanted +

Hi,
+we hit a bug in the function traits that is out of my league.
+If you are really good with template metraprogramming, please have a look at:

+

+Thanks.

+

1 post - 1 participant

+

Read full topic

+
+ 2024-02-26T09:54:21Z + 2024-02-26T09:54:21Z + + + JM_ROS + + + + 2024-03-15T16:08:10Z + +
+ + + discourse.ros.org-topic-36297 + + ROS News for the Week of February 19th, 2024 +

ROS News for the Week of February 19th, 2024

+


+Open Robotics will be participating in Google Summer of Code 2024. We’re looking for a few interns to help us out! See the post for all the details.

+
+

+

Our next Gazebo Community meeting is next Wednesday, February 28th. Sikiru Salau, a competitor in the Pan-African Robotics Competition, will be joining us to talk about simulating robots for agriculture.

+
+

image
+Hello Robot is having a great month! Last week they released their third gen robot. This week they are at the top of the orange website with this “OK Robot” paper from NYU

+
+


+Check out the AutoNav robot by Jatin Patil. Hats off to the developer, this is a really well put together personal project!

+
+


+Just a reminder: Gazebo Classic goes End-Of-Life in January 2025 and ROS 2 Jazzy will not support Gazebo Classic. We put together some guidance for those of you that need to make the switch!

+

Events

+ +

News

+ +

ROS

+ +

ROS Questions

+

Got a minute to spare? Pay it forward by answering a few ROS questions on Robotics Stack Exchange.

+

3 posts - 3 participants

+

Read full topic

+
+ 2024-02-23T23:13:39Z + 2024-02-23T23:13:39Z + + + Katherine_Scott + + + + 2024-03-15T16:08:09Z + +
+ + + discourse.ros.org-topic-36283 + + New Packages for Iron Irwini 2024-02-23 +

We’re happy to announce 2 new packages and 75 updates are now available in ROS 2 Iron Irwini :iron: :irwini: . This sync was tagged as iron/2024-02-23 .

+

Package Updates for iron

+

Added Packages [2]:

+
    +
  • ros-iron-apriltag-detector: 1.2.0-1
  • +
  • ros-iron-multidim-rrt-planner: 0.0.8-1
  • +
+

Updated Packages [75]:

+ +

Removed Packages [0]:

+

Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:

+
    +
  • Alejandro Hernandez Cordero
  • +
  • Alejandro Hernández
  • +
  • Alvin Sun
  • +
  • Bence Magyar
  • +
  • Bernd Pfrommer
  • +
  • Brandon Ong
  • +
  • Davide Faconti
  • +
  • Denis Štogl
  • +
  • Dharini Dutia
  • +
  • Eloy Bricneo
  • +
  • Fictionlab
  • +
  • John Wason
  • +
  • Jose-Luis Blanco-Claraco
  • +
  • Martin Pecka
  • +
  • Tim Clephas
  • +
  • david
  • +
  • flynneva
  • +
+

1 post - 1 participant

+

Read full topic

+
+ 2024-02-23T10:35:02Z + 2024-02-23T10:35:02Z + + + Yadunund + + + + 2024-03-15T16:08:09Z + +
+ + + discourse.ros.org-topic-36275 + + New packages and patch release for Humble Hawksbill 2024-02-22 +

We’re happy to announce a new Humble release!

+

This sync brings several new packages and some updates to ROS 2 core packages. (I’m not including the project board because it was empty.)

+
+

Package Updates for Humble

+

Added Packages [42]:

+
    +
  • ros-humble-apriltag-detector: 1.1.0-1
  • +
  • ros-humble-as2-gazebo-assets: 1.0.8-1
  • +
  • ros-humble-as2-gazebo-assets-dbgsym: 1.0.8-1
  • +
  • ros-humble-as2-platform-dji-osdk: 1.0.8-1
  • +
  • ros-humble-as2-platform-dji-osdk-dbgsym: 1.0.8-1
  • +
  • ros-humble-as2-platform-gazebo: 1.0.8-1
  • +
  • ros-humble-as2-platform-gazebo-dbgsym: 1.0.8-1
  • +
  • ros-humble-caret-analyze: 0.5.0-1
  • +
  • ros-humble-caret-msgs: 0.5.0-6
  • +
  • ros-humble-caret-msgs-dbgsym: 0.5.0-6
  • +
  • ros-humble-data-tamer-cpp: 0.9.3-2
  • +
  • ros-humble-data-tamer-cpp-dbgsym: 0.9.3-2
  • +
  • ros-humble-data-tamer-msgs: 0.9.3-2
  • +
  • ros-humble-data-tamer-msgs-dbgsym: 0.9.3-2
  • +
  • ros-humble-hardware-interface-testing: 2.39.1-1
  • +
  • ros-humble-hardware-interface-testing-dbgsym: 2.39.1-1
  • +
  • ros-humble-hri-msgs: 2.0.0-1
  • +
  • ros-humble-hri-msgs-dbgsym: 2.0.0-1
  • +
  • ros-humble-mocap4r2-dummy-driver: 0.0.7-1
  • +
  • ros-humble-mocap4r2-dummy-driver-dbgsym: 0.0.7-1
  • +
  • ros-humble-mocap4r2-marker-viz: 0.0.7-1
  • +
  • ros-humble-mocap4r2-marker-viz-dbgsym: 0.0.7-1
  • +
  • ros-humble-mocap4r2-marker-viz-srvs: 0.0.7-1
  • +
  • ros-humble-mocap4r2-marker-viz-srvs-dbgsym: 0.0.7-1
  • +
  • ros-humble-motion-capture-tracking: 1.0.3-1
  • +
  • ros-humble-motion-capture-tracking-dbgsym: 1.0.3-1
  • +
  • ros-humble-motion-capture-tracking-interfaces: 1.0.3-1
  • +
  • ros-humble-motion-capture-tracking-interfaces-dbgsym: 1.0.3-1
  • +
  • ros-humble-psdk-interfaces: 1.0.0-1
  • +
  • ros-humble-psdk-interfaces-dbgsym: 1.0.0-1
  • +
  • ros-humble-psdk-wrapper: 1.0.0-1
  • +
  • ros-humble-psdk-wrapper-dbgsym: 1.0.0-1
  • +
  • ros-humble-qb-softhand-industry-description: 2.1.2-4
  • +
  • ros-humble-qb-softhand-industry-msgs: 2.1.2-4
  • +
  • ros-humble-qb-softhand-industry-msgs-dbgsym: 2.1.2-4
  • +
  • ros-humble-qb-softhand-industry-ros2-control: 2.1.2-4
  • +
  • ros-humble-qb-softhand-industry-ros2-control-dbgsym: 2.1.2-4
  • +
  • ros-humble-qb-softhand-industry-srvs: 2.1.2-4
  • +
  • ros-humble-qb-softhand-industry-srvs-dbgsym: 2.1.2-4
  • +
  • ros-humble-ros2caret: 0.5.0-2
  • +
  • ros-humble-sync-parameter-server: 1.0.1-2
  • +
  • ros-humble-sync-parameter-server-dbgsym: 1.0.1-2
  • +
+

Updated Packages [280]:

+ +

Removed Packages [4]:

+
    +
  • ros-humble-as2-ign-gazebo-assets
  • +
  • ros-humble-as2-ign-gazebo-assets-dbgsym
  • +
  • ros-humble-as2-platform-ign-gazebo
  • +
  • ros-humble-as2-platform-ign-gazebo-dbgsym
  • +
+

Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:

+
    +
  • Adam Serafin
  • +
  • Aditya Pande
  • +
  • Alexey Merzlyakov
  • +
  • Alvin Sun
  • +
  • Bence Magyar
  • +
  • Bernd Pfrommer
  • +
  • Bianca Bendris
  • +
  • Brian Wilcox
  • +
  • CVAR-UPM
  • +
  • Carl Delsey
  • +
  • Carlos Orduno
  • +
  • Chris Lalancette
  • +
  • David V. Lu!!
  • +
  • Davide Faconti
  • +
  • Dirk Thomas
  • +
  • Dorian Scholz
  • +
  • Fictionlab
  • +
  • Francisco Martín
  • +
  • Francisco Martín Rico
  • +
  • Jacob Perron
  • +
  • John Wason
  • +
  • Jose-Luis Blanco-Claraco
  • +
  • Matej Vargovcik
  • +
  • Michael Jeronimo
  • +
  • Mohammad Haghighipanah
  • +
  • Paul Bovbel
  • +
  • Rob Fisher
  • +
  • Sachin Guruswamy
  • +
  • Shane Loretz
  • +
  • Steve Macenski
  • +
  • Support Team
  • +
  • Séverin Lemaignan
  • +
  • Tatsuro Sakaguchi
  • +
  • Vincent Rabaud
  • +
  • Víctor Mayoral-Vilches
  • +
  • Wolfgang Hönig
  • +
  • fmrico
  • +
  • rostest
  • +
  • sachin
  • +
  • steve
  • +
  • ymski
  • +
+

5 posts - 4 participants

+

Read full topic

+
+ 2024-02-23T05:00:30Z + 2024-02-23T05:00:30Z + + + audrow + + + + 2024-03-15T16:08:09Z + +
+ + + discourse.ros.org-topic-36274 + + GTC March 18-21 highlights for ROS & AI robotics +

NVIDIA GTC is happening live on March 18–21, with registration open for the event in San Jose, CA.

+

There are multiple inspiring robotics sessions following the kickoff with CEO Jensen Huang’s must-see keynote at the SAP Center which will share the latest breakthroughs affecting every industry.

+

Listing some highlight robotics sessions, hands-on-labs, and developers sessions there are:

+

Hands-on training Labs

+
    +
  • DLIT61534 Elevate Your Robotics Game: Unleash High Performance with Isaac ROS & Isaac Sim
  • +
  • DLIT61899 Simulating Custom Robots: A Hands-On Lab Using Isaac Sim and ROS 2
  • +
  • DLIT61523 Unlocking Local LLM Inference with Jetson AGX Orin: A Hands-On Lab
  • +
  • DLIT61797 Training an Autonomous Mobile Race Car with Open USD and Isaac Sim
  • +
+

Jetson and Robotics Developer Day

+
    +
  • SE62934 Introduction to AI-Based Robot Development With Isaac ROS
  • +
  • SE62675 Meet Jetson: The Platform for Edge AI and Robotics
  • +
  • SE62933 Overview of Jetson Software and Developer Tools
  • +
+

Robotics focused sessions

+
    +
  • S63374 (Disney Research) Breathing Life into Disney’s Robotic Characters with Deep Reinforcement Learning
  • +
  • S62602 (Boston Dynamics) Come See an Unlocked Ecosystem in the Robotics World
  • +
  • S62315 (The AI Institute) Robotics and the Role of AI: Past, Present, and Future
  • +
  • S61182 (Google DeepMind) Robotics in the Age of Generative AI
  • +
  • S63034 Panel Discussion on the Impact of Generative AI on Robotics
  • +
+

This is a great opportunity to connect, learn, and share with industry luminaries, robotics companies, NVIDIA experts, and peers face-to-face.

+

Thanks.

+

1 post - 1 participant

+

Read full topic

+
+ 2024-02-23T04:43:51Z + 2024-02-23T04:43:51Z + + + ggrigor + + + + 2024-03-15T16:08:09Z + +
+
diff --git a/foafroll.xml b/foafroll.xml new file mode 100644 index 00000000..9abccd7e --- /dev/null +++ b/foafroll.xml @@ -0,0 +1,345 @@ + + + + Open Robotics + + + + Fawkes + + + + + + + + + + + + + William Woodall + + + wjwwood.github.io + + + + + + + + + + David Hodo + + + + + + + + + + + + + Michael Ferguson + + + + + + + + + + + + + ROS news + + + ROS robotics news + + + + + + + + + + mobotica + + + mobotica + + + + + + + + + + Achu Wilson + + + Achu's TechBlog + + + + + + + + + + ASL ETHZ + + + Kommentare zu: + + + + + + + + + + Robbie The Robot + + + Robbie The Robot + + + + + + + + + + NooTriX + + + nootrix + + + + + + + + + + ROS Industrial + + + Blog - ROS-Industrial + + + + + + + + + + Yujin R&D + + + + + + + + + + + + + Isaac Saito + + + ROS Jogger + + + + + + + + + + John Stowers + + + Johns Blog + + + + + + + + + + MobileWill + + + MobileWill + + + + + + + + + + MoveIt! + + + MoveIt Motion Planning Framework + + + + + + + + + + CAR: Components, Agents, and Robots with Dynamic Languages + + + + + + + + + + + + + Open Source Robotics Foundation + + + Open Robotics + + + + + + + + + + PAL Robotics blog + + + PAL Robotics Blog + + + + + + + + + + Sachin Chitta's Blog + + + + + + + + + + + + + TORK + + + Tokyo Opensource Robotics Kyokai Association + + + + + + + + + + Pi Robot + + + + + + + + + + + + + ROSVirtual + + + + + + + + + + + + + ROS Discourse General + + + General - ROS Discourse + + + + + + + + + + Mat Sadowski Blog + + + msadowski blog + + + + + + + + + + Robots For Robots + + + RobotsForRobots + + + + + + + + + diff --git a/images/Robot_Head_clip_art.svg b/images/Robot_Head_clip_art.svg new file mode 100644 index 00000000..c38294c0 --- /dev/null +++ b/images/Robot_Head_clip_art.svg @@ -0,0 +1,905 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + image/svg+xml + + robo + + + hrum + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/images/feed-icon-10x10.png b/images/feed-icon-10x10.png new file mode 100644 index 0000000000000000000000000000000000000000..cc869bc61785f4db646fcbbcfc87aa3d20d99eba GIT binary patch literal 469 zcmV;`0V@89P)b#`aw-kX_Si^Jc1|2c;v&L+N%#GTkbr^vx_L@0?2ue1vae8uy9 zW>j2Fwi~;wnv#w|%)>D{wT>10d6+ znvjX&#K$e}$~4lwrKocpr#p$RZpK^viI-7mZAGBOZfK!ocn0%!gWCL!dO5}Fox+@M z<8LitMCck7=WdtW+z{sJ1iSwiD)WlBvkT!CKn3HAn`5Jatl8=pf zWMv()t_|gz0w^mJ$i`l18o*uui>ycxWHsvP8s|%U9u%*Cx=p;;psT*)T^`{*rxCTS zWX}(wG=ZN^W3ms(Xv}CQKedl?b7Aoq{>2_>AN82ZL&^z8{|hhxfn}MuvYci literal 0 HcmV?d00001 diff --git a/images/feed-icon-plus.svg b/images/feed-icon-plus.svg new file mode 100644 index 00000000..4841f7db --- /dev/null +++ b/images/feed-icon-plus.svg @@ -0,0 +1,142 @@ + + + + + + image/svg+xml + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/images/feed-icon.svg b/images/feed-icon.svg new file mode 100644 index 00000000..ce172c26 --- /dev/null +++ b/images/feed-icon.svg @@ -0,0 +1,137 @@ + + + + + + image/svg+xml + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/images/planet_ros.svg b/images/planet_ros.svg new file mode 100644 index 00000000..e7c190c3 --- /dev/null +++ b/images/planet_ros.svg @@ -0,0 +1,153 @@ + + + + + + + + + + image/svg+xml + + + + + + + + + + + + + + + + + + + + + + + diff --git a/images/ros.ico b/images/ros.ico new file mode 100644 index 0000000000000000000000000000000000000000..0d8a03c50c4270fbaa99163f7423fafab257c7f5 GIT binary patch literal 1150 zcmb`HYe}D;kykbsjUZzgn+}>^fo^!_gLii=BGv0G{e((SJpXWShXM(VdK7M`z z^{WK!azR)r2toiogoyY!4T3=LUas&xd`_L$>%~CNNA$eygvafM)ExE&biHW9=up3e zz_b%T#?bk+1#>ggeExctdM53Un`zbLkAvQyM&ScuUyBC|FIKA z5iX~5iQl5bPRz5E+Zp$rk9lYSMZ4pnO^k%8%OGdcR#$^u`mO3TdtsZgaxS`@xLBNr z>`h^)p*=8tS#cqF-Ts90-gic1Ql0Y>6e!ppjoGP5 ze(w%#jX-YvdOUBemE)_gTtH5I7z&eAu$V_UpSR7TBsGz*&(?CgMb6nBFw;Gi_QW82 a$IhFbgT&d$Jchn}=68?HrT>Ay9e)AFCQ~c` literal 0 HcmV?d00001 diff --git a/images/ros_logo.svg b/images/ros_logo.svg new file mode 100644 index 00000000..f808ee4d --- /dev/null +++ b/images/ros_logo.svg @@ -0,0 +1,28 @@ + + + + + + + + + + + + + + + + diff --git a/images/venus.png b/images/venus.png new file mode 100644 index 0000000000000000000000000000000000000000..685035deef15ff2e82bba825fa4d94a781870bb1 GIT binary patch literal 570 zcmV-A0>%A_P)f=0>XZ6dBBH zu-r7?W7Tx0jdLv4Og324uQR1kW5c{=f}swP!V^)*3W$ke+Io25tgezcoi)?#)=joP zxWQ-7N{1!=*6pRHQ|rwZG?|{-HJ3aOV@(D4qZ*fXDtq|qrPF!QdW*ZwcP_Hsv)pNM zuWfOH@$#vTE4xjmR~S8ibd#8Tg*6T1&%>DUNxRbVKUyi>)0hK{ + + + + + + + + + + + + + + Planet ROS + + + + + + + + + + + + + +
+ + +
+
+
+ +
+
+ +
+
+
+
+
+ +
+
+ + + + + + +
+ + + + + +
+
+
+ + +
+ + +
+ + + + +
+ + +
March 15, 2024
+ + + + + + + + +
+ + +
ROS News for the Week of March 11th, 2024
+ + + +
+ +
+

ROS News for the Week of March 11th, 2024

+
+


+The ROSCon 2024 call for talks and workshops is now open! We want your amazing talks! Also, the ROSCon Diversity Scholarship deadline is coming up!

+
+


+ROS By-The-Bay is next week.. Open Robotic’s CEO @Vanessa_Yamzon_Orsi is dropping by to take your questions about the future of Open Robotics, and I recommend you swing by if you can. Just a heads up, we have to move to a different room on the other side of the complex; details are on Meetup.com.

+
+


+We’re planning a ROS Meetup in San Antonio on March 26th in conjunction with the ROS Industrial Consortium meeting. If you are in the area, or have colleagues in the region, please help us spread the word.

+
+


+We’ve line up a phenomenal guest for our next Gazebo Community Meeting; Ji Zhang from Carnegie Mellon will be speaking about his work on his work integrating ROS, Gazebo, and a variety of LIDAR-based SLAM techniques.

+

Events

+ +

News

+ +

ROS

+ +

ROS Questions

+

Got a minute? Please take some time to answer questions on Robotics Stack Exchange!

+

1 post - 1 participant

+

Read full topic

+ + + + + + + +
+

+by Katherine_Scott on March 15, 2024 03:33 PM +

+ +
+ + + + + + + + + +
+ + +
ROSCon 2024 Call for Proposals Now Open
+ + + +
+ +
+

ROSCon 2024 Call for Proposals

+

+

Hi Everyone,

+

The ROSCon call for proposals is now open! You can find full proposal details on the ROSCon website. ROSCon Workshop proposals are due by 2024-05-08T06:59:00Z UTC and can be submitted using this Google Form. ROSCon talks are due by 2024-06-04T06:59:00Z UTC and you can submit your proposals using Hot CRP. Please note that you’ll need a HotCRP account to submit your talk proposal. We plan to post the accepted workshops on or around 2024-07-08T07:00:00Z UTC and the accepted talks on or around 2024-07-15T07:00:00Z UTC respectively. If you think you will need financial assistance to attend ROSCon, and you meet the qualifications, please apply for our Diversity Scholarship Program as soon as possible. Diversity Scholarship applications are due on 2024-04-06T06:59:00Z UTC, well before the CFP deadlines or final speakers are announced. Questions and concerns about the ROSCon CFP can be directed to the ROSCon executive committee (roscon-2024-ec@openrobotics.org) or posted in this thread.

+

We recommend you start planning your talk early and take the time to workshop your submission with your friends and colleagues. You are more than welcome to use this Discourse thread and the #roscon-2024 channel on the ROS Discord to workshop ideas and organize collaborators.

+

Finally, I want to take a moment to recognize this year’s ROSCon Program Co-Chairs @Ingo_Lutkebohle and @Yadunund, along with a very long list of talk reviewers who are still being finalized. Reviewing talk proposals is fairly tedious task, and ROSCon wouldn’t happen without the efforts of our volunteers. If you happen to run into any of them at ROSCon please thank them for their service to the community.

+

Talk and Workshop Ideas for ROSCon 2024

+

If you’ve never been to ROSCon, but would like to submit a talk or workshop proposal, we recommend you take a look at the archive of previous ROSCon talks. Another good resource to consider are frequently discussed topics on ROS Discourse and Robotics Stack Exchange. In last year’s metric’s report I include a list of frequently asked topic tags from Robotics Stack that might be helpful. Aside from code, we really want to your robots! We want to see your race cars, mining robots, moon landers, maritime robots, development boards, and factories and hear about lessons you learned from making them happen. If you organize a working group, run a local meetup, or maintain a larger package we want to hear about your big wins in the past year.

+

While we can suggest a few ideas for talks and workshops that we would like to see at ROSCon 2024, what we really want is to hear from the community about topic areas that you think are important. If there is a talk you would like to see at ROSCon 2024 consider proposing a that topic in the comments below. Feel free to write a whole list! Some of our most memorable talks have been ten minute overviews of key ROS subsystems that everyone uses. If you think a half hour talk about writing a custom ROS 2 executor and benchmarking its performance would be helpful, please say so!

+

1 post - 1 participant

+

Read full topic

+ + + + + + + +
+

+by Katherine_Scott on March 15, 2024 03:19 PM +

+ +
+ + + + + + + + + +
+ + +
Cobot Magic: AgileX achieved the whole process of Mobile Aloha model training in both the simulation and real environment
+ + + +
+ +
+

Cobot Magic: AgileX achieved the whole process of Mobile Aloha model training in both the simulation and real environment

+

Mobile Aloha is a whole-body remote operation data collection system developed by Zipeng Fu, Tony Z. Zhao, and Chelsea Finn from Stanford University. link.

+

Based on Mobile Aloha, AgileX developed Cobot Magic, which can achieve the complete code of Mobile Aloha, with higher configurations and lower costs, and is equipped with larger-load robotic arms and high-computing power industrial computers. For more details about Cobot Magic please check the AgileX website .

+

Currently, AgileX has successfully completed the integration of Cobot Magic based on the Mobile Aloha source code project.
+推理

+

Simulation data training

+

Data collection

+

After setting up the Mobile Aloha software environment(metioned in last section), model training in the simulation environment and real environment can be achieved. The following is the data collection part of the simulation environment. The data is provided by the team of Zipeng Fu, Tony Z. Zhao, and Chelsea Finn team.You can find all scripted/human demo for simulated environments here. here

+

After downloading, copy it to the act-plus-plus/data directory. The directory structure is as follows:

+
act-plus-plus/data
+    ├── sim_insertion_human
+    │   ├── sim_insertion_human-20240110T054847Z-001.zip
+        ├── ...
+    ├── sim_insertion_scripted
+    │   ├── sim_insertion_scripted-20240110T054854Z-001.zip
+        ├── ... 
+    ├── sim_transfer_cube_human
+    │   ├── sim_transfer_cube_human-20240110T054900Z-001.zip
+    │   ├── ...
+    └── sim_transfer_cube_scripted
+        ├── sim_transfer_cube_scripted-20240110T054901Z-001.zip
+        ├── ...
+
+

Generate episodes and render the result graph. The terminal displays 10 episodes and 2 successful ones.

+
# 1 Run
+python3 record_sim_episodes.py --task_name sim_transfer_cube_scripted --dataset_dir <data save dir> --num_episodes 50
+
+# 2 Take sim_transfer_cube_scripted as an example
+python3 record_sim_episodes.py --task_name sim_transfer_cube_scripted --dataset_dir data/sim_transfer_cube_scripted --num_episodes 10
+
+# 2.1 Real-time rendering
+python3 record_sim_episodes.py --task_name sim_transfer_cube_scripted --dataset_dir data/sim_transfer_cube_scripted --num_episodes 10  --onscreen_render
+
+# 2.2 The output in the terminal shows
+ube_scripted --num_episodes 10
+episode_idx=0
+Rollout out EE space scripted policy
+episode_idx=0 Failed
+Replaying joint commands
+episode_idx=0 Failed
+Saving: 0.9 secs
+
+episode_idx=1
+Rollout out EE space scripted policy
+episode_idx=1 Successful, episode_return=57
+Replaying joint commands
+episode_idx=1 Successful, episode_return=59
+Saving: 0.6 secs
+...
+Saved to data/sim_transfer_cube_scripted
+Success: 2 / 10
+
+

The loaded image renders as follows:
+

+

Data Visualization

+

Visualize simulation data. The following figures show the images of episode0 and episode9 respectively.

+

The episode 0 screen in the data set is as follows, showing a case where the gripper fails to pick up.

+

episode0

+

The visualization of the data of episode 9 shows the successful case of grippering.

+

episode19

+

Print the data of each joint of the robotic arm in the simulation environment. Joint 0-13 is the data of 14 degrees of freedom of the robot arm and the gripper.

+

+

Model training and inference

+

Simulated environments datasets must be downloaded (see Data Collection)

+
python3 imitate_episodes.py --task_name sim_transfer_cube_scripted --ckpt_dir <ckpt dir> --policy_class ACT --kl_weight 10 --chunk_size 100 --hidden_dim 512 --batch_size 8 --dim_feedforward 3200 --num_epochs 2000  --lr 1e-5 --seed 0
+
+# run
+python3 imitate_episodes.py --task_name sim_transfer_cube_scripted --ckpt_dir trainings --policy_class ACT --kl_weight 1 --chunk_size 10 --hidden_dim 512 --batch_size 1 --dim_feedforward 3200  --lr 1e-5 --seed 0 --num_steps 2000
+
+# During training, you will be prompted with the following content. Since you do not have a W&B account, choose 3 directly.
+wandb: (1) Create a W&B account
+wandb: (2) Use an existing W&B account
+wandb: (3) Don't visualize my results
+wandb: Enter your choice:
+
+

After training is completed, the weights will be saved to the trainings directory. The results are as follows:

+
trainings
+  ├── config.pkl
+  ├── dataset_stats.pkl
+  ├── policy_best.ckpt
+  ├── policy_last.ckpt
+  └── policy_step_0_seed_0.ckpt
+
+

Evaluate the model trained above:

+
# 1 evaluate the policy  add --onscreen_render real-time render parameter
+python3 imitate_episodes.py --eval --task_name sim_transfer_cube_scripted --ckpt_dir trainings --policy_class ACT --kl_weight 1 --chunk_size 10 --hidden_dim 512 --batch_size 1 --dim_feedforward 3200  --lr 1e-5 --seed 0 --num_steps 20 --onscreen_render
+
+

And print the rendering picture.

+

+

Data Training in real environment

+

Data Collection

+

1.Environment dependency

+

1.1 ROS dependency

+

● Default: ubuntu20.04-noetic environment has been configured

+
sudo apt install ros-$ROS_DISTRO-sensor-msgs ros-$ROS_DISTRO-nav-msgs ros-$ROS_DISTRO-cv-bridge
+
+

1.2 Python dependency

+
# Enter the current working space directory and install the dependencies in the requirements.txt file.
+pip install -r requiredments.txt
+
+

2.Data collection

+

2.1 Run ‘collect_data’

+
python collect_data.py -h # see parameters
+python collect_data.py --max_timesteps 500 --episode_idx 0
+python collect_data.py --max_timesteps 500 --is_compress --episode_idx 0
+python collect_data.py --max_timesteps 500 --use_depth_image --episode_idx 1
+python collect_data.py --max_timesteps 500 --is_compress --use_depth_image --episode_idx 1
+
+

After the data collection is completed, it will be saved in the ${dataset_dir}/{task_name} directory.

+
python collect_data.py --max_timesteps 500 --is_compress --episode_idx 0
+# Generate dataset episode_0.hdf5 . The structure is :
+
+collect_data
+  ├── collect_data.py
+  ├── data                     # --dataset_dir 
+  │   └── cobot_magic_agilex   # --task_name 
+  │       ├── episode_0.hdf5   # The location of the generated data set file
+          ├── episode_idx.hdf5 # idx is depended on  --episode_idx
+          └── ...
+  ├── readme.md
+  ├── replay_data.py
+  ├── requiredments.txt
+  └── visualize_episodes.py
+
+

The specific parameters are shown:

+
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
NameExplanation
dataset_dirData set saving path
task_nametask name, as the file name of the data set
episode_idxAction block index number
max_timestepsThe number of time steps for the maximum action block
camera_namesCamera names, default [‘cam_high’, ‘cam_left_wrist’, ‘cam_right_wrist’]
img_front_topicCamera 1 Color Picture Topic
img_left_topicCamera 2 Color Picture Topic
img_right_topicCamera 3 Color Picture Topic
use_depth_imageWhether to use depth information
depth_front_topicCamera 1 depth map topic
depth_left_topicCamera 2 depth map topic
depth_right_topicCamera 3 depth map topic
master_arm_left_topicLeft main arm topic
master_arm_right_topicRight main arm topic
puppet_arm_left_topicLeft puppet arm topic
puppet_arm_right_topicRight puppet arm topic
use_robot_baseWhether to use mobile base information
robot_base_topicMobile base topic
frame_rateAcquisition frame rate. Because the camera image stabilization value is 30 frames, the default is 30 frames
is_compressWhether the image is compressed and saved
+

The picture of data collection from the camera perspective is as follows:

+

data collection

+

Data visualization

+

Run the following code:

+
python visualize_episodes.py --dataset_dir ./data --task_name cobot_magic_agilex --episode_idx 0
+
+

Visualize the collected data. --dataset_dir, --task_name and --episode_idx need to be the same as when ‘collecting data’. When you run the above code, the terminal will print the action and display a color image window. The visualization results are as follows:

+

+

After the operation is completed, episode${idx}qpos.png, episode${idx}base_action.png and episode${idx}video.mp4 files will be generated under ${dataset_dir}/{task_name}. The directory structure is as follows:

+
collect_data
+├── data
+│   ├── cobot_magic_agilex
+│   │   └── episode_0.hdf5
+│   ├── episode_0_base_action.png   # base_action
+│   ├── episode_0_qpos.png          # qpos
+│   └── episode_0_video.mp4         # Color video
+
+

Taking episode30 as an example, replay the collected episode30 data. The camera perspective is as follows:

+

data visualization

+

Model Training and Inference

+

The Mobile Aloha project has studied different strategies for imitation learning, and proposed a Transformer-based action chunking algorithm ACT (Action Chunking with Transformers). It is essentially an end-to-end strategy: directly mapping real-world RGB images to actions, allowing the robot to learn and imitate from the visual input without the need for additional artificially encoded intermediate representations, and using action chunking (Chunking) as the unit to predict and integrates accurate and smooth motion trajectories.

+

The model is as follows:

+

+

Disassemble and interpret the model.

+
    +
  1. Sample data
  2. +
+

+

Input: includes 4 RGB images, each image has a resolution of 480 × 640, and the joint positions of the two robot arms (7+7=14 DoF in total)

+

Output: The action space is the absolute joint positions of the two robots, a 14-dimensional vector. Therefore, with action chunking, the policy outputs a k × 14 tensor given the current observation (each action is defined as a 14-dimensional vector, so k actions are a k × 14 tensor)

+
    +
  1. Infer Z
  2. +
+

+

The input to the encoder is a [CLS] token, which consists of randomly initialized learning weights. Through a linear layer2, the joints are projected to the joint positions of the embedded dimensions (14 dimensions to 512 dimensions) to obtain the embedded joint positions embedded joints. Through another linear layer linear layer1, the k × 14 action sequence is projected to the embedded action sequence of the embedded dimension (k × 14 dimension to k × 512 dimension).

+

The above three inputs finally form a sequence of (k + 2) × embedding_dimension, that is, (k + 2) × 512, and are processed with the transformer encoder. Finally, just take the first output, which corresponds to the [CLS] tag, and use another linear network to predict the mean and variance of the Z distribution, parameterizing it as a diagonal Gaussian distribution. Use reparameterization to obtain samples of Z.

+
    +
  1. Predict a action sequence
  2. +
+

+

① First, for each image observation, it is processed by ResNet18 to obtain a feature map (15 × 20 × 728 feature maps), and then flattened to obtain a feature sequence (300 × 728). These features are processed using a linear layer Layer5 is projected to the embedding dimension (300×512), and in order to preserve spatial information, a 2D sinusoidal position embedding is added.

+

② Secondly, repeat this operation for all 4 images, and the resulting feature sequence dimension is 1200 × 512.

+

③ Next, the feature sequences from each camera are concatenated and used as one of the inputs of the transformer encoder. For the other two inputs: the current joint positions joints and the “style variable” z, they are passed through the linear layer linear layer6, linear layer respectively Layer7 is uniformly projected to 512 from their respective original dimensions (14, 15).

+

④ Finally, the encoder input of the transformer is 1202×512 (the feature dimension of the 4 images is 1200×512, the feature dimension of the joint position joins is 1×512, and the feature dimension of the style variable z is 1×512).

+

The input to the transformer decoder has two aspects:

+

On the one hand, the “query” of the transformer decoder is the first layer of fixed sinusoidal position embeddings, that is, the position embeddings (fixed) shown in the lower right corner of the above figure, whose dimension is k × 512

+

On the other hand, the “keys” and “values” in the cross-attention layer of the transformer decoder come from the output of the above-mentioned transformer encoder.

+

Thereby, the transformer decoder predicts the action sequence given the encoder output.

+

By collecting data and training the above model, you can observe that the results converge.

+

+

A third view of the model inference results is as follows. The robotic arm can infer the movement of placing colored blocks from point A to point B.

+

推理

+

Summary

+

Cobot Magic is a remote whole-body data collection device, developed by AgileX Robotics based on the Mobile Aloha project from Stanford University. With Cobot Magic, AgileX Robotics has successfully achieved the open-source code from the Stanford laboratory used on the Mobile Aloha platform, including in simulation and real environment.
+AgileX will continue to collect data from various motion tasks based on Cobot Magic for model training and inference. Please stay tuned for updates on Github. And if you are interested in this Mobile Aloha project, join us with this slack link: Slack. Let’s talk about our ideas.

+

About AgileX

+

Established in 2016, AgileX Robotics is a leading manufacturer of mobile robot platforms and a provider of unmanned system solutions. The company specializes in independently developed multi-mode wheeled and tracked wire-controlled chassis technology and has obtained multiple international certifications. AgileX Robotics offers users self-developed innovative application solutions such as autonomous driving, mobile grasping, and navigation positioning, helping users in various industries achieve automation. Additionally, AgileX Robotics has introduced research and education software and hardware products related to machine learning, embodied intelligence, and visual algorithms. The company works closely with research and educational institutions to promote robotics technology teaching and innovation.

+

1 post - 1 participant

+

Read full topic

+ + + + + + + +
+

+by Agilex_Robotics on March 15, 2024 03:07 AM +

+ +
+ +
March 12, 2024
+ + + + + + + + +
+ + +
Cloud Robotics WG Strategy & Next Meeting Announcement
+ + + +
+ +
+

Hi folks!

+

I wanted to tell you the results of the Cloud Robotics Working Group meeting from 2024-03-11. We met to discuss the long-term strategy of the group. You can see the full meeting recording on vimeo, with our meeting minutes here (thanks to Phil Roan for taking minutes this meeting!).

+

During the meeting, we went over some definitions of Cloud Robotics, our tenets going forward, and a phase approach of gathering data, analyzing it, and acting on it. We used slides to frame the discussion, which have since been updated from the discussion and will form the backbone of our discussion going forwards. The slide deck is publicly available here.

+

Next meeting will be about how to start collecting the data for the first phase. We will hold it 2024-03-25T17:00:00Z UTC2024-03-25T18:00:00Z UTC. If you’d like to join the group, you are welcome to, and you can sign up for our meeting invites at this Google Group.

+

Finally, we will regularly invite members and guests to give talks in our meetings. If you have a topic you’d like to talk about, or would like to invite someone to talk, please use this speaker signup sheet to let us know.

+

Hopefully I’ll see you all in future meetings!

+

6 posts - 4 participants

+

Read full topic

+ + + + + + + +
+

+by mikelikesrobots on March 12, 2024 05:33 PM +

+ +
+ +
March 11, 2024
+ + + + + + + + +
+ + +
Foxglove 2.0 - integrated UI, new pricing, and open source changes
+ + + +
+ +
+

Hi everyone - excited to announce Foxglove 2.0, with a new integrated UI (merging Foxglove Studio and Data Platform), new pricing plans, and open source changes.

+

:handshake: Streamlined UI for smoother robotics observability
+:satellite: Automatic data offload through Foxglove Agent
+:credit_card: Updated pricing plans to make Foxglove accessible for teams of all sizes
+:mag_right: Changes to our open-source strategy (we’re discontinuing the open source edition of Foxglove Studio)

+

Read the details in our blog post.

+

Note that Foxglove is still free for academic teams and researchers! If you fall into that category, please contact us and we can upgrade your account.

+

15 posts - 10 participants

+

Read full topic

+ + + + + + + +
+

+by amacneil on March 11, 2024 07:28 PM +

+ +
+ + + + + + + + + +
+ + +
Announcing open sourcing of ROS 2 Task Manager!
+ + + +
+ +
+

:tada: Me and my team are happy to announce that we at Karelics have open sourced our ROS 2 Task Manager package. This solution allows you to convert your existing ROS actions and services into tasks, offering useful features such as automatic task conflict resolution, the ability to aggregate multiple tasks into larger Missions, and straightforward tracking for active tasks and their results.

+

Check out the package and examples of its usage with the Nav2 package:
+:link: https://github.com/Karelics/task_manager

+

For an introduction and deeper insights into our design decisions, see our blog post available at: https://karelics.fi/task-manager-ros-2-package/
+

+


+

+We firmly believe that this package will prove valuable to the ROS community and accelerate the development of the robot systems. We are excited to hear your thoughts and feedback on it!

+

1 post - 1 participant

+

Read full topic

+ + + + + + + +
+

+by jak on March 11, 2024 12:52 PM +

+ +
+ + + + + + + + + +
+ + +
New Packages for Iron Irwini 2024-03-11
+ + + +
+ +
+

We’re happy to announce 1 new packages and 82 updates are now available in ROS 2 Iron Irwini :iron: :irwini: . This sync was tagged as iron/2024-03-11 .

+

Package Updates for iron

+

Added Packages [1]:

+
    +
  • ros-iron-apriltag-detector-dbgsym: 1.2.1-1
  • +
+

Updated Packages [82]:

+
    +
  • ros-iron-apriltag-detector: 1.2.0-1 → 1.2.1-1
  • +
  • ros-iron-controller-interface: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-controller-interface-dbgsym: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-controller-manager: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-controller-manager-dbgsym: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-controller-manager-msgs: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-controller-manager-msgs-dbgsym: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-etsi-its-cam-coding: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-cam-coding-dbgsym: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-cam-conversion: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-cam-msgs: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-cam-msgs-dbgsym: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-coding: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-conversion: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-conversion-dbgsym: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-denm-coding: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-denm-coding-dbgsym: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-denm-conversion: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-denm-msgs: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-denm-msgs-dbgsym: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-messages: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-msgs: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-msgs-utils: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-primitives-conversion: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-rviz-plugins: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-etsi-its-rviz-plugins-dbgsym: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-flir-camera-description: 2.0.8-1 → 2.0.8-2
  • +
  • ros-iron-flir-camera-msgs: 2.0.8-1 → 2.0.8-2
  • +
  • ros-iron-flir-camera-msgs-dbgsym: 2.0.8-1 → 2.0.8-2
  • +
  • ros-iron-hardware-interface: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-hardware-interface-dbgsym: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-hardware-interface-testing: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-hardware-interface-testing-dbgsym: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-joint-limits: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-libmavconn: 2.6.0-1 → 2.7.0-1
  • +
  • ros-iron-libmavconn-dbgsym: 2.6.0-1 → 2.7.0-1
  • +
  • ros-iron-mavlink: 2023.9.9-1 → 2024.3.3-1
  • +
  • ros-iron-mavros: 2.6.0-1 → 2.7.0-1
  • +
  • ros-iron-mavros-dbgsym: 2.6.0-1 → 2.7.0-1
  • +
  • ros-iron-mavros-extras: 2.6.0-1 → 2.7.0-1
  • +
  • ros-iron-mavros-extras-dbgsym: 2.6.0-1 → 2.7.0-1
  • +
  • ros-iron-mavros-msgs: 2.6.0-1 → 2.7.0-1
  • +
  • ros-iron-mavros-msgs-dbgsym: 2.6.0-1 → 2.7.0-1
  • +
  • ros-iron-mrpt2: 2.11.9-1 → 2.11.11-1
  • +
  • ros-iron-mrpt2-dbgsym: 2.11.9-1 → 2.11.11-1
  • +
  • ros-iron-mvsim: 0.8.3-1 → 0.9.1-1
  • +
  • ros-iron-mvsim-dbgsym: 0.8.3-1 → 0.9.1-1
  • +
  • ros-iron-ntrip-client: 1.2.0-3 → 1.3.0-1
  • +
  • ros-iron-ros2-control: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-ros2-control-test-assets: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-ros2controlcli: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-rqt-controller-manager: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-rtabmap: 0.21.3-1 → 0.21.4-1
  • +
  • ros-iron-rtabmap-conversions: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-conversions-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-dbgsym: 0.21.3-1 → 0.21.4-1
  • +
  • ros-iron-rtabmap-demos: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-examples: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-launch: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-msgs: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-msgs-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-odom: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-odom-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-python: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-ros: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-rviz-plugins: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-rviz-plugins-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-slam: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-slam-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-sync: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-sync-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-util: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-util-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-viz: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-rtabmap-viz-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-iron-simple-launch: 1.9.0-1 → 1.9.1-1
  • +
  • ros-iron-spinnaker-camera-driver: 2.0.8-1 → 2.0.8-2
  • +
  • ros-iron-spinnaker-camera-driver-dbgsym: 2.0.8-1 → 2.0.8-2
  • +
  • ros-iron-transmission-interface: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-transmission-interface-dbgsym: 3.23.0-1 → 3.24.0-1
  • +
  • ros-iron-ur-client-library: 1.3.4-1 → 1.3.5-1
  • +
  • ros-iron-ur-client-library-dbgsym: 1.3.4-1 → 1.3.5-1
  • +
+

Removed Packages [0]:

+

Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:

+
    +
  • Bence Magyar
  • +
  • Bernd Pfrommer
  • +
  • Felix Exner
  • +
  • Jean-Pierre Busch
  • +
  • Jose-Luis Blanco-Claraco
  • +
  • Luis Camero
  • +
  • Mathieu Labbe
  • +
  • Olivier Kermorgant
  • +
  • Rob Fisher
  • +
  • Vladimir Ermakov
  • +
+

1 post - 1 participant

+

Read full topic

+ + + + + + + +
+

+by Yadunund on March 11, 2024 01:54 AM +

+ +
+ +
March 08, 2024
+ + + + + + + + +
+ + +
ROS News for the Week of March 4th, 2024
+ + + +
+ +
+

ROS News for the Week of March 4th, 2024

+


+I’ve been working with the ROS Industrial team, and the Port of San Antonio, to put together a ROS Meetup in San Antonio / Austin in conjunction with the annual ROS Industrial Consortium Meeting. If you are attending the ROS-I meeting make sure you sign up!

+
+


+Gazebo Classic goes end of life in 2025! To help the community move over to modern Gazebo we’re holding open Gazebo office hours next Tuesday, March 12th, at 9am PST. If you have questions about the migration process please come by!

+
+

e1d28e85278dd4e221030828367839e4950b8cf9_2_671x500
+We often get questions about the “best” robot components for a particular application. I really hate answering these questions; my inner engineer just screams, “IT DEPENDS!” Unfortunately, w really don’t have a lot of apples-to-apples data to compare different hardware vendors.

+

Thankfully @iliao is putting in a ton of work to review ten different low cost LIDAR sensors. Check it out here.
+

+

teaser3
+This week we got a sneak peek at some of the cool CVPR 2024 papers. Check out, “Gaussian Splatting SLAM”, by Hidenobu Matsuki, Riku Murai, Paul H.J. Kelly, Andrew J. Davison, complete with source code.

+
+

1aa39368041ea4a73d78470ab0d7441453258cdf_2_353x500
+We got our new ROSCon France graphic this week! ROSCon France is currently accepting papers! Please consider applying if you speak French!

+

Events

+ +

News

+ +

ROS

+ +

ROS Questions

+

Please make ROS a better project for the next person! Take a moment to answer a question on Robotics Stack Exchange! Not your thing? Contribute to the ROS 2 Docs!

+

4 posts - 2 participants

+

Read full topic

+ + + + + + + +
+

+by Katherine_Scott on March 08, 2024 09:50 PM +

+ +
+ + + + + + + + + +
+ + +
New packages for Humble Hawksbill 2024-03-08
+ + + +
+ +
+

Package Updates for Humble

+

Added Packages [13]:

+
    +
  • ros-humble-apriltag-detector-dbgsym: 1.1.1-1
  • +
  • ros-humble-caret-analyze-cpp-impl: 0.5.0-5
  • +
  • ros-humble-caret-analyze-cpp-impl-dbgsym: 0.5.0-5
  • +
  • ros-humble-ds-dbw: 2.1.10-1
  • +
  • ros-humble-ds-dbw-can: 2.1.10-1
  • +
  • ros-humble-ds-dbw-can-dbgsym: 2.1.10-1
  • +
  • ros-humble-ds-dbw-joystick-demo: 2.1.10-1
  • +
  • ros-humble-ds-dbw-joystick-demo-dbgsym: 2.1.10-1
  • +
  • ros-humble-ds-dbw-msgs: 2.1.10-1
  • +
  • ros-humble-ds-dbw-msgs-dbgsym: 2.1.10-1
  • +
  • ros-humble-gazebo-no-physics-plugin: 0.1.1-1
  • +
  • ros-humble-gazebo-no-physics-plugin-dbgsym: 0.1.1-1
  • +
  • ros-humble-kinematics-interface-dbgsym: 0.3.0-1
  • +
+

Updated Packages [220]:

+
    +
  • ros-humble-ackermann-steering-controller: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-ackermann-steering-controller-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-admittance-controller: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-admittance-controller-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-apriltag-detector: 1.1.0-1 → 1.1.1-1
  • +
  • ros-humble-bicycle-steering-controller: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-bicycle-steering-controller-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-bno055: 0.4.1-1 → 0.5.0-1
  • +
  • ros-humble-camera-calibration: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-caret-analyze: 0.5.0-1 → 0.5.0-2
  • +
  • ros-humble-cob-actions: 2.7.9-1 → 2.7.10-1
  • +
  • ros-humble-cob-actions-dbgsym: 2.7.9-1 → 2.7.10-1
  • +
  • ros-humble-cob-msgs: 2.7.9-1 → 2.7.10-1
  • +
  • ros-humble-cob-msgs-dbgsym: 2.7.9-1 → 2.7.10-1
  • +
  • ros-humble-cob-srvs: 2.7.9-1 → 2.7.10-1
  • +
  • ros-humble-cob-srvs-dbgsym: 2.7.9-1 → 2.7.10-1
  • +
  • ros-humble-controller-interface: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-controller-interface-dbgsym: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-controller-manager: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-controller-manager-dbgsym: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-controller-manager-msgs: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-controller-manager-msgs-dbgsym: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-dataspeed-dbw-common: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dataspeed-ulc: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dataspeed-ulc-can: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dataspeed-ulc-can-dbgsym: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dataspeed-ulc-msgs: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dataspeed-ulc-msgs-dbgsym: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-fca: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-fca-can: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-fca-can-dbgsym: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-fca-description: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-fca-joystick-demo: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-fca-joystick-demo-dbgsym: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-fca-msgs: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-fca-msgs-dbgsym: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-ford: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-ford-can: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-ford-can-dbgsym: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-ford-description: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-ford-joystick-demo: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-ford-joystick-demo-dbgsym: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-ford-msgs: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-ford-msgs-dbgsym: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-polaris: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-polaris-can: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-polaris-can-dbgsym: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-polaris-description: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-polaris-joystick-demo: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-polaris-joystick-demo-dbgsym: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-polaris-msgs: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-dbw-polaris-msgs-dbgsym: 2.1.3-1 → 2.1.10-1
  • +
  • ros-humble-depth-image-proc: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-depth-image-proc-dbgsym: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-diff-drive-controller: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-diff-drive-controller-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-draco-point-cloud-transport: 1.0.9-1 → 1.0.10-1
  • +
  • ros-humble-draco-point-cloud-transport-dbgsym: 1.0.9-1 → 1.0.10-1
  • +
  • ros-humble-effort-controllers: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-effort-controllers-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-etsi-its-cam-coding: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-cam-coding-dbgsym: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-cam-conversion: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-cam-msgs: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-cam-msgs-dbgsym: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-coding: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-conversion: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-conversion-dbgsym: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-denm-coding: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-denm-coding-dbgsym: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-denm-conversion: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-denm-msgs: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-denm-msgs-dbgsym: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-messages: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-msgs: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-msgs-utils: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-primitives-conversion: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-rviz-plugins: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-etsi-its-rviz-plugins-dbgsym: 2.0.0-1 → 2.0.1-1
  • +
  • ros-humble-flir-camera-description: 2.0.8-2 → 2.0.8-3
  • +
  • ros-humble-flir-camera-msgs: 2.0.8-2 → 2.0.8-3
  • +
  • ros-humble-flir-camera-msgs-dbgsym: 2.0.8-2 → 2.0.8-3
  • +
  • ros-humble-force-torque-sensor-broadcaster: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-force-torque-sensor-broadcaster-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-forward-command-controller: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-forward-command-controller-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-gripper-controllers: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-gripper-controllers-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-hardware-interface: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-hardware-interface-dbgsym: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-hardware-interface-testing: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-hardware-interface-testing-dbgsym: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-image-pipeline: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-image-proc: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-image-proc-dbgsym: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-image-publisher: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-image-publisher-dbgsym: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-image-rotate: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-image-rotate-dbgsym: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-image-view: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-image-view-dbgsym: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-imu-sensor-broadcaster: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-imu-sensor-broadcaster-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-joint-limits: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-joint-limits-dbgsym: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-joint-state-broadcaster: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-joint-state-broadcaster-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-joint-trajectory-controller: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-joint-trajectory-controller-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-kinematics-interface: 0.2.0-1 → 0.3.0-1
  • +
  • ros-humble-kinematics-interface-kdl: 0.2.0-1 → 0.3.0-1
  • +
  • ros-humble-kinematics-interface-kdl-dbgsym: 0.2.0-1 → 0.3.0-1
  • +
  • ros-humble-launch-pal: 0.0.16-1 → 0.0.18-1
  • +
  • ros-humble-libmavconn: 2.6.0-1 → 2.7.0-1
  • +
  • ros-humble-libmavconn-dbgsym: 2.6.0-1 → 2.7.0-1
  • +
  • ros-humble-mavlink: 2023.9.9-1 → 2024.3.3-1
  • +
  • ros-humble-mavros: 2.6.0-1 → 2.7.0-1
  • +
  • ros-humble-mavros-dbgsym: 2.6.0-1 → 2.7.0-1
  • +
  • ros-humble-mavros-extras: 2.6.0-1 → 2.7.0-1
  • +
  • ros-humble-mavros-extras-dbgsym: 2.6.0-1 → 2.7.0-1
  • +
  • ros-humble-mavros-msgs: 2.6.0-1 → 2.7.0-1
  • +
  • ros-humble-mavros-msgs-dbgsym: 2.6.0-1 → 2.7.0-1
  • +
  • ros-humble-mrpt2: 2.11.9-1 → 2.11.11-1
  • +
  • ros-humble-mrpt2-dbgsym: 2.11.9-1 → 2.11.11-1
  • +
  • ros-humble-mvsim: 0.8.3-1 → 0.9.1-1
  • +
  • ros-humble-mvsim-dbgsym: 0.8.3-1 → 0.9.1-1
  • +
  • ros-humble-ntrip-client: 1.2.0-1 → 1.3.0-1
  • +
  • ros-humble-play-motion2: 0.0.13-1 → 1.0.0-1
  • +
  • ros-humble-play-motion2-dbgsym: 0.0.13-1 → 1.0.0-1
  • +
  • ros-humble-play-motion2-msgs: 0.0.13-1 → 1.0.0-1
  • +
  • ros-humble-play-motion2-msgs-dbgsym: 0.0.13-1 → 1.0.0-1
  • +
  • ros-humble-plotjuggler: 3.9.0-1 → 3.9.1-1
  • +
  • ros-humble-plotjuggler-dbgsym: 3.9.0-1 → 3.9.1-1
  • +
  • ros-humble-pmb2-2dnav: 4.0.9-1 → 4.0.12-1
  • +
  • ros-humble-pmb2-bringup: 5.0.15-1 → 5.0.16-1
  • +
  • ros-humble-pmb2-controller-configuration: 5.0.15-1 → 5.0.16-1
  • +
  • ros-humble-pmb2-description: 5.0.15-1 → 5.0.16-1
  • +
  • ros-humble-pmb2-laser-sensors: 4.0.9-1 → 4.0.12-1
  • +
  • ros-humble-pmb2-maps: 4.0.9-1 → 4.0.12-1
  • +
  • ros-humble-pmb2-navigation: 4.0.9-1 → 4.0.12-1
  • +
  • ros-humble-pmb2-robot: 5.0.15-1 → 5.0.16-1
  • +
  • ros-humble-point-cloud-interfaces: 1.0.9-1 → 1.0.10-1
  • +
  • ros-humble-point-cloud-interfaces-dbgsym: 1.0.9-1 → 1.0.10-1
  • +
  • ros-humble-point-cloud-transport: 1.0.15-1 → 1.0.16-1
  • +
  • ros-humble-point-cloud-transport-dbgsym: 1.0.15-1 → 1.0.16-1
  • +
  • ros-humble-point-cloud-transport-plugins: 1.0.9-1 → 1.0.10-1
  • +
  • ros-humble-point-cloud-transport-py: 1.0.15-1 → 1.0.16-1
  • +
  • ros-humble-position-controllers: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-position-controllers-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-psdk-interfaces: 1.0.0-1 → 1.1.0-1
  • +
  • ros-humble-psdk-interfaces-dbgsym: 1.0.0-1 → 1.1.0-1
  • +
  • ros-humble-psdk-wrapper: 1.0.0-1 → 1.1.0-1
  • +
  • ros-humble-psdk-wrapper-dbgsym: 1.0.0-1 → 1.1.0-1
  • +
  • ros-humble-range-sensor-broadcaster: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-range-sensor-broadcaster-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-ros2-control: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-ros2-control-test-assets: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-ros2-controllers: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-ros2-controllers-test-nodes: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-ros2caret: 0.5.0-2 → 0.5.0-6
  • +
  • ros-humble-ros2controlcli: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-rqt-controller-manager: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-rqt-gauges: 0.0.1-1 → 0.0.2-1
  • +
  • ros-humble-rqt-joint-trajectory-controller: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-rtabmap: 0.21.3-1 → 0.21.4-1
  • +
  • ros-humble-rtabmap-conversions: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-conversions-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-dbgsym: 0.21.3-1 → 0.21.4-1
  • +
  • ros-humble-rtabmap-demos: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-examples: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-launch: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-msgs: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-msgs-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-odom: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-odom-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-python: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-ros: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-rviz-plugins: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-rviz-plugins-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-slam: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-slam-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-sync: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-sync-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-util: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-util-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-viz: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-rtabmap-viz-dbgsym: 0.21.3-1 → 0.21.4-2
  • +
  • ros-humble-simple-launch: 1.9.0-1 → 1.9.1-1
  • +
  • ros-humble-spinnaker-camera-driver: 2.0.8-2 → 2.0.8-3
  • +
  • ros-humble-spinnaker-camera-driver-dbgsym: 2.0.8-2 → 2.0.8-3
  • +
  • ros-humble-steering-controllers-library: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-steering-controllers-library-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-stereo-image-proc: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-stereo-image-proc-dbgsym: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-tiago-2dnav: 4.0.9-1 → 4.0.12-1
  • +
  • ros-humble-tiago-bringup: 4.1.2-1 → 4.2.3-1
  • +
  • ros-humble-tiago-controller-configuration: 4.1.2-1 → 4.2.3-1
  • +
  • ros-humble-tiago-description: 4.1.2-1 → 4.2.3-1
  • +
  • ros-humble-tiago-gazebo: 4.0.8-1 → 4.1.0-1
  • +
  • ros-humble-tiago-laser-sensors: 4.0.9-1 → 4.0.12-1
  • +
  • ros-humble-tiago-moveit-config: 3.0.7-1 → 3.0.10-1
  • +
  • ros-humble-tiago-navigation: 4.0.9-1 → 4.0.12-1
  • +
  • ros-humble-tiago-robot: 4.1.2-1 → 4.2.3-1
  • +
  • ros-humble-tiago-simulation: 4.0.8-1 → 4.1.0-1
  • +
  • ros-humble-tracetools-image-pipeline: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-tracetools-image-pipeline-dbgsym: 3.0.3-1 → 3.0.4-1
  • +
  • ros-humble-transmission-interface: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-transmission-interface-dbgsym: 2.39.1-1 → 2.40.0-1
  • +
  • ros-humble-tricycle-controller: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-tricycle-controller-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-tricycle-steering-controller: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-tricycle-steering-controller-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-ur-client-library: 1.3.4-1 → 1.3.5-1
  • +
  • ros-humble-ur-client-library-dbgsym: 1.3.4-1 → 1.3.5-1
  • +
  • ros-humble-velocity-controllers: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-velocity-controllers-dbgsym: 2.32.0-1 → 2.33.0-1
  • +
  • ros-humble-zlib-point-cloud-transport: 1.0.9-1 → 1.0.10-1
  • +
  • ros-humble-zlib-point-cloud-transport-dbgsym: 1.0.9-1 → 1.0.10-1
  • +
  • ros-humble-zstd-point-cloud-transport: 1.0.9-1 → 1.0.10-1
  • +
  • ros-humble-zstd-point-cloud-transport-dbgsym: 1.0.9-1 → 1.0.10-1
  • +
+

Removed Packages [2]:

+ +

Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:

+
    +
  • Alejandro Hernandez Cordero
  • +
  • Alejandro Hernández
  • +
  • Bence Magyar
  • +
  • Bernd Pfrommer
  • +
  • Bianca Bendris
  • +
  • Boeing
  • +
  • Davide Faconti
  • +
  • Denis Štogl
  • +
  • Eloy Bricneo
  • +
  • Felix Exner
  • +
  • Felix Messmer
  • +
  • Jean-Pierre Busch
  • +
  • Jordan Palacios
  • +
  • Jordi Pages
  • +
  • Jose-Luis Blanco-Claraco
  • +
  • Kevin Hallenbeck
  • +
  • Luis Camero
  • +
  • Martin Pecka
  • +
  • Mathieu Labbe
  • +
  • Micho Radovnikovich
  • +
  • Noel Jimenez
  • +
  • Olivier Kermorgant
  • +
  • Rob Fisher
  • +
  • TIAGo PAL support team
  • +
  • Vincent Rabaud
  • +
  • Vladimir Ermakov
  • +
  • Víctor Mayoral-Vilches
  • +
  • flynneva
  • +
  • ymski
  • +
+

1 post - 1 participant

+

Read full topic

+ + + + + + + +
+

+by audrow on March 08, 2024 04:36 PM +

+ +
+ + + + + + + + + +
+ + +
ROS1: Now is a great time to add `catkin_lint` to your packages!
+ + + +
+ +
+

catkin_lint is an established ROS package that can do lots of useful checks on your CMakeLists.txt and package.xml . It can e.g. warn you about dependencies that do not match between package.xml and CMakeLists.txt, it can check existence of all rosdep keys in pacakge.xml, it will watch if all executable files in your package get installed, it will warn you about the most common wrong usages of CMake, and recently it even got the ability to warn you if you’re using a CMake feature that is too new for the CMake version you’ve put in cmake_minimum_required(). And there’s much more.

+

Personally, as a maintainer, I feel much more comfortable releasing a new version of a package once I see catkin_lint passed without complaints.

+

Until recently, automatic running of catkin_lint tests on packages released via buildfarm was problematic because the buildfarm doesn’t initialize rosdep cache and catkin_lint needed it for its working. The recently released version 1.6.22 of catkin lint no longer fails in this case, so it is able to run all other tests that do not require rosdep on the buildfarm, while disabling those that need rosdep (currently only checking that package.xml keys point to existing packages).

+

Adding automatic catkin_lint to your package is easy!

+

CMakeLists.txt:

+
if (CATKIN_ENABLE_TESTING)
+  find_package(roslint REQUIRED)
+  roslint_custom(catkin_lint "-W2" .)
+  roslint_add_test()
+endif()
+
+

package.xml:

+
<test_depend>python3-catkin-lint</test_depend>
+<test_depend>roslint</test_depend>
+
+

And that’s it!

+

If you want to run the test locally, you can either manually invoke catkin_lint . in your package directory, or make roslint in the build directory.

+

And if you’re okay with some warnings catkin_lint gives you, you can always ignore them either for a single line (#catkin_lint: ignore_once duplicate_find) or globally by adding arguments to the catkin_lint call (catkin_lint -W2 --ignore duplicate_find .).

+

Of course, the catkin_lint automation should not substitute manual runs of this tool before releasing a new version of your package. It should be a good habit to run caktin_lint after you finished editing your build files. However, having the automation built in, you can get assurance that even if you forget running the tool manually, the buildfarm will let you know :slight_smile:

+

You can see examples of catkin_lint used on buildfarm-released packages e.g. in our ROS utils stack: ros-utils/cras_topic_tools/CMakeLists.txt at master · ctu-vras/ros-utils · GitHub . Or scroll down on rosdep System Dependency: python3-catkin-lint to see all other.

+
+

NB: I’m not the developer of catkin_lint. @roehling @ FKIE is doing all of the awesome work!

+
+

NB2: When you’re at it, also have a look at:

+
find_package(roslaunch REQUIRED)
+roslaunch_add_file_check(launch IGNORE_UNSET_ARGS)
+
+

and

+
<test_depend>roslaunch</test_depend>
+
+

1 post - 1 participant

+

Read full topic

+ + + + + + + +
+

+by peci1 on March 08, 2024 10:28 AM +

+ +
+ + + + + + + + + +
+ + +
Cobot Magic: Mobile Aloha system works on AgileX Robotics platform
+ + + +
+ +
+

Introduction

+

AgileX Cobot Magic is a system based on Mobile ALOHA that can simultaneously remotely control the TRACER mobile base and robotic arms.

+

浇花1

+

Story

+

Mobile Aloha is a whole-body remote operation data collection system developed by Zipeng Fu, Tony Z. Zhao, and Chelsea Finn from Stanford University. Its hardware is based on 2 robotic arms (ViperX 300), equipped with 2 wrist cameras and 1 top camera, and a mobile base from AgileX Robotics Tracer differential motion robot, etc. Data collected using Mobile ALOHA, combined with supervised behavior cloning and joint training with existing static ALOHA datasets, can improve the performance of mobile manipulation tasks. With 50 demonstrations for each task, joint training can increase the success rate by 90%. Mobile ALOHA can autonomously perform complex mobile manipulation tasks such as cooking and opening doors. Special thanks to the Stanford research team Zipeng Fu, Tony Z. Zhao, and Chelsea Finn for their research on Mobile ALOHA, which enabled full open-source implementation. For more details about this project please check the link.

+

Based on Mobile Aloha, AgileX developed Cobot Magic, which can achieve the complete code of Mobile Aloha, with higher configurations and lower costs, and is equipped with larger-load robotic arms and high-computing power industrial computers. For more details about Cobot Magic please check the AgileX website.

+

AgileX Cobot Magic is a system based on Mobile ALOHA that can simultaneously remotely control the TRACER mobile base and robotic arms. It is equipped with an indoor differential drive AGV base, a high-performance robotic arm, an industrial-grade computer, and other components. AgileX Cobot Magic assists users in better utilizing open-source hardware and the Mobile ALOHA deep learning framework for robotics. It covers a wide range of tasks, from simple pick-and-place operations to more intricate and complex actions such as pouring, cooking, riding elevators, and organizing items.

+

+

Currently, AgileX has successfully completed the integration of Cobot Magic based on the Mobile Aloha source code project. It currently includes the entire process of data collection, data re-display, data visualization, demonstration mode, model training, inference, and so on. This project will introduce AgileX Cobot Magic and provide ongoing updates on the training progress of mobile manipulation tasks.

+

Hardware configuration

+

Here is the list of hardware in AgileX Cobot Magic.

+
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
ComponentItem NameModel
Standard ConfigurationWheeled Mobile RobotAgileX Tracer
Deep Camera x3Orbbec Dabai
USB Hub12V Power Supply,7-port
6 DOF Lightweight Robot Arm x4Customized by AgileX
Adjustable Velcro x2Customized by AgileX
Grip Tape x2Customized by AgileX
Power Strip4 Outlets, 1.8m
Mobile Power Station1000W
ALOHA StandCustomized by AgileX
OptionalConfigurationNano Development KitJetson Orin Nano Developer Kit (8G)
Industrial PCAPQ-X7010/GPU 4060/i7-9700-32g-4T
IMUCH110
Display11.6" 1080p
+

Note: An IPC is required. Users have two options: Nano Development kit and APQ-X7010 IPC.

+

Software configuration

+

Local computer:

+

Ubuntu20.04, cuda-11.3.

+

Environment configuration:

+
# 1. Create python virtual environment
+conda create -n aloha python=3.8
+
+# 2. Activate
+conda activate aloha
+
+# 3. Install cuda and torch
+pip install torch==1.11.0+cu113 torchvision==0.12.0+cu113 torchaudio==0.11.0 --extra-index-url https://download.pytorch.org/whl/cu113  
+
+
+# 4 Install detr
+##  Get act code
+git clone https://github.com/agilexrobotics/act-plus-plus.git
+cd act-plus-plus
+
+
+# 4.1 other dependencies
+pip install -r requirements.txt
+
+## 4.2 Install detr
+cd detr && pip install -v -e .
+
+

Simulated environment datasets

+

You can find all scripted/human demos for simulated environments here. here

+

After downloading, copy it to the act-plus-plus/data directory. The directory structure is as follows:

+
act-plus-plus/data
+    ├── sim_insertion_human
+    │   ├── sim_insertion_human-20240110T054847Z-001.zip
+        ├── ...
+    ├── sim_insertion_scripted
+    │   ├── sim_insertion_scripted-20240110T054854Z-001.zip
+        ├── ... 
+    ├── sim_transfer_cube_human
+    │   ├── sim_transfer_cube_human-20240110T054900Z-001.zip
+    │   ├── ...
+    └── sim_transfer_cube_scripted
+        ├── sim_transfer_cube_scripted-20240110T054901Z-001.zip
+        ├── ...
+
+

Demonstration

+

By now it is widely accepted that learning a task from scratch, i.e., without any prior knowledge, is a daunting undertaking. Humans, however, rarely attempt to learn from scratch. They extract initial biases as well as strategies on how to approach a learning problem from instructions and/or demonstrations of other humans. This is what we call ‘programming by demonstration’ or ‘Imitation learning’.

+

The demonstration usually contains decision data {T1, T2,…, Tm}. Each decision contains the state and action sequence

+

image.png

+

Extract all “state-action pairs” and build a new set

+

image.png

+

Currently, based on AgileX Cobot Magic, we can achieve multiple whole-body action tasks.

+

Here we will show different action task demonstrations collected using AgileX Cobot Magic.

+

Watering flowers

+

浇花1

+

Opening a box

+

开箱子1

+

Pouring rice

+

倒米1

+

Twisting a bottle cap

+

拧瓶盖1

+

Throwing a rubbish

+

扔垃圾1

+

Using AgileX Cobot Magic, users can flexibly complete various action tasks in life by controlling the teaching robot arm from simple pick and place skills to more sophisticated skills such as twisting bottle caps. The mobile chassis provides more possibilities for the robotic arms so that the robotic arm is no longer restricted to performing actions in a fixed place. The 14 + 2 DOFs provide limitless potential for collecting diverse data.

+

Data Presentation

+

+

Display the collected data of a certain demonstration of the AgileX Cobot Magic arms. The collected data includes the positional information of 14 joints at different time intervals.

+

+

Summary

+

Cobot Magic is a remote whole-body data collection device, developed by AgileX Robotics based on the Mobile Aloha project from Stanford University. With Cobot Magic, AgileX Robotics has successfully achieved the open-source code from the Stanford laboratory used on the Mobile Aloha platform. Data collection is no longer limited to desktops or specific surfaces thanks to the mobile base Tracer on the Cobot Magic, which enhances the richness and diversity of collected data.

+

AgileX will continue to collect data from various motion tasks based on Cobot Magic for model training and inference. Please stay tuned for updates on Github.

+

About AgileX

+

Established in 2016, AgileX Robotics is a leading manufacturer of mobile robot platforms and a provider of unmanned system solutions. The company specializes in independently developed multi-mode wheeled and tracked wire-controlled chassis technology and has obtained multiple international certifications. AgileX Robotics offers users self-developed innovative application solutions such as autonomous driving, mobile grasping, and navigation positioning, helping users in various industries achieve automation. Additionally, AgileX Robotics has introduced research and education software and hardware products related to machine learning, embodied intelligence, and visual algorithms. The company works closely with research and educational institutions to promote robotics technology teaching and innovation.

+

Appendix

+

ros_astra_camera configuration

+

ros_astra_camera-githubros_astra_camera-gitee

+

Camera Parameters

+
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
NameParameters
Baseline40mm
Depth distance0.3-3m
Depth map resolution640x400x30fps、320x200x30fps
Color image resolution1920x1080x30fps、1280x720x30fps、640x480x30fps
Accuracy6mm@1m (81% FOV area in accuracy calculation)
Depth FOVH 67.9° V 45.3°
Color FOVH 71° V 43.7° @ 1920x1080
Delay30-45ms
Data transmissionUSB2.0 or above
Working temperature10°C~40°C
SizeLength 59.5x Width 17.4x Thickness 11.1 mm
+
    +
  1. OrbbecSDK_ROS1Drive installation
  2. +
+
# 1 Install dependencies
+sudo apt install libgflags-dev  ros-$ROS_DISTRO-image-geometry ros-$ROS_DISTRO-camera-info-manager ros-$ROS_DISTRO-image-transport ros-$ROS_DISTRO-image-publisher ros-$ROS_DISTRO-libuvc-ros libgoogle-glog-dev libusb-1.0-0-dev libeigen3-dev 
+
+# 2 Download the code
+## 2.1 github
+git clone https://github.com/orbbec/OrbbecSDK_ROS1.git astra_ws/src
+## 2.2 gitee(Chinese region)
+git clone https://gitee.com/orbbecdeveloper/OrbbecSDK_ROS1 -b v1.4.6 astra_ws/src
+
+# 3 Compile orbbec_camera
+## 3.1 Enter astra_ws workspace
+cd astra_ws
+## 3.2 Compile orbbec_camera
+catkin_make
+
+# 4 Install udev rules.
+source devel/setup.bash && rospack list
+roscd orbbec_camera/scripts
+sudo cp 99-obsensor-libusb.rules /etc/udev/rules.d/99-obsensor-libusb.rules
+sudo udevadm control --reload && sudo  udevadm trigger
+
+# 5 Add ros_astra_camera package environment variables
+## 5.1 Enter astra_ws
+cd astra_ws
+## 5.2 Add environment variables
+echo "source $(pwd)/devel/setup.bash" >> ~/.bashrc 
+## 5.3 Environment variables work
+
+# 6 Launch
+## If step 5 is not performed, the code in 6.2 needs to be executed every time it is started to make the ros workspace environment take effect.
+## 6.1 astra_ws
+cd astra_ws
+## 6.2 workspace works
+source devel/setup.bash
+## 6.3 launch astra.launch
+roslaunch orbbec_camera astra.launch
+## 6.4 luanch dabai.launch
+roslaunch orbbec_camera dabai.launch
+
+
    +
  1. Configure orbbec_camera multiple camera nodes
  2. +
+

① Check the device serial number

+

● After installing the camera, run the following code

+
rosrun orbbec_camera list_devices_node | grep -i serial
+
+

● The output in the terminal

+
[ INFO] [1709728787.207920484]: serial: AU1P32201SA
+# Please recored this serial number. Each camera corresponds to a unique Serial number.
+
+

② Configure multiple camera nodes

+

● cobot_magic uses three Dabai cameras of orbbec_camera, so it is necessary to configure the corresponding camera according to the Serial number of each camera.

+

● Industrial computer PC plugs in the USB data cables of the three cameras and runs 1. View the code in the device number section to view the Serial numbers of the three cameras

+

● In order to clarify the topics corresponding to each camera in subsequent development, please fill in the Serial number in order.

+

● Create the multi_dabai.launch file in the astra_ws/src/launch directory with the following content:

+
# Mainly modify: 1 Camera name 、2 Serial number
+<launch>
+    <arg name="camera_name" default="camera"/>
+    <arg name="3d_sensor" default="dabai"/>
+    
+     <!-- 1 Mainly modify 1 camera name prefix and 2 Serial number. -->
+    <arg name="camera1_prefix" default="01"/>
+    <arg name="camera2_prefix" default="02"/>
+    <arg name="camera3_prefix" default="03"/>
+    
+    <!-- # 2 Serial number : Fill in the camera Serial number -->
+    <arg name="camera1_usb_port" default="camera1的serial number"/>
+    <arg name="camera2_usb_port" default="camera2的serial number"/>
+    <arg name="camera3_usb_port" default="camera3的serial number"/>
+ 
+    <arg name="device_num" default="3"/>
+    <include file="$(find orbbec_camera)/launch/$(arg 3d_sensor).launch">
+        <arg name="camera_name" value="$(arg camera_name)_$(arg camera1_prefix)"/>
+        <arg name="usb_port" value="$(arg camera1_usb_port)"/>
+        <arg name="device_num" value="$(arg device_num)"/>
+    </include>
+ 
+    <include file="$(find orbbec_camera)/launch/$(arg 3d_sensor).launch">
+        <arg name="camera_name" value="$(arg camera_name)_$(arg camera2_prefix)"/>
+        <arg name="usb_port" value="$(arg camera2_usb_port)"/>
+        <arg name="device_num" value="$(arg device_num)"/>
+    </include>
+    
+    <include file="$(find orbbec_camera)/launch/$(arg 3d_sensor).launch">
+        <arg name="camera_name" value="$(arg camera_name)_$(arg camera3_prefix)"/>
+        <arg name="usb_port" value="$(arg camera3_usb_port)"/>
+        <arg name="device_num" value="$(arg device_num)"/>
+    </include>
+</launch>
+
+

● Add permissions

+
# 1 Enter astra_camera/launch/
+roscd orbbec_camera/launch/
+ 
+# 2 multi_dabai.launch add permissions
+chmod +x multi_dabai.launch
+
+

● Launch ros

+
roslaunch orbbec_camera multi_dabai.launch
+
+

1 post - 1 participant

+

Read full topic

+ + + + + + + +
+

+by Agilex_Robotics on March 08, 2024 02:01 AM +

+ +
+ + + + + + + + + +
+ + +
New Packages for Noetic 2024-03-07
+ + + +
+ +
+

We’re happy to announce 10 new packages and 46 updates are now available in ROS Noetic. This sync was tagged as noetic/2024-03-07.

+

Thank you to every maintainer and contributor who made these updates available!

+

Package Updates for ROS Noetic

+

Added Packages [10]:

+
    +
  • ros-noetic-atf: 0.1.1-1
  • +
  • ros-noetic-atf-core: 0.1.1-1
  • +
  • ros-noetic-atf-metrics: 0.1.1-1
  • +
  • ros-noetic-atf-msgs: 0.1.1-1
  • +
  • ros-noetic-atf-plotter: 0.1.1-1
  • +
  • ros-noetic-atf-recorder-plugins: 0.1.1-1
  • +
  • ros-noetic-atf-test: 0.1.1-1
  • +
  • ros-noetic-atf-test-tools: 0.1.1-1
  • +
  • ros-noetic-etsi-its-rviz-plugins: 2.0.1-1
  • +
  • ros-noetic-py-binding-tools: 1.0.0-1
  • +
+

Updated Packages [46]:

+
    +
  • ros-noetic-cras-cpp-common: 2.3.8-1 → 2.3.9-1
  • +
  • ros-noetic-cras-docs-common: 2.3.8-1 → 2.3.9-1
  • +
  • ros-noetic-cras-py-common: 2.3.8-1 → 2.3.9-1
  • +
  • ros-noetic-cras-topic-tools: 2.3.8-1 → 2.3.9-1
  • +
  • ros-noetic-etsi-its-cam-coding: 2.0.0-1 → 2.0.1-1
  • +
  • ros-noetic-etsi-its-cam-conversion: 2.0.0-1 → 2.0.1-1
  • +
  • ros-noetic-etsi-its-cam-msgs: 2.0.0-1 → 2.0.1-1
  • +
  • ros-noetic-etsi-its-coding: 2.0.0-1 → 2.0.1-1
  • +
  • ros-noetic-etsi-its-conversion: 2.0.0-1 → 2.0.1-1
  • +
  • ros-noetic-etsi-its-denm-coding: 2.0.0-1 → 2.0.1-1
  • +
  • ros-noetic-etsi-its-denm-conversion: 2.0.0-1 → 2.0.1-1
  • +
  • ros-noetic-etsi-its-denm-msgs: 2.0.0-1 → 2.0.1-1
  • +
  • ros-noetic-etsi-its-messages: 2.0.0-1 → 2.0.1-1
  • +
  • ros-noetic-etsi-its-msgs: 2.0.0-1 → 2.0.1-1
  • +
  • ros-noetic-etsi-its-msgs-utils: 2.0.0-1 → 2.0.1-1
  • +
  • ros-noetic-etsi-its-primitives-conversion: 2.0.0-1 → 2.0.1-1
  • +
  • ros-noetic-gnss-info: 1.0.1-1 → 1.0.2-1
  • +
  • ros-noetic-gnss-info-msgs: 1.0.1-1 → 1.0.2-1
  • +
  • ros-noetic-gnsstk-ros: 1.0.1-1 → 1.0.2-1
  • +
  • ros-noetic-image-transport-codecs: 2.3.8-1 → 2.3.9-1
  • +
  • ros-noetic-libmavconn: 1.17.0-1 → 1.18.0-1
  • +
  • ros-noetic-mavlink: 2023.9.9-1 → 2024.3.3-1
  • +
  • ros-noetic-mavros: 1.17.0-1 → 1.18.0-1
  • +
  • ros-noetic-mavros-extras: 1.17.0-1 → 1.18.0-1
  • +
  • ros-noetic-mavros-msgs: 1.17.0-1 → 1.18.0-1
  • +
  • ros-noetic-mrpt2: 2.11.9-1 → 2.11.11-1
  • +
  • ros-noetic-mvsim: 0.8.3-1 → 0.9.1-2
  • +
  • ros-noetic-ntrip-client: 1.2.0-1 → 1.3.0-1
  • +
  • ros-noetic-rtabmap: 0.21.3-1 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-conversions: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-costmap-plugins: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-demos: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-examples: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-launch: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-legacy: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-msgs: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-odom: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-python: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-ros: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-rviz-plugins: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-slam: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-sync: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-util: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-rtabmap-viz: 0.21.3-4 → 0.21.4-1
  • +
  • ros-noetic-test-mavros: 1.17.0-1 → 1.18.0-1
  • +
  • ros-noetic-ur-client-library: 1.3.4-1 → 1.3.5-1
  • +
+

Removed Packages [0]:

+

Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:

+
    +
  • Felix Exner
  • +
  • Florian Weisshardt
  • +
  • Jean-Pierre Busch
  • +
  • Jose-Luis Blanco-Claraco
  • +
  • Martin Pecka
  • +
  • Mathieu Labbe
  • +
  • Rob Fisher
  • +
  • Robert Haschke
  • +
  • Vladimir Ermakov
  • +
+

2 posts - 2 participants

+

Read full topic

+ + + + + + + +
+

+by sloretz on March 08, 2024 01:28 AM +

+ +
+ +
March 04, 2024
+ + + + + + + + +
+ + +
Interoperability Interest Group March 7, 2024: Standardizing Infrastructure Messages, Part 3
+ + + +
+ +
+

Community Page

+

Meeting Link

+

Calendar Link

+

Continuing our discussion from the last session, our next session will get into more depth on how errors for building infrastructure devices should be represented.

+

Some questions to consider:

+
    +
  • What level of detail needs to be standardized for error messages? +
      +
    • Is it enough to simply communicate that the devices is unusable?
    • +
    • Should the standardized error messages also provide enough information for a technician to troubleshoot the device?
    • +
    • Should detailed troubleshooting information be provided through a separate non-standard channel instead?
    • +
    +
  • +
  • How efficient should error messages be? +
      +
    • A simple error code is high performance and allows for millions of possible error types but then can only communicate the presence of one error at a time
    • +
    • Bitsets could express multiple simultaneous errors with high performance but then the number of error types is severely limited
    • +
    • Dynamic arrays of error codes can communicate many types of errors with no limit but then heap allocations are needed
    • +
    • A string of serialized JSON could represent unlimited types of errors and provide troubleshooting information for them, but then heap allocation and string parsing are needed
    • +
    +
  • +
  • Should standardized error definitions be specific to each type of building device, or should the definitions be abstract enough to use across all/multiple devices? +
      +
    • E.g. are doors and elevators different enough that they need their own error code definitions?
    • +
    • What kind of errors should we expect to report for each different type of device?
    • +
    +
  • +
+

We will be seeking input on all of the above questions and more. Please come armed with examples of your most hated device errors that you think a good standard should be able to express.

+

4 posts - 3 participants

+

Read full topic

+ + + + + + + +
+

+by grey on March 04, 2024 03:23 PM +

+ +
+ + + + + + + + + +
+ + +
ROS Mapping and Navigation with AgileX Robotics Limo
+ + + +
+ +
+

Limo is a smart educational robot published by AgileX Robotics. More details please visit: https://global.agilex.ai/
+

+

Four steering modes make LIMO substantially superior to other robots in its class. The available modes are: Omni-Wheel Steering, Tracked Steering, Four-Wheel Differential Steering and Ackermann Steering. These advanced steering modes plus a built-in 360° scanning LiDAR and RealSense infrared camera make the platform perfect for industrial and commercial tasks in any scenario. With these incredible features, LIMO can achieve precise self-localization, SLAM mapping, route planning and autonomous obstacle avoidance, reverse parking, traffic light recognition, and more.
+

+

Mapping

+

Gmapping

+

Gmapping is a widely adopted open-source SLAM algorithm that operates within the filtering SLAM framework. It effectively uses wheel odometry data and does not heavily rely on high-frequency laser LiDAR scans. When constructing a map of a smaller environment, Gmapping requires minimal computational resources to maintain high accuracy. Here the ROS encapsulated Gmapping package is used to achieve the Gmapping for Limo.

+

Note: The speed of limo should be slow in the process of mapping. If the speed is too fast, the effect of mapping will be affected.

+

Run the command in a new terminal. It launches the LiDAR.

+
 roslaunch limo_bringup limo_start.launch pub_odom_tf:=false
+
+

Then launch the gmapping algorithm. Open another new terminal, and enter the command:

+
roslaunch limo_bringup limo_gmapping.launch
+
+

After launching successfully, the rviz visualization tool will start up. The interface is shown in the figure.
+

+

At this time, the handle can be set to remote control mode and control limo mapping.

+

After building the map, run the following command to save the map to the specified directory:

+
    +
  1. Switch to the directory where you need to save the map, save the map to ~/agilex_ws/src/limo_ros/limo_bringup/maps/, and enter the command in the terminal:
  2. +
+
cd ~/agilex_ws/src/limo_ros/limo_bringup/maps/
+
+
    +
  1. After switching to /agilex_ws/limo_bringup/maps, continue to enter the command in the terminal:
  2. +
+
rosrun map_server map_saver –f map1
+
+

Note: map1 is the name of the saved map, and duplicate names should be avoided when saving the map.

+

Cartographer

+

Cartographer is a set of SLAM algorithms based on image optimization launched by Google. The main goal of this algorithm is to achieve low computing resource consumption and achieve the purpose of real-time SLAM. The algorithm is mainly divided into two parts. The first part is called Local SLAM. This part establishes and maintains a series of Submaps through each frame of the Laser Scan, and the so-called submap is a series of Grid Maps. The second part called Global SLAM, is to perform closed-loop detection through Loop Closure to eliminate accumulated errors: when a submap is built, no new laser scans will be inserted into the submap. The algorithm will add the submap to the closed-loop detection.

+

Note: Before running the command, please make sure that the programs in other terminals have been terminated. The termination command is: Ctrl+c.

+

Note: The speed of limo should be slow in the process of mapping. If the speed is too fast, the effect of mapping will be affected.

+

Launch a new terminal and enter the command:

+
roslaunch limo_bringup limo_start.launch pub_odom_tf:=false
+
+

Then start the cartographer mapping algorithm. Open another new terminal and enter the command:

+
roslaunch limo_bringup limo_cartographer.launch
+
+

After launching successfully, the rviz visualization interface will be shown as the figure below:
+

+

After building the map, it is necessary to save it. Three following commands need to be entered in the terminal:

+

(1)After completing the trajectory, no further data should be accepted.

+
rosservice call /finish_trajectory 0
+
+

(2)Serialize and save its current state.

+
rosservice call /write_state "{filename: '${HOME}/agilex_ws/src/limo_ros/limo_bringup/maps/mymap.pbstream'}"
+
+

(3)Convert pbstream to pgm and yaml

+
rosrun cartographer_ros cartographer_pbstream_to_ros_map -map_filestem=${HOME}/agilex_ws/src/limo_ros/limo_bringup/maps/mymap.pbstream -pbstream_filename=${HOME}/agilex_ws/src/limo_ros/limo_bringup/maps/mymap.pbstream -resolution=0.05
+
+

Generate the corresponding pgm and yaml, and put them in the directory:

+

${HOME}/agilex_ws/src/limo_ros/limo_bringup/maps/mymap.pbstream

+

Note: During the process of mapping, some warnings will appear in the terminal. This is caused by the excessive speed and the delayed data processing, which can be ignored.
+

+

Navigation

+

Navigation framework

+

The key to navigation is robot positioning and path planning. For these, ROS provides the following two packages.

+

(1)move_base:achieve the optimal path planning in robot navigation.

+

(2)amcl:achieve robot positioning in a two-dimensional map.

+

On the basis of the above two packages, ROS provides a complete navigation framework.
+


+The robot only needs to publish the necessary sensor information and navigation goal position, and ROS can complete the navigation function. In this framework, the move_base package provides the main operation and interactive interface of navigation. In order to ensure the accuracy of the navigation path, the robot also needs to accurately locate its own position. This part of the function is implemented by the amcl package.

+

1.1 Move_base package

+

move_base is a package for path planning in ROS, which is mainly composed of the following two planners.

+

(1) Global path planning (global_planner). Global path planning is to plan the overall path according to a given goal position and global map. In navigation, Dijkstra or A* algorithm is used for global path planning, and the optimal route from the robot to the goal position is calculated as the robot’s global path.

+

(2) Local real-time planning (local_planner). In practice, robots often cannot strictly follow the global path. So it is necessary to plan the path that the robot should travel in each cycle according to the map information and obstacles that may appear near the robot at any time. So that it conforms to the global optimal path as much as possible.

+

1.2 Amcl package

+

Autonomous positioning means that the robot can calculate its position on the map in any state. ROS provides developers with an adaptive (or kld sampling) Monte Carlo localization (amcl), which is a probabilistic positioning system that locates mobile robots in 2D. It implements an adaptive (or KLD-sampling) Monte Carlo localization, using particle filtering to track the pose of the robot on a known map.

+

1.3 Introduction of DWA_planner and TEB_planner

+
DWA_planner
+

The full name of DWA is Dynamic Window Approaches. The algorithm can search for multiple paths to avoid and travel, select the optimal path based on various evaluation criteria (whether it will hit an obstacle, the time required, etc.), and calculate the linear velocity and angular velocity during the driving cycle to avoid collisions with dynamic obstacles.

+
TEB_planner
+

The full name of “TEB” is Time Elastic Band Local Planner. This method performs subsequent modifications to the initial trajectory generated by the global path planner to optimize the robot’s motion trajectory and belongs to local path planning. In the process of trajectory optimization, the algorithm has a variety of optimization goals, including but not limited to: overall path length, trajectory running time, distance to obstacles, passing intermediate path points, and compliance with robot dynamics, kinematics, and geometric constraints. The“TEB method” explicitly considers the dynamic constraints of time and space in the state of motion, for example, the velocity and acceleration of the robot are limited.

+

Limo navigation

+

Note: In the four-wheel differential mode, the omnidirectional wheel mode and the track mode, the file run for the navigation is the same.

+

Note: Before running the command, please make sure that the programs in other terminals have been terminated. The termination command is: Ctrl+c.

+

(1)First launch the LiDAR and enter the command in the terminal:

+
roslaunch limo_bringup limo_start.launch pub_odom_tf:=false
+
+

(2)Launch the navigation and enter the command in the terminal:

+
roslaunch limo_bringup limo_navigation_diff.launch
+
+

Note: If it is Ackermann motion mode, please run:

+
roslaunch limo_bringup limo_navigation_ackerman.launch
+
+

After launching successfully, the rviz interface will be shown in the figure below:
+

+

Note: If you need to customize the opened map, please open the limo_navigation_diff.launch file to modify the parameters. The file directory is: ~/agilex_ws/src/limo_ros/limo_bringup/launch. Please modify map02 to the name of the map that needs to be replaced.

+

+

3)After launching the navigation, it may be observed that the laser-scanned shape does not align with the map, requiring manual correction. To rectify this, adjust the actual position of the chassis in the scene displayed on the rviz map. Use the rviz tools to designate an approximate position for the vehicle, providing it with a preliminary estimation. Subsequently, use the handle tool to remotely rotate the vehicle until automatic alignment is achieved. Once the laser shape overlaps with the scene shape on the map, the correction process is concluded. The operational steps are outlined as follows:
+

+

The correction is completed:

+

+

(4)Set the navigation goal point through 2D Nav Goal.
+

+

A purple path will be generated on the map. Switch the handle to command mode, and Limo will automatically navigate to the goal point.

+

+

Limo path inspection

+

(1)First launch the LiDAR and enter the command in the terminal:

+
roslaunch limo_bringup limo_start.launch pub_odom_tf:=false
+
+

(2)Launch the navigation and enter the command in the terminal:

+
roslaunch limo_bringup limo_navigation_diff.launch
+
+

Note: If it is Ackermann motion mode, please run:

+
roslaunch limo_bringup limo_navigation_ackerman.launch
+
+

(3)Launch the path recording function. Open a new terminal, and enter the command in the terminal:

+
roslaunch agilex_pure_pursuit record_path.launch
+
+

After the path recording is completed, terminate the path recording program, and enter the command in the terminal: Ctrl+c.

+

(4)Launch the path inspection function. Open a new terminal, and enter the command in the terminal:

+

Note: Switch the handle to command mode.

+
roslaunch agilex_pure_pursuit pure_pursuit.launch
+
+

2 posts - 2 participants

+

Read full topic

+ + + + + + + +
+

+by Agilex_Robotics on March 04, 2024 07:28 AM +

+ +
+ +
March 03, 2024
+ + + + + + + + +
+ + +
ROS Meetup Arab
+ + + +
+ +
+

We’re excited to introduce the forthcoming installment of our Arabian Meet series, centered around the captivating theme of “Autonomous Racing: Advancing the Frontiers of Automated Technology.”

+

The topics we’ll explore include :

+
    +
  • Introduction to Autonomous Racing.
  • +
  • Autonomous Racing Competitions.
  • +
  • Racing Cars & Sensor Technologies.
  • +
  • ROS-Based Racing Simulator.
  • +
  • Autonomous Racing Software Architecture.
  • +
+

Stay tuned for more updates and save the date for this enlightening conversation! :spiral_calendar:

+

save time on the calendar:

+

+You can find the meeting link here:

+ +

+

1 post - 1 participant

+

Read full topic

+ + + + + + + +
+

+by khaledgabr77 on March 03, 2024 07:16 AM +

+ +
+ +
March 02, 2024
+ + + + + + + + +
+ + +
Potential Humanoid Robotics Monthly Working Group
+ + + +
+ +
+

Hi Everyone,

+

I want to introduce myself - my name is Ronaldson Bellande, I’m a PhD Student/Founder CEO/CTO/Director of Research Organizations and a Startup; I’m working on. Can find information more about me in my linkedin and Github Profile

+

I wanted to create a monthly meeting working group, where we would meet monthly and discuss about humanoid robotics, what everyone is working on? Are looking for and are excited for? Anything Interested you are working in? and more in the space of humanoid robotics.

+

If there is interest I will start a Working Group, I’m passionate about this subject and other subject related to activities I’m constantly doing.

+

13 posts - 8 participants

+

Read full topic

+ + + + + + + +
+

+by RonaldsonBellande on March 02, 2024 01:57 AM +

+ +
+ +
March 01, 2024
+ + + + + + + + +
+ + +
ROS News for the Week of February 26th, 2024
+ + + +
+ +
+

ROS News for the Week of February 26th, 2024

+
+

belt2

+

In manufacturing I’ve seen talented people do things with clever light placement that transform an extremely difficult computer vision task into something that’s easily solved. I came across this paper this week that does just that for the robotic manipulation of objects. The paper is titled, “Dynamics-Guided Diffusion Model for Robot Manipulator Design” and the authors use diffusion models to make simple grippers that can manipulate a specific object into a given pose. The results are pretty cool and could be very useful for any roboticist with a 3D printer.

+
+


+Amazon is putting up US$1B to fund startups that combine robotics and “AI.” While regular startup investment has fallen off a bit, it looks like there are still funding opportunities for robotics companies.

+
+


+Last week everyone was talking about how NVIDIA’s market cap had hit US$2T. According to this LinkedIn post they are putting that money to good use by funding the development of the open source Nav2 project.

+
+


+Cross sensor calibration is a pain in the :peach:. A good robot model can only get you so far, and getting a bunch of sensor data to match up can be difficult for even the most seasoned engineers. The Github repository below attempts to build a toolbox to fix some these problems. LVT2Calib: Automatic and Unified Extrinsic Calibration Toolbox for Different 3D LiDAR, Visual Camera and Thermal Camerapaper

+

Events

+ +

News

+ +

ROS

+ +

ROS Questions

+

+

Got a minute? Please take a moment to answer a question on Robotics Stack Exchange and help out your fellow ROS users.

+

1 post - 1 participant

+

Read full topic

+ + + + + + + +
+

+by Katherine_Scott on March 01, 2024 06:15 PM +

+ +
+ + + + + + + + + +
+ + +
Revival of client library working group?
+ + + +
+ +
+

Hi,
+are there any plans, to revive this working group ?

+

15 posts - 5 participants

+

Read full topic

+ + + + + + + +
+

+by JM_ROS on March 01, 2024 04:19 PM +

+ +
+ + + + + + + + + +
+ + +
Scalability issues with large number of nodes
+ + + +
+ +
+

My team and I are developing a mobile platform for industrial tasks (such as rivet fastening or drilling), fully based in ROS2 Stack (Humble).

+

The stack comprises a bunch of nodes for different tasks (slam, motion planning, fiducial registration…) and are coordinated through a state machine node (based on smach).

+

The issue we are facing is that the state machine node (which is connected to most of the nodes in the stack) gets slower and slower until it stops receiving events from other nodes.

+

We’ve been debbuging this issue and our feeling is that the number of objects (nodes/clients/subscribers…) is too high and whole stack suffers a lot of overhead, being this most noticeable in the “biggest” node (the state machine).

+

Our stack has 80 nodes, and a total of 1505 objects

+
    +
  • Stack clients: 198
  • +
  • Stack services: 636
  • +
  • Stack publishers: 236
  • +
  • Stack subscribers: 173
  • +
+

My questions are:

+
    +
  • Is this number of nodes too high for an industrial robotics project? How large are usually projects using ROS2?
  • +
  • Which is the maximum number of objects in the stack? Is this a rmw limitation or ROS2 itself?
  • +
+

30 posts - 14 participants

+

Read full topic

+ + + + + + + +
+

+by leander2189 on March 01, 2024 09:35 AM +

+ +
+ +
February 26, 2024
+ + + + + + + + +
+ + +
Robot Fleet Management: Make vs. Buy? An Alternative
+ + + +
+ +
+

Virtually every robotics CTO we’ve spoken to has told us about this dilemma about fleet management systems: neither “make” nor “buy” are great options! With Transitive we are providing an alternative.

+ + +

8 posts - 5 participants

+

Read full topic

+ + + + + + + +
+

+by chfritz on February 26, 2024 11:00 PM +

+ +
+ + + + + + + + + +
+ + +
Rclcpp template metaprogramming bug. Help wanted
+ + + +
+ +
+

Hi,
+we hit a bug in the function traits that is out of my league.
+If you are really good with template metraprogramming, please have a look at:

+

+Thanks.

+

1 post - 1 participant

+

Read full topic

+ + + + + + + +
+

+by JM_ROS on February 26, 2024 09:54 AM +

+ +
+ +
February 23, 2024
+ + + + + + + + +
+ + +
ROS News for the Week of February 19th, 2024
+ + + +
+ +
+

ROS News for the Week of February 19th, 2024

+


+Open Robotics will be participating in Google Summer of Code 2024. We’re looking for a few interns to help us out! See the post for all the details.

+
+

+

Our next Gazebo Community meeting is next Wednesday, February 28th. Sikiru Salau, a competitor in the Pan-African Robotics Competition, will be joining us to talk about simulating robots for agriculture.

+
+

image
+Hello Robot is having a great month! Last week they released their third gen robot. This week they are at the top of the orange website with this “OK Robot” paper from NYU

+
+


+Check out the AutoNav robot by Jatin Patil. Hats off to the developer, this is a really well put together personal project!

+
+


+Just a reminder: Gazebo Classic goes End-Of-Life in January 2025 and ROS 2 Jazzy will not support Gazebo Classic. We put together some guidance for those of you that need to make the switch!

+

Events

+ +

News

+ +

ROS

+ +

ROS Questions

+

Got a minute to spare? Pay it forward by answering a few ROS questions on Robotics Stack Exchange.

+

3 posts - 3 participants

+

Read full topic

+ + + + + + + +
+

+by Katherine_Scott on February 23, 2024 11:13 PM +

+ +
+ + + + + + + + + +
+ + +
New Packages for Iron Irwini 2024-02-23
+ + + +
+ +
+

We’re happy to announce 2 new packages and 75 updates are now available in ROS 2 Iron Irwini :iron: :irwini: . This sync was tagged as iron/2024-02-23 .

+

Package Updates for iron

+

Added Packages [2]:

+
    +
  • ros-iron-apriltag-detector: 1.2.0-1
  • +
  • ros-iron-multidim-rrt-planner: 0.0.8-1
  • +
+

Updated Packages [75]:

+
    +
  • ros-iron-ackermann-steering-controller: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-ackermann-steering-controller-dbgsym: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-admittance-controller: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-admittance-controller-dbgsym: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-azure-iot-sdk-c: 1.10.1-4 → 1.12.0-1
  • +
  • ros-iron-bicycle-steering-controller: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-bicycle-steering-controller-dbgsym: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-bno055: 0.4.1-4 → 0.5.0-1
  • +
  • ros-iron-diff-drive-controller: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-diff-drive-controller-dbgsym: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-draco-point-cloud-transport: 2.0.3-1 → 2.0.4-1
  • +
  • ros-iron-draco-point-cloud-transport-dbgsym: 2.0.3-1 → 2.0.4-1
  • +
  • ros-iron-effort-controllers: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-effort-controllers-dbgsym: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-force-torque-sensor-broadcaster: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-force-torque-sensor-broadcaster-dbgsym: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-forward-command-controller: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-forward-command-controller-dbgsym: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-gripper-controllers: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-gripper-controllers-dbgsym: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-imu-sensor-broadcaster: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-imu-sensor-broadcaster-dbgsym: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-joint-state-broadcaster: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-joint-state-broadcaster-dbgsym: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-joint-trajectory-controller: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-joint-trajectory-controller-dbgsym: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-leo: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-leo-description: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-leo-msgs: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-leo-msgs-dbgsym: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-leo-teleop: 2.0.0-1 → 2.0.1-1
  • +
  • ros-iron-mp2p-icp: 1.1.0-1 → 1.2.0-1
  • +
  • ros-iron-mp2p-icp-dbgsym: 1.1.0-1 → 1.2.0-1
  • +
  • ros-iron-mrpt2: 2.11.7-1 → 2.11.9-1
  • +
  • ros-iron-mrpt2-dbgsym: 2.11.7-1 → 2.11.9-1
  • +
  • ros-iron-plotjuggler-ros: 2.1.0-1 → 2.1.1-1
  • +
  • ros-iron-plotjuggler-ros-dbgsym: 2.1.0-1 → 2.1.1-1
  • +
  • ros-iron-point-cloud-interfaces: 2.0.3-1 → 2.0.4-1
  • +
  • ros-iron-point-cloud-interfaces-dbgsym: 2.0.3-1 → 2.0.4-1
  • +
  • ros-iron-point-cloud-transport: 2.0.3-1 → 2.0.4-1
  • +
  • ros-iron-point-cloud-transport-dbgsym: 2.0.3-1 → 2.0.4-1
  • +
  • ros-iron-point-cloud-transport-plugins: 2.0.3-1 → 2.0.4-1
  • +
  • ros-iron-point-cloud-transport-py: 2.0.3-1 → 2.0.4-1
  • +
  • ros-iron-position-controllers: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-position-controllers-dbgsym: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-range-sensor-broadcaster: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-range-sensor-broadcaster-dbgsym: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-rcpputils: 2.6.2-1 → 2.6.3-1
  • +
  • ros-iron-rcpputils-dbgsym: 2.6.2-1 → 2.6.3-1
  • +
  • ros-iron-robotraconteur: 1.0.0-1 → 1.0.0-2
  • +
  • ros-iron-robotraconteur-dbgsym: 1.0.0-1 → 1.0.0-2
  • +
  • ros-iron-ros2-controllers: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-ros2-controllers-test-nodes: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-rqt: 1.3.3-1 → 1.3.4-1
  • +
  • ros-iron-rqt-gauges: 0.0.1-1 → 0.0.2-1
  • +
  • ros-iron-rqt-gui: 1.3.3-1 → 1.3.4-1
  • +
  • ros-iron-rqt-gui-cpp: 1.3.3-1 → 1.3.4-1
  • +
  • ros-iron-rqt-gui-cpp-dbgsym: 1.3.3-1 → 1.3.4-1
  • +
  • ros-iron-rqt-gui-py: 1.3.3-1 → 1.3.4-1
  • +
  • ros-iron-rqt-joint-trajectory-controller: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-rqt-py-common: 1.3.3-1 → 1.3.4-1
  • +
  • ros-iron-steering-controllers-library: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-steering-controllers-library-dbgsym: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-tricycle-controller: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-tricycle-controller-dbgsym: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-tricycle-steering-controller: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-tricycle-steering-controller-dbgsym: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-velocity-controllers: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-velocity-controllers-dbgsym: 3.21.0-1 → 3.22.0-1
  • +
  • ros-iron-vrpn-mocap: 1.0.3-3 → 1.1.0-1
  • +
  • ros-iron-vrpn-mocap-dbgsym: 1.0.3-3 → 1.1.0-1
  • +
  • ros-iron-zlib-point-cloud-transport: 2.0.3-1 → 2.0.4-1
  • +
  • ros-iron-zlib-point-cloud-transport-dbgsym: 2.0.3-1 → 2.0.4-1
  • +
  • ros-iron-zstd-point-cloud-transport: 2.0.3-1 → 2.0.4-1
  • +
  • ros-iron-zstd-point-cloud-transport-dbgsym: 2.0.3-1 → 2.0.4-1
  • +
+

Removed Packages [0]:

+

Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:

+
    +
  • Alejandro Hernandez Cordero
  • +
  • Alejandro Hernández
  • +
  • Alvin Sun
  • +
  • Bence Magyar
  • +
  • Bernd Pfrommer
  • +
  • Brandon Ong
  • +
  • Davide Faconti
  • +
  • Denis Štogl
  • +
  • Dharini Dutia
  • +
  • Eloy Bricneo
  • +
  • Fictionlab
  • +
  • John Wason
  • +
  • Jose-Luis Blanco-Claraco
  • +
  • Martin Pecka
  • +
  • Tim Clephas
  • +
  • david
  • +
  • flynneva
  • +
+

1 post - 1 participant

+

Read full topic

+ + + + + + + +
+

+by Yadunund on February 23, 2024 10:35 AM +

+ +
+ + + + + + + + + +
+ + +
New packages and patch release for Humble Hawksbill 2024-02-22
+ + + +
+ +
+

We’re happy to announce a new Humble release!

+

This sync brings several new packages and some updates to ROS 2 core packages. (I’m not including the project board because it was empty.)

+
+

Package Updates for Humble

+

Added Packages [42]:

+
    +
  • ros-humble-apriltag-detector: 1.1.0-1
  • +
  • ros-humble-as2-gazebo-assets: 1.0.8-1
  • +
  • ros-humble-as2-gazebo-assets-dbgsym: 1.0.8-1
  • +
  • ros-humble-as2-platform-dji-osdk: 1.0.8-1
  • +
  • ros-humble-as2-platform-dji-osdk-dbgsym: 1.0.8-1
  • +
  • ros-humble-as2-platform-gazebo: 1.0.8-1
  • +
  • ros-humble-as2-platform-gazebo-dbgsym: 1.0.8-1
  • +
  • ros-humble-caret-analyze: 0.5.0-1
  • +
  • ros-humble-caret-msgs: 0.5.0-6
  • +
  • ros-humble-caret-msgs-dbgsym: 0.5.0-6
  • +
  • ros-humble-data-tamer-cpp: 0.9.3-2
  • +
  • ros-humble-data-tamer-cpp-dbgsym: 0.9.3-2
  • +
  • ros-humble-data-tamer-msgs: 0.9.3-2
  • +
  • ros-humble-data-tamer-msgs-dbgsym: 0.9.3-2
  • +
  • ros-humble-hardware-interface-testing: 2.39.1-1
  • +
  • ros-humble-hardware-interface-testing-dbgsym: 2.39.1-1
  • +
  • ros-humble-hri-msgs: 2.0.0-1
  • +
  • ros-humble-hri-msgs-dbgsym: 2.0.0-1
  • +
  • ros-humble-mocap4r2-dummy-driver: 0.0.7-1
  • +
  • ros-humble-mocap4r2-dummy-driver-dbgsym: 0.0.7-1
  • +
  • ros-humble-mocap4r2-marker-viz: 0.0.7-1
  • +
  • ros-humble-mocap4r2-marker-viz-dbgsym: 0.0.7-1
  • +
  • ros-humble-mocap4r2-marker-viz-srvs: 0.0.7-1
  • +
  • ros-humble-mocap4r2-marker-viz-srvs-dbgsym: 0.0.7-1
  • +
  • ros-humble-motion-capture-tracking: 1.0.3-1
  • +
  • ros-humble-motion-capture-tracking-dbgsym: 1.0.3-1
  • +
  • ros-humble-motion-capture-tracking-interfaces: 1.0.3-1
  • +
  • ros-humble-motion-capture-tracking-interfaces-dbgsym: 1.0.3-1
  • +
  • ros-humble-psdk-interfaces: 1.0.0-1
  • +
  • ros-humble-psdk-interfaces-dbgsym: 1.0.0-1
  • +
  • ros-humble-psdk-wrapper: 1.0.0-1
  • +
  • ros-humble-psdk-wrapper-dbgsym: 1.0.0-1
  • +
  • ros-humble-qb-softhand-industry-description: 2.1.2-4
  • +
  • ros-humble-qb-softhand-industry-msgs: 2.1.2-4
  • +
  • ros-humble-qb-softhand-industry-msgs-dbgsym: 2.1.2-4
  • +
  • ros-humble-qb-softhand-industry-ros2-control: 2.1.2-4
  • +
  • ros-humble-qb-softhand-industry-ros2-control-dbgsym: 2.1.2-4
  • +
  • ros-humble-qb-softhand-industry-srvs: 2.1.2-4
  • +
  • ros-humble-qb-softhand-industry-srvs-dbgsym: 2.1.2-4
  • +
  • ros-humble-ros2caret: 0.5.0-2
  • +
  • ros-humble-sync-parameter-server: 1.0.1-2
  • +
  • ros-humble-sync-parameter-server-dbgsym: 1.0.1-2
  • +
+

Updated Packages [280]:

+
    +
  • ros-humble-ament-cmake: 1.3.7-1 → 1.3.8-1
  • +
  • ros-humble-ament-cmake-auto: 1.3.7-1 → 1.3.8-1
  • +
  • ros-humble-ament-cmake-core: 1.3.7-1 → 1.3.8-1
  • +
  • ros-humble-ament-cmake-export-definitions: 1.3.7-1 → 1.3.8-1
  • +
  • ros-humble-ament-cmake-export-dependencies: 1.3.7-1 → 1.3.8-1
  • +
  • ros-humble-ament-cmake-export-include-directories: 1.3.7-1 → 1.3.8-1
  • +
  • ros-humble-ament-cmake-export-interfaces: 1.3.7-1 → 1.3.8-1
  • +
  • ros-humble-ament-cmake-export-libraries: 1.3.7-1 → 1.3.8-1
  • +
  • ros-humble-ament-cmake-export-link-flags: 1.3.7-1 → 1.3.8-1
  • +
  • ros-humble-ament-cmake-export-targets: 1.3.7-1 → 1.3.8-1
  • +
  • ros-humble-ament-cmake-gen-version-h: 1.3.7-1 → 1.3.8-1
  • +
  • ros-humble-ament-cmake-gmock: 1.3.7-1 → 1.3.8-1
  • +
  • ros-humble-ament-cmake-google-benchmark: 1.3.7-1 → 1.3.8-1
  • +
  • ros-humble-ament-cmake-gtest: 1.3.7-1 → 1.3.8-1
  • +
  • ros-humble-ament-cmake-include-directories: 1.3.7-1 → 1.3.8-1
  • +
  • ros-humble-ament-cmake-libraries: 1.3.7-1 → 1.3.8-1
  • +
  • ros-humble-ament-cmake-nose: 1.3.7-1 → 1.3.8-1
  • +
  • ros-humble-ament-cmake-pytest: 1.3.7-1 → 1.3.8-1
  • +
  • ros-humble-ament-cmake-python: 1.3.7-1 → 1.3.8-1
  • +
  • ros-humble-ament-cmake-target-dependencies: 1.3.7-1 → 1.3.8-1
  • +
  • ros-humble-ament-cmake-test: 1.3.7-1 → 1.3.8-1
  • +
  • ros-humble-ament-cmake-vendor-package: 1.3.7-1 → 1.3.8-1
  • +
  • ros-humble-ament-cmake-version: 1.3.7-1 → 1.3.8-1
  • +
  • ros-humble-as2-alphanumeric-viewer: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-alphanumeric-viewer-dbgsym: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-behavior: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-behavior-dbgsym: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-behavior-tree: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-behavior-tree-dbgsym: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-behaviors-motion: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-behaviors-motion-dbgsym: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-behaviors-perception: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-behaviors-perception-dbgsym: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-behaviors-platform: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-behaviors-platform-dbgsym: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-behaviors-trajectory-generation: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-behaviors-trajectory-generation-dbgsym: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-cli: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-core: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-keyboard-teleoperation: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-motion-controller: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-motion-controller-dbgsym: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-motion-reference-handlers: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-msgs: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-msgs-dbgsym: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-platform-crazyflie: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-platform-crazyflie-dbgsym: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-platform-tello: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-platform-tello-dbgsym: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-python-api: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-realsense-interface: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-realsense-interface-dbgsym: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-state-estimator: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-state-estimator-dbgsym: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-usb-camera-interface: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-as2-usb-camera-interface-dbgsym: 1.0.6-1 → 1.0.8-1
  • +
  • ros-humble-camera-calibration: 3.0.0-1 → 3.0.3-1
  • +
  • ros-humble-controller-interface: 2.37.0-1 → 2.39.1-1
  • +
  • ros-humble-controller-interface-dbgsym: 2.37.0-1 → 2.39.1-1
  • +
  • ros-humble-controller-manager: 2.37.0-1 → 2.39.1-1
  • +
  • ros-humble-controller-manager-dbgsym: 2.37.0-1 → 2.39.1-1
  • +
  • ros-humble-controller-manager-msgs: 2.37.0-1 → 2.39.1-1
  • +
  • ros-humble-controller-manager-msgs-dbgsym: 2.37.0-1 → 2.39.1-1
  • +
  • ros-humble-costmap-queue: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-depth-image-proc: 3.0.0-1 → 3.0.3-1
  • +
  • ros-humble-depth-image-proc-dbgsym: 3.0.0-1 → 3.0.3-1
  • +
  • ros-humble-depthai-bridge: 2.8.2-1 → 2.9.0-1
  • +
  • ros-humble-depthai-bridge-dbgsym: 2.8.2-1 → 2.9.0-1
  • +
  • ros-humble-depthai-descriptions: 2.8.2-1 → 2.9.0-1
  • +
  • ros-humble-depthai-examples: 2.8.2-1 → 2.9.0-1
  • +
  • ros-humble-depthai-examples-dbgsym: 2.8.2-1 → 2.9.0-1
  • +
  • ros-humble-depthai-filters: 2.8.2-1 → 2.9.0-1
  • +
  • ros-humble-depthai-filters-dbgsym: 2.8.2-1 → 2.9.0-1
  • +
  • ros-humble-depthai-ros: 2.8.2-1 → 2.9.0-1
  • +
  • ros-humble-depthai-ros-driver: 2.8.2-1 → 2.9.0-1
  • +
  • ros-humble-depthai-ros-driver-dbgsym: 2.8.2-1 → 2.9.0-1
  • +
  • ros-humble-depthai-ros-msgs: 2.8.2-1 → 2.9.0-1
  • +
  • ros-humble-depthai-ros-msgs-dbgsym: 2.8.2-1 → 2.9.0-1
  • +
  • ros-humble-dwb-core: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-dwb-core-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-dwb-critics: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-dwb-critics-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-dwb-msgs: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-dwb-msgs-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-dwb-plugins: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-dwb-plugins-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-event-camera-codecs: 1.1.2-1 → 1.1.3-1
  • +
  • ros-humble-event-camera-codecs-dbgsym: 1.1.2-1 → 1.1.3-1
  • +
  • ros-humble-event-camera-py: 1.1.3-1 → 1.1.4-1
  • +
  • ros-humble-event-camera-renderer: 1.1.2-1 → 1.1.3-1
  • +
  • ros-humble-event-camera-renderer-dbgsym: 1.1.2-1 → 1.1.3-1
  • +
  • ros-humble-examples-tf2-py: 0.25.5-1 → 0.25.6-1
  • +
  • ros-humble-geometry2: 0.25.5-1 → 0.25.6-1
  • +
  • ros-humble-hardware-interface: 2.37.0-1 → 2.39.1-1
  • +
  • ros-humble-hardware-interface-dbgsym: 2.37.0-1 → 2.39.1-1
  • +
  • ros-humble-image-pipeline: 3.0.0-1 → 3.0.3-1
  • +
  • ros-humble-image-proc: 3.0.0-1 → 3.0.3-1
  • +
  • ros-humble-image-proc-dbgsym: 3.0.0-1 → 3.0.3-1
  • +
  • ros-humble-image-publisher: 3.0.0-1 → 3.0.3-1
  • +
  • ros-humble-image-publisher-dbgsym: 3.0.0-1 → 3.0.3-1
  • +
  • ros-humble-image-rotate: 3.0.0-1 → 3.0.3-1
  • +
  • ros-humble-image-rotate-dbgsym: 3.0.0-1 → 3.0.3-1
  • +
  • ros-humble-image-view: 3.0.0-1 → 3.0.3-1
  • +
  • ros-humble-image-view-dbgsym: 3.0.0-1 → 3.0.3-1
  • +
  • ros-humble-joint-limits: 2.37.0-1 → 2.39.1-1
  • +
  • ros-humble-joint-limits-dbgsym: 2.37.0-1 → 2.39.1-1
  • +
  • ros-humble-launch: 1.0.4-1 → 1.0.5-1
  • +
  • ros-humble-launch-pytest: 1.0.4-1 → 1.0.5-1
  • +
  • ros-humble-launch-testing: 1.0.4-1 → 1.0.5-1
  • +
  • ros-humble-launch-testing-ament-cmake: 1.0.4-1 → 1.0.5-1
  • +
  • ros-humble-launch-xml: 1.0.4-1 → 1.0.5-1
  • +
  • ros-humble-launch-yaml: 1.0.4-1 → 1.0.5-1
  • +
  • ros-humble-leo: 1.2.0-1 → 1.2.1-1
  • +
  • ros-humble-leo-description: 1.2.0-1 → 1.2.1-1
  • +
  • ros-humble-leo-msgs: 1.2.0-1 → 1.2.1-1
  • +
  • ros-humble-leo-msgs-dbgsym: 1.2.0-1 → 1.2.1-1
  • +
  • ros-humble-leo-teleop: 1.2.0-1 → 1.2.1-1
  • +
  • ros-humble-microstrain-inertial-driver: 3.2.0-2 → 3.2.1-1
  • +
  • ros-humble-microstrain-inertial-driver-dbgsym: 3.2.0-2 → 3.2.1-1
  • +
  • ros-humble-microstrain-inertial-examples: 3.2.0-2 → 3.2.1-1
  • +
  • ros-humble-microstrain-inertial-examples-dbgsym: 3.2.0-2 → 3.2.1-1
  • +
  • ros-humble-microstrain-inertial-msgs: 3.2.0-2 → 3.2.1-1
  • +
  • ros-humble-microstrain-inertial-msgs-dbgsym: 3.2.0-2 → 3.2.1-1
  • +
  • ros-humble-microstrain-inertial-rqt: 3.2.0-2 → 3.2.1-1
  • +
  • ros-humble-mocap4r2-control: 0.0.6-1 → 0.0.7-1
  • +
  • ros-humble-mocap4r2-control-dbgsym: 0.0.6-1 → 0.0.7-1
  • +
  • ros-humble-mocap4r2-control-msgs: 0.0.6-1 → 0.0.7-1
  • +
  • ros-humble-mocap4r2-control-msgs-dbgsym: 0.0.6-1 → 0.0.7-1
  • +
  • ros-humble-mocap4r2-marker-publisher: 0.0.6-1 → 0.0.7-1
  • +
  • ros-humble-mocap4r2-marker-publisher-dbgsym: 0.0.6-1 → 0.0.7-1
  • +
  • ros-humble-mocap4r2-robot-gt: 0.0.6-1 → 0.0.7-1
  • +
  • ros-humble-mocap4r2-robot-gt-dbgsym: 0.0.6-1 → 0.0.7-1
  • +
  • ros-humble-mocap4r2-robot-gt-msgs: 0.0.6-1 → 0.0.7-1
  • +
  • ros-humble-mocap4r2-robot-gt-msgs-dbgsym: 0.0.6-1 → 0.0.7-1
  • +
  • ros-humble-mp2p-icp: 1.0.0-1 → 1.2.0-1
  • +
  • ros-humble-mp2p-icp-dbgsym: 1.0.0-1 → 1.2.0-1
  • +
  • ros-humble-mrpt2: 2.11.6-1 → 2.11.9-1
  • +
  • ros-humble-mrpt2-dbgsym: 2.11.6-1 → 2.11.9-1
  • +
  • ros-humble-nav-2d-msgs: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav-2d-msgs-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav-2d-utils: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav-2d-utils-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-amcl: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-amcl-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-behavior-tree: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-behavior-tree-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-behaviors: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-behaviors-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-bringup: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-bt-navigator: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-bt-navigator-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-collision-monitor: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-collision-monitor-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-common: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-constrained-smoother: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-constrained-smoother-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-controller: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-controller-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-core: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-costmap-2d: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-costmap-2d-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-dwb-controller: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-lifecycle-manager: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-lifecycle-manager-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-map-server: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-map-server-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-mppi-controller: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-mppi-controller-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-msgs: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-msgs-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-navfn-planner: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-navfn-planner-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-planner: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-planner-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-regulated-pure-pursuit-controller: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-regulated-pure-pursuit-controller-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-rotation-shim-controller: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-rotation-shim-controller-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-rviz-plugins: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-rviz-plugins-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-simple-commander: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-smac-planner: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-smac-planner-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-smoother: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-smoother-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-theta-star-planner: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-theta-star-planner-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-util: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-util-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-velocity-smoother: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-velocity-smoother-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-voxel-grid: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-voxel-grid-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-waypoint-follower: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-nav2-waypoint-follower-dbgsym: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-navigation2: 1.1.12-1 → 1.1.13-1
  • +
  • ros-humble-pcl-conversions: 2.4.0-4 → 2.4.0-6
  • +
  • ros-humble-pcl-ros: 2.4.0-4 → 2.4.0-6
  • +
  • ros-humble-perception-pcl: 2.4.0-4 → 2.4.0-6
  • +
  • ros-humble-plotjuggler: 3.8.8-3 → 3.9.0-1
  • +
  • ros-humble-plotjuggler-dbgsym: 3.8.8-3 → 3.9.0-1
  • +
  • ros-humble-plotjuggler-ros: 2.0.0-3 → 2.1.0-1
  • +
  • ros-humble-plotjuggler-ros-dbgsym: 2.0.0-3 → 2.1.0-1
  • +
  • ros-humble-rclpy: 3.3.11-1 → 3.3.12-1
  • +
  • ros-humble-rcpputils: 2.4.1-1 → 2.4.2-1
  • +
  • ros-humble-rcpputils-dbgsym: 2.4.1-1 → 2.4.2-1
  • +
  • ros-humble-rcutils: 5.1.4-1 → 5.1.5-1
  • +
  • ros-humble-rcutils-dbgsym: 5.1.4-1 → 5.1.5-1
  • +
  • ros-humble-robotraconteur: 1.0.0-1 → 1.0.0-2
  • +
  • ros-humble-robotraconteur-dbgsym: 1.0.0-1 → 1.0.0-2
  • +
  • ros-humble-ros2-control: 2.37.0-1 → 2.39.1-1
  • +
  • ros-humble-ros2-control-test-assets: 2.37.0-1 → 2.39.1-1
  • +
  • ros-humble-ros2action: 0.18.8-1 → 0.18.9-1
  • +
  • ros-humble-ros2cli: 0.18.8-1 → 0.18.9-1
  • +
  • ros-humble-ros2cli-test-interfaces: 0.18.8-1 → 0.18.9-1
  • +
  • ros-humble-ros2cli-test-interfaces-dbgsym: 0.18.8-1 → 0.18.9-1
  • +
  • ros-humble-ros2component: 0.18.8-1 → 0.18.9-1
  • +
  • ros-humble-ros2controlcli: 2.37.0-1 → 2.39.1-1
  • +
  • ros-humble-ros2doctor: 0.18.8-1 → 0.18.9-1
  • +
  • ros-humble-ros2interface: 0.18.8-1 → 0.18.9-1
  • +
  • ros-humble-ros2lifecycle: 0.18.8-1 → 0.18.9-1
  • +
  • ros-humble-ros2lifecycle-test-fixtures: 0.18.8-1 → 0.18.9-1
  • +
  • ros-humble-ros2lifecycle-test-fixtures-dbgsym: 0.18.8-1 → 0.18.9-1
  • +
  • ros-humble-ros2multicast: 0.18.8-1 → 0.18.9-1
  • +
  • ros-humble-ros2node: 0.18.8-1 → 0.18.9-1
  • +
  • ros-humble-ros2param: 0.18.8-1 → 0.18.9-1
  • +
  • ros-humble-ros2pkg: 0.18.8-1 → 0.18.9-1
  • +
  • ros-humble-ros2run: 0.18.8-1 → 0.18.9-1
  • +
  • ros-humble-ros2service: 0.18.8-1 → 0.18.9-1
  • +
  • ros-humble-ros2topic: 0.18.8-1 → 0.18.9-1
  • +
  • ros-humble-rqt: 1.1.6-2 → 1.1.7-1
  • +
  • ros-humble-rqt-console: 2.0.2-3 → 2.0.3-1
  • +
  • ros-humble-rqt-controller-manager: 2.37.0-1 → 2.39.1-1
  • +
  • ros-humble-rqt-gui: 1.1.6-2 → 1.1.7-1
  • +
  • ros-humble-rqt-gui-cpp: 1.1.6-2 → 1.1.7-1
  • +
  • ros-humble-rqt-gui-cpp-dbgsym: 1.1.6-2 → 1.1.7-1
  • +
  • ros-humble-rqt-gui-py: 1.1.6-2 → 1.1.7-1
  • +
  • ros-humble-rqt-mocap4r2-control: 0.0.6-1 → 0.0.7-1
  • +
  • ros-humble-rqt-mocap4r2-control-dbgsym: 0.0.6-1 → 0.0.7-1
  • +
  • ros-humble-rqt-py-common: 1.1.6-2 → 1.1.7-1
  • +
  • ros-humble-rviz-assimp-vendor: 11.2.10-1 → 11.2.11-1
  • +
  • ros-humble-rviz-common: 11.2.10-1 → 11.2.11-1
  • +
  • ros-humble-rviz-common-dbgsym: 11.2.10-1 → 11.2.11-1
  • +
  • ros-humble-rviz-default-plugins: 11.2.10-1 → 11.2.11-1
  • +
  • ros-humble-rviz-default-plugins-dbgsym: 11.2.10-1 → 11.2.11-1
  • +
  • ros-humble-rviz-ogre-vendor: 11.2.10-1 → 11.2.11-1
  • +
  • ros-humble-rviz-ogre-vendor-dbgsym: 11.2.10-1 → 11.2.11-1
  • +
  • ros-humble-rviz-rendering: 11.2.10-1 → 11.2.11-1
  • +
  • ros-humble-rviz-rendering-dbgsym: 11.2.10-1 → 11.2.11-1
  • +
  • ros-humble-rviz-rendering-tests: 11.2.10-1 → 11.2.11-1
  • +
  • ros-humble-rviz-visual-testing-framework: 11.2.10-1 → 11.2.11-1
  • +
  • ros-humble-rviz2: 11.2.10-1 → 11.2.11-1
  • +
  • ros-humble-rviz2-dbgsym: 11.2.10-1 → 11.2.11-1
  • +
  • ros-humble-sick-scan-xd: 3.1.11-1 → 3.1.11-3
  • +
  • ros-humble-sick-scan-xd-dbgsym: 3.1.11-1 → 3.1.11-3
  • +
  • ros-humble-stereo-image-proc: 3.0.0-1 → 3.0.3-1
  • +
  • ros-humble-stereo-image-proc-dbgsym: 3.0.0-1 → 3.0.3-1
  • +
  • ros-humble-tf2: 0.25.5-1 → 0.25.6-1
  • +
  • ros-humble-tf2-bullet: 0.25.5-1 → 0.25.6-1
  • +
  • ros-humble-tf2-dbgsym: 0.25.5-1 → 0.25.6-1
  • +
  • ros-humble-tf2-eigen: 0.25.5-1 → 0.25.6-1
  • +
  • ros-humble-tf2-eigen-kdl: 0.25.5-1 → 0.25.6-1
  • +
  • ros-humble-tf2-eigen-kdl-dbgsym: 0.25.5-1 → 0.25.6-1
  • +
  • ros-humble-tf2-geometry-msgs: 0.25.5-1 → 0.25.6-1
  • +
  • ros-humble-tf2-kdl: 0.25.5-1 → 0.25.6-1
  • +
  • ros-humble-tf2-msgs: 0.25.5-1 → 0.25.6-1
  • +
  • ros-humble-tf2-msgs-dbgsym: 0.25.5-1 → 0.25.6-1
  • +
  • ros-humble-tf2-py: 0.25.5-1 → 0.25.6-1
  • +
  • ros-humble-tf2-py-dbgsym: 0.25.5-1 → 0.25.6-1
  • +
  • ros-humble-tf2-ros: 0.25.5-1 → 0.25.6-1
  • +
  • ros-humble-tf2-ros-dbgsym: 0.25.5-1 → 0.25.6-1
  • +
  • ros-humble-tf2-ros-py: 0.25.5-1 → 0.25.6-1
  • +
  • ros-humble-tf2-sensor-msgs: 0.25.5-1 → 0.25.6-1
  • +
  • ros-humble-tf2-tools: 0.25.5-1 → 0.25.6-1
  • +
  • ros-humble-tracetools-image-pipeline: 3.0.0-1 → 3.0.3-1
  • +
  • ros-humble-tracetools-image-pipeline-dbgsym: 3.0.0-1 → 3.0.3-1
  • +
  • ros-humble-transmission-interface: 2.37.0-1 → 2.39.1-1
  • +
  • ros-humble-transmission-interface-dbgsym: 2.37.0-1 → 2.39.1-1
  • +
  • ros-humble-vrpn-mocap: 1.0.4-1 → 1.1.0-1
  • +
  • ros-humble-vrpn-mocap-dbgsym: 1.0.4-1 → 1.1.0-1
  • +
+

Removed Packages [4]:

+
    +
  • ros-humble-as2-ign-gazebo-assets
  • +
  • ros-humble-as2-ign-gazebo-assets-dbgsym
  • +
  • ros-humble-as2-platform-ign-gazebo
  • +
  • ros-humble-as2-platform-ign-gazebo-dbgsym
  • +
+

Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:

+
    +
  • Adam Serafin
  • +
  • Aditya Pande
  • +
  • Alexey Merzlyakov
  • +
  • Alvin Sun
  • +
  • Bence Magyar
  • +
  • Bernd Pfrommer
  • +
  • Bianca Bendris
  • +
  • Brian Wilcox
  • +
  • CVAR-UPM
  • +
  • Carl Delsey
  • +
  • Carlos Orduno
  • +
  • Chris Lalancette
  • +
  • David V. Lu!!
  • +
  • Davide Faconti
  • +
  • Dirk Thomas
  • +
  • Dorian Scholz
  • +
  • Fictionlab
  • +
  • Francisco Martín
  • +
  • Francisco Martín Rico
  • +
  • Jacob Perron
  • +
  • John Wason
  • +
  • Jose-Luis Blanco-Claraco
  • +
  • Matej Vargovcik
  • +
  • Michael Jeronimo
  • +
  • Mohammad Haghighipanah
  • +
  • Paul Bovbel
  • +
  • Rob Fisher
  • +
  • Sachin Guruswamy
  • +
  • Shane Loretz
  • +
  • Steve Macenski
  • +
  • Support Team
  • +
  • Séverin Lemaignan
  • +
  • Tatsuro Sakaguchi
  • +
  • Vincent Rabaud
  • +
  • Víctor Mayoral-Vilches
  • +
  • Wolfgang Hönig
  • +
  • fmrico
  • +
  • rostest
  • +
  • sachin
  • +
  • steve
  • +
  • ymski
  • +
+

5 posts - 4 participants

+

Read full topic

+ + + + + + + +
+

+by audrow on February 23, 2024 05:00 AM +

+ +
+ + + + + + + + + +
+ + +
GTC March 18-21 highlights for ROS & AI robotics
+ + + +
+ +
+

NVIDIA GTC is happening live on March 18–21, with registration open for the event in San Jose, CA.

+

There are multiple inspiring robotics sessions following the kickoff with CEO Jensen Huang’s must-see keynote at the SAP Center which will share the latest breakthroughs affecting every industry.

+

Listing some highlight robotics sessions, hands-on-labs, and developers sessions there are:

+

Hands-on training Labs

+
    +
  • DLIT61534 Elevate Your Robotics Game: Unleash High Performance with Isaac ROS & Isaac Sim
  • +
  • DLIT61899 Simulating Custom Robots: A Hands-On Lab Using Isaac Sim and ROS 2
  • +
  • DLIT61523 Unlocking Local LLM Inference with Jetson AGX Orin: A Hands-On Lab
  • +
  • DLIT61797 Training an Autonomous Mobile Race Car with Open USD and Isaac Sim
  • +
+

Jetson and Robotics Developer Day

+
    +
  • SE62934 Introduction to AI-Based Robot Development With Isaac ROS
  • +
  • SE62675 Meet Jetson: The Platform for Edge AI and Robotics
  • +
  • SE62933 Overview of Jetson Software and Developer Tools
  • +
+

Robotics focused sessions

+
    +
  • S63374 (Disney Research) Breathing Life into Disney’s Robotic Characters with Deep Reinforcement Learning
  • +
  • S62602 (Boston Dynamics) Come See an Unlocked Ecosystem in the Robotics World
  • +
  • S62315 (The AI Institute) Robotics and the Role of AI: Past, Present, and Future
  • +
  • S61182 (Google DeepMind) Robotics in the Age of Generative AI
  • +
  • S63034 Panel Discussion on the Impact of Generative AI on Robotics
  • +
+

This is a great opportunity to connect, learn, and share with industry luminaries, robotics companies, NVIDIA experts, and peers face-to-face.

+

Thanks.

+

1 post - 1 participant

+

Read full topic

+ + + + + + + +
+

+by ggrigor on February 23, 2024 04:43 AM +

+ +
+ + + +
+ + +
+ + +
+Powered by the awesome: Planet

+
+ + + diff --git a/opml.xml b/opml.xml new file mode 100644 index 00000000..39d79e9f --- /dev/null +++ b/opml.xml @@ -0,0 +1,37 @@ + + + + Planet ROS + Fri, 15 Mar 2024 16:08:10 GMT + Open Robotics + info@openrobotics.org + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/planet.css b/planet.css new file mode 100644 index 00000000..e9800734 --- /dev/null +++ b/planet.css @@ -0,0 +1,132 @@ +body { + padding-left: 20px; + padding-right: 20px; + margin-top: 0; + font-family: "Arial",sans-serif; +} + +.top_block { + border-radius: 0 0 15px 15px; + -moz-border-radius: 0 0 15px 15px; + border-width: medium; + border-color: #2E3E60; + border-style:solid; + border-top: none; + margin-bottom:20px; + width:500px; + padding-bottom:15px; + padding-top:5px +} + +#participants { + text-align:left; +} + +#add_your_blog { + text-align:left; +} + +.ROS_planet_text { + font-size: 40pt; + font-family: "Interstate",sans-serif; + color:#2E3E60; + font-weight: bold; + font-stretch:semi-condensed; +} + +#top_info { + text-align: left; + border-radius: 15px; + -moz-border-radius: 15px; + border-width: medium; + border-color: #2E3E60; + border-style:solid; +} + +.entry { + font-size: 11pt; + margin-bottom: 2em; + padding-top: 20px; + padding-left: 20px; + padding-right: 20px; + padding-bottom: 10px; + text-align: left; + border-radius: 15px; + -moz-border-radius: 15px; + border-width: medium; + border-color: #2E3E60; + border-style:solid; + width: 800px; +} + +.entry .content { + padding-left: 20px; + padding-right: 20px; +} + +.entry .by_and_date { + color: grey; + text-align: right; +} + +.entry .by_and_date a { + text-decoration: none; + color: inherit; +} + +.entry_title { + font-weight: none; + font-size: 25pt; + padding-bottom: 20pt; + float:left; +} + +.channel_name { + color: grey; + font-weight: none; + float:right; +} + +.date { + font-size: 20pt; + font-weight: none; + color:#2E3E60; + padding-bottom: 15px; +} + +.entry a { + text-decoration: none; + color: #2E3E60; +} + +a:hover { + text-decoration: underline !important; +} + +.top_button { + color: grey; + text-decoration: none; + text-align: center; +} + +.top_button a:active, a:focus, input[type="image"] { +outline: 0; +} + +div.top_button { + padding-left: 30px; + padding-right: 30px; + text-align: center; +} + +.top_button img { + height: 30px; + width: 30px; + text-decoration: none; + text-align: center; +} + +.top_button .icon { + width: 30px; + height: 30px; +} diff --git a/rss10.xml b/rss10.xml new file mode 100644 index 00000000..755b9eb3 --- /dev/null +++ b/rss10.xml @@ -0,0 +1,2550 @@ + + + + Planet ROS + http://planet.ros.org + Planet ROS - http://planet.ros.org + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + ROS Discourse General: ROS News for the Week of March 11th, 2024 + https://discourse.ros.org/t/ros-news-for-the-week-of-march-11th-2024/36651 + <h1><a class="anchor" href="https://discourse.ros.org#ros-news-for-the-week-of-march-11th-2024-1" name="ros-news-for-the-week-of-march-11th-2024-1"></a>ROS News for the Week of March 11th, 2024</h1> +<br /> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/e/5/e501ec44fb4eefddc8e5d1f1334345b72b815ed1.png" title="ROSCon_2024_transparent"><img alt="ROSCon_2024_transparent" height="500" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/e/5/e501ec44fb4eefddc8e5d1f1334345b72b815ed1_2_545x500.png" width="545" /></a></div><br /> +<a href="https://discourse.ros.org/t/roscon-2024-call-for-proposals-now-open/36624">The ROSCon 2024 call for talks and workshops is now open!</a> We want your amazing talks! Also, the ROSCon Diversity Scholarship deadline is coming up!<p></p> +<br /> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/5/f/5fc4b38399ab864e35409e6f7d0b7b66b833a633.jpeg" title="ROSBTBMarch24 (2)"><img alt="ROSBTBMarch24 (2)" height="388" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/5/f/5fc4b38399ab864e35409e6f7d0b7b66b833a633_2_690x388.jpeg" width="690" /></a></div><br /> +<a href="https://www.meetup.com/ros-by-the-bay/events/299684887/">ROS By-The-Bay is next week.</a>. Open Robotic’s CEO <a class="mention" href="https://discourse.ros.org/u/vanessa_yamzon_orsi">@Vanessa_Yamzon_Orsi</a> is dropping by to take your questions about the future of Open Robotics, and I recommend you swing by if you can. Just a heads up, we have to move to a different room on the other side of the complex; details are on <a href="http://Meetup.com">Meetup.com</a>.<p></p> +<br /> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/d/3/d3f66abe058d28275dba6cf53c8bdc7afe3b1ccc.jpeg" title="MEETUP SAN ANTONIO (2)"><img alt="MEETUP SAN ANTONIO (2)" height="388" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/d/3/d3f66abe058d28275dba6cf53c8bdc7afe3b1ccc_2_690x388.jpeg" width="690" /></a></div><br /> +<a href="https://www.eventbrite.com/e/ros-meetup-san-antonio-tickets-858425041407?aff=oddtdtcreator">We’re planning a ROS Meetup in San Antonio on March 26th in conjunction with the ROS Industrial Consortium meeting.</a> If you are in the area, or have colleagues in the region, please help us spread the word.<p></p> +<br /> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/4/7/47238db6ac84cd3ced4eb5168ae8a7f829403d7e.jpeg" title="March24GCM"><img alt="March24GCM" height="388" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/4/7/47238db6ac84cd3ced4eb5168ae8a7f829403d7e_2_690x388.jpeg" width="690" /></a></div><br /> +<a href="https://community.gazebosim.org/t/community-meeting-from-lidar-slam-to-full-scale-autonomy-and-beyond/2622">We’ve line up a phenomenal guest for our next Gazebo Community Meeting; Ji Zhang from Carnegie Mellon will be speaking about his work on his work integrating ROS, Gazebo, and a variety of LIDAR-based SLAM techniques. </a><p></p> +<h1><a class="anchor" href="https://discourse.ros.org#events-2" name="events-2"></a>Events</h1> +<ul> +<li><a href="https://online-learning.tudelft.nl/courses/hello-real-world-with-ros-robot-operating-systems/">ONGOING: TU Delft ROS MOOC (FREE!)</a></li> +<li><a href="https://www.meetup.com/ros-by-the-bay/events/299684887/">2024-03-21 ROS By The Bay</a></li> +<li><a href="https://partiful.com/e/0Z53uQxTyXNDOochGBgV">2024-03-23 Robo Hackathon @ Circuit Launch (Bay Area)</a></li> +<li><a href="https://discourse.ros.org/t/cracow-robotics-ai-club-8/36634">2024-03-25 Robotics &amp; AI Meetup Krakow</a></li> +<li>NEW: <a href="https://www.eventbrite.com/e/ros-meetup-san-antonio-tickets-858425041407?aff=oddtdtcreator">2024-03-26 ROS Meetup San Antonio, Texas</a></li> +<li><a href="https://community.gazebosim.org/t/community-meeting-from-lidar-slam-to-full-scale-autonomy-and-beyond/2622">2024-03-27 Gazebo Community Meeting: CMU LIDAR SLAM Expert</a></li> +<li><a href="https://rosindustrial.org/events/2024/ros-industrial-consortium-americas-2024-annual-meeting">2024-03-27 ROS Industrial Annual Meeting</a></li> +<li><a href="https://www.eventbrite.com/e/robotics-and-tech-happy-hour-tickets-835278218637?aff=soc">2024-03-27 Robotics Happy Hour Pittsburgh</a></li> +<li><a href="https://eurecat.org/serveis/eurecatevents/rosmeetupbcn/">2024-04-03 ROS Meetup Barcelona</a></li> +<li><a href="https://www.eventbrite.com/e/robot-block-party-2024-registration-855602328597?aff=oddtdtcreator">2024-04-06 Silicon Valley Robotics Robot Block Party</a></li> +<li><a href="https://www.zephyrproject.org/zephyr-developer-summit-2024/?utm_campaign=Zephyr%20Developer%20Summit&amp;utm_content=285280528&amp;utm_medium=social&amp;utm_source=linkedin&amp;hss_channel=lcp-27161269">2024-04-16 → 2024-04-18 Zephyr Developer Summit</a></li> +<li><a href="https://www.eventbrite.com/e/robots-on-ice-40-2024-tickets-815030266467">2024-04-21 Robots on Ice</a></li> +<li><a href="https://2024.oshwa.org/">2024-05-03 Open Hardware Summit Montreal</a></li> +<li><a href="https://www.tum-venture-labs.de/education/bots-bento-the-first-robotic-pallet-handler-competition-icra-2024/">2024-05-13 Bots &amp; Bento @ ICRA 2024</a></li> +<li><a href="https://cpsweek2024-race.f1tenth.org/">2024-05-14 → 2024-05-16 F1Tenth Autonomous Grand Prix @ CPS-IoT Week</a></li> +<li><a href="https://discourse.ros.org/t/acm-sigsoft-summer-school-on-software-engineering-for-robotics-in-brussels/35992">2024-06-04 → 2024-06-08 ACM SIGSOFT Summer School on Software Engineering for Robotics</a></li> +<li><a href="https://www.agriculture-vision.com/">2024-06-?? Workshop on Agriculture Vision at CVPR 2024</a></li> +<li><a href="https://icra2024.rt-net.jp/archives/87">2024-06-16 Food Topping Challenge at ICRA 2024</a></li> +<li><a href="https://discourse.ros.org/t/roscon-france-2024/35222">2024-06-19 =&gt; 2024-06-20 ROSCon France</a></li> +<li><a href="https://sites.udel.edu/ceoe-able/able-summer-bootcamp/">2024-08-05 Autonomous Systems Bootcam at Univ. Deleware</a>– <a href="https://www.youtube.com/watch?v=4ETDsBN2o8M">Video</a></li> +<li><a href="https://roscon.jp/2024/">2024-09-25 ROSConJP</a></li> +<li><a href="https://fira-usa.com/">2024-10-22 → 2024-10-24 AgRobot FIRA in Sacramento</a></li> +</ul> +<h1><a class="anchor" href="https://discourse.ros.org#news-3" name="news-3"></a>News</h1> +<ul> +<li><a href="https://discourse.ros.org/t/foxglove-2-0-integrated-ui-new-pricing-and-open-source-changes/36583">Foxglove 2.0: Integrated UI, New Price, Less Open Source</a> – <a href="https://www.therobotreport.com/foxglove-launches-upgraded-platform-with-enhanced-observability/">Robot Report</a></li> +<li><a href="https://www.bearrobotics.ai/blog/bear-robotics-secures-60m-series-c-funding-led-by-lg-electronics">LG Leads $60M Series C for Bear Robotics</a> – <a href="https://techcrunch.com/2024/03/12/bear-robotics-a-robot-waiter-startup-just-picked-up-60m-from-lg/">TechCrunch</a> – <a href="https://www.therobotreport.com/lg-makes-strategic-investment-in-bear-robotics/">Robot Report</a></li> +<li><a href="https://dronecode.org/the-2023-year-in-review/">Dronecode Annual Report</a></li> +<li><a href="https://www.ieee-ras.org/educational-resources-outreach/technical-education-programs"><img alt=":cool:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/cool.png?v=12" title=":cool:" width="20" /> Want to run a ROS Summer School? Get $25k from IEEE-RAS! </a></li> +<li><a href="https://www.youtube.com/watch?v=EZm_kWPMq0Q">YOKOHAMA GUNDAM FACTORY!</a></li> +<li><a href="https://hackaday.com/2024/03/09/rosie-the-robot-runs-for-real/">Actual Rosie Robot</a></li> +<li><a href="https://techcrunch.com/2024/03/14/humanoid-robots-face-continued-skepticism-at-modex/">Modex Skeptical of Humanoids</a> – <a href="https://techcrunch.com/2024/03/11/the-loneliness-of-the-robotic-humanoid/">See also: Digit only Humanoid at Modex</a></li> +<li><a href="https://techcrunch.com/2024/03/13/behold-truckbot/">Behold Truckbot</a></li> +<li><a href="https://techcrunch.com/2024/03/13/cyphers-inventory-drone-launches-from-an-autonomous-mobile-robot-base/">AMR + Drone for Inventory at Modex</a></li> +<li><a href="https://techcrunch.com/2024/03/12/locus-robotics-success-is-a-tale-of-focusing-on-what-works/">Locus Robotics’ success is a tale of focusing on what works</a></li> +<li><a href="https://www.therobotreport.com/afara-launches-autonomous-picker-to-clean-up-after-cotton-harvest/">Afara launches autonomous picker to clean up after cotton harvest</a></li> +<li><a href="https://spectrum.ieee.org/covariant-foundation-model">Covariant Announces a Universal AI Platform for Robots</a></li> +<li><a href="https://dex-cap.github.io/">DexCap: Scalable and Portable Mocap Data Collection System for Dexterous Manipulation – open hardware</a></li> +<li><a href="https://techcrunch.com/2024/03/15/these-61-robotics-companies-are-hiring/">Who’s Hiring Robotics</a></li> +</ul> +<h1><a class="anchor" href="https://discourse.ros.org#ros-4" name="ros-4"></a>ROS</h1> +<ul> +<li><a href="https://discourse.ros.org/t/fresh-edition-of-the-ros-mooc-from-tudelft/36633"><img alt=":cool:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/cool.png?v=12" title=":cool:" width="20" /> <img alt=":new:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/new.png?v=12" title=":new:" width="20" /> TU-Delft ROS MOOC</a></li> +<li><a href="https://discourse.ros.org/t/new-packages-for-ros-2-rolling-ridley-2024-03-13/36626">Rolling Ridley now Runs on 24.04 – 1416 Updated Packages <img alt=":tada:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/tada.png?v=12" title=":tada:" width="20" /></a></li> +<li><a href="https://discourse.ros.org/t/new-packages-for-noetic-2024-03-07/36514">10 New and 46 Updated Packages for Noetic</a></li> +<li><a href="https://discourse.ros.org/t/new-packages-for-iron-irwini-2024-03-11/36560">1 New and 82 Updated Packages for Iron Irwini</a></li> +<li><a href="https://www.baslerweb.com/en/software/pylon/camera-driver-ros/"><img alt=":cool:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/cool.png?v=12" title=":cool:" width="20" /> Pylon Basler Camera Driver for ROS 2</a> – <a href="https://github.com/basler/pylon-ros-camera">source</a> – <a href="https://www2.baslerweb.com/en/downloads/document-downloads/interfacing-basler-cameras-with-ros-2/">docs</a></li> +<li><a href="https://discourse.ros.org/t/march-2024-meetings-aerial-robotics/36495">Aerial Robotics Meetings for March</a></li> +<li><a href="https://vimeo.com/923208013?share=copy">Interop SIG: Standardizing Infrastructure Video</a></li> +<li><a href="https://discourse.ros.org/t/teleop-keyboard-node-in-rust/36555">Keyboard Teleop in Rust <img alt=":crab:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/crab.png?v=12" title=":crab:" width="20" /></a></li> +<li><a href="https://discourse.ros.org/t/ros-2-and-large-data-transfer-on-lossy-networks/36598">ROS 2 and Large Data Transfer of Lossy Network</a></li> +<li><a href="https://discourse.ros.org/t/cloud-robotics-wg-strategy-next-meeting-announcement/36604">Cloud Robotics WG Next Meeting</a></li> +<li><a href="https://discourse.ros.org/t/announcing-open-sourcing-of-ros-2-task-manager/36572">ROS 2 Task Manager</a></li> +<li><a href="https://github.com/jsk-ros-pkg/jsk_3rdparty/tree/master/switchbot_ros">SwitchBot ROS Package</a></li> +<li><a href="https://www.behaviortree.dev/docs/category/tutorials-advanced/">New Advanced Behavior Tree Tutorials</a></li> +<li><a href="https://github.com/ToyotaResearchInstitute/gauges2">TRI ROS 2 Gauges Package</a></li> +<li><a href="https://haraduka.github.io/continuous-state-recognition/">Continuous Object State Recognition for Cooking Robots</a></li> +<li><a href="https://www.youtube.com/watch?v=lTew9mbXrAs">ROS Python PyCharm Setup Guide </a></li> +<li><a href="https://github.com/MJavadZallaghi/ros2webots">ROS 2 WeBots Starter Code</a></li> +<li><a href="https://github.com/uos/ros2_tutorial">Osnabrück University KBS Robotics Tutorial</a></li> +<li><a href="https://github.com/ika-rwth-aachen/etsi_its_messages">ROS Package for ETSI ITS Message for V2X Comms </a></li> +<li><a href="https://github.com/suchetanrs/ORB-SLAM3-ROS2-Docker">ORB-SLAM3 ROS 2 Docker Container</a></li> +<li><a href="https://www.youtube.com/watch?v=TWTDPilQ8q0&amp;t=8s">Factory Control System from Scratch in Rust <img alt=":crab:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/crab.png?v=12" title=":crab:" width="20" /></a></li> +<li><a href="https://www.youtube.com/watch?v=sAkrG_WBqyc">ROS + QT-Creator (Arabic)</a></li> +<li><a href="https://www.allegrohand.com/">Dexterous Hand that Runs ROS</a></li> +<li><a href="https://discourse.ros.org/t/cobot-magic-agilex-achieved-the-whole-process-of-mobile-aloha-model-training-in-both-the-simulation-and-real-environment/36644">Cobot Magic: AgileX achieved the whole process of Mobile Aloha model training in both the simulation and real environment</a></li> +</ul> +<h1><a class="anchor" href="https://discourse.ros.org#ros-questions-5" name="ros-questions-5"></a>ROS Questions</h1> +<p>Got a minute? <a href="https://robotics.stackexchange.com/">Please take some time to answer questions on Robotics Stack Exchange!</a></p> + <p><small>1 post - 1 participant</small></p> + <p><a href="https://discourse.ros.org/t/ros-news-for-the-week-of-march-11th-2024/36651">Read full topic</a></p> + 2024-03-15T15:33:56+00:00 + Katherine_Scott + + + ROS Discourse General: ROSCon 2024 Call for Proposals Now Open + https://discourse.ros.org/t/roscon-2024-call-for-proposals-now-open/36624 + <h1><a class="anchor" href="https://discourse.ros.org#roscon-2024-call-for-proposals-1" name="roscon-2024-call-for-proposals-1"></a>ROSCon 2024 Call for Proposals</h1> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/e/5/e501ec44fb4eefddc8e5d1f1334345b72b815ed1.png" title="ROSCon_2024_transparent"><img alt="ROSCon_2024_transparent" height="500" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/e/5/e501ec44fb4eefddc8e5d1f1334345b72b815ed1_2_545x500.png" width="545" /></a></div><p></p> +<p>Hi Everyone,</p> +<p>The ROSCon call for proposals is now open! You can find full proposal details on the <a href="http://roscon.ros.org/#call-for-proposals">ROSCon website</a>. ROSCon Workshop proposals are due by <span class="discourse-local-date">2024-05-08T06:59:00Z UTC</span> and can be submitted using this <a href="https://docs.google.com/forms/d/e/1FAIpQLSeciW0G6_bvlH_AL7mJrERBiajnUqnq1yO3z1rgzeb-O2hZxw/viewform?usp=header_link">Google Form</a>. ROSCon talks are due by <span class="discourse-local-date">2024-06-04T06:59:00Z UTC</span> and you can submit your proposals using <a href="https://roscon2024.hotcrp.com/">Hot CRP</a>. Please note that you’ll need a HotCRP account to submit your talk proposal. We plan to post the accepted workshops on or around <span class="discourse-local-date">2024-07-08T07:00:00Z UTC</span> and the accepted talks on or around <span class="discourse-local-date">2024-07-15T07:00:00Z UTC</span> respectively. If you think you will need financial assistance to attend ROSCon, and you meet the qualifications, please apply for our <a href="https://docs.google.com/forms/d/e/1FAIpQLSfJYMAT8wXjFp6FjMMTva_bYoKhZtgRy7P9540e6MX94PgzPg/viewform?fbzx=-7920629384650366975">Diversity Scholarship Program</a> as soon as possible. Diversity Scholarship applications are due on <span class="discourse-local-date">2024-04-06T06:59:00Z UTC</span>, well before the CFP deadlines or final speakers are announced. Questions and concerns about the ROSCon CFP can be directed to the ROSCon executive committee (<a href="mailto:roscon-2024-ec@openrobotics.org">roscon-2024-ec@openrobotics.org</a>) or posted in this thread.</p> +<p>We recommend you start planning your talk early and take the time to workshop your submission with your friends and colleagues. You are more than welcome to use this Discourse thread and the <a href="https://discord.com/channels/1077825543698927656/1208998489154129920">#roscon-2024 channel on the ROS Discord</a> to workshop ideas and organize collaborators.</p> +<p>Finally, I want to take a moment to recognize this year’s ROSCon Program Co-Chairs <a class="mention" href="https://discourse.ros.org/u/ingo_lutkebohle">@Ingo_Lutkebohle</a> and <a class="mention" href="https://discourse.ros.org/u/yadunund">@Yadunund</a>, along with a very long list of talk reviewers who are still being finalized. Reviewing talk proposals is fairly tedious task, and ROSCon wouldn’t happen without the efforts of our volunteers. If you happen to run into any of them at ROSCon please thank them for their service to the community.</p> +<h2><a class="anchor" href="https://discourse.ros.org#talk-and-workshop-ideas-for-roscon-2024-2" name="talk-and-workshop-ideas-for-roscon-2024-2"></a>Talk and Workshop Ideas for ROSCon 2024</h2> +<p>If you’ve never been to ROSCon, but would like to submit a talk or workshop proposal, we recommend you take a look at the <a href="https://roscon.ros.org/2024/#archive">archive of previous ROSCon talks</a>. Another good resource to consider are frequently discussed topics on ROS Discourse and Robotics Stack Exchange. <a href="https://discourse.ros.org/t/2023-ros-metrics-report/35837">In last year’s metric’s report</a> I include a list of frequently asked topic tags from Robotics Stack that might be helpful. Aside from code, we really want to your robots! We want to see your race cars, mining robots, moon landers, maritime robots, development boards, and factories and hear about lessons you learned from making them happen. If you organize a working group, run a local meetup, or maintain a larger package we want to hear about your big wins in the past year.</p> +<p>While we can suggest a few ideas for talks and workshops that we would like to see at ROSCon 2024, what we really want is to hear from the community about topic areas that you think are important. <em><strong>If there is a talk you would like to see at ROSCon 2024 consider proposing a that topic in the comments below.</strong></em> Feel free to write a whole list! Some of our most memorable talks have been ten minute overviews of key ROS subsystems that everyone uses. If you think a half hour talk about writing a custom ROS 2 executor and benchmarking its performance would be helpful, please say so!</p> + <p><small>1 post - 1 participant</small></p> + <p><a href="https://discourse.ros.org/t/roscon-2024-call-for-proposals-now-open/36624">Read full topic</a></p> + 2024-03-15T15:19:51+00:00 + Katherine_Scott + + + ROS Discourse General: Cobot Magic: AgileX achieved the whole process of Mobile Aloha model training in both the simulation and real environment + https://discourse.ros.org/t/cobot-magic-agilex-achieved-the-whole-process-of-mobile-aloha-model-training-in-both-the-simulation-and-real-environment/36644 + <h1><a class="anchor" href="https://discourse.ros.org#cobot-magic-agilex-achieved-the-whole-process-of-mobile-aloha-model-training-in-both-the-simulation-and-real-environment-1" name="cobot-magic-agilex-achieved-the-whole-process-of-mobile-aloha-model-training-in-both-the-simulation-and-real-environment-1"></a><strong>Cobot Magic: AgileX achieved the whole process of Mobile Aloha model training in both the simulation and real environment</strong></h1> +<p>Mobile Aloha is a whole-body remote operation data collection system developed by Zipeng Fu, Tony Z. Zhao, and Chelsea Finn from Stanford University. <a href="https://mobile-aloha.github.io/" rel="noopener nofollow ugc">link.</a></p> +<p>Based on Mobile Aloha, AgileX developed Cobot Magic, which can achieve the complete code of Mobile Aloha, with higher configurations and lower costs, and is equipped with larger-load robotic arms and high-computing power industrial computers. For more details about Cobot Magic please check the <a href="https://global.agilex.ai/" rel="noopener nofollow ugc">AgileX website </a>.</p> +<p>Currently, AgileX has successfully completed the integration of Cobot Magic based on the Mobile Aloha source code project.<br /> +<img alt="推理" class="animated" height="400" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/4/f/4f9834cff531f45ab648f7db0a7142ee080270af.gif" width="424" /></p> +<h1><a class="anchor" href="https://discourse.ros.org#simulation-data-training-2" name="simulation-data-training-2"></a><strong>Simulation data training</strong></h1> +<h1><a class="anchor" href="https://discourse.ros.org#data-collection-3" name="data-collection-3"></a><strong>Data collection</strong></h1> +<p>After setting up the Mobile Aloha software environment(metioned in last section), model training in the simulation environment and real environment can be achieved. The following is the data collection part of the simulation environment. The data is provided by the team of Zipeng Fu, Tony Z. Zhao, and Chelsea Finn team.You can find all scripted/human demo for simulated environments here. <a href="https://drive.google.com/drive/folders/1gPR03v05S1xiInoVJn7G7VJ9pDCnxq9O" rel="noopener nofollow ugc">here</a></p> +<p>After downloading, copy it to the act-plus-plus/data directory. The directory structure is as follows:</p> +<pre><code class="lang-auto">act-plus-plus/data + ├── sim_insertion_human + │ ├── sim_insertion_human-20240110T054847Z-001.zip + ├── ... + ├── sim_insertion_scripted + │ ├── sim_insertion_scripted-20240110T054854Z-001.zip + ├── ... + ├── sim_transfer_cube_human + │ ├── sim_transfer_cube_human-20240110T054900Z-001.zip + │ ├── ... + └── sim_transfer_cube_scripted + ├── sim_transfer_cube_scripted-20240110T054901Z-001.zip + ├── ... +</code></pre> +<p>Generate episodes and render the result graph. The terminal displays 10 episodes and 2 successful ones.</p> +<pre><code class="lang-auto"># 1 Run +python3 record_sim_episodes.py --task_name sim_transfer_cube_scripted --dataset_dir &lt;data save dir&gt; --num_episodes 50 + +# 2 Take sim_transfer_cube_scripted as an example +python3 record_sim_episodes.py --task_name sim_transfer_cube_scripted --dataset_dir data/sim_transfer_cube_scripted --num_episodes 10 + +# 2.1 Real-time rendering +python3 record_sim_episodes.py --task_name sim_transfer_cube_scripted --dataset_dir data/sim_transfer_cube_scripted --num_episodes 10 --onscreen_render + +# 2.2 The output in the terminal shows +ube_scripted --num_episodes 10 +episode_idx=0 +Rollout out EE space scripted policy +episode_idx=0 Failed +Replaying joint commands +episode_idx=0 Failed +Saving: 0.9 secs + +episode_idx=1 +Rollout out EE space scripted policy +episode_idx=1 Successful, episode_return=57 +Replaying joint commands +episode_idx=1 Successful, episode_return=59 +Saving: 0.6 secs +... +Saved to data/sim_transfer_cube_scripted +Success: 2 / 10 +</code></pre> +<p>The loaded image renders as follows:<br /> +</p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/b/5/b52f830cdca421a0a4960f61c81219922df8668d.png" rel="noopener nofollow ugc" title="1"><img alt="1" height="500" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/b/5/b52f830cdca421a0a4960f61c81219922df8668d_2_655x500.png" width="655" /></a></div><p></p> +<h1><a class="anchor" href="https://discourse.ros.org#data-visualization-4" name="data-visualization-4"></a>Data Visualization</h1> +<p>Visualize simulation data. The following figures show the images of episode0 and episode9 respectively.</p> +<p>The episode 0 screen in the data set is as follows, showing a case where the gripper fails to pick up.</p> +<p><img alt="episode0" class="animated" height="230" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/1/f/1f1e94f75c5ff731886fbf069597af5dfe0137cf.gif" width="690" /></p> +<p>The visualization of the data of episode 9 shows the successful case of grippering.</p> +<p><img alt="episode19" class="animated" height="230" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/0/9/09268db2b7338acfb94096bbd25f139a3a932006.gif" width="690" /></p> +<p>Print the data of each joint of the robotic arm in the simulation environment. Joint 0-13 is the data of 14 degrees of freedom of the robot arm and the gripper.</p> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/d/4/d4620062ddee3643956b6bef2cf4aed3728a6aec.png" rel="noopener nofollow ugc" title="episode-qpos"><img alt="episode-qpos" height="500" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/d/4/d4620062ddee3643956b6bef2cf4aed3728a6aec_2_250x500.png" width="250" /></a></div><p></p> +<h1><a class="anchor" href="https://discourse.ros.org#model-training-and-inference-5" name="model-training-and-inference-5"></a><strong>Model training and inference</strong></h1> +<p>Simulated environments datasets must be downloaded (see Data Collection)</p> +<pre><code class="lang-auto">python3 imitate_episodes.py --task_name sim_transfer_cube_scripted --ckpt_dir &lt;ckpt dir&gt; --policy_class ACT --kl_weight 10 --chunk_size 100 --hidden_dim 512 --batch_size 8 --dim_feedforward 3200 --num_epochs 2000 --lr 1e-5 --seed 0 + +# run +python3 imitate_episodes.py --task_name sim_transfer_cube_scripted --ckpt_dir trainings --policy_class ACT --kl_weight 1 --chunk_size 10 --hidden_dim 512 --batch_size 1 --dim_feedforward 3200 --lr 1e-5 --seed 0 --num_steps 2000 + +# During training, you will be prompted with the following content. Since you do not have a W&amp;B account, choose 3 directly. +wandb: (1) Create a W&amp;B account +wandb: (2) Use an existing W&amp;B account +wandb: (3) Don't visualize my results +wandb: Enter your choice: +</code></pre> +<p>After training is completed, the weights will be saved to the trainings directory. The results are as follows:</p> +<pre><code class="lang-auto">trainings + ├── config.pkl + ├── dataset_stats.pkl + ├── policy_best.ckpt + ├── policy_last.ckpt + └── policy_step_0_seed_0.ckpt +</code></pre> +<p>Evaluate the model trained above:</p> +<pre><code class="lang-auto"># 1 evaluate the policy add --onscreen_render real-time render parameter +python3 imitate_episodes.py --eval --task_name sim_transfer_cube_scripted --ckpt_dir trainings --policy_class ACT --kl_weight 1 --chunk_size 10 --hidden_dim 512 --batch_size 1 --dim_feedforward 3200 --lr 1e-5 --seed 0 --num_steps 20 --onscreen_render +</code></pre> +<p>And print the rendering picture.</p> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/2/d/2dfdf7294ff8c2b78a434ee0fe315b8e9f252a49.png" rel="noopener nofollow ugc" title="2"><img alt="2" height="500" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/2/d/2dfdf7294ff8c2b78a434ee0fe315b8e9f252a49_2_661x500.png" width="661" /></a></div><p></p> +<h1><a class="anchor" href="https://discourse.ros.org#data-training-in-real-environment-6" name="data-training-in-real-environment-6"></a><strong>Data Training in real environment</strong></h1> +<h1><a class="anchor" href="https://discourse.ros.org#data-collection-7" name="data-collection-7"></a><strong>Data Collection</strong></h1> +<p>1.Environment dependency</p> +<p>1.1 ROS dependency</p> +<p>● Default: ubuntu20.04-noetic environment has been configured</p> +<pre><code class="lang-auto">sudo apt install ros-$ROS_DISTRO-sensor-msgs ros-$ROS_DISTRO-nav-msgs ros-$ROS_DISTRO-cv-bridge +</code></pre> +<p>1.2 Python dependency</p> +<pre><code class="lang-auto"># Enter the current working space directory and install the dependencies in the requirements.txt file. +pip install -r requiredments.txt +</code></pre> +<p>2.Data collection</p> +<p>2.1 Run ‘collect_data’</p> +<pre><code class="lang-auto">python collect_data.py -h # see parameters +python collect_data.py --max_timesteps 500 --episode_idx 0 +python collect_data.py --max_timesteps 500 --is_compress --episode_idx 0 +python collect_data.py --max_timesteps 500 --use_depth_image --episode_idx 1 +python collect_data.py --max_timesteps 500 --is_compress --use_depth_image --episode_idx 1 +</code></pre> +<p>After the data collection is completed, it will be saved in the ${dataset_dir}/{task_name} directory.</p> +<pre><code class="lang-auto">python collect_data.py --max_timesteps 500 --is_compress --episode_idx 0 +# Generate dataset episode_0.hdf5 . The structure is : + +collect_data + ├── collect_data.py + ├── data # --dataset_dir + │ └── cobot_magic_agilex # --task_name + │ ├── episode_0.hdf5 # The location of the generated data set file + ├── episode_idx.hdf5 # idx is depended on --episode_idx + └── ... + ├── readme.md + ├── replay_data.py + ├── requiredments.txt + └── visualize_episodes.py +</code></pre> +<p>The specific parameters are shown:</p> +<div class="md-table"> +<table> +<thead> +<tr> +<th>Name</th> +<th>Explanation</th> +</tr> +</thead> +<tbody> +<tr> +<td>dataset_dir</td> +<td>Data set saving path</td> +</tr> +<tr> +<td>task_name</td> +<td>task name, as the file name of the data set</td> +</tr> +<tr> +<td>episode_idx</td> +<td>Action block index number</td> +</tr> +<tr> +<td>max_timesteps</td> +<td>The number of time steps for the maximum action block</td> +</tr> +<tr> +<td>camera_names</td> +<td>Camera names, default [‘cam_high’, ‘cam_left_wrist’, ‘cam_right_wrist’]</td> +</tr> +<tr> +<td>img_front_topic</td> +<td>Camera 1 Color Picture Topic</td> +</tr> +<tr> +<td>img_left_topic</td> +<td>Camera 2 Color Picture Topic</td> +</tr> +<tr> +<td>img_right_topic</td> +<td>Camera 3 Color Picture Topic</td> +</tr> +<tr> +<td>use_depth_image</td> +<td>Whether to use depth information</td> +</tr> +<tr> +<td>depth_front_topic</td> +<td>Camera 1 depth map topic</td> +</tr> +<tr> +<td>depth_left_topic</td> +<td>Camera 2 depth map topic</td> +</tr> +<tr> +<td>depth_right_topic</td> +<td>Camera 3 depth map topic</td> +</tr> +<tr> +<td>master_arm_left_topic</td> +<td>Left main arm topic</td> +</tr> +<tr> +<td>master_arm_right_topic</td> +<td>Right main arm topic</td> +</tr> +<tr> +<td>puppet_arm_left_topic</td> +<td>Left puppet arm topic</td> +</tr> +<tr> +<td>puppet_arm_right_topic</td> +<td>Right puppet arm topic</td> +</tr> +<tr> +<td>use_robot_base</td> +<td>Whether to use mobile base information</td> +</tr> +<tr> +<td>robot_base_topic</td> +<td>Mobile base topic</td> +</tr> +<tr> +<td>frame_rate</td> +<td>Acquisition frame rate. Because the camera image stabilization value is 30 frames, the default is 30 frames</td> +</tr> +<tr> +<td>is_compress</td> +<td>Whether the image is compressed and saved</td> +</tr> +</tbody> +</table> +</div><p>The picture of data collection from the camera perspective is as follows:</p> +<p><img alt="data collection" class="animated" height="387" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/0/2/02c868b09ce46587de9150e9d6c09c62a5719a9a.gif" width="690" /></p> +<h1><a class="anchor" href="https://discourse.ros.org#data-visualization-8" name="data-visualization-8"></a><strong>Data visualization</strong></h1> +<p>Run the following code:</p> +<pre><code class="lang-auto">python visualize_episodes.py --dataset_dir ./data --task_name cobot_magic_agilex --episode_idx 0 +</code></pre> +<p>Visualize the collected data. <code>--dataset_dir</code>, <code>--task_name</code> and <code>--episode_idx</code> need to be the same as when ‘collecting data’. When you run the above code, the terminal will print the action and display a color image window. The visualization results are as follows:</p> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/7/f/7f33bb7c245190e4b69c5871d0300c3019215a89.jpeg" rel="noopener nofollow ugc" title="733bfc3a250f3d9f0a919d8f447421cb"><img alt="733bfc3a250f3d9f0a919d8f447421cb" height="316" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/7/f/7f33bb7c245190e4b69c5871d0300c3019215a89_2_690x316.jpeg" width="690" /></a></div><p></p> +<p>After the operation is completed, episode${idx}qpos.png, episode${idx}base_action.png and episode${idx}video.mp4 files will be generated under ${dataset_dir}/{task_name}. The directory structure is as follows:</p> +<pre><code class="lang-auto">collect_data +├── data +│ ├── cobot_magic_agilex +│ │ └── episode_0.hdf5 +│ ├── episode_0_base_action.png # base_action +│ ├── episode_0_qpos.png # qpos +│ └── episode_0_video.mp4 # Color video +</code></pre> +<p>Taking episode30 as an example, replay the collected episode30 data. The camera perspective is as follows:</p> +<p><img alt="data visualization" class="animated" height="172" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/e/a/eafb8cd13e73cd06ffacc771589c7106f080a252.gif" width="690" /></p> +<h1><a class="anchor" href="https://discourse.ros.org#model-training-and-inference-9" name="model-training-and-inference-9"></a>Model Training and Inference</h1> +<p>The Mobile Aloha project has studied different strategies for imitation learning, and proposed a Transformer-based action chunking algorithm ACT (Action Chunking with Transformers). It is essentially an end-to-end strategy: directly mapping real-world RGB images to actions, allowing the robot to learn and imitate from the visual input without the need for additional artificially encoded intermediate representations, and using action chunking (Chunking) as the unit to predict and integrates accurate and smooth motion trajectories.</p> +<p>The model is as follows:</p> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/a/f/af32cea48cc4e4b04932386d0bc9ec8c32ddce9e.png" rel="noopener nofollow ugc" title="image (1)"><img alt="image (1)" height="174" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/a/f/af32cea48cc4e4b04932386d0bc9ec8c32ddce9e_2_690x174.png" width="690" /></a></div><p></p> +<p>Disassemble and interpret the model.</p> +<ol> +<li>Sample data</li> +</ol> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/3/1/3123a4970c9d91e665510d39acd191c588f3c216.png" rel="noopener nofollow ugc" title="image (2)"><img alt="image (2)" height="140" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/3/1/3123a4970c9d91e665510d39acd191c588f3c216_2_690x140.png" width="690" /></a></div><p></p> +<p>Input: includes 4 RGB images, each image has a resolution of 480 × 640, and the joint positions of the two robot arms (7+7=14 DoF in total)</p> +<p>Output: The action space is the absolute joint positions of the two robots, a 14-dimensional vector. Therefore, with action chunking, the policy outputs a k × 14 tensor given the current observation (each action is defined as a 14-dimensional vector, so k actions are a k × 14 tensor)</p> +<ol start="2"> +<li>Infer Z</li> +</ol> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/2/f/2f72d64dd82d004c926759c64b00b78647d10231.png" rel="noopener nofollow ugc" title="image (3)"><img alt="image (3)" height="215" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/2/f/2f72d64dd82d004c926759c64b00b78647d10231_2_690x215.png" width="690" /></a></div><p></p> +<p>The input to the encoder is a [CLS] token, which consists of randomly initialized learning weights. Through a linear layer2, the joints are projected to the joint positions of the embedded dimensions (14 dimensions to 512 dimensions) to obtain the embedded joint positions embedded joints. Through another linear layer linear layer1, the k × 14 action sequence is projected to the embedded action sequence of the embedded dimension (k × 14 dimension to k × 512 dimension).</p> +<p>The above three inputs finally form a sequence of (k + 2) × embedding_dimension, that is, (k + 2) × 512, and are processed with the transformer encoder. Finally, just take the first output, which corresponds to the [CLS] tag, and use another linear network to predict the mean and variance of the Z distribution, parameterizing it as a diagonal Gaussian distribution. Use reparameterization to obtain samples of Z.</p> +<ol start="3"> +<li>Predict a action sequence</li> +</ol> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/4/1/41d86ea2457b78aa9f7c8d3172130611cc9441e5.jpeg" rel="noopener nofollow ugc" title="image (4)"><img alt="image (4)" height="267" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/4/1/41d86ea2457b78aa9f7c8d3172130611cc9441e5_2_690x267.jpeg" width="690" /></a></div><p></p> +<p>① First, for each image observation, it is processed by ResNet18 to obtain a feature map (15 × 20 × 728 feature maps), and then flattened to obtain a feature sequence (300 × 728). These features are processed using a linear layer Layer5 is projected to the embedding dimension (300×512), and in order to preserve spatial information, a 2D sinusoidal position embedding is added.</p> +<p>② Secondly, repeat this operation for all 4 images, and the resulting feature sequence dimension is 1200 × 512.</p> +<p>③ Next, the feature sequences from each camera are concatenated and used as one of the inputs of the transformer encoder. For the other two inputs: the current joint positions joints and the “style variable” z, they are passed through the linear layer linear layer6, linear layer respectively Layer7 is uniformly projected to 512 from their respective original dimensions (14, 15).</p> +<p>④ Finally, the encoder input of the transformer is 1202×512 (the feature dimension of the 4 images is 1200×512, the feature dimension of the joint position joins is 1×512, and the feature dimension of the style variable z is 1×512).</p> +<p>The input to the transformer decoder has two aspects:</p> +<p>On the one hand, the “query” of the transformer decoder is the first layer of fixed sinusoidal position embeddings, that is, the position embeddings (fixed) shown in the lower right corner of the above figure, whose dimension is k × 512</p> +<p>On the other hand, the “keys” and “values” in the cross-attention layer of the transformer decoder come from the output of the above-mentioned transformer encoder.</p> +<p>Thereby, the transformer decoder predicts the action sequence given the encoder output.</p> +<p>By collecting data and training the above model, you can observe that the results converge.</p> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/f/c/fcd703b5a444096e904cbd048218f306c61f7964.png" rel="noopener nofollow ugc" title="image-20240314233128053"><img alt="image-20240314233128053" height="500" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/f/c/fcd703b5a444096e904cbd048218f306c61f7964_2_672x500.png" width="672" /></a></div><p></p> +<p>A third view of the model inference results is as follows. The robotic arm can infer the movement of placing colored blocks from point A to point B.</p> +<p><img alt="推理" class="animated" height="400" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/4/f/4f9834cff531f45ab648f7db0a7142ee080270af.gif" width="424" /></p> +<h3><a class="anchor" href="https://discourse.ros.org#summary-10" name="summary-10"></a><strong>Summary</strong></h3> +<p>Cobot Magic is a remote whole-body data collection device, developed by AgileX Robotics based on the Mobile Aloha project from Stanford University. With Cobot Magic, AgileX Robotics has successfully achieved the open-source code from the Stanford laboratory used on the Mobile Aloha platform, including in simulation and real environment.<br /> +AgileX will continue to collect data from various motion tasks based on Cobot Magic for model training and inference. Please stay tuned for updates on <a href="https://github.com/agilexrobotics?tab=repositories" rel="noopener nofollow ugc">Github. </a> And if you are interested in this Mobile Aloha project, join us with this slack link: <a class="inline-onebox" href="https://join.slack.com/t/mobilealohaproject/shared_invite/zt-2evdxspac-h9QXyigdcrR1TcYsUqTMOw" rel="noopener nofollow ugc">Slack</a>. Let’s talk about our ideas.</p> +<h3><a class="anchor" href="https://discourse.ros.org#about-agilex-11" name="about-agilex-11"></a><strong>About AgileX</strong></h3> +<p>Established in 2016, AgileX Robotics is a leading manufacturer of mobile robot platforms and a provider of unmanned system solutions. The company specializes in independently developed multi-mode wheeled and tracked wire-controlled chassis technology and has obtained multiple international certifications. AgileX Robotics offers users self-developed innovative application solutions such as autonomous driving, mobile grasping, and navigation positioning, helping users in various industries achieve automation. Additionally, AgileX Robotics has introduced research and education software and hardware products related to machine learning, embodied intelligence, and visual algorithms. The company works closely with research and educational institutions to promote robotics technology teaching and innovation.</p> + <p><small>1 post - 1 participant</small></p> + <p><a href="https://discourse.ros.org/t/cobot-magic-agilex-achieved-the-whole-process-of-mobile-aloha-model-training-in-both-the-simulation-and-real-environment/36644">Read full topic</a></p> + 2024-03-15T03:07:59+00:00 + Agilex_Robotics + + + ROS Discourse General: Cloud Robotics WG Strategy & Next Meeting Announcement + https://discourse.ros.org/t/cloud-robotics-wg-strategy-next-meeting-announcement/36604 + <p>Hi folks!</p> +<p>I wanted to tell you the results of the Cloud Robotics Working Group meeting from 2024-03-11. We met to discuss the long-term strategy of the group. You can see the full meeting recording on <a href="https://vimeo.com/922530909?share=copy" rel="noopener nofollow ugc">vimeo</a>, with our meeting minutes <a href="https://docs.google.com/document/d/10yT-0DKkrw1gDKGlWKl_c--2yM1b-UOP5rWW73bJuMw" rel="noopener nofollow ugc">here</a> (thanks to Phil Roan for taking minutes this meeting!).</p> +<p>During the meeting, we went over some definitions of Cloud Robotics, our tenets going forward, and a phase approach of gathering data, analyzing it, and acting on it. We used slides to frame the discussion, which have since been updated from the discussion and will form the backbone of our discussion going forwards. The slide deck is publicly available <a href="https://docs.google.com/presentation/d/1PPBYw7EZNTE8YnGF8CSYQ4DyErXX2sRI" rel="noopener nofollow ugc">here</a>.</p> +<p>Next meeting will be about how to start collecting the data for the first phase. We will hold it <span class="discourse-local-date">2024-03-25T17:00:00Z UTC</span>→<span class="discourse-local-date">2024-03-25T18:00:00Z UTC</span>. If you’d like to join the group, you are welcome to, and you can sign up for our meeting invites at <a href="https://groups.google.com/g/cloud-robotics-working-group-invites" rel="noopener nofollow ugc">this Google Group</a>.</p> +<p>Finally, we will regularly invite members and guests to give talks in our meetings. If you have a topic you’d like to talk about, or would like to invite someone to talk, please use this <a href="https://docs.google.com/spreadsheets/d/1drBcG-CXmX8YxBZuRK8Lr3eTTfqe2p_RF_HlDw4Rj5g/" rel="noopener nofollow ugc">speaker signup sheet</a> to let us know.</p> +<p>Hopefully I’ll see you all in future meetings!</p> + <p><small>6 posts - 4 participants</small></p> + <p><a href="https://discourse.ros.org/t/cloud-robotics-wg-strategy-next-meeting-announcement/36604">Read full topic</a></p> + 2024-03-12T17:33:16+00:00 + mikelikesrobots + + + ROS Discourse General: Foxglove 2.0 - integrated UI, new pricing, and open source changes + https://discourse.ros.org/t/foxglove-2-0-integrated-ui-new-pricing-and-open-source-changes/36583 + <p>Hi everyone - excited to announce Foxglove 2.0, with a new integrated UI (merging Foxglove Studio and Data Platform), new pricing plans, and open source changes.</p> +<p><img alt=":handshake:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/handshake.png?v=12" title=":handshake:" width="20" /> Streamlined UI for smoother robotics observability<br /> +<img alt=":satellite:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/satellite.png?v=12" title=":satellite:" width="20" /> Automatic data offload through Foxglove Agent<br /> +<img alt=":credit_card:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/credit_card.png?v=12" title=":credit_card:" width="20" /> Updated pricing plans to make Foxglove accessible for teams of all sizes<br /> +<img alt=":mag_right:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/mag_right.png?v=12" title=":mag_right:" width="20" /> Changes to our open-source strategy (we’re discontinuing the open source edition of Foxglove Studio)</p> +<p><a href="https://foxglove.dev/blog/foxglove-2-0-unifying-robotics-observability" rel="noopener nofollow ugc">Read the details in our blog post</a>.</p> +<p>Note that Foxglove is still free for academic teams and researchers! If you fall into that category, please <a href="https://foxglove.dev/contact" rel="noopener nofollow ugc">contact us</a> and we can upgrade your account.</p> + <p><small>15 posts - 10 participants</small></p> + <p><a href="https://discourse.ros.org/t/foxglove-2-0-integrated-ui-new-pricing-and-open-source-changes/36583">Read full topic</a></p> + 2024-03-11T19:28:55+00:00 + amacneil + + + ROS Discourse General: Announcing open sourcing of ROS 2 Task Manager! + https://discourse.ros.org/t/announcing-open-sourcing-of-ros-2-task-manager/36572 + <p><img alt=":tada:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/tada.png?v=12" title=":tada:" width="20" /> Me and my team are happy to announce that we at Karelics have open sourced our ROS 2 Task Manager package. This solution allows you to convert your existing ROS actions and services into tasks, offering useful features such as automatic task conflict resolution, the ability to aggregate multiple tasks into larger Missions, and straightforward tracking for active tasks and their results.</p> +<p>Check out the package and examples of its usage with the Nav2 package:<br /> +<img alt=":link:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/link.png?v=12" title=":link:" width="20" /> <a href="https://github.com/Karelics/task_manager" rel="noopener nofollow ugc">https://github.com/Karelics/task_manager</a></p> +<p>For an introduction and deeper insights into our design decisions, see our blog post available at: <a href="https://karelics.fi/task-manager-ros-2-package/" rel="noopener nofollow ugc">https://karelics.fi/task-manager-ros-2-package/</a><br /> +<br /></p> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/6/e/6ec33466cb152ca88bc1d2c9e1a60415db944598.png" rel="noopener nofollow ugc" title="task_manager_overview"><img alt="task_manager_overview" height="464" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/6/e/6ec33466cb152ca88bc1d2c9e1a60415db944598_2_690x464.png" width="690" /></a></div><br /> +<br /><br /> +We firmly believe that this package will prove valuable to the ROS community and accelerate the development of the robot systems. We are excited to hear your thoughts and feedback on it!<p></p> + <p><small>1 post - 1 participant</small></p> + <p><a href="https://discourse.ros.org/t/announcing-open-sourcing-of-ros-2-task-manager/36572">Read full topic</a></p> + 2024-03-11T12:52:42+00:00 + jak + + + ROS Discourse General: New Packages for Iron Irwini 2024-03-11 + https://discourse.ros.org/t/new-packages-for-iron-irwini-2024-03-11/36560 + <p>We’re happy to announce <strong>1</strong> new packages and <strong>82</strong> updates are now available in ROS 2 Iron Irwini <img alt=":iron:" class="emoji emoji-custom" height="20" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/b/3/b3c1340fc185f5e47c7ec55ef5bb1771802de993.png?v=12" title=":iron:" width="20" /> <img alt=":irwini:" class="emoji emoji-custom" height="20" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/d/2/d2f3dcbdaff6f33258719fe5b8f692594a9feab0.png?v=12" title=":irwini:" width="20" /> . This sync was tagged as <a href="https://github.com/ros/rosdistro/blob/iron/2024-03-11/iron/distribution.yaml" rel="noopener nofollow ugc"><code>iron/2024-03-11</code> </a>.</p> +<h2><a class="anchor" href="https://discourse.ros.org#package-updates-for-iron-1" name="package-updates-for-iron-1"></a>Package Updates for iron</h2> +<h3><a class="anchor" href="https://discourse.ros.org#added-packages-1-2" name="added-packages-1-2"></a>Added Packages [1]:</h3> +<ul> +<li>ros-iron-apriltag-detector-dbgsym: 1.2.1-1</li> +</ul> +<h3><a class="anchor" href="https://discourse.ros.org#updated-packages-82-3" name="updated-packages-82-3"></a>Updated Packages [82]:</h3> +<ul> +<li>ros-iron-apriltag-detector: 1.2.0-1 → 1.2.1-1</li> +<li>ros-iron-controller-interface: 3.23.0-1 → 3.24.0-1</li> +<li>ros-iron-controller-interface-dbgsym: 3.23.0-1 → 3.24.0-1</li> +<li>ros-iron-controller-manager: 3.23.0-1 → 3.24.0-1</li> +<li>ros-iron-controller-manager-dbgsym: 3.23.0-1 → 3.24.0-1</li> +<li><a href="http://ros.org/wiki/controller_manager_msgs">ros-iron-controller-manager-msgs</a>: 3.23.0-1 → 3.24.0-1</li> +<li>ros-iron-controller-manager-msgs-dbgsym: 3.23.0-1 → 3.24.0-1</li> +<li>ros-iron-etsi-its-cam-coding: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-cam-coding-dbgsym: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-cam-conversion: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-cam-msgs: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-cam-msgs-dbgsym: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-coding: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-conversion: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-conversion-dbgsym: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-denm-coding: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-denm-coding-dbgsym: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-denm-conversion: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-denm-msgs: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-denm-msgs-dbgsym: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-messages: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-msgs: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-msgs-utils: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-primitives-conversion: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-rviz-plugins: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-rviz-plugins-dbgsym: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-flir-camera-description: 2.0.8-1 → 2.0.8-2</li> +<li>ros-iron-flir-camera-msgs: 2.0.8-1 → 2.0.8-2</li> +<li>ros-iron-flir-camera-msgs-dbgsym: 2.0.8-1 → 2.0.8-2</li> +<li>ros-iron-hardware-interface: 3.23.0-1 → 3.24.0-1</li> +<li>ros-iron-hardware-interface-dbgsym: 3.23.0-1 → 3.24.0-1</li> +<li>ros-iron-hardware-interface-testing: 3.23.0-1 → 3.24.0-1</li> +<li>ros-iron-hardware-interface-testing-dbgsym: 3.23.0-1 → 3.24.0-1</li> +<li><a href="https://github.com/ros-controls/ros2_control/wiki" rel="noopener nofollow ugc">ros-iron-joint-limits</a>: 3.23.0-1 → 3.24.0-1</li> +<li><a href="http://wiki.ros.org/mavros">ros-iron-libmavconn</a>: 2.6.0-1 → 2.7.0-1</li> +<li>ros-iron-libmavconn-dbgsym: 2.6.0-1 → 2.7.0-1</li> +<li><a href="https://mavlink.io/en/" rel="noopener nofollow ugc">ros-iron-mavlink</a>: 2023.9.9-1 → 2024.3.3-1</li> +<li><a href="http://wiki.ros.org/mavros">ros-iron-mavros</a>: 2.6.0-1 → 2.7.0-1</li> +<li>ros-iron-mavros-dbgsym: 2.6.0-1 → 2.7.0-1</li> +<li><a href="http://wiki.ros.org/mavros_extras">ros-iron-mavros-extras</a>: 2.6.0-1 → 2.7.0-1</li> +<li>ros-iron-mavros-extras-dbgsym: 2.6.0-1 → 2.7.0-1</li> +<li><a href="http://wiki.ros.org/mavros_msgs">ros-iron-mavros-msgs</a>: 2.6.0-1 → 2.7.0-1</li> +<li>ros-iron-mavros-msgs-dbgsym: 2.6.0-1 → 2.7.0-1</li> +<li><a href="https://www.mrpt.org/" rel="noopener nofollow ugc">ros-iron-mrpt2</a>: 2.11.9-1 → 2.11.11-1</li> +<li>ros-iron-mrpt2-dbgsym: 2.11.9-1 → 2.11.11-1</li> +<li><a href="https://wiki.ros.org/mvsim">ros-iron-mvsim</a>: 0.8.3-1 → 0.9.1-1</li> +<li>ros-iron-mvsim-dbgsym: 0.8.3-1 → 0.9.1-1</li> +<li><a href="https://github.com/LORD-MicroStrain/ntrip_client" rel="noopener nofollow ugc">ros-iron-ntrip-client</a>: 1.2.0-3 → 1.3.0-1</li> +<li>ros-iron-ros2-control: 3.23.0-1 → 3.24.0-1</li> +<li>ros-iron-ros2-control-test-assets: 3.23.0-1 → 3.24.0-1</li> +<li>ros-iron-ros2controlcli: 3.23.0-1 → 3.24.0-1</li> +<li><a href="http://ros.org/wiki/rqt_controller_manager">ros-iron-rqt-controller-manager</a>: 3.23.0-1 → 3.24.0-1</li> +<li><a href="http://introlab.github.io/rtabmap" rel="noopener nofollow ugc">ros-iron-rtabmap</a>: 0.21.3-1 → 0.21.4-1</li> +<li>ros-iron-rtabmap-conversions: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-conversions-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-dbgsym: 0.21.3-1 → 0.21.4-1</li> +<li>ros-iron-rtabmap-demos: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-examples: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-launch: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-msgs: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-msgs-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-odom: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-odom-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-python: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-ros: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-rviz-plugins: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-rviz-plugins-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-slam: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-slam-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-sync: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-sync-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-util: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-util-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-viz: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-viz-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-simple-launch: 1.9.0-1 → 1.9.1-1</li> +<li>ros-iron-spinnaker-camera-driver: 2.0.8-1 → 2.0.8-2</li> +<li>ros-iron-spinnaker-camera-driver-dbgsym: 2.0.8-1 → 2.0.8-2</li> +<li>ros-iron-transmission-interface: 3.23.0-1 → 3.24.0-1</li> +<li>ros-iron-transmission-interface-dbgsym: 3.23.0-1 → 3.24.0-1</li> +<li><a href="http://wiki.ros.org/ur_client_library">ros-iron-ur-client-library</a>: 1.3.4-1 → 1.3.5-1</li> +<li>ros-iron-ur-client-library-dbgsym: 1.3.4-1 → 1.3.5-1</li> +</ul> +<h3><a class="anchor" href="https://discourse.ros.org#removed-packages-0-4" name="removed-packages-0-4"></a>Removed Packages [0]:</h3> +<p>Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:</p> +<ul> +<li>Bence Magyar</li> +<li>Bernd Pfrommer</li> +<li>Felix Exner</li> +<li>Jean-Pierre Busch</li> +<li>Jose-Luis Blanco-Claraco</li> +<li>Luis Camero</li> +<li>Mathieu Labbe</li> +<li>Olivier Kermorgant</li> +<li>Rob Fisher</li> +<li>Vladimir Ermakov</li> +</ul> + <p><small>1 post - 1 participant</small></p> + <p><a href="https://discourse.ros.org/t/new-packages-for-iron-irwini-2024-03-11/36560">Read full topic</a></p> + 2024-03-11T01:54:48+00:00 + Yadunund + + + ROS Discourse General: ROS News for the Week of March 4th, 2024 + https://discourse.ros.org/t/ros-news-for-the-week-of-march-4th-2024/36532 + <h1><a class="anchor" href="https://discourse.ros.org#ros-news-for-the-week-of-march-4th-2024-1" name="ros-news-for-the-week-of-march-4th-2024-1"></a>ROS News for the Week of March 4th, 2024</h1> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/d/3/d3f66abe058d28275dba6cf53c8bdc7afe3b1ccc.jpeg" title="MEETUP SAN ANTONIO (2)"><img alt="MEETUP SAN ANTONIO (2)" height="291" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/d/3/d3f66abe058d28275dba6cf53c8bdc7afe3b1ccc_2_517x291.jpeg" width="517" /></a></div><br /> +I’ve been working with the ROS Industrial team, and the Port of San Antonio, to put together a <a href="https://www.eventbrite.com/e/ros-meetup-san-antonio-tickets-858425041407?aff=oddtdtcreator">ROS Meetup in San Antonio / Austin</a> in conjunction with the annual <a href="https://rosindustrial.org/events/2024/ros-industrial-consortium-americas-2024-annual-meeting">ROS Industrial Consortium Meeting.</a> If you are attending the ROS-I meeting make sure you sign up!<p></p> +<br /> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/6/0/60e45baa168f3f7246a0f17cdb3985e476b9cd0f.jpeg" title="Add a heading (3)"><img alt="Add a heading (3)" height="194" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/6/0/60e45baa168f3f7246a0f17cdb3985e476b9cd0f_2_345x194.jpeg" width="345" /></a></div><br /> +Gazebo Classic goes end of life in 2025! To help the community move over to modern Gazebo we’re holding open <a href="https://community.gazebosim.org/t/gazebo-migration-guide-office-hours/2543">Gazebo office hours</a> next Tuesday, March 12th, at 9am PST. If you have questions about the migration process please come by!<p></p> +<br /> +<p><img alt="e1d28e85278dd4e221030828367839e4950b8cf9_2_671x500" height="250" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/4/b/4b596515548ed682aef78e342b55bab8167c62aa.jpeg" width="335" /><br /> +We often get questions about the “best” robot components for a particular application. I really hate answering these questions; my inner engineer just screams, “IT DEPENDS!” Unfortunately, w really don’t have a lot of apples-to-apples data to compare different hardware vendors.</p> +<p>Thankfully <a class="mention" href="https://discourse.ros.org/u/iliao">@iliao</a> is putting in a ton of work to review ten different low cost LIDAR sensors. <a href="https://discourse.ros.org/t/fyi-10-low-cost-lidar-lds-interfaced-to-ros2-micro-ros-arduino/36369">Check it out here.</a><br /> +<br /></p> +<p><img alt="teaser3" class="animated" height="108" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/2/c/2c31dbd971221364f6a944324235d66203fb4362.gif" width="405" /><br /> +This week we got a sneak peek at some of the cool CVPR 2024 papers. Check out, <a href="https://rmurai.co.uk/projects/GaussianSplattingSLAM/">“Gaussian Splatting SLAM”, by Hidenobu Matsuki, Riku Murai, Paul H.J. Kelly, Andrew J. Davison</a>, complete with <a href="https://github.com/muskie82/MonoGS">source code</a>.</p> +<br /> +<p><img alt="1aa39368041ea4a73d78470ab0d7441453258cdf_2_353x500" height="375" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/d/d/ddc4e36f6a5ea13ccf25f28256bd8f6bf3b8247a.jpeg" width="264" /><br /> +<a href="https://roscon.fr/">We got our new ROSCon France graphic this week!</a> ROSCon France is currently accepting papers! Please consider applying if you speak French!</p> +<h1><a class="anchor" href="https://discourse.ros.org#events-2" name="events-2"></a>Events</h1> +<ul> +<li><a href="https://discourse.ros.org/t/ros-2-rust-meeting-march-11th/36523">2024-03-11 ROS 2 <img alt=":crab:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/crab.png?v=12" title=":crab:" width="20" /> Rust Meeting</a></li> +<li><a href="https://twitter.com/HRI_Conference/status/1765426051503595991">2024-03-12 Queer in Robotics Social @ HRI</a></li> +<li><a href="https://community.gazebosim.org/t/gazebo-migration-guide-office-hours/2543">2024-03-12 Gazebo Migration Office Hours</a></li> +<li><a href="https://online-learning.tudelft.nl/courses/hello-real-world-with-ros-robot-operating-systems/">2024-03-14 TU Delft ROS MOOC (FREE!)</a></li> +<li><img alt=":new:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/new.png?v=12" title=":new:" width="20" /> <a href="https://www.meetup.com/ros-by-the-bay/events/299684887/">2024-03-21 ROS By-The-Bay with Dusty Robotics and Project Q&amp;A Session</a></li> +<li><a href="https://partiful.com/e/0Z53uQxTyXNDOochGBgV">2024-03-23 Robo Hackathon @ Circuit Launch (Bay Area)</a></li> +<li><img alt=":new:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/new.png?v=12" title=":new:" width="20" /> <a href="https://www.eventbrite.com/e/ros-meetup-san-antonio-tickets-858425041407?aff=oddtdtcreator">2024-03-26 ROS Meetup San Antonio, Texas</a></li> +<li><a href="https://rosindustrial.org/events/2024/ros-industrial-consortium-americas-2024-annual-meeting">2024-03-27 ROS Industrial Annual Meeting</a></li> +<li><a href="https://www.eventbrite.com/e/robotics-and-tech-happy-hour-tickets-835278218637?aff=soc">2024-03-27 Robotics Happy Hour Pittsburgh</a></li> +<li><img alt=":new:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/new.png?v=12" title=":new:" width="20" /> <a href="https://eurecat.org/serveis/eurecatevents/rosmeetupbcn/">2024-04-03 ROS Meetup Barcelona</a></li> +<li><img alt=":new:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/new.png?v=12" title=":new:" width="20" /> <a href="https://www.eventbrite.com/e/robot-block-party-2024-registration-855602328597?aff=oddtdtcreator">2024-04-06 Silicon Valley Robotics Robot Block Party</a></li> +<li><a href="https://www.zephyrproject.org/zephyr-developer-summit-2024/?utm_campaign=Zephyr%20Developer%20Summit&amp;utm_content=285280528&amp;utm_medium=social&amp;utm_source=linkedin&amp;hss_channel=lcp-27161269">2024-04-16 → 2024-04-18 Zephyr Developer Summit</a></li> +<li><a href="https://www.eventbrite.com/e/robots-on-ice-40-2024-tickets-815030266467">2024-04-21 Robots on Ice</a></li> +<li><a href="https://2024.oshwa.org/">2024-05-03 Open Hardware Summit Montreal</a></li> +<li><a href="https://www.tum-venture-labs.de/education/bots-bento-the-first-robotic-pallet-handler-competition-icra-2024/">2024-05-13 Bots &amp; Bento @ ICRA 2024</a></li> +<li><a href="https://cpsweek2024-race.f1tenth.org/">2024-05-14 → 2024-05-16 F1Tenth Autonomous Grand Prix @ CPS-IoT Week</a></li> +<li><a href="https://discourse.ros.org/t/acm-sigsoft-summer-school-on-software-engineering-for-robotics-in-brussels/35992">2024-06-04 → 2024-06-08 ACM SIGSOFT Summer School on Software Engineering for Robotics</a></li> +<li><a href="https://www.agriculture-vision.com/">2024-06-?? Workshop on Agriculture Vision at CVPR 2024</a></li> +<li><a href="https://icra2024.rt-net.jp/archives/87">2024-06-16 Food Topping Challenge at ICRA 2024</a></li> +<li><a href="https://roscon.fr/">2024-06-19 =&gt; 2024-06-20 ROSCon France</a></li> +<li><a href="https://sites.udel.edu/ceoe-able/able-summer-bootcamp/">2024-08-05 Autonomous Systems Bootcam at Univ. Deleware</a>– <a href="https://www.youtube.com/watch?v=4ETDsBN2o8M">Video</a></li> +<li><a href="https://roscon.jp/2024/">2024-09-25 ROSConJP</a></li> +</ul> +<h1><a class="anchor" href="https://discourse.ros.org#news-3" name="news-3"></a>News</h1> +<ul> +<li><a href="https://techcrunch.com/2024/03/06/saildrones-first-aluminum-surveyor-autonomous-vessel-splashes-down-for-navy-testing/">Saildrone’s New Aluminum Surveyor</a></li> +<li><a href="https://techcrunch.com/2024/03/06/amazon-teams-with-recycling-robot-firm-to-track-package-waste/">Glacier Recycling Robot Raises $7.7M</a> – <a href="https://www.therobotreport.com/recycling-automation-startup-glacier-brings-in-7-7m/">Robot Report</a></li> +<li><a href="https://techcrunch.com/2024/03/05/agility-robotics-new-ceo-is-focused-on-the-here-and-now/">New CEO at Agility</a></li> +<li><a href="https://techcrunch.com/2024/02/29/figure-rides-the-humanoid-robot-hype-wave-to-2-6b-valuation-and-openai-collab/">Figure raises $675M for Humanoid Robots</a></li> +<li><a href="https://www.therobotreport.com/rios-intelligent-machines-raises-series-b-funding-starts-rolls-out-mission-control/">RIOS Raises $13M Series B</a></li> +<li><a href="https://www.therobotreport.com/robotics-companies-raised-578m-in-january-2024/">$578M in Raised for Robotics in January 2024</a></li> +<li><a href="https://hackaday.com/2024/03/06/the-16-pcb-robot/">$16 PCB Robot</a></li> +<li><a href="https://github.com/muskie82/MonoGS"><img alt=":cool:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/cool.png?v=12" title=":cool:" width="20" /> Gaussian Splatting SLAM source code</a></li> +<li><a href="https://www.therobotreport.com/researchers-develop-interface-for-quadriplegics-to-control-robots/">Researchers develop interface for quadriplegics to control robots</a></li> +<li><a href="https://github.com/Wuziyi616/LEOD">LEOD: Label-Efficient Object Detection for Event Cameras</a></li> +<li><a href="https://www.youtube.com/watch?v=uL5ClqHg5Jw">Taylor Alexander on Solar Powered Farming Robots</a></li> +<li><a href="https://discourse.ros.org/t/roscon-france-2024/35222/3">ROSCon France Logo Drops</a></li> +<li><a href="https://www.swri.org/industry/industrial-robotics-automation/blog/making-robot-programming-user-friendly">SwRI Workbench for Offline Robotics Development (SWORD)</a></li> +<li><img alt=":cool:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/cool.png?v=12" title=":cool:" width="20" /> <a href="https://github.com/Romea/cropcraft">Procedural World Generator for Farm Robots</a></li> +<li><a href="https://github.com/ulagbulag/kiss-icp-rs">KISS ICP Odometry in Rust <img alt=":crab:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/crab.png?v=12" title=":crab:" width="20" /></a></li> +<li><a href="https://github.com/princeton-vl/OcMesher?tab=readme-ov-file">View-Dependent Octree-based Mesh Extraction in Unbounded Scenes for Procedural Synthetic Data</a></li> +<li><a href="https://github.com/juanb09111/FinnForest">Woodlands Dataset with Stereo and LIDAR</a></li> +<li><a href="https://github.com/peterstratton/Volume-DROID">Volume-DROID SLAM Source Code</a></li> +<li><a href="https://spectrum.ieee.org/video-friday-human-to-humanoid">Video Friday</a></li> +</ul> +<h1><a class="anchor" href="https://discourse.ros.org#ros-4" name="ros-4"></a>ROS</h1> +<ul> +<li><a href="https://discourse.ros.org/t/releasing-packages-to-integrate-brickpi3-with-ros2/36389">ROS for Lego Mindstorms!</a></li> +<li><a href="https://discourse.ros.org/t/fyi-10-low-cost-lidar-lds-interfaced-to-ros2-micro-ros-arduino/36369"><img alt=":cool:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/cool.png?v=12" title=":cool:" width="20" /> 10+ Low-Cost LIDARs Compared</a></li> +<li><a href="https://discourse.ros.org/t/revival-of-client-library-working-group/36406">Reboot Client Library Working Group?</a></li> +<li><a href="https://discourse.ros.org/t/new-packages-for-humble-hawksbill-2024-03-08/36529">13 New and 220 Updated Packages for ROS 2 Humble</a></li> +<li><a href="https://discourse.ros.org/t/ros1-now-is-a-great-time-to-add-catkin-lint-to-your-packages/36521">Now is a Great Time to Add Catkin Lint to Your Package</a></li> +<li><a href="https://discourse.ros.org/t/cobot-magic-mobile-aloha-system-works-on-agilex-robotics-platform/36515">Cobot Magic: Mobile Aloha system works on AgileX Robotics platform</a></li> +<li><a href="https://discourse.ros.org/t/new-packages-for-noetic-2024-03-07/36514">10 New and 46 Updated Packages for ROS 1 Noetic</a></li> +<li><a href="https://discourse.ros.org/t/new-packages-for-ros-2-rolling-ridley-2024-02-28/36358">5 New and 279 Updated Packages for ROS 2 Rolling Ridley (Last 22.04 Update)</a></li> +<li><a href="https://discourse.ros.org/t/potential-humanoid-robotics-monthly-working-group/36426">Humanoid Working Group?</a></li> +<li><a href="https://discourse.ros.org/t/ros-mapping-and-navigation-with-agilex-robotics-limo/36452">New Agile-X LIMO</a></li> +<li><a href="https://discourse.ros.org/t/rosmicropy-graphical-controller-proposal-feedback/36424">ROS MicroPy Graphical Controller</a></li> +<li><a href="https://discourse.ros.org/t/noise-model-for-depth-camera-simulation/36385">Simulating Noise in Depth Cameras</a></li> +<li><a href="https://discourse.ros.org/t/what-are-the-main-challenges-you-faced-in-using-ros2-to-develop-industrial-applications-with-manipulators/36393">What are the main challenges you faced in using ROS2 to develop industrial applications with manipulators? </a></li> +<li><a href="https://www.youtube.com/playlist?list=PL8EeqqtDev57JEEs_HL3g9DbAwGkbWmhK">Autoware Foundation General Assembly 2023 Recordings</a></li> +<li><a href="https://arxiv.org/abs/2312.14808">F1Tenth: A Tricycle Model to Accurately Control an Autonomous Racecar with Locked Differential</a></li> +<li><img alt=":cool:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/cool.png?v=12" title=":cool:" width="20" /> <a href="https://arxiv.org/abs/2402.18558">Unifying F1TENTH Autonomous Racing: Survey, Methods and Benchmarks</a> – <a href="https://github.com/BDEvan5/f1tenth_benchmarks">Benchmark Data</a></li> +<li><a href="https://github.com/dimaxano/ros2-lifecycle-monitoring">RViz Plugin for Monitoring Node Life Cycles</a></li> +<li><a href="https://github.com/suchetanrs/ORB-SLAM3-ROS2-Docker">ROS 2 + ORB SLAM 3 Docker Container</a></li> +<li><a href="https://www.youtube.com/@kevinwoodrobot/playlists">Kevin Wood ROS Youtube Videos</a></li> +<li><img alt=":cool:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/cool.png?v=12" title=":cool:" width="20" /> <a href="https://arxiv.org/abs/2402.19341">JPL + ROS: RoadRunner - Learning Traversability Estimation for Autonomous Off-road Driving </a></li> +<li><a href="https://navigation.ros.org/tutorials/docs/integrating_vio.html">Nav2: Using VIO to Augment Robot Odometry</a></li> +<li><a href="https://github.com/MRPT/mvsim">MultiVehicle simulator (MVSim)</a></li> +<li><a href="https://kylew239.github.io/in_progress/crazyflie/">Light Painting with a Drone Swarm</a></li> +<li><a href="https://github.com/TKG-Tou-Kai-Group/CoRE-jp-Isaac-Sim-ROS2-packages">ROS 2 + Isaac Sim Docker (Japanese) </a></li> +<li><a href="https://github.com/husarion/rosbot-telepresence/tree/foxglove">Real-Time Internet Control and Video Streaming with ROSbot 2R / 2 PRO</a></li> +</ul> +<h1><a class="anchor" href="https://discourse.ros.org#ros-questions-5" name="ros-questions-5"></a>ROS Questions</h1> +<p>Please make ROS a better project for the next person! Take a moment to answer a question on <a href="https://robotics.stackexchange.com/">Robotics Stack Exchange</a>! Not your thing? <a href="https://github.com/ros2/ros2_documentation">Contribute to the ROS 2 Docs!</a></p> + <p><small>4 posts - 2 participants</small></p> + <p><a href="https://discourse.ros.org/t/ros-news-for-the-week-of-march-4th-2024/36532">Read full topic</a></p> + 2024-03-08T21:50:00+00:00 + Katherine_Scott + + + ROS Discourse General: New packages for Humble Hawksbill 2024-03-08 + https://discourse.ros.org/t/new-packages-for-humble-hawksbill-2024-03-08/36529 + <h2><a class="anchor" href="https://discourse.ros.org#package-updates-for-humble-1" name="package-updates-for-humble-1"></a>Package Updates for Humble</h2> +<h3><a class="anchor" href="https://discourse.ros.org#added-packages-13-2" name="added-packages-13-2"></a>Added Packages [13]:</h3> +<ul> +<li>ros-humble-apriltag-detector-dbgsym: 1.1.1-1</li> +<li>ros-humble-caret-analyze-cpp-impl: 0.5.0-5</li> +<li>ros-humble-caret-analyze-cpp-impl-dbgsym: 0.5.0-5</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-ds-dbw</a>: 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-ds-dbw-can</a>: 2.1.10-1</li> +<li>ros-humble-ds-dbw-can-dbgsym: 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-ds-dbw-joystick-demo</a>: 2.1.10-1</li> +<li>ros-humble-ds-dbw-joystick-demo-dbgsym: 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-ds-dbw-msgs</a>: 2.1.10-1</li> +<li>ros-humble-ds-dbw-msgs-dbgsym: 2.1.10-1</li> +<li>ros-humble-gazebo-no-physics-plugin: 0.1.1-1</li> +<li>ros-humble-gazebo-no-physics-plugin-dbgsym: 0.1.1-1</li> +<li>ros-humble-kinematics-interface-dbgsym: 0.3.0-1</li> +</ul> +<h3><a class="anchor" href="https://discourse.ros.org#updated-packages-220-3" name="updated-packages-220-3"></a>Updated Packages [220]:</h3> +<ul> +<li>ros-humble-ackermann-steering-controller: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-ackermann-steering-controller-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-admittance-controller: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-admittance-controller-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-apriltag-detector: 1.1.0-1 → 1.1.1-1</li> +<li>ros-humble-bicycle-steering-controller: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-bicycle-steering-controller-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-bno055: 0.4.1-1 → 0.5.0-1</li> +<li><a href="https://index.ros.org/p/camera_calibration/github-ros-perception-image_pipeline/">ros-humble-camera-calibration</a>: 3.0.3-1 → 3.0.4-1</li> +<li>ros-humble-caret-analyze: 0.5.0-1 → 0.5.0-2</li> +<li>ros-humble-cob-actions: 2.7.9-1 → 2.7.10-1</li> +<li>ros-humble-cob-actions-dbgsym: 2.7.9-1 → 2.7.10-1</li> +<li>ros-humble-cob-msgs: 2.7.9-1 → 2.7.10-1</li> +<li>ros-humble-cob-msgs-dbgsym: 2.7.9-1 → 2.7.10-1</li> +<li><a href="http://ros.org/wiki/cob_srvs">ros-humble-cob-srvs</a>: 2.7.9-1 → 2.7.10-1</li> +<li>ros-humble-cob-srvs-dbgsym: 2.7.9-1 → 2.7.10-1</li> +<li>ros-humble-controller-interface: 2.39.1-1 → 2.40.0-1</li> +<li>ros-humble-controller-interface-dbgsym: 2.39.1-1 → 2.40.0-1</li> +<li>ros-humble-controller-manager: 2.39.1-1 → 2.40.0-1</li> +<li>ros-humble-controller-manager-dbgsym: 2.39.1-1 → 2.40.0-1</li> +<li><a href="http://ros.org/wiki/controller_manager_msgs">ros-humble-controller-manager-msgs</a>: 2.39.1-1 → 2.40.0-1</li> +<li>ros-humble-controller-manager-msgs-dbgsym: 2.39.1-1 → 2.40.0-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dataspeed-dbw-common</a>: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dataspeed-ulc</a>: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dataspeed-ulc-can</a>: 2.1.3-1 → 2.1.10-1</li> +<li>ros-humble-dataspeed-ulc-can-dbgsym: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dataspeed-ulc-msgs</a>: 2.1.3-1 → 2.1.10-1</li> +<li>ros-humble-dataspeed-ulc-msgs-dbgsym: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-fca</a>: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-fca-can</a>: 2.1.3-1 → 2.1.10-1</li> +<li>ros-humble-dbw-fca-can-dbgsym: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-fca-description</a>: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-fca-joystick-demo</a>: 2.1.3-1 → 2.1.10-1</li> +<li>ros-humble-dbw-fca-joystick-demo-dbgsym: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-fca-msgs</a>: 2.1.3-1 → 2.1.10-1</li> +<li>ros-humble-dbw-fca-msgs-dbgsym: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-ford</a>: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-ford-can</a>: 2.1.3-1 → 2.1.10-1</li> +<li>ros-humble-dbw-ford-can-dbgsym: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-ford-description</a>: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-ford-joystick-demo</a>: 2.1.3-1 → 2.1.10-1</li> +<li>ros-humble-dbw-ford-joystick-demo-dbgsym: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-ford-msgs</a>: 2.1.3-1 → 2.1.10-1</li> +<li>ros-humble-dbw-ford-msgs-dbgsym: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-polaris</a>: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-polaris-can</a>: 2.1.3-1 → 2.1.10-1</li> +<li>ros-humble-dbw-polaris-can-dbgsym: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-polaris-description</a>: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-polaris-joystick-demo</a>: 2.1.3-1 → 2.1.10-1</li> +<li>ros-humble-dbw-polaris-joystick-demo-dbgsym: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-polaris-msgs</a>: 2.1.3-1 → 2.1.10-1</li> +<li>ros-humble-dbw-polaris-msgs-dbgsym: 2.1.3-1 → 2.1.10-1</li> +<li><a href="https://index.ros.org/p/depth_image_proc/github-ros-perception-image_pipeline/">ros-humble-depth-image-proc</a>: 3.0.3-1 → 3.0.4-1</li> +<li>ros-humble-depth-image-proc-dbgsym: 3.0.3-1 → 3.0.4-1</li> +<li>ros-humble-diff-drive-controller: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-diff-drive-controller-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li><a href="https://wiki.ros.org/draco_point_cloud_transport">ros-humble-draco-point-cloud-transport</a>: 1.0.9-1 → 1.0.10-1</li> +<li>ros-humble-draco-point-cloud-transport-dbgsym: 1.0.9-1 → 1.0.10-1</li> +<li>ros-humble-effort-controllers: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-effort-controllers-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-etsi-its-cam-coding: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-cam-coding-dbgsym: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-cam-conversion: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-cam-msgs: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-cam-msgs-dbgsym: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-coding: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-conversion: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-conversion-dbgsym: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-denm-coding: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-denm-coding-dbgsym: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-denm-conversion: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-denm-msgs: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-denm-msgs-dbgsym: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-messages: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-msgs: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-msgs-utils: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-primitives-conversion: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-rviz-plugins: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-rviz-plugins-dbgsym: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-flir-camera-description: 2.0.8-2 → 2.0.8-3</li> +<li>ros-humble-flir-camera-msgs: 2.0.8-2 → 2.0.8-3</li> +<li>ros-humble-flir-camera-msgs-dbgsym: 2.0.8-2 → 2.0.8-3</li> +<li>ros-humble-force-torque-sensor-broadcaster: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-force-torque-sensor-broadcaster-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-forward-command-controller: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-forward-command-controller-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-gripper-controllers: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-gripper-controllers-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-hardware-interface: 2.39.1-1 → 2.40.0-1</li> +<li>ros-humble-hardware-interface-dbgsym: 2.39.1-1 → 2.40.0-1</li> +<li>ros-humble-hardware-interface-testing: 2.39.1-1 → 2.40.0-1</li> +<li>ros-humble-hardware-interface-testing-dbgsym: 2.39.1-1 → 2.40.0-1</li> +<li><a href="https://index.ros.org/p/image_pipeline/github-ros-perception-image_pipeline/">ros-humble-image-pipeline</a>: 3.0.3-1 → 3.0.4-1</li> +<li><a href="https://index.ros.org/p/image_proc/github-ros-perception-image_pipeline/">ros-humble-image-proc</a>: 3.0.3-1 → 3.0.4-1</li> +<li>ros-humble-image-proc-dbgsym: 3.0.3-1 → 3.0.4-1</li> +<li><a href="https://index.ros.org/p/image_publisher/github-ros-perception-image_pipeline/">ros-humble-image-publisher</a>: 3.0.3-1 → 3.0.4-1</li> +<li>ros-humble-image-publisher-dbgsym: 3.0.3-1 → 3.0.4-1</li> +<li><a href="https://index.ros.org/p/image_rotate/github-ros-perception-image_pipeline/">ros-humble-image-rotate</a>: 3.0.3-1 → 3.0.4-1</li> +<li>ros-humble-image-rotate-dbgsym: 3.0.3-1 → 3.0.4-1</li> +<li><a href="https://index.ros.org/p/image_view/github-ros-perception-image_pipeline/">ros-humble-image-view</a>: 3.0.3-1 → 3.0.4-1</li> +<li>ros-humble-image-view-dbgsym: 3.0.3-1 → 3.0.4-1</li> +<li>ros-humble-imu-sensor-broadcaster: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-imu-sensor-broadcaster-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li><a href="https://github.com/ros-controls/ros2_control/wiki" rel="noopener nofollow ugc">ros-humble-joint-limits</a>: 2.39.1-1 → 2.40.0-1</li> +<li>ros-humble-joint-limits-dbgsym: 2.39.1-1 → 2.40.0-1</li> +<li>ros-humble-joint-state-broadcaster: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-joint-state-broadcaster-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-joint-trajectory-controller: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-joint-trajectory-controller-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-kinematics-interface: 0.2.0-1 → 0.3.0-1</li> +<li>ros-humble-kinematics-interface-kdl: 0.2.0-1 → 0.3.0-1</li> +<li>ros-humble-kinematics-interface-kdl-dbgsym: 0.2.0-1 → 0.3.0-1</li> +<li><a href="https://github.com/pal-robotics/launch_pal" rel="noopener nofollow ugc">ros-humble-launch-pal</a>: 0.0.16-1 → 0.0.18-1</li> +<li><a href="http://wiki.ros.org/mavros">ros-humble-libmavconn</a>: 2.6.0-1 → 2.7.0-1</li> +<li>ros-humble-libmavconn-dbgsym: 2.6.0-1 → 2.7.0-1</li> +<li><a href="https://mavlink.io/en/" rel="noopener nofollow ugc">ros-humble-mavlink</a>: 2023.9.9-1 → 2024.3.3-1</li> +<li><a href="http://wiki.ros.org/mavros">ros-humble-mavros</a>: 2.6.0-1 → 2.7.0-1</li> +<li>ros-humble-mavros-dbgsym: 2.6.0-1 → 2.7.0-1</li> +<li><a href="http://wiki.ros.org/mavros_extras">ros-humble-mavros-extras</a>: 2.6.0-1 → 2.7.0-1</li> +<li>ros-humble-mavros-extras-dbgsym: 2.6.0-1 → 2.7.0-1</li> +<li><a href="http://wiki.ros.org/mavros_msgs">ros-humble-mavros-msgs</a>: 2.6.0-1 → 2.7.0-1</li> +<li>ros-humble-mavros-msgs-dbgsym: 2.6.0-1 → 2.7.0-1</li> +<li><a href="https://www.mrpt.org/" rel="noopener nofollow ugc">ros-humble-mrpt2</a>: 2.11.9-1 → 2.11.11-1</li> +<li>ros-humble-mrpt2-dbgsym: 2.11.9-1 → 2.11.11-1</li> +<li><a href="https://wiki.ros.org/mvsim">ros-humble-mvsim</a>: 0.8.3-1 → 0.9.1-1</li> +<li>ros-humble-mvsim-dbgsym: 0.8.3-1 → 0.9.1-1</li> +<li><a href="https://github.com/LORD-MicroStrain/ntrip_client" rel="noopener nofollow ugc">ros-humble-ntrip-client</a>: 1.2.0-1 → 1.3.0-1</li> +<li><a href="https://github.com/pal-robotics/play_motion2" rel="noopener nofollow ugc">ros-humble-play-motion2</a>: 0.0.13-1 → 1.0.0-1</li> +<li>ros-humble-play-motion2-dbgsym: 0.0.13-1 → 1.0.0-1</li> +<li><a href="https://github.com/pal-robotics/play_motion2" rel="noopener nofollow ugc">ros-humble-play-motion2-msgs</a>: 0.0.13-1 → 1.0.0-1</li> +<li>ros-humble-play-motion2-msgs-dbgsym: 0.0.13-1 → 1.0.0-1</li> +<li><a href="https://github.com/facontidavide/PlotJuggler" rel="noopener nofollow ugc">ros-humble-plotjuggler</a>: 3.9.0-1 → 3.9.1-1</li> +<li>ros-humble-plotjuggler-dbgsym: 3.9.0-1 → 3.9.1-1</li> +<li><a href="https://github.com/pal-robotics/pmb2_simulation" rel="noopener nofollow ugc">ros-humble-pmb2-2dnav</a>: 4.0.9-1 → 4.0.12-1</li> +<li><a href="https://github.com/pal-robotics/pmb2_simulation" rel="noopener nofollow ugc">ros-humble-pmb2-bringup</a>: 5.0.15-1 → 5.0.16-1</li> +<li><a href="https://github.com/pal-robotics/pmb2_simulation" rel="noopener nofollow ugc">ros-humble-pmb2-controller-configuration</a>: 5.0.15-1 → 5.0.16-1</li> +<li><a href="https://github.com/pal-robotics/pmb2_simulation" rel="noopener nofollow ugc">ros-humble-pmb2-description</a>: 5.0.15-1 → 5.0.16-1</li> +<li><a href="https://github.com/pal-robotics/pmb2_simulation" rel="noopener nofollow ugc">ros-humble-pmb2-laser-sensors</a>: 4.0.9-1 → 4.0.12-1</li> +<li><a href="https://github.com/pal-robotics/pmb2_simulation" rel="noopener nofollow ugc">ros-humble-pmb2-maps</a>: 4.0.9-1 → 4.0.12-1</li> +<li><a href="https://github.com/pal-robotics/pmb2_simulation" rel="noopener nofollow ugc">ros-humble-pmb2-navigation</a>: 4.0.9-1 → 4.0.12-1</li> +<li><a href="https://github.com/pal-robotics/pmb2_simulation" rel="noopener nofollow ugc">ros-humble-pmb2-robot</a>: 5.0.15-1 → 5.0.16-1</li> +<li><a href="https://wiki.ros.org/draco_point_cloud_transport">ros-humble-point-cloud-interfaces</a>: 1.0.9-1 → 1.0.10-1</li> +<li>ros-humble-point-cloud-interfaces-dbgsym: 1.0.9-1 → 1.0.10-1</li> +<li>ros-humble-point-cloud-transport: 1.0.15-1 → 1.0.16-1</li> +<li>ros-humble-point-cloud-transport-dbgsym: 1.0.15-1 → 1.0.16-1</li> +<li><a href="https://wiki.ros.org/point_cloud_transport">ros-humble-point-cloud-transport-plugins</a>: 1.0.9-1 → 1.0.10-1</li> +<li>ros-humble-point-cloud-transport-py: 1.0.15-1 → 1.0.16-1</li> +<li>ros-humble-position-controllers: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-position-controllers-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-psdk-interfaces: 1.0.0-1 → 1.1.0-1</li> +<li>ros-humble-psdk-interfaces-dbgsym: 1.0.0-1 → 1.1.0-1</li> +<li>ros-humble-psdk-wrapper: 1.0.0-1 → 1.1.0-1</li> +<li>ros-humble-psdk-wrapper-dbgsym: 1.0.0-1 → 1.1.0-1</li> +<li>ros-humble-range-sensor-broadcaster: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-range-sensor-broadcaster-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-ros2-control: 2.39.1-1 → 2.40.0-1</li> +<li>ros-humble-ros2-control-test-assets: 2.39.1-1 → 2.40.0-1</li> +<li>ros-humble-ros2-controllers: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-ros2-controllers-test-nodes: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-ros2caret: 0.5.0-2 → 0.5.0-6</li> +<li>ros-humble-ros2controlcli: 2.39.1-1 → 2.40.0-1</li> +<li><a href="http://ros.org/wiki/rqt_controller_manager">ros-humble-rqt-controller-manager</a>: 2.39.1-1 → 2.40.0-1</li> +<li>ros-humble-rqt-gauges: 0.0.1-1 → 0.0.2-1</li> +<li>ros-humble-rqt-joint-trajectory-controller: 2.32.0-1 → 2.33.0-1</li> +<li><a href="http://introlab.github.io/rtabmap" rel="noopener nofollow ugc">ros-humble-rtabmap</a>: 0.21.3-1 → 0.21.4-1</li> +<li>ros-humble-rtabmap-conversions: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-conversions-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-dbgsym: 0.21.3-1 → 0.21.4-1</li> +<li>ros-humble-rtabmap-demos: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-examples: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-launch: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-msgs: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-msgs-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-odom: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-odom-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-python: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-ros: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-rviz-plugins: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-rviz-plugins-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-slam: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-slam-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-sync: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-sync-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-util: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-util-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-viz: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-viz-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-simple-launch: 1.9.0-1 → 1.9.1-1</li> +<li>ros-humble-spinnaker-camera-driver: 2.0.8-2 → 2.0.8-3</li> +<li>ros-humble-spinnaker-camera-driver-dbgsym: 2.0.8-2 → 2.0.8-3</li> +<li>ros-humble-steering-controllers-library: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-steering-controllers-library-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li><a href="https://index.ros.org/p/stereo_image_proc/github-ros-perception-image_pipeline/">ros-humble-stereo-image-proc</a>: 3.0.3-1 → 3.0.4-1</li> +<li>ros-humble-stereo-image-proc-dbgsym: 3.0.3-1 → 3.0.4-1</li> +<li><a href="https://github.com/pal-robotics/tiago_simulation" rel="noopener nofollow ugc">ros-humble-tiago-2dnav</a>: 4.0.9-1 → 4.0.12-1</li> +<li><a href="https://github.com/pal-robotics/tiago_simulation" rel="noopener nofollow ugc">ros-humble-tiago-bringup</a>: 4.1.2-1 → 4.2.3-1</li> +<li><a href="https://github.com/pal-robotics/tiago_simulation" rel="noopener nofollow ugc">ros-humble-tiago-controller-configuration</a>: 4.1.2-1 → 4.2.3-1</li> +<li><a href="https://github.com/pal-robotics/tiago_simulation" rel="noopener nofollow ugc">ros-humble-tiago-description</a>: 4.1.2-1 → 4.2.3-1</li> +<li><a href="https://github.com/pal-robotics/tiago_simulation" rel="noopener nofollow ugc">ros-humble-tiago-gazebo</a>: 4.0.8-1 → 4.1.0-1</li> +<li><a href="https://github.com/pal-robotics/tiago_simulation" rel="noopener nofollow ugc">ros-humble-tiago-laser-sensors</a>: 4.0.9-1 → 4.0.12-1</li> +<li><a href="https://github.com/pal-robotics/tiago_simulation" rel="noopener nofollow ugc">ros-humble-tiago-moveit-config</a>: 3.0.7-1 → 3.0.10-1</li> +<li><a href="https://github.com/pal-robotics/tiago_simulation" rel="noopener nofollow ugc">ros-humble-tiago-navigation</a>: 4.0.9-1 → 4.0.12-1</li> +<li><a href="https://github.com/pal-robotics/tiago_simulation" rel="noopener nofollow ugc">ros-humble-tiago-robot</a>: 4.1.2-1 → 4.2.3-1</li> +<li><a href="https://github.com/pal-robotics/tiago_simulation" rel="noopener nofollow ugc">ros-humble-tiago-simulation</a>: 4.0.8-1 → 4.1.0-1</li> +<li>ros-humble-tracetools-image-pipeline: 3.0.3-1 → 3.0.4-1</li> +<li>ros-humble-tracetools-image-pipeline-dbgsym: 3.0.3-1 → 3.0.4-1</li> +<li>ros-humble-transmission-interface: 2.39.1-1 → 2.40.0-1</li> +<li>ros-humble-transmission-interface-dbgsym: 2.39.1-1 → 2.40.0-1</li> +<li>ros-humble-tricycle-controller: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-tricycle-controller-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-tricycle-steering-controller: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-tricycle-steering-controller-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li><a href="http://wiki.ros.org/ur_client_library">ros-humble-ur-client-library</a>: 1.3.4-1 → 1.3.5-1</li> +<li>ros-humble-ur-client-library-dbgsym: 1.3.4-1 → 1.3.5-1</li> +<li>ros-humble-velocity-controllers: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-velocity-controllers-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li><a href="https://wiki.ros.org/draco_point_cloud_transport">ros-humble-zlib-point-cloud-transport</a>: 1.0.9-1 → 1.0.10-1</li> +<li>ros-humble-zlib-point-cloud-transport-dbgsym: 1.0.9-1 → 1.0.10-1</li> +<li><a href="https://wiki.ros.org/draco_point_cloud_transport">ros-humble-zstd-point-cloud-transport</a>: 1.0.9-1 → 1.0.10-1</li> +<li>ros-humble-zstd-point-cloud-transport-dbgsym: 1.0.9-1 → 1.0.10-1</li> +</ul> +<h3><a class="anchor" href="https://discourse.ros.org#removed-packages-2-4" name="removed-packages-2-4"></a>Removed Packages [2]:</h3> +<ul> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dataspeed-dbw-gateway</a></li> +<li>ros-humble-dataspeed-dbw-gateway-dbgsym</li> +</ul> +<p>Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:</p> +<ul> +<li>Alejandro Hernandez Cordero</li> +<li>Alejandro Hernández</li> +<li>Bence Magyar</li> +<li>Bernd Pfrommer</li> +<li>Bianca Bendris</li> +<li>Boeing</li> +<li>Davide Faconti</li> +<li>Denis Štogl</li> +<li>Eloy Bricneo</li> +<li>Felix Exner</li> +<li>Felix Messmer</li> +<li>Jean-Pierre Busch</li> +<li>Jordan Palacios</li> +<li>Jordi Pages</li> +<li>Jose-Luis Blanco-Claraco</li> +<li>Kevin Hallenbeck</li> +<li>Luis Camero</li> +<li>Martin Pecka</li> +<li>Mathieu Labbe</li> +<li>Micho Radovnikovich</li> +<li>Noel Jimenez</li> +<li>Olivier Kermorgant</li> +<li>Rob Fisher</li> +<li>TIAGo PAL support team</li> +<li>Vincent Rabaud</li> +<li>Vladimir Ermakov</li> +<li>Víctor Mayoral-Vilches</li> +<li>flynneva</li> +<li>ymski</li> +</ul> + <p><small>1 post - 1 participant</small></p> + <p><a href="https://discourse.ros.org/t/new-packages-for-humble-hawksbill-2024-03-08/36529">Read full topic</a></p> + 2024-03-08T16:36:12+00:00 + audrow + + + ROS Discourse General: ROS1: Now is a great time to add `catkin_lint` to your packages! + https://discourse.ros.org/t/ros1-now-is-a-great-time-to-add-catkin-lint-to-your-packages/36521 + <p><a href="https://fkie.github.io/catkin_lint/" rel="noopener nofollow ugc">catkin_lint</a> is an established ROS package that can do lots of useful checks on your CMakeLists.txt and package.xml . It can e.g. warn you about dependencies that do not match between package.xml and CMakeLists.txt, it can check existence of all rosdep keys in pacakge.xml, it will watch if all executable files in your package get installed, it will warn you about the most common wrong usages of CMake, and recently it even got the ability to warn you if you’re using a CMake feature that is too new for the CMake version you’ve put in <code>cmake_minimum_required()</code>. And there’s <a href="https://fkie.github.io/catkin_lint/messages/" rel="noopener nofollow ugc">much more</a>.</p> +<p>Personally, as a maintainer, I feel much more comfortable releasing a new version of a package once I see catkin_lint passed without complaints.</p> +<p>Until recently, automatic running of catkin_lint tests on packages released via buildfarm was problematic because <a href="https://github.com/ros-infrastructure/ros_buildfarm/issues/923" rel="noopener nofollow ugc">the buildfarm doesn’t initialize rosdep cache</a> and <a href="https://github.com/fkie/catkin_lint/issues/108" rel="noopener nofollow ugc">catkin_lint needed it for its working</a>. The recently released version 1.6.22 of catkin lint no longer fails in this case, so it is able to run all other tests that do not require rosdep on the buildfarm, while disabling those that need rosdep (currently only checking that package.xml keys point to existing packages).</p> +<p>Adding automatic catkin_lint to your package is easy!</p> +<p>CMakeLists.txt:</p> +<pre><code class="lang-cmake">if (CATKIN_ENABLE_TESTING) + find_package(roslint REQUIRED) + roslint_custom(catkin_lint "-W2" .) + roslint_add_test() +endif() +</code></pre> +<p>package.xml:</p> +<pre><code class="lang-XML">&lt;test_depend&gt;python3-catkin-lint&lt;/test_depend&gt; +&lt;test_depend&gt;roslint&lt;/test_depend&gt; +</code></pre> +<p>And that’s it!</p> +<p>If you want to run the test locally, you can either manually invoke <code>catkin_lint .</code> in your package directory, or <code>make roslint</code> in the build directory.</p> +<p>And if you’re okay with some warnings catkin_lint gives you, you can always ignore them either for a single line (<code>#catkin_lint: ignore_once duplicate_find</code>) or globally by adding arguments to the <code>catkin_lint</code> call (<code>catkin_lint -W2 --ignore duplicate_find .</code>).</p> +<p>Of course, the catkin_lint automation should not substitute manual runs of this tool before releasing a new version of your package. It should be a good habit to run caktin_lint after you finished editing your build files. However, having the automation built in, you can get assurance that even if you forget running the tool manually, the buildfarm will let you know <img alt=":slight_smile:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/slight_smile.png?v=12" title=":slight_smile:" width="20" /></p> +<p>You can see examples of catkin_lint used on buildfarm-released packages e.g. in our ROS utils stack: <a class="inline-onebox" href="https://github.com/ctu-vras/ros-utils/blob/master/cras_topic_tools/CMakeLists.txt#L108" rel="noopener nofollow ugc">ros-utils/cras_topic_tools/CMakeLists.txt at master · ctu-vras/ros-utils · GitHub</a> . Or scroll down on <a class="inline-onebox" href="https://index.ros.org/d/python3-catkin-lint/">rosdep System Dependency: python3-catkin-lint</a> to see all other.</p> +<hr /> +<p>NB: I’m not the developer of catkin_lint. <a class="mention" href="https://discourse.ros.org/u/roehling">@roehling</a> @ FKIE is doing all of the awesome work!</p> +<hr /> +<p>NB2: When you’re at it, also have a look at:</p> +<pre><code class="lang-auto">find_package(roslaunch REQUIRED) +roslaunch_add_file_check(launch IGNORE_UNSET_ARGS) +</code></pre> +<p>and</p> +<pre><code class="lang-auto">&lt;test_depend&gt;roslaunch&lt;/test_depend&gt; +</code></pre> + <p><small>1 post - 1 participant</small></p> + <p><a href="https://discourse.ros.org/t/ros1-now-is-a-great-time-to-add-catkin-lint-to-your-packages/36521">Read full topic</a></p> + 2024-03-08T10:28:13+00:00 + peci1 + + + ROS Discourse General: Cobot Magic: Mobile Aloha system works on AgileX Robotics platform + https://discourse.ros.org/t/cobot-magic-mobile-aloha-system-works-on-agilex-robotics-platform/36515 + <h1><a class="anchor" href="https://discourse.ros.org#introduction-1" name="introduction-1"></a>Introduction</h1> +<p>AgileX Cobot Magic is a system based on Mobile ALOHA that can simultaneously remotely control the TRACER mobile base and robotic arms.</p> +<p><img alt="浇花1" class="animated" height="390" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/5/1/51f4b207bad0b9e972c323fe70722378c752cbf1.gif" width="690" /></p> +<h2><a class="anchor" href="https://discourse.ros.org#story-2" name="story-2"></a>Story</h2> +<p>Mobile Aloha is a whole-body remote operation data collection system developed by Zipeng Fu, Tony Z. Zhao, and Chelsea Finn from Stanford University. Its hardware is based on 2 robotic arms (ViperX 300), equipped with 2 wrist cameras and 1 top camera, and a mobile base from <strong>AgileX Robotics Tracer</strong> differential motion robot, etc. Data collected using Mobile ALOHA, combined with supervised behavior cloning and joint training with existing static ALOHA datasets, can improve the performance of mobile manipulation tasks. With 50 demonstrations for each task, joint training can increase the success rate by 90%. Mobile ALOHA can autonomously perform complex mobile manipulation tasks such as cooking and opening doors. <strong>Special thanks to the Stanford research team Zipeng Fu, Tony Z. Zhao, and Chelsea Finn for their research on Mobile ALOHA</strong>, which enabled full open-source implementation. For more details about this project please check the <a href="https://mobile-aloha.github.io/" rel="noopener nofollow ugc">link.</a></p> +<p>Based on Mobile Aloha, AgileX developed Cobot Magic, which can achieve the complete code of Mobile Aloha, with higher configurations and lower costs, and is equipped with larger-load robotic arms and high-computing power industrial computers. For more details about Cobot Magic please check the <a href="https://global.agilex.ai/" rel="noopener nofollow ugc">AgileX website</a>.</p> +<p>AgileX Cobot Magic is a system based on Mobile ALOHA that can simultaneously remotely control the TRACER mobile base and robotic arms. It is equipped with an indoor differential drive AGV base, a high-performance robotic arm, an industrial-grade computer, and other components. AgileX Cobot Magic assists users in better utilizing open-source hardware and the Mobile ALOHA deep learning framework for robotics. It covers a wide range of tasks, from simple pick-and-place operations to more intricate and complex actions such as pouring, cooking, riding elevators, and organizing items.</p> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/1/f/1f9a0a4bbbad855e74db30ed0f83c9efd121753c.jpeg" rel="noopener nofollow ugc" title="cobot magic"><img alt="cobot magic" height="500" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/1/f/1f9a0a4bbbad855e74db30ed0f83c9efd121753c_2_500x500.jpeg" width="500" /></a></div><p></p> +<p>Currently, AgileX has successfully completed the integration of Cobot Magic based on the Mobile Aloha source code project. It currently includes the entire process of data collection, data re-display, data visualization, demonstration mode, model training, inference, and so on. This project will introduce AgileX Cobot Magic and provide ongoing updates on the training progress of mobile manipulation tasks.</p> +<h3><a class="anchor" href="https://discourse.ros.org#hardware-configuration-3" name="hardware-configuration-3"></a><strong>Hardware configuration</strong></h3> +<p>Here is the list of hardware in AgileX Cobot Magic.</p> +<div class="md-table"> +<table> +<thead> +<tr> +<th>Component</th> +<th>Item Name</th> +<th>Model</th> +</tr> +</thead> +<tbody> +<tr> +<td>Standard Configuration</td> +<td>Wheeled Mobile Robot</td> +<td>AgileX Tracer</td> +</tr> +<tr> +<td>Deep Camera x3</td> +<td>Orbbec Dabai</td> +<td></td> +</tr> +<tr> +<td>USB Hub</td> +<td>12V Power Supply,7-port</td> +<td></td> +</tr> +<tr> +<td>6 DOF Lightweight Robot Arm x4</td> +<td>Customized by AgileX</td> +<td></td> +</tr> +<tr> +<td>Adjustable Velcro x2</td> +<td>Customized by AgileX</td> +<td></td> +</tr> +<tr> +<td>Grip Tape x2</td> +<td>Customized by AgileX</td> +<td></td> +</tr> +<tr> +<td>Power Strip</td> +<td>4 Outlets, 1.8m</td> +<td></td> +</tr> +<tr> +<td>Mobile Power Station</td> +<td>1000W</td> +<td></td> +</tr> +<tr> +<td>ALOHA Stand</td> +<td>Customized by AgileX</td> +<td></td> +</tr> +<tr> +<td>OptionalConfiguration</td> +<td>Nano Development Kit</td> +<td>Jetson Orin Nano Developer Kit (8G)</td> +</tr> +<tr> +<td>Industrial PC</td> +<td>APQ-X7010/GPU 4060/i7-9700-32g-4T</td> +<td></td> +</tr> +<tr> +<td>IMU</td> +<td>CH110</td> +<td></td> +</tr> +<tr> +<td>Display</td> +<td>11.6" 1080p</td> +<td></td> +</tr> +</tbody> +</table> +</div><p>Note: An IPC is required. Users have two options: Nano Development kit and APQ-X7010 IPC.</p> +<h3><a class="anchor" href="https://discourse.ros.org#software-configuration-4" name="software-configuration-4"></a><strong>Software configuration</strong></h3> +<p><strong>Local computer:</strong></p> +<p>Ubuntu20.04, cuda-11.3.</p> +<p><strong>Environment configuration:</strong></p> +<pre><code class="lang-auto"># 1. Create python virtual environment +conda create -n aloha python=3.8 + +# 2. Activate +conda activate aloha + +# 3. Install cuda and torch +pip install torch==1.11.0+cu113 torchvision==0.12.0+cu113 torchaudio==0.11.0 --extra-index-url https://download.pytorch.org/whl/cu113 + + +# 4 Install detr +## Get act code +git clone https://github.com/agilexrobotics/act-plus-plus.git +cd act-plus-plus + + +# 4.1 other dependencies +pip install -r requirements.txt + +## 4.2 Install detr +cd detr &amp;&amp; pip install -v -e . +</code></pre> +<p><strong>Simulated environment datasets</strong></p> +<p>You can find all scripted/human demos for simulated environments here. <a href="https://drive.google.com/drive/folders/1gPR03v05S1xiInoVJn7G7VJ9pDCnxq9O" rel="noopener nofollow ugc">here</a></p> +<p>After downloading, copy it to the act-plus-plus/data directory. The directory structure is as follows:</p> +<pre><code class="lang-auto">act-plus-plus/data + ├── sim_insertion_human + │ ├── sim_insertion_human-20240110T054847Z-001.zip + ├── ... + ├── sim_insertion_scripted + │ ├── sim_insertion_scripted-20240110T054854Z-001.zip + ├── ... + ├── sim_transfer_cube_human + │ ├── sim_transfer_cube_human-20240110T054900Z-001.zip + │ ├── ... + └── sim_transfer_cube_scripted + ├── sim_transfer_cube_scripted-20240110T054901Z-001.zip + ├── ... +</code></pre> +<h3><a class="anchor" href="https://discourse.ros.org#demonstration-5" name="demonstration-5"></a><strong>Demonstration</strong></h3> +<p>By now it is widely accepted that learning a task from scratch, i.e., without any prior knowledge, is a daunting undertaking. Humans, however, rarely attempt to learn from scratch. They extract initial biases as well as strategies on how to approach a learning problem from instructions and/or demonstrations of other humans. This is what we call ‘programming by demonstration’ or ‘Imitation learning’.</p> +<p>The demonstration usually contains decision data {T1, T2,…, Tm}. Each decision contains the state and action sequence</p> +<p><img alt="image.png" height="25" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/6/1/6111251fd1d4fc2705870cbb78cec3e6a365c753.png" width="226" /></p> +<p>Extract all “state-action pairs” and build a new set</p> +<p><img alt="image.png" height="22" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/0/f/0fdd7c382a74a5e19eb6e32f31c0dddcd48e5e2b.png" width="336" /></p> +<p>Currently, based on AgileX Cobot Magic, we can achieve multiple whole-body action tasks.</p> +<p>Here we will show different action task demonstrations collected using AgileX Cobot Magic.</p> +<p>● <strong>Watering flowers</strong></p> +<p><img alt="浇花1" class="animated" height="390" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/5/1/51f4b207bad0b9e972c323fe70722378c752cbf1.gif" width="690" /></p> +<p>● <strong>Opening a box</strong></p> +<p><img alt="开箱子1" class="animated" height="390" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/8/6/86e16475476bf07b16c76d9af74d99b2469ecc80.gif" width="690" /></p> +<p>● <strong>Pouring rice</strong></p> +<p><img alt="倒米1" class="animated" height="390" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/5/5/551c084ea7f0ac49044a4bc8c78c95fba2570c88.gif" width="690" /></p> +<p>● <strong>Twisting a bottle cap</strong></p> +<p><img alt="拧瓶盖1" class="animated" height="444" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/5/3/53214e9bd6afb1292baf5ba6a3d548b21aa468ee.gif" width="400" /></p> +<p>● <strong>Throwing a rubbish</strong></p> +<p><img alt="扔垃圾1" class="animated" height="387" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/e/a/eaa1703cf4e551c161caf33d4d666987191b98e8.gif" width="690" /></p> +<p>Using AgileX Cobot Magic, users can flexibly complete various action tasks in life by controlling the teaching robot arm from simple pick and place skills to more sophisticated skills such as twisting bottle caps. The mobile chassis provides more possibilities for the robotic arms so that the robotic arm is no longer restricted to performing actions in a fixed place. The 14 + 2 DOFs provide limitless potential for collecting diverse data.</p> +<h3><a class="anchor" href="https://discourse.ros.org#data-presentation-6" name="data-presentation-6"></a><strong>Data Presentation</strong></h3> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/4/7/472aee2a6ebf04518c8af69a3acf2ee09fcc9be4.jpeg" rel="noopener nofollow ugc" title="image-20240307183526584"><img alt="image-20240307183526584" height="500" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/4/7/472aee2a6ebf04518c8af69a3acf2ee09fcc9be4.jpeg" width="602" /></a></div><p></p> +<p>Display the collected data of a certain demonstration of the AgileX Cobot Magic arms. The collected data includes the positional information of 14 joints at different time intervals.</p> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/0/1/0107b4dc478e3d9069119b915825a2daba5202a7.jpeg" rel="noopener nofollow ugc" title="img"><img alt="img" height="318" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/0/1/0107b4dc478e3d9069119b915825a2daba5202a7.jpeg" width="690" /></a></div><p></p> +<h3><a class="anchor" href="https://discourse.ros.org#summary-7" name="summary-7"></a><strong>Summary</strong></h3> +<p>Cobot Magic is a remote whole-body data collection device, developed by AgileX Robotics based on the Mobile Aloha project from Stanford University. With Cobot Magic, AgileX Robotics has successfully achieved the open-source code from the Stanford laboratory used on the Mobile Aloha platform. Data collection is no longer limited to desktops or specific surfaces thanks to the mobile base Tracer on the Cobot Magic, which enhances the richness and diversity of collected data.</p> +<p>AgileX will continue to collect data from various motion tasks based on Cobot Magic for model training and inference. Please stay tuned for updates on <a href="https://github.com/agilexrobotics?tab=repositories" rel="noopener nofollow ugc">Github.</a></p> +<h3><a class="anchor" href="https://discourse.ros.org#about-agilex-8" name="about-agilex-8"></a><strong>About AgileX</strong></h3> +<p>Established in 2016, AgileX Robotics is a leading manufacturer of mobile robot platforms and a provider of unmanned system solutions. The company specializes in independently developed multi-mode wheeled and tracked wire-controlled chassis technology and has obtained multiple international certifications. AgileX Robotics offers users self-developed innovative application solutions such as autonomous driving, mobile grasping, and navigation positioning, helping users in various industries achieve automation. Additionally, AgileX Robotics has introduced research and education software and hardware products related to machine learning, embodied intelligence, and visual algorithms. The company works closely with research and educational institutions to promote robotics technology teaching and innovation.</p> +<h3><a class="anchor" href="https://discourse.ros.org#appendix-9" name="appendix-9"></a><strong>Appendix</strong></h3> +<p><strong>ros_astra_camera configuration</strong></p> +<p><a href="https://github.com/orbbec/ros_astra_camera" rel="noopener nofollow ugc">ros_astra_camera-github</a>、 <a href="https://gitee.com/orbbecdeveloper/OrbbecSDK_ROS1" rel="noopener nofollow ugc">ros_astra_camera-gitee</a></p> +<p>● <strong>Camera Parameters</strong></p> +<div class="md-table"> +<table> +<thead> +<tr> +<th>Name</th> +<th>Parameters</th> +</tr> +</thead> +<tbody> +<tr> +<td>Baseline</td> +<td>40mm</td> +</tr> +<tr> +<td>Depth distance</td> +<td>0.3-3m</td> +</tr> +<tr> +<td>Depth map resolution</td> +<td>640x400x30fps、320x200x30fps</td> +</tr> +<tr> +<td>Color image resolution</td> +<td>1920x1080x30fps、1280x720x30fps、640x480x30fps</td> +</tr> +<tr> +<td>Accuracy</td> +<td>6mm@1m (81% FOV area in accuracy calculation)</td> +</tr> +<tr> +<td>Depth FOV</td> +<td>H 67.9° V 45.3°</td> +</tr> +<tr> +<td>Color FOV</td> +<td>H 71° V 43.7° @ 1920x1080</td> +</tr> +<tr> +<td>Delay</td> +<td>30-45ms</td> +</tr> +<tr> +<td>Data transmission</td> +<td>USB2.0 or above</td> +</tr> +<tr> +<td>Working temperature</td> +<td>10°C~40°C</td> +</tr> +<tr> +<td>Size</td> +<td>Length 59.5x Width 17.4x Thickness 11.1 mm</td> +</tr> +</tbody> +</table> +</div><ol> +<li>OrbbecSDK_ROS1Drive installation</li> +</ol> +<pre><code class="lang-auto"># 1 Install dependencies +sudo apt install libgflags-dev ros-$ROS_DISTRO-image-geometry ros-$ROS_DISTRO-camera-info-manager ros-$ROS_DISTRO-image-transport ros-$ROS_DISTRO-image-publisher ros-$ROS_DISTRO-libuvc-ros libgoogle-glog-dev libusb-1.0-0-dev libeigen3-dev + +# 2 Download the code +## 2.1 github +git clone https://github.com/orbbec/OrbbecSDK_ROS1.git astra_ws/src +## 2.2 gitee(Chinese region) +git clone https://gitee.com/orbbecdeveloper/OrbbecSDK_ROS1 -b v1.4.6 astra_ws/src + +# 3 Compile orbbec_camera +## 3.1 Enter astra_ws workspace +cd astra_ws +## 3.2 Compile orbbec_camera +catkin_make + +# 4 Install udev rules. +source devel/setup.bash &amp;&amp; rospack list +roscd orbbec_camera/scripts +sudo cp 99-obsensor-libusb.rules /etc/udev/rules.d/99-obsensor-libusb.rules +sudo udevadm control --reload &amp;&amp; sudo udevadm trigger + +# 5 Add ros_astra_camera package environment variables +## 5.1 Enter astra_ws +cd astra_ws +## 5.2 Add environment variables +echo "source $(pwd)/devel/setup.bash" &gt;&gt; ~/.bashrc +## 5.3 Environment variables work + +# 6 Launch +## If step 5 is not performed, the code in 6.2 needs to be executed every time it is started to make the ros workspace environment take effect. +## 6.1 astra_ws +cd astra_ws +## 6.2 workspace works +source devel/setup.bash +## 6.3 launch astra.launch +roslaunch orbbec_camera astra.launch +## 6.4 luanch dabai.launch +roslaunch orbbec_camera dabai.launch +</code></pre> +<ol> +<li>Configure orbbec_camera multiple camera nodes</li> +</ol> +<p>① Check the device serial number</p> +<p>● After installing the camera, run the following code</p> +<pre><code class="lang-auto">rosrun orbbec_camera list_devices_node | grep -i serial +</code></pre> +<p>● The output in the terminal</p> +<pre><code class="lang-auto">[ INFO] [1709728787.207920484]: serial: AU1P32201SA +# Please recored this serial number. Each camera corresponds to a unique Serial number. +</code></pre> +<p>② Configure multiple camera nodes</p> +<p>● cobot_magic uses three Dabai cameras of orbbec_camera, so it is necessary to configure the corresponding camera according to the Serial number of each camera.</p> +<p>● Industrial computer PC plugs in the USB data cables of the three cameras and runs 1. View the code in the device number section to view the Serial numbers of the three cameras</p> +<p>● In order to clarify the topics corresponding to each camera in subsequent development, please fill in the Serial number in order.</p> +<p>● Create the multi_dabai.launch file in the astra_ws/src/launch directory with the following content:</p> +<pre><code class="lang-auto"># Mainly modify: 1 Camera name 、2 Serial number +&lt;launch&gt; + &lt;arg name="camera_name" default="camera"/&gt; + &lt;arg name="3d_sensor" default="dabai"/&gt; + + &lt;!-- 1 Mainly modify 1 camera name prefix and 2 Serial number. --&gt; + &lt;arg name="camera1_prefix" default="01"/&gt; + &lt;arg name="camera2_prefix" default="02"/&gt; + &lt;arg name="camera3_prefix" default="03"/&gt; + + &lt;!-- # 2 Serial number : Fill in the camera Serial number --&gt; + &lt;arg name="camera1_usb_port" default="camera1的serial number"/&gt; + &lt;arg name="camera2_usb_port" default="camera2的serial number"/&gt; + &lt;arg name="camera3_usb_port" default="camera3的serial number"/&gt; + + &lt;arg name="device_num" default="3"/&gt; + &lt;include file="$(find orbbec_camera)/launch/$(arg 3d_sensor).launch"&gt; + &lt;arg name="camera_name" value="$(arg camera_name)_$(arg camera1_prefix)"/&gt; + &lt;arg name="usb_port" value="$(arg camera1_usb_port)"/&gt; + &lt;arg name="device_num" value="$(arg device_num)"/&gt; + &lt;/include&gt; + + &lt;include file="$(find orbbec_camera)/launch/$(arg 3d_sensor).launch"&gt; + &lt;arg name="camera_name" value="$(arg camera_name)_$(arg camera2_prefix)"/&gt; + &lt;arg name="usb_port" value="$(arg camera2_usb_port)"/&gt; + &lt;arg name="device_num" value="$(arg device_num)"/&gt; + &lt;/include&gt; + + &lt;include file="$(find orbbec_camera)/launch/$(arg 3d_sensor).launch"&gt; + &lt;arg name="camera_name" value="$(arg camera_name)_$(arg camera3_prefix)"/&gt; + &lt;arg name="usb_port" value="$(arg camera3_usb_port)"/&gt; + &lt;arg name="device_num" value="$(arg device_num)"/&gt; + &lt;/include&gt; +&lt;/launch&gt; +</code></pre> +<p>● Add permissions</p> +<pre><code class="lang-auto"># 1 Enter astra_camera/launch/ +roscd orbbec_camera/launch/ + +# 2 multi_dabai.launch add permissions +chmod +x multi_dabai.launch +</code></pre> +<p>● Launch ros</p> +<pre><code class="lang-auto">roslaunch orbbec_camera multi_dabai.launch +</code></pre> + <p><small>1 post - 1 participant</small></p> + <p><a href="https://discourse.ros.org/t/cobot-magic-mobile-aloha-system-works-on-agilex-robotics-platform/36515">Read full topic</a></p> + 2024-03-08T02:01:53+00:00 + Agilex_Robotics + + + ROS Discourse General: New Packages for Noetic 2024-03-07 + https://discourse.ros.org/t/new-packages-for-noetic-2024-03-07/36514 + <p>We’re happy to announce <strong>10</strong> new packages and <strong>46</strong> updates are now available in ROS Noetic. This sync was tagged as <a href="https://github.com/ros/rosdistro/blob/noetic/2024-03-07/noetic/distribution.yaml" rel="noopener nofollow ugc"><code>noetic/2024-03-07</code></a>.</p> +<p>Thank you to every maintainer and contributor who made these updates available!</p> +<h2><a class="anchor" href="https://discourse.ros.org#package-updates-for-ros-noetic-1" name="package-updates-for-ros-noetic-1"></a>Package Updates for ROS Noetic</h2> +<h3><a class="anchor" href="https://discourse.ros.org#added-packages-10-2" name="added-packages-10-2"></a>Added Packages [10]:</h3> +<ul> +<li>ros-noetic-atf: 0.1.1-1</li> +<li>ros-noetic-atf-core: 0.1.1-1</li> +<li>ros-noetic-atf-metrics: 0.1.1-1</li> +<li>ros-noetic-atf-msgs: 0.1.1-1</li> +<li>ros-noetic-atf-plotter: 0.1.1-1</li> +<li>ros-noetic-atf-recorder-plugins: 0.1.1-1</li> +<li>ros-noetic-atf-test: 0.1.1-1</li> +<li>ros-noetic-atf-test-tools: 0.1.1-1</li> +<li>ros-noetic-etsi-its-rviz-plugins: 2.0.1-1</li> +<li><a href="http://wiki.ros.org/py_binding_tools">ros-noetic-py-binding-tools</a>: 1.0.0-1</li> +</ul> +<h3><a class="anchor" href="https://discourse.ros.org#updated-packages-46-3" name="updated-packages-46-3"></a>Updated Packages [46]:</h3> +<ul> +<li><a href="https://wiki.ros.org/cras_cpp_common">ros-noetic-cras-cpp-common</a>: 2.3.8-1 → 2.3.9-1</li> +<li>ros-noetic-cras-docs-common: 2.3.8-1 → 2.3.9-1</li> +<li><a href="https://wiki.ros.org/cras_py_common">ros-noetic-cras-py-common</a>: 2.3.8-1 → 2.3.9-1</li> +<li><a href="https://wiki.ros.org/cras_topic_tools">ros-noetic-cras-topic-tools</a>: 2.3.8-1 → 2.3.9-1</li> +<li>ros-noetic-etsi-its-cam-coding: 2.0.0-1 → 2.0.1-1</li> +<li>ros-noetic-etsi-its-cam-conversion: 2.0.0-1 → 2.0.1-1</li> +<li>ros-noetic-etsi-its-cam-msgs: 2.0.0-1 → 2.0.1-1</li> +<li>ros-noetic-etsi-its-coding: 2.0.0-1 → 2.0.1-1</li> +<li>ros-noetic-etsi-its-conversion: 2.0.0-1 → 2.0.1-1</li> +<li>ros-noetic-etsi-its-denm-coding: 2.0.0-1 → 2.0.1-1</li> +<li>ros-noetic-etsi-its-denm-conversion: 2.0.0-1 → 2.0.1-1</li> +<li>ros-noetic-etsi-its-denm-msgs: 2.0.0-1 → 2.0.1-1</li> +<li>ros-noetic-etsi-its-messages: 2.0.0-1 → 2.0.1-1</li> +<li>ros-noetic-etsi-its-msgs: 2.0.0-1 → 2.0.1-1</li> +<li>ros-noetic-etsi-its-msgs-utils: 2.0.0-1 → 2.0.1-1</li> +<li>ros-noetic-etsi-its-primitives-conversion: 2.0.0-1 → 2.0.1-1</li> +<li>ros-noetic-gnss-info: 1.0.1-1 → 1.0.2-1</li> +<li>ros-noetic-gnss-info-msgs: 1.0.1-1 → 1.0.2-1</li> +<li>ros-noetic-gnsstk-ros: 1.0.1-1 → 1.0.2-1</li> +<li><a href="https://wiki.ros.org/image_transport_codecs">ros-noetic-image-transport-codecs</a>: 2.3.8-1 → 2.3.9-1</li> +<li><a href="http://wiki.ros.org/mavros">ros-noetic-libmavconn</a>: 1.17.0-1 → 1.18.0-1</li> +<li><a href="https://mavlink.io/en/" rel="noopener nofollow ugc">ros-noetic-mavlink</a>: 2023.9.9-1 → 2024.3.3-1</li> +<li><a href="http://wiki.ros.org/mavros">ros-noetic-mavros</a>: 1.17.0-1 → 1.18.0-1</li> +<li><a href="http://wiki.ros.org/mavros_extras">ros-noetic-mavros-extras</a>: 1.17.0-1 → 1.18.0-1</li> +<li><a href="http://wiki.ros.org/mavros_msgs">ros-noetic-mavros-msgs</a>: 1.17.0-1 → 1.18.0-1</li> +<li><a href="https://www.mrpt.org/" rel="noopener nofollow ugc">ros-noetic-mrpt2</a>: 2.11.9-1 → 2.11.11-1</li> +<li><a href="https://wiki.ros.org/mvsim">ros-noetic-mvsim</a>: 0.8.3-1 → 0.9.1-2</li> +<li><a href="https://github.com/LORD-MicroStrain/ntrip_client" rel="noopener nofollow ugc">ros-noetic-ntrip-client</a>: 1.2.0-1 → 1.3.0-1</li> +<li><a href="http://introlab.github.io/rtabmap" rel="noopener nofollow ugc">ros-noetic-rtabmap</a>: 0.21.3-1 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-conversions: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-costmap-plugins: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-demos: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-examples: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-launch: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-legacy: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-msgs: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-odom: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-python: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-ros: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-rviz-plugins: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-slam: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-sync: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-util: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-viz: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-test-mavros: 1.17.0-1 → 1.18.0-1</li> +<li><a href="http://wiki.ros.org/ur_client_library">ros-noetic-ur-client-library</a>: 1.3.4-1 → 1.3.5-1</li> +</ul> +<h3><a class="anchor" href="https://discourse.ros.org#removed-packages-0-4" name="removed-packages-0-4"></a>Removed Packages [0]:</h3> +<p>Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:</p> +<ul> +<li>Felix Exner</li> +<li>Florian Weisshardt</li> +<li>Jean-Pierre Busch</li> +<li>Jose-Luis Blanco-Claraco</li> +<li>Martin Pecka</li> +<li>Mathieu Labbe</li> +<li>Rob Fisher</li> +<li>Robert Haschke</li> +<li>Vladimir Ermakov</li> +</ul> + <p><small>2 posts - 2 participants</small></p> + <p><a href="https://discourse.ros.org/t/new-packages-for-noetic-2024-03-07/36514">Read full topic</a></p> + 2024-03-08T01:28:07+00:00 + sloretz + + + ROS Discourse General: Interoperability Interest Group March 7, 2024: Standardizing Infrastructure Messages, Part 3 + https://discourse.ros.org/t/interoperability-interest-group-march-7-2024-standardizing-infrastructure-messages-part-3/36455 + <p><a href="https://github.com/osrf-sig-interoperability/community" rel="noopener nofollow ugc">Community Page </a></p> +<p><a href="https://meet.google.com/qoj-nfxx-bxy" rel="noopener nofollow ugc">Meeting Link </a></p> +<p><a href="https://calendar.google.com/calendar/u/0/embed?src=agf3kajirket8khktupm9go748@group.calendar.google.com" rel="noopener nofollow ugc">Calendar Link </a></p> +<p>Continuing our discussion from the last session, our next session will get into more depth on how errors for building infrastructure devices should be represented.</p> +<p>Some questions to consider:</p> +<ul> +<li>What level of detail needs to be standardized for error messages? +<ul> +<li>Is it enough to simply communicate that the devices is unusable?</li> +<li>Should the standardized error messages also provide enough information for a technician to troubleshoot the device?</li> +<li>Should detailed troubleshooting information be provided through a separate non-standard channel instead?</li> +</ul> +</li> +<li>How efficient should error messages be? +<ul> +<li>A simple error code is high performance and allows for millions of possible error types but then can only communicate the presence of one error at a time</li> +<li>Bitsets could express multiple simultaneous errors with high performance but then the number of error types is severely limited</li> +<li>Dynamic arrays of error codes can communicate many types of errors with no limit but then heap allocations are needed</li> +<li>A string of serialized JSON could represent unlimited types of errors and provide troubleshooting information for them, but then heap allocation and string parsing are needed</li> +</ul> +</li> +<li>Should standardized error definitions be specific to each type of building device, or should the definitions be abstract enough to use across all/multiple devices? +<ul> +<li>E.g. are doors and elevators different enough that they need their own error code definitions?</li> +<li>What kind of errors should we expect to report for each different type of device?</li> +</ul> +</li> +</ul> +<p>We will be seeking input on all of the above questions and more. Please come armed with examples of your most hated device errors that you think a good standard should be able to express.</p> + <p><small>4 posts - 3 participants</small></p> + <p><a href="https://discourse.ros.org/t/interoperability-interest-group-march-7-2024-standardizing-infrastructure-messages-part-3/36455">Read full topic</a></p> + 2024-03-04T15:23:04+00:00 + grey + + + ROS Discourse General: ROS Mapping and Navigation with AgileX Robotics Limo + https://discourse.ros.org/t/ros-mapping-and-navigation-with-agilex-robotics-limo/36452 + <p>Limo is a smart educational robot published by AgileX Robotics. More details please visit: <a href="https://global.agilex.ai/" rel="noopener nofollow ugc">https://global.agilex.ai/</a><br /> +</p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/0/6/0606141abd360c2207b55c28a051a29aad2cb9f5.jpeg" rel="noopener nofollow ugc" title="808f249cb1f134a0088fbe659c204d9b_original"><img alt="808f249cb1f134a0088fbe659c204d9b_original" height="500" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/0/6/0606141abd360c2207b55c28a051a29aad2cb9f5_2_383x500.jpeg" width="383" /></a></div><p></p> +<p>Four steering modes make LIMO substantially superior to other robots in its class. The available modes are: Omni-Wheel Steering, Tracked Steering, Four-Wheel Differential Steering and Ackermann Steering. These advanced steering modes plus a built-in 360° scanning LiDAR and RealSense infrared camera make the platform perfect for industrial and commercial tasks in any scenario. With these incredible features, LIMO can achieve precise self-localization, SLAM mapping, route planning and autonomous obstacle avoidance, reverse parking, traffic light recognition, and more.<br /> +</p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/c/f/cf84a27f52dae814140c99ea284bc63816864697.jpeg" rel="noopener nofollow ugc" title="1_63677_74581"><img alt="1_63677_74581" height="304" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/c/f/cf84a27f52dae814140c99ea284bc63816864697_2_690x304.jpeg" width="690" /></a></div><p></p> +<h1><a class="anchor" href="https://discourse.ros.org#mapping-1" name="mapping-1"></a><strong>Mapping</strong></h1> +<h2><a class="anchor" href="https://discourse.ros.org#gmapping-2" name="gmapping-2"></a>Gmapping</h2> +<p>Gmapping is a widely adopted open-source SLAM algorithm that operates within the filtering SLAM framework. It effectively uses wheel odometry data and does not heavily rely on high-frequency laser LiDAR scans. When constructing a map of a smaller environment, Gmapping requires minimal computational resources to maintain high accuracy. Here the ROS encapsulated Gmapping package is used to achieve the Gmapping for Limo.</p> +<p><strong>Note:</strong> The speed of limo should be slow in the process of mapping. If the speed is too fast, the effect of mapping will be affected.</p> +<p>Run the command in a new terminal. It launches the LiDAR.</p> +<pre><code class="lang-auto"> roslaunch limo_bringup limo_start.launch pub_odom_tf:=false +</code></pre> +<p>Then launch the gmapping algorithm. Open another new terminal, and enter the command:</p> +<pre><code class="lang-auto">roslaunch limo_bringup limo_gmapping.launch +</code></pre> +<p>After launching successfully, the rviz visualization tool will start up. The interface is shown in the figure.<br /> +</p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/9/d/9de4e145dccc9ebb806ef5ec3f8094c11d1859ab.png" rel="noopener nofollow ugc" title="gmapping"><img alt="gmapping" height="391" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/9/d/9de4e145dccc9ebb806ef5ec3f8094c11d1859ab_2_690x391.png" width="690" /></a></div><p></p> +<p>At this time, the handle can be set to remote control mode and control limo mapping.</p> +<p>After building the map, run the following command to save the map to the specified directory:</p> +<ol> +<li>Switch to the directory where you need to save the map, save the map to ~/agilex_ws/src/limo_ros/limo_bringup/maps/, and enter the command in the terminal:</li> +</ol> +<pre><code class="lang-auto">cd ~/agilex_ws/src/limo_ros/limo_bringup/maps/ +</code></pre> +<ol start="2"> +<li>After switching to /agilex_ws/limo_bringup/maps, continue to enter the command in the terminal:</li> +</ol> +<pre><code class="lang-auto">rosrun map_server map_saver –f map1 +</code></pre> +<p><strong>Note:</strong> map1 is the name of the saved map, and duplicate names should be avoided when saving the map.</p> +<h2><a class="anchor" href="https://discourse.ros.org#cartographer-3" name="cartographer-3"></a>Cartographer</h2> +<p>Cartographer is a set of SLAM algorithms based on image optimization launched by Google. The main goal of this algorithm is to achieve low computing resource consumption and achieve the purpose of real-time SLAM. The algorithm is mainly divided into two parts. The first part is called Local SLAM. This part establishes and maintains a series of Submaps through each frame of the Laser Scan, and the so-called submap is a series of Grid Maps. The second part called Global SLAM, is to perform closed-loop detection through Loop Closure to eliminate accumulated errors: when a submap is built, no new laser scans will be inserted into the submap. The algorithm will add the submap to the closed-loop detection.</p> +<p><strong>Note:</strong> Before running the command, please make sure that the programs in other terminals have been terminated. The termination command is: Ctrl+c.</p> +<p><strong>Note:</strong> The speed of limo should be slow in the process of mapping. If the speed is too fast, the effect of mapping will be affected.</p> +<p>Launch a new terminal and enter the command:</p> +<pre><code class="lang-auto">roslaunch limo_bringup limo_start.launch pub_odom_tf:=false +</code></pre> +<p>Then start the cartographer mapping algorithm. Open another new terminal and enter the command:</p> +<pre><code class="lang-auto">roslaunch limo_bringup limo_cartographer.launch +</code></pre> +<p>After launching successfully, the rviz visualization interface will be shown as the figure below:<br /> +</p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/d/c/dc83e862c3071db60baa93c1e067da859514e4ed.jpeg" rel="noopener nofollow ugc" title="carto_1"><img alt="carto_1" height="500" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/d/c/dc83e862c3071db60baa93c1e067da859514e4ed_2_661x500.jpeg" width="661" /></a></div><p></p> +<p>After building the map, it is necessary to save it. Three following commands need to be entered in the terminal:</p> +<p>(1)After completing the trajectory, no further data should be accepted.</p> +<pre><code class="lang-auto">rosservice call /finish_trajectory 0 +</code></pre> +<p>(2)Serialize and save its current state.</p> +<pre><code class="lang-auto">rosservice call /write_state "{filename: '${HOME}/agilex_ws/src/limo_ros/limo_bringup/maps/mymap.pbstream'}" +</code></pre> +<p>(3)Convert pbstream to pgm and yaml</p> +<pre><code class="lang-auto">rosrun cartographer_ros cartographer_pbstream_to_ros_map -map_filestem=${HOME}/agilex_ws/src/limo_ros/limo_bringup/maps/mymap.pbstream -pbstream_filename=${HOME}/agilex_ws/src/limo_ros/limo_bringup/maps/mymap.pbstream -resolution=0.05 +</code></pre> +<p>Generate the corresponding pgm and yaml, and put them in the directory:</p> +<p>${HOME}/agilex_ws/src/limo_ros/limo_bringup/maps/mymap.pbstream</p> +<p>Note: During the process of mapping, some warnings will appear in the terminal. This is caused by the excessive speed and the delayed data processing, which can be ignored.<br /> +</p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/3/2/32bd0fbb75ffcfa0caedc3c4eb0a48a1989617e3.png" rel="noopener nofollow ugc" title="carto_2"><img alt="carto_2" height="220" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/3/2/32bd0fbb75ffcfa0caedc3c4eb0a48a1989617e3_2_690x220.png" width="690" /></a></div><p></p> +<h1><a class="anchor" href="https://discourse.ros.org#navigation-4" name="navigation-4"></a>Navigation</h1> +<h3><a class="anchor" href="https://discourse.ros.org#navigation-framework-5" name="navigation-framework-5"></a>Navigation framework</h3> +<p>The key to navigation is robot positioning and path planning. For these, ROS provides the following two packages.</p> +<p>(1)move_base:achieve the optimal path planning in robot navigation.</p> +<p>(2)amcl:achieve robot positioning in a two-dimensional map.</p> +<p>On the basis of the above two packages, ROS provides a complete navigation framework.<br /> +</p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/7/7/7741fb79c9ba4edc70d038066e1471fb5b22ad1e.png" rel="noopener nofollow ugc" title="ROS 导航框架"><img alt="ROS 导航框架" height="291" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/7/7/7741fb79c9ba4edc70d038066e1471fb5b22ad1e_2_690x291.png" width="690" /></a></div><br /> +The robot only needs to publish the necessary sensor information and navigation goal position, and ROS can complete the navigation function. In this framework, the move_base package provides the main operation and interactive interface of navigation. In order to ensure the accuracy of the navigation path, the robot also needs to accurately locate its own position. This part of the function is implemented by the amcl package.<p></p> +<h4><a class="anchor" href="https://discourse.ros.org#h-11-move_base-package-6" name="h-11-move_base-package-6"></a>1.1 Move_base package</h4> +<p>move_base is a package for path planning in ROS, which is mainly composed of the following two planners.</p> +<p>(1) Global path planning (global_planner). Global path planning is to plan the overall path according to a given goal position and global map. In navigation, Dijkstra or A* algorithm is used for global path planning, and the optimal route from the robot to the goal position is calculated as the robot’s global path.</p> +<p>(2) Local real-time planning (local_planner). In practice, robots often cannot strictly follow the global path. So it is necessary to plan the path that the robot should travel in each cycle according to the map information and obstacles that may appear near the robot at any time. So that it conforms to the global optimal path as much as possible.</p> +<h4><a class="anchor" href="https://discourse.ros.org#h-12-amcl-package-7" name="h-12-amcl-package-7"></a>1.2 Amcl package</h4> +<p>Autonomous positioning means that the robot can calculate its position on the map in any state. ROS provides developers with an adaptive (or kld sampling) Monte Carlo localization (amcl), which is a probabilistic positioning system that locates mobile robots in 2D. It implements an adaptive (or KLD-sampling) Monte Carlo localization, using particle filtering to track the pose of the robot on a known map.</p> +<h4><a class="anchor" href="https://discourse.ros.org#h-13-introduction-of-dwa_planner-and-teb_planner-8" name="h-13-introduction-of-dwa_planner-and-teb_planner-8"></a>1.3 Introduction of DWA_planner and TEB_planner</h4> +<h5><a class="anchor" href="https://discourse.ros.org#dwa_planner-9" name="dwa_planner-9"></a>DWA_planner</h5> +<p>The full name of DWA is Dynamic Window Approaches. The algorithm can search for multiple paths to avoid and travel, select the optimal path based on various evaluation criteria (whether it will hit an obstacle, the time required, etc.), and calculate the linear velocity and angular velocity during the driving cycle to avoid collisions with dynamic obstacles.</p> +<h5><a class="anchor" href="https://discourse.ros.org#teb_planner-10" name="teb_planner-10"></a>TEB_planner</h5> +<p>The full name of “TEB” is Time Elastic Band Local Planner. This method performs subsequent modifications to the initial trajectory generated by the global path planner to optimize the robot’s motion trajectory and belongs to local path planning. In the process of trajectory optimization, the algorithm has a variety of optimization goals, including but not limited to: overall path length, trajectory running time, distance to obstacles, passing intermediate path points, and compliance with robot dynamics, kinematics, and geometric constraints. The“TEB method” explicitly considers the dynamic constraints of time and space in the state of motion, for example, the velocity and acceleration of the robot are limited.</p> +<h3><a class="anchor" href="https://discourse.ros.org#limo-navigation-11" name="limo-navigation-11"></a>Limo navigation</h3> +<p><strong>Note:</strong> In the four-wheel differential mode, the omnidirectional wheel mode and the track mode, the file run for the navigation is the same.</p> +<p><strong>Note:</strong> Before running the command, please make sure that the programs in other terminals have been terminated. The termination command is: Ctrl+c.</p> +<p>(1)First launch the LiDAR and enter the command in the terminal:</p> +<pre><code class="lang-auto">roslaunch limo_bringup limo_start.launch pub_odom_tf:=false +</code></pre> +<p>(2)Launch the navigation and enter the command in the terminal:</p> +<pre><code class="lang-auto">roslaunch limo_bringup limo_navigation_diff.launch +</code></pre> +<p><strong>Note:</strong> If it is Ackermann motion mode, please run:</p> +<pre><code class="lang-auto">roslaunch limo_bringup limo_navigation_ackerman.launch +</code></pre> +<p>After launching successfully, the rviz interface will be shown in the figure below:<br /> +</p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/c/b/cbf90b32a8bb75cce05af0c01f5094809e1f4069.jpeg" rel="noopener nofollow ugc" title="navi_1"><img alt="navi_1" height="388" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/c/b/cbf90b32a8bb75cce05af0c01f5094809e1f4069_2_690x388.jpeg" width="690" /></a></div><p></p> +<p><strong>Note:</strong> If you need to customize the opened map, please open the limo_navigation_diff.launch file to modify the parameters. The file directory is: ~/agilex_ws/src/limo_ros/limo_bringup/launch. Please modify map02 to the name of the map that needs to be replaced.</p> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/4/0/4076ecf0b20ad45b7ab64dda8e6707572a3f8593.png" rel="noopener nofollow ugc" title="navi_diff"><img alt="navi_diff" height="391" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/4/0/4076ecf0b20ad45b7ab64dda8e6707572a3f8593_2_690x391.png" width="690" /></a></div><p></p> +<p>3)After launching the navigation, it may be observed that the laser-scanned shape does not align with the map, requiring manual correction. To rectify this, adjust the actual position of the chassis in the scene displayed on the rviz map. Use the rviz tools to designate an approximate position for the vehicle, providing it with a preliminary estimation. Subsequently, use the handle tool to remotely rotate the vehicle until automatic alignment is achieved. Once the laser shape overlaps with the scene shape on the map, the correction process is concluded. The operational steps are outlined as follows:<br /> +</p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/d/d/dddc6afc5b57a2d16e952c0417769f606289bd3c.jpeg" rel="noopener nofollow ugc" title="limo_tu_2"><img alt="limo_tu_2" height="384" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/d/d/dddc6afc5b57a2d16e952c0417769f606289bd3c_2_690x384.jpeg" width="690" /></a></div><p></p> +<p>The correction is completed:</p> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/5/3/539627d512375c5538e1891d2764a830a018e691.jpeg" rel="noopener nofollow ugc" title="navi3"><img alt="navi3" height="388" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/5/3/539627d512375c5538e1891d2764a830a018e691_2_690x388.jpeg" width="690" /></a></div><p></p> +<p>(4)Set the navigation goal point through 2D Nav Goal.<br /> +</p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/c/0/c087f25803dfbb510b6f973eeef9c20e8be5e507.jpeg" rel="noopener nofollow ugc" title="limo_tu_3"><img alt="limo_tu_3" height="382" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/c/0/c087f25803dfbb510b6f973eeef9c20e8be5e507_2_690x382.jpeg" width="690" /></a></div><p></p> +<p>A purple path will be generated on the map. Switch the handle to command mode, and Limo will automatically navigate to the goal point.</p> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/1/a/1a70d62038c185a244bc7b1a0d4de413ad8de05c.jpeg" rel="noopener nofollow ugc" title="navi_5"><img alt="navi_5" height="388" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/1/a/1a70d62038c185a244bc7b1a0d4de413ad8de05c_2_690x388.jpeg" width="690" /></a></div><p></p> +<h1><a class="anchor" href="https://discourse.ros.org#limo-path-inspection-12" name="limo-path-inspection-12"></a>Limo path inspection</h1> +<p>(1)First launch the LiDAR and enter the command in the terminal:</p> +<pre><code class="lang-auto">roslaunch limo_bringup limo_start.launch pub_odom_tf:=false +</code></pre> +<p>(2)Launch the navigation and enter the command in the terminal:</p> +<pre><code class="lang-auto">roslaunch limo_bringup limo_navigation_diff.launch +</code></pre> +<p><strong>Note:</strong> If it is Ackermann motion mode, please run:</p> +<pre><code class="lang-auto">roslaunch limo_bringup limo_navigation_ackerman.launch +</code></pre> +<p>(3)Launch the path recording function. Open a new terminal, and enter the command in the terminal:</p> +<pre><code class="lang-auto">roslaunch agilex_pure_pursuit record_path.launch +</code></pre> +<p>After the path recording is completed, terminate the path recording program, and enter the command in the terminal: Ctrl+c.</p> +<p>(4)Launch the path inspection function. Open a new terminal, and enter the command in the terminal:</p> +<p><strong>Note:</strong> Switch the handle to command mode.</p> +<pre><code class="lang-auto">roslaunch agilex_pure_pursuit pure_pursuit.launch +</code></pre> + <p><small>2 posts - 2 participants</small></p> + <p><a href="https://discourse.ros.org/t/ros-mapping-and-navigation-with-agilex-robotics-limo/36452">Read full topic</a></p> + 2024-03-04T07:28:26+00:00 + Agilex_Robotics + + + ROS Discourse General: ROS Meetup Arab + https://discourse.ros.org/t/ros-meetup-arab/36439 + <p>We’re excited to introduce the forthcoming installment of our Arabian Meet series, centered around the captivating theme of “Autonomous Racing: Advancing the Frontiers of Automated Technology.”</p> +<p>The topics we’ll explore include :</p> +<ul> +<li>Introduction to Autonomous Racing.</li> +<li>Autonomous Racing Competitions.</li> +<li>Racing Cars &amp; Sensor Technologies.</li> +<li>ROS-Based Racing Simulator.</li> +<li>Autonomous Racing Software Architecture.</li> +</ul> +<p>Stay tuned for more updates and save the date for this enlightening conversation! <img alt=":spiral_calendar:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/spiral_calendar.png?v=12" title=":spiral_calendar:" width="20" /></p> +<p>save time on the calendar:</p><aside class="onebox allowlistedgeneric"> + <header class="source"> + + <a href="https://accounts.google.com/v3/signin/identifier?continue=https%3A%2F%2Fcalendar.google.com%2Fcalendar%2Fevent%3Faction%3DTEMPLATE%26tmeid%3DMGhjNWNodTNya2MxMHEycW9vNHBkMGFycmQgYXJhYnJvYm9lbnRodXNpYXN0QG0%26tmsrc%3Darabroboenthusiast%40gmail.com&amp;emr=1&amp;followup=https%3A%2F%2Fcalendar.google.com%2Fcalendar%2Fevent%3Faction%3DTEMPLATE%26tmeid%3DMGhjNWNodTNya2MxMHEycW9vNHBkMGFycmQgYXJhYnJvYm9lbnRodXNpYXN0QG0%26tmsrc%3Darabroboenthusiast%40gmail.com&amp;ifkv=ATuJsjxgQkUkOFW5ri_9_zB_Q28be014orSHg4eFYV2ZWt5CNfRXOcqrlZmxblEmLhDYw8zd5Arlgg&amp;osid=1&amp;passive=1209600&amp;service=cl&amp;flowName=WebLiteSignIn&amp;flowEntry=ServiceLogin&amp;dsh=S-1425911791%3A1709450102390372" rel="noopener nofollow ugc" target="_blank">accounts.google.com</a> + </header> + + <article class="onebox-body"> + + +<h3><a href="https://accounts.google.com/v3/signin/identifier?continue=https%3A%2F%2Fcalendar.google.com%2Fcalendar%2Fevent%3Faction%3DTEMPLATE%26tmeid%3DMGhjNWNodTNya2MxMHEycW9vNHBkMGFycmQgYXJhYnJvYm9lbnRodXNpYXN0QG0%26tmsrc%3Darabroboenthusiast%40gmail.com&amp;emr=1&amp;followup=https%3A%2F%2Fcalendar.google.com%2Fcalendar%2Fevent%3Faction%3DTEMPLATE%26tmeid%3DMGhjNWNodTNya2MxMHEycW9vNHBkMGFycmQgYXJhYnJvYm9lbnRodXNpYXN0QG0%26tmsrc%3Darabroboenthusiast%40gmail.com&amp;ifkv=ATuJsjxgQkUkOFW5ri_9_zB_Q28be014orSHg4eFYV2ZWt5CNfRXOcqrlZmxblEmLhDYw8zd5Arlgg&amp;osid=1&amp;passive=1209600&amp;service=cl&amp;flowName=WebLiteSignIn&amp;flowEntry=ServiceLogin&amp;dsh=S-1425911791%3A1709450102390372" rel="noopener nofollow ugc" target="_blank">Google Calendar - Sign in to Access &amp; Edit Your Schedule</a></h3> + + <p>Access Google Calendar with a Google account (for personal use) or Google Workspace account (for business use).</p> + + + </article> + + <div class="onebox-metadata"> + + + </div> + + <div style="clear: both;"></div> +</aside> +<p> +You can find the meeting link here:</p><aside class="onebox allowlistedgeneric"> + <header class="source"> + <img class="site-icon" height="32" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/9/e/9ed0292ad9ae04ccaa893508cebd1fd741a2ba80.png" width="32" /> + + <a href="https://meet.google.com/ecu-cseq-hhv" rel="noopener nofollow ugc" target="_blank">Google Workspace</a> + </header> + + <article class="onebox-body"> + <div class="aspect-image"><img class="thumbnail" height="362" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/0/e/0ec45595982470ef64dcc81caf9a6f8837a2f064_2_690x362.jpeg" width="690" /></div> + +<h3><a href="https://meet.google.com/ecu-cseq-hhv" rel="noopener nofollow ugc" target="_blank">Google Meet: Online Web and Video Conferencing Calls | Google Workspace</a></h3> + + <p>Use Google Meet for secure online web conferencing calls and video chat as a part of Google Workspace.</p> + + + </article> + + <div class="onebox-metadata"> + + + </div> + + <div style="clear: both;"></div> +</aside> + +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/5/5/55c3ec018f4f6d71439c644e9551da08908f7e38.jpeg" rel="noopener nofollow ugc" title="image"><img alt="image" height="500" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/5/5/55c3ec018f4f6d71439c644e9551da08908f7e38.jpeg" width="500" /></a></div><p></p> + <p><small>1 post - 1 participant</small></p> + <p><a href="https://discourse.ros.org/t/ros-meetup-arab/36439">Read full topic</a></p> + 2024-03-03T07:16:59+00:00 + khaledgabr77 + + + ROS Discourse General: Potential Humanoid Robotics Monthly Working Group + https://discourse.ros.org/t/potential-humanoid-robotics-monthly-working-group/36426 + <p>Hi Everyone,</p> +<p>I want to introduce myself - my name is Ronaldson Bellande, I’m a PhD Student/Founder CEO/CTO/Director of Research Organizations and a Startup; I’m working on. Can find information more about me in my <a href="https://www.linkedin.com/in/ronaldson-bellande-5b9699178" rel="noopener nofollow ugc">linkedin</a> and <a href="https://github.com/RonaldsonBellande" rel="noopener nofollow ugc">Github Profile</a></p> +<p>I wanted to create a monthly meeting working group, where we would meet monthly and discuss about humanoid robotics, what everyone is working on? Are looking for and are excited for? Anything Interested you are working in? and more in the space of humanoid robotics.</p> +<p>If there is interest I will start a Working Group, I’m passionate about this subject and other subject related to activities I’m constantly doing.</p> + <p><small>13 posts - 8 participants</small></p> + <p><a href="https://discourse.ros.org/t/potential-humanoid-robotics-monthly-working-group/36426">Read full topic</a></p> + 2024-03-02T01:57:09+00:00 + RonaldsonBellande + + + ROS Discourse General: ROS News for the Week of February 26th, 2024 + https://discourse.ros.org/t/ros-news-for-the-week-of-february-26th-2024/36367 + <h1><a class="anchor" href="https://discourse.ros.org#ros-news-for-the-week-of-february-26th-2024-1" name="ros-news-for-the-week-of-february-26th-2024-1"></a>ROS News for the Week of February 26th, 2024</h1> +<br /> +<p><img alt="belt2" class="animated" height="288" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/b/7/b7eaaea21f96fb5a468b92cd87c38346522f10c3.gif" width="512" /></p> +<p>In manufacturing I’ve seen talented people do things with clever light placement that transform an extremely difficult computer vision task into something that’s easily solved. I came across this paper this week that does just that for the robotic manipulation of objects. The paper is titled, <a href="https://dgdm-robot.github.io/">“Dynamics-Guided Diffusion Model for Robot Manipulator Design”</a> and the authors use diffusion models to make simple grippers that can manipulate a specific object into a given pose. The results are pretty cool and could be very useful for any roboticist with a 3D printer.</p> +<br /> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/0/6/06808ab2cd53461d9e9170f3315f78f669ac702b.jpeg" title="image"><img alt="image" height="250" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/0/6/06808ab2cd53461d9e9170f3315f78f669ac702b_2_313x250.jpeg" width="313" /></a></div><br /> +<a href="https://arstechnica.com/ai/2024/02/amazon-to-spend-1-billion-on-startups-that-combine-ai-with-robots/">Amazon is putting up US$1B to fund startups that combine robotics and “AI.”</a> While regular startup investment has fallen off a bit, it looks like there are still funding opportunities for robotics companies.<p></p> +<br /> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/e/b/eb729571e2a23ef928c2429a0c5005f5763b4c1a.jpeg" title="image"><img alt="image" height="250" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/e/b/eb729571e2a23ef928c2429a0c5005f5763b4c1a_2_310x250.jpeg" width="310" /></a></div><br /> +Last week everyone was talking about how <a href="https://www.forbes.com/sites/greatspeculations/2024/02/26/ai-revolution-sparks-nvidias-historic-market-cap-achievement/?sh=31a262673362">NVIDIA’s market cap had hit US$2T</a>. <a href="https://www.linkedin.com/pulse/nvidia-open-navigation-collaborate-drive-new-mobile-amulya-vishwanath-pesqc/?trackingId=1qN86jkFQleBX%2FC%2Bx%2Fv%2BMA%3D%3D">According to this LinkedIn</a> post they are putting that money to good use by funding the development of the open source Nav2 project.<p></p> +<br /> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/1/e/1e1a12e89f98ef4f5ee95da821dabc380e6ceaaf.jpeg" title="fig_lvt2calib_overview"><img alt="fig_lvt2calib_overview" height="374" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/1/e/1e1a12e89f98ef4f5ee95da821dabc380e6ceaaf_2_416x374.jpeg" width="416" /></a></div><br /> +Cross sensor calibration is a pain in the <img alt=":peach:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/peach.png?v=12" title=":peach:" width="20" />. A good robot model can only get you so far, and getting a bunch of sensor data to match up can be difficult for even the most seasoned engineers. The Github repository below attempts to build a toolbox to fix some these problems. <a href="https://github.com/Clothooo/lvt2calib?tab=readme-ov-file">LVT2Calib: Automatic and Unified Extrinsic Calibration Toolbox for Different 3D LiDAR, Visual Camera and Thermal Camera</a> – <a href="https://www.researchgate.net/publication/371377845_L2V2T2Calib_Automatic_and_Unified_Extrinsic_Calibration_Toolbox_for_Different_3D_LiDAR_Visual_Camera_and_Thermal_Camera">paper</a><p></p> +<h1><a class="anchor" href="https://discourse.ros.org#events-2" name="events-2"></a>Events</h1> +<ul> +<li><a href="https://www.meetup.com/boulderisforrobots/events/299280969/">2024-03-06 Boulder is for Robots Meetup</a></li> +<li><a href="https://online-learning.tudelft.nl/courses/hello-real-world-with-ros-robot-operating-systems/">2024-03-14 TU Delft ROS MOOC (FREE!)</a></li> +<li><a href="https://rosindustrial.org/events/2024/ros-industrial-consortium-americas-2024-annual-meeting">2024-03-27 ROS Industrial Annual Meeting</a></li> +<li><a href="https://www.eventbrite.com/e/robotics-and-tech-happy-hour-tickets-835278218637?aff=soc">2024-03-27 Robotics Happy Hour Pittsburgh</a></li> +<li><a href="https://www.eventbrite.com/e/robots-on-ice-40-2024-tickets-815030266467">2024-04-21 Robots on Ice</a></li> +<li><a href="https://2024.oshwa.org/">2024-05-03 Open Hardware Summit Montreal</a></li> +<li><a href="https://www.tum-venture-labs.de/education/bots-bento-the-first-robotic-pallet-handler-competition-icra-2024/">2024-05-13 Bots &amp; Bento @ ICRA 2024</a></li> +<li><a href="https://discourse.ros.org/t/acm-sigsoft-summer-school-on-software-engineering-for-robotics-in-brussels/35992">2024-06-04 → 2024-06-08 ACM SIGSOFT Summer School on Software Engineering for Robotics</a></li> +<li><a href="https://www.agriculture-vision.com/">2024-06-?? Workshop on Agriculture Vision at CVPR 2024</a></li> +<li><a href="https://icra2024.rt-net.jp/archives/87">2024-06-16 Food Topping Challenge at ICRA 2024</a></li> +<li><a href="https://discourse.ros.org/t/roscon-france-2024/35222">2024-06-19 =&gt; 2024-06-20 ROSCon France</a></li> +<li><a href="https://sites.udel.edu/ceoe-able/able-summer-bootcamp/">2024-08-05 Autonomous Systems Bootcam at Univ. Deleware</a>– <a href="https://www.youtube.com/watch?v=4ETDsBN2o8M">Video</a></li> +<li><a href="https://roscon.jp/2024/">2024-09-25 ROSConJP</a></li> +</ul> +<h1><a class="anchor" href="https://discourse.ros.org#news-3" name="news-3"></a>News</h1> +<ul> +<li><a href="https://github.com/Tanneguydv/pyolp_robotics">PyOLP – Python tool for Offline Robot Programming</a></li> +<li><a href="https://concreteproducts.com/index.php/2024/02/26/rebar-placement-robot-books-15-ton-day-on-florida-bridge-deck/">Rebar and Tie Robot</a></li> +<li><a href="https://www.therobotreport.com/apple-reportedly-pulls-plug-on-autonomous-vehicles/">Apple Car Canceled</a></li> +<li><a href="https://www.therobotreport.com/robco-raises-42-5m-for-automation-for-small-midsize-manufacturers/">RobCo Raises $42.5M</a></li> +<li><a href="https://spectrum.ieee.org/air-force-research-ares-os">Airforce Uses Robots for Lab Work</a></li> +<li><a href="https://arstechnica.com/ai/2024/02/amazon-to-spend-1-billion-on-startups-that-combine-ai-with-robots/">Amazon to drop $1B on Startups that use Robots and AI</a></li> +<li><a href="https://vimeo.com/917608513?share=copy">Gazebo Community Meeting on Pan-African Robotics Competition</a></li> +<li><a href="https://medium.com/toyotaresearch/meet-punyo-tris-soft-robot-for-whole-body-manipulation-research-949c934ac3d8">TRI’s Soft Robot Punyo</a></li> +<li><a href="https://discourse.ros.org/t/robot-fleet-management-make-vs-buy-an-alternative/36330">Fleet Managers: Build? Buy? A third thing?</a></li> +<li><a href="https://discourse.ros.org/t/roscon-france-j-28-pour-le-call-for-paper/36317">ROSCon France CFP is Now Open</a></li> +<li><a href="https://www.linkedin.com/pulse/nvidia-open-navigation-collaborate-drive-new-mobile-amulya-vishwanath-pesqc/?trackingId=1qN86jkFQleBX%2FC%2Bx%2Fv%2BMA%3D%3D">NVIDIA Support for Nav2 (LinkedIn)</a></li> +<li><a href="https://cvpr.thecvf.com/Conferences/2024">CVPR Tutorials and Workshops Announced</a></li> +<li><a href="https://svrobo.org/volunteer-with-silicon-valley-robotics/">SVR and IEEE RAS Volunteer Form (also free desks at Circuit launch)</a></li> +<li><a href="https://www.therobotreport.com/picknik-robotics-moveit-studio-is-now-moveit-pro/">Robot Report MoveIt Pro</a></li> +</ul> +<h1><a class="anchor" href="https://discourse.ros.org#ros-4" name="ros-4"></a>ROS</h1> +<ul> +<li><a href="https://vimeo.com/917307697?share=copy">Cloud Robotics Working Group Meeting Recording</a></li> +<li><a href="https://vimeo.com/917307697?share=copy">Updates to REP-147 to Improve Drone Performance</a></li> +<li><a href="https://discourse.ros.org/t/development-topics-for-aerial-robotics-indoor-navigation/36347">Development topics for Aerial Robotics - Indoor Navigation </a></li> +<li><a href="https://discourse.ros.org/t/new-packages-for-ros-2-rolling-ridley-2024-02-28/36358">5 new and 279 Updated Package for Rolling – Sync hold for 24.04 Migration</a></li> +<li><a href="https://discourse.ros.org/t/new-ros-enhancement-proposal-for-marine-robotics/36218">Marine Robotics Conventions REP Discussion</a></li> +<li><a href="https://discourse.ros.org/t/gtc-march-18-21-highlights-for-ros-ai-robotics/36274">NVIDIA GTC Highlights</a></li> +<li><a href="https://github.com/Clothooo/lvt2calib">Unified Calibration for 3D LiDARs, Cameras and Thermal Cameras</a></li> +<li><a href="https://github.com/scepter914/DepthAnything-ROS">Depth-Anything ROS</a></li> +<li><a href="https://github.com/tatsuyai713/RCL-like-Wrapper-for-Fast-DDS/">RCL Wrapper for Fast DDS</a></li> +<li><a href="https://github.com/alexklwong/awesome-state-of-depth-completion">Awesome State of Depth Completion</a></li> +<li><a href="https://www.youtube.com/watch?app=desktop&amp;v=rBPPuN-KQ08&amp;feature=youtu.be">Bug Fixes and Logging with ROS 2</a></li> +<li><a href="https://github.com/TUMFTM/Multi_LiCa">Multi - LiDAR-to-LiDAR calibration framework for ROS2 and non-ROS applications</a></li> +<li><a href="https://github.com/RTI-BDI/ROS2-BDI">ROS 2 / PlanSys BDI Framework</a></li> +<li><a href="https://github.com/unitreerobotics/point_lio_unilidar">LiDAR Odometry for Unitree L1</a></li> +<li><a href="https://www.dihnamic.eu/fr/1603-2/">What can ROS bring to my industrial robots? (French)</a></li> +<li><a href="https://www.youtube.com/@ROSCon-India">ROSCon India 2023 Videos</a></li> +<li><a href="https://dgdm-robot.github.io">(COOL) Dynamics Guided Diffusion Model for Robot Manipulator Design</a></li> +<li><a href="https://github.com/teamspatzenhirn/rviz_2d_overlay_plugins">RViz 2D Overlay Plugins</a></li> +<li><a href="https://foxglove.dev/blog/announcing-h264-support-in-foxglove">H.264 Support in Foxglove </a></li> +<li><a href="https://www.youtube.com/watch?v=JIGO3b-aoz8">What to do when your robot’s camera doesn’t work</a></li> +<li><a href="https://fieldrobotics.net/Field_Robotics/Volume_4_files/Vol4_01.pdf">Fast and Modular Autonomy Software for Autonomous Racing Vehicles</a></li> +</ul> +<h1><a class="anchor" href="https://discourse.ros.org#ros-questions-5" name="ros-questions-5"></a>ROS Questions</h1> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/8/f/8fb2ed962f40d1b9346f5514c82d543fe805e6e8.jpeg" title="image"><img alt="image" height="250" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/8/f/8fb2ed962f40d1b9346f5514c82d543fe805e6e8_2_186x250.jpeg" width="186" /></a></div><p></p> +<p>Got a minute? <a href="https://robotics.stackexchange.com/">Please take a moment to answer a question on Robotics Stack Exchange and help out your fellow ROS users.</a></p> + <p><small>1 post - 1 participant</small></p> + <p><a href="https://discourse.ros.org/t/ros-news-for-the-week-of-february-26th-2024/36367">Read full topic</a></p> + 2024-03-01T18:15:56+00:00 + Katherine_Scott + + + ROS Discourse General: Revival of client library working group? + https://discourse.ros.org/t/revival-of-client-library-working-group/36406 + <p>Hi,<br /> +are there any plans, to revive this working group ?</p> + <p><small>15 posts - 5 participants</small></p> + <p><a href="https://discourse.ros.org/t/revival-of-client-library-working-group/36406">Read full topic</a></p> + 2024-03-01T16:19:00+00:00 + JM_ROS + + + ROS Discourse General: Scalability issues with large number of nodes + https://discourse.ros.org/t/scalability-issues-with-large-number-of-nodes/36399 + <p>My team and I are developing a mobile platform for industrial tasks (such as rivet fastening or drilling), fully based in ROS2 Stack (Humble).</p> +<p>The stack comprises a bunch of nodes for different tasks (slam, motion planning, fiducial registration…) and are coordinated through a state machine node (based on smach).</p> +<p>The issue we are facing is that the state machine node (which is connected to most of the nodes in the stack) gets slower and slower until it stops receiving events from other nodes.</p> +<p>We’ve been debbuging this issue and our feeling is that the number of objects (nodes/clients/subscribers…) is too high and whole stack suffers a lot of overhead, being this most noticeable in the “biggest” node (the state machine).</p> +<p>Our stack has 80 nodes, and a total of 1505 objects</p> +<ul> +<li>Stack clients: 198</li> +<li>Stack services: 636</li> +<li>Stack publishers: 236</li> +<li>Stack subscribers: 173</li> +</ul> +<p>My questions are:</p> +<ul> +<li>Is this number of nodes too high for an industrial robotics project? How large are usually projects using ROS2?</li> +<li>Which is the maximum number of objects in the stack? Is this a rmw limitation or ROS2 itself?</li> +</ul> + <p><small>30 posts - 14 participants</small></p> + <p><a href="https://discourse.ros.org/t/scalability-issues-with-large-number-of-nodes/36399">Read full topic</a></p> + 2024-03-01T09:35:36+00:00 + leander2189 + + + ROS Discourse General: Robot Fleet Management: Make vs. Buy? An Alternative + https://discourse.ros.org/t/robot-fleet-management-make-vs-buy-an-alternative/36330 + <p>Virtually every robotics CTO we’ve spoken to has told us about this dilemma about fleet management systems: neither “make” nor “buy” are great options! With Transitive we are providing an alternative.</p> +<aside class="onebox allowlistedgeneric"> + <header class="source"> + <img class="site-icon" height="16" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/e/c/ec02994dee445cd82f49ef66d73f44e715a315b3.png" width="16" /> + + <a href="https://transitiverobotics.com/blog/make-vs-buy/" rel="noopener nofollow ugc" target="_blank" title="12:00AM - 26 February 2024">transitiverobotics.com – 26 Feb 24</a> + </header> + + <article class="onebox-body"> + <div class="aspect-image"><img class="thumbnail" height="362" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/3/0/30d7b39881ac5c6bfd5ca25ba8e15d49daee742f_2_690x362.png" width="690" /></div> + +<h3><a href="https://transitiverobotics.com/blog/make-vs-buy/" rel="noopener nofollow ugc" target="_blank">Robot Fleet Management: Make vs. Buy? An Alternative | Transitive Robotics</a></h3> + + <p>Transitive provides an alternative to the make vs. buy dilemma of robot fleet management.</p> + + + </article> + + <div class="onebox-metadata"> + + + </div> + + <div style="clear: both;"></div> +</aside> + + <p><small>8 posts - 5 participants</small></p> + <p><a href="https://discourse.ros.org/t/robot-fleet-management-make-vs-buy-an-alternative/36330">Read full topic</a></p> + 2024-02-26T23:00:34+00:00 + chfritz + + + ROS Discourse General: Rclcpp template metaprogramming bug. Help wanted + https://discourse.ros.org/t/rclcpp-template-metaprogramming-bug-help-wanted/36319 + <p>Hi,<br /> +we hit a bug in the function traits that is out of my league.<br /> +If you are really good with template metraprogramming, please have a look at:</p><aside class="onebox githubissue"> + <header class="source"> + + <a href="https://github.com/ros2/rclcpp/issues/2429" rel="noopener nofollow ugc" target="_blank">github.com/ros2/rclcpp</a> + </header> + + <article class="onebox-body"> + <div class="github-row"> + <div class="github-icon-container" title="Issue"> + <svg class="github-icon" height="60" viewBox="0 0 14 16" width="60" xmlns="http://www.w3.org/2000/svg"><path d="M7 2.3c3.14 0 5.7 2.56 5.7 5.7s-2.56 5.7-5.7 5.7A5.71 5.71 0 0 1 1.3 8c0-3.14 2.56-5.7 5.7-5.7zM7 1C3.14 1 0 4.14 0 8s3.14 7 7 7 7-3.14 7-7-3.14-7-7-7zm1 3H6v5h2V4zm0 6H6v2h2v-2z" fill-rule="evenodd"></path></svg> + </div> + + <div class="github-info-container"> + <h4> + <a href="https://github.com/ros2/rclcpp/issues/2429" rel="noopener nofollow ugc" target="_blank">AnySubscriptionCallback doesn't accept `std::bind` callbacks with bound arguments</a> + </h4> + + <div class="github-info"> + <div class="date"> + opened <span class="discourse-local-date">09:56AM - 20 Feb 24 UTC</span> + </div> + + + <div class="user"> + <a href="https://github.com/HovorunB" rel="noopener nofollow ugc" target="_blank"> + <img alt="HovorunB" class="onebox-avatar-inline" height="20" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/a/0/a09013e7344330898234ab4ff20056eac37b1631.png" width="20" /> + HovorunB + </a> + </div> + </div> + + <div class="labels"> + </div> + </div> +</div> + + <div class="github-row"> + <p class="github-body-container">## Bug report + +**Required Info:** + +- Operating System: + - Ubuntu 22.04 +- <span class="show-more-container"><a class="show-more" href="https://discourse.ros.org" rel="noopener">…</a></span><span class="excerpt hidden">Installation type: + - from source +- Version or commit hash: + - rolling +- DDS implementation: + - Fast-RTPS +- Client library (if applicable): + - rclcpp + +#### Steps to reproduce issue + +1) Build rclcpp on a version after https://github.com/ros2/rclcpp/pull/1928 +2) Try to build for example [foxglove-bridge](https://github.com/foxglove/ros-foxglove-bridge) + +#### Expected behavior +`std::bind(&amp;FoxgloveBridge::rosMessageHandler, this, channelId, clientHandle, _1),` [(source)](https://github.com/foxglove/ros-foxglove-bridge/blob/89239eb5bbb8549fec08ade82254fabcf773cc37/ros2_foxglove_bridge/src/ros2_foxglove_bridge.cpp#L535-L538) is cast to `std::function` from [any_subscription_callback.hpp](https://github.com/ros2/rclcpp/blob/10252e9f66ac87f3903f301b64320d32457f0658/rclcpp/include/rclcpp/any_subscription_callback.hpp#L416) + + +#### Actual behavior +![Screenshot from 2024-02-19 10-56-06](https://github.com/ros2/rclcpp/assets/87417416/56f16069-6ab4-4a4d-ae28-a9865cafaf17) + +#### Additional information + +For example `std::bind(&amp;Class::method, this, std::placeholders::_1)` (without bound arguments) will build fine + +We also were able to fix the issue by casting the callback to `std::function` before passing it to the subscription +``` +auto subscriber = this-&gt;create_generic_subscription( + topic, datatype, qos, + static_cast&lt;std::function&lt;void(std::shared_ptr&lt;rclcpp::SerializedMessage&gt;)&gt;&gt;(std::bind(&amp;FoxgloveBridge::rosMessageHandler, this, channelId, clientHandle, _1)), + subscriptionOptions); +``` +Is this how it is supposed to be done now, or is there a bug in casting std::bind from any_subscription_callback.hpp?</span></p> + </div> + + </article> + + <div class="onebox-metadata"> + + + </div> + + <div style="clear: both;"></div> +</aside> +<p> +Thanks.</p> + <p><small>1 post - 1 participant</small></p> + <p><a href="https://discourse.ros.org/t/rclcpp-template-metaprogramming-bug-help-wanted/36319">Read full topic</a></p> + 2024-02-26T09:54:21+00:00 + JM_ROS + + + ROS Discourse General: ROS News for the Week of February 19th, 2024 + https://discourse.ros.org/t/ros-news-for-the-week-of-february-19th-2024/36297 + <h1><a class="anchor" href="https://discourse.ros.org#ros-news-for-the-week-of-february-19th-2024-1" name="ros-news-for-the-week-of-february-19th-2024-1"></a>ROS News for the Week of February 19th, 2024</h1> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/1/5/156ac623189dad6794d1ea4909bc5936954f0a75.jpeg" title="ROS &amp; Gazebo GSoC 2024 (4)"><img alt="ROS &amp; Gazebo GSoC 2024 (4)" height="291" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/1/5/156ac623189dad6794d1ea4909bc5936954f0a75_2_517x291.jpeg" width="517" /></a></div><br /> +<a href="https://discourse.ros.org/t/attention-students-open-robotics-google-summer-of-code-2024-projects/36271">Open Robotics will be participating in Google Summer of Code 2024.</a> We’re looking for a few interns to help us out! See the post for all the details.<p></p> +<br /> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/e/c/ec6167c7415dff0d97740f4de4ae75be20f9108d.jpeg" title="Copy of Feb24GCM"><img alt="Copy of Feb24GCM" height="291" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/e/c/ec6167c7415dff0d97740f4de4ae75be20f9108d_2_517x291.jpeg" width="517" /></a></div><p></p> +<p><a href="https://community.gazebosim.org/t/community-meeting-pan-african-robotics-competition-parc/2564">Our next Gazebo Community meeting is next Wednesday, February 28th. </a> Sikiru Salau, a competitor in the <a href="https://parcrobotics.org/">Pan-African Robotics Competition</a>, will be joining us to talk about simulating robots for agriculture.</p> +<br /> +<p><img alt="image" height="272" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/f/f/ff6ea0e30d4779c75920306594cb6795825311c7.jpeg" width="405" /><br /> +Hello Robot is having a great month! Last week they released their <a href="https://hello-robot.com/stretch-3-whats-new">third gen robot</a>. This week they are at the top of the <a href="https://news.ycombinator.com/item?id=39483482">orange website</a> with <a href="https://ok-robot.github.io">this “OK Robot” paper from NYU</a></p> +<br /> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/2/4/246645eb0057f4128b9c46ed3e3b62211654347f.jpeg" title="282791034-352fa4d7-270b-43e4-bd51-bcee4377b07a"><img alt="282791034-352fa4d7-270b-43e4-bd51-bcee4377b07a" height="286" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/2/4/246645eb0057f4128b9c46ed3e3b62211654347f_2_517x286.jpeg" width="517" /></a></div><br /> +<a href="https://github.com/JatinPatil2003/AutoNav">Check out the AutoNav robot by Jatin Patil.</a> Hats off to the developer, this is a really well put together personal project!<p></p> +<br /> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/5/5/5581295f1c63eb51bd8ab79e6bdf722d3c2ac818.jpeg" title="RIP"><img alt="RIP" height="250" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/5/5/5581295f1c63eb51bd8ab79e6bdf722d3c2ac818_2_250x250.jpeg" width="250" /></a></div><br /> +Just a reminder: <a href="https://discourse.ros.org/t/gazebo-classic-end-of-life-ros-2-jazzy/36239">Gazebo Classic goes End-Of-Life in January 2025 and ROS 2 Jazzy will not support Gazebo Classic.</a> We put together some guidance for those of you that need to make the switch!<p></p> +<h1><a class="anchor" href="https://discourse.ros.org#events-2" name="events-2"></a>Events</h1> +<ul> +<li><a href="https://discourse.ros.org/t/february-2024-meetings-aerial-robotics/35981">February Aerial Robotics Meetings</a></li> +<li><a href="https://community.gazebosim.org/t/community-meeting-pan-african-robotics-competition-parc/2564">2024-02-28 Gazebo Community Meeting wsg Sikiru Salau of Pan-African Robotics Competition</a></li> +<li><a href="https://www.meetup.com/boulderisforrobots/events/299280969/">2024-03-06 Boulder is for Robots Meetup</a></li> +<li><a href="https://online-learning.tudelft.nl/courses/hello-real-world-with-ros-robot-operating-systems/">2024-03-14 TU Delft ROS MOOC (FREE!)</a></li> +<li><a href="https://rosindustrial.org/events/2024/ros-industrial-consortium-americas-2024-annual-meeting">2024-03-27 ROS Industrial Annual Meeting</a></li> +<li><a href="https://www.eventbrite.com/e/robotics-and-tech-happy-hour-tickets-835278218637?aff=soc">2024-03-27 Robotics Happy Hour Pittsburgh</a></li> +<li><a href="https://www.eventbrite.com/e/robots-on-ice-40-2024-tickets-815030266467">2024-04-21 Robots on Ice</a></li> +<li><a href="https://2024.oshwa.org/">2024-05-03 Open Hardware Summit Montreal</a></li> +<li><a href="https://www.tum-venture-labs.de/education/bots-bento-the-first-robotic-pallet-handler-competition-icra-2024/">2024-05-13 Bots &amp; Bento @ ICRA 2024</a></li> +<li><a href="https://discourse.ros.org/t/acm-sigsoft-summer-school-on-software-engineering-for-robotics-in-brussels/35992">2024-06-04 → 2024-06-08 ACM SIGSOFT Summer School on Software Engineering for Robotics</a></li> +<li><a href="https://www.agriculture-vision.com/">2024-06-?? Workshop on Agriculture Vision at CVPR 2024</a></li> +<li><a href="https://icra2024.rt-net.jp/archives/87">2024-06-16 Food Topping Challenge at ICRA 2024</a></li> +<li><a href="https://discourse.ros.org/t/roscon-france-2024/35222">2024-06-19 =&gt; 2024-06-20 ROSCon France</a></li> +<li><a href="https://sites.udel.edu/ceoe-able/able-summer-bootcamp/">2024-08-05 Autonomous Systems Bootcam at Univ. Deleware</a>– <a href="https://www.youtube.com/watch?v=4ETDsBN2o8M">Video</a></li> +<li><a href="https://roscon.jp/2024/">2024-09-25 ROSConJP</a></li> +</ul> +<h1><a class="anchor" href="https://discourse.ros.org#news-3" name="news-3"></a>News</h1> +<ul> +<li><a href="https://discourse.ros.org/t/announcing-moveit-pro-runtime-and-developer-platform-previously-moveit-studio/36235">MoveIt Pro Released</a> – <a href="https://www.therobotreport.com/picknik-robotics-moveit-studio-is-now-moveit-pro/">The Robot Report</a></li> +<li><a href="https://www.therobotreport.com/intuitive-machines-odysseus-makes-first-us-lunar-landing-50-years/">Intuitive Machines Lands on the Moon!</a> – <a href="https://spectrum.ieee.org/lunar-landing-intuitive-machines">IEEE Spectrum</a></li> +<li><a href="https://www.swri.org/industry/industrial-robotics-automation/blog/unveiling-novel-lunar-rover-navigation">SWRI’s Novel Lunar Rover Navigation System</a></li> +<li><a href="https://arxiv.org/abs/2402.13616">YOLO v. 9 Paper</a>–<a href="https://github.com/WongKinYiu/yolov9">Source</a></li> +<li><a href="https://github.com/snt-arg/lidar_s_graphs/">Real Time S-Graphs for Robot Pose</a> – <a href="https://ieeexplore.ieee.org/document/10168233">Paper</a></li> +<li><a href="https://github.com/ywyeli/lidar-camera-placement">Influence of Camera-LiDAR Configuration on 3D Object Detection for Autonomous Driving</a></li> +<li><a href="https://ok-robot.github.io/">An open, modular framework for zero-shot, language conditioned pick-and-drop tasks in arbitrary homes.</a></li> +<li><a href="https://www.robot-learning.uk/dinobot">DINOBot: Robot Manipulation via Retrieval and Alignment with Vision Foundation Models</a></li> +<li><a href="https://marwan99.github.io/Fit-NGP/">Cool: Fit-NGP: Fitting Object Models to Neural Graphics Primitives</a> – <a href="https://www.youtube.com/watch?v=KQ7yH_em3Qg">Video</a></li> +<li><a href="https://github.com/MIT-SPARK/Khronos">Khronos: Spatio-Temporal Metric-Semantic SLAM</a></li> +<li><a href="https://github.com/huiyu-gao/VisFusion">VisFusion: Visibility-aware Online 3D Scene Reconstruction from Videos</a></li> +<li><a href="https://github.com/ISEE-Technology/lidar-with-velocity">Lidar With Velocity: Correcting Moving Objects Point Cloud Distortion From Oscillating Scanning Lidars by Fusion With Camera</a></li> +<li><a href="https://techcrunch.com/2024/02/17/dutch-startup-monumental-is-using-robots-to-lay-bricks/">Dutch startup Monumental is using robots to lay bricks</a></li> +<li><a href="https://www.therobotreport.com/olis-robotics-and-kawasaki-partner-to-offer-remote-troubleshooting/">Olis and Kawasaki Land Ink Deal for Remote Support</a></li> +<li><a href="https://spectrum.ieee.org/video-friday-pedipulate">Video Friday</a></li> +</ul> +<h1><a class="anchor" href="https://discourse.ros.org#ros-4" name="ros-4"></a>ROS</h1> +<ul> +<li><a href="https://discourse.ros.org/t/attention-students-open-robotics-google-summer-of-code-2024-projects/36271">Open Robotics @ GSoC 2024</a></li> +<li><a href="https://discourse.ros.org/t/gazebo-classic-end-of-life-ros-2-jazzy/36239">Gazebo Classic End-Of-Life and ROS 2 Jazzy</a></li> +<li><a href="https://discourse.ros.org/t/new-packages-and-patch-release-for-humble-hawksbill-2024-02-22/36275">42 New and 280 Updated Packages for Humble</a></li> +<li><a href="https://discourse.ros.org/t/new-packages-for-iron-irwini-2024-02-23/36283">2 New and 75 Updated Packages for Iron</a></li> +<li><a href="https://discourse.ros.org/t/new-packages-for-noetic-2024-02-22/36273">0 New and 136 Updated Packages for Noetic</a></li> +<li><a href="https://discourse.ros.org/t/migrating-turtlebot-2-to-ros-2/36225">ROS 2 Migration Guide using TurtleBot 2</a></li> +<li><a href="https://discourse.ros.org/t/new-ros-enhancement-proposal-for-marine-robotics/36218/5">REP Proposal: Coordinate Frames for Maritime Robots</a></li> +<li><a href="https://discourse.ros.org/t/introducing-psdk-ros2-bridging-djis-psdk-libraries-with-ros-2/33500">ROS 2 DJI Drone PSDK Bridge</a></li> +<li><a href="https://discourse.ros.org/t/learn-ros2-with-a-limo-robot-ros-developers-openclass-182/36287">Learn ROS 2 with LIMO Robot</a></li> +<li><a href="https://discourse.ros.org/t/gtc-march-18-21-highlights-for-ros-ai-robotics/36274">Highlights from NVIDIA GTC for ROS Developers</a></li> +<li><a href="https://discourse.ros.org/t/handle-unique-parameters-for-robot-instances/36074">Handle Unique Paramers for Robot Instances</a></li> +<li><a href="https://discourse.ros.org/t/we-use-websockets-and-ros-messaging-together-in-our-robot-software-stack-should-you/36199">Thoughts on Websockets with ROS Messaging</a></li> +<li><a href="https://vimeo.com/915293743">Maritime Robotics Working Group wsg HoloOcean</a></li> +<li><a href="https://discourse.ros.org/t/plotjuggler-3-9-1-is-out-and-few-more-things-you-should-know/36210">Plot Juggler 3.9.1 Release</a></li> +<li><a href="https://discourse.ros.org/t/devops-for-robotics-certificate-training-in-barcelona-spain-march-20-22-2024/36213">Devops for Robotics Certificate Training</a></li> +<li><a href="https://discourse.ros.org/t/preparing-ros-2-rolling-for-the-transition-to-ubuntu-24-04/35673">Rolling out 24.04 for Rolling</a></li> +<li><a href="https://discourse.ros.org/t/mcap-file-editor-gui-in-your-browser/36198">MCAP File Editor from @facontidavide</a></li> +<li><a href="https://www.hackster.io/aal-shaji/differential-drive-robot-using-ros2-and-esp32-aae289">Diff Drive ROS 2 Robot with ESP32</a></li> +<li><a href="https://www.petrikvandervelde.nl/posts/Swerve-drive-motor-limitations">Swerve Drive Motor Limitations</a> ← This personal blog is <img alt=":100:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/100.png?v=12" title=":100:" width="20" /></li> +<li><a href="https://github.com/JatinPatil2003/AutoNav">New AutoNav ROS Robot</a> – <a href="https://www.youtube.com/watch?v=g5LXZQY55DI">Video</a></li> +<li><a href="https://www.youtube.com/watch?v=Sz1fanH58kg">ROS Intro Workshop @ Purdue</a> – <a href="https://ivory-sale-974.notion.site/ARC-ROS-Workshop-2d26d5bcdd69496996806ccf8e5a011b">Materials</a></li> +<li><a href="https://www.youtube.com/watch?v=Y6AUsB3RUhA">Robotics at Compile Time: Optimizing Robotics Algorithms With C++'s Compile-Time Features - CppCon23</a></li> +<li><a href="https://github.com/jarain78/mycobot280_movelt2">MyCobot280 for MoveIt</a></li> +<li><a href="https://rosmo-robot.github.io/zio/">Ziobot ROS Robot</a></li> +</ul> +<h1><a class="anchor" href="https://discourse.ros.org#ros-questions-5" name="ros-questions-5"></a>ROS Questions</h1> +<p>Got a minute to spare? Pay it forward by answering a few ROS questions on <a href="https://robotics.stackexchange.com/">Robotics Stack Exchange</a>.</p> + <p><small>3 posts - 3 participants</small></p> + <p><a href="https://discourse.ros.org/t/ros-news-for-the-week-of-february-19th-2024/36297">Read full topic</a></p> + 2024-02-23T23:13:39+00:00 + Katherine_Scott + + + ROS Discourse General: New Packages for Iron Irwini 2024-02-23 + https://discourse.ros.org/t/new-packages-for-iron-irwini-2024-02-23/36283 + <p>We’re happy to announce <strong>2</strong> new packages and <strong>75</strong> updates are now available in ROS 2 Iron Irwini <img alt=":iron:" class="emoji emoji-custom" height="20" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/b/3/b3c1340fc185f5e47c7ec55ef5bb1771802de993.png?v=12" title=":iron:" width="20" /> <img alt=":irwini:" class="emoji emoji-custom" height="20" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/d/2/d2f3dcbdaff6f33258719fe5b8f692594a9feab0.png?v=12" title=":irwini:" width="20" /> . This sync was tagged as <a href="https://github.com/ros/rosdistro/blob/iron/2024-02-23/iron/distribution.yaml" rel="noopener nofollow ugc"><code>iron/2024-02-23</code> </a>.</p> +<h2><a class="anchor" href="https://discourse.ros.org#package-updates-for-iron-1" name="package-updates-for-iron-1"></a>Package Updates for iron</h2> +<h3><a class="anchor" href="https://discourse.ros.org#added-packages-2-2" name="added-packages-2-2"></a>Added Packages [2]:</h3> +<ul> +<li>ros-iron-apriltag-detector: 1.2.0-1</li> +<li>ros-iron-multidim-rrt-planner: 0.0.8-1</li> +</ul> +<h3><a class="anchor" href="https://discourse.ros.org#updated-packages-75-3" name="updated-packages-75-3"></a>Updated Packages [75]:</h3> +<ul> +<li>ros-iron-ackermann-steering-controller: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-ackermann-steering-controller-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-admittance-controller: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-admittance-controller-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-azure-iot-sdk-c: 1.10.1-4 → 1.12.0-1</li> +<li>ros-iron-bicycle-steering-controller: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-bicycle-steering-controller-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-bno055: 0.4.1-4 → 0.5.0-1</li> +<li>ros-iron-diff-drive-controller: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-diff-drive-controller-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li><a href="https://wiki.ros.org/draco_point_cloud_transport">ros-iron-draco-point-cloud-transport</a>: 2.0.3-1 → 2.0.4-1</li> +<li>ros-iron-draco-point-cloud-transport-dbgsym: 2.0.3-1 → 2.0.4-1</li> +<li>ros-iron-effort-controllers: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-effort-controllers-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-force-torque-sensor-broadcaster: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-force-torque-sensor-broadcaster-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-forward-command-controller: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-forward-command-controller-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-gripper-controllers: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-gripper-controllers-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-imu-sensor-broadcaster: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-imu-sensor-broadcaster-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-joint-state-broadcaster: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-joint-state-broadcaster-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-joint-trajectory-controller: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-joint-trajectory-controller-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li><a href="http://wiki.ros.org/leo">ros-iron-leo</a>: 2.0.0-1 → 2.0.1-1</li> +<li><a href="http://wiki.ros.org/leo_description">ros-iron-leo-description</a>: 2.0.0-1 → 2.0.1-1</li> +<li><a href="http://wiki.ros.org/leo">ros-iron-leo-msgs</a>: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-leo-msgs-dbgsym: 2.0.0-1 → 2.0.1-1</li> +<li><a href="http://wiki.ros.org/leo_teleop">ros-iron-leo-teleop</a>: 2.0.0-1 → 2.0.1-1</li> +<li><a href="https://github.com/MOLAorg/mp2p_icp" rel="noopener nofollow ugc">ros-iron-mp2p-icp</a>: 1.1.0-1 → 1.2.0-1</li> +<li>ros-iron-mp2p-icp-dbgsym: 1.1.0-1 → 1.2.0-1</li> +<li><a href="https://www.mrpt.org/" rel="noopener nofollow ugc">ros-iron-mrpt2</a>: 2.11.7-1 → 2.11.9-1</li> +<li>ros-iron-mrpt2-dbgsym: 2.11.7-1 → 2.11.9-1</li> +<li><a href="https://github.com/facontidavide/PlotJuggler" rel="noopener nofollow ugc">ros-iron-plotjuggler-ros</a>: 2.1.0-1 → 2.1.1-1</li> +<li>ros-iron-plotjuggler-ros-dbgsym: 2.1.0-1 → 2.1.1-1</li> +<li><a href="https://wiki.ros.org/draco_point_cloud_transport">ros-iron-point-cloud-interfaces</a>: 2.0.3-1 → 2.0.4-1</li> +<li>ros-iron-point-cloud-interfaces-dbgsym: 2.0.3-1 → 2.0.4-1</li> +<li>ros-iron-point-cloud-transport: 2.0.3-1 → 2.0.4-1</li> +<li>ros-iron-point-cloud-transport-dbgsym: 2.0.3-1 → 2.0.4-1</li> +<li><a href="https://wiki.ros.org/point_cloud_transport">ros-iron-point-cloud-transport-plugins</a>: 2.0.3-1 → 2.0.4-1</li> +<li>ros-iron-point-cloud-transport-py: 2.0.3-1 → 2.0.4-1</li> +<li>ros-iron-position-controllers: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-position-controllers-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-range-sensor-broadcaster: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-range-sensor-broadcaster-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-rcpputils: 2.6.2-1 → 2.6.3-1</li> +<li>ros-iron-rcpputils-dbgsym: 2.6.2-1 → 2.6.3-1</li> +<li><a href="http://robotraconteur.com" rel="noopener nofollow ugc">ros-iron-robotraconteur</a>: 1.0.0-1 → 1.0.0-2</li> +<li>ros-iron-robotraconteur-dbgsym: 1.0.0-1 → 1.0.0-2</li> +<li>ros-iron-ros2-controllers: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-ros2-controllers-test-nodes: 3.21.0-1 → 3.22.0-1</li> +<li><a href="http://ros.org/wiki/rqt">ros-iron-rqt</a>: 1.3.3-1 → 1.3.4-1</li> +<li>ros-iron-rqt-gauges: 0.0.1-1 → 0.0.2-1</li> +<li><a href="http://ros.org/wiki/rqt_gui">ros-iron-rqt-gui</a>: 1.3.3-1 → 1.3.4-1</li> +<li><a href="http://ros.org/wiki/rqt_gui_cpp">ros-iron-rqt-gui-cpp</a>: 1.3.3-1 → 1.3.4-1</li> +<li>ros-iron-rqt-gui-cpp-dbgsym: 1.3.3-1 → 1.3.4-1</li> +<li><a href="http://ros.org/wiki/rqt_gui_py">ros-iron-rqt-gui-py</a>: 1.3.3-1 → 1.3.4-1</li> +<li>ros-iron-rqt-joint-trajectory-controller: 3.21.0-1 → 3.22.0-1</li> +<li><a href="http://ros.org/wiki/rqt_py_common">ros-iron-rqt-py-common</a>: 1.3.3-1 → 1.3.4-1</li> +<li>ros-iron-steering-controllers-library: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-steering-controllers-library-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-tricycle-controller: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-tricycle-controller-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-tricycle-steering-controller: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-tricycle-steering-controller-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-velocity-controllers: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-velocity-controllers-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-vrpn-mocap: 1.0.3-3 → 1.1.0-1</li> +<li>ros-iron-vrpn-mocap-dbgsym: 1.0.3-3 → 1.1.0-1</li> +<li><a href="https://wiki.ros.org/draco_point_cloud_transport">ros-iron-zlib-point-cloud-transport</a>: 2.0.3-1 → 2.0.4-1</li> +<li>ros-iron-zlib-point-cloud-transport-dbgsym: 2.0.3-1 → 2.0.4-1</li> +<li><a href="https://wiki.ros.org/draco_point_cloud_transport">ros-iron-zstd-point-cloud-transport</a>: 2.0.3-1 → 2.0.4-1</li> +<li>ros-iron-zstd-point-cloud-transport-dbgsym: 2.0.3-1 → 2.0.4-1</li> +</ul> +<h3><a class="anchor" href="https://discourse.ros.org#removed-packages-0-4" name="removed-packages-0-4"></a>Removed Packages [0]:</h3> +<p>Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:</p> +<ul> +<li>Alejandro Hernandez Cordero</li> +<li>Alejandro Hernández</li> +<li>Alvin Sun</li> +<li>Bence Magyar</li> +<li>Bernd Pfrommer</li> +<li>Brandon Ong</li> +<li>Davide Faconti</li> +<li>Denis Štogl</li> +<li>Dharini Dutia</li> +<li>Eloy Bricneo</li> +<li>Fictionlab</li> +<li>John Wason</li> +<li>Jose-Luis Blanco-Claraco</li> +<li>Martin Pecka</li> +<li>Tim Clephas</li> +<li>david</li> +<li>flynneva</li> +</ul> + <p><small>1 post - 1 participant</small></p> + <p><a href="https://discourse.ros.org/t/new-packages-for-iron-irwini-2024-02-23/36283">Read full topic</a></p> + 2024-02-23T10:35:02+00:00 + Yadunund + + + ROS Discourse General: New packages and patch release for Humble Hawksbill 2024-02-22 + https://discourse.ros.org/t/new-packages-and-patch-release-for-humble-hawksbill-2024-02-22/36275 + <p>We’re happy to announce a <a href="https://github.com/ros2/ros2/releases/tag/release-humble-20240222" rel="noopener nofollow ugc">new Humble release</a>!</p> +<p>This sync brings several new packages and some updates to ROS 2 core packages. (I’m not including the project board because it was empty.)</p> +<hr /> +<h2><a class="anchor" href="https://discourse.ros.org#package-updates-for-humble-1" name="package-updates-for-humble-1"></a>Package Updates for Humble</h2> +<h3><a class="anchor" href="https://discourse.ros.org#added-packages-42-2" name="added-packages-42-2"></a>Added Packages [42]:</h3> +<ul> +<li>ros-humble-apriltag-detector: 1.1.0-1</li> +<li>ros-humble-as2-gazebo-assets: 1.0.8-1</li> +<li>ros-humble-as2-gazebo-assets-dbgsym: 1.0.8-1</li> +<li>ros-humble-as2-platform-dji-osdk: 1.0.8-1</li> +<li>ros-humble-as2-platform-dji-osdk-dbgsym: 1.0.8-1</li> +<li>ros-humble-as2-platform-gazebo: 1.0.8-1</li> +<li>ros-humble-as2-platform-gazebo-dbgsym: 1.0.8-1</li> +<li>ros-humble-caret-analyze: 0.5.0-1</li> +<li>ros-humble-caret-msgs: 0.5.0-6</li> +<li>ros-humble-caret-msgs-dbgsym: 0.5.0-6</li> +<li>ros-humble-data-tamer-cpp: 0.9.3-2</li> +<li>ros-humble-data-tamer-cpp-dbgsym: 0.9.3-2</li> +<li>ros-humble-data-tamer-msgs: 0.9.3-2</li> +<li>ros-humble-data-tamer-msgs-dbgsym: 0.9.3-2</li> +<li>ros-humble-hardware-interface-testing: 2.39.1-1</li> +<li>ros-humble-hardware-interface-testing-dbgsym: 2.39.1-1</li> +<li>ros-humble-hri-msgs: 2.0.0-1</li> +<li>ros-humble-hri-msgs-dbgsym: 2.0.0-1</li> +<li>ros-humble-mocap4r2-dummy-driver: 0.0.7-1</li> +<li>ros-humble-mocap4r2-dummy-driver-dbgsym: 0.0.7-1</li> +<li>ros-humble-mocap4r2-marker-viz: 0.0.7-1</li> +<li>ros-humble-mocap4r2-marker-viz-dbgsym: 0.0.7-1</li> +<li>ros-humble-mocap4r2-marker-viz-srvs: 0.0.7-1</li> +<li>ros-humble-mocap4r2-marker-viz-srvs-dbgsym: 0.0.7-1</li> +<li>ros-humble-motion-capture-tracking: 1.0.3-1</li> +<li>ros-humble-motion-capture-tracking-dbgsym: 1.0.3-1</li> +<li>ros-humble-motion-capture-tracking-interfaces: 1.0.3-1</li> +<li>ros-humble-motion-capture-tracking-interfaces-dbgsym: 1.0.3-1</li> +<li>ros-humble-psdk-interfaces: 1.0.0-1</li> +<li>ros-humble-psdk-interfaces-dbgsym: 1.0.0-1</li> +<li>ros-humble-psdk-wrapper: 1.0.0-1</li> +<li>ros-humble-psdk-wrapper-dbgsym: 1.0.0-1</li> +<li>ros-humble-qb-softhand-industry-description: 2.1.2-4</li> +<li>ros-humble-qb-softhand-industry-msgs: 2.1.2-4</li> +<li>ros-humble-qb-softhand-industry-msgs-dbgsym: 2.1.2-4</li> +<li>ros-humble-qb-softhand-industry-ros2-control: 2.1.2-4</li> +<li>ros-humble-qb-softhand-industry-ros2-control-dbgsym: 2.1.2-4</li> +<li>ros-humble-qb-softhand-industry-srvs: 2.1.2-4</li> +<li>ros-humble-qb-softhand-industry-srvs-dbgsym: 2.1.2-4</li> +<li>ros-humble-ros2caret: 0.5.0-2</li> +<li>ros-humble-sync-parameter-server: 1.0.1-2</li> +<li>ros-humble-sync-parameter-server-dbgsym: 1.0.1-2</li> +</ul> +<h3><a class="anchor" href="https://discourse.ros.org#updated-packages-280-3" name="updated-packages-280-3"></a>Updated Packages [280]:</h3> +<ul> +<li>ros-humble-ament-cmake: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-auto: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-core: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-export-definitions: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-export-dependencies: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-export-include-directories: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-export-interfaces: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-export-libraries: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-export-link-flags: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-export-targets: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-gen-version-h: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-gmock: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-google-benchmark: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-gtest: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-include-directories: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-libraries: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-nose: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-pytest: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-python: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-target-dependencies: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-test: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-vendor-package: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-version: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-as2-alphanumeric-viewer: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-alphanumeric-viewer-dbgsym: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-behavior: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-behavior-dbgsym: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-behavior-tree: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-behavior-tree-dbgsym: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-behaviors-motion: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-behaviors-motion-dbgsym: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-behaviors-perception: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-behaviors-perception-dbgsym: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-behaviors-platform: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-behaviors-platform-dbgsym: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-behaviors-trajectory-generation: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-behaviors-trajectory-generation-dbgsym: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-cli: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-core: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-keyboard-teleoperation: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-motion-controller: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-motion-controller-dbgsym: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-motion-reference-handlers: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-msgs: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-msgs-dbgsym: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-platform-crazyflie: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-platform-crazyflie-dbgsym: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-platform-tello: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-platform-tello-dbgsym: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-python-api: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-realsense-interface: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-realsense-interface-dbgsym: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-state-estimator: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-state-estimator-dbgsym: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-usb-camera-interface: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-usb-camera-interface-dbgsym: 1.0.6-1 → 1.0.8-1</li> +<li><a href="https://index.ros.org/p/camera_calibration/github-ros-perception-image_pipeline/">ros-humble-camera-calibration</a>: 3.0.0-1 → 3.0.3-1</li> +<li>ros-humble-controller-interface: 2.37.0-1 → 2.39.1-1</li> +<li>ros-humble-controller-interface-dbgsym: 2.37.0-1 → 2.39.1-1</li> +<li>ros-humble-controller-manager: 2.37.0-1 → 2.39.1-1</li> +<li>ros-humble-controller-manager-dbgsym: 2.37.0-1 → 2.39.1-1</li> +<li><a href="http://ros.org/wiki/controller_manager_msgs">ros-humble-controller-manager-msgs</a>: 2.37.0-1 → 2.39.1-1</li> +<li>ros-humble-controller-manager-msgs-dbgsym: 2.37.0-1 → 2.39.1-1</li> +<li>ros-humble-costmap-queue: 1.1.12-1 → 1.1.13-1</li> +<li><a href="https://index.ros.org/p/depth_image_proc/github-ros-perception-image_pipeline/">ros-humble-depth-image-proc</a>: 3.0.0-1 → 3.0.3-1</li> +<li>ros-humble-depth-image-proc-dbgsym: 3.0.0-1 → 3.0.3-1</li> +<li>ros-humble-depthai-bridge: 2.8.2-1 → 2.9.0-1</li> +<li>ros-humble-depthai-bridge-dbgsym: 2.8.2-1 → 2.9.0-1</li> +<li>ros-humble-depthai-descriptions: 2.8.2-1 → 2.9.0-1</li> +<li>ros-humble-depthai-examples: 2.8.2-1 → 2.9.0-1</li> +<li>ros-humble-depthai-examples-dbgsym: 2.8.2-1 → 2.9.0-1</li> +<li>ros-humble-depthai-filters: 2.8.2-1 → 2.9.0-1</li> +<li>ros-humble-depthai-filters-dbgsym: 2.8.2-1 → 2.9.0-1</li> +<li>ros-humble-depthai-ros: 2.8.2-1 → 2.9.0-1</li> +<li>ros-humble-depthai-ros-driver: 2.8.2-1 → 2.9.0-1</li> +<li>ros-humble-depthai-ros-driver-dbgsym: 2.8.2-1 → 2.9.0-1</li> +<li>ros-humble-depthai-ros-msgs: 2.8.2-1 → 2.9.0-1</li> +<li>ros-humble-depthai-ros-msgs-dbgsym: 2.8.2-1 → 2.9.0-1</li> +<li>ros-humble-dwb-core: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-dwb-core-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-dwb-critics: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-dwb-critics-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-dwb-msgs: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-dwb-msgs-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-dwb-plugins: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-dwb-plugins-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-event-camera-codecs: 1.1.2-1 → 1.1.3-1</li> +<li>ros-humble-event-camera-codecs-dbgsym: 1.1.2-1 → 1.1.3-1</li> +<li>ros-humble-event-camera-py: 1.1.3-1 → 1.1.4-1</li> +<li>ros-humble-event-camera-renderer: 1.1.2-1 → 1.1.3-1</li> +<li>ros-humble-event-camera-renderer-dbgsym: 1.1.2-1 → 1.1.3-1</li> +<li>ros-humble-examples-tf2-py: 0.25.5-1 → 0.25.6-1</li> +<li><a href="http://www.ros.org/wiki/geometry2">ros-humble-geometry2</a>: 0.25.5-1 → 0.25.6-1</li> +<li>ros-humble-hardware-interface: 2.37.0-1 → 2.39.1-1</li> +<li>ros-humble-hardware-interface-dbgsym: 2.37.0-1 → 2.39.1-1</li> +<li><a href="https://index.ros.org/p/image_pipeline/github-ros-perception-image_pipeline/">ros-humble-image-pipeline</a>: 3.0.0-1 → 3.0.3-1</li> +<li><a href="https://index.ros.org/p/image_proc/github-ros-perception-image_pipeline/">ros-humble-image-proc</a>: 3.0.0-1 → 3.0.3-1</li> +<li>ros-humble-image-proc-dbgsym: 3.0.0-1 → 3.0.3-1</li> +<li><a href="https://index.ros.org/p/image_publisher/github-ros-perception-image_pipeline/">ros-humble-image-publisher</a>: 3.0.0-1 → 3.0.3-1</li> +<li>ros-humble-image-publisher-dbgsym: 3.0.0-1 → 3.0.3-1</li> +<li><a href="https://index.ros.org/p/image_rotate/github-ros-perception-image_pipeline/">ros-humble-image-rotate</a>: 3.0.0-1 → 3.0.3-1</li> +<li>ros-humble-image-rotate-dbgsym: 3.0.0-1 → 3.0.3-1</li> +<li><a href="https://index.ros.org/p/image_view/github-ros-perception-image_pipeline/">ros-humble-image-view</a>: 3.0.0-1 → 3.0.3-1</li> +<li>ros-humble-image-view-dbgsym: 3.0.0-1 → 3.0.3-1</li> +<li><a href="https://github.com/ros-controls/ros2_control/wiki" rel="noopener nofollow ugc">ros-humble-joint-limits</a>: 2.37.0-1 → 2.39.1-1</li> +<li>ros-humble-joint-limits-dbgsym: 2.37.0-1 → 2.39.1-1</li> +<li>ros-humble-launch: 1.0.4-1 → 1.0.5-1</li> +<li>ros-humble-launch-pytest: 1.0.4-1 → 1.0.5-1</li> +<li>ros-humble-launch-testing: 1.0.4-1 → 1.0.5-1</li> +<li>ros-humble-launch-testing-ament-cmake: 1.0.4-1 → 1.0.5-1</li> +<li>ros-humble-launch-xml: 1.0.4-1 → 1.0.5-1</li> +<li>ros-humble-launch-yaml: 1.0.4-1 → 1.0.5-1</li> +<li><a href="http://wiki.ros.org/leo">ros-humble-leo</a>: 1.2.0-1 → 1.2.1-1</li> +<li><a href="http://wiki.ros.org/leo_description">ros-humble-leo-description</a>: 1.2.0-1 → 1.2.1-1</li> +<li><a href="http://wiki.ros.org/leo">ros-humble-leo-msgs</a>: 1.2.0-1 → 1.2.1-1</li> +<li>ros-humble-leo-msgs-dbgsym: 1.2.0-1 → 1.2.1-1</li> +<li><a href="http://wiki.ros.org/leo_teleop">ros-humble-leo-teleop</a>: 1.2.0-1 → 1.2.1-1</li> +<li><a href="https://github.com/LORD-MicroStrain/microstrain_inertial" rel="noopener nofollow ugc">ros-humble-microstrain-inertial-driver</a>: 3.2.0-2 → 3.2.1-1</li> +<li>ros-humble-microstrain-inertial-driver-dbgsym: 3.2.0-2 → 3.2.1-1</li> +<li><a href="https://github.com/LORD-MicroStrain/microstrain_inertial" rel="noopener nofollow ugc">ros-humble-microstrain-inertial-examples</a>: 3.2.0-2 → 3.2.1-1</li> +<li>ros-humble-microstrain-inertial-examples-dbgsym: 3.2.0-2 → 3.2.1-1</li> +<li><a href="https://github.com/LORD-MicroStrain/microstrain_inertial" rel="noopener nofollow ugc">ros-humble-microstrain-inertial-msgs</a>: 3.2.0-2 → 3.2.1-1</li> +<li>ros-humble-microstrain-inertial-msgs-dbgsym: 3.2.0-2 → 3.2.1-1</li> +<li><a href="https://github.com/LORD-MicroStrain/microstrain_inertial" rel="noopener nofollow ugc">ros-humble-microstrain-inertial-rqt</a>: 3.2.0-2 → 3.2.1-1</li> +<li>ros-humble-mocap4r2-control: 0.0.6-1 → 0.0.7-1</li> +<li>ros-humble-mocap4r2-control-dbgsym: 0.0.6-1 → 0.0.7-1</li> +<li>ros-humble-mocap4r2-control-msgs: 0.0.6-1 → 0.0.7-1</li> +<li>ros-humble-mocap4r2-control-msgs-dbgsym: 0.0.6-1 → 0.0.7-1</li> +<li>ros-humble-mocap4r2-marker-publisher: 0.0.6-1 → 0.0.7-1</li> +<li>ros-humble-mocap4r2-marker-publisher-dbgsym: 0.0.6-1 → 0.0.7-1</li> +<li>ros-humble-mocap4r2-robot-gt: 0.0.6-1 → 0.0.7-1</li> +<li>ros-humble-mocap4r2-robot-gt-dbgsym: 0.0.6-1 → 0.0.7-1</li> +<li>ros-humble-mocap4r2-robot-gt-msgs: 0.0.6-1 → 0.0.7-1</li> +<li>ros-humble-mocap4r2-robot-gt-msgs-dbgsym: 0.0.6-1 → 0.0.7-1</li> +<li><a href="https://github.com/MOLAorg/mp2p_icp" rel="noopener nofollow ugc">ros-humble-mp2p-icp</a>: 1.0.0-1 → 1.2.0-1</li> +<li>ros-humble-mp2p-icp-dbgsym: 1.0.0-1 → 1.2.0-1</li> +<li><a href="https://www.mrpt.org/" rel="noopener nofollow ugc">ros-humble-mrpt2</a>: 2.11.6-1 → 2.11.9-1</li> +<li>ros-humble-mrpt2-dbgsym: 2.11.6-1 → 2.11.9-1</li> +<li>ros-humble-nav-2d-msgs: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav-2d-msgs-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav-2d-utils: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav-2d-utils-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-amcl: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-amcl-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-behavior-tree: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-behavior-tree-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-behaviors: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-behaviors-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-bringup: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-bt-navigator: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-bt-navigator-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-collision-monitor: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-collision-monitor-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-common: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-constrained-smoother: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-constrained-smoother-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-controller: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-controller-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-core: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-costmap-2d: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-costmap-2d-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-dwb-controller: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-lifecycle-manager: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-lifecycle-manager-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-map-server: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-map-server-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-mppi-controller: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-mppi-controller-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-msgs: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-msgs-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-navfn-planner: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-navfn-planner-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-planner: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-planner-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-regulated-pure-pursuit-controller: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-regulated-pure-pursuit-controller-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-rotation-shim-controller: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-rotation-shim-controller-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-rviz-plugins: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-rviz-plugins-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-simple-commander: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-smac-planner: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-smac-planner-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-smoother: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-smoother-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-theta-star-planner: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-theta-star-planner-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-util: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-util-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-velocity-smoother: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-velocity-smoother-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-voxel-grid: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-voxel-grid-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-waypoint-follower: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-waypoint-follower-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-navigation2: 1.1.12-1 → 1.1.13-1</li> +<li><a href="http://wiki.ros.org/pcl_conversions">ros-humble-pcl-conversions</a>: 2.4.0-4 → 2.4.0-6</li> +<li><a href="http://ros.org/wiki/perception_pcl">ros-humble-pcl-ros</a>: 2.4.0-4 → 2.4.0-6</li> +<li><a href="http://ros.org/wiki/perception_pcl">ros-humble-perception-pcl</a>: 2.4.0-4 → 2.4.0-6</li> +<li><a href="https://github.com/facontidavide/PlotJuggler" rel="noopener nofollow ugc">ros-humble-plotjuggler</a>: 3.8.8-3 → 3.9.0-1</li> +<li>ros-humble-plotjuggler-dbgsym: 3.8.8-3 → 3.9.0-1</li> +<li><a href="https://github.com/facontidavide/PlotJuggler" rel="noopener nofollow ugc">ros-humble-plotjuggler-ros</a>: 2.0.0-3 → 2.1.0-1</li> +<li>ros-humble-plotjuggler-ros-dbgsym: 2.0.0-3 → 2.1.0-1</li> +<li>ros-humble-rclpy: 3.3.11-1 → 3.3.12-1</li> +<li>ros-humble-rcpputils: 2.4.1-1 → 2.4.2-1</li> +<li>ros-humble-rcpputils-dbgsym: 2.4.1-1 → 2.4.2-1</li> +<li>ros-humble-rcutils: 5.1.4-1 → 5.1.5-1</li> +<li>ros-humble-rcutils-dbgsym: 5.1.4-1 → 5.1.5-1</li> +<li><a href="http://robotraconteur.com" rel="noopener nofollow ugc">ros-humble-robotraconteur</a>: 1.0.0-1 → 1.0.0-2</li> +<li>ros-humble-robotraconteur-dbgsym: 1.0.0-1 → 1.0.0-2</li> +<li>ros-humble-ros2-control: 2.37.0-1 → 2.39.1-1</li> +<li>ros-humble-ros2-control-test-assets: 2.37.0-1 → 2.39.1-1</li> +<li>ros-humble-ros2action: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2cli: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2cli-test-interfaces: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2cli-test-interfaces-dbgsym: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2component: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2controlcli: 2.37.0-1 → 2.39.1-1</li> +<li>ros-humble-ros2doctor: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2interface: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2lifecycle: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2lifecycle-test-fixtures: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2lifecycle-test-fixtures-dbgsym: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2multicast: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2node: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2param: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2pkg: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2run: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2service: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2topic: 0.18.8-1 → 0.18.9-1</li> +<li><a href="http://ros.org/wiki/rqt">ros-humble-rqt</a>: 1.1.6-2 → 1.1.7-1</li> +<li><a href="http://wiki.ros.org/rqt_console">ros-humble-rqt-console</a>: 2.0.2-3 → 2.0.3-1</li> +<li><a href="http://ros.org/wiki/rqt_controller_manager">ros-humble-rqt-controller-manager</a>: 2.37.0-1 → 2.39.1-1</li> +<li><a href="http://ros.org/wiki/rqt_gui">ros-humble-rqt-gui</a>: 1.1.6-2 → 1.1.7-1</li> +<li><a href="http://ros.org/wiki/rqt_gui_cpp">ros-humble-rqt-gui-cpp</a>: 1.1.6-2 → 1.1.7-1</li> +<li>ros-humble-rqt-gui-cpp-dbgsym: 1.1.6-2 → 1.1.7-1</li> +<li><a href="http://ros.org/wiki/rqt_gui_py">ros-humble-rqt-gui-py</a>: 1.1.6-2 → 1.1.7-1</li> +<li>ros-humble-rqt-mocap4r2-control: 0.0.6-1 → 0.0.7-1</li> +<li>ros-humble-rqt-mocap4r2-control-dbgsym: 0.0.6-1 → 0.0.7-1</li> +<li><a href="http://ros.org/wiki/rqt_py_common">ros-humble-rqt-py-common</a>: 1.1.6-2 → 1.1.7-1</li> +<li><a href="http://assimp.sourceforge.net/index.html" rel="noopener nofollow ugc">ros-humble-rviz-assimp-vendor</a>: 11.2.10-1 → 11.2.11-1</li> +<li><a href="https://github.com/ros2/rviz/blob/ros2/README.md" rel="noopener nofollow ugc">ros-humble-rviz-common</a>: 11.2.10-1 → 11.2.11-1</li> +<li>ros-humble-rviz-common-dbgsym: 11.2.10-1 → 11.2.11-1</li> +<li><a href="https://github.com/ros2/rviz/blob/ros2/README.md" rel="noopener nofollow ugc">ros-humble-rviz-default-plugins</a>: 11.2.10-1 → 11.2.11-1</li> +<li>ros-humble-rviz-default-plugins-dbgsym: 11.2.10-1 → 11.2.11-1</li> +<li><a href="https://www.ogre3d.org/" rel="noopener nofollow ugc">ros-humble-rviz-ogre-vendor</a>: 11.2.10-1 → 11.2.11-1</li> +<li>ros-humble-rviz-ogre-vendor-dbgsym: 11.2.10-1 → 11.2.11-1</li> +<li><a href="https://github.com/ros2/rviz/blob/ros2/README.md" rel="noopener nofollow ugc">ros-humble-rviz-rendering</a>: 11.2.10-1 → 11.2.11-1</li> +<li>ros-humble-rviz-rendering-dbgsym: 11.2.10-1 → 11.2.11-1</li> +<li>ros-humble-rviz-rendering-tests: 11.2.10-1 → 11.2.11-1</li> +<li><a href="http://ros.org/wiki/rviz2">ros-humble-rviz-visual-testing-framework</a>: 11.2.10-1 → 11.2.11-1</li> +<li><a href="https://github.com/ros2/rviz/blob/ros2/README.md" rel="noopener nofollow ugc">ros-humble-rviz2</a>: 11.2.10-1 → 11.2.11-1</li> +<li>ros-humble-rviz2-dbgsym: 11.2.10-1 → 11.2.11-1</li> +<li>ros-humble-sick-scan-xd: 3.1.11-1 → 3.1.11-3</li> +<li>ros-humble-sick-scan-xd-dbgsym: 3.1.11-1 → 3.1.11-3</li> +<li><a href="https://index.ros.org/p/stereo_image_proc/github-ros-perception-image_pipeline/">ros-humble-stereo-image-proc</a>: 3.0.0-1 → 3.0.3-1</li> +<li>ros-humble-stereo-image-proc-dbgsym: 3.0.0-1 → 3.0.3-1</li> +<li><a href="http://www.ros.org/wiki/tf2">ros-humble-tf2</a>: 0.25.5-1 → 0.25.6-1</li> +<li><a href="http://www.ros.org/wiki/tf2_bullet">ros-humble-tf2-bullet</a>: 0.25.5-1 → 0.25.6-1</li> +<li>ros-humble-tf2-dbgsym: 0.25.5-1 → 0.25.6-1</li> +<li>ros-humble-tf2-eigen: 0.25.5-1 → 0.25.6-1</li> +<li>ros-humble-tf2-eigen-kdl: 0.25.5-1 → 0.25.6-1</li> +<li>ros-humble-tf2-eigen-kdl-dbgsym: 0.25.5-1 → 0.25.6-1</li> +<li><a href="http://www.ros.org/wiki/tf2_ros">ros-humble-tf2-geometry-msgs</a>: 0.25.5-1 → 0.25.6-1</li> +<li><a href="http://ros.org/wiki/tf2">ros-humble-tf2-kdl</a>: 0.25.5-1 → 0.25.6-1</li> +<li><a href="http://www.ros.org/wiki/tf2_msgs">ros-humble-tf2-msgs</a>: 0.25.5-1 → 0.25.6-1</li> +<li>ros-humble-tf2-msgs-dbgsym: 0.25.5-1 → 0.25.6-1</li> +<li><a href="http://ros.org/wiki/tf2_py">ros-humble-tf2-py</a>: 0.25.5-1 → 0.25.6-1</li> +<li>ros-humble-tf2-py-dbgsym: 0.25.5-1 → 0.25.6-1</li> +<li><a href="http://www.ros.org/wiki/tf2_ros">ros-humble-tf2-ros</a>: 0.25.5-1 → 0.25.6-1</li> +<li>ros-humble-tf2-ros-dbgsym: 0.25.5-1 → 0.25.6-1</li> +<li><a href="http://www.ros.org/wiki/tf2_ros">ros-humble-tf2-ros-py</a>: 0.25.5-1 → 0.25.6-1</li> +<li><a href="http://www.ros.org/wiki/tf2_ros">ros-humble-tf2-sensor-msgs</a>: 0.25.5-1 → 0.25.6-1</li> +<li><a href="http://www.ros.org/wiki/tf2_tools">ros-humble-tf2-tools</a>: 0.25.5-1 → 0.25.6-1</li> +<li>ros-humble-tracetools-image-pipeline: 3.0.0-1 → 3.0.3-1</li> +<li>ros-humble-tracetools-image-pipeline-dbgsym: 3.0.0-1 → 3.0.3-1</li> +<li>ros-humble-transmission-interface: 2.37.0-1 → 2.39.1-1</li> +<li>ros-humble-transmission-interface-dbgsym: 2.37.0-1 → 2.39.1-1</li> +<li>ros-humble-vrpn-mocap: 1.0.4-1 → 1.1.0-1</li> +<li>ros-humble-vrpn-mocap-dbgsym: 1.0.4-1 → 1.1.0-1</li> +</ul> +<h3><a class="anchor" href="https://discourse.ros.org#removed-packages-4-4" name="removed-packages-4-4"></a>Removed Packages [4]:</h3> +<ul> +<li>ros-humble-as2-ign-gazebo-assets</li> +<li>ros-humble-as2-ign-gazebo-assets-dbgsym</li> +<li>ros-humble-as2-platform-ign-gazebo</li> +<li>ros-humble-as2-platform-ign-gazebo-dbgsym</li> +</ul> +<p>Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:</p> +<ul> +<li>Adam Serafin</li> +<li>Aditya Pande</li> +<li>Alexey Merzlyakov</li> +<li>Alvin Sun</li> +<li>Bence Magyar</li> +<li>Bernd Pfrommer</li> +<li>Bianca Bendris</li> +<li>Brian Wilcox</li> +<li>CVAR-UPM</li> +<li>Carl Delsey</li> +<li>Carlos Orduno</li> +<li>Chris Lalancette</li> +<li>David V. Lu!!</li> +<li>Davide Faconti</li> +<li>Dirk Thomas</li> +<li>Dorian Scholz</li> +<li>Fictionlab</li> +<li>Francisco Martín</li> +<li>Francisco Martín Rico</li> +<li>Jacob Perron</li> +<li>John Wason</li> +<li>Jose-Luis Blanco-Claraco</li> +<li>Matej Vargovcik</li> +<li>Michael Jeronimo</li> +<li>Mohammad Haghighipanah</li> +<li>Paul Bovbel</li> +<li>Rob Fisher</li> +<li>Sachin Guruswamy</li> +<li>Shane Loretz</li> +<li>Steve Macenski</li> +<li>Support Team</li> +<li>Séverin Lemaignan</li> +<li>Tatsuro Sakaguchi</li> +<li>Vincent Rabaud</li> +<li>Víctor Mayoral-Vilches</li> +<li>Wolfgang Hönig</li> +<li>fmrico</li> +<li>rostest</li> +<li>sachin</li> +<li>steve</li> +<li>ymski</li> +</ul> + <p><small>5 posts - 4 participants</small></p> + <p><a href="https://discourse.ros.org/t/new-packages-and-patch-release-for-humble-hawksbill-2024-02-22/36275">Read full topic</a></p> + 2024-02-23T05:00:30+00:00 + audrow + + + ROS Discourse General: GTC March 18-21 highlights for ROS & AI robotics + https://discourse.ros.org/t/gtc-march-18-21-highlights-for-ros-ai-robotics/36274 + <p><strong><a href="https://www.nvidia.com/gtc/" rel="noopener nofollow ugc">NVIDIA GTC</a></strong> is happening live on March 18–21, with <strong><a href="https://www.nvidia.com/gtc/sessions/robotics/" rel="noopener nofollow ugc">registration open</a></strong> for the event in San Jose, CA.</p> +<p>There are multiple inspiring <a href="https://www.nvidia.com/gtc/sessions/robotics/" rel="noopener nofollow ugc">robotics</a> sessions following the kickoff with <strong>CEO Jensen Huang’s</strong> must-see keynote at the SAP Center which will share the latest breakthroughs affecting every industry.</p> +<p>Listing some highlight robotics sessions, hands-on-labs, and developers sessions there are:</p> +<p><strong>Hands-on training Labs</strong></p> +<ul> +<li><a href="https://www.nvidia.com/gtc/session-catalog/?tab.allsessions=1700692987788001F1cG&amp;search=DLIT61534#/" rel="noopener nofollow ugc">DLIT61534</a> Elevate Your Robotics Game: Unleash High Performance with Isaac ROS &amp; Isaac Sim</li> +<li><a href="https://www.nvidia.com/gtc/session-catalog/?tab.allsessions=1700692987788001F1cG&amp;search=DLIT61899#/" rel="noopener nofollow ugc">DLIT61899</a> Simulating Custom Robots: A Hands-On Lab Using Isaac Sim and ROS 2</li> +<li><a href="https://www.nvidia.com/gtc/session-catalog/?search=DLIT61523" rel="noopener nofollow ugc">DLIT61523</a> Unlocking Local LLM Inference with Jetson AGX Orin: A Hands-On Lab</li> +<li><a href="https://www.nvidia.com/gtc/session-catalog/?tab.allsessions=1700692987788001F1cG&amp;search=DLIT61797#/" rel="noopener nofollow ugc">DLIT61797</a> Training an Autonomous Mobile Race Car with Open USD and Isaac Sim</li> +</ul> +<p><strong><a href="https://www.nvidia.com/gtc/sessions/jetson-and-robotics-developer-day/" rel="noopener nofollow ugc">Jetson and Robotics Developer Day</a></strong></p> +<ul> +<li><a href="https://www.nvidia.com/gtc/session-catalog/?search=SE62934" rel="noopener nofollow ugc">SE62934</a> Introduction to AI-Based Robot Development With Isaac ROS</li> +<li><a href="https://www.nvidia.com/gtc/session-catalog/?search=SE62675&amp;tab.allsessions=1700692987788001F1cG#/" rel="noopener nofollow ugc">SE62675</a> Meet Jetson: The Platform for Edge AI and Robotics</li> +<li><a href="https://www.nvidia.com/gtc/session-catalog/?search=SE62933" rel="noopener nofollow ugc">SE62933</a> Overview of Jetson Software and Developer Tools</li> +</ul> +<p><strong>Robotics focused sessions</strong></p> +<ul> +<li><a href="https://www.nvidia.com/gtc/session-catalog/?tab.allsessions=1700692987788001F1cG&amp;search=s63374#/" rel="noopener nofollow ugc">S63374</a> <em>(Disney Research)</em> Breathing Life into Disney’s Robotic Characters with Deep Reinforcement Learning</li> +<li><a href="https://www.nvidia.com/gtc/session-catalog/?tab.allsessions=1700692987788001F1cG&amp;search=S62602#/" rel="noopener nofollow ugc">S62602</a> <em>(Boston Dynamics)</em> Come See an Unlocked Ecosystem in the Robotics World</li> +<li><a href="https://www.nvidia.com/gtc/session-catalog/?tab.allsessions=1700692987788001F1cG&amp;search=S62315#/" rel="noopener nofollow ugc">S62315</a> <em>(The AI Institute)</em> Robotics and the Role of AI: Past, Present, and Future</li> +<li><a href="https://www.nvidia.com/gtc/session-catalog/?tab.allsessions=1700692987788001F1cG&amp;search=S61182#/" rel="noopener nofollow ugc">S61182</a> <em>(Google DeepMind)</em> Robotics in the Age of Generative AI</li> +<li><a href="https://www.nvidia.com/gtc/session-catalog/?tab.allsessions=1700692987788001F1cG&amp;search=S63034#/" rel="noopener nofollow ugc">S63034</a> Panel Discussion on the Impact of Generative AI on Robotics</li> +</ul> +<p>This is a great opportunity to connect, learn, and share with industry luminaries, robotics companies, NVIDIA experts, and peers face-to-face.</p> +<p>Thanks.</p> + <p><small>1 post - 1 participant</small></p> + <p><a href="https://discourse.ros.org/t/gtc-march-18-21-highlights-for-ros-ai-robotics/36274">Read full topic</a></p> + 2024-02-23T04:43:51+00:00 + ggrigor + + + diff --git a/rss20.xml b/rss20.xml new file mode 100644 index 00000000..837c81ab --- /dev/null +++ b/rss20.xml @@ -0,0 +1,2516 @@ + + + + + Planet ROS + http://planet.ros.org + en + Planet ROS - http://planet.ros.org + + + ROS Discourse General: ROS News for the Week of March 11th, 2024 + discourse.ros.org-topic-36651 + https://discourse.ros.org/t/ros-news-for-the-week-of-march-11th-2024/36651 + <h1><a class="anchor" href="https://discourse.ros.org#ros-news-for-the-week-of-march-11th-2024-1" name="ros-news-for-the-week-of-march-11th-2024-1"></a>ROS News for the Week of March 11th, 2024</h1> +<br /> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/e/5/e501ec44fb4eefddc8e5d1f1334345b72b815ed1.png" title="ROSCon_2024_transparent"><img alt="ROSCon_2024_transparent" height="500" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/e/5/e501ec44fb4eefddc8e5d1f1334345b72b815ed1_2_545x500.png" width="545" /></a></div><br /> +<a href="https://discourse.ros.org/t/roscon-2024-call-for-proposals-now-open/36624">The ROSCon 2024 call for talks and workshops is now open!</a> We want your amazing talks! Also, the ROSCon Diversity Scholarship deadline is coming up!<p></p> +<br /> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/5/f/5fc4b38399ab864e35409e6f7d0b7b66b833a633.jpeg" title="ROSBTBMarch24 (2)"><img alt="ROSBTBMarch24 (2)" height="388" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/5/f/5fc4b38399ab864e35409e6f7d0b7b66b833a633_2_690x388.jpeg" width="690" /></a></div><br /> +<a href="https://www.meetup.com/ros-by-the-bay/events/299684887/">ROS By-The-Bay is next week.</a>. Open Robotic’s CEO <a class="mention" href="https://discourse.ros.org/u/vanessa_yamzon_orsi">@Vanessa_Yamzon_Orsi</a> is dropping by to take your questions about the future of Open Robotics, and I recommend you swing by if you can. Just a heads up, we have to move to a different room on the other side of the complex; details are on <a href="http://Meetup.com">Meetup.com</a>.<p></p> +<br /> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/d/3/d3f66abe058d28275dba6cf53c8bdc7afe3b1ccc.jpeg" title="MEETUP SAN ANTONIO (2)"><img alt="MEETUP SAN ANTONIO (2)" height="388" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/d/3/d3f66abe058d28275dba6cf53c8bdc7afe3b1ccc_2_690x388.jpeg" width="690" /></a></div><br /> +<a href="https://www.eventbrite.com/e/ros-meetup-san-antonio-tickets-858425041407?aff=oddtdtcreator">We’re planning a ROS Meetup in San Antonio on March 26th in conjunction with the ROS Industrial Consortium meeting.</a> If you are in the area, or have colleagues in the region, please help us spread the word.<p></p> +<br /> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/4/7/47238db6ac84cd3ced4eb5168ae8a7f829403d7e.jpeg" title="March24GCM"><img alt="March24GCM" height="388" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/4/7/47238db6ac84cd3ced4eb5168ae8a7f829403d7e_2_690x388.jpeg" width="690" /></a></div><br /> +<a href="https://community.gazebosim.org/t/community-meeting-from-lidar-slam-to-full-scale-autonomy-and-beyond/2622">We’ve line up a phenomenal guest for our next Gazebo Community Meeting; Ji Zhang from Carnegie Mellon will be speaking about his work on his work integrating ROS, Gazebo, and a variety of LIDAR-based SLAM techniques. </a><p></p> +<h1><a class="anchor" href="https://discourse.ros.org#events-2" name="events-2"></a>Events</h1> +<ul> +<li><a href="https://online-learning.tudelft.nl/courses/hello-real-world-with-ros-robot-operating-systems/">ONGOING: TU Delft ROS MOOC (FREE!)</a></li> +<li><a href="https://www.meetup.com/ros-by-the-bay/events/299684887/">2024-03-21 ROS By The Bay</a></li> +<li><a href="https://partiful.com/e/0Z53uQxTyXNDOochGBgV">2024-03-23 Robo Hackathon @ Circuit Launch (Bay Area)</a></li> +<li><a href="https://discourse.ros.org/t/cracow-robotics-ai-club-8/36634">2024-03-25 Robotics &amp; AI Meetup Krakow</a></li> +<li>NEW: <a href="https://www.eventbrite.com/e/ros-meetup-san-antonio-tickets-858425041407?aff=oddtdtcreator">2024-03-26 ROS Meetup San Antonio, Texas</a></li> +<li><a href="https://community.gazebosim.org/t/community-meeting-from-lidar-slam-to-full-scale-autonomy-and-beyond/2622">2024-03-27 Gazebo Community Meeting: CMU LIDAR SLAM Expert</a></li> +<li><a href="https://rosindustrial.org/events/2024/ros-industrial-consortium-americas-2024-annual-meeting">2024-03-27 ROS Industrial Annual Meeting</a></li> +<li><a href="https://www.eventbrite.com/e/robotics-and-tech-happy-hour-tickets-835278218637?aff=soc">2024-03-27 Robotics Happy Hour Pittsburgh</a></li> +<li><a href="https://eurecat.org/serveis/eurecatevents/rosmeetupbcn/">2024-04-03 ROS Meetup Barcelona</a></li> +<li><a href="https://www.eventbrite.com/e/robot-block-party-2024-registration-855602328597?aff=oddtdtcreator">2024-04-06 Silicon Valley Robotics Robot Block Party</a></li> +<li><a href="https://www.zephyrproject.org/zephyr-developer-summit-2024/?utm_campaign=Zephyr%20Developer%20Summit&amp;utm_content=285280528&amp;utm_medium=social&amp;utm_source=linkedin&amp;hss_channel=lcp-27161269">2024-04-16 → 2024-04-18 Zephyr Developer Summit</a></li> +<li><a href="https://www.eventbrite.com/e/robots-on-ice-40-2024-tickets-815030266467">2024-04-21 Robots on Ice</a></li> +<li><a href="https://2024.oshwa.org/">2024-05-03 Open Hardware Summit Montreal</a></li> +<li><a href="https://www.tum-venture-labs.de/education/bots-bento-the-first-robotic-pallet-handler-competition-icra-2024/">2024-05-13 Bots &amp; Bento @ ICRA 2024</a></li> +<li><a href="https://cpsweek2024-race.f1tenth.org/">2024-05-14 → 2024-05-16 F1Tenth Autonomous Grand Prix @ CPS-IoT Week</a></li> +<li><a href="https://discourse.ros.org/t/acm-sigsoft-summer-school-on-software-engineering-for-robotics-in-brussels/35992">2024-06-04 → 2024-06-08 ACM SIGSOFT Summer School on Software Engineering for Robotics</a></li> +<li><a href="https://www.agriculture-vision.com/">2024-06-?? Workshop on Agriculture Vision at CVPR 2024</a></li> +<li><a href="https://icra2024.rt-net.jp/archives/87">2024-06-16 Food Topping Challenge at ICRA 2024</a></li> +<li><a href="https://discourse.ros.org/t/roscon-france-2024/35222">2024-06-19 =&gt; 2024-06-20 ROSCon France</a></li> +<li><a href="https://sites.udel.edu/ceoe-able/able-summer-bootcamp/">2024-08-05 Autonomous Systems Bootcam at Univ. Deleware</a>– <a href="https://www.youtube.com/watch?v=4ETDsBN2o8M">Video</a></li> +<li><a href="https://roscon.jp/2024/">2024-09-25 ROSConJP</a></li> +<li><a href="https://fira-usa.com/">2024-10-22 → 2024-10-24 AgRobot FIRA in Sacramento</a></li> +</ul> +<h1><a class="anchor" href="https://discourse.ros.org#news-3" name="news-3"></a>News</h1> +<ul> +<li><a href="https://discourse.ros.org/t/foxglove-2-0-integrated-ui-new-pricing-and-open-source-changes/36583">Foxglove 2.0: Integrated UI, New Price, Less Open Source</a> – <a href="https://www.therobotreport.com/foxglove-launches-upgraded-platform-with-enhanced-observability/">Robot Report</a></li> +<li><a href="https://www.bearrobotics.ai/blog/bear-robotics-secures-60m-series-c-funding-led-by-lg-electronics">LG Leads $60M Series C for Bear Robotics</a> – <a href="https://techcrunch.com/2024/03/12/bear-robotics-a-robot-waiter-startup-just-picked-up-60m-from-lg/">TechCrunch</a> – <a href="https://www.therobotreport.com/lg-makes-strategic-investment-in-bear-robotics/">Robot Report</a></li> +<li><a href="https://dronecode.org/the-2023-year-in-review/">Dronecode Annual Report</a></li> +<li><a href="https://www.ieee-ras.org/educational-resources-outreach/technical-education-programs"><img alt=":cool:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/cool.png?v=12" title=":cool:" width="20" /> Want to run a ROS Summer School? Get $25k from IEEE-RAS! </a></li> +<li><a href="https://www.youtube.com/watch?v=EZm_kWPMq0Q">YOKOHAMA GUNDAM FACTORY!</a></li> +<li><a href="https://hackaday.com/2024/03/09/rosie-the-robot-runs-for-real/">Actual Rosie Robot</a></li> +<li><a href="https://techcrunch.com/2024/03/14/humanoid-robots-face-continued-skepticism-at-modex/">Modex Skeptical of Humanoids</a> – <a href="https://techcrunch.com/2024/03/11/the-loneliness-of-the-robotic-humanoid/">See also: Digit only Humanoid at Modex</a></li> +<li><a href="https://techcrunch.com/2024/03/13/behold-truckbot/">Behold Truckbot</a></li> +<li><a href="https://techcrunch.com/2024/03/13/cyphers-inventory-drone-launches-from-an-autonomous-mobile-robot-base/">AMR + Drone for Inventory at Modex</a></li> +<li><a href="https://techcrunch.com/2024/03/12/locus-robotics-success-is-a-tale-of-focusing-on-what-works/">Locus Robotics’ success is a tale of focusing on what works</a></li> +<li><a href="https://www.therobotreport.com/afara-launches-autonomous-picker-to-clean-up-after-cotton-harvest/">Afara launches autonomous picker to clean up after cotton harvest</a></li> +<li><a href="https://spectrum.ieee.org/covariant-foundation-model">Covariant Announces a Universal AI Platform for Robots</a></li> +<li><a href="https://dex-cap.github.io/">DexCap: Scalable and Portable Mocap Data Collection System for Dexterous Manipulation – open hardware</a></li> +<li><a href="https://techcrunch.com/2024/03/15/these-61-robotics-companies-are-hiring/">Who’s Hiring Robotics</a></li> +</ul> +<h1><a class="anchor" href="https://discourse.ros.org#ros-4" name="ros-4"></a>ROS</h1> +<ul> +<li><a href="https://discourse.ros.org/t/fresh-edition-of-the-ros-mooc-from-tudelft/36633"><img alt=":cool:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/cool.png?v=12" title=":cool:" width="20" /> <img alt=":new:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/new.png?v=12" title=":new:" width="20" /> TU-Delft ROS MOOC</a></li> +<li><a href="https://discourse.ros.org/t/new-packages-for-ros-2-rolling-ridley-2024-03-13/36626">Rolling Ridley now Runs on 24.04 – 1416 Updated Packages <img alt=":tada:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/tada.png?v=12" title=":tada:" width="20" /></a></li> +<li><a href="https://discourse.ros.org/t/new-packages-for-noetic-2024-03-07/36514">10 New and 46 Updated Packages for Noetic</a></li> +<li><a href="https://discourse.ros.org/t/new-packages-for-iron-irwini-2024-03-11/36560">1 New and 82 Updated Packages for Iron Irwini</a></li> +<li><a href="https://www.baslerweb.com/en/software/pylon/camera-driver-ros/"><img alt=":cool:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/cool.png?v=12" title=":cool:" width="20" /> Pylon Basler Camera Driver for ROS 2</a> – <a href="https://github.com/basler/pylon-ros-camera">source</a> – <a href="https://www2.baslerweb.com/en/downloads/document-downloads/interfacing-basler-cameras-with-ros-2/">docs</a></li> +<li><a href="https://discourse.ros.org/t/march-2024-meetings-aerial-robotics/36495">Aerial Robotics Meetings for March</a></li> +<li><a href="https://vimeo.com/923208013?share=copy">Interop SIG: Standardizing Infrastructure Video</a></li> +<li><a href="https://discourse.ros.org/t/teleop-keyboard-node-in-rust/36555">Keyboard Teleop in Rust <img alt=":crab:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/crab.png?v=12" title=":crab:" width="20" /></a></li> +<li><a href="https://discourse.ros.org/t/ros-2-and-large-data-transfer-on-lossy-networks/36598">ROS 2 and Large Data Transfer of Lossy Network</a></li> +<li><a href="https://discourse.ros.org/t/cloud-robotics-wg-strategy-next-meeting-announcement/36604">Cloud Robotics WG Next Meeting</a></li> +<li><a href="https://discourse.ros.org/t/announcing-open-sourcing-of-ros-2-task-manager/36572">ROS 2 Task Manager</a></li> +<li><a href="https://github.com/jsk-ros-pkg/jsk_3rdparty/tree/master/switchbot_ros">SwitchBot ROS Package</a></li> +<li><a href="https://www.behaviortree.dev/docs/category/tutorials-advanced/">New Advanced Behavior Tree Tutorials</a></li> +<li><a href="https://github.com/ToyotaResearchInstitute/gauges2">TRI ROS 2 Gauges Package</a></li> +<li><a href="https://haraduka.github.io/continuous-state-recognition/">Continuous Object State Recognition for Cooking Robots</a></li> +<li><a href="https://www.youtube.com/watch?v=lTew9mbXrAs">ROS Python PyCharm Setup Guide </a></li> +<li><a href="https://github.com/MJavadZallaghi/ros2webots">ROS 2 WeBots Starter Code</a></li> +<li><a href="https://github.com/uos/ros2_tutorial">Osnabrück University KBS Robotics Tutorial</a></li> +<li><a href="https://github.com/ika-rwth-aachen/etsi_its_messages">ROS Package for ETSI ITS Message for V2X Comms </a></li> +<li><a href="https://github.com/suchetanrs/ORB-SLAM3-ROS2-Docker">ORB-SLAM3 ROS 2 Docker Container</a></li> +<li><a href="https://www.youtube.com/watch?v=TWTDPilQ8q0&amp;t=8s">Factory Control System from Scratch in Rust <img alt=":crab:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/crab.png?v=12" title=":crab:" width="20" /></a></li> +<li><a href="https://www.youtube.com/watch?v=sAkrG_WBqyc">ROS + QT-Creator (Arabic)</a></li> +<li><a href="https://www.allegrohand.com/">Dexterous Hand that Runs ROS</a></li> +<li><a href="https://discourse.ros.org/t/cobot-magic-agilex-achieved-the-whole-process-of-mobile-aloha-model-training-in-both-the-simulation-and-real-environment/36644">Cobot Magic: AgileX achieved the whole process of Mobile Aloha model training in both the simulation and real environment</a></li> +</ul> +<h1><a class="anchor" href="https://discourse.ros.org#ros-questions-5" name="ros-questions-5"></a>ROS Questions</h1> +<p>Got a minute? <a href="https://robotics.stackexchange.com/">Please take some time to answer questions on Robotics Stack Exchange!</a></p> + <p><small>1 post - 1 participant</small></p> + <p><a href="https://discourse.ros.org/t/ros-news-for-the-week-of-march-11th-2024/36651">Read full topic</a></p> + Fri, 15 Mar 2024 15:33:56 +0000 + + + ROS Discourse General: ROSCon 2024 Call for Proposals Now Open + discourse.ros.org-topic-36624 + https://discourse.ros.org/t/roscon-2024-call-for-proposals-now-open/36624 + <h1><a class="anchor" href="https://discourse.ros.org#roscon-2024-call-for-proposals-1" name="roscon-2024-call-for-proposals-1"></a>ROSCon 2024 Call for Proposals</h1> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/e/5/e501ec44fb4eefddc8e5d1f1334345b72b815ed1.png" title="ROSCon_2024_transparent"><img alt="ROSCon_2024_transparent" height="500" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/e/5/e501ec44fb4eefddc8e5d1f1334345b72b815ed1_2_545x500.png" width="545" /></a></div><p></p> +<p>Hi Everyone,</p> +<p>The ROSCon call for proposals is now open! You can find full proposal details on the <a href="http://roscon.ros.org/#call-for-proposals">ROSCon website</a>. ROSCon Workshop proposals are due by <span class="discourse-local-date">2024-05-08T06:59:00Z UTC</span> and can be submitted using this <a href="https://docs.google.com/forms/d/e/1FAIpQLSeciW0G6_bvlH_AL7mJrERBiajnUqnq1yO3z1rgzeb-O2hZxw/viewform?usp=header_link">Google Form</a>. ROSCon talks are due by <span class="discourse-local-date">2024-06-04T06:59:00Z UTC</span> and you can submit your proposals using <a href="https://roscon2024.hotcrp.com/">Hot CRP</a>. Please note that you’ll need a HotCRP account to submit your talk proposal. We plan to post the accepted workshops on or around <span class="discourse-local-date">2024-07-08T07:00:00Z UTC</span> and the accepted talks on or around <span class="discourse-local-date">2024-07-15T07:00:00Z UTC</span> respectively. If you think you will need financial assistance to attend ROSCon, and you meet the qualifications, please apply for our <a href="https://docs.google.com/forms/d/e/1FAIpQLSfJYMAT8wXjFp6FjMMTva_bYoKhZtgRy7P9540e6MX94PgzPg/viewform?fbzx=-7920629384650366975">Diversity Scholarship Program</a> as soon as possible. Diversity Scholarship applications are due on <span class="discourse-local-date">2024-04-06T06:59:00Z UTC</span>, well before the CFP deadlines or final speakers are announced. Questions and concerns about the ROSCon CFP can be directed to the ROSCon executive committee (<a href="mailto:roscon-2024-ec@openrobotics.org">roscon-2024-ec@openrobotics.org</a>) or posted in this thread.</p> +<p>We recommend you start planning your talk early and take the time to workshop your submission with your friends and colleagues. You are more than welcome to use this Discourse thread and the <a href="https://discord.com/channels/1077825543698927656/1208998489154129920">#roscon-2024 channel on the ROS Discord</a> to workshop ideas and organize collaborators.</p> +<p>Finally, I want to take a moment to recognize this year’s ROSCon Program Co-Chairs <a class="mention" href="https://discourse.ros.org/u/ingo_lutkebohle">@Ingo_Lutkebohle</a> and <a class="mention" href="https://discourse.ros.org/u/yadunund">@Yadunund</a>, along with a very long list of talk reviewers who are still being finalized. Reviewing talk proposals is fairly tedious task, and ROSCon wouldn’t happen without the efforts of our volunteers. If you happen to run into any of them at ROSCon please thank them for their service to the community.</p> +<h2><a class="anchor" href="https://discourse.ros.org#talk-and-workshop-ideas-for-roscon-2024-2" name="talk-and-workshop-ideas-for-roscon-2024-2"></a>Talk and Workshop Ideas for ROSCon 2024</h2> +<p>If you’ve never been to ROSCon, but would like to submit a talk or workshop proposal, we recommend you take a look at the <a href="https://roscon.ros.org/2024/#archive">archive of previous ROSCon talks</a>. Another good resource to consider are frequently discussed topics on ROS Discourse and Robotics Stack Exchange. <a href="https://discourse.ros.org/t/2023-ros-metrics-report/35837">In last year’s metric’s report</a> I include a list of frequently asked topic tags from Robotics Stack that might be helpful. Aside from code, we really want to your robots! We want to see your race cars, mining robots, moon landers, maritime robots, development boards, and factories and hear about lessons you learned from making them happen. If you organize a working group, run a local meetup, or maintain a larger package we want to hear about your big wins in the past year.</p> +<p>While we can suggest a few ideas for talks and workshops that we would like to see at ROSCon 2024, what we really want is to hear from the community about topic areas that you think are important. <em><strong>If there is a talk you would like to see at ROSCon 2024 consider proposing a that topic in the comments below.</strong></em> Feel free to write a whole list! Some of our most memorable talks have been ten minute overviews of key ROS subsystems that everyone uses. If you think a half hour talk about writing a custom ROS 2 executor and benchmarking its performance would be helpful, please say so!</p> + <p><small>1 post - 1 participant</small></p> + <p><a href="https://discourse.ros.org/t/roscon-2024-call-for-proposals-now-open/36624">Read full topic</a></p> + Fri, 15 Mar 2024 15:19:51 +0000 + + + ROS Discourse General: Cobot Magic: AgileX achieved the whole process of Mobile Aloha model training in both the simulation and real environment + discourse.ros.org-topic-36644 + https://discourse.ros.org/t/cobot-magic-agilex-achieved-the-whole-process-of-mobile-aloha-model-training-in-both-the-simulation-and-real-environment/36644 + <h1><a class="anchor" href="https://discourse.ros.org#cobot-magic-agilex-achieved-the-whole-process-of-mobile-aloha-model-training-in-both-the-simulation-and-real-environment-1" name="cobot-magic-agilex-achieved-the-whole-process-of-mobile-aloha-model-training-in-both-the-simulation-and-real-environment-1"></a><strong>Cobot Magic: AgileX achieved the whole process of Mobile Aloha model training in both the simulation and real environment</strong></h1> +<p>Mobile Aloha is a whole-body remote operation data collection system developed by Zipeng Fu, Tony Z. Zhao, and Chelsea Finn from Stanford University. <a href="https://mobile-aloha.github.io/" rel="noopener nofollow ugc">link.</a></p> +<p>Based on Mobile Aloha, AgileX developed Cobot Magic, which can achieve the complete code of Mobile Aloha, with higher configurations and lower costs, and is equipped with larger-load robotic arms and high-computing power industrial computers. For more details about Cobot Magic please check the <a href="https://global.agilex.ai/" rel="noopener nofollow ugc">AgileX website </a>.</p> +<p>Currently, AgileX has successfully completed the integration of Cobot Magic based on the Mobile Aloha source code project.<br /> +<img alt="推理" class="animated" height="400" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/4/f/4f9834cff531f45ab648f7db0a7142ee080270af.gif" width="424" /></p> +<h1><a class="anchor" href="https://discourse.ros.org#simulation-data-training-2" name="simulation-data-training-2"></a><strong>Simulation data training</strong></h1> +<h1><a class="anchor" href="https://discourse.ros.org#data-collection-3" name="data-collection-3"></a><strong>Data collection</strong></h1> +<p>After setting up the Mobile Aloha software environment(metioned in last section), model training in the simulation environment and real environment can be achieved. The following is the data collection part of the simulation environment. The data is provided by the team of Zipeng Fu, Tony Z. Zhao, and Chelsea Finn team.You can find all scripted/human demo for simulated environments here. <a href="https://drive.google.com/drive/folders/1gPR03v05S1xiInoVJn7G7VJ9pDCnxq9O" rel="noopener nofollow ugc">here</a></p> +<p>After downloading, copy it to the act-plus-plus/data directory. The directory structure is as follows:</p> +<pre><code class="lang-auto">act-plus-plus/data + ├── sim_insertion_human + │ ├── sim_insertion_human-20240110T054847Z-001.zip + ├── ... + ├── sim_insertion_scripted + │ ├── sim_insertion_scripted-20240110T054854Z-001.zip + ├── ... + ├── sim_transfer_cube_human + │ ├── sim_transfer_cube_human-20240110T054900Z-001.zip + │ ├── ... + └── sim_transfer_cube_scripted + ├── sim_transfer_cube_scripted-20240110T054901Z-001.zip + ├── ... +</code></pre> +<p>Generate episodes and render the result graph. The terminal displays 10 episodes and 2 successful ones.</p> +<pre><code class="lang-auto"># 1 Run +python3 record_sim_episodes.py --task_name sim_transfer_cube_scripted --dataset_dir &lt;data save dir&gt; --num_episodes 50 + +# 2 Take sim_transfer_cube_scripted as an example +python3 record_sim_episodes.py --task_name sim_transfer_cube_scripted --dataset_dir data/sim_transfer_cube_scripted --num_episodes 10 + +# 2.1 Real-time rendering +python3 record_sim_episodes.py --task_name sim_transfer_cube_scripted --dataset_dir data/sim_transfer_cube_scripted --num_episodes 10 --onscreen_render + +# 2.2 The output in the terminal shows +ube_scripted --num_episodes 10 +episode_idx=0 +Rollout out EE space scripted policy +episode_idx=0 Failed +Replaying joint commands +episode_idx=0 Failed +Saving: 0.9 secs + +episode_idx=1 +Rollout out EE space scripted policy +episode_idx=1 Successful, episode_return=57 +Replaying joint commands +episode_idx=1 Successful, episode_return=59 +Saving: 0.6 secs +... +Saved to data/sim_transfer_cube_scripted +Success: 2 / 10 +</code></pre> +<p>The loaded image renders as follows:<br /> +</p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/b/5/b52f830cdca421a0a4960f61c81219922df8668d.png" rel="noopener nofollow ugc" title="1"><img alt="1" height="500" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/b/5/b52f830cdca421a0a4960f61c81219922df8668d_2_655x500.png" width="655" /></a></div><p></p> +<h1><a class="anchor" href="https://discourse.ros.org#data-visualization-4" name="data-visualization-4"></a>Data Visualization</h1> +<p>Visualize simulation data. The following figures show the images of episode0 and episode9 respectively.</p> +<p>The episode 0 screen in the data set is as follows, showing a case where the gripper fails to pick up.</p> +<p><img alt="episode0" class="animated" height="230" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/1/f/1f1e94f75c5ff731886fbf069597af5dfe0137cf.gif" width="690" /></p> +<p>The visualization of the data of episode 9 shows the successful case of grippering.</p> +<p><img alt="episode19" class="animated" height="230" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/0/9/09268db2b7338acfb94096bbd25f139a3a932006.gif" width="690" /></p> +<p>Print the data of each joint of the robotic arm in the simulation environment. Joint 0-13 is the data of 14 degrees of freedom of the robot arm and the gripper.</p> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/d/4/d4620062ddee3643956b6bef2cf4aed3728a6aec.png" rel="noopener nofollow ugc" title="episode-qpos"><img alt="episode-qpos" height="500" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/d/4/d4620062ddee3643956b6bef2cf4aed3728a6aec_2_250x500.png" width="250" /></a></div><p></p> +<h1><a class="anchor" href="https://discourse.ros.org#model-training-and-inference-5" name="model-training-and-inference-5"></a><strong>Model training and inference</strong></h1> +<p>Simulated environments datasets must be downloaded (see Data Collection)</p> +<pre><code class="lang-auto">python3 imitate_episodes.py --task_name sim_transfer_cube_scripted --ckpt_dir &lt;ckpt dir&gt; --policy_class ACT --kl_weight 10 --chunk_size 100 --hidden_dim 512 --batch_size 8 --dim_feedforward 3200 --num_epochs 2000 --lr 1e-5 --seed 0 + +# run +python3 imitate_episodes.py --task_name sim_transfer_cube_scripted --ckpt_dir trainings --policy_class ACT --kl_weight 1 --chunk_size 10 --hidden_dim 512 --batch_size 1 --dim_feedforward 3200 --lr 1e-5 --seed 0 --num_steps 2000 + +# During training, you will be prompted with the following content. Since you do not have a W&amp;B account, choose 3 directly. +wandb: (1) Create a W&amp;B account +wandb: (2) Use an existing W&amp;B account +wandb: (3) Don't visualize my results +wandb: Enter your choice: +</code></pre> +<p>After training is completed, the weights will be saved to the trainings directory. The results are as follows:</p> +<pre><code class="lang-auto">trainings + ├── config.pkl + ├── dataset_stats.pkl + ├── policy_best.ckpt + ├── policy_last.ckpt + └── policy_step_0_seed_0.ckpt +</code></pre> +<p>Evaluate the model trained above:</p> +<pre><code class="lang-auto"># 1 evaluate the policy add --onscreen_render real-time render parameter +python3 imitate_episodes.py --eval --task_name sim_transfer_cube_scripted --ckpt_dir trainings --policy_class ACT --kl_weight 1 --chunk_size 10 --hidden_dim 512 --batch_size 1 --dim_feedforward 3200 --lr 1e-5 --seed 0 --num_steps 20 --onscreen_render +</code></pre> +<p>And print the rendering picture.</p> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/2/d/2dfdf7294ff8c2b78a434ee0fe315b8e9f252a49.png" rel="noopener nofollow ugc" title="2"><img alt="2" height="500" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/2/d/2dfdf7294ff8c2b78a434ee0fe315b8e9f252a49_2_661x500.png" width="661" /></a></div><p></p> +<h1><a class="anchor" href="https://discourse.ros.org#data-training-in-real-environment-6" name="data-training-in-real-environment-6"></a><strong>Data Training in real environment</strong></h1> +<h1><a class="anchor" href="https://discourse.ros.org#data-collection-7" name="data-collection-7"></a><strong>Data Collection</strong></h1> +<p>1.Environment dependency</p> +<p>1.1 ROS dependency</p> +<p>● Default: ubuntu20.04-noetic environment has been configured</p> +<pre><code class="lang-auto">sudo apt install ros-$ROS_DISTRO-sensor-msgs ros-$ROS_DISTRO-nav-msgs ros-$ROS_DISTRO-cv-bridge +</code></pre> +<p>1.2 Python dependency</p> +<pre><code class="lang-auto"># Enter the current working space directory and install the dependencies in the requirements.txt file. +pip install -r requiredments.txt +</code></pre> +<p>2.Data collection</p> +<p>2.1 Run ‘collect_data’</p> +<pre><code class="lang-auto">python collect_data.py -h # see parameters +python collect_data.py --max_timesteps 500 --episode_idx 0 +python collect_data.py --max_timesteps 500 --is_compress --episode_idx 0 +python collect_data.py --max_timesteps 500 --use_depth_image --episode_idx 1 +python collect_data.py --max_timesteps 500 --is_compress --use_depth_image --episode_idx 1 +</code></pre> +<p>After the data collection is completed, it will be saved in the ${dataset_dir}/{task_name} directory.</p> +<pre><code class="lang-auto">python collect_data.py --max_timesteps 500 --is_compress --episode_idx 0 +# Generate dataset episode_0.hdf5 . The structure is : + +collect_data + ├── collect_data.py + ├── data # --dataset_dir + │ └── cobot_magic_agilex # --task_name + │ ├── episode_0.hdf5 # The location of the generated data set file + ├── episode_idx.hdf5 # idx is depended on --episode_idx + └── ... + ├── readme.md + ├── replay_data.py + ├── requiredments.txt + └── visualize_episodes.py +</code></pre> +<p>The specific parameters are shown:</p> +<div class="md-table"> +<table> +<thead> +<tr> +<th>Name</th> +<th>Explanation</th> +</tr> +</thead> +<tbody> +<tr> +<td>dataset_dir</td> +<td>Data set saving path</td> +</tr> +<tr> +<td>task_name</td> +<td>task name, as the file name of the data set</td> +</tr> +<tr> +<td>episode_idx</td> +<td>Action block index number</td> +</tr> +<tr> +<td>max_timesteps</td> +<td>The number of time steps for the maximum action block</td> +</tr> +<tr> +<td>camera_names</td> +<td>Camera names, default [‘cam_high’, ‘cam_left_wrist’, ‘cam_right_wrist’]</td> +</tr> +<tr> +<td>img_front_topic</td> +<td>Camera 1 Color Picture Topic</td> +</tr> +<tr> +<td>img_left_topic</td> +<td>Camera 2 Color Picture Topic</td> +</tr> +<tr> +<td>img_right_topic</td> +<td>Camera 3 Color Picture Topic</td> +</tr> +<tr> +<td>use_depth_image</td> +<td>Whether to use depth information</td> +</tr> +<tr> +<td>depth_front_topic</td> +<td>Camera 1 depth map topic</td> +</tr> +<tr> +<td>depth_left_topic</td> +<td>Camera 2 depth map topic</td> +</tr> +<tr> +<td>depth_right_topic</td> +<td>Camera 3 depth map topic</td> +</tr> +<tr> +<td>master_arm_left_topic</td> +<td>Left main arm topic</td> +</tr> +<tr> +<td>master_arm_right_topic</td> +<td>Right main arm topic</td> +</tr> +<tr> +<td>puppet_arm_left_topic</td> +<td>Left puppet arm topic</td> +</tr> +<tr> +<td>puppet_arm_right_topic</td> +<td>Right puppet arm topic</td> +</tr> +<tr> +<td>use_robot_base</td> +<td>Whether to use mobile base information</td> +</tr> +<tr> +<td>robot_base_topic</td> +<td>Mobile base topic</td> +</tr> +<tr> +<td>frame_rate</td> +<td>Acquisition frame rate. Because the camera image stabilization value is 30 frames, the default is 30 frames</td> +</tr> +<tr> +<td>is_compress</td> +<td>Whether the image is compressed and saved</td> +</tr> +</tbody> +</table> +</div><p>The picture of data collection from the camera perspective is as follows:</p> +<p><img alt="data collection" class="animated" height="387" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/0/2/02c868b09ce46587de9150e9d6c09c62a5719a9a.gif" width="690" /></p> +<h1><a class="anchor" href="https://discourse.ros.org#data-visualization-8" name="data-visualization-8"></a><strong>Data visualization</strong></h1> +<p>Run the following code:</p> +<pre><code class="lang-auto">python visualize_episodes.py --dataset_dir ./data --task_name cobot_magic_agilex --episode_idx 0 +</code></pre> +<p>Visualize the collected data. <code>--dataset_dir</code>, <code>--task_name</code> and <code>--episode_idx</code> need to be the same as when ‘collecting data’. When you run the above code, the terminal will print the action and display a color image window. The visualization results are as follows:</p> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/7/f/7f33bb7c245190e4b69c5871d0300c3019215a89.jpeg" rel="noopener nofollow ugc" title="733bfc3a250f3d9f0a919d8f447421cb"><img alt="733bfc3a250f3d9f0a919d8f447421cb" height="316" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/7/f/7f33bb7c245190e4b69c5871d0300c3019215a89_2_690x316.jpeg" width="690" /></a></div><p></p> +<p>After the operation is completed, episode${idx}qpos.png, episode${idx}base_action.png and episode${idx}video.mp4 files will be generated under ${dataset_dir}/{task_name}. The directory structure is as follows:</p> +<pre><code class="lang-auto">collect_data +├── data +│ ├── cobot_magic_agilex +│ │ └── episode_0.hdf5 +│ ├── episode_0_base_action.png # base_action +│ ├── episode_0_qpos.png # qpos +│ └── episode_0_video.mp4 # Color video +</code></pre> +<p>Taking episode30 as an example, replay the collected episode30 data. The camera perspective is as follows:</p> +<p><img alt="data visualization" class="animated" height="172" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/e/a/eafb8cd13e73cd06ffacc771589c7106f080a252.gif" width="690" /></p> +<h1><a class="anchor" href="https://discourse.ros.org#model-training-and-inference-9" name="model-training-and-inference-9"></a>Model Training and Inference</h1> +<p>The Mobile Aloha project has studied different strategies for imitation learning, and proposed a Transformer-based action chunking algorithm ACT (Action Chunking with Transformers). It is essentially an end-to-end strategy: directly mapping real-world RGB images to actions, allowing the robot to learn and imitate from the visual input without the need for additional artificially encoded intermediate representations, and using action chunking (Chunking) as the unit to predict and integrates accurate and smooth motion trajectories.</p> +<p>The model is as follows:</p> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/a/f/af32cea48cc4e4b04932386d0bc9ec8c32ddce9e.png" rel="noopener nofollow ugc" title="image (1)"><img alt="image (1)" height="174" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/a/f/af32cea48cc4e4b04932386d0bc9ec8c32ddce9e_2_690x174.png" width="690" /></a></div><p></p> +<p>Disassemble and interpret the model.</p> +<ol> +<li>Sample data</li> +</ol> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/3/1/3123a4970c9d91e665510d39acd191c588f3c216.png" rel="noopener nofollow ugc" title="image (2)"><img alt="image (2)" height="140" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/3/1/3123a4970c9d91e665510d39acd191c588f3c216_2_690x140.png" width="690" /></a></div><p></p> +<p>Input: includes 4 RGB images, each image has a resolution of 480 × 640, and the joint positions of the two robot arms (7+7=14 DoF in total)</p> +<p>Output: The action space is the absolute joint positions of the two robots, a 14-dimensional vector. Therefore, with action chunking, the policy outputs a k × 14 tensor given the current observation (each action is defined as a 14-dimensional vector, so k actions are a k × 14 tensor)</p> +<ol start="2"> +<li>Infer Z</li> +</ol> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/2/f/2f72d64dd82d004c926759c64b00b78647d10231.png" rel="noopener nofollow ugc" title="image (3)"><img alt="image (3)" height="215" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/2/f/2f72d64dd82d004c926759c64b00b78647d10231_2_690x215.png" width="690" /></a></div><p></p> +<p>The input to the encoder is a [CLS] token, which consists of randomly initialized learning weights. Through a linear layer2, the joints are projected to the joint positions of the embedded dimensions (14 dimensions to 512 dimensions) to obtain the embedded joint positions embedded joints. Through another linear layer linear layer1, the k × 14 action sequence is projected to the embedded action sequence of the embedded dimension (k × 14 dimension to k × 512 dimension).</p> +<p>The above three inputs finally form a sequence of (k + 2) × embedding_dimension, that is, (k + 2) × 512, and are processed with the transformer encoder. Finally, just take the first output, which corresponds to the [CLS] tag, and use another linear network to predict the mean and variance of the Z distribution, parameterizing it as a diagonal Gaussian distribution. Use reparameterization to obtain samples of Z.</p> +<ol start="3"> +<li>Predict a action sequence</li> +</ol> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/4/1/41d86ea2457b78aa9f7c8d3172130611cc9441e5.jpeg" rel="noopener nofollow ugc" title="image (4)"><img alt="image (4)" height="267" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/4/1/41d86ea2457b78aa9f7c8d3172130611cc9441e5_2_690x267.jpeg" width="690" /></a></div><p></p> +<p>① First, for each image observation, it is processed by ResNet18 to obtain a feature map (15 × 20 × 728 feature maps), and then flattened to obtain a feature sequence (300 × 728). These features are processed using a linear layer Layer5 is projected to the embedding dimension (300×512), and in order to preserve spatial information, a 2D sinusoidal position embedding is added.</p> +<p>② Secondly, repeat this operation for all 4 images, and the resulting feature sequence dimension is 1200 × 512.</p> +<p>③ Next, the feature sequences from each camera are concatenated and used as one of the inputs of the transformer encoder. For the other two inputs: the current joint positions joints and the “style variable” z, they are passed through the linear layer linear layer6, linear layer respectively Layer7 is uniformly projected to 512 from their respective original dimensions (14, 15).</p> +<p>④ Finally, the encoder input of the transformer is 1202×512 (the feature dimension of the 4 images is 1200×512, the feature dimension of the joint position joins is 1×512, and the feature dimension of the style variable z is 1×512).</p> +<p>The input to the transformer decoder has two aspects:</p> +<p>On the one hand, the “query” of the transformer decoder is the first layer of fixed sinusoidal position embeddings, that is, the position embeddings (fixed) shown in the lower right corner of the above figure, whose dimension is k × 512</p> +<p>On the other hand, the “keys” and “values” in the cross-attention layer of the transformer decoder come from the output of the above-mentioned transformer encoder.</p> +<p>Thereby, the transformer decoder predicts the action sequence given the encoder output.</p> +<p>By collecting data and training the above model, you can observe that the results converge.</p> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/f/c/fcd703b5a444096e904cbd048218f306c61f7964.png" rel="noopener nofollow ugc" title="image-20240314233128053"><img alt="image-20240314233128053" height="500" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/f/c/fcd703b5a444096e904cbd048218f306c61f7964_2_672x500.png" width="672" /></a></div><p></p> +<p>A third view of the model inference results is as follows. The robotic arm can infer the movement of placing colored blocks from point A to point B.</p> +<p><img alt="推理" class="animated" height="400" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/4/f/4f9834cff531f45ab648f7db0a7142ee080270af.gif" width="424" /></p> +<h3><a class="anchor" href="https://discourse.ros.org#summary-10" name="summary-10"></a><strong>Summary</strong></h3> +<p>Cobot Magic is a remote whole-body data collection device, developed by AgileX Robotics based on the Mobile Aloha project from Stanford University. With Cobot Magic, AgileX Robotics has successfully achieved the open-source code from the Stanford laboratory used on the Mobile Aloha platform, including in simulation and real environment.<br /> +AgileX will continue to collect data from various motion tasks based on Cobot Magic for model training and inference. Please stay tuned for updates on <a href="https://github.com/agilexrobotics?tab=repositories" rel="noopener nofollow ugc">Github. </a> And if you are interested in this Mobile Aloha project, join us with this slack link: <a class="inline-onebox" href="https://join.slack.com/t/mobilealohaproject/shared_invite/zt-2evdxspac-h9QXyigdcrR1TcYsUqTMOw" rel="noopener nofollow ugc">Slack</a>. Let’s talk about our ideas.</p> +<h3><a class="anchor" href="https://discourse.ros.org#about-agilex-11" name="about-agilex-11"></a><strong>About AgileX</strong></h3> +<p>Established in 2016, AgileX Robotics is a leading manufacturer of mobile robot platforms and a provider of unmanned system solutions. The company specializes in independently developed multi-mode wheeled and tracked wire-controlled chassis technology and has obtained multiple international certifications. AgileX Robotics offers users self-developed innovative application solutions such as autonomous driving, mobile grasping, and navigation positioning, helping users in various industries achieve automation. Additionally, AgileX Robotics has introduced research and education software and hardware products related to machine learning, embodied intelligence, and visual algorithms. The company works closely with research and educational institutions to promote robotics technology teaching and innovation.</p> + <p><small>1 post - 1 participant</small></p> + <p><a href="https://discourse.ros.org/t/cobot-magic-agilex-achieved-the-whole-process-of-mobile-aloha-model-training-in-both-the-simulation-and-real-environment/36644">Read full topic</a></p> + Fri, 15 Mar 2024 03:07:59 +0000 + + + ROS Discourse General: Cloud Robotics WG Strategy & Next Meeting Announcement + discourse.ros.org-topic-36604 + https://discourse.ros.org/t/cloud-robotics-wg-strategy-next-meeting-announcement/36604 + <p>Hi folks!</p> +<p>I wanted to tell you the results of the Cloud Robotics Working Group meeting from 2024-03-11. We met to discuss the long-term strategy of the group. You can see the full meeting recording on <a href="https://vimeo.com/922530909?share=copy" rel="noopener nofollow ugc">vimeo</a>, with our meeting minutes <a href="https://docs.google.com/document/d/10yT-0DKkrw1gDKGlWKl_c--2yM1b-UOP5rWW73bJuMw" rel="noopener nofollow ugc">here</a> (thanks to Phil Roan for taking minutes this meeting!).</p> +<p>During the meeting, we went over some definitions of Cloud Robotics, our tenets going forward, and a phase approach of gathering data, analyzing it, and acting on it. We used slides to frame the discussion, which have since been updated from the discussion and will form the backbone of our discussion going forwards. The slide deck is publicly available <a href="https://docs.google.com/presentation/d/1PPBYw7EZNTE8YnGF8CSYQ4DyErXX2sRI" rel="noopener nofollow ugc">here</a>.</p> +<p>Next meeting will be about how to start collecting the data for the first phase. We will hold it <span class="discourse-local-date">2024-03-25T17:00:00Z UTC</span>→<span class="discourse-local-date">2024-03-25T18:00:00Z UTC</span>. If you’d like to join the group, you are welcome to, and you can sign up for our meeting invites at <a href="https://groups.google.com/g/cloud-robotics-working-group-invites" rel="noopener nofollow ugc">this Google Group</a>.</p> +<p>Finally, we will regularly invite members and guests to give talks in our meetings. If you have a topic you’d like to talk about, or would like to invite someone to talk, please use this <a href="https://docs.google.com/spreadsheets/d/1drBcG-CXmX8YxBZuRK8Lr3eTTfqe2p_RF_HlDw4Rj5g/" rel="noopener nofollow ugc">speaker signup sheet</a> to let us know.</p> +<p>Hopefully I’ll see you all in future meetings!</p> + <p><small>6 posts - 4 participants</small></p> + <p><a href="https://discourse.ros.org/t/cloud-robotics-wg-strategy-next-meeting-announcement/36604">Read full topic</a></p> + Tue, 12 Mar 2024 17:33:16 +0000 + + + ROS Discourse General: Foxglove 2.0 - integrated UI, new pricing, and open source changes + discourse.ros.org-topic-36583 + https://discourse.ros.org/t/foxglove-2-0-integrated-ui-new-pricing-and-open-source-changes/36583 + <p>Hi everyone - excited to announce Foxglove 2.0, with a new integrated UI (merging Foxglove Studio and Data Platform), new pricing plans, and open source changes.</p> +<p><img alt=":handshake:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/handshake.png?v=12" title=":handshake:" width="20" /> Streamlined UI for smoother robotics observability<br /> +<img alt=":satellite:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/satellite.png?v=12" title=":satellite:" width="20" /> Automatic data offload through Foxglove Agent<br /> +<img alt=":credit_card:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/credit_card.png?v=12" title=":credit_card:" width="20" /> Updated pricing plans to make Foxglove accessible for teams of all sizes<br /> +<img alt=":mag_right:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/mag_right.png?v=12" title=":mag_right:" width="20" /> Changes to our open-source strategy (we’re discontinuing the open source edition of Foxglove Studio)</p> +<p><a href="https://foxglove.dev/blog/foxglove-2-0-unifying-robotics-observability" rel="noopener nofollow ugc">Read the details in our blog post</a>.</p> +<p>Note that Foxglove is still free for academic teams and researchers! If you fall into that category, please <a href="https://foxglove.dev/contact" rel="noopener nofollow ugc">contact us</a> and we can upgrade your account.</p> + <p><small>15 posts - 10 participants</small></p> + <p><a href="https://discourse.ros.org/t/foxglove-2-0-integrated-ui-new-pricing-and-open-source-changes/36583">Read full topic</a></p> + Mon, 11 Mar 2024 19:28:55 +0000 + + + ROS Discourse General: Announcing open sourcing of ROS 2 Task Manager! + discourse.ros.org-topic-36572 + https://discourse.ros.org/t/announcing-open-sourcing-of-ros-2-task-manager/36572 + <p><img alt=":tada:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/tada.png?v=12" title=":tada:" width="20" /> Me and my team are happy to announce that we at Karelics have open sourced our ROS 2 Task Manager package. This solution allows you to convert your existing ROS actions and services into tasks, offering useful features such as automatic task conflict resolution, the ability to aggregate multiple tasks into larger Missions, and straightforward tracking for active tasks and their results.</p> +<p>Check out the package and examples of its usage with the Nav2 package:<br /> +<img alt=":link:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/link.png?v=12" title=":link:" width="20" /> <a href="https://github.com/Karelics/task_manager" rel="noopener nofollow ugc">https://github.com/Karelics/task_manager</a></p> +<p>For an introduction and deeper insights into our design decisions, see our blog post available at: <a href="https://karelics.fi/task-manager-ros-2-package/" rel="noopener nofollow ugc">https://karelics.fi/task-manager-ros-2-package/</a><br /> +<br /></p> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/6/e/6ec33466cb152ca88bc1d2c9e1a60415db944598.png" rel="noopener nofollow ugc" title="task_manager_overview"><img alt="task_manager_overview" height="464" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/6/e/6ec33466cb152ca88bc1d2c9e1a60415db944598_2_690x464.png" width="690" /></a></div><br /> +<br /><br /> +We firmly believe that this package will prove valuable to the ROS community and accelerate the development of the robot systems. We are excited to hear your thoughts and feedback on it!<p></p> + <p><small>1 post - 1 participant</small></p> + <p><a href="https://discourse.ros.org/t/announcing-open-sourcing-of-ros-2-task-manager/36572">Read full topic</a></p> + Mon, 11 Mar 2024 12:52:42 +0000 + + + ROS Discourse General: New Packages for Iron Irwini 2024-03-11 + discourse.ros.org-topic-36560 + https://discourse.ros.org/t/new-packages-for-iron-irwini-2024-03-11/36560 + <p>We’re happy to announce <strong>1</strong> new packages and <strong>82</strong> updates are now available in ROS 2 Iron Irwini <img alt=":iron:" class="emoji emoji-custom" height="20" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/b/3/b3c1340fc185f5e47c7ec55ef5bb1771802de993.png?v=12" title=":iron:" width="20" /> <img alt=":irwini:" class="emoji emoji-custom" height="20" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/d/2/d2f3dcbdaff6f33258719fe5b8f692594a9feab0.png?v=12" title=":irwini:" width="20" /> . This sync was tagged as <a href="https://github.com/ros/rosdistro/blob/iron/2024-03-11/iron/distribution.yaml" rel="noopener nofollow ugc"><code>iron/2024-03-11</code> </a>.</p> +<h2><a class="anchor" href="https://discourse.ros.org#package-updates-for-iron-1" name="package-updates-for-iron-1"></a>Package Updates for iron</h2> +<h3><a class="anchor" href="https://discourse.ros.org#added-packages-1-2" name="added-packages-1-2"></a>Added Packages [1]:</h3> +<ul> +<li>ros-iron-apriltag-detector-dbgsym: 1.2.1-1</li> +</ul> +<h3><a class="anchor" href="https://discourse.ros.org#updated-packages-82-3" name="updated-packages-82-3"></a>Updated Packages [82]:</h3> +<ul> +<li>ros-iron-apriltag-detector: 1.2.0-1 → 1.2.1-1</li> +<li>ros-iron-controller-interface: 3.23.0-1 → 3.24.0-1</li> +<li>ros-iron-controller-interface-dbgsym: 3.23.0-1 → 3.24.0-1</li> +<li>ros-iron-controller-manager: 3.23.0-1 → 3.24.0-1</li> +<li>ros-iron-controller-manager-dbgsym: 3.23.0-1 → 3.24.0-1</li> +<li><a href="http://ros.org/wiki/controller_manager_msgs">ros-iron-controller-manager-msgs</a>: 3.23.0-1 → 3.24.0-1</li> +<li>ros-iron-controller-manager-msgs-dbgsym: 3.23.0-1 → 3.24.0-1</li> +<li>ros-iron-etsi-its-cam-coding: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-cam-coding-dbgsym: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-cam-conversion: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-cam-msgs: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-cam-msgs-dbgsym: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-coding: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-conversion: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-conversion-dbgsym: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-denm-coding: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-denm-coding-dbgsym: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-denm-conversion: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-denm-msgs: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-denm-msgs-dbgsym: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-messages: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-msgs: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-msgs-utils: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-primitives-conversion: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-rviz-plugins: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-etsi-its-rviz-plugins-dbgsym: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-flir-camera-description: 2.0.8-1 → 2.0.8-2</li> +<li>ros-iron-flir-camera-msgs: 2.0.8-1 → 2.0.8-2</li> +<li>ros-iron-flir-camera-msgs-dbgsym: 2.0.8-1 → 2.0.8-2</li> +<li>ros-iron-hardware-interface: 3.23.0-1 → 3.24.0-1</li> +<li>ros-iron-hardware-interface-dbgsym: 3.23.0-1 → 3.24.0-1</li> +<li>ros-iron-hardware-interface-testing: 3.23.0-1 → 3.24.0-1</li> +<li>ros-iron-hardware-interface-testing-dbgsym: 3.23.0-1 → 3.24.0-1</li> +<li><a href="https://github.com/ros-controls/ros2_control/wiki" rel="noopener nofollow ugc">ros-iron-joint-limits</a>: 3.23.0-1 → 3.24.0-1</li> +<li><a href="http://wiki.ros.org/mavros">ros-iron-libmavconn</a>: 2.6.0-1 → 2.7.0-1</li> +<li>ros-iron-libmavconn-dbgsym: 2.6.0-1 → 2.7.0-1</li> +<li><a href="https://mavlink.io/en/" rel="noopener nofollow ugc">ros-iron-mavlink</a>: 2023.9.9-1 → 2024.3.3-1</li> +<li><a href="http://wiki.ros.org/mavros">ros-iron-mavros</a>: 2.6.0-1 → 2.7.0-1</li> +<li>ros-iron-mavros-dbgsym: 2.6.0-1 → 2.7.0-1</li> +<li><a href="http://wiki.ros.org/mavros_extras">ros-iron-mavros-extras</a>: 2.6.0-1 → 2.7.0-1</li> +<li>ros-iron-mavros-extras-dbgsym: 2.6.0-1 → 2.7.0-1</li> +<li><a href="http://wiki.ros.org/mavros_msgs">ros-iron-mavros-msgs</a>: 2.6.0-1 → 2.7.0-1</li> +<li>ros-iron-mavros-msgs-dbgsym: 2.6.0-1 → 2.7.0-1</li> +<li><a href="https://www.mrpt.org/" rel="noopener nofollow ugc">ros-iron-mrpt2</a>: 2.11.9-1 → 2.11.11-1</li> +<li>ros-iron-mrpt2-dbgsym: 2.11.9-1 → 2.11.11-1</li> +<li><a href="https://wiki.ros.org/mvsim">ros-iron-mvsim</a>: 0.8.3-1 → 0.9.1-1</li> +<li>ros-iron-mvsim-dbgsym: 0.8.3-1 → 0.9.1-1</li> +<li><a href="https://github.com/LORD-MicroStrain/ntrip_client" rel="noopener nofollow ugc">ros-iron-ntrip-client</a>: 1.2.0-3 → 1.3.0-1</li> +<li>ros-iron-ros2-control: 3.23.0-1 → 3.24.0-1</li> +<li>ros-iron-ros2-control-test-assets: 3.23.0-1 → 3.24.0-1</li> +<li>ros-iron-ros2controlcli: 3.23.0-1 → 3.24.0-1</li> +<li><a href="http://ros.org/wiki/rqt_controller_manager">ros-iron-rqt-controller-manager</a>: 3.23.0-1 → 3.24.0-1</li> +<li><a href="http://introlab.github.io/rtabmap" rel="noopener nofollow ugc">ros-iron-rtabmap</a>: 0.21.3-1 → 0.21.4-1</li> +<li>ros-iron-rtabmap-conversions: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-conversions-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-dbgsym: 0.21.3-1 → 0.21.4-1</li> +<li>ros-iron-rtabmap-demos: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-examples: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-launch: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-msgs: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-msgs-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-odom: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-odom-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-python: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-ros: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-rviz-plugins: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-rviz-plugins-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-slam: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-slam-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-sync: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-sync-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-util: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-util-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-viz: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-rtabmap-viz-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-iron-simple-launch: 1.9.0-1 → 1.9.1-1</li> +<li>ros-iron-spinnaker-camera-driver: 2.0.8-1 → 2.0.8-2</li> +<li>ros-iron-spinnaker-camera-driver-dbgsym: 2.0.8-1 → 2.0.8-2</li> +<li>ros-iron-transmission-interface: 3.23.0-1 → 3.24.0-1</li> +<li>ros-iron-transmission-interface-dbgsym: 3.23.0-1 → 3.24.0-1</li> +<li><a href="http://wiki.ros.org/ur_client_library">ros-iron-ur-client-library</a>: 1.3.4-1 → 1.3.5-1</li> +<li>ros-iron-ur-client-library-dbgsym: 1.3.4-1 → 1.3.5-1</li> +</ul> +<h3><a class="anchor" href="https://discourse.ros.org#removed-packages-0-4" name="removed-packages-0-4"></a>Removed Packages [0]:</h3> +<p>Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:</p> +<ul> +<li>Bence Magyar</li> +<li>Bernd Pfrommer</li> +<li>Felix Exner</li> +<li>Jean-Pierre Busch</li> +<li>Jose-Luis Blanco-Claraco</li> +<li>Luis Camero</li> +<li>Mathieu Labbe</li> +<li>Olivier Kermorgant</li> +<li>Rob Fisher</li> +<li>Vladimir Ermakov</li> +</ul> + <p><small>1 post - 1 participant</small></p> + <p><a href="https://discourse.ros.org/t/new-packages-for-iron-irwini-2024-03-11/36560">Read full topic</a></p> + Mon, 11 Mar 2024 01:54:48 +0000 + + + ROS Discourse General: ROS News for the Week of March 4th, 2024 + discourse.ros.org-topic-36532 + https://discourse.ros.org/t/ros-news-for-the-week-of-march-4th-2024/36532 + <h1><a class="anchor" href="https://discourse.ros.org#ros-news-for-the-week-of-march-4th-2024-1" name="ros-news-for-the-week-of-march-4th-2024-1"></a>ROS News for the Week of March 4th, 2024</h1> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/d/3/d3f66abe058d28275dba6cf53c8bdc7afe3b1ccc.jpeg" title="MEETUP SAN ANTONIO (2)"><img alt="MEETUP SAN ANTONIO (2)" height="291" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/d/3/d3f66abe058d28275dba6cf53c8bdc7afe3b1ccc_2_517x291.jpeg" width="517" /></a></div><br /> +I’ve been working with the ROS Industrial team, and the Port of San Antonio, to put together a <a href="https://www.eventbrite.com/e/ros-meetup-san-antonio-tickets-858425041407?aff=oddtdtcreator">ROS Meetup in San Antonio / Austin</a> in conjunction with the annual <a href="https://rosindustrial.org/events/2024/ros-industrial-consortium-americas-2024-annual-meeting">ROS Industrial Consortium Meeting.</a> If you are attending the ROS-I meeting make sure you sign up!<p></p> +<br /> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/6/0/60e45baa168f3f7246a0f17cdb3985e476b9cd0f.jpeg" title="Add a heading (3)"><img alt="Add a heading (3)" height="194" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/6/0/60e45baa168f3f7246a0f17cdb3985e476b9cd0f_2_345x194.jpeg" width="345" /></a></div><br /> +Gazebo Classic goes end of life in 2025! To help the community move over to modern Gazebo we’re holding open <a href="https://community.gazebosim.org/t/gazebo-migration-guide-office-hours/2543">Gazebo office hours</a> next Tuesday, March 12th, at 9am PST. If you have questions about the migration process please come by!<p></p> +<br /> +<p><img alt="e1d28e85278dd4e221030828367839e4950b8cf9_2_671x500" height="250" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/4/b/4b596515548ed682aef78e342b55bab8167c62aa.jpeg" width="335" /><br /> +We often get questions about the “best” robot components for a particular application. I really hate answering these questions; my inner engineer just screams, “IT DEPENDS!” Unfortunately, w really don’t have a lot of apples-to-apples data to compare different hardware vendors.</p> +<p>Thankfully <a class="mention" href="https://discourse.ros.org/u/iliao">@iliao</a> is putting in a ton of work to review ten different low cost LIDAR sensors. <a href="https://discourse.ros.org/t/fyi-10-low-cost-lidar-lds-interfaced-to-ros2-micro-ros-arduino/36369">Check it out here.</a><br /> +<br /></p> +<p><img alt="teaser3" class="animated" height="108" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/2/c/2c31dbd971221364f6a944324235d66203fb4362.gif" width="405" /><br /> +This week we got a sneak peek at some of the cool CVPR 2024 papers. Check out, <a href="https://rmurai.co.uk/projects/GaussianSplattingSLAM/">“Gaussian Splatting SLAM”, by Hidenobu Matsuki, Riku Murai, Paul H.J. Kelly, Andrew J. Davison</a>, complete with <a href="https://github.com/muskie82/MonoGS">source code</a>.</p> +<br /> +<p><img alt="1aa39368041ea4a73d78470ab0d7441453258cdf_2_353x500" height="375" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/d/d/ddc4e36f6a5ea13ccf25f28256bd8f6bf3b8247a.jpeg" width="264" /><br /> +<a href="https://roscon.fr/">We got our new ROSCon France graphic this week!</a> ROSCon France is currently accepting papers! Please consider applying if you speak French!</p> +<h1><a class="anchor" href="https://discourse.ros.org#events-2" name="events-2"></a>Events</h1> +<ul> +<li><a href="https://discourse.ros.org/t/ros-2-rust-meeting-march-11th/36523">2024-03-11 ROS 2 <img alt=":crab:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/crab.png?v=12" title=":crab:" width="20" /> Rust Meeting</a></li> +<li><a href="https://twitter.com/HRI_Conference/status/1765426051503595991">2024-03-12 Queer in Robotics Social @ HRI</a></li> +<li><a href="https://community.gazebosim.org/t/gazebo-migration-guide-office-hours/2543">2024-03-12 Gazebo Migration Office Hours</a></li> +<li><a href="https://online-learning.tudelft.nl/courses/hello-real-world-with-ros-robot-operating-systems/">2024-03-14 TU Delft ROS MOOC (FREE!)</a></li> +<li><img alt=":new:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/new.png?v=12" title=":new:" width="20" /> <a href="https://www.meetup.com/ros-by-the-bay/events/299684887/">2024-03-21 ROS By-The-Bay with Dusty Robotics and Project Q&amp;A Session</a></li> +<li><a href="https://partiful.com/e/0Z53uQxTyXNDOochGBgV">2024-03-23 Robo Hackathon @ Circuit Launch (Bay Area)</a></li> +<li><img alt=":new:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/new.png?v=12" title=":new:" width="20" /> <a href="https://www.eventbrite.com/e/ros-meetup-san-antonio-tickets-858425041407?aff=oddtdtcreator">2024-03-26 ROS Meetup San Antonio, Texas</a></li> +<li><a href="https://rosindustrial.org/events/2024/ros-industrial-consortium-americas-2024-annual-meeting">2024-03-27 ROS Industrial Annual Meeting</a></li> +<li><a href="https://www.eventbrite.com/e/robotics-and-tech-happy-hour-tickets-835278218637?aff=soc">2024-03-27 Robotics Happy Hour Pittsburgh</a></li> +<li><img alt=":new:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/new.png?v=12" title=":new:" width="20" /> <a href="https://eurecat.org/serveis/eurecatevents/rosmeetupbcn/">2024-04-03 ROS Meetup Barcelona</a></li> +<li><img alt=":new:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/new.png?v=12" title=":new:" width="20" /> <a href="https://www.eventbrite.com/e/robot-block-party-2024-registration-855602328597?aff=oddtdtcreator">2024-04-06 Silicon Valley Robotics Robot Block Party</a></li> +<li><a href="https://www.zephyrproject.org/zephyr-developer-summit-2024/?utm_campaign=Zephyr%20Developer%20Summit&amp;utm_content=285280528&amp;utm_medium=social&amp;utm_source=linkedin&amp;hss_channel=lcp-27161269">2024-04-16 → 2024-04-18 Zephyr Developer Summit</a></li> +<li><a href="https://www.eventbrite.com/e/robots-on-ice-40-2024-tickets-815030266467">2024-04-21 Robots on Ice</a></li> +<li><a href="https://2024.oshwa.org/">2024-05-03 Open Hardware Summit Montreal</a></li> +<li><a href="https://www.tum-venture-labs.de/education/bots-bento-the-first-robotic-pallet-handler-competition-icra-2024/">2024-05-13 Bots &amp; Bento @ ICRA 2024</a></li> +<li><a href="https://cpsweek2024-race.f1tenth.org/">2024-05-14 → 2024-05-16 F1Tenth Autonomous Grand Prix @ CPS-IoT Week</a></li> +<li><a href="https://discourse.ros.org/t/acm-sigsoft-summer-school-on-software-engineering-for-robotics-in-brussels/35992">2024-06-04 → 2024-06-08 ACM SIGSOFT Summer School on Software Engineering for Robotics</a></li> +<li><a href="https://www.agriculture-vision.com/">2024-06-?? Workshop on Agriculture Vision at CVPR 2024</a></li> +<li><a href="https://icra2024.rt-net.jp/archives/87">2024-06-16 Food Topping Challenge at ICRA 2024</a></li> +<li><a href="https://roscon.fr/">2024-06-19 =&gt; 2024-06-20 ROSCon France</a></li> +<li><a href="https://sites.udel.edu/ceoe-able/able-summer-bootcamp/">2024-08-05 Autonomous Systems Bootcam at Univ. Deleware</a>– <a href="https://www.youtube.com/watch?v=4ETDsBN2o8M">Video</a></li> +<li><a href="https://roscon.jp/2024/">2024-09-25 ROSConJP</a></li> +</ul> +<h1><a class="anchor" href="https://discourse.ros.org#news-3" name="news-3"></a>News</h1> +<ul> +<li><a href="https://techcrunch.com/2024/03/06/saildrones-first-aluminum-surveyor-autonomous-vessel-splashes-down-for-navy-testing/">Saildrone’s New Aluminum Surveyor</a></li> +<li><a href="https://techcrunch.com/2024/03/06/amazon-teams-with-recycling-robot-firm-to-track-package-waste/">Glacier Recycling Robot Raises $7.7M</a> – <a href="https://www.therobotreport.com/recycling-automation-startup-glacier-brings-in-7-7m/">Robot Report</a></li> +<li><a href="https://techcrunch.com/2024/03/05/agility-robotics-new-ceo-is-focused-on-the-here-and-now/">New CEO at Agility</a></li> +<li><a href="https://techcrunch.com/2024/02/29/figure-rides-the-humanoid-robot-hype-wave-to-2-6b-valuation-and-openai-collab/">Figure raises $675M for Humanoid Robots</a></li> +<li><a href="https://www.therobotreport.com/rios-intelligent-machines-raises-series-b-funding-starts-rolls-out-mission-control/">RIOS Raises $13M Series B</a></li> +<li><a href="https://www.therobotreport.com/robotics-companies-raised-578m-in-january-2024/">$578M in Raised for Robotics in January 2024</a></li> +<li><a href="https://hackaday.com/2024/03/06/the-16-pcb-robot/">$16 PCB Robot</a></li> +<li><a href="https://github.com/muskie82/MonoGS"><img alt=":cool:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/cool.png?v=12" title=":cool:" width="20" /> Gaussian Splatting SLAM source code</a></li> +<li><a href="https://www.therobotreport.com/researchers-develop-interface-for-quadriplegics-to-control-robots/">Researchers develop interface for quadriplegics to control robots</a></li> +<li><a href="https://github.com/Wuziyi616/LEOD">LEOD: Label-Efficient Object Detection for Event Cameras</a></li> +<li><a href="https://www.youtube.com/watch?v=uL5ClqHg5Jw">Taylor Alexander on Solar Powered Farming Robots</a></li> +<li><a href="https://discourse.ros.org/t/roscon-france-2024/35222/3">ROSCon France Logo Drops</a></li> +<li><a href="https://www.swri.org/industry/industrial-robotics-automation/blog/making-robot-programming-user-friendly">SwRI Workbench for Offline Robotics Development (SWORD)</a></li> +<li><img alt=":cool:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/cool.png?v=12" title=":cool:" width="20" /> <a href="https://github.com/Romea/cropcraft">Procedural World Generator for Farm Robots</a></li> +<li><a href="https://github.com/ulagbulag/kiss-icp-rs">KISS ICP Odometry in Rust <img alt=":crab:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/crab.png?v=12" title=":crab:" width="20" /></a></li> +<li><a href="https://github.com/princeton-vl/OcMesher?tab=readme-ov-file">View-Dependent Octree-based Mesh Extraction in Unbounded Scenes for Procedural Synthetic Data</a></li> +<li><a href="https://github.com/juanb09111/FinnForest">Woodlands Dataset with Stereo and LIDAR</a></li> +<li><a href="https://github.com/peterstratton/Volume-DROID">Volume-DROID SLAM Source Code</a></li> +<li><a href="https://spectrum.ieee.org/video-friday-human-to-humanoid">Video Friday</a></li> +</ul> +<h1><a class="anchor" href="https://discourse.ros.org#ros-4" name="ros-4"></a>ROS</h1> +<ul> +<li><a href="https://discourse.ros.org/t/releasing-packages-to-integrate-brickpi3-with-ros2/36389">ROS for Lego Mindstorms!</a></li> +<li><a href="https://discourse.ros.org/t/fyi-10-low-cost-lidar-lds-interfaced-to-ros2-micro-ros-arduino/36369"><img alt=":cool:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/cool.png?v=12" title=":cool:" width="20" /> 10+ Low-Cost LIDARs Compared</a></li> +<li><a href="https://discourse.ros.org/t/revival-of-client-library-working-group/36406">Reboot Client Library Working Group?</a></li> +<li><a href="https://discourse.ros.org/t/new-packages-for-humble-hawksbill-2024-03-08/36529">13 New and 220 Updated Packages for ROS 2 Humble</a></li> +<li><a href="https://discourse.ros.org/t/ros1-now-is-a-great-time-to-add-catkin-lint-to-your-packages/36521">Now is a Great Time to Add Catkin Lint to Your Package</a></li> +<li><a href="https://discourse.ros.org/t/cobot-magic-mobile-aloha-system-works-on-agilex-robotics-platform/36515">Cobot Magic: Mobile Aloha system works on AgileX Robotics platform</a></li> +<li><a href="https://discourse.ros.org/t/new-packages-for-noetic-2024-03-07/36514">10 New and 46 Updated Packages for ROS 1 Noetic</a></li> +<li><a href="https://discourse.ros.org/t/new-packages-for-ros-2-rolling-ridley-2024-02-28/36358">5 New and 279 Updated Packages for ROS 2 Rolling Ridley (Last 22.04 Update)</a></li> +<li><a href="https://discourse.ros.org/t/potential-humanoid-robotics-monthly-working-group/36426">Humanoid Working Group?</a></li> +<li><a href="https://discourse.ros.org/t/ros-mapping-and-navigation-with-agilex-robotics-limo/36452">New Agile-X LIMO</a></li> +<li><a href="https://discourse.ros.org/t/rosmicropy-graphical-controller-proposal-feedback/36424">ROS MicroPy Graphical Controller</a></li> +<li><a href="https://discourse.ros.org/t/noise-model-for-depth-camera-simulation/36385">Simulating Noise in Depth Cameras</a></li> +<li><a href="https://discourse.ros.org/t/what-are-the-main-challenges-you-faced-in-using-ros2-to-develop-industrial-applications-with-manipulators/36393">What are the main challenges you faced in using ROS2 to develop industrial applications with manipulators? </a></li> +<li><a href="https://www.youtube.com/playlist?list=PL8EeqqtDev57JEEs_HL3g9DbAwGkbWmhK">Autoware Foundation General Assembly 2023 Recordings</a></li> +<li><a href="https://arxiv.org/abs/2312.14808">F1Tenth: A Tricycle Model to Accurately Control an Autonomous Racecar with Locked Differential</a></li> +<li><img alt=":cool:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/cool.png?v=12" title=":cool:" width="20" /> <a href="https://arxiv.org/abs/2402.18558">Unifying F1TENTH Autonomous Racing: Survey, Methods and Benchmarks</a> – <a href="https://github.com/BDEvan5/f1tenth_benchmarks">Benchmark Data</a></li> +<li><a href="https://github.com/dimaxano/ros2-lifecycle-monitoring">RViz Plugin for Monitoring Node Life Cycles</a></li> +<li><a href="https://github.com/suchetanrs/ORB-SLAM3-ROS2-Docker">ROS 2 + ORB SLAM 3 Docker Container</a></li> +<li><a href="https://www.youtube.com/@kevinwoodrobot/playlists">Kevin Wood ROS Youtube Videos</a></li> +<li><img alt=":cool:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/cool.png?v=12" title=":cool:" width="20" /> <a href="https://arxiv.org/abs/2402.19341">JPL + ROS: RoadRunner - Learning Traversability Estimation for Autonomous Off-road Driving </a></li> +<li><a href="https://navigation.ros.org/tutorials/docs/integrating_vio.html">Nav2: Using VIO to Augment Robot Odometry</a></li> +<li><a href="https://github.com/MRPT/mvsim">MultiVehicle simulator (MVSim)</a></li> +<li><a href="https://kylew239.github.io/in_progress/crazyflie/">Light Painting with a Drone Swarm</a></li> +<li><a href="https://github.com/TKG-Tou-Kai-Group/CoRE-jp-Isaac-Sim-ROS2-packages">ROS 2 + Isaac Sim Docker (Japanese) </a></li> +<li><a href="https://github.com/husarion/rosbot-telepresence/tree/foxglove">Real-Time Internet Control and Video Streaming with ROSbot 2R / 2 PRO</a></li> +</ul> +<h1><a class="anchor" href="https://discourse.ros.org#ros-questions-5" name="ros-questions-5"></a>ROS Questions</h1> +<p>Please make ROS a better project for the next person! Take a moment to answer a question on <a href="https://robotics.stackexchange.com/">Robotics Stack Exchange</a>! Not your thing? <a href="https://github.com/ros2/ros2_documentation">Contribute to the ROS 2 Docs!</a></p> + <p><small>4 posts - 2 participants</small></p> + <p><a href="https://discourse.ros.org/t/ros-news-for-the-week-of-march-4th-2024/36532">Read full topic</a></p> + Fri, 08 Mar 2024 21:50:00 +0000 + + + ROS Discourse General: New packages for Humble Hawksbill 2024-03-08 + discourse.ros.org-topic-36529 + https://discourse.ros.org/t/new-packages-for-humble-hawksbill-2024-03-08/36529 + <h2><a class="anchor" href="https://discourse.ros.org#package-updates-for-humble-1" name="package-updates-for-humble-1"></a>Package Updates for Humble</h2> +<h3><a class="anchor" href="https://discourse.ros.org#added-packages-13-2" name="added-packages-13-2"></a>Added Packages [13]:</h3> +<ul> +<li>ros-humble-apriltag-detector-dbgsym: 1.1.1-1</li> +<li>ros-humble-caret-analyze-cpp-impl: 0.5.0-5</li> +<li>ros-humble-caret-analyze-cpp-impl-dbgsym: 0.5.0-5</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-ds-dbw</a>: 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-ds-dbw-can</a>: 2.1.10-1</li> +<li>ros-humble-ds-dbw-can-dbgsym: 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-ds-dbw-joystick-demo</a>: 2.1.10-1</li> +<li>ros-humble-ds-dbw-joystick-demo-dbgsym: 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-ds-dbw-msgs</a>: 2.1.10-1</li> +<li>ros-humble-ds-dbw-msgs-dbgsym: 2.1.10-1</li> +<li>ros-humble-gazebo-no-physics-plugin: 0.1.1-1</li> +<li>ros-humble-gazebo-no-physics-plugin-dbgsym: 0.1.1-1</li> +<li>ros-humble-kinematics-interface-dbgsym: 0.3.0-1</li> +</ul> +<h3><a class="anchor" href="https://discourse.ros.org#updated-packages-220-3" name="updated-packages-220-3"></a>Updated Packages [220]:</h3> +<ul> +<li>ros-humble-ackermann-steering-controller: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-ackermann-steering-controller-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-admittance-controller: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-admittance-controller-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-apriltag-detector: 1.1.0-1 → 1.1.1-1</li> +<li>ros-humble-bicycle-steering-controller: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-bicycle-steering-controller-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-bno055: 0.4.1-1 → 0.5.0-1</li> +<li><a href="https://index.ros.org/p/camera_calibration/github-ros-perception-image_pipeline/">ros-humble-camera-calibration</a>: 3.0.3-1 → 3.0.4-1</li> +<li>ros-humble-caret-analyze: 0.5.0-1 → 0.5.0-2</li> +<li>ros-humble-cob-actions: 2.7.9-1 → 2.7.10-1</li> +<li>ros-humble-cob-actions-dbgsym: 2.7.9-1 → 2.7.10-1</li> +<li>ros-humble-cob-msgs: 2.7.9-1 → 2.7.10-1</li> +<li>ros-humble-cob-msgs-dbgsym: 2.7.9-1 → 2.7.10-1</li> +<li><a href="http://ros.org/wiki/cob_srvs">ros-humble-cob-srvs</a>: 2.7.9-1 → 2.7.10-1</li> +<li>ros-humble-cob-srvs-dbgsym: 2.7.9-1 → 2.7.10-1</li> +<li>ros-humble-controller-interface: 2.39.1-1 → 2.40.0-1</li> +<li>ros-humble-controller-interface-dbgsym: 2.39.1-1 → 2.40.0-1</li> +<li>ros-humble-controller-manager: 2.39.1-1 → 2.40.0-1</li> +<li>ros-humble-controller-manager-dbgsym: 2.39.1-1 → 2.40.0-1</li> +<li><a href="http://ros.org/wiki/controller_manager_msgs">ros-humble-controller-manager-msgs</a>: 2.39.1-1 → 2.40.0-1</li> +<li>ros-humble-controller-manager-msgs-dbgsym: 2.39.1-1 → 2.40.0-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dataspeed-dbw-common</a>: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dataspeed-ulc</a>: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dataspeed-ulc-can</a>: 2.1.3-1 → 2.1.10-1</li> +<li>ros-humble-dataspeed-ulc-can-dbgsym: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dataspeed-ulc-msgs</a>: 2.1.3-1 → 2.1.10-1</li> +<li>ros-humble-dataspeed-ulc-msgs-dbgsym: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-fca</a>: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-fca-can</a>: 2.1.3-1 → 2.1.10-1</li> +<li>ros-humble-dbw-fca-can-dbgsym: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-fca-description</a>: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-fca-joystick-demo</a>: 2.1.3-1 → 2.1.10-1</li> +<li>ros-humble-dbw-fca-joystick-demo-dbgsym: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-fca-msgs</a>: 2.1.3-1 → 2.1.10-1</li> +<li>ros-humble-dbw-fca-msgs-dbgsym: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-ford</a>: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-ford-can</a>: 2.1.3-1 → 2.1.10-1</li> +<li>ros-humble-dbw-ford-can-dbgsym: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-ford-description</a>: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-ford-joystick-demo</a>: 2.1.3-1 → 2.1.10-1</li> +<li>ros-humble-dbw-ford-joystick-demo-dbgsym: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-ford-msgs</a>: 2.1.3-1 → 2.1.10-1</li> +<li>ros-humble-dbw-ford-msgs-dbgsym: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-polaris</a>: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-polaris-can</a>: 2.1.3-1 → 2.1.10-1</li> +<li>ros-humble-dbw-polaris-can-dbgsym: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-polaris-description</a>: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-polaris-joystick-demo</a>: 2.1.3-1 → 2.1.10-1</li> +<li>ros-humble-dbw-polaris-joystick-demo-dbgsym: 2.1.3-1 → 2.1.10-1</li> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dbw-polaris-msgs</a>: 2.1.3-1 → 2.1.10-1</li> +<li>ros-humble-dbw-polaris-msgs-dbgsym: 2.1.3-1 → 2.1.10-1</li> +<li><a href="https://index.ros.org/p/depth_image_proc/github-ros-perception-image_pipeline/">ros-humble-depth-image-proc</a>: 3.0.3-1 → 3.0.4-1</li> +<li>ros-humble-depth-image-proc-dbgsym: 3.0.3-1 → 3.0.4-1</li> +<li>ros-humble-diff-drive-controller: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-diff-drive-controller-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li><a href="https://wiki.ros.org/draco_point_cloud_transport">ros-humble-draco-point-cloud-transport</a>: 1.0.9-1 → 1.0.10-1</li> +<li>ros-humble-draco-point-cloud-transport-dbgsym: 1.0.9-1 → 1.0.10-1</li> +<li>ros-humble-effort-controllers: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-effort-controllers-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-etsi-its-cam-coding: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-cam-coding-dbgsym: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-cam-conversion: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-cam-msgs: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-cam-msgs-dbgsym: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-coding: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-conversion: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-conversion-dbgsym: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-denm-coding: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-denm-coding-dbgsym: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-denm-conversion: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-denm-msgs: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-denm-msgs-dbgsym: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-messages: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-msgs: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-msgs-utils: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-primitives-conversion: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-rviz-plugins: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-etsi-its-rviz-plugins-dbgsym: 2.0.0-1 → 2.0.1-1</li> +<li>ros-humble-flir-camera-description: 2.0.8-2 → 2.0.8-3</li> +<li>ros-humble-flir-camera-msgs: 2.0.8-2 → 2.0.8-3</li> +<li>ros-humble-flir-camera-msgs-dbgsym: 2.0.8-2 → 2.0.8-3</li> +<li>ros-humble-force-torque-sensor-broadcaster: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-force-torque-sensor-broadcaster-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-forward-command-controller: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-forward-command-controller-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-gripper-controllers: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-gripper-controllers-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-hardware-interface: 2.39.1-1 → 2.40.0-1</li> +<li>ros-humble-hardware-interface-dbgsym: 2.39.1-1 → 2.40.0-1</li> +<li>ros-humble-hardware-interface-testing: 2.39.1-1 → 2.40.0-1</li> +<li>ros-humble-hardware-interface-testing-dbgsym: 2.39.1-1 → 2.40.0-1</li> +<li><a href="https://index.ros.org/p/image_pipeline/github-ros-perception-image_pipeline/">ros-humble-image-pipeline</a>: 3.0.3-1 → 3.0.4-1</li> +<li><a href="https://index.ros.org/p/image_proc/github-ros-perception-image_pipeline/">ros-humble-image-proc</a>: 3.0.3-1 → 3.0.4-1</li> +<li>ros-humble-image-proc-dbgsym: 3.0.3-1 → 3.0.4-1</li> +<li><a href="https://index.ros.org/p/image_publisher/github-ros-perception-image_pipeline/">ros-humble-image-publisher</a>: 3.0.3-1 → 3.0.4-1</li> +<li>ros-humble-image-publisher-dbgsym: 3.0.3-1 → 3.0.4-1</li> +<li><a href="https://index.ros.org/p/image_rotate/github-ros-perception-image_pipeline/">ros-humble-image-rotate</a>: 3.0.3-1 → 3.0.4-1</li> +<li>ros-humble-image-rotate-dbgsym: 3.0.3-1 → 3.0.4-1</li> +<li><a href="https://index.ros.org/p/image_view/github-ros-perception-image_pipeline/">ros-humble-image-view</a>: 3.0.3-1 → 3.0.4-1</li> +<li>ros-humble-image-view-dbgsym: 3.0.3-1 → 3.0.4-1</li> +<li>ros-humble-imu-sensor-broadcaster: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-imu-sensor-broadcaster-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li><a href="https://github.com/ros-controls/ros2_control/wiki" rel="noopener nofollow ugc">ros-humble-joint-limits</a>: 2.39.1-1 → 2.40.0-1</li> +<li>ros-humble-joint-limits-dbgsym: 2.39.1-1 → 2.40.0-1</li> +<li>ros-humble-joint-state-broadcaster: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-joint-state-broadcaster-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-joint-trajectory-controller: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-joint-trajectory-controller-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-kinematics-interface: 0.2.0-1 → 0.3.0-1</li> +<li>ros-humble-kinematics-interface-kdl: 0.2.0-1 → 0.3.0-1</li> +<li>ros-humble-kinematics-interface-kdl-dbgsym: 0.2.0-1 → 0.3.0-1</li> +<li><a href="https://github.com/pal-robotics/launch_pal" rel="noopener nofollow ugc">ros-humble-launch-pal</a>: 0.0.16-1 → 0.0.18-1</li> +<li><a href="http://wiki.ros.org/mavros">ros-humble-libmavconn</a>: 2.6.0-1 → 2.7.0-1</li> +<li>ros-humble-libmavconn-dbgsym: 2.6.0-1 → 2.7.0-1</li> +<li><a href="https://mavlink.io/en/" rel="noopener nofollow ugc">ros-humble-mavlink</a>: 2023.9.9-1 → 2024.3.3-1</li> +<li><a href="http://wiki.ros.org/mavros">ros-humble-mavros</a>: 2.6.0-1 → 2.7.0-1</li> +<li>ros-humble-mavros-dbgsym: 2.6.0-1 → 2.7.0-1</li> +<li><a href="http://wiki.ros.org/mavros_extras">ros-humble-mavros-extras</a>: 2.6.0-1 → 2.7.0-1</li> +<li>ros-humble-mavros-extras-dbgsym: 2.6.0-1 → 2.7.0-1</li> +<li><a href="http://wiki.ros.org/mavros_msgs">ros-humble-mavros-msgs</a>: 2.6.0-1 → 2.7.0-1</li> +<li>ros-humble-mavros-msgs-dbgsym: 2.6.0-1 → 2.7.0-1</li> +<li><a href="https://www.mrpt.org/" rel="noopener nofollow ugc">ros-humble-mrpt2</a>: 2.11.9-1 → 2.11.11-1</li> +<li>ros-humble-mrpt2-dbgsym: 2.11.9-1 → 2.11.11-1</li> +<li><a href="https://wiki.ros.org/mvsim">ros-humble-mvsim</a>: 0.8.3-1 → 0.9.1-1</li> +<li>ros-humble-mvsim-dbgsym: 0.8.3-1 → 0.9.1-1</li> +<li><a href="https://github.com/LORD-MicroStrain/ntrip_client" rel="noopener nofollow ugc">ros-humble-ntrip-client</a>: 1.2.0-1 → 1.3.0-1</li> +<li><a href="https://github.com/pal-robotics/play_motion2" rel="noopener nofollow ugc">ros-humble-play-motion2</a>: 0.0.13-1 → 1.0.0-1</li> +<li>ros-humble-play-motion2-dbgsym: 0.0.13-1 → 1.0.0-1</li> +<li><a href="https://github.com/pal-robotics/play_motion2" rel="noopener nofollow ugc">ros-humble-play-motion2-msgs</a>: 0.0.13-1 → 1.0.0-1</li> +<li>ros-humble-play-motion2-msgs-dbgsym: 0.0.13-1 → 1.0.0-1</li> +<li><a href="https://github.com/facontidavide/PlotJuggler" rel="noopener nofollow ugc">ros-humble-plotjuggler</a>: 3.9.0-1 → 3.9.1-1</li> +<li>ros-humble-plotjuggler-dbgsym: 3.9.0-1 → 3.9.1-1</li> +<li><a href="https://github.com/pal-robotics/pmb2_simulation" rel="noopener nofollow ugc">ros-humble-pmb2-2dnav</a>: 4.0.9-1 → 4.0.12-1</li> +<li><a href="https://github.com/pal-robotics/pmb2_simulation" rel="noopener nofollow ugc">ros-humble-pmb2-bringup</a>: 5.0.15-1 → 5.0.16-1</li> +<li><a href="https://github.com/pal-robotics/pmb2_simulation" rel="noopener nofollow ugc">ros-humble-pmb2-controller-configuration</a>: 5.0.15-1 → 5.0.16-1</li> +<li><a href="https://github.com/pal-robotics/pmb2_simulation" rel="noopener nofollow ugc">ros-humble-pmb2-description</a>: 5.0.15-1 → 5.0.16-1</li> +<li><a href="https://github.com/pal-robotics/pmb2_simulation" rel="noopener nofollow ugc">ros-humble-pmb2-laser-sensors</a>: 4.0.9-1 → 4.0.12-1</li> +<li><a href="https://github.com/pal-robotics/pmb2_simulation" rel="noopener nofollow ugc">ros-humble-pmb2-maps</a>: 4.0.9-1 → 4.0.12-1</li> +<li><a href="https://github.com/pal-robotics/pmb2_simulation" rel="noopener nofollow ugc">ros-humble-pmb2-navigation</a>: 4.0.9-1 → 4.0.12-1</li> +<li><a href="https://github.com/pal-robotics/pmb2_simulation" rel="noopener nofollow ugc">ros-humble-pmb2-robot</a>: 5.0.15-1 → 5.0.16-1</li> +<li><a href="https://wiki.ros.org/draco_point_cloud_transport">ros-humble-point-cloud-interfaces</a>: 1.0.9-1 → 1.0.10-1</li> +<li>ros-humble-point-cloud-interfaces-dbgsym: 1.0.9-1 → 1.0.10-1</li> +<li>ros-humble-point-cloud-transport: 1.0.15-1 → 1.0.16-1</li> +<li>ros-humble-point-cloud-transport-dbgsym: 1.0.15-1 → 1.0.16-1</li> +<li><a href="https://wiki.ros.org/point_cloud_transport">ros-humble-point-cloud-transport-plugins</a>: 1.0.9-1 → 1.0.10-1</li> +<li>ros-humble-point-cloud-transport-py: 1.0.15-1 → 1.0.16-1</li> +<li>ros-humble-position-controllers: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-position-controllers-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-psdk-interfaces: 1.0.0-1 → 1.1.0-1</li> +<li>ros-humble-psdk-interfaces-dbgsym: 1.0.0-1 → 1.1.0-1</li> +<li>ros-humble-psdk-wrapper: 1.0.0-1 → 1.1.0-1</li> +<li>ros-humble-psdk-wrapper-dbgsym: 1.0.0-1 → 1.1.0-1</li> +<li>ros-humble-range-sensor-broadcaster: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-range-sensor-broadcaster-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-ros2-control: 2.39.1-1 → 2.40.0-1</li> +<li>ros-humble-ros2-control-test-assets: 2.39.1-1 → 2.40.0-1</li> +<li>ros-humble-ros2-controllers: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-ros2-controllers-test-nodes: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-ros2caret: 0.5.0-2 → 0.5.0-6</li> +<li>ros-humble-ros2controlcli: 2.39.1-1 → 2.40.0-1</li> +<li><a href="http://ros.org/wiki/rqt_controller_manager">ros-humble-rqt-controller-manager</a>: 2.39.1-1 → 2.40.0-1</li> +<li>ros-humble-rqt-gauges: 0.0.1-1 → 0.0.2-1</li> +<li>ros-humble-rqt-joint-trajectory-controller: 2.32.0-1 → 2.33.0-1</li> +<li><a href="http://introlab.github.io/rtabmap" rel="noopener nofollow ugc">ros-humble-rtabmap</a>: 0.21.3-1 → 0.21.4-1</li> +<li>ros-humble-rtabmap-conversions: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-conversions-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-dbgsym: 0.21.3-1 → 0.21.4-1</li> +<li>ros-humble-rtabmap-demos: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-examples: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-launch: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-msgs: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-msgs-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-odom: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-odom-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-python: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-ros: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-rviz-plugins: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-rviz-plugins-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-slam: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-slam-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-sync: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-sync-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-util: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-util-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-viz: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-rtabmap-viz-dbgsym: 0.21.3-1 → 0.21.4-2</li> +<li>ros-humble-simple-launch: 1.9.0-1 → 1.9.1-1</li> +<li>ros-humble-spinnaker-camera-driver: 2.0.8-2 → 2.0.8-3</li> +<li>ros-humble-spinnaker-camera-driver-dbgsym: 2.0.8-2 → 2.0.8-3</li> +<li>ros-humble-steering-controllers-library: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-steering-controllers-library-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li><a href="https://index.ros.org/p/stereo_image_proc/github-ros-perception-image_pipeline/">ros-humble-stereo-image-proc</a>: 3.0.3-1 → 3.0.4-1</li> +<li>ros-humble-stereo-image-proc-dbgsym: 3.0.3-1 → 3.0.4-1</li> +<li><a href="https://github.com/pal-robotics/tiago_simulation" rel="noopener nofollow ugc">ros-humble-tiago-2dnav</a>: 4.0.9-1 → 4.0.12-1</li> +<li><a href="https://github.com/pal-robotics/tiago_simulation" rel="noopener nofollow ugc">ros-humble-tiago-bringup</a>: 4.1.2-1 → 4.2.3-1</li> +<li><a href="https://github.com/pal-robotics/tiago_simulation" rel="noopener nofollow ugc">ros-humble-tiago-controller-configuration</a>: 4.1.2-1 → 4.2.3-1</li> +<li><a href="https://github.com/pal-robotics/tiago_simulation" rel="noopener nofollow ugc">ros-humble-tiago-description</a>: 4.1.2-1 → 4.2.3-1</li> +<li><a href="https://github.com/pal-robotics/tiago_simulation" rel="noopener nofollow ugc">ros-humble-tiago-gazebo</a>: 4.0.8-1 → 4.1.0-1</li> +<li><a href="https://github.com/pal-robotics/tiago_simulation" rel="noopener nofollow ugc">ros-humble-tiago-laser-sensors</a>: 4.0.9-1 → 4.0.12-1</li> +<li><a href="https://github.com/pal-robotics/tiago_simulation" rel="noopener nofollow ugc">ros-humble-tiago-moveit-config</a>: 3.0.7-1 → 3.0.10-1</li> +<li><a href="https://github.com/pal-robotics/tiago_simulation" rel="noopener nofollow ugc">ros-humble-tiago-navigation</a>: 4.0.9-1 → 4.0.12-1</li> +<li><a href="https://github.com/pal-robotics/tiago_simulation" rel="noopener nofollow ugc">ros-humble-tiago-robot</a>: 4.1.2-1 → 4.2.3-1</li> +<li><a href="https://github.com/pal-robotics/tiago_simulation" rel="noopener nofollow ugc">ros-humble-tiago-simulation</a>: 4.0.8-1 → 4.1.0-1</li> +<li>ros-humble-tracetools-image-pipeline: 3.0.3-1 → 3.0.4-1</li> +<li>ros-humble-tracetools-image-pipeline-dbgsym: 3.0.3-1 → 3.0.4-1</li> +<li>ros-humble-transmission-interface: 2.39.1-1 → 2.40.0-1</li> +<li>ros-humble-transmission-interface-dbgsym: 2.39.1-1 → 2.40.0-1</li> +<li>ros-humble-tricycle-controller: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-tricycle-controller-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-tricycle-steering-controller: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-tricycle-steering-controller-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li><a href="http://wiki.ros.org/ur_client_library">ros-humble-ur-client-library</a>: 1.3.4-1 → 1.3.5-1</li> +<li>ros-humble-ur-client-library-dbgsym: 1.3.4-1 → 1.3.5-1</li> +<li>ros-humble-velocity-controllers: 2.32.0-1 → 2.33.0-1</li> +<li>ros-humble-velocity-controllers-dbgsym: 2.32.0-1 → 2.33.0-1</li> +<li><a href="https://wiki.ros.org/draco_point_cloud_transport">ros-humble-zlib-point-cloud-transport</a>: 1.0.9-1 → 1.0.10-1</li> +<li>ros-humble-zlib-point-cloud-transport-dbgsym: 1.0.9-1 → 1.0.10-1</li> +<li><a href="https://wiki.ros.org/draco_point_cloud_transport">ros-humble-zstd-point-cloud-transport</a>: 1.0.9-1 → 1.0.10-1</li> +<li>ros-humble-zstd-point-cloud-transport-dbgsym: 1.0.9-1 → 1.0.10-1</li> +</ul> +<h3><a class="anchor" href="https://discourse.ros.org#removed-packages-2-4" name="removed-packages-2-4"></a>Removed Packages [2]:</h3> +<ul> +<li><a href="http://dataspeedinc.com" rel="noopener nofollow ugc">ros-humble-dataspeed-dbw-gateway</a></li> +<li>ros-humble-dataspeed-dbw-gateway-dbgsym</li> +</ul> +<p>Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:</p> +<ul> +<li>Alejandro Hernandez Cordero</li> +<li>Alejandro Hernández</li> +<li>Bence Magyar</li> +<li>Bernd Pfrommer</li> +<li>Bianca Bendris</li> +<li>Boeing</li> +<li>Davide Faconti</li> +<li>Denis Štogl</li> +<li>Eloy Bricneo</li> +<li>Felix Exner</li> +<li>Felix Messmer</li> +<li>Jean-Pierre Busch</li> +<li>Jordan Palacios</li> +<li>Jordi Pages</li> +<li>Jose-Luis Blanco-Claraco</li> +<li>Kevin Hallenbeck</li> +<li>Luis Camero</li> +<li>Martin Pecka</li> +<li>Mathieu Labbe</li> +<li>Micho Radovnikovich</li> +<li>Noel Jimenez</li> +<li>Olivier Kermorgant</li> +<li>Rob Fisher</li> +<li>TIAGo PAL support team</li> +<li>Vincent Rabaud</li> +<li>Vladimir Ermakov</li> +<li>Víctor Mayoral-Vilches</li> +<li>flynneva</li> +<li>ymski</li> +</ul> + <p><small>1 post - 1 participant</small></p> + <p><a href="https://discourse.ros.org/t/new-packages-for-humble-hawksbill-2024-03-08/36529">Read full topic</a></p> + Fri, 08 Mar 2024 16:36:12 +0000 + + + ROS Discourse General: ROS1: Now is a great time to add `catkin_lint` to your packages! + discourse.ros.org-topic-36521 + https://discourse.ros.org/t/ros1-now-is-a-great-time-to-add-catkin-lint-to-your-packages/36521 + <p><a href="https://fkie.github.io/catkin_lint/" rel="noopener nofollow ugc">catkin_lint</a> is an established ROS package that can do lots of useful checks on your CMakeLists.txt and package.xml . It can e.g. warn you about dependencies that do not match between package.xml and CMakeLists.txt, it can check existence of all rosdep keys in pacakge.xml, it will watch if all executable files in your package get installed, it will warn you about the most common wrong usages of CMake, and recently it even got the ability to warn you if you’re using a CMake feature that is too new for the CMake version you’ve put in <code>cmake_minimum_required()</code>. And there’s <a href="https://fkie.github.io/catkin_lint/messages/" rel="noopener nofollow ugc">much more</a>.</p> +<p>Personally, as a maintainer, I feel much more comfortable releasing a new version of a package once I see catkin_lint passed without complaints.</p> +<p>Until recently, automatic running of catkin_lint tests on packages released via buildfarm was problematic because <a href="https://github.com/ros-infrastructure/ros_buildfarm/issues/923" rel="noopener nofollow ugc">the buildfarm doesn’t initialize rosdep cache</a> and <a href="https://github.com/fkie/catkin_lint/issues/108" rel="noopener nofollow ugc">catkin_lint needed it for its working</a>. The recently released version 1.6.22 of catkin lint no longer fails in this case, so it is able to run all other tests that do not require rosdep on the buildfarm, while disabling those that need rosdep (currently only checking that package.xml keys point to existing packages).</p> +<p>Adding automatic catkin_lint to your package is easy!</p> +<p>CMakeLists.txt:</p> +<pre><code class="lang-cmake">if (CATKIN_ENABLE_TESTING) + find_package(roslint REQUIRED) + roslint_custom(catkin_lint "-W2" .) + roslint_add_test() +endif() +</code></pre> +<p>package.xml:</p> +<pre><code class="lang-XML">&lt;test_depend&gt;python3-catkin-lint&lt;/test_depend&gt; +&lt;test_depend&gt;roslint&lt;/test_depend&gt; +</code></pre> +<p>And that’s it!</p> +<p>If you want to run the test locally, you can either manually invoke <code>catkin_lint .</code> in your package directory, or <code>make roslint</code> in the build directory.</p> +<p>And if you’re okay with some warnings catkin_lint gives you, you can always ignore them either for a single line (<code>#catkin_lint: ignore_once duplicate_find</code>) or globally by adding arguments to the <code>catkin_lint</code> call (<code>catkin_lint -W2 --ignore duplicate_find .</code>).</p> +<p>Of course, the catkin_lint automation should not substitute manual runs of this tool before releasing a new version of your package. It should be a good habit to run caktin_lint after you finished editing your build files. However, having the automation built in, you can get assurance that even if you forget running the tool manually, the buildfarm will let you know <img alt=":slight_smile:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/slight_smile.png?v=12" title=":slight_smile:" width="20" /></p> +<p>You can see examples of catkin_lint used on buildfarm-released packages e.g. in our ROS utils stack: <a class="inline-onebox" href="https://github.com/ctu-vras/ros-utils/blob/master/cras_topic_tools/CMakeLists.txt#L108" rel="noopener nofollow ugc">ros-utils/cras_topic_tools/CMakeLists.txt at master · ctu-vras/ros-utils · GitHub</a> . Or scroll down on <a class="inline-onebox" href="https://index.ros.org/d/python3-catkin-lint/">rosdep System Dependency: python3-catkin-lint</a> to see all other.</p> +<hr /> +<p>NB: I’m not the developer of catkin_lint. <a class="mention" href="https://discourse.ros.org/u/roehling">@roehling</a> @ FKIE is doing all of the awesome work!</p> +<hr /> +<p>NB2: When you’re at it, also have a look at:</p> +<pre><code class="lang-auto">find_package(roslaunch REQUIRED) +roslaunch_add_file_check(launch IGNORE_UNSET_ARGS) +</code></pre> +<p>and</p> +<pre><code class="lang-auto">&lt;test_depend&gt;roslaunch&lt;/test_depend&gt; +</code></pre> + <p><small>1 post - 1 participant</small></p> + <p><a href="https://discourse.ros.org/t/ros1-now-is-a-great-time-to-add-catkin-lint-to-your-packages/36521">Read full topic</a></p> + Fri, 08 Mar 2024 10:28:13 +0000 + + + ROS Discourse General: Cobot Magic: Mobile Aloha system works on AgileX Robotics platform + discourse.ros.org-topic-36515 + https://discourse.ros.org/t/cobot-magic-mobile-aloha-system-works-on-agilex-robotics-platform/36515 + <h1><a class="anchor" href="https://discourse.ros.org#introduction-1" name="introduction-1"></a>Introduction</h1> +<p>AgileX Cobot Magic is a system based on Mobile ALOHA that can simultaneously remotely control the TRACER mobile base and robotic arms.</p> +<p><img alt="浇花1" class="animated" height="390" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/5/1/51f4b207bad0b9e972c323fe70722378c752cbf1.gif" width="690" /></p> +<h2><a class="anchor" href="https://discourse.ros.org#story-2" name="story-2"></a>Story</h2> +<p>Mobile Aloha is a whole-body remote operation data collection system developed by Zipeng Fu, Tony Z. Zhao, and Chelsea Finn from Stanford University. Its hardware is based on 2 robotic arms (ViperX 300), equipped with 2 wrist cameras and 1 top camera, and a mobile base from <strong>AgileX Robotics Tracer</strong> differential motion robot, etc. Data collected using Mobile ALOHA, combined with supervised behavior cloning and joint training with existing static ALOHA datasets, can improve the performance of mobile manipulation tasks. With 50 demonstrations for each task, joint training can increase the success rate by 90%. Mobile ALOHA can autonomously perform complex mobile manipulation tasks such as cooking and opening doors. <strong>Special thanks to the Stanford research team Zipeng Fu, Tony Z. Zhao, and Chelsea Finn for their research on Mobile ALOHA</strong>, which enabled full open-source implementation. For more details about this project please check the <a href="https://mobile-aloha.github.io/" rel="noopener nofollow ugc">link.</a></p> +<p>Based on Mobile Aloha, AgileX developed Cobot Magic, which can achieve the complete code of Mobile Aloha, with higher configurations and lower costs, and is equipped with larger-load robotic arms and high-computing power industrial computers. For more details about Cobot Magic please check the <a href="https://global.agilex.ai/" rel="noopener nofollow ugc">AgileX website</a>.</p> +<p>AgileX Cobot Magic is a system based on Mobile ALOHA that can simultaneously remotely control the TRACER mobile base and robotic arms. It is equipped with an indoor differential drive AGV base, a high-performance robotic arm, an industrial-grade computer, and other components. AgileX Cobot Magic assists users in better utilizing open-source hardware and the Mobile ALOHA deep learning framework for robotics. It covers a wide range of tasks, from simple pick-and-place operations to more intricate and complex actions such as pouring, cooking, riding elevators, and organizing items.</p> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/1/f/1f9a0a4bbbad855e74db30ed0f83c9efd121753c.jpeg" rel="noopener nofollow ugc" title="cobot magic"><img alt="cobot magic" height="500" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/1/f/1f9a0a4bbbad855e74db30ed0f83c9efd121753c_2_500x500.jpeg" width="500" /></a></div><p></p> +<p>Currently, AgileX has successfully completed the integration of Cobot Magic based on the Mobile Aloha source code project. It currently includes the entire process of data collection, data re-display, data visualization, demonstration mode, model training, inference, and so on. This project will introduce AgileX Cobot Magic and provide ongoing updates on the training progress of mobile manipulation tasks.</p> +<h3><a class="anchor" href="https://discourse.ros.org#hardware-configuration-3" name="hardware-configuration-3"></a><strong>Hardware configuration</strong></h3> +<p>Here is the list of hardware in AgileX Cobot Magic.</p> +<div class="md-table"> +<table> +<thead> +<tr> +<th>Component</th> +<th>Item Name</th> +<th>Model</th> +</tr> +</thead> +<tbody> +<tr> +<td>Standard Configuration</td> +<td>Wheeled Mobile Robot</td> +<td>AgileX Tracer</td> +</tr> +<tr> +<td>Deep Camera x3</td> +<td>Orbbec Dabai</td> +<td></td> +</tr> +<tr> +<td>USB Hub</td> +<td>12V Power Supply,7-port</td> +<td></td> +</tr> +<tr> +<td>6 DOF Lightweight Robot Arm x4</td> +<td>Customized by AgileX</td> +<td></td> +</tr> +<tr> +<td>Adjustable Velcro x2</td> +<td>Customized by AgileX</td> +<td></td> +</tr> +<tr> +<td>Grip Tape x2</td> +<td>Customized by AgileX</td> +<td></td> +</tr> +<tr> +<td>Power Strip</td> +<td>4 Outlets, 1.8m</td> +<td></td> +</tr> +<tr> +<td>Mobile Power Station</td> +<td>1000W</td> +<td></td> +</tr> +<tr> +<td>ALOHA Stand</td> +<td>Customized by AgileX</td> +<td></td> +</tr> +<tr> +<td>OptionalConfiguration</td> +<td>Nano Development Kit</td> +<td>Jetson Orin Nano Developer Kit (8G)</td> +</tr> +<tr> +<td>Industrial PC</td> +<td>APQ-X7010/GPU 4060/i7-9700-32g-4T</td> +<td></td> +</tr> +<tr> +<td>IMU</td> +<td>CH110</td> +<td></td> +</tr> +<tr> +<td>Display</td> +<td>11.6" 1080p</td> +<td></td> +</tr> +</tbody> +</table> +</div><p>Note: An IPC is required. Users have two options: Nano Development kit and APQ-X7010 IPC.</p> +<h3><a class="anchor" href="https://discourse.ros.org#software-configuration-4" name="software-configuration-4"></a><strong>Software configuration</strong></h3> +<p><strong>Local computer:</strong></p> +<p>Ubuntu20.04, cuda-11.3.</p> +<p><strong>Environment configuration:</strong></p> +<pre><code class="lang-auto"># 1. Create python virtual environment +conda create -n aloha python=3.8 + +# 2. Activate +conda activate aloha + +# 3. Install cuda and torch +pip install torch==1.11.0+cu113 torchvision==0.12.0+cu113 torchaudio==0.11.0 --extra-index-url https://download.pytorch.org/whl/cu113 + + +# 4 Install detr +## Get act code +git clone https://github.com/agilexrobotics/act-plus-plus.git +cd act-plus-plus + + +# 4.1 other dependencies +pip install -r requirements.txt + +## 4.2 Install detr +cd detr &amp;&amp; pip install -v -e . +</code></pre> +<p><strong>Simulated environment datasets</strong></p> +<p>You can find all scripted/human demos for simulated environments here. <a href="https://drive.google.com/drive/folders/1gPR03v05S1xiInoVJn7G7VJ9pDCnxq9O" rel="noopener nofollow ugc">here</a></p> +<p>After downloading, copy it to the act-plus-plus/data directory. The directory structure is as follows:</p> +<pre><code class="lang-auto">act-plus-plus/data + ├── sim_insertion_human + │ ├── sim_insertion_human-20240110T054847Z-001.zip + ├── ... + ├── sim_insertion_scripted + │ ├── sim_insertion_scripted-20240110T054854Z-001.zip + ├── ... + ├── sim_transfer_cube_human + │ ├── sim_transfer_cube_human-20240110T054900Z-001.zip + │ ├── ... + └── sim_transfer_cube_scripted + ├── sim_transfer_cube_scripted-20240110T054901Z-001.zip + ├── ... +</code></pre> +<h3><a class="anchor" href="https://discourse.ros.org#demonstration-5" name="demonstration-5"></a><strong>Demonstration</strong></h3> +<p>By now it is widely accepted that learning a task from scratch, i.e., without any prior knowledge, is a daunting undertaking. Humans, however, rarely attempt to learn from scratch. They extract initial biases as well as strategies on how to approach a learning problem from instructions and/or demonstrations of other humans. This is what we call ‘programming by demonstration’ or ‘Imitation learning’.</p> +<p>The demonstration usually contains decision data {T1, T2,…, Tm}. Each decision contains the state and action sequence</p> +<p><img alt="image.png" height="25" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/6/1/6111251fd1d4fc2705870cbb78cec3e6a365c753.png" width="226" /></p> +<p>Extract all “state-action pairs” and build a new set</p> +<p><img alt="image.png" height="22" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/0/f/0fdd7c382a74a5e19eb6e32f31c0dddcd48e5e2b.png" width="336" /></p> +<p>Currently, based on AgileX Cobot Magic, we can achieve multiple whole-body action tasks.</p> +<p>Here we will show different action task demonstrations collected using AgileX Cobot Magic.</p> +<p>● <strong>Watering flowers</strong></p> +<p><img alt="浇花1" class="animated" height="390" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/5/1/51f4b207bad0b9e972c323fe70722378c752cbf1.gif" width="690" /></p> +<p>● <strong>Opening a box</strong></p> +<p><img alt="开箱子1" class="animated" height="390" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/8/6/86e16475476bf07b16c76d9af74d99b2469ecc80.gif" width="690" /></p> +<p>● <strong>Pouring rice</strong></p> +<p><img alt="倒米1" class="animated" height="390" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/5/5/551c084ea7f0ac49044a4bc8c78c95fba2570c88.gif" width="690" /></p> +<p>● <strong>Twisting a bottle cap</strong></p> +<p><img alt="拧瓶盖1" class="animated" height="444" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/5/3/53214e9bd6afb1292baf5ba6a3d548b21aa468ee.gif" width="400" /></p> +<p>● <strong>Throwing a rubbish</strong></p> +<p><img alt="扔垃圾1" class="animated" height="387" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/e/a/eaa1703cf4e551c161caf33d4d666987191b98e8.gif" width="690" /></p> +<p>Using AgileX Cobot Magic, users can flexibly complete various action tasks in life by controlling the teaching robot arm from simple pick and place skills to more sophisticated skills such as twisting bottle caps. The mobile chassis provides more possibilities for the robotic arms so that the robotic arm is no longer restricted to performing actions in a fixed place. The 14 + 2 DOFs provide limitless potential for collecting diverse data.</p> +<h3><a class="anchor" href="https://discourse.ros.org#data-presentation-6" name="data-presentation-6"></a><strong>Data Presentation</strong></h3> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/4/7/472aee2a6ebf04518c8af69a3acf2ee09fcc9be4.jpeg" rel="noopener nofollow ugc" title="image-20240307183526584"><img alt="image-20240307183526584" height="500" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/4/7/472aee2a6ebf04518c8af69a3acf2ee09fcc9be4.jpeg" width="602" /></a></div><p></p> +<p>Display the collected data of a certain demonstration of the AgileX Cobot Magic arms. The collected data includes the positional information of 14 joints at different time intervals.</p> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/0/1/0107b4dc478e3d9069119b915825a2daba5202a7.jpeg" rel="noopener nofollow ugc" title="img"><img alt="img" height="318" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/0/1/0107b4dc478e3d9069119b915825a2daba5202a7.jpeg" width="690" /></a></div><p></p> +<h3><a class="anchor" href="https://discourse.ros.org#summary-7" name="summary-7"></a><strong>Summary</strong></h3> +<p>Cobot Magic is a remote whole-body data collection device, developed by AgileX Robotics based on the Mobile Aloha project from Stanford University. With Cobot Magic, AgileX Robotics has successfully achieved the open-source code from the Stanford laboratory used on the Mobile Aloha platform. Data collection is no longer limited to desktops or specific surfaces thanks to the mobile base Tracer on the Cobot Magic, which enhances the richness and diversity of collected data.</p> +<p>AgileX will continue to collect data from various motion tasks based on Cobot Magic for model training and inference. Please stay tuned for updates on <a href="https://github.com/agilexrobotics?tab=repositories" rel="noopener nofollow ugc">Github.</a></p> +<h3><a class="anchor" href="https://discourse.ros.org#about-agilex-8" name="about-agilex-8"></a><strong>About AgileX</strong></h3> +<p>Established in 2016, AgileX Robotics is a leading manufacturer of mobile robot platforms and a provider of unmanned system solutions. The company specializes in independently developed multi-mode wheeled and tracked wire-controlled chassis technology and has obtained multiple international certifications. AgileX Robotics offers users self-developed innovative application solutions such as autonomous driving, mobile grasping, and navigation positioning, helping users in various industries achieve automation. Additionally, AgileX Robotics has introduced research and education software and hardware products related to machine learning, embodied intelligence, and visual algorithms. The company works closely with research and educational institutions to promote robotics technology teaching and innovation.</p> +<h3><a class="anchor" href="https://discourse.ros.org#appendix-9" name="appendix-9"></a><strong>Appendix</strong></h3> +<p><strong>ros_astra_camera configuration</strong></p> +<p><a href="https://github.com/orbbec/ros_astra_camera" rel="noopener nofollow ugc">ros_astra_camera-github</a>、 <a href="https://gitee.com/orbbecdeveloper/OrbbecSDK_ROS1" rel="noopener nofollow ugc">ros_astra_camera-gitee</a></p> +<p>● <strong>Camera Parameters</strong></p> +<div class="md-table"> +<table> +<thead> +<tr> +<th>Name</th> +<th>Parameters</th> +</tr> +</thead> +<tbody> +<tr> +<td>Baseline</td> +<td>40mm</td> +</tr> +<tr> +<td>Depth distance</td> +<td>0.3-3m</td> +</tr> +<tr> +<td>Depth map resolution</td> +<td>640x400x30fps、320x200x30fps</td> +</tr> +<tr> +<td>Color image resolution</td> +<td>1920x1080x30fps、1280x720x30fps、640x480x30fps</td> +</tr> +<tr> +<td>Accuracy</td> +<td>6mm@1m (81% FOV area in accuracy calculation)</td> +</tr> +<tr> +<td>Depth FOV</td> +<td>H 67.9° V 45.3°</td> +</tr> +<tr> +<td>Color FOV</td> +<td>H 71° V 43.7° @ 1920x1080</td> +</tr> +<tr> +<td>Delay</td> +<td>30-45ms</td> +</tr> +<tr> +<td>Data transmission</td> +<td>USB2.0 or above</td> +</tr> +<tr> +<td>Working temperature</td> +<td>10°C~40°C</td> +</tr> +<tr> +<td>Size</td> +<td>Length 59.5x Width 17.4x Thickness 11.1 mm</td> +</tr> +</tbody> +</table> +</div><ol> +<li>OrbbecSDK_ROS1Drive installation</li> +</ol> +<pre><code class="lang-auto"># 1 Install dependencies +sudo apt install libgflags-dev ros-$ROS_DISTRO-image-geometry ros-$ROS_DISTRO-camera-info-manager ros-$ROS_DISTRO-image-transport ros-$ROS_DISTRO-image-publisher ros-$ROS_DISTRO-libuvc-ros libgoogle-glog-dev libusb-1.0-0-dev libeigen3-dev + +# 2 Download the code +## 2.1 github +git clone https://github.com/orbbec/OrbbecSDK_ROS1.git astra_ws/src +## 2.2 gitee(Chinese region) +git clone https://gitee.com/orbbecdeveloper/OrbbecSDK_ROS1 -b v1.4.6 astra_ws/src + +# 3 Compile orbbec_camera +## 3.1 Enter astra_ws workspace +cd astra_ws +## 3.2 Compile orbbec_camera +catkin_make + +# 4 Install udev rules. +source devel/setup.bash &amp;&amp; rospack list +roscd orbbec_camera/scripts +sudo cp 99-obsensor-libusb.rules /etc/udev/rules.d/99-obsensor-libusb.rules +sudo udevadm control --reload &amp;&amp; sudo udevadm trigger + +# 5 Add ros_astra_camera package environment variables +## 5.1 Enter astra_ws +cd astra_ws +## 5.2 Add environment variables +echo "source $(pwd)/devel/setup.bash" &gt;&gt; ~/.bashrc +## 5.3 Environment variables work + +# 6 Launch +## If step 5 is not performed, the code in 6.2 needs to be executed every time it is started to make the ros workspace environment take effect. +## 6.1 astra_ws +cd astra_ws +## 6.2 workspace works +source devel/setup.bash +## 6.3 launch astra.launch +roslaunch orbbec_camera astra.launch +## 6.4 luanch dabai.launch +roslaunch orbbec_camera dabai.launch +</code></pre> +<ol> +<li>Configure orbbec_camera multiple camera nodes</li> +</ol> +<p>① Check the device serial number</p> +<p>● After installing the camera, run the following code</p> +<pre><code class="lang-auto">rosrun orbbec_camera list_devices_node | grep -i serial +</code></pre> +<p>● The output in the terminal</p> +<pre><code class="lang-auto">[ INFO] [1709728787.207920484]: serial: AU1P32201SA +# Please recored this serial number. Each camera corresponds to a unique Serial number. +</code></pre> +<p>② Configure multiple camera nodes</p> +<p>● cobot_magic uses three Dabai cameras of orbbec_camera, so it is necessary to configure the corresponding camera according to the Serial number of each camera.</p> +<p>● Industrial computer PC plugs in the USB data cables of the three cameras and runs 1. View the code in the device number section to view the Serial numbers of the three cameras</p> +<p>● In order to clarify the topics corresponding to each camera in subsequent development, please fill in the Serial number in order.</p> +<p>● Create the multi_dabai.launch file in the astra_ws/src/launch directory with the following content:</p> +<pre><code class="lang-auto"># Mainly modify: 1 Camera name 、2 Serial number +&lt;launch&gt; + &lt;arg name="camera_name" default="camera"/&gt; + &lt;arg name="3d_sensor" default="dabai"/&gt; + + &lt;!-- 1 Mainly modify 1 camera name prefix and 2 Serial number. --&gt; + &lt;arg name="camera1_prefix" default="01"/&gt; + &lt;arg name="camera2_prefix" default="02"/&gt; + &lt;arg name="camera3_prefix" default="03"/&gt; + + &lt;!-- # 2 Serial number : Fill in the camera Serial number --&gt; + &lt;arg name="camera1_usb_port" default="camera1的serial number"/&gt; + &lt;arg name="camera2_usb_port" default="camera2的serial number"/&gt; + &lt;arg name="camera3_usb_port" default="camera3的serial number"/&gt; + + &lt;arg name="device_num" default="3"/&gt; + &lt;include file="$(find orbbec_camera)/launch/$(arg 3d_sensor).launch"&gt; + &lt;arg name="camera_name" value="$(arg camera_name)_$(arg camera1_prefix)"/&gt; + &lt;arg name="usb_port" value="$(arg camera1_usb_port)"/&gt; + &lt;arg name="device_num" value="$(arg device_num)"/&gt; + &lt;/include&gt; + + &lt;include file="$(find orbbec_camera)/launch/$(arg 3d_sensor).launch"&gt; + &lt;arg name="camera_name" value="$(arg camera_name)_$(arg camera2_prefix)"/&gt; + &lt;arg name="usb_port" value="$(arg camera2_usb_port)"/&gt; + &lt;arg name="device_num" value="$(arg device_num)"/&gt; + &lt;/include&gt; + + &lt;include file="$(find orbbec_camera)/launch/$(arg 3d_sensor).launch"&gt; + &lt;arg name="camera_name" value="$(arg camera_name)_$(arg camera3_prefix)"/&gt; + &lt;arg name="usb_port" value="$(arg camera3_usb_port)"/&gt; + &lt;arg name="device_num" value="$(arg device_num)"/&gt; + &lt;/include&gt; +&lt;/launch&gt; +</code></pre> +<p>● Add permissions</p> +<pre><code class="lang-auto"># 1 Enter astra_camera/launch/ +roscd orbbec_camera/launch/ + +# 2 multi_dabai.launch add permissions +chmod +x multi_dabai.launch +</code></pre> +<p>● Launch ros</p> +<pre><code class="lang-auto">roslaunch orbbec_camera multi_dabai.launch +</code></pre> + <p><small>1 post - 1 participant</small></p> + <p><a href="https://discourse.ros.org/t/cobot-magic-mobile-aloha-system-works-on-agilex-robotics-platform/36515">Read full topic</a></p> + Fri, 08 Mar 2024 02:01:53 +0000 + + + ROS Discourse General: New Packages for Noetic 2024-03-07 + discourse.ros.org-topic-36514 + https://discourse.ros.org/t/new-packages-for-noetic-2024-03-07/36514 + <p>We’re happy to announce <strong>10</strong> new packages and <strong>46</strong> updates are now available in ROS Noetic. This sync was tagged as <a href="https://github.com/ros/rosdistro/blob/noetic/2024-03-07/noetic/distribution.yaml" rel="noopener nofollow ugc"><code>noetic/2024-03-07</code></a>.</p> +<p>Thank you to every maintainer and contributor who made these updates available!</p> +<h2><a class="anchor" href="https://discourse.ros.org#package-updates-for-ros-noetic-1" name="package-updates-for-ros-noetic-1"></a>Package Updates for ROS Noetic</h2> +<h3><a class="anchor" href="https://discourse.ros.org#added-packages-10-2" name="added-packages-10-2"></a>Added Packages [10]:</h3> +<ul> +<li>ros-noetic-atf: 0.1.1-1</li> +<li>ros-noetic-atf-core: 0.1.1-1</li> +<li>ros-noetic-atf-metrics: 0.1.1-1</li> +<li>ros-noetic-atf-msgs: 0.1.1-1</li> +<li>ros-noetic-atf-plotter: 0.1.1-1</li> +<li>ros-noetic-atf-recorder-plugins: 0.1.1-1</li> +<li>ros-noetic-atf-test: 0.1.1-1</li> +<li>ros-noetic-atf-test-tools: 0.1.1-1</li> +<li>ros-noetic-etsi-its-rviz-plugins: 2.0.1-1</li> +<li><a href="http://wiki.ros.org/py_binding_tools">ros-noetic-py-binding-tools</a>: 1.0.0-1</li> +</ul> +<h3><a class="anchor" href="https://discourse.ros.org#updated-packages-46-3" name="updated-packages-46-3"></a>Updated Packages [46]:</h3> +<ul> +<li><a href="https://wiki.ros.org/cras_cpp_common">ros-noetic-cras-cpp-common</a>: 2.3.8-1 → 2.3.9-1</li> +<li>ros-noetic-cras-docs-common: 2.3.8-1 → 2.3.9-1</li> +<li><a href="https://wiki.ros.org/cras_py_common">ros-noetic-cras-py-common</a>: 2.3.8-1 → 2.3.9-1</li> +<li><a href="https://wiki.ros.org/cras_topic_tools">ros-noetic-cras-topic-tools</a>: 2.3.8-1 → 2.3.9-1</li> +<li>ros-noetic-etsi-its-cam-coding: 2.0.0-1 → 2.0.1-1</li> +<li>ros-noetic-etsi-its-cam-conversion: 2.0.0-1 → 2.0.1-1</li> +<li>ros-noetic-etsi-its-cam-msgs: 2.0.0-1 → 2.0.1-1</li> +<li>ros-noetic-etsi-its-coding: 2.0.0-1 → 2.0.1-1</li> +<li>ros-noetic-etsi-its-conversion: 2.0.0-1 → 2.0.1-1</li> +<li>ros-noetic-etsi-its-denm-coding: 2.0.0-1 → 2.0.1-1</li> +<li>ros-noetic-etsi-its-denm-conversion: 2.0.0-1 → 2.0.1-1</li> +<li>ros-noetic-etsi-its-denm-msgs: 2.0.0-1 → 2.0.1-1</li> +<li>ros-noetic-etsi-its-messages: 2.0.0-1 → 2.0.1-1</li> +<li>ros-noetic-etsi-its-msgs: 2.0.0-1 → 2.0.1-1</li> +<li>ros-noetic-etsi-its-msgs-utils: 2.0.0-1 → 2.0.1-1</li> +<li>ros-noetic-etsi-its-primitives-conversion: 2.0.0-1 → 2.0.1-1</li> +<li>ros-noetic-gnss-info: 1.0.1-1 → 1.0.2-1</li> +<li>ros-noetic-gnss-info-msgs: 1.0.1-1 → 1.0.2-1</li> +<li>ros-noetic-gnsstk-ros: 1.0.1-1 → 1.0.2-1</li> +<li><a href="https://wiki.ros.org/image_transport_codecs">ros-noetic-image-transport-codecs</a>: 2.3.8-1 → 2.3.9-1</li> +<li><a href="http://wiki.ros.org/mavros">ros-noetic-libmavconn</a>: 1.17.0-1 → 1.18.0-1</li> +<li><a href="https://mavlink.io/en/" rel="noopener nofollow ugc">ros-noetic-mavlink</a>: 2023.9.9-1 → 2024.3.3-1</li> +<li><a href="http://wiki.ros.org/mavros">ros-noetic-mavros</a>: 1.17.0-1 → 1.18.0-1</li> +<li><a href="http://wiki.ros.org/mavros_extras">ros-noetic-mavros-extras</a>: 1.17.0-1 → 1.18.0-1</li> +<li><a href="http://wiki.ros.org/mavros_msgs">ros-noetic-mavros-msgs</a>: 1.17.0-1 → 1.18.0-1</li> +<li><a href="https://www.mrpt.org/" rel="noopener nofollow ugc">ros-noetic-mrpt2</a>: 2.11.9-1 → 2.11.11-1</li> +<li><a href="https://wiki.ros.org/mvsim">ros-noetic-mvsim</a>: 0.8.3-1 → 0.9.1-2</li> +<li><a href="https://github.com/LORD-MicroStrain/ntrip_client" rel="noopener nofollow ugc">ros-noetic-ntrip-client</a>: 1.2.0-1 → 1.3.0-1</li> +<li><a href="http://introlab.github.io/rtabmap" rel="noopener nofollow ugc">ros-noetic-rtabmap</a>: 0.21.3-1 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-conversions: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-costmap-plugins: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-demos: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-examples: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-launch: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-legacy: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-msgs: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-odom: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-python: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-ros: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-rviz-plugins: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-slam: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-sync: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-util: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-rtabmap-viz: 0.21.3-4 → 0.21.4-1</li> +<li>ros-noetic-test-mavros: 1.17.0-1 → 1.18.0-1</li> +<li><a href="http://wiki.ros.org/ur_client_library">ros-noetic-ur-client-library</a>: 1.3.4-1 → 1.3.5-1</li> +</ul> +<h3><a class="anchor" href="https://discourse.ros.org#removed-packages-0-4" name="removed-packages-0-4"></a>Removed Packages [0]:</h3> +<p>Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:</p> +<ul> +<li>Felix Exner</li> +<li>Florian Weisshardt</li> +<li>Jean-Pierre Busch</li> +<li>Jose-Luis Blanco-Claraco</li> +<li>Martin Pecka</li> +<li>Mathieu Labbe</li> +<li>Rob Fisher</li> +<li>Robert Haschke</li> +<li>Vladimir Ermakov</li> +</ul> + <p><small>2 posts - 2 participants</small></p> + <p><a href="https://discourse.ros.org/t/new-packages-for-noetic-2024-03-07/36514">Read full topic</a></p> + Fri, 08 Mar 2024 01:28:07 +0000 + + + ROS Discourse General: Interoperability Interest Group March 7, 2024: Standardizing Infrastructure Messages, Part 3 + discourse.ros.org-topic-36455 + https://discourse.ros.org/t/interoperability-interest-group-march-7-2024-standardizing-infrastructure-messages-part-3/36455 + <p><a href="https://github.com/osrf-sig-interoperability/community" rel="noopener nofollow ugc">Community Page </a></p> +<p><a href="https://meet.google.com/qoj-nfxx-bxy" rel="noopener nofollow ugc">Meeting Link </a></p> +<p><a href="https://calendar.google.com/calendar/u/0/embed?src=agf3kajirket8khktupm9go748@group.calendar.google.com" rel="noopener nofollow ugc">Calendar Link </a></p> +<p>Continuing our discussion from the last session, our next session will get into more depth on how errors for building infrastructure devices should be represented.</p> +<p>Some questions to consider:</p> +<ul> +<li>What level of detail needs to be standardized for error messages? +<ul> +<li>Is it enough to simply communicate that the devices is unusable?</li> +<li>Should the standardized error messages also provide enough information for a technician to troubleshoot the device?</li> +<li>Should detailed troubleshooting information be provided through a separate non-standard channel instead?</li> +</ul> +</li> +<li>How efficient should error messages be? +<ul> +<li>A simple error code is high performance and allows for millions of possible error types but then can only communicate the presence of one error at a time</li> +<li>Bitsets could express multiple simultaneous errors with high performance but then the number of error types is severely limited</li> +<li>Dynamic arrays of error codes can communicate many types of errors with no limit but then heap allocations are needed</li> +<li>A string of serialized JSON could represent unlimited types of errors and provide troubleshooting information for them, but then heap allocation and string parsing are needed</li> +</ul> +</li> +<li>Should standardized error definitions be specific to each type of building device, or should the definitions be abstract enough to use across all/multiple devices? +<ul> +<li>E.g. are doors and elevators different enough that they need their own error code definitions?</li> +<li>What kind of errors should we expect to report for each different type of device?</li> +</ul> +</li> +</ul> +<p>We will be seeking input on all of the above questions and more. Please come armed with examples of your most hated device errors that you think a good standard should be able to express.</p> + <p><small>4 posts - 3 participants</small></p> + <p><a href="https://discourse.ros.org/t/interoperability-interest-group-march-7-2024-standardizing-infrastructure-messages-part-3/36455">Read full topic</a></p> + Mon, 04 Mar 2024 15:23:04 +0000 + + + ROS Discourse General: ROS Mapping and Navigation with AgileX Robotics Limo + discourse.ros.org-topic-36452 + https://discourse.ros.org/t/ros-mapping-and-navigation-with-agilex-robotics-limo/36452 + <p>Limo is a smart educational robot published by AgileX Robotics. More details please visit: <a href="https://global.agilex.ai/" rel="noopener nofollow ugc">https://global.agilex.ai/</a><br /> +</p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/0/6/0606141abd360c2207b55c28a051a29aad2cb9f5.jpeg" rel="noopener nofollow ugc" title="808f249cb1f134a0088fbe659c204d9b_original"><img alt="808f249cb1f134a0088fbe659c204d9b_original" height="500" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/0/6/0606141abd360c2207b55c28a051a29aad2cb9f5_2_383x500.jpeg" width="383" /></a></div><p></p> +<p>Four steering modes make LIMO substantially superior to other robots in its class. The available modes are: Omni-Wheel Steering, Tracked Steering, Four-Wheel Differential Steering and Ackermann Steering. These advanced steering modes plus a built-in 360° scanning LiDAR and RealSense infrared camera make the platform perfect for industrial and commercial tasks in any scenario. With these incredible features, LIMO can achieve precise self-localization, SLAM mapping, route planning and autonomous obstacle avoidance, reverse parking, traffic light recognition, and more.<br /> +</p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/c/f/cf84a27f52dae814140c99ea284bc63816864697.jpeg" rel="noopener nofollow ugc" title="1_63677_74581"><img alt="1_63677_74581" height="304" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/c/f/cf84a27f52dae814140c99ea284bc63816864697_2_690x304.jpeg" width="690" /></a></div><p></p> +<h1><a class="anchor" href="https://discourse.ros.org#mapping-1" name="mapping-1"></a><strong>Mapping</strong></h1> +<h2><a class="anchor" href="https://discourse.ros.org#gmapping-2" name="gmapping-2"></a>Gmapping</h2> +<p>Gmapping is a widely adopted open-source SLAM algorithm that operates within the filtering SLAM framework. It effectively uses wheel odometry data and does not heavily rely on high-frequency laser LiDAR scans. When constructing a map of a smaller environment, Gmapping requires minimal computational resources to maintain high accuracy. Here the ROS encapsulated Gmapping package is used to achieve the Gmapping for Limo.</p> +<p><strong>Note:</strong> The speed of limo should be slow in the process of mapping. If the speed is too fast, the effect of mapping will be affected.</p> +<p>Run the command in a new terminal. It launches the LiDAR.</p> +<pre><code class="lang-auto"> roslaunch limo_bringup limo_start.launch pub_odom_tf:=false +</code></pre> +<p>Then launch the gmapping algorithm. Open another new terminal, and enter the command:</p> +<pre><code class="lang-auto">roslaunch limo_bringup limo_gmapping.launch +</code></pre> +<p>After launching successfully, the rviz visualization tool will start up. The interface is shown in the figure.<br /> +</p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/9/d/9de4e145dccc9ebb806ef5ec3f8094c11d1859ab.png" rel="noopener nofollow ugc" title="gmapping"><img alt="gmapping" height="391" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/9/d/9de4e145dccc9ebb806ef5ec3f8094c11d1859ab_2_690x391.png" width="690" /></a></div><p></p> +<p>At this time, the handle can be set to remote control mode and control limo mapping.</p> +<p>After building the map, run the following command to save the map to the specified directory:</p> +<ol> +<li>Switch to the directory where you need to save the map, save the map to ~/agilex_ws/src/limo_ros/limo_bringup/maps/, and enter the command in the terminal:</li> +</ol> +<pre><code class="lang-auto">cd ~/agilex_ws/src/limo_ros/limo_bringup/maps/ +</code></pre> +<ol start="2"> +<li>After switching to /agilex_ws/limo_bringup/maps, continue to enter the command in the terminal:</li> +</ol> +<pre><code class="lang-auto">rosrun map_server map_saver –f map1 +</code></pre> +<p><strong>Note:</strong> map1 is the name of the saved map, and duplicate names should be avoided when saving the map.</p> +<h2><a class="anchor" href="https://discourse.ros.org#cartographer-3" name="cartographer-3"></a>Cartographer</h2> +<p>Cartographer is a set of SLAM algorithms based on image optimization launched by Google. The main goal of this algorithm is to achieve low computing resource consumption and achieve the purpose of real-time SLAM. The algorithm is mainly divided into two parts. The first part is called Local SLAM. This part establishes and maintains a series of Submaps through each frame of the Laser Scan, and the so-called submap is a series of Grid Maps. The second part called Global SLAM, is to perform closed-loop detection through Loop Closure to eliminate accumulated errors: when a submap is built, no new laser scans will be inserted into the submap. The algorithm will add the submap to the closed-loop detection.</p> +<p><strong>Note:</strong> Before running the command, please make sure that the programs in other terminals have been terminated. The termination command is: Ctrl+c.</p> +<p><strong>Note:</strong> The speed of limo should be slow in the process of mapping. If the speed is too fast, the effect of mapping will be affected.</p> +<p>Launch a new terminal and enter the command:</p> +<pre><code class="lang-auto">roslaunch limo_bringup limo_start.launch pub_odom_tf:=false +</code></pre> +<p>Then start the cartographer mapping algorithm. Open another new terminal and enter the command:</p> +<pre><code class="lang-auto">roslaunch limo_bringup limo_cartographer.launch +</code></pre> +<p>After launching successfully, the rviz visualization interface will be shown as the figure below:<br /> +</p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/d/c/dc83e862c3071db60baa93c1e067da859514e4ed.jpeg" rel="noopener nofollow ugc" title="carto_1"><img alt="carto_1" height="500" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/d/c/dc83e862c3071db60baa93c1e067da859514e4ed_2_661x500.jpeg" width="661" /></a></div><p></p> +<p>After building the map, it is necessary to save it. Three following commands need to be entered in the terminal:</p> +<p>(1)After completing the trajectory, no further data should be accepted.</p> +<pre><code class="lang-auto">rosservice call /finish_trajectory 0 +</code></pre> +<p>(2)Serialize and save its current state.</p> +<pre><code class="lang-auto">rosservice call /write_state "{filename: '${HOME}/agilex_ws/src/limo_ros/limo_bringup/maps/mymap.pbstream'}" +</code></pre> +<p>(3)Convert pbstream to pgm and yaml</p> +<pre><code class="lang-auto">rosrun cartographer_ros cartographer_pbstream_to_ros_map -map_filestem=${HOME}/agilex_ws/src/limo_ros/limo_bringup/maps/mymap.pbstream -pbstream_filename=${HOME}/agilex_ws/src/limo_ros/limo_bringup/maps/mymap.pbstream -resolution=0.05 +</code></pre> +<p>Generate the corresponding pgm and yaml, and put them in the directory:</p> +<p>${HOME}/agilex_ws/src/limo_ros/limo_bringup/maps/mymap.pbstream</p> +<p>Note: During the process of mapping, some warnings will appear in the terminal. This is caused by the excessive speed and the delayed data processing, which can be ignored.<br /> +</p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/3/2/32bd0fbb75ffcfa0caedc3c4eb0a48a1989617e3.png" rel="noopener nofollow ugc" title="carto_2"><img alt="carto_2" height="220" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/3/2/32bd0fbb75ffcfa0caedc3c4eb0a48a1989617e3_2_690x220.png" width="690" /></a></div><p></p> +<h1><a class="anchor" href="https://discourse.ros.org#navigation-4" name="navigation-4"></a>Navigation</h1> +<h3><a class="anchor" href="https://discourse.ros.org#navigation-framework-5" name="navigation-framework-5"></a>Navigation framework</h3> +<p>The key to navigation is robot positioning and path planning. For these, ROS provides the following two packages.</p> +<p>(1)move_base:achieve the optimal path planning in robot navigation.</p> +<p>(2)amcl:achieve robot positioning in a two-dimensional map.</p> +<p>On the basis of the above two packages, ROS provides a complete navigation framework.<br /> +</p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/7/7/7741fb79c9ba4edc70d038066e1471fb5b22ad1e.png" rel="noopener nofollow ugc" title="ROS 导航框架"><img alt="ROS 导航框架" height="291" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/7/7/7741fb79c9ba4edc70d038066e1471fb5b22ad1e_2_690x291.png" width="690" /></a></div><br /> +The robot only needs to publish the necessary sensor information and navigation goal position, and ROS can complete the navigation function. In this framework, the move_base package provides the main operation and interactive interface of navigation. In order to ensure the accuracy of the navigation path, the robot also needs to accurately locate its own position. This part of the function is implemented by the amcl package.<p></p> +<h4><a class="anchor" href="https://discourse.ros.org#h-11-move_base-package-6" name="h-11-move_base-package-6"></a>1.1 Move_base package</h4> +<p>move_base is a package for path planning in ROS, which is mainly composed of the following two planners.</p> +<p>(1) Global path planning (global_planner). Global path planning is to plan the overall path according to a given goal position and global map. In navigation, Dijkstra or A* algorithm is used for global path planning, and the optimal route from the robot to the goal position is calculated as the robot’s global path.</p> +<p>(2) Local real-time planning (local_planner). In practice, robots often cannot strictly follow the global path. So it is necessary to plan the path that the robot should travel in each cycle according to the map information and obstacles that may appear near the robot at any time. So that it conforms to the global optimal path as much as possible.</p> +<h4><a class="anchor" href="https://discourse.ros.org#h-12-amcl-package-7" name="h-12-amcl-package-7"></a>1.2 Amcl package</h4> +<p>Autonomous positioning means that the robot can calculate its position on the map in any state. ROS provides developers with an adaptive (or kld sampling) Monte Carlo localization (amcl), which is a probabilistic positioning system that locates mobile robots in 2D. It implements an adaptive (or KLD-sampling) Monte Carlo localization, using particle filtering to track the pose of the robot on a known map.</p> +<h4><a class="anchor" href="https://discourse.ros.org#h-13-introduction-of-dwa_planner-and-teb_planner-8" name="h-13-introduction-of-dwa_planner-and-teb_planner-8"></a>1.3 Introduction of DWA_planner and TEB_planner</h4> +<h5><a class="anchor" href="https://discourse.ros.org#dwa_planner-9" name="dwa_planner-9"></a>DWA_planner</h5> +<p>The full name of DWA is Dynamic Window Approaches. The algorithm can search for multiple paths to avoid and travel, select the optimal path based on various evaluation criteria (whether it will hit an obstacle, the time required, etc.), and calculate the linear velocity and angular velocity during the driving cycle to avoid collisions with dynamic obstacles.</p> +<h5><a class="anchor" href="https://discourse.ros.org#teb_planner-10" name="teb_planner-10"></a>TEB_planner</h5> +<p>The full name of “TEB” is Time Elastic Band Local Planner. This method performs subsequent modifications to the initial trajectory generated by the global path planner to optimize the robot’s motion trajectory and belongs to local path planning. In the process of trajectory optimization, the algorithm has a variety of optimization goals, including but not limited to: overall path length, trajectory running time, distance to obstacles, passing intermediate path points, and compliance with robot dynamics, kinematics, and geometric constraints. The“TEB method” explicitly considers the dynamic constraints of time and space in the state of motion, for example, the velocity and acceleration of the robot are limited.</p> +<h3><a class="anchor" href="https://discourse.ros.org#limo-navigation-11" name="limo-navigation-11"></a>Limo navigation</h3> +<p><strong>Note:</strong> In the four-wheel differential mode, the omnidirectional wheel mode and the track mode, the file run for the navigation is the same.</p> +<p><strong>Note:</strong> Before running the command, please make sure that the programs in other terminals have been terminated. The termination command is: Ctrl+c.</p> +<p>(1)First launch the LiDAR and enter the command in the terminal:</p> +<pre><code class="lang-auto">roslaunch limo_bringup limo_start.launch pub_odom_tf:=false +</code></pre> +<p>(2)Launch the navigation and enter the command in the terminal:</p> +<pre><code class="lang-auto">roslaunch limo_bringup limo_navigation_diff.launch +</code></pre> +<p><strong>Note:</strong> If it is Ackermann motion mode, please run:</p> +<pre><code class="lang-auto">roslaunch limo_bringup limo_navigation_ackerman.launch +</code></pre> +<p>After launching successfully, the rviz interface will be shown in the figure below:<br /> +</p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/c/b/cbf90b32a8bb75cce05af0c01f5094809e1f4069.jpeg" rel="noopener nofollow ugc" title="navi_1"><img alt="navi_1" height="388" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/c/b/cbf90b32a8bb75cce05af0c01f5094809e1f4069_2_690x388.jpeg" width="690" /></a></div><p></p> +<p><strong>Note:</strong> If you need to customize the opened map, please open the limo_navigation_diff.launch file to modify the parameters. The file directory is: ~/agilex_ws/src/limo_ros/limo_bringup/launch. Please modify map02 to the name of the map that needs to be replaced.</p> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/4/0/4076ecf0b20ad45b7ab64dda8e6707572a3f8593.png" rel="noopener nofollow ugc" title="navi_diff"><img alt="navi_diff" height="391" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/4/0/4076ecf0b20ad45b7ab64dda8e6707572a3f8593_2_690x391.png" width="690" /></a></div><p></p> +<p>3)After launching the navigation, it may be observed that the laser-scanned shape does not align with the map, requiring manual correction. To rectify this, adjust the actual position of the chassis in the scene displayed on the rviz map. Use the rviz tools to designate an approximate position for the vehicle, providing it with a preliminary estimation. Subsequently, use the handle tool to remotely rotate the vehicle until automatic alignment is achieved. Once the laser shape overlaps with the scene shape on the map, the correction process is concluded. The operational steps are outlined as follows:<br /> +</p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/d/d/dddc6afc5b57a2d16e952c0417769f606289bd3c.jpeg" rel="noopener nofollow ugc" title="limo_tu_2"><img alt="limo_tu_2" height="384" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/d/d/dddc6afc5b57a2d16e952c0417769f606289bd3c_2_690x384.jpeg" width="690" /></a></div><p></p> +<p>The correction is completed:</p> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/5/3/539627d512375c5538e1891d2764a830a018e691.jpeg" rel="noopener nofollow ugc" title="navi3"><img alt="navi3" height="388" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/5/3/539627d512375c5538e1891d2764a830a018e691_2_690x388.jpeg" width="690" /></a></div><p></p> +<p>(4)Set the navigation goal point through 2D Nav Goal.<br /> +</p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/c/0/c087f25803dfbb510b6f973eeef9c20e8be5e507.jpeg" rel="noopener nofollow ugc" title="limo_tu_3"><img alt="limo_tu_3" height="382" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/c/0/c087f25803dfbb510b6f973eeef9c20e8be5e507_2_690x382.jpeg" width="690" /></a></div><p></p> +<p>A purple path will be generated on the map. Switch the handle to command mode, and Limo will automatically navigate to the goal point.</p> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/1/a/1a70d62038c185a244bc7b1a0d4de413ad8de05c.jpeg" rel="noopener nofollow ugc" title="navi_5"><img alt="navi_5" height="388" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/1/a/1a70d62038c185a244bc7b1a0d4de413ad8de05c_2_690x388.jpeg" width="690" /></a></div><p></p> +<h1><a class="anchor" href="https://discourse.ros.org#limo-path-inspection-12" name="limo-path-inspection-12"></a>Limo path inspection</h1> +<p>(1)First launch the LiDAR and enter the command in the terminal:</p> +<pre><code class="lang-auto">roslaunch limo_bringup limo_start.launch pub_odom_tf:=false +</code></pre> +<p>(2)Launch the navigation and enter the command in the terminal:</p> +<pre><code class="lang-auto">roslaunch limo_bringup limo_navigation_diff.launch +</code></pre> +<p><strong>Note:</strong> If it is Ackermann motion mode, please run:</p> +<pre><code class="lang-auto">roslaunch limo_bringup limo_navigation_ackerman.launch +</code></pre> +<p>(3)Launch the path recording function. Open a new terminal, and enter the command in the terminal:</p> +<pre><code class="lang-auto">roslaunch agilex_pure_pursuit record_path.launch +</code></pre> +<p>After the path recording is completed, terminate the path recording program, and enter the command in the terminal: Ctrl+c.</p> +<p>(4)Launch the path inspection function. Open a new terminal, and enter the command in the terminal:</p> +<p><strong>Note:</strong> Switch the handle to command mode.</p> +<pre><code class="lang-auto">roslaunch agilex_pure_pursuit pure_pursuit.launch +</code></pre> + <p><small>2 posts - 2 participants</small></p> + <p><a href="https://discourse.ros.org/t/ros-mapping-and-navigation-with-agilex-robotics-limo/36452">Read full topic</a></p> + Mon, 04 Mar 2024 07:28:26 +0000 + + + ROS Discourse General: ROS Meetup Arab + discourse.ros.org-topic-36439 + https://discourse.ros.org/t/ros-meetup-arab/36439 + <p>We’re excited to introduce the forthcoming installment of our Arabian Meet series, centered around the captivating theme of “Autonomous Racing: Advancing the Frontiers of Automated Technology.”</p> +<p>The topics we’ll explore include :</p> +<ul> +<li>Introduction to Autonomous Racing.</li> +<li>Autonomous Racing Competitions.</li> +<li>Racing Cars &amp; Sensor Technologies.</li> +<li>ROS-Based Racing Simulator.</li> +<li>Autonomous Racing Software Architecture.</li> +</ul> +<p>Stay tuned for more updates and save the date for this enlightening conversation! <img alt=":spiral_calendar:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/spiral_calendar.png?v=12" title=":spiral_calendar:" width="20" /></p> +<p>save time on the calendar:</p><aside class="onebox allowlistedgeneric"> + <header class="source"> + + <a href="https://accounts.google.com/v3/signin/identifier?continue=https%3A%2F%2Fcalendar.google.com%2Fcalendar%2Fevent%3Faction%3DTEMPLATE%26tmeid%3DMGhjNWNodTNya2MxMHEycW9vNHBkMGFycmQgYXJhYnJvYm9lbnRodXNpYXN0QG0%26tmsrc%3Darabroboenthusiast%40gmail.com&amp;emr=1&amp;followup=https%3A%2F%2Fcalendar.google.com%2Fcalendar%2Fevent%3Faction%3DTEMPLATE%26tmeid%3DMGhjNWNodTNya2MxMHEycW9vNHBkMGFycmQgYXJhYnJvYm9lbnRodXNpYXN0QG0%26tmsrc%3Darabroboenthusiast%40gmail.com&amp;ifkv=ATuJsjxgQkUkOFW5ri_9_zB_Q28be014orSHg4eFYV2ZWt5CNfRXOcqrlZmxblEmLhDYw8zd5Arlgg&amp;osid=1&amp;passive=1209600&amp;service=cl&amp;flowName=WebLiteSignIn&amp;flowEntry=ServiceLogin&amp;dsh=S-1425911791%3A1709450102390372" rel="noopener nofollow ugc" target="_blank">accounts.google.com</a> + </header> + + <article class="onebox-body"> + + +<h3><a href="https://accounts.google.com/v3/signin/identifier?continue=https%3A%2F%2Fcalendar.google.com%2Fcalendar%2Fevent%3Faction%3DTEMPLATE%26tmeid%3DMGhjNWNodTNya2MxMHEycW9vNHBkMGFycmQgYXJhYnJvYm9lbnRodXNpYXN0QG0%26tmsrc%3Darabroboenthusiast%40gmail.com&amp;emr=1&amp;followup=https%3A%2F%2Fcalendar.google.com%2Fcalendar%2Fevent%3Faction%3DTEMPLATE%26tmeid%3DMGhjNWNodTNya2MxMHEycW9vNHBkMGFycmQgYXJhYnJvYm9lbnRodXNpYXN0QG0%26tmsrc%3Darabroboenthusiast%40gmail.com&amp;ifkv=ATuJsjxgQkUkOFW5ri_9_zB_Q28be014orSHg4eFYV2ZWt5CNfRXOcqrlZmxblEmLhDYw8zd5Arlgg&amp;osid=1&amp;passive=1209600&amp;service=cl&amp;flowName=WebLiteSignIn&amp;flowEntry=ServiceLogin&amp;dsh=S-1425911791%3A1709450102390372" rel="noopener nofollow ugc" target="_blank">Google Calendar - Sign in to Access &amp; Edit Your Schedule</a></h3> + + <p>Access Google Calendar with a Google account (for personal use) or Google Workspace account (for business use).</p> + + + </article> + + <div class="onebox-metadata"> + + + </div> + + <div style="clear: both;"></div> +</aside> +<p> +You can find the meeting link here:</p><aside class="onebox allowlistedgeneric"> + <header class="source"> + <img class="site-icon" height="32" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/9/e/9ed0292ad9ae04ccaa893508cebd1fd741a2ba80.png" width="32" /> + + <a href="https://meet.google.com/ecu-cseq-hhv" rel="noopener nofollow ugc" target="_blank">Google Workspace</a> + </header> + + <article class="onebox-body"> + <div class="aspect-image"><img class="thumbnail" height="362" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/0/e/0ec45595982470ef64dcc81caf9a6f8837a2f064_2_690x362.jpeg" width="690" /></div> + +<h3><a href="https://meet.google.com/ecu-cseq-hhv" rel="noopener nofollow ugc" target="_blank">Google Meet: Online Web and Video Conferencing Calls | Google Workspace</a></h3> + + <p>Use Google Meet for secure online web conferencing calls and video chat as a part of Google Workspace.</p> + + + </article> + + <div class="onebox-metadata"> + + + </div> + + <div style="clear: both;"></div> +</aside> + +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/5/5/55c3ec018f4f6d71439c644e9551da08908f7e38.jpeg" rel="noopener nofollow ugc" title="image"><img alt="image" height="500" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/5/5/55c3ec018f4f6d71439c644e9551da08908f7e38.jpeg" width="500" /></a></div><p></p> + <p><small>1 post - 1 participant</small></p> + <p><a href="https://discourse.ros.org/t/ros-meetup-arab/36439">Read full topic</a></p> + Sun, 03 Mar 2024 07:16:59 +0000 + + + ROS Discourse General: Potential Humanoid Robotics Monthly Working Group + discourse.ros.org-topic-36426 + https://discourse.ros.org/t/potential-humanoid-robotics-monthly-working-group/36426 + <p>Hi Everyone,</p> +<p>I want to introduce myself - my name is Ronaldson Bellande, I’m a PhD Student/Founder CEO/CTO/Director of Research Organizations and a Startup; I’m working on. Can find information more about me in my <a href="https://www.linkedin.com/in/ronaldson-bellande-5b9699178" rel="noopener nofollow ugc">linkedin</a> and <a href="https://github.com/RonaldsonBellande" rel="noopener nofollow ugc">Github Profile</a></p> +<p>I wanted to create a monthly meeting working group, where we would meet monthly and discuss about humanoid robotics, what everyone is working on? Are looking for and are excited for? Anything Interested you are working in? and more in the space of humanoid robotics.</p> +<p>If there is interest I will start a Working Group, I’m passionate about this subject and other subject related to activities I’m constantly doing.</p> + <p><small>13 posts - 8 participants</small></p> + <p><a href="https://discourse.ros.org/t/potential-humanoid-robotics-monthly-working-group/36426">Read full topic</a></p> + Sat, 02 Mar 2024 01:57:09 +0000 + + + ROS Discourse General: ROS News for the Week of February 26th, 2024 + discourse.ros.org-topic-36367 + https://discourse.ros.org/t/ros-news-for-the-week-of-february-26th-2024/36367 + <h1><a class="anchor" href="https://discourse.ros.org#ros-news-for-the-week-of-february-26th-2024-1" name="ros-news-for-the-week-of-february-26th-2024-1"></a>ROS News for the Week of February 26th, 2024</h1> +<br /> +<p><img alt="belt2" class="animated" height="288" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/b/7/b7eaaea21f96fb5a468b92cd87c38346522f10c3.gif" width="512" /></p> +<p>In manufacturing I’ve seen talented people do things with clever light placement that transform an extremely difficult computer vision task into something that’s easily solved. I came across this paper this week that does just that for the robotic manipulation of objects. The paper is titled, <a href="https://dgdm-robot.github.io/">“Dynamics-Guided Diffusion Model for Robot Manipulator Design”</a> and the authors use diffusion models to make simple grippers that can manipulate a specific object into a given pose. The results are pretty cool and could be very useful for any roboticist with a 3D printer.</p> +<br /> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/0/6/06808ab2cd53461d9e9170f3315f78f669ac702b.jpeg" title="image"><img alt="image" height="250" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/0/6/06808ab2cd53461d9e9170f3315f78f669ac702b_2_313x250.jpeg" width="313" /></a></div><br /> +<a href="https://arstechnica.com/ai/2024/02/amazon-to-spend-1-billion-on-startups-that-combine-ai-with-robots/">Amazon is putting up US$1B to fund startups that combine robotics and “AI.”</a> While regular startup investment has fallen off a bit, it looks like there are still funding opportunities for robotics companies.<p></p> +<br /> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/e/b/eb729571e2a23ef928c2429a0c5005f5763b4c1a.jpeg" title="image"><img alt="image" height="250" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/e/b/eb729571e2a23ef928c2429a0c5005f5763b4c1a_2_310x250.jpeg" width="310" /></a></div><br /> +Last week everyone was talking about how <a href="https://www.forbes.com/sites/greatspeculations/2024/02/26/ai-revolution-sparks-nvidias-historic-market-cap-achievement/?sh=31a262673362">NVIDIA’s market cap had hit US$2T</a>. <a href="https://www.linkedin.com/pulse/nvidia-open-navigation-collaborate-drive-new-mobile-amulya-vishwanath-pesqc/?trackingId=1qN86jkFQleBX%2FC%2Bx%2Fv%2BMA%3D%3D">According to this LinkedIn</a> post they are putting that money to good use by funding the development of the open source Nav2 project.<p></p> +<br /> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/1/e/1e1a12e89f98ef4f5ee95da821dabc380e6ceaaf.jpeg" title="fig_lvt2calib_overview"><img alt="fig_lvt2calib_overview" height="374" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/1/e/1e1a12e89f98ef4f5ee95da821dabc380e6ceaaf_2_416x374.jpeg" width="416" /></a></div><br /> +Cross sensor calibration is a pain in the <img alt=":peach:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/peach.png?v=12" title=":peach:" width="20" />. A good robot model can only get you so far, and getting a bunch of sensor data to match up can be difficult for even the most seasoned engineers. The Github repository below attempts to build a toolbox to fix some these problems. <a href="https://github.com/Clothooo/lvt2calib?tab=readme-ov-file">LVT2Calib: Automatic and Unified Extrinsic Calibration Toolbox for Different 3D LiDAR, Visual Camera and Thermal Camera</a> – <a href="https://www.researchgate.net/publication/371377845_L2V2T2Calib_Automatic_and_Unified_Extrinsic_Calibration_Toolbox_for_Different_3D_LiDAR_Visual_Camera_and_Thermal_Camera">paper</a><p></p> +<h1><a class="anchor" href="https://discourse.ros.org#events-2" name="events-2"></a>Events</h1> +<ul> +<li><a href="https://www.meetup.com/boulderisforrobots/events/299280969/">2024-03-06 Boulder is for Robots Meetup</a></li> +<li><a href="https://online-learning.tudelft.nl/courses/hello-real-world-with-ros-robot-operating-systems/">2024-03-14 TU Delft ROS MOOC (FREE!)</a></li> +<li><a href="https://rosindustrial.org/events/2024/ros-industrial-consortium-americas-2024-annual-meeting">2024-03-27 ROS Industrial Annual Meeting</a></li> +<li><a href="https://www.eventbrite.com/e/robotics-and-tech-happy-hour-tickets-835278218637?aff=soc">2024-03-27 Robotics Happy Hour Pittsburgh</a></li> +<li><a href="https://www.eventbrite.com/e/robots-on-ice-40-2024-tickets-815030266467">2024-04-21 Robots on Ice</a></li> +<li><a href="https://2024.oshwa.org/">2024-05-03 Open Hardware Summit Montreal</a></li> +<li><a href="https://www.tum-venture-labs.de/education/bots-bento-the-first-robotic-pallet-handler-competition-icra-2024/">2024-05-13 Bots &amp; Bento @ ICRA 2024</a></li> +<li><a href="https://discourse.ros.org/t/acm-sigsoft-summer-school-on-software-engineering-for-robotics-in-brussels/35992">2024-06-04 → 2024-06-08 ACM SIGSOFT Summer School on Software Engineering for Robotics</a></li> +<li><a href="https://www.agriculture-vision.com/">2024-06-?? Workshop on Agriculture Vision at CVPR 2024</a></li> +<li><a href="https://icra2024.rt-net.jp/archives/87">2024-06-16 Food Topping Challenge at ICRA 2024</a></li> +<li><a href="https://discourse.ros.org/t/roscon-france-2024/35222">2024-06-19 =&gt; 2024-06-20 ROSCon France</a></li> +<li><a href="https://sites.udel.edu/ceoe-able/able-summer-bootcamp/">2024-08-05 Autonomous Systems Bootcam at Univ. Deleware</a>– <a href="https://www.youtube.com/watch?v=4ETDsBN2o8M">Video</a></li> +<li><a href="https://roscon.jp/2024/">2024-09-25 ROSConJP</a></li> +</ul> +<h1><a class="anchor" href="https://discourse.ros.org#news-3" name="news-3"></a>News</h1> +<ul> +<li><a href="https://github.com/Tanneguydv/pyolp_robotics">PyOLP – Python tool for Offline Robot Programming</a></li> +<li><a href="https://concreteproducts.com/index.php/2024/02/26/rebar-placement-robot-books-15-ton-day-on-florida-bridge-deck/">Rebar and Tie Robot</a></li> +<li><a href="https://www.therobotreport.com/apple-reportedly-pulls-plug-on-autonomous-vehicles/">Apple Car Canceled</a></li> +<li><a href="https://www.therobotreport.com/robco-raises-42-5m-for-automation-for-small-midsize-manufacturers/">RobCo Raises $42.5M</a></li> +<li><a href="https://spectrum.ieee.org/air-force-research-ares-os">Airforce Uses Robots for Lab Work</a></li> +<li><a href="https://arstechnica.com/ai/2024/02/amazon-to-spend-1-billion-on-startups-that-combine-ai-with-robots/">Amazon to drop $1B on Startups that use Robots and AI</a></li> +<li><a href="https://vimeo.com/917608513?share=copy">Gazebo Community Meeting on Pan-African Robotics Competition</a></li> +<li><a href="https://medium.com/toyotaresearch/meet-punyo-tris-soft-robot-for-whole-body-manipulation-research-949c934ac3d8">TRI’s Soft Robot Punyo</a></li> +<li><a href="https://discourse.ros.org/t/robot-fleet-management-make-vs-buy-an-alternative/36330">Fleet Managers: Build? Buy? A third thing?</a></li> +<li><a href="https://discourse.ros.org/t/roscon-france-j-28-pour-le-call-for-paper/36317">ROSCon France CFP is Now Open</a></li> +<li><a href="https://www.linkedin.com/pulse/nvidia-open-navigation-collaborate-drive-new-mobile-amulya-vishwanath-pesqc/?trackingId=1qN86jkFQleBX%2FC%2Bx%2Fv%2BMA%3D%3D">NVIDIA Support for Nav2 (LinkedIn)</a></li> +<li><a href="https://cvpr.thecvf.com/Conferences/2024">CVPR Tutorials and Workshops Announced</a></li> +<li><a href="https://svrobo.org/volunteer-with-silicon-valley-robotics/">SVR and IEEE RAS Volunteer Form (also free desks at Circuit launch)</a></li> +<li><a href="https://www.therobotreport.com/picknik-robotics-moveit-studio-is-now-moveit-pro/">Robot Report MoveIt Pro</a></li> +</ul> +<h1><a class="anchor" href="https://discourse.ros.org#ros-4" name="ros-4"></a>ROS</h1> +<ul> +<li><a href="https://vimeo.com/917307697?share=copy">Cloud Robotics Working Group Meeting Recording</a></li> +<li><a href="https://vimeo.com/917307697?share=copy">Updates to REP-147 to Improve Drone Performance</a></li> +<li><a href="https://discourse.ros.org/t/development-topics-for-aerial-robotics-indoor-navigation/36347">Development topics for Aerial Robotics - Indoor Navigation </a></li> +<li><a href="https://discourse.ros.org/t/new-packages-for-ros-2-rolling-ridley-2024-02-28/36358">5 new and 279 Updated Package for Rolling – Sync hold for 24.04 Migration</a></li> +<li><a href="https://discourse.ros.org/t/new-ros-enhancement-proposal-for-marine-robotics/36218">Marine Robotics Conventions REP Discussion</a></li> +<li><a href="https://discourse.ros.org/t/gtc-march-18-21-highlights-for-ros-ai-robotics/36274">NVIDIA GTC Highlights</a></li> +<li><a href="https://github.com/Clothooo/lvt2calib">Unified Calibration for 3D LiDARs, Cameras and Thermal Cameras</a></li> +<li><a href="https://github.com/scepter914/DepthAnything-ROS">Depth-Anything ROS</a></li> +<li><a href="https://github.com/tatsuyai713/RCL-like-Wrapper-for-Fast-DDS/">RCL Wrapper for Fast DDS</a></li> +<li><a href="https://github.com/alexklwong/awesome-state-of-depth-completion">Awesome State of Depth Completion</a></li> +<li><a href="https://www.youtube.com/watch?app=desktop&amp;v=rBPPuN-KQ08&amp;feature=youtu.be">Bug Fixes and Logging with ROS 2</a></li> +<li><a href="https://github.com/TUMFTM/Multi_LiCa">Multi - LiDAR-to-LiDAR calibration framework for ROS2 and non-ROS applications</a></li> +<li><a href="https://github.com/RTI-BDI/ROS2-BDI">ROS 2 / PlanSys BDI Framework</a></li> +<li><a href="https://github.com/unitreerobotics/point_lio_unilidar">LiDAR Odometry for Unitree L1</a></li> +<li><a href="https://www.dihnamic.eu/fr/1603-2/">What can ROS bring to my industrial robots? (French)</a></li> +<li><a href="https://www.youtube.com/@ROSCon-India">ROSCon India 2023 Videos</a></li> +<li><a href="https://dgdm-robot.github.io">(COOL) Dynamics Guided Diffusion Model for Robot Manipulator Design</a></li> +<li><a href="https://github.com/teamspatzenhirn/rviz_2d_overlay_plugins">RViz 2D Overlay Plugins</a></li> +<li><a href="https://foxglove.dev/blog/announcing-h264-support-in-foxglove">H.264 Support in Foxglove </a></li> +<li><a href="https://www.youtube.com/watch?v=JIGO3b-aoz8">What to do when your robot’s camera doesn’t work</a></li> +<li><a href="https://fieldrobotics.net/Field_Robotics/Volume_4_files/Vol4_01.pdf">Fast and Modular Autonomy Software for Autonomous Racing Vehicles</a></li> +</ul> +<h1><a class="anchor" href="https://discourse.ros.org#ros-questions-5" name="ros-questions-5"></a>ROS Questions</h1> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/8/f/8fb2ed962f40d1b9346f5514c82d543fe805e6e8.jpeg" title="image"><img alt="image" height="250" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/8/f/8fb2ed962f40d1b9346f5514c82d543fe805e6e8_2_186x250.jpeg" width="186" /></a></div><p></p> +<p>Got a minute? <a href="https://robotics.stackexchange.com/">Please take a moment to answer a question on Robotics Stack Exchange and help out your fellow ROS users.</a></p> + <p><small>1 post - 1 participant</small></p> + <p><a href="https://discourse.ros.org/t/ros-news-for-the-week-of-february-26th-2024/36367">Read full topic</a></p> + Fri, 01 Mar 2024 18:15:56 +0000 + + + ROS Discourse General: Revival of client library working group? + discourse.ros.org-topic-36406 + https://discourse.ros.org/t/revival-of-client-library-working-group/36406 + <p>Hi,<br /> +are there any plans, to revive this working group ?</p> + <p><small>15 posts - 5 participants</small></p> + <p><a href="https://discourse.ros.org/t/revival-of-client-library-working-group/36406">Read full topic</a></p> + Fri, 01 Mar 2024 16:19:00 +0000 + + + ROS Discourse General: Scalability issues with large number of nodes + discourse.ros.org-topic-36399 + https://discourse.ros.org/t/scalability-issues-with-large-number-of-nodes/36399 + <p>My team and I are developing a mobile platform for industrial tasks (such as rivet fastening or drilling), fully based in ROS2 Stack (Humble).</p> +<p>The stack comprises a bunch of nodes for different tasks (slam, motion planning, fiducial registration…) and are coordinated through a state machine node (based on smach).</p> +<p>The issue we are facing is that the state machine node (which is connected to most of the nodes in the stack) gets slower and slower until it stops receiving events from other nodes.</p> +<p>We’ve been debbuging this issue and our feeling is that the number of objects (nodes/clients/subscribers…) is too high and whole stack suffers a lot of overhead, being this most noticeable in the “biggest” node (the state machine).</p> +<p>Our stack has 80 nodes, and a total of 1505 objects</p> +<ul> +<li>Stack clients: 198</li> +<li>Stack services: 636</li> +<li>Stack publishers: 236</li> +<li>Stack subscribers: 173</li> +</ul> +<p>My questions are:</p> +<ul> +<li>Is this number of nodes too high for an industrial robotics project? How large are usually projects using ROS2?</li> +<li>Which is the maximum number of objects in the stack? Is this a rmw limitation or ROS2 itself?</li> +</ul> + <p><small>30 posts - 14 participants</small></p> + <p><a href="https://discourse.ros.org/t/scalability-issues-with-large-number-of-nodes/36399">Read full topic</a></p> + Fri, 01 Mar 2024 09:35:36 +0000 + + + ROS Discourse General: Robot Fleet Management: Make vs. Buy? An Alternative + discourse.ros.org-topic-36330 + https://discourse.ros.org/t/robot-fleet-management-make-vs-buy-an-alternative/36330 + <p>Virtually every robotics CTO we’ve spoken to has told us about this dilemma about fleet management systems: neither “make” nor “buy” are great options! With Transitive we are providing an alternative.</p> +<aside class="onebox allowlistedgeneric"> + <header class="source"> + <img class="site-icon" height="16" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/e/c/ec02994dee445cd82f49ef66d73f44e715a315b3.png" width="16" /> + + <a href="https://transitiverobotics.com/blog/make-vs-buy/" rel="noopener nofollow ugc" target="_blank" title="12:00AM - 26 February 2024">transitiverobotics.com – 26 Feb 24</a> + </header> + + <article class="onebox-body"> + <div class="aspect-image"><img class="thumbnail" height="362" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/3/0/30d7b39881ac5c6bfd5ca25ba8e15d49daee742f_2_690x362.png" width="690" /></div> + +<h3><a href="https://transitiverobotics.com/blog/make-vs-buy/" rel="noopener nofollow ugc" target="_blank">Robot Fleet Management: Make vs. Buy? An Alternative | Transitive Robotics</a></h3> + + <p>Transitive provides an alternative to the make vs. buy dilemma of robot fleet management.</p> + + + </article> + + <div class="onebox-metadata"> + + + </div> + + <div style="clear: both;"></div> +</aside> + + <p><small>8 posts - 5 participants</small></p> + <p><a href="https://discourse.ros.org/t/robot-fleet-management-make-vs-buy-an-alternative/36330">Read full topic</a></p> + Mon, 26 Feb 2024 23:00:34 +0000 + + + ROS Discourse General: Rclcpp template metaprogramming bug. Help wanted + discourse.ros.org-topic-36319 + https://discourse.ros.org/t/rclcpp-template-metaprogramming-bug-help-wanted/36319 + <p>Hi,<br /> +we hit a bug in the function traits that is out of my league.<br /> +If you are really good with template metraprogramming, please have a look at:</p><aside class="onebox githubissue"> + <header class="source"> + + <a href="https://github.com/ros2/rclcpp/issues/2429" rel="noopener nofollow ugc" target="_blank">github.com/ros2/rclcpp</a> + </header> + + <article class="onebox-body"> + <div class="github-row"> + <div class="github-icon-container" title="Issue"> + <svg class="github-icon" height="60" viewBox="0 0 14 16" width="60" xmlns="http://www.w3.org/2000/svg"><path d="M7 2.3c3.14 0 5.7 2.56 5.7 5.7s-2.56 5.7-5.7 5.7A5.71 5.71 0 0 1 1.3 8c0-3.14 2.56-5.7 5.7-5.7zM7 1C3.14 1 0 4.14 0 8s3.14 7 7 7 7-3.14 7-7-3.14-7-7-7zm1 3H6v5h2V4zm0 6H6v2h2v-2z" fill-rule="evenodd"></path></svg> + </div> + + <div class="github-info-container"> + <h4> + <a href="https://github.com/ros2/rclcpp/issues/2429" rel="noopener nofollow ugc" target="_blank">AnySubscriptionCallback doesn't accept `std::bind` callbacks with bound arguments</a> + </h4> + + <div class="github-info"> + <div class="date"> + opened <span class="discourse-local-date">09:56AM - 20 Feb 24 UTC</span> + </div> + + + <div class="user"> + <a href="https://github.com/HovorunB" rel="noopener nofollow ugc" target="_blank"> + <img alt="HovorunB" class="onebox-avatar-inline" height="20" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/a/0/a09013e7344330898234ab4ff20056eac37b1631.png" width="20" /> + HovorunB + </a> + </div> + </div> + + <div class="labels"> + </div> + </div> +</div> + + <div class="github-row"> + <p class="github-body-container">## Bug report + +**Required Info:** + +- Operating System: + - Ubuntu 22.04 +- <span class="show-more-container"><a class="show-more" href="https://discourse.ros.org" rel="noopener">…</a></span><span class="excerpt hidden">Installation type: + - from source +- Version or commit hash: + - rolling +- DDS implementation: + - Fast-RTPS +- Client library (if applicable): + - rclcpp + +#### Steps to reproduce issue + +1) Build rclcpp on a version after https://github.com/ros2/rclcpp/pull/1928 +2) Try to build for example [foxglove-bridge](https://github.com/foxglove/ros-foxglove-bridge) + +#### Expected behavior +`std::bind(&amp;FoxgloveBridge::rosMessageHandler, this, channelId, clientHandle, _1),` [(source)](https://github.com/foxglove/ros-foxglove-bridge/blob/89239eb5bbb8549fec08ade82254fabcf773cc37/ros2_foxglove_bridge/src/ros2_foxglove_bridge.cpp#L535-L538) is cast to `std::function` from [any_subscription_callback.hpp](https://github.com/ros2/rclcpp/blob/10252e9f66ac87f3903f301b64320d32457f0658/rclcpp/include/rclcpp/any_subscription_callback.hpp#L416) + + +#### Actual behavior +![Screenshot from 2024-02-19 10-56-06](https://github.com/ros2/rclcpp/assets/87417416/56f16069-6ab4-4a4d-ae28-a9865cafaf17) + +#### Additional information + +For example `std::bind(&amp;Class::method, this, std::placeholders::_1)` (without bound arguments) will build fine + +We also were able to fix the issue by casting the callback to `std::function` before passing it to the subscription +``` +auto subscriber = this-&gt;create_generic_subscription( + topic, datatype, qos, + static_cast&lt;std::function&lt;void(std::shared_ptr&lt;rclcpp::SerializedMessage&gt;)&gt;&gt;(std::bind(&amp;FoxgloveBridge::rosMessageHandler, this, channelId, clientHandle, _1)), + subscriptionOptions); +``` +Is this how it is supposed to be done now, or is there a bug in casting std::bind from any_subscription_callback.hpp?</span></p> + </div> + + </article> + + <div class="onebox-metadata"> + + + </div> + + <div style="clear: both;"></div> +</aside> +<p> +Thanks.</p> + <p><small>1 post - 1 participant</small></p> + <p><a href="https://discourse.ros.org/t/rclcpp-template-metaprogramming-bug-help-wanted/36319">Read full topic</a></p> + Mon, 26 Feb 2024 09:54:21 +0000 + + + ROS Discourse General: ROS News for the Week of February 19th, 2024 + discourse.ros.org-topic-36297 + https://discourse.ros.org/t/ros-news-for-the-week-of-february-19th-2024/36297 + <h1><a class="anchor" href="https://discourse.ros.org#ros-news-for-the-week-of-february-19th-2024-1" name="ros-news-for-the-week-of-february-19th-2024-1"></a>ROS News for the Week of February 19th, 2024</h1> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/1/5/156ac623189dad6794d1ea4909bc5936954f0a75.jpeg" title="ROS &amp; Gazebo GSoC 2024 (4)"><img alt="ROS &amp; Gazebo GSoC 2024 (4)" height="291" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/1/5/156ac623189dad6794d1ea4909bc5936954f0a75_2_517x291.jpeg" width="517" /></a></div><br /> +<a href="https://discourse.ros.org/t/attention-students-open-robotics-google-summer-of-code-2024-projects/36271">Open Robotics will be participating in Google Summer of Code 2024.</a> We’re looking for a few interns to help us out! See the post for all the details.<p></p> +<br /> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/e/c/ec6167c7415dff0d97740f4de4ae75be20f9108d.jpeg" title="Copy of Feb24GCM"><img alt="Copy of Feb24GCM" height="291" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/e/c/ec6167c7415dff0d97740f4de4ae75be20f9108d_2_517x291.jpeg" width="517" /></a></div><p></p> +<p><a href="https://community.gazebosim.org/t/community-meeting-pan-african-robotics-competition-parc/2564">Our next Gazebo Community meeting is next Wednesday, February 28th. </a> Sikiru Salau, a competitor in the <a href="https://parcrobotics.org/">Pan-African Robotics Competition</a>, will be joining us to talk about simulating robots for agriculture.</p> +<br /> +<p><img alt="image" height="272" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/f/f/ff6ea0e30d4779c75920306594cb6795825311c7.jpeg" width="405" /><br /> +Hello Robot is having a great month! Last week they released their <a href="https://hello-robot.com/stretch-3-whats-new">third gen robot</a>. This week they are at the top of the <a href="https://news.ycombinator.com/item?id=39483482">orange website</a> with <a href="https://ok-robot.github.io">this “OK Robot” paper from NYU</a></p> +<br /> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/2/4/246645eb0057f4128b9c46ed3e3b62211654347f.jpeg" title="282791034-352fa4d7-270b-43e4-bd51-bcee4377b07a"><img alt="282791034-352fa4d7-270b-43e4-bd51-bcee4377b07a" height="286" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/2/4/246645eb0057f4128b9c46ed3e3b62211654347f_2_517x286.jpeg" width="517" /></a></div><br /> +<a href="https://github.com/JatinPatil2003/AutoNav">Check out the AutoNav robot by Jatin Patil.</a> Hats off to the developer, this is a really well put together personal project!<p></p> +<br /> +<p></p><div class="lightbox-wrapper"><a class="lightbox" href="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/5/5/5581295f1c63eb51bd8ab79e6bdf722d3c2ac818.jpeg" title="RIP"><img alt="RIP" height="250" src="https://global.discourse-cdn.com/business7/uploads/ros/optimized/3X/5/5/5581295f1c63eb51bd8ab79e6bdf722d3c2ac818_2_250x250.jpeg" width="250" /></a></div><br /> +Just a reminder: <a href="https://discourse.ros.org/t/gazebo-classic-end-of-life-ros-2-jazzy/36239">Gazebo Classic goes End-Of-Life in January 2025 and ROS 2 Jazzy will not support Gazebo Classic.</a> We put together some guidance for those of you that need to make the switch!<p></p> +<h1><a class="anchor" href="https://discourse.ros.org#events-2" name="events-2"></a>Events</h1> +<ul> +<li><a href="https://discourse.ros.org/t/february-2024-meetings-aerial-robotics/35981">February Aerial Robotics Meetings</a></li> +<li><a href="https://community.gazebosim.org/t/community-meeting-pan-african-robotics-competition-parc/2564">2024-02-28 Gazebo Community Meeting wsg Sikiru Salau of Pan-African Robotics Competition</a></li> +<li><a href="https://www.meetup.com/boulderisforrobots/events/299280969/">2024-03-06 Boulder is for Robots Meetup</a></li> +<li><a href="https://online-learning.tudelft.nl/courses/hello-real-world-with-ros-robot-operating-systems/">2024-03-14 TU Delft ROS MOOC (FREE!)</a></li> +<li><a href="https://rosindustrial.org/events/2024/ros-industrial-consortium-americas-2024-annual-meeting">2024-03-27 ROS Industrial Annual Meeting</a></li> +<li><a href="https://www.eventbrite.com/e/robotics-and-tech-happy-hour-tickets-835278218637?aff=soc">2024-03-27 Robotics Happy Hour Pittsburgh</a></li> +<li><a href="https://www.eventbrite.com/e/robots-on-ice-40-2024-tickets-815030266467">2024-04-21 Robots on Ice</a></li> +<li><a href="https://2024.oshwa.org/">2024-05-03 Open Hardware Summit Montreal</a></li> +<li><a href="https://www.tum-venture-labs.de/education/bots-bento-the-first-robotic-pallet-handler-competition-icra-2024/">2024-05-13 Bots &amp; Bento @ ICRA 2024</a></li> +<li><a href="https://discourse.ros.org/t/acm-sigsoft-summer-school-on-software-engineering-for-robotics-in-brussels/35992">2024-06-04 → 2024-06-08 ACM SIGSOFT Summer School on Software Engineering for Robotics</a></li> +<li><a href="https://www.agriculture-vision.com/">2024-06-?? Workshop on Agriculture Vision at CVPR 2024</a></li> +<li><a href="https://icra2024.rt-net.jp/archives/87">2024-06-16 Food Topping Challenge at ICRA 2024</a></li> +<li><a href="https://discourse.ros.org/t/roscon-france-2024/35222">2024-06-19 =&gt; 2024-06-20 ROSCon France</a></li> +<li><a href="https://sites.udel.edu/ceoe-able/able-summer-bootcamp/">2024-08-05 Autonomous Systems Bootcam at Univ. Deleware</a>– <a href="https://www.youtube.com/watch?v=4ETDsBN2o8M">Video</a></li> +<li><a href="https://roscon.jp/2024/">2024-09-25 ROSConJP</a></li> +</ul> +<h1><a class="anchor" href="https://discourse.ros.org#news-3" name="news-3"></a>News</h1> +<ul> +<li><a href="https://discourse.ros.org/t/announcing-moveit-pro-runtime-and-developer-platform-previously-moveit-studio/36235">MoveIt Pro Released</a> – <a href="https://www.therobotreport.com/picknik-robotics-moveit-studio-is-now-moveit-pro/">The Robot Report</a></li> +<li><a href="https://www.therobotreport.com/intuitive-machines-odysseus-makes-first-us-lunar-landing-50-years/">Intuitive Machines Lands on the Moon!</a> – <a href="https://spectrum.ieee.org/lunar-landing-intuitive-machines">IEEE Spectrum</a></li> +<li><a href="https://www.swri.org/industry/industrial-robotics-automation/blog/unveiling-novel-lunar-rover-navigation">SWRI’s Novel Lunar Rover Navigation System</a></li> +<li><a href="https://arxiv.org/abs/2402.13616">YOLO v. 9 Paper</a>–<a href="https://github.com/WongKinYiu/yolov9">Source</a></li> +<li><a href="https://github.com/snt-arg/lidar_s_graphs/">Real Time S-Graphs for Robot Pose</a> – <a href="https://ieeexplore.ieee.org/document/10168233">Paper</a></li> +<li><a href="https://github.com/ywyeli/lidar-camera-placement">Influence of Camera-LiDAR Configuration on 3D Object Detection for Autonomous Driving</a></li> +<li><a href="https://ok-robot.github.io/">An open, modular framework for zero-shot, language conditioned pick-and-drop tasks in arbitrary homes.</a></li> +<li><a href="https://www.robot-learning.uk/dinobot">DINOBot: Robot Manipulation via Retrieval and Alignment with Vision Foundation Models</a></li> +<li><a href="https://marwan99.github.io/Fit-NGP/">Cool: Fit-NGP: Fitting Object Models to Neural Graphics Primitives</a> – <a href="https://www.youtube.com/watch?v=KQ7yH_em3Qg">Video</a></li> +<li><a href="https://github.com/MIT-SPARK/Khronos">Khronos: Spatio-Temporal Metric-Semantic SLAM</a></li> +<li><a href="https://github.com/huiyu-gao/VisFusion">VisFusion: Visibility-aware Online 3D Scene Reconstruction from Videos</a></li> +<li><a href="https://github.com/ISEE-Technology/lidar-with-velocity">Lidar With Velocity: Correcting Moving Objects Point Cloud Distortion From Oscillating Scanning Lidars by Fusion With Camera</a></li> +<li><a href="https://techcrunch.com/2024/02/17/dutch-startup-monumental-is-using-robots-to-lay-bricks/">Dutch startup Monumental is using robots to lay bricks</a></li> +<li><a href="https://www.therobotreport.com/olis-robotics-and-kawasaki-partner-to-offer-remote-troubleshooting/">Olis and Kawasaki Land Ink Deal for Remote Support</a></li> +<li><a href="https://spectrum.ieee.org/video-friday-pedipulate">Video Friday</a></li> +</ul> +<h1><a class="anchor" href="https://discourse.ros.org#ros-4" name="ros-4"></a>ROS</h1> +<ul> +<li><a href="https://discourse.ros.org/t/attention-students-open-robotics-google-summer-of-code-2024-projects/36271">Open Robotics @ GSoC 2024</a></li> +<li><a href="https://discourse.ros.org/t/gazebo-classic-end-of-life-ros-2-jazzy/36239">Gazebo Classic End-Of-Life and ROS 2 Jazzy</a></li> +<li><a href="https://discourse.ros.org/t/new-packages-and-patch-release-for-humble-hawksbill-2024-02-22/36275">42 New and 280 Updated Packages for Humble</a></li> +<li><a href="https://discourse.ros.org/t/new-packages-for-iron-irwini-2024-02-23/36283">2 New and 75 Updated Packages for Iron</a></li> +<li><a href="https://discourse.ros.org/t/new-packages-for-noetic-2024-02-22/36273">0 New and 136 Updated Packages for Noetic</a></li> +<li><a href="https://discourse.ros.org/t/migrating-turtlebot-2-to-ros-2/36225">ROS 2 Migration Guide using TurtleBot 2</a></li> +<li><a href="https://discourse.ros.org/t/new-ros-enhancement-proposal-for-marine-robotics/36218/5">REP Proposal: Coordinate Frames for Maritime Robots</a></li> +<li><a href="https://discourse.ros.org/t/introducing-psdk-ros2-bridging-djis-psdk-libraries-with-ros-2/33500">ROS 2 DJI Drone PSDK Bridge</a></li> +<li><a href="https://discourse.ros.org/t/learn-ros2-with-a-limo-robot-ros-developers-openclass-182/36287">Learn ROS 2 with LIMO Robot</a></li> +<li><a href="https://discourse.ros.org/t/gtc-march-18-21-highlights-for-ros-ai-robotics/36274">Highlights from NVIDIA GTC for ROS Developers</a></li> +<li><a href="https://discourse.ros.org/t/handle-unique-parameters-for-robot-instances/36074">Handle Unique Paramers for Robot Instances</a></li> +<li><a href="https://discourse.ros.org/t/we-use-websockets-and-ros-messaging-together-in-our-robot-software-stack-should-you/36199">Thoughts on Websockets with ROS Messaging</a></li> +<li><a href="https://vimeo.com/915293743">Maritime Robotics Working Group wsg HoloOcean</a></li> +<li><a href="https://discourse.ros.org/t/plotjuggler-3-9-1-is-out-and-few-more-things-you-should-know/36210">Plot Juggler 3.9.1 Release</a></li> +<li><a href="https://discourse.ros.org/t/devops-for-robotics-certificate-training-in-barcelona-spain-march-20-22-2024/36213">Devops for Robotics Certificate Training</a></li> +<li><a href="https://discourse.ros.org/t/preparing-ros-2-rolling-for-the-transition-to-ubuntu-24-04/35673">Rolling out 24.04 for Rolling</a></li> +<li><a href="https://discourse.ros.org/t/mcap-file-editor-gui-in-your-browser/36198">MCAP File Editor from @facontidavide</a></li> +<li><a href="https://www.hackster.io/aal-shaji/differential-drive-robot-using-ros2-and-esp32-aae289">Diff Drive ROS 2 Robot with ESP32</a></li> +<li><a href="https://www.petrikvandervelde.nl/posts/Swerve-drive-motor-limitations">Swerve Drive Motor Limitations</a> ← This personal blog is <img alt=":100:" class="emoji" height="20" src="https://emoji.discourse-cdn.com/twitter/100.png?v=12" title=":100:" width="20" /></li> +<li><a href="https://github.com/JatinPatil2003/AutoNav">New AutoNav ROS Robot</a> – <a href="https://www.youtube.com/watch?v=g5LXZQY55DI">Video</a></li> +<li><a href="https://www.youtube.com/watch?v=Sz1fanH58kg">ROS Intro Workshop @ Purdue</a> – <a href="https://ivory-sale-974.notion.site/ARC-ROS-Workshop-2d26d5bcdd69496996806ccf8e5a011b">Materials</a></li> +<li><a href="https://www.youtube.com/watch?v=Y6AUsB3RUhA">Robotics at Compile Time: Optimizing Robotics Algorithms With C++'s Compile-Time Features - CppCon23</a></li> +<li><a href="https://github.com/jarain78/mycobot280_movelt2">MyCobot280 for MoveIt</a></li> +<li><a href="https://rosmo-robot.github.io/zio/">Ziobot ROS Robot</a></li> +</ul> +<h1><a class="anchor" href="https://discourse.ros.org#ros-questions-5" name="ros-questions-5"></a>ROS Questions</h1> +<p>Got a minute to spare? Pay it forward by answering a few ROS questions on <a href="https://robotics.stackexchange.com/">Robotics Stack Exchange</a>.</p> + <p><small>3 posts - 3 participants</small></p> + <p><a href="https://discourse.ros.org/t/ros-news-for-the-week-of-february-19th-2024/36297">Read full topic</a></p> + Fri, 23 Feb 2024 23:13:39 +0000 + + + ROS Discourse General: New Packages for Iron Irwini 2024-02-23 + discourse.ros.org-topic-36283 + https://discourse.ros.org/t/new-packages-for-iron-irwini-2024-02-23/36283 + <p>We’re happy to announce <strong>2</strong> new packages and <strong>75</strong> updates are now available in ROS 2 Iron Irwini <img alt=":iron:" class="emoji emoji-custom" height="20" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/b/3/b3c1340fc185f5e47c7ec55ef5bb1771802de993.png?v=12" title=":iron:" width="20" /> <img alt=":irwini:" class="emoji emoji-custom" height="20" src="https://global.discourse-cdn.com/business7/uploads/ros/original/3X/d/2/d2f3dcbdaff6f33258719fe5b8f692594a9feab0.png?v=12" title=":irwini:" width="20" /> . This sync was tagged as <a href="https://github.com/ros/rosdistro/blob/iron/2024-02-23/iron/distribution.yaml" rel="noopener nofollow ugc"><code>iron/2024-02-23</code> </a>.</p> +<h2><a class="anchor" href="https://discourse.ros.org#package-updates-for-iron-1" name="package-updates-for-iron-1"></a>Package Updates for iron</h2> +<h3><a class="anchor" href="https://discourse.ros.org#added-packages-2-2" name="added-packages-2-2"></a>Added Packages [2]:</h3> +<ul> +<li>ros-iron-apriltag-detector: 1.2.0-1</li> +<li>ros-iron-multidim-rrt-planner: 0.0.8-1</li> +</ul> +<h3><a class="anchor" href="https://discourse.ros.org#updated-packages-75-3" name="updated-packages-75-3"></a>Updated Packages [75]:</h3> +<ul> +<li>ros-iron-ackermann-steering-controller: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-ackermann-steering-controller-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-admittance-controller: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-admittance-controller-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-azure-iot-sdk-c: 1.10.1-4 → 1.12.0-1</li> +<li>ros-iron-bicycle-steering-controller: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-bicycle-steering-controller-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-bno055: 0.4.1-4 → 0.5.0-1</li> +<li>ros-iron-diff-drive-controller: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-diff-drive-controller-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li><a href="https://wiki.ros.org/draco_point_cloud_transport">ros-iron-draco-point-cloud-transport</a>: 2.0.3-1 → 2.0.4-1</li> +<li>ros-iron-draco-point-cloud-transport-dbgsym: 2.0.3-1 → 2.0.4-1</li> +<li>ros-iron-effort-controllers: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-effort-controllers-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-force-torque-sensor-broadcaster: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-force-torque-sensor-broadcaster-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-forward-command-controller: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-forward-command-controller-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-gripper-controllers: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-gripper-controllers-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-imu-sensor-broadcaster: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-imu-sensor-broadcaster-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-joint-state-broadcaster: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-joint-state-broadcaster-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-joint-trajectory-controller: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-joint-trajectory-controller-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li><a href="http://wiki.ros.org/leo">ros-iron-leo</a>: 2.0.0-1 → 2.0.1-1</li> +<li><a href="http://wiki.ros.org/leo_description">ros-iron-leo-description</a>: 2.0.0-1 → 2.0.1-1</li> +<li><a href="http://wiki.ros.org/leo">ros-iron-leo-msgs</a>: 2.0.0-1 → 2.0.1-1</li> +<li>ros-iron-leo-msgs-dbgsym: 2.0.0-1 → 2.0.1-1</li> +<li><a href="http://wiki.ros.org/leo_teleop">ros-iron-leo-teleop</a>: 2.0.0-1 → 2.0.1-1</li> +<li><a href="https://github.com/MOLAorg/mp2p_icp" rel="noopener nofollow ugc">ros-iron-mp2p-icp</a>: 1.1.0-1 → 1.2.0-1</li> +<li>ros-iron-mp2p-icp-dbgsym: 1.1.0-1 → 1.2.0-1</li> +<li><a href="https://www.mrpt.org/" rel="noopener nofollow ugc">ros-iron-mrpt2</a>: 2.11.7-1 → 2.11.9-1</li> +<li>ros-iron-mrpt2-dbgsym: 2.11.7-1 → 2.11.9-1</li> +<li><a href="https://github.com/facontidavide/PlotJuggler" rel="noopener nofollow ugc">ros-iron-plotjuggler-ros</a>: 2.1.0-1 → 2.1.1-1</li> +<li>ros-iron-plotjuggler-ros-dbgsym: 2.1.0-1 → 2.1.1-1</li> +<li><a href="https://wiki.ros.org/draco_point_cloud_transport">ros-iron-point-cloud-interfaces</a>: 2.0.3-1 → 2.0.4-1</li> +<li>ros-iron-point-cloud-interfaces-dbgsym: 2.0.3-1 → 2.0.4-1</li> +<li>ros-iron-point-cloud-transport: 2.0.3-1 → 2.0.4-1</li> +<li>ros-iron-point-cloud-transport-dbgsym: 2.0.3-1 → 2.0.4-1</li> +<li><a href="https://wiki.ros.org/point_cloud_transport">ros-iron-point-cloud-transport-plugins</a>: 2.0.3-1 → 2.0.4-1</li> +<li>ros-iron-point-cloud-transport-py: 2.0.3-1 → 2.0.4-1</li> +<li>ros-iron-position-controllers: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-position-controllers-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-range-sensor-broadcaster: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-range-sensor-broadcaster-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-rcpputils: 2.6.2-1 → 2.6.3-1</li> +<li>ros-iron-rcpputils-dbgsym: 2.6.2-1 → 2.6.3-1</li> +<li><a href="http://robotraconteur.com" rel="noopener nofollow ugc">ros-iron-robotraconteur</a>: 1.0.0-1 → 1.0.0-2</li> +<li>ros-iron-robotraconteur-dbgsym: 1.0.0-1 → 1.0.0-2</li> +<li>ros-iron-ros2-controllers: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-ros2-controllers-test-nodes: 3.21.0-1 → 3.22.0-1</li> +<li><a href="http://ros.org/wiki/rqt">ros-iron-rqt</a>: 1.3.3-1 → 1.3.4-1</li> +<li>ros-iron-rqt-gauges: 0.0.1-1 → 0.0.2-1</li> +<li><a href="http://ros.org/wiki/rqt_gui">ros-iron-rqt-gui</a>: 1.3.3-1 → 1.3.4-1</li> +<li><a href="http://ros.org/wiki/rqt_gui_cpp">ros-iron-rqt-gui-cpp</a>: 1.3.3-1 → 1.3.4-1</li> +<li>ros-iron-rqt-gui-cpp-dbgsym: 1.3.3-1 → 1.3.4-1</li> +<li><a href="http://ros.org/wiki/rqt_gui_py">ros-iron-rqt-gui-py</a>: 1.3.3-1 → 1.3.4-1</li> +<li>ros-iron-rqt-joint-trajectory-controller: 3.21.0-1 → 3.22.0-1</li> +<li><a href="http://ros.org/wiki/rqt_py_common">ros-iron-rqt-py-common</a>: 1.3.3-1 → 1.3.4-1</li> +<li>ros-iron-steering-controllers-library: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-steering-controllers-library-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-tricycle-controller: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-tricycle-controller-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-tricycle-steering-controller: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-tricycle-steering-controller-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-velocity-controllers: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-velocity-controllers-dbgsym: 3.21.0-1 → 3.22.0-1</li> +<li>ros-iron-vrpn-mocap: 1.0.3-3 → 1.1.0-1</li> +<li>ros-iron-vrpn-mocap-dbgsym: 1.0.3-3 → 1.1.0-1</li> +<li><a href="https://wiki.ros.org/draco_point_cloud_transport">ros-iron-zlib-point-cloud-transport</a>: 2.0.3-1 → 2.0.4-1</li> +<li>ros-iron-zlib-point-cloud-transport-dbgsym: 2.0.3-1 → 2.0.4-1</li> +<li><a href="https://wiki.ros.org/draco_point_cloud_transport">ros-iron-zstd-point-cloud-transport</a>: 2.0.3-1 → 2.0.4-1</li> +<li>ros-iron-zstd-point-cloud-transport-dbgsym: 2.0.3-1 → 2.0.4-1</li> +</ul> +<h3><a class="anchor" href="https://discourse.ros.org#removed-packages-0-4" name="removed-packages-0-4"></a>Removed Packages [0]:</h3> +<p>Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:</p> +<ul> +<li>Alejandro Hernandez Cordero</li> +<li>Alejandro Hernández</li> +<li>Alvin Sun</li> +<li>Bence Magyar</li> +<li>Bernd Pfrommer</li> +<li>Brandon Ong</li> +<li>Davide Faconti</li> +<li>Denis Štogl</li> +<li>Dharini Dutia</li> +<li>Eloy Bricneo</li> +<li>Fictionlab</li> +<li>John Wason</li> +<li>Jose-Luis Blanco-Claraco</li> +<li>Martin Pecka</li> +<li>Tim Clephas</li> +<li>david</li> +<li>flynneva</li> +</ul> + <p><small>1 post - 1 participant</small></p> + <p><a href="https://discourse.ros.org/t/new-packages-for-iron-irwini-2024-02-23/36283">Read full topic</a></p> + Fri, 23 Feb 2024 10:35:02 +0000 + + + ROS Discourse General: New packages and patch release for Humble Hawksbill 2024-02-22 + discourse.ros.org-topic-36275 + https://discourse.ros.org/t/new-packages-and-patch-release-for-humble-hawksbill-2024-02-22/36275 + <p>We’re happy to announce a <a href="https://github.com/ros2/ros2/releases/tag/release-humble-20240222" rel="noopener nofollow ugc">new Humble release</a>!</p> +<p>This sync brings several new packages and some updates to ROS 2 core packages. (I’m not including the project board because it was empty.)</p> +<hr /> +<h2><a class="anchor" href="https://discourse.ros.org#package-updates-for-humble-1" name="package-updates-for-humble-1"></a>Package Updates for Humble</h2> +<h3><a class="anchor" href="https://discourse.ros.org#added-packages-42-2" name="added-packages-42-2"></a>Added Packages [42]:</h3> +<ul> +<li>ros-humble-apriltag-detector: 1.1.0-1</li> +<li>ros-humble-as2-gazebo-assets: 1.0.8-1</li> +<li>ros-humble-as2-gazebo-assets-dbgsym: 1.0.8-1</li> +<li>ros-humble-as2-platform-dji-osdk: 1.0.8-1</li> +<li>ros-humble-as2-platform-dji-osdk-dbgsym: 1.0.8-1</li> +<li>ros-humble-as2-platform-gazebo: 1.0.8-1</li> +<li>ros-humble-as2-platform-gazebo-dbgsym: 1.0.8-1</li> +<li>ros-humble-caret-analyze: 0.5.0-1</li> +<li>ros-humble-caret-msgs: 0.5.0-6</li> +<li>ros-humble-caret-msgs-dbgsym: 0.5.0-6</li> +<li>ros-humble-data-tamer-cpp: 0.9.3-2</li> +<li>ros-humble-data-tamer-cpp-dbgsym: 0.9.3-2</li> +<li>ros-humble-data-tamer-msgs: 0.9.3-2</li> +<li>ros-humble-data-tamer-msgs-dbgsym: 0.9.3-2</li> +<li>ros-humble-hardware-interface-testing: 2.39.1-1</li> +<li>ros-humble-hardware-interface-testing-dbgsym: 2.39.1-1</li> +<li>ros-humble-hri-msgs: 2.0.0-1</li> +<li>ros-humble-hri-msgs-dbgsym: 2.0.0-1</li> +<li>ros-humble-mocap4r2-dummy-driver: 0.0.7-1</li> +<li>ros-humble-mocap4r2-dummy-driver-dbgsym: 0.0.7-1</li> +<li>ros-humble-mocap4r2-marker-viz: 0.0.7-1</li> +<li>ros-humble-mocap4r2-marker-viz-dbgsym: 0.0.7-1</li> +<li>ros-humble-mocap4r2-marker-viz-srvs: 0.0.7-1</li> +<li>ros-humble-mocap4r2-marker-viz-srvs-dbgsym: 0.0.7-1</li> +<li>ros-humble-motion-capture-tracking: 1.0.3-1</li> +<li>ros-humble-motion-capture-tracking-dbgsym: 1.0.3-1</li> +<li>ros-humble-motion-capture-tracking-interfaces: 1.0.3-1</li> +<li>ros-humble-motion-capture-tracking-interfaces-dbgsym: 1.0.3-1</li> +<li>ros-humble-psdk-interfaces: 1.0.0-1</li> +<li>ros-humble-psdk-interfaces-dbgsym: 1.0.0-1</li> +<li>ros-humble-psdk-wrapper: 1.0.0-1</li> +<li>ros-humble-psdk-wrapper-dbgsym: 1.0.0-1</li> +<li>ros-humble-qb-softhand-industry-description: 2.1.2-4</li> +<li>ros-humble-qb-softhand-industry-msgs: 2.1.2-4</li> +<li>ros-humble-qb-softhand-industry-msgs-dbgsym: 2.1.2-4</li> +<li>ros-humble-qb-softhand-industry-ros2-control: 2.1.2-4</li> +<li>ros-humble-qb-softhand-industry-ros2-control-dbgsym: 2.1.2-4</li> +<li>ros-humble-qb-softhand-industry-srvs: 2.1.2-4</li> +<li>ros-humble-qb-softhand-industry-srvs-dbgsym: 2.1.2-4</li> +<li>ros-humble-ros2caret: 0.5.0-2</li> +<li>ros-humble-sync-parameter-server: 1.0.1-2</li> +<li>ros-humble-sync-parameter-server-dbgsym: 1.0.1-2</li> +</ul> +<h3><a class="anchor" href="https://discourse.ros.org#updated-packages-280-3" name="updated-packages-280-3"></a>Updated Packages [280]:</h3> +<ul> +<li>ros-humble-ament-cmake: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-auto: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-core: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-export-definitions: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-export-dependencies: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-export-include-directories: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-export-interfaces: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-export-libraries: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-export-link-flags: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-export-targets: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-gen-version-h: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-gmock: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-google-benchmark: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-gtest: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-include-directories: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-libraries: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-nose: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-pytest: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-python: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-target-dependencies: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-test: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-vendor-package: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-ament-cmake-version: 1.3.7-1 → 1.3.8-1</li> +<li>ros-humble-as2-alphanumeric-viewer: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-alphanumeric-viewer-dbgsym: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-behavior: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-behavior-dbgsym: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-behavior-tree: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-behavior-tree-dbgsym: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-behaviors-motion: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-behaviors-motion-dbgsym: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-behaviors-perception: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-behaviors-perception-dbgsym: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-behaviors-platform: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-behaviors-platform-dbgsym: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-behaviors-trajectory-generation: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-behaviors-trajectory-generation-dbgsym: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-cli: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-core: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-keyboard-teleoperation: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-motion-controller: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-motion-controller-dbgsym: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-motion-reference-handlers: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-msgs: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-msgs-dbgsym: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-platform-crazyflie: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-platform-crazyflie-dbgsym: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-platform-tello: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-platform-tello-dbgsym: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-python-api: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-realsense-interface: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-realsense-interface-dbgsym: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-state-estimator: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-state-estimator-dbgsym: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-usb-camera-interface: 1.0.6-1 → 1.0.8-1</li> +<li>ros-humble-as2-usb-camera-interface-dbgsym: 1.0.6-1 → 1.0.8-1</li> +<li><a href="https://index.ros.org/p/camera_calibration/github-ros-perception-image_pipeline/">ros-humble-camera-calibration</a>: 3.0.0-1 → 3.0.3-1</li> +<li>ros-humble-controller-interface: 2.37.0-1 → 2.39.1-1</li> +<li>ros-humble-controller-interface-dbgsym: 2.37.0-1 → 2.39.1-1</li> +<li>ros-humble-controller-manager: 2.37.0-1 → 2.39.1-1</li> +<li>ros-humble-controller-manager-dbgsym: 2.37.0-1 → 2.39.1-1</li> +<li><a href="http://ros.org/wiki/controller_manager_msgs">ros-humble-controller-manager-msgs</a>: 2.37.0-1 → 2.39.1-1</li> +<li>ros-humble-controller-manager-msgs-dbgsym: 2.37.0-1 → 2.39.1-1</li> +<li>ros-humble-costmap-queue: 1.1.12-1 → 1.1.13-1</li> +<li><a href="https://index.ros.org/p/depth_image_proc/github-ros-perception-image_pipeline/">ros-humble-depth-image-proc</a>: 3.0.0-1 → 3.0.3-1</li> +<li>ros-humble-depth-image-proc-dbgsym: 3.0.0-1 → 3.0.3-1</li> +<li>ros-humble-depthai-bridge: 2.8.2-1 → 2.9.0-1</li> +<li>ros-humble-depthai-bridge-dbgsym: 2.8.2-1 → 2.9.0-1</li> +<li>ros-humble-depthai-descriptions: 2.8.2-1 → 2.9.0-1</li> +<li>ros-humble-depthai-examples: 2.8.2-1 → 2.9.0-1</li> +<li>ros-humble-depthai-examples-dbgsym: 2.8.2-1 → 2.9.0-1</li> +<li>ros-humble-depthai-filters: 2.8.2-1 → 2.9.0-1</li> +<li>ros-humble-depthai-filters-dbgsym: 2.8.2-1 → 2.9.0-1</li> +<li>ros-humble-depthai-ros: 2.8.2-1 → 2.9.0-1</li> +<li>ros-humble-depthai-ros-driver: 2.8.2-1 → 2.9.0-1</li> +<li>ros-humble-depthai-ros-driver-dbgsym: 2.8.2-1 → 2.9.0-1</li> +<li>ros-humble-depthai-ros-msgs: 2.8.2-1 → 2.9.0-1</li> +<li>ros-humble-depthai-ros-msgs-dbgsym: 2.8.2-1 → 2.9.0-1</li> +<li>ros-humble-dwb-core: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-dwb-core-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-dwb-critics: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-dwb-critics-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-dwb-msgs: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-dwb-msgs-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-dwb-plugins: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-dwb-plugins-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-event-camera-codecs: 1.1.2-1 → 1.1.3-1</li> +<li>ros-humble-event-camera-codecs-dbgsym: 1.1.2-1 → 1.1.3-1</li> +<li>ros-humble-event-camera-py: 1.1.3-1 → 1.1.4-1</li> +<li>ros-humble-event-camera-renderer: 1.1.2-1 → 1.1.3-1</li> +<li>ros-humble-event-camera-renderer-dbgsym: 1.1.2-1 → 1.1.3-1</li> +<li>ros-humble-examples-tf2-py: 0.25.5-1 → 0.25.6-1</li> +<li><a href="http://www.ros.org/wiki/geometry2">ros-humble-geometry2</a>: 0.25.5-1 → 0.25.6-1</li> +<li>ros-humble-hardware-interface: 2.37.0-1 → 2.39.1-1</li> +<li>ros-humble-hardware-interface-dbgsym: 2.37.0-1 → 2.39.1-1</li> +<li><a href="https://index.ros.org/p/image_pipeline/github-ros-perception-image_pipeline/">ros-humble-image-pipeline</a>: 3.0.0-1 → 3.0.3-1</li> +<li><a href="https://index.ros.org/p/image_proc/github-ros-perception-image_pipeline/">ros-humble-image-proc</a>: 3.0.0-1 → 3.0.3-1</li> +<li>ros-humble-image-proc-dbgsym: 3.0.0-1 → 3.0.3-1</li> +<li><a href="https://index.ros.org/p/image_publisher/github-ros-perception-image_pipeline/">ros-humble-image-publisher</a>: 3.0.0-1 → 3.0.3-1</li> +<li>ros-humble-image-publisher-dbgsym: 3.0.0-1 → 3.0.3-1</li> +<li><a href="https://index.ros.org/p/image_rotate/github-ros-perception-image_pipeline/">ros-humble-image-rotate</a>: 3.0.0-1 → 3.0.3-1</li> +<li>ros-humble-image-rotate-dbgsym: 3.0.0-1 → 3.0.3-1</li> +<li><a href="https://index.ros.org/p/image_view/github-ros-perception-image_pipeline/">ros-humble-image-view</a>: 3.0.0-1 → 3.0.3-1</li> +<li>ros-humble-image-view-dbgsym: 3.0.0-1 → 3.0.3-1</li> +<li><a href="https://github.com/ros-controls/ros2_control/wiki" rel="noopener nofollow ugc">ros-humble-joint-limits</a>: 2.37.0-1 → 2.39.1-1</li> +<li>ros-humble-joint-limits-dbgsym: 2.37.0-1 → 2.39.1-1</li> +<li>ros-humble-launch: 1.0.4-1 → 1.0.5-1</li> +<li>ros-humble-launch-pytest: 1.0.4-1 → 1.0.5-1</li> +<li>ros-humble-launch-testing: 1.0.4-1 → 1.0.5-1</li> +<li>ros-humble-launch-testing-ament-cmake: 1.0.4-1 → 1.0.5-1</li> +<li>ros-humble-launch-xml: 1.0.4-1 → 1.0.5-1</li> +<li>ros-humble-launch-yaml: 1.0.4-1 → 1.0.5-1</li> +<li><a href="http://wiki.ros.org/leo">ros-humble-leo</a>: 1.2.0-1 → 1.2.1-1</li> +<li><a href="http://wiki.ros.org/leo_description">ros-humble-leo-description</a>: 1.2.0-1 → 1.2.1-1</li> +<li><a href="http://wiki.ros.org/leo">ros-humble-leo-msgs</a>: 1.2.0-1 → 1.2.1-1</li> +<li>ros-humble-leo-msgs-dbgsym: 1.2.0-1 → 1.2.1-1</li> +<li><a href="http://wiki.ros.org/leo_teleop">ros-humble-leo-teleop</a>: 1.2.0-1 → 1.2.1-1</li> +<li><a href="https://github.com/LORD-MicroStrain/microstrain_inertial" rel="noopener nofollow ugc">ros-humble-microstrain-inertial-driver</a>: 3.2.0-2 → 3.2.1-1</li> +<li>ros-humble-microstrain-inertial-driver-dbgsym: 3.2.0-2 → 3.2.1-1</li> +<li><a href="https://github.com/LORD-MicroStrain/microstrain_inertial" rel="noopener nofollow ugc">ros-humble-microstrain-inertial-examples</a>: 3.2.0-2 → 3.2.1-1</li> +<li>ros-humble-microstrain-inertial-examples-dbgsym: 3.2.0-2 → 3.2.1-1</li> +<li><a href="https://github.com/LORD-MicroStrain/microstrain_inertial" rel="noopener nofollow ugc">ros-humble-microstrain-inertial-msgs</a>: 3.2.0-2 → 3.2.1-1</li> +<li>ros-humble-microstrain-inertial-msgs-dbgsym: 3.2.0-2 → 3.2.1-1</li> +<li><a href="https://github.com/LORD-MicroStrain/microstrain_inertial" rel="noopener nofollow ugc">ros-humble-microstrain-inertial-rqt</a>: 3.2.0-2 → 3.2.1-1</li> +<li>ros-humble-mocap4r2-control: 0.0.6-1 → 0.0.7-1</li> +<li>ros-humble-mocap4r2-control-dbgsym: 0.0.6-1 → 0.0.7-1</li> +<li>ros-humble-mocap4r2-control-msgs: 0.0.6-1 → 0.0.7-1</li> +<li>ros-humble-mocap4r2-control-msgs-dbgsym: 0.0.6-1 → 0.0.7-1</li> +<li>ros-humble-mocap4r2-marker-publisher: 0.0.6-1 → 0.0.7-1</li> +<li>ros-humble-mocap4r2-marker-publisher-dbgsym: 0.0.6-1 → 0.0.7-1</li> +<li>ros-humble-mocap4r2-robot-gt: 0.0.6-1 → 0.0.7-1</li> +<li>ros-humble-mocap4r2-robot-gt-dbgsym: 0.0.6-1 → 0.0.7-1</li> +<li>ros-humble-mocap4r2-robot-gt-msgs: 0.0.6-1 → 0.0.7-1</li> +<li>ros-humble-mocap4r2-robot-gt-msgs-dbgsym: 0.0.6-1 → 0.0.7-1</li> +<li><a href="https://github.com/MOLAorg/mp2p_icp" rel="noopener nofollow ugc">ros-humble-mp2p-icp</a>: 1.0.0-1 → 1.2.0-1</li> +<li>ros-humble-mp2p-icp-dbgsym: 1.0.0-1 → 1.2.0-1</li> +<li><a href="https://www.mrpt.org/" rel="noopener nofollow ugc">ros-humble-mrpt2</a>: 2.11.6-1 → 2.11.9-1</li> +<li>ros-humble-mrpt2-dbgsym: 2.11.6-1 → 2.11.9-1</li> +<li>ros-humble-nav-2d-msgs: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav-2d-msgs-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav-2d-utils: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav-2d-utils-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-amcl: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-amcl-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-behavior-tree: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-behavior-tree-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-behaviors: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-behaviors-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-bringup: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-bt-navigator: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-bt-navigator-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-collision-monitor: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-collision-monitor-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-common: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-constrained-smoother: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-constrained-smoother-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-controller: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-controller-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-core: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-costmap-2d: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-costmap-2d-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-dwb-controller: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-lifecycle-manager: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-lifecycle-manager-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-map-server: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-map-server-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-mppi-controller: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-mppi-controller-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-msgs: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-msgs-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-navfn-planner: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-navfn-planner-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-planner: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-planner-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-regulated-pure-pursuit-controller: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-regulated-pure-pursuit-controller-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-rotation-shim-controller: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-rotation-shim-controller-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-rviz-plugins: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-rviz-plugins-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-simple-commander: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-smac-planner: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-smac-planner-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-smoother: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-smoother-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-theta-star-planner: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-theta-star-planner-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-util: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-util-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-velocity-smoother: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-velocity-smoother-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-voxel-grid: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-voxel-grid-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-waypoint-follower: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-nav2-waypoint-follower-dbgsym: 1.1.12-1 → 1.1.13-1</li> +<li>ros-humble-navigation2: 1.1.12-1 → 1.1.13-1</li> +<li><a href="http://wiki.ros.org/pcl_conversions">ros-humble-pcl-conversions</a>: 2.4.0-4 → 2.4.0-6</li> +<li><a href="http://ros.org/wiki/perception_pcl">ros-humble-pcl-ros</a>: 2.4.0-4 → 2.4.0-6</li> +<li><a href="http://ros.org/wiki/perception_pcl">ros-humble-perception-pcl</a>: 2.4.0-4 → 2.4.0-6</li> +<li><a href="https://github.com/facontidavide/PlotJuggler" rel="noopener nofollow ugc">ros-humble-plotjuggler</a>: 3.8.8-3 → 3.9.0-1</li> +<li>ros-humble-plotjuggler-dbgsym: 3.8.8-3 → 3.9.0-1</li> +<li><a href="https://github.com/facontidavide/PlotJuggler" rel="noopener nofollow ugc">ros-humble-plotjuggler-ros</a>: 2.0.0-3 → 2.1.0-1</li> +<li>ros-humble-plotjuggler-ros-dbgsym: 2.0.0-3 → 2.1.0-1</li> +<li>ros-humble-rclpy: 3.3.11-1 → 3.3.12-1</li> +<li>ros-humble-rcpputils: 2.4.1-1 → 2.4.2-1</li> +<li>ros-humble-rcpputils-dbgsym: 2.4.1-1 → 2.4.2-1</li> +<li>ros-humble-rcutils: 5.1.4-1 → 5.1.5-1</li> +<li>ros-humble-rcutils-dbgsym: 5.1.4-1 → 5.1.5-1</li> +<li><a href="http://robotraconteur.com" rel="noopener nofollow ugc">ros-humble-robotraconteur</a>: 1.0.0-1 → 1.0.0-2</li> +<li>ros-humble-robotraconteur-dbgsym: 1.0.0-1 → 1.0.0-2</li> +<li>ros-humble-ros2-control: 2.37.0-1 → 2.39.1-1</li> +<li>ros-humble-ros2-control-test-assets: 2.37.0-1 → 2.39.1-1</li> +<li>ros-humble-ros2action: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2cli: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2cli-test-interfaces: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2cli-test-interfaces-dbgsym: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2component: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2controlcli: 2.37.0-1 → 2.39.1-1</li> +<li>ros-humble-ros2doctor: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2interface: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2lifecycle: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2lifecycle-test-fixtures: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2lifecycle-test-fixtures-dbgsym: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2multicast: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2node: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2param: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2pkg: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2run: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2service: 0.18.8-1 → 0.18.9-1</li> +<li>ros-humble-ros2topic: 0.18.8-1 → 0.18.9-1</li> +<li><a href="http://ros.org/wiki/rqt">ros-humble-rqt</a>: 1.1.6-2 → 1.1.7-1</li> +<li><a href="http://wiki.ros.org/rqt_console">ros-humble-rqt-console</a>: 2.0.2-3 → 2.0.3-1</li> +<li><a href="http://ros.org/wiki/rqt_controller_manager">ros-humble-rqt-controller-manager</a>: 2.37.0-1 → 2.39.1-1</li> +<li><a href="http://ros.org/wiki/rqt_gui">ros-humble-rqt-gui</a>: 1.1.6-2 → 1.1.7-1</li> +<li><a href="http://ros.org/wiki/rqt_gui_cpp">ros-humble-rqt-gui-cpp</a>: 1.1.6-2 → 1.1.7-1</li> +<li>ros-humble-rqt-gui-cpp-dbgsym: 1.1.6-2 → 1.1.7-1</li> +<li><a href="http://ros.org/wiki/rqt_gui_py">ros-humble-rqt-gui-py</a>: 1.1.6-2 → 1.1.7-1</li> +<li>ros-humble-rqt-mocap4r2-control: 0.0.6-1 → 0.0.7-1</li> +<li>ros-humble-rqt-mocap4r2-control-dbgsym: 0.0.6-1 → 0.0.7-1</li> +<li><a href="http://ros.org/wiki/rqt_py_common">ros-humble-rqt-py-common</a>: 1.1.6-2 → 1.1.7-1</li> +<li><a href="http://assimp.sourceforge.net/index.html" rel="noopener nofollow ugc">ros-humble-rviz-assimp-vendor</a>: 11.2.10-1 → 11.2.11-1</li> +<li><a href="https://github.com/ros2/rviz/blob/ros2/README.md" rel="noopener nofollow ugc">ros-humble-rviz-common</a>: 11.2.10-1 → 11.2.11-1</li> +<li>ros-humble-rviz-common-dbgsym: 11.2.10-1 → 11.2.11-1</li> +<li><a href="https://github.com/ros2/rviz/blob/ros2/README.md" rel="noopener nofollow ugc">ros-humble-rviz-default-plugins</a>: 11.2.10-1 → 11.2.11-1</li> +<li>ros-humble-rviz-default-plugins-dbgsym: 11.2.10-1 → 11.2.11-1</li> +<li><a href="https://www.ogre3d.org/" rel="noopener nofollow ugc">ros-humble-rviz-ogre-vendor</a>: 11.2.10-1 → 11.2.11-1</li> +<li>ros-humble-rviz-ogre-vendor-dbgsym: 11.2.10-1 → 11.2.11-1</li> +<li><a href="https://github.com/ros2/rviz/blob/ros2/README.md" rel="noopener nofollow ugc">ros-humble-rviz-rendering</a>: 11.2.10-1 → 11.2.11-1</li> +<li>ros-humble-rviz-rendering-dbgsym: 11.2.10-1 → 11.2.11-1</li> +<li>ros-humble-rviz-rendering-tests: 11.2.10-1 → 11.2.11-1</li> +<li><a href="http://ros.org/wiki/rviz2">ros-humble-rviz-visual-testing-framework</a>: 11.2.10-1 → 11.2.11-1</li> +<li><a href="https://github.com/ros2/rviz/blob/ros2/README.md" rel="noopener nofollow ugc">ros-humble-rviz2</a>: 11.2.10-1 → 11.2.11-1</li> +<li>ros-humble-rviz2-dbgsym: 11.2.10-1 → 11.2.11-1</li> +<li>ros-humble-sick-scan-xd: 3.1.11-1 → 3.1.11-3</li> +<li>ros-humble-sick-scan-xd-dbgsym: 3.1.11-1 → 3.1.11-3</li> +<li><a href="https://index.ros.org/p/stereo_image_proc/github-ros-perception-image_pipeline/">ros-humble-stereo-image-proc</a>: 3.0.0-1 → 3.0.3-1</li> +<li>ros-humble-stereo-image-proc-dbgsym: 3.0.0-1 → 3.0.3-1</li> +<li><a href="http://www.ros.org/wiki/tf2">ros-humble-tf2</a>: 0.25.5-1 → 0.25.6-1</li> +<li><a href="http://www.ros.org/wiki/tf2_bullet">ros-humble-tf2-bullet</a>: 0.25.5-1 → 0.25.6-1</li> +<li>ros-humble-tf2-dbgsym: 0.25.5-1 → 0.25.6-1</li> +<li>ros-humble-tf2-eigen: 0.25.5-1 → 0.25.6-1</li> +<li>ros-humble-tf2-eigen-kdl: 0.25.5-1 → 0.25.6-1</li> +<li>ros-humble-tf2-eigen-kdl-dbgsym: 0.25.5-1 → 0.25.6-1</li> +<li><a href="http://www.ros.org/wiki/tf2_ros">ros-humble-tf2-geometry-msgs</a>: 0.25.5-1 → 0.25.6-1</li> +<li><a href="http://ros.org/wiki/tf2">ros-humble-tf2-kdl</a>: 0.25.5-1 → 0.25.6-1</li> +<li><a href="http://www.ros.org/wiki/tf2_msgs">ros-humble-tf2-msgs</a>: 0.25.5-1 → 0.25.6-1</li> +<li>ros-humble-tf2-msgs-dbgsym: 0.25.5-1 → 0.25.6-1</li> +<li><a href="http://ros.org/wiki/tf2_py">ros-humble-tf2-py</a>: 0.25.5-1 → 0.25.6-1</li> +<li>ros-humble-tf2-py-dbgsym: 0.25.5-1 → 0.25.6-1</li> +<li><a href="http://www.ros.org/wiki/tf2_ros">ros-humble-tf2-ros</a>: 0.25.5-1 → 0.25.6-1</li> +<li>ros-humble-tf2-ros-dbgsym: 0.25.5-1 → 0.25.6-1</li> +<li><a href="http://www.ros.org/wiki/tf2_ros">ros-humble-tf2-ros-py</a>: 0.25.5-1 → 0.25.6-1</li> +<li><a href="http://www.ros.org/wiki/tf2_ros">ros-humble-tf2-sensor-msgs</a>: 0.25.5-1 → 0.25.6-1</li> +<li><a href="http://www.ros.org/wiki/tf2_tools">ros-humble-tf2-tools</a>: 0.25.5-1 → 0.25.6-1</li> +<li>ros-humble-tracetools-image-pipeline: 3.0.0-1 → 3.0.3-1</li> +<li>ros-humble-tracetools-image-pipeline-dbgsym: 3.0.0-1 → 3.0.3-1</li> +<li>ros-humble-transmission-interface: 2.37.0-1 → 2.39.1-1</li> +<li>ros-humble-transmission-interface-dbgsym: 2.37.0-1 → 2.39.1-1</li> +<li>ros-humble-vrpn-mocap: 1.0.4-1 → 1.1.0-1</li> +<li>ros-humble-vrpn-mocap-dbgsym: 1.0.4-1 → 1.1.0-1</li> +</ul> +<h3><a class="anchor" href="https://discourse.ros.org#removed-packages-4-4" name="removed-packages-4-4"></a>Removed Packages [4]:</h3> +<ul> +<li>ros-humble-as2-ign-gazebo-assets</li> +<li>ros-humble-as2-ign-gazebo-assets-dbgsym</li> +<li>ros-humble-as2-platform-ign-gazebo</li> +<li>ros-humble-as2-platform-ign-gazebo-dbgsym</li> +</ul> +<p>Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:</p> +<ul> +<li>Adam Serafin</li> +<li>Aditya Pande</li> +<li>Alexey Merzlyakov</li> +<li>Alvin Sun</li> +<li>Bence Magyar</li> +<li>Bernd Pfrommer</li> +<li>Bianca Bendris</li> +<li>Brian Wilcox</li> +<li>CVAR-UPM</li> +<li>Carl Delsey</li> +<li>Carlos Orduno</li> +<li>Chris Lalancette</li> +<li>David V. Lu!!</li> +<li>Davide Faconti</li> +<li>Dirk Thomas</li> +<li>Dorian Scholz</li> +<li>Fictionlab</li> +<li>Francisco Martín</li> +<li>Francisco Martín Rico</li> +<li>Jacob Perron</li> +<li>John Wason</li> +<li>Jose-Luis Blanco-Claraco</li> +<li>Matej Vargovcik</li> +<li>Michael Jeronimo</li> +<li>Mohammad Haghighipanah</li> +<li>Paul Bovbel</li> +<li>Rob Fisher</li> +<li>Sachin Guruswamy</li> +<li>Shane Loretz</li> +<li>Steve Macenski</li> +<li>Support Team</li> +<li>Séverin Lemaignan</li> +<li>Tatsuro Sakaguchi</li> +<li>Vincent Rabaud</li> +<li>Víctor Mayoral-Vilches</li> +<li>Wolfgang Hönig</li> +<li>fmrico</li> +<li>rostest</li> +<li>sachin</li> +<li>steve</li> +<li>ymski</li> +</ul> + <p><small>5 posts - 4 participants</small></p> + <p><a href="https://discourse.ros.org/t/new-packages-and-patch-release-for-humble-hawksbill-2024-02-22/36275">Read full topic</a></p> + Fri, 23 Feb 2024 05:00:30 +0000 + + + ROS Discourse General: GTC March 18-21 highlights for ROS & AI robotics + discourse.ros.org-topic-36274 + https://discourse.ros.org/t/gtc-march-18-21-highlights-for-ros-ai-robotics/36274 + <p><strong><a href="https://www.nvidia.com/gtc/" rel="noopener nofollow ugc">NVIDIA GTC</a></strong> is happening live on March 18–21, with <strong><a href="https://www.nvidia.com/gtc/sessions/robotics/" rel="noopener nofollow ugc">registration open</a></strong> for the event in San Jose, CA.</p> +<p>There are multiple inspiring <a href="https://www.nvidia.com/gtc/sessions/robotics/" rel="noopener nofollow ugc">robotics</a> sessions following the kickoff with <strong>CEO Jensen Huang’s</strong> must-see keynote at the SAP Center which will share the latest breakthroughs affecting every industry.</p> +<p>Listing some highlight robotics sessions, hands-on-labs, and developers sessions there are:</p> +<p><strong>Hands-on training Labs</strong></p> +<ul> +<li><a href="https://www.nvidia.com/gtc/session-catalog/?tab.allsessions=1700692987788001F1cG&amp;search=DLIT61534#/" rel="noopener nofollow ugc">DLIT61534</a> Elevate Your Robotics Game: Unleash High Performance with Isaac ROS &amp; Isaac Sim</li> +<li><a href="https://www.nvidia.com/gtc/session-catalog/?tab.allsessions=1700692987788001F1cG&amp;search=DLIT61899#/" rel="noopener nofollow ugc">DLIT61899</a> Simulating Custom Robots: A Hands-On Lab Using Isaac Sim and ROS 2</li> +<li><a href="https://www.nvidia.com/gtc/session-catalog/?search=DLIT61523" rel="noopener nofollow ugc">DLIT61523</a> Unlocking Local LLM Inference with Jetson AGX Orin: A Hands-On Lab</li> +<li><a href="https://www.nvidia.com/gtc/session-catalog/?tab.allsessions=1700692987788001F1cG&amp;search=DLIT61797#/" rel="noopener nofollow ugc">DLIT61797</a> Training an Autonomous Mobile Race Car with Open USD and Isaac Sim</li> +</ul> +<p><strong><a href="https://www.nvidia.com/gtc/sessions/jetson-and-robotics-developer-day/" rel="noopener nofollow ugc">Jetson and Robotics Developer Day</a></strong></p> +<ul> +<li><a href="https://www.nvidia.com/gtc/session-catalog/?search=SE62934" rel="noopener nofollow ugc">SE62934</a> Introduction to AI-Based Robot Development With Isaac ROS</li> +<li><a href="https://www.nvidia.com/gtc/session-catalog/?search=SE62675&amp;tab.allsessions=1700692987788001F1cG#/" rel="noopener nofollow ugc">SE62675</a> Meet Jetson: The Platform for Edge AI and Robotics</li> +<li><a href="https://www.nvidia.com/gtc/session-catalog/?search=SE62933" rel="noopener nofollow ugc">SE62933</a> Overview of Jetson Software and Developer Tools</li> +</ul> +<p><strong>Robotics focused sessions</strong></p> +<ul> +<li><a href="https://www.nvidia.com/gtc/session-catalog/?tab.allsessions=1700692987788001F1cG&amp;search=s63374#/" rel="noopener nofollow ugc">S63374</a> <em>(Disney Research)</em> Breathing Life into Disney’s Robotic Characters with Deep Reinforcement Learning</li> +<li><a href="https://www.nvidia.com/gtc/session-catalog/?tab.allsessions=1700692987788001F1cG&amp;search=S62602#/" rel="noopener nofollow ugc">S62602</a> <em>(Boston Dynamics)</em> Come See an Unlocked Ecosystem in the Robotics World</li> +<li><a href="https://www.nvidia.com/gtc/session-catalog/?tab.allsessions=1700692987788001F1cG&amp;search=S62315#/" rel="noopener nofollow ugc">S62315</a> <em>(The AI Institute)</em> Robotics and the Role of AI: Past, Present, and Future</li> +<li><a href="https://www.nvidia.com/gtc/session-catalog/?tab.allsessions=1700692987788001F1cG&amp;search=S61182#/" rel="noopener nofollow ugc">S61182</a> <em>(Google DeepMind)</em> Robotics in the Age of Generative AI</li> +<li><a href="https://www.nvidia.com/gtc/session-catalog/?tab.allsessions=1700692987788001F1cG&amp;search=S63034#/" rel="noopener nofollow ugc">S63034</a> Panel Discussion on the Impact of Generative AI on Robotics</li> +</ul> +<p>This is a great opportunity to connect, learn, and share with industry luminaries, robotics companies, NVIDIA experts, and peers face-to-face.</p> +<p>Thanks.</p> + <p><small>1 post - 1 participant</small></p> + <p><a href="https://discourse.ros.org/t/gtc-march-18-21-highlights-for-ros-ai-robotics/36274">Read full topic</a></p> + Fri, 23 Feb 2024 04:43:51 +0000 + + + +