R1 Pro Autonomous Navigation System Tutorial
1. Product Introduction
This system includes mapping, localization, navigation, and control modules. The robot can construct a point cloud map in the environment, and realize autonomous movement and obstacle avoidance to target points based on global localization.
The autonomous navigation system is a paid enablement feature, currently in the testing phase. For in-depth understanding and purchase trial, please contact product@galaxea-dynamics.com or call 4008780980.
2. Hardware Introduction
2.1 Performance Parameters
| Localization | Description |
|---|---|
| Localization Method | Laser SLAM |
| Localization Frequency | 100 Hz |
| Localization Accuracy | <0.05 m |
| Motion Control | Description |
|---|---|
| Control Method | Autonomous Navigation (Path Tracking) |
| Maximum Travel Speed | 0.6 m/s |
| Obstacle Avoidance Method | Obstacle Circumvention |
| Obstacle Avoidance Frequency | 10-20 Hz |
| Network | Description |
|---|---|
| Wired Network | Supported |
| WiFi | Supported |
2.2 Sensor Configuration
Galaxea R1Pro is equipped with various sensors, including 9 HD cameras and 2 LiDARs, enabling it to perceive the surrounding environment in all directions.

| Sensor | Description |
|---|---|
| Camera | Head: 1 x Stereo Depth Camera Wrist: 2 x Monocular Depth Cameras Chassis: 5 x Monocular Cameras |
| LiDAR | 1 x 360° (Optional dual LiDARs) |
2.2.1 Camera
| Specifications | Head | Wrist | Chassis |
|---|---|---|---|
| Type | Binocular Camera | Monocular Depth Camera | Monocular Camera |
| Quantity | 1 | 2 | 5 |
| Output Resolution | 1920 x 1080 @30FPS | 1280 x 720 @30FPS | 1920 x 1080 @30FPS |
| Field of View | 110°H x 70°V | 87°H x 58°V x 95°D | 118°H x 62°V |
| Depth Range | 0.3 m ~ 20 m | 0.2 m ~ 3 m | \ |
| Operating Temperature | 0 °C ~ +35°C | 0 ~ +85°C | -40 ~ +85°C |
| Dimensions | 175L x 30W x 32H mm | 90L x 25W x 25H mm | 30L x 30W x 23H mm |
| Weight | 164 g | 75 g | <50 g |
2.2.2 LiDAR
The chassis is equipped with a 360° LiDAR*, with high precision and strong anti-interference capability.
| LiDAR | Description |
|---|---|
| Quantity | 1 ~ 2 |
| Field of View | 360°H x 59°V |
| Laser Wavelength | 905 nm |
| Detection Range | 40 m @10% Reflectivity 70 m @80% Reflectivity |
| Close-range Blind Zone | 0.1 m |
| Data Port | 100 BASE-TX Ethernet |
| IMU | Built-in IMU |
| Operating Temperature Range | -20 ~ +55°C |
| Dimensions | 65L x 65W x 60H mm |
| Weight | 265 g |
* Standard configuration includes 1 LiDAR, and the number of LiDAR configurations can be selected according to user requirements.
3. Software Introduction
Please ensure that your environment meets the following software dependency requirements.
- Hardware Dependency: R1 Pro Computing Unit
- Operating System Dependency: Ubuntu 22.04 LTS
- Middleware Dependency: ROS Humble
The R1 Pro robot software version must have SDK V2.1.1 or higher installed.
4. Localization and Navigation Operation Process
Map construction is the basic step for robot autonomous navigation. Through remote control of the robot to record map data (mcap files), processing and constructing maps on a local computer, and finally uploading the map to the robot-side designated directory to complete map deployment. According to the tutorial content shown below, set the target pose, modify the target file, and achieve point-to-point navigation after running.
4.1 Map Construction
4.1.1 Start R1 Pro
Log in to the R1 Pro ECU via SSH.
Run the following command to start the relevant nodes of R1.
cd ~/galaxea/install/startup_config/share/startup_config/script/
./robot_startup.sh boot ../sessions.d/ATCNavigation/R1PROVRTeleopNAV.d/
4.1.2 Record Data Packets
Run the following command to start recording bag files.
# Check if topics are complete and have normal frame rate output
ros2 topic hz /hdas/imu_chassis
ros2 topic hz /hdas/lidar_chassis_left # 10hz
ros2 topic hz /hdas/feedback_chassis
cd ~
ros2 bag record /hdas/imu_chassis /hdas/lidar_chassis_left /hdas/feedback_chassis -s mcap
Control the robot to move in the space where mapping is needed through the remote controller, ensuring coverage of all areas that need navigation.
For the remote controller operation method of the robot chassis, please click here to refer.
After completing the map data recording, press Ctrl + C to end the recording.
Note:
- Move the robot to the area where mapping is to be performed. Currently, only indoor environments are supported, with a maximum area of 100 square meters and a floor height not exceeding 5 meters.
- At the beginning of recording, the robot should remain stationary for more than 5 seconds to ensure data quality.
- During data recording, ensure that there are no dynamic targets in the environment (such as moving people or objects), and avoid following closely behind the robot, to avoid interfering with map construction.
- After map construction is completed, ensure that there are no environmental changes (such as newly added tables, partitions, etc.) during subsequent use, otherwise, remapping is required.
- Robot movement should completely cover the navigation area 2 times, i.e., control the robot to travel the same route twice, as shown in the figure below, completing two laps from 1 to 8.

4.1.3 Obtain Mapping Runtime Environment (Docker)
-
Download and Install Docker Image
Please send an email to support@galaxea-dynamics.com to obtain the file.

It is recommended to refer to the Docker Installation Tutorial for installation.
-
Load Docker File Execute the following command to load the Docker file on the local computer
Docker is mounted by default in the root directory. Please reserve more than 20G of storage space. If you need to change the mount path, please refer to:
```bash # 1. Create a new daemon.json file in the /etc/docker/ directory: sudo vim /etc/docker/daemon.json # 2. In the opened file, add the following content to change the Docker storage directory to the desired mount path { "data-root": "/path/to/target_dir" } # 3. Save and exit. Press 'shift'+ ':', enter 'w'+'q', and then press Enter. # 4. After modifying the configuration, restart the Docker service to make the configuration take effect: sudo systemctl restart docker ``` -
Download default calibration file. If purchased, we will send it to you via official email.
4.1.4 Map in Environment
-
On the local computer terminal, run the following command to pull the recorded bag file from the R1 Pro to the local machine.
-
Prepare the bag file and calibration parameter file.
-
Start Docker and begin mapping.
-
View mapping results.
4.1.5 Import Map and Calibration Files
Execute the following command to import map and calibration files to R1 Pro
ssh nvidia@{rorbot_ip} "mkdir -p ~/galaxea/calib ~/galaxea/maps"
scp -r ~/mapping_data/map/* nvidia@{robot_ip}:~/galaxea/maps/
scp -r ~/mapping_data/robot_calibration.json nvidia@{robot_ip}:~/galaxea/calib/
4.2 Start Localization Function
When starting the localization function, ensure that the robot is in a known map.
-
Start Software
Execute the following command on the R1 Pro side to start the relevant nodes.
-
Obtain Localization
Switch the remote controller to chassis control mode, operate the robot to slowly circle for 10-30 seconds within 2m of a known map environment for localization initialization, and observe the output of the following command.
On R1Pro, run the following command to check the localization status:
If the following similar data is normally returned, then localization is successful:
4.3 Set Target Pose
-
Remote control robot to target point After successful localization startup, remotely control the robot to the customer's desired target point, ensuring that the R1 center is at least 45cm away from obstacles.
-
Record pose information Each time remotely controlled to a target point, record the pose information of that position:
-
Update navigation target point issuing script Repeat the above process, after recording the pose information of all target points, update all target point pose information to the navigation target point issuing script.
Script example as follows, modify position and orientation in pose to target point information.
ros2 action send_goal /system_manager/navi/action system_manager_msg/action/NavigationTask -f --feedback "{ header: { stamp: {sec: 0, nanosec: 0}, frame_id: 'map' }, pose: { position: {x: -2.6662282943725586, y: 3.4633750915527344, z: 0.0}, orientation: {x: 0.0, y: 0.0, z: 0.3263046490397236, w: 0.945264659243676} }, frame_id: 'map', target_point_type: 1 }"
5. Software Interface
5.1 Driver Interface
R1 provides various driver interfaces for communication and control with hardware devices. The following are the main driver interfaces and their descriptions:
5.1.1 Chassis Driver Interface
/motion_control/chassis_speed: Used to control the movement of the robot chassis, including speed control, direction control, etc. Please go to R1Pro Software Introduction to view the chassis driver interface section for more detailed information.
5.1.2 LiDAR Interface
/hdas/lidar_chassis_left: LiDAR is used for environmental perception and distance measurement, providing real-time environmental information for the robot. Please go to R1Pro Software Introduction to view the LiDAR interface section for more detailed information.
5.1.3 IMU Interface
/hdas/imu_chassis: IMU is used to measure the robot's acceleration, angular velocity and other information, providing data support for navigation and attitude control. Please go to R1Pro Software Introduction to view the IMU interface section for more detailed information.
5.2 Motion Control Interface
The R1Pro robot provides various motion control interfaces for precise control of robot movement. The following are the main motion control interfaces and their descriptions:
5.2.1 Chassis Control Interface
/motion_target/target_speed_chassis: Used to control the movement of the robot chassis, including speed control, direction control, etc. Please go to the "Chassis Control Interface" section of the R1Pro software manual for more detailed information.
5.3 Localization Interface
The localization interface is the core component for the R1 Pro robot to achieve autonomous navigation and environmental perception. Through these interfaces, the robot can receive data from various sensors, such as IMU (Inertial Measurement Unit) and LiDAR, thereby achieving accurate multi-sensor fusion positioning. These interfaces ensure that the robot can accurately perceive its position and attitude in complex environments, providing reliable data support for subsequent path planning and navigation. This chapter details the various topics of the localization interface, including the types and purposes of input and output data.
| Topic Name | I/O | Description | Message Type |
|---|---|---|---|
| /hdas/imu_chassis | Input | IMU data for multi-sensor fusion localization | sensor_msgs::msg::Imu |
| /hdas/lidar_chassis_left | Input | Multi-line LiDAR point cloud for localization | sensor_msgs::msg::PointCloud2 |
| /localization/localization_results | Output | SLAM localization status | localization_msg::msg::LocLocalization |
5.4 Navigation Topic Interface
The navigation interface is a key part of the R1 Pro robot's autonomous path planning and motion control. These interfaces allow the robot to perform global and local path planning based on input sensor data (such as LiDAR point clouds and SLAM localization status), and output control commands to drive the robot chassis movement. The navigation interface not only supports obstacle avoidance functions but can also update the robot's motion trajectory and task status in real-time, ensuring that the robot can efficiently and safely complete navigation tasks. This chapter details the various topics of the navigation interface, including the types and purposes of input and output data.
| Topic Name | I/O | Description | Message Type |
|---|---|---|---|
| /hdas/lidar_chassis_left | Input | Multi-line LiDAR point cloud for obstacle avoidance | sensor_msgs::msg::PointCloud2 |
| /localization/localization_results | Input | SLAM localization status | localization_msg::msg::LocLocalization |
| /nav/local_path | Output | Local path planned by the local path planner | sensor_msgs::msg::PointCloud2 |
| /nav/global_path | Output | Global path planned by the global path planner | sensor_msgs::msg::PointCloud2 |
| /nav/robot_global_traj | Output | Global trajectory of robot travel | nav_msgs::msg::Path |
| /nav/global_map | Output | Navigation global cost map for global path planning | nav_msgs::msg::OccupancyGrid |
| /nav/local_map | Output | Navigation local cost map for local path planning | nav_msgs::msg::OccupancyGrid |
| /nav/global_goal | Output | Target point received by navigation for visualization | geometry_msgs::msg::PoseStamped |
| /motion_target/target_speed_chassis | Output | Navigation output control speed | geometry_msgs::msg::Twist |