1. 系统架构设计
1.1 功能需求分析
1.2 技术栈选型
组件 | 技术选型 |
---|
机器人框架 | ROS2 Humble |
仿真环境 | Gazebo 11 |
导航系统 | Nav2(Navigation2) |
建图算法 | SLAM Toolbox |
语音交互 | SpeechRecognition + PyAudio |
建模工具 | URDF + SolidWorks插件 |
2. 开发环境搭建
2.1 系统依赖安装
sudo apt install -y \ros-humble-desktop \ros-humble-gazebo-* \ros-humble-navigation2 \ros-humble-nav2-bringup \python3-pyaudio
2.2 工作空间初始化
mkdir -p ~/delivery_robot_ws/src
cd ~/delivery_robot_ws/
rosdep install -i --from-path src --rosdistro humble -y
colcon build --symlink-install
3. 机器人URDF建模
3.1 基础结构定义(base.urdf.xacro)
<?xml version="1.0"?>
<robot name="delivery_robot" xmlns:xacro="http://www.ros.org/wiki/xacro"><link name="base_link"><visual><geometry><cylinder radius="0.25" length="0.15"/></geometry><material name="blue"><color rgba="0 0 1 1"/></material></visual></link><xacro:macro name="wheel" params="prefix reflect"><link name="${prefix}_wheel_link"><visual><geometry><cylinder radius="0.05" length="0.03"/></geometry></visual></link><joint name="${prefix}_wheel_joint" type="continuous"><origin xyz="0 ${reflect*0.15} -0.05" rpy="1.5708 0 0"/><parent link="base_link"/><child link="${prefix}_wheel_link"/><axis xyz="0 0 1"/></joint></xacro:macro><xacro:wheel prefix="left" reflect="1"/><xacro:wheel prefix="right" reflect="-1"/>
</robot>
3.2 传感器集成
<link name="lidar_link"><visual><geometry><cylinder radius="0.05" length="0.1"/></geometry></visual>
</link><joint name="lidar_joint" type="fixed"><origin xyz="0 0 0.2" rpy="0 0 0"/><parent link="base_link"/><child link="lidar_link"/>
</joint>
<gazebo reference="camera_link"><sensor type="depth" name="camera"><update_rate>30</update_rate><camera><horizontal_fov>1.047</horizontal_fov></camera></sensor>
</gazebo>
4. SLAM建图与定位
4.1 SLAM Toolbox配置
slam_toolbox:ros__parameters:base_frame: "base_link"odom_frame: "odom"map_frame: "map"mode: "mapping" scan_topic: "/scan"map_resolution: 0.05map_size: 204.8
4.2 建图流程
ros2 launch delivery_robot_gazebo simulation.launch.py
ros2 launch delivery_robot_slam slam.launch.py
ros2 run teleop_twist_keyboard teleop_twist_keyboard
ros2 run nav2_map_server map_saver_cli -f ~/maps/office_map
5. Nav2导航系统配置
5.1 导航参数配置
nav2_costmap_2d:global_costmap:global_frame: "map"robot_base_frame: "base_link"resolution: 0.05plugins:- {name: static_layer, type: "nav2_costmap_2d::StaticLayer"}- {name: obstacle_layer, type: "nav2_costmap_2d::ObstacleLayer"}- {name: inflation_layer, type: "nav2_costmap_2d::InflationLayer"}local_costmap:global_frame: "odom"rolling_window: true
5.2 路径规划算法
import rclpy
from geometry_msgs.msg import PoseStampedclass NavGoalPublisher(rclpy.node.Node):def __init__(self):super().__init__('nav_goal_publisher')self.publisher_ = self.create_publisher(PoseStamped, '/goal_pose', 10)timer_period = 5.0 self.timer = self.create_timer(timer_period, self.timer_callback)self.count = 0def timer_callback(self):msg = PoseStamped()msg.header.stamp = self.get_clock().now().to_msg()msg.header.frame_id = 'map'msg.pose.position.x = 2.0msg.pose.position.y = 1.0self.publisher_.publish(msg)self.get_logger().info(f'Publishing: {msg}')
6. 语音交互模块集成
6.1 语音识别实现
import speech_recognition as sr
import rclpy
from std_msgs.msg import Stringclass VoiceController(rclpy.node.Node):def __init__(self):super().__init__('voice_controller')self.publisher_ = self.create_publisher(String, '/voice_command', 10)self.recognizer = sr.Recognizer()self.mic = sr.Microphone()def listen(self):with self.mic as source:self.recognizer.adjust_for_ambient_noise(source)audio = self.recognizer.listen(source)try:command = self.recognizer.recognize_google(audio).lower()self.publisher_.publish(String(data=command))return commandexcept sr.UnknownValueError:return ""except sr.RequestError as e:self.get_logger().error(f'Recognition error: {e}')return ""
6.2 命令解析逻辑
def command_callback(self, msg):cmd = msg.dataif '送餐' in cmd:target = self.parse_target(cmd)self.send_navigation_goal(target)elif '停止' in cmd:self.cancel_navigation()def parse_target(self, cmd):locations = {'前台': (2.0, 1.0),'会议室': (5.0, -3.0),'休息区': (-1.0, 4.0)}for loc, coord in locations.items():if loc in cmd:return coordreturn None
7. 系统集成与测试
7.1 完整启动流程
ros2 launch delivery_robot_gazebo full_system.launch.py
ros2 launch delivery_robot_nav2 nav2.launch.py
ros2 run delivery_robot_voice voice_control_node
ros2 run rviz2 rviz2 -d $(ros2 pkg prefix delivery_robot_bringup)/share/delivery_robot_bringup/rviz/nav2.rviz
7.2 测试用例设计
测试场景 | 预期结果 | 验证方法 |
---|
空旷环境导航 | 规划平滑路径,准时到达 | RViz路径显示 + 到达提示 |
动态障碍物避让 | 实时调整路径,保持安全距离 | Gazebo添加移动障碍物 |
语音指令识别 | 正确解析位置指令并执行导航 | 口语化指令测试(带噪音环境) |
低电量预警 | 自主返回充电桩 | 模拟电量下降阈值 |
8. 完整代码库说明
8.1 代码结构
delivery_robot_ws/
├── src/
│ ├── delivery_robot_description/ # URDF模型
│ ├── delivery_robot_gazebo/ # 仿真环境
│ ├── delivery_robot_slam/ # SLAM配置
│ ├── delivery_robot_nav2/ # 导航系统
│ ├── delivery_robot_voice/ # 语音交互
│ └── delivery_robot_bringup/ # 系统集成
8.2 关键文件说明
urdf/delivery_robot.urdf.xacro
:机器人模型描述文件;config/nav2_params.yaml
:导航参数配置;scripts/voice_control.py
:语音交互主程序;launch/full_system.launch.py
:完整系统启动文件。
8.3 运行要求
- 硬件:建议配置i5-8代CPU + 16GB内存 + 独立显卡;
- 软件:Ubuntu 22.04 + ROS2 Humble完全安装;
- 依赖:需安装Gazebo经典模型库(
sudo apt install ros-humble-gazebo-ros-pkgs
)。
9. 扩展功能实现建议
- 多机器人协同:通过ROS2的DDS实现机器人间通信;
- 电梯交互:添加数字IO接口控制电梯按钮;
- 任务调度系统:基于RCL动作接口实现任务队列管理;
- 云端监控:集成WebSocket实现远程状态查看。
注:实际部署时需根据具体场景调整传感器参数和导航配置,建议使用物理机器人前在仿真环境中完成90%以上的功能验证。可通过修改URDF中的传感器插件参数适配不同硬件平台。