根据师兄的博客 aubo_i5机械臂无序抓取实践二:eye_in_hand 手在眼上-手眼标定_我不不不不喝可乐的博客-CSDN博客_aubo手眼标定
补充手眼标定后续的tf发布,并基于此控制aubo机械臂末端跟踪标定码


1.参考连接

Moveit官方文档:

ArUco markers generator:


2.安装功能包

aubo_ws下已经按照github上的要求安装好了相应的功能包


3.创建eye_on_hand_calibration.launch文件

参考aubo_ws/src/easy_handeye/docs/example_launch/ur5e_realsense_calibration.launch

创建eye_on_hand_calibration.launch

文件位于aubo_ws/src/easy_handeye/easy_handeye/launch/

eye_on_hand_calibration.launch:

<launch>
    <arg name="namespace_prefix" default="aubo_i5_calibration" />
    <arg name="robot_ip" doc="The IP address of the aubo_i5 robot" />
    <arg name="marker_size" doc="Size of the ArUco marker used, in meters" />
    <arg name="marker_id" doc="The ID of the ArUco marker used" />
    <!-- 1 start realsense-->
    <!-- include file="$(find realsense2_camera)/launch/rs_camera.launch" /-->

    <!-- 2 start ArUco -->
    <node name="aruco_tracker" pkg="aruco_ros" type="single">
        <remap from="/camera_info" to="/camera/color/camera_info" />
        <remap from="/image" to="/camera/color/image_raw" />
        <param name="image_is_rectified" value="true"/>
        <param name="marker_size"        value="0.1"/>
        <param name="marker_id"          value="25"/>
        <param name="reference_frame"    value="camera_color_optical_frame"/>
        <param name="camera_frame"       value="camera_color_optical_frame"/>
        <param name="marker_frame"       value="camera_marker"/>
    </node>

    <!-- 3 start the robot -->
    <include file="$(find aubo_i5_moveit_config)/launch/moveit_planning_execution.launch">
        <arg name="sim" value="true" />
        <arg name="robot_ip" value="192.168.99.254" />
    </include>


    <!-- 4 start easy_handeye -->
    <include file="$(find easy_handeye)/launch/calibrate.launch" >
        <arg name="namespace_prefix" value="$(arg namespace_prefix)" />
        <arg name="eye_on_hand" value="true" />

        <arg name="tracking_base_frame" value="camera_link" />
        <arg name="tracking_marker_frame" value="camera_marker" />
        <arg name="robot_base_frame" value="base_link" />
        <arg name="robot_effector_frame" value="wrist3_Link" />

        <arg name="freehand_robot_movement" value="false" />
        <arg name="robot_velocity_scaling" value="0.5" />
        <arg name="robot_acceleration_scaling" value="0.2" />
    </include>
</launch>

注:

  1. namespace_prefix 自定义为 “aubo_i5_calibration”

后续可以用publish.launch发布标定后的TF,其namespace_prefix应该与自定义的一致,这样才能找到标定好的 YAML 文件

  1. marker_id 应与实际使用的图片ID一致
  2. 在start ArUco中 camera_frame 应设置为camera_color_optical_frame

4.标定流程

  • 运行launch文件,会打开相应的RVIZ和rqt界面
roslaunch easy_handeye eye_on_hand_calibration.launch

image-20221030142901533

机器人自主移动界面:

image-20221028202523785

标定界面:

image-20221030142730854

  • 标定操作

(1) 手动调节机械臂,使 aruco 二维码移动至相机视野中心处附近。在机器人自主移动界面中,点击 check starting pose,若检查成功,界面会出现: 0/17,ready to start。
(2)机器人自主移动界面中依次点击 Next Pose,Plan,Execute,机械臂会移动至新的位置,若二维码在相机视野范围内,且能检测成功,则进行下一步。
(3)标定界面中点击 Take Sample,若 Samples 对话框中出现有效信息,说明第一个点采样成功。
(4)重复执行步骤(2)和步骤(3),直至 17 个点全部采样完毕。
(5)标定界面中点击 Compute,则 Result 对话框中会出现结果。
(6)标定界面中 Save,会将结果保存为一个 YAML 文件,路径为 ~/.ros/easy_handeye

aubo_i5_calibration_eye_on_hand.yaml

parameters:
  eye_on_hand: true
  freehand_robot_movement: false
  move_group: manipulator_i5
  move_group_namespace: /
  namespace: /aubo_i5_calibration_eye_on_hand/
  robot_base_frame: base_link
  robot_effector_frame: wrist3_Link
  tracking_base_frame: camera_link
  tracking_marker_frame: camera_marker
transformation:
  qw: 0.5044090478676875
  qx: -0.5437136044950396
  qy: -0.4612761515238514
  qz: -0.48700240322976784
  x: -0.0024504907487361725
  y: 0.03589105151444005
  z: 0.024799751488110557

5. 发布标定后的TF

easy_handeye 功能包提供了 publish.launch 文件,可以将标定好的 TF 发布出。

publish.launch

<?xml version="1.0"?>
<launch>
    <arg name="eye_on_hand" doc="eye-on-hand instead of eye-on-base" default="true"/>
    <arg name="namespace_prefix" default="easy_handeye" />
    <arg if="$(arg eye_on_hand)" name="namespace" value="$(arg namespace_prefix)_eye_on_hand" />
    <arg unless="$(arg eye_on_hand)" name="namespace" value="$(arg namespace_prefix)_eye_on_base" />

    <!--it is possible to override the link names saved in the yaml file in case of name clashes, for example-->
    <arg if="$(arg eye_on_hand)" name="robot_effector_frame" default="" />
    <arg unless="$(arg eye_on_hand)" name="robot_base_frame" default="" />
    <arg name="tracking_base_frame" default="" />
    
    <arg name="inverse" default="false" />
    <arg name="calibration_file" default="" />
    
    <!--publish hand-eye calibration-->
    <group ns="$(arg namespace)">
        <param name="eye_on_hand" value="$(arg eye_on_hand)" />
        <param unless="$(arg eye_on_hand)" name="robot_base_frame" value="$(arg robot_base_frame)" />
        <param if="$(arg eye_on_hand)" name="robot_effector_frame" value="$(arg robot_effector_frame)" />
        <param name="tracking_base_frame" value="$(arg tracking_base_frame)" />
        <param name="inverse" value="$(arg inverse)" />
        <param name="calibration_file" value="$(arg calibration_file)" />
        <node name="$(anon handeye_publisher)" pkg="easy_handeye" type="publish.py" output="screen"/>
    </group>
</launch>

注意修改参数 namespace_prefix 与标定的 launch 文件中的保持一致,这样才能找到标定好的 YAML 文件。

  • ----方法1----
roslaunch easy_handeye publish.launch namespace_prefix:=aubo_i5_calibration

查看 TF(基坐标系到相机坐标系)

rosrun tf tf_echo /base_link /camera_link

在实际应用中可在机器人launch 文件中这样发布标定信息

<include file="$(find easy_handeye)/launch/publish.launch" >
	<arg name="namespace_prefix" default="aubo_i5_calibration" />
    <arg name="eye_on_hand" value="true" />
</include>
  • ----方法2----

    添加static_transform_publisher节点,实时(100hz)发布位置转换关系

<node pkg="tf" type="static_transform_publisher" name="static_transform_publisher" args="x y z qx qy qz qw <frame_id> <child_frame_id> 100" />

——方法1具体配置——

/home/robot/amr_ws/src/ir100_aubo_moveit_config/launch 下的moveit_planning_execution.launch

/home/robot/aubo_ws/src/iaubo_i5_moveit_config/launch 下的moveit_planning_execution.launch

添加如下内容

<include file="$(find easy_handeye)/launch/publish.launch" >
	<arg name="namespace_prefix" default="aubo_i5_calibration" />
    <arg name="eye_on_hand" value="true" />
</include>

——方法2具体配置——

在上述同样的moveit_planning_execution.launch

添加如下节点,发布标定后的结果

根据.ros/下最新的标定结果,自定义修改相应的参数值

image-20221030142304620

总结:

后续重新标定时,方法2需要根据最新的.yaml文件,手动输入更改对应的参数值

方法1只需保证namespace_prefix的参数值与第3步的eye_on_hand_calibration.launch中的namespace_prefix一样即可


6.获取标定码相对ref_frame的空间位姿

参考 /home/robot/aubo_ws/src/aruco_ros/aruco_ros/launchsingle.launch

新建single_marker_frame_to_ref_frame.launch

<launch>

    <arg name="markerId"        default="25"/>
    <arg name="markerSize"      default="0.1"/>    <!-- in m -->
    
    <arg name="eye"             default="left"/>
    
    <arg name="marker_frame"    default="aruco_marker_frame"/>
    
    <!-- leave empty and the pose will be published wrt param parent_name -->
    <!--arg name="ref_frame"       default="camera_color_frame"/--> 
    <!--arg name="ref_frame"       default="wrist3_Link"/-->
    <arg name="ref_frame"       default="camera_link"/>

    <arg name="corner_refinement" default="LINES" /> <!-- NONE, HARRIS, LINES, SUBPIX -->    
    
    <node pkg="aruco_ros" type="single" name="aruco_single">

        <remap from="/camera_info" to="/camera/color/camera_info" />
         <remap from="/image" to="/camera/color/image_raw" />

        <param name="image_is_rectified" value="True"/>
        <param name="marker_size"        value="$(arg markerSize)"/>
        <param name="marker_id"          value="$(arg markerId)"/>


        <param name="reference_frame"    value="$(arg ref_frame)"/>   <!-- frame in which the marker pose will be refered -->
        <param name="camera_frame"       value="camera_color_optical_frame"/>
        <param name="marker_frame"       value="$(arg marker_frame)" />
        <param name="corner_refinement"  value="$(arg corner_refinement)" />
    </node>

</launch>

测试:

  • 单独打开realsense
roslaunch realsense2_camera rs_camera.launch
  • 启动single_marker_frame_to_ref_frame.launch
roslaunch aruco_ros single_marker_frame_to_ref_frame.launch ref_frame:=camera_color_optical_frame
  • 打开rqt,选择/aruco_single/result
rqt

image-20221101171354512

  • 在终端输出相对空间坐标
rostopic echo /aruco_single/pose

image-20221101171432883

如rviz所示

X轴为红色,Y轴为绿色,Z轴为蓝色

aruco_marker_frame相对 camera_color_optical_frame 的位姿与终端输出基本一致

image-20221101171511907


7. 获取标定码相对机械臂基座的空间位姿

单独使用机械臂

  • 启动机械臂
roslaunch aubo_i5_moveit_config moveit_planning_execution.launch robot_ip:=192.168.99.254
  • 获取标定码相对机械臂基座的空间位姿
roslaunch aruco_ros single_marker_frame_to_ref_frame.launch ref_frame:=base_link
  • 在终端输出相对空间坐标
rostopic echo /aruco_single/pose

机械臂与小车联合使用

    1. 启动小车和机械臂、moveit
roslaunch ir100_driver ir100_aubo.launch
    1. 获取标定码相对此时机械臂基座的空间坐标,此时机械臂的基座为base_link1,更改ref_frame的参数即可坐标
roslaunch aruco_ros single_marker_frame_to_ref_frame.launch ref_frame:=base_link1
  • 在终端输出相对空间坐标
rostopic echo /aruco_single/pose

image-20221030152623731


8. 控制机械臂末端跟踪标定码-python脚本

可以建立一个节点,订阅话题/aruco_single/pose,将获取到的标定码相对于机械臂基座相对位姿发送给Moveit

从而控制机械臂末端移动到标定码的位置

代码使用的是Moveitpython接口,参考官方文档

注意:通过rviz里显示,机械臂的末端坐标系wrist3_Link与标定码的坐标系aruco_marker_frame姿态不一致,

为了使机械臂的末端始终是朝向标定码的,根据图中的相对关系,

需将aruco_marker_frame坐标系绕自身的y轴旋转180°

aubo_tracktomarker.py

#!/usr/bin/python
# -*- coding: utf-8 -*-

import rospy, sys ,tf
import moveit_commander
import moveit_msgs.msg
import tf.transformations

from tf.transformations import quaternion_from_euler 
from moveit_commander import MoveGroupCommander
from geometry_msgs.msg import Pose, PoseStamped 
from copy import deepcopy
from std_msgs.msg import String

def track(msg):
    # We can get the name of the reference frame for this robot:
    planning_frame = arm.get_planning_frame()
    print("============ Planning frame: %s" % planning_frame)

    # We can also print the name of the end-effector link for this group:
    eef_link = arm.get_end_effector_link()
    print("============ End effector link: %s" % eef_link)

    # Sometimes for debugging it is useful to print the entire state of the
    # robot:
    print("============ Printing robot state")
    print(robot.get_current_state())
    print("")
    
    end_effector_link = "wrist3_Link"
    
    pose_goal = Pose()
    
    #基于标定码坐标系自定义设置偏移量
    x_offset = -0.15
    y_offset = -0.15
    z_offset = 0.30

    v_offset =  ( x_offset,
                  y_offset,
                  z_offset) 
    v = np.asarray(v_offset)  
    t_m = tftf.translation_matrix(v)
    print("---偏移的matrix---")
    print (t_m)

   	marker_m = tftf.quaternion_matrix([msg.pose.orientation.x,                                                           	msg.pose.orientation.y,                                                           	 msg.pose.orientation.z,                                                           	  msg.pose.orientation.w])                                                          
    marker_m[0,3] = msg.pose.position.x
    marker_m[1,3] = msg.pose.position.y
    marker_m[2,3] = msg.pose.position.z
    print("---标定码的natrix---")
	print(marker_m)
    

	#将平移矩阵右乘标定码矩阵,相当于标定码基于自身坐标系平移
	r_dot=np.dot(marker_m,t_m)
	print("---偏移后标定码的matrix--")
	print(r_dot)
    
    pose_goal.position.x = r_dot[0,3]
    pose_goal.position.y = r_dot[1,3]
    pose_goal.position.z = r_dot[2,3]
    
    r_q = tftf.quaternion_from_matrix(r_dot)
    print("---标定码的quaternion--")
	print (r_q)
    
    #右乘四元数[0,1,0,0],绕自身的y轴旋转180°
	qf=tf.transformations.quaternion_multiply([r_q[0],
                                               r_q[1],
                                               r_q[2],
                                               r_q[3],
                                               [0,1,0,0])
    print("-标定码绕自身y轴旋转180度后的quaternion--")
    print(qf)
    pose_goal.orientation.x  = qf[0]
    pose_goal.orientation.y  = qf[1]
    pose_goal.orientation.z  = qf[2]
    pose_goal.orientation.w  = qf[3]
                                           
    arm.set_pose_target(pose_goal,end_effector_link)
    arm.go()
    
    # It is always good to clear your targets after planning with poses.
    # Note: there is no equivalent function for clear_joint_value_targets().
    arm.clear_pose_targets()
    rospy.sleep(2)
 
if __name__ == '__main__':

    try:
        rospy.init_node('track_marker',anonymous = True)
        
        #初始化moveit_group的API
        moveit_commander.roscpp_initialize(sys.argv)

        #初始化需要使用move group控制的机械臂的arm group
        #arm = MoveGroupCommander('aubo_i5') #小车与机械臂的group
        arm = MoveGroupCommander('manipulator_i5') 
        
        #Instantiate a RobotCommander object. 
        #Provides information such as the robot's 
        #kinematic model and the robot's current joint states
        robot = moveit_commander.RobotCommander()
        
        #当运动规划失败后,允许重新规划
        arm.allow_replanning(True)

        #设置目标位置所使用的参考坐标系
        #reference_frame = 'base_link1'
        reference_frame = 'base_link'
        arm.set_pose_reference_frame(reference_frame)
        arm.set_planning_time(5)
        
        #设置位置(单位:米)和姿态(单位:弧度)的允许误差
        arm.set_goal_position_tolerance(0.001)
        arm.set_goal_orientation_tolerance(0.001)

        #设置允许的最大速度和加速度
        arm.set_max_acceleration_scaling_factor(1)
        arm.set_max_velocity_scaling_factor(1)
        
        track_sub = rospy.Subscriber('/aruco_single/pose',PoseStamped,track,queue_size=1)
    
    except rospy.ROSInterruptException:
        pass

总结

  • 在上述的步骤4,5中:

通过 roslaunch easy_handeye eye_on_hand_calibration.launch的标定流程,

在路径为 ~/.ros/easy_handeye下生成了.yaml文件,

并根据该文件在 /home/robot/amr_ws/src/ir100_aubo_moveit_config/launch 下的moveit_execution.launch

添加如下节点,发布标定后的结果,或者使用publish.launch,从而覆盖urdf模型中camera_link相对于wrist_Link3tf变换

image-20221030142304620

  • 步骤6、7中:

通过roslaunch aruco_ros single_marker_frame_to_ref_frame.launch ref_frame:=....

rostopic echo /aruco_single/pose

可以在终端显示二维码相对 ref_frame 的空间坐标

  • 步骤8中:

利用节点订阅 /aruco_single/pose话题,结合 Moveit,控制机械臂末端到达标定码位置

相关命令汇总

重新标定

默认的 namespace_prefixaubo_i5_calibration

roslaunch easy_handeye eye_on_hand_calibration.launch
启动机械臂
##单独的机械臂模型
roslaunch aubo_i5_moveit_config moveit_planning_execution.launch robot_ip:=192.168.99.254

##小车与机械臂模型
roslaunch ir100_driver ir100_aubo.launch
获取标定码相对机械臂基座的位姿
##单独的机械臂模型
roslaunch aruco_ros single_marker_frame_to_ref_frame.launch ref_frame:=base_link

##小车与机械臂模型
roslaunch aruco_ros single_marker_frame_to_ref_frame.launch ref_frame:=base_link1
订阅/aruco_single/pose话题跟踪标定码
cd /home/robot/program/grasp

./aubo_tracktomarker.py
实际演示

跟踪标定码

Logo

有“AI”的1024 = 2048,欢迎大家加入2048 AI社区

更多推荐