Human-Robot Cooperative Handling Using Virtual ... - Semantic Scholar

Report 2 Downloads 108 Views
Human-Robot Cooperative Handling Using Virtual Nonholonomic Constraint in 3-D space Tomohito TAKUBO*, Hirohiko ARAI**, Kazuo TANIE** *Graduate School of Engineering, University of Tsukuba, 1-1-1 Tennodai, Tsukuba, Ibaraki, 305-8573 Japan **Intelligent Systems Laboratory, AIST 1-2 Namiki, Tsukuba, Ibaraki, 305-8564 Japan Email: [email protected] Abstract This paper deals with human-robot cooperative handling of a long object in 3-D space. We assume that the operator and the robot grasp each end of the object and carry it cooperatively. In this task, it is difficult for the human operator to apply a large torque to give a translational force at the robot hand. Thus, we propose to assign a virtual nonholonomic constraint to the robot hand. In this method, the robot behaves like a unicycle and we only use translational force to manipulate it. The controllability is assured based on nonholonomy and the operator can transport the object to the desired position and posture in 3-D space by a similar skill using a wheelbarrow. The effectiveness of our method is experimentally verified.

operator and the load independently, because he/she grasps the object directly. If the object's mass and size is known, the measured force can be separated. However, the exact characteristic of the load are rarely known. Moreover, the disturbance cannot be distinguished from the operator's force, and it might be dangerous to amplify the estimated force. Therefore, it is difficult to simply apply the power assist technique to the cooperative transportation.

1. Introduction Most of robots are now being used in simple tasks at the place isolated from humans. However, the techniques by which human and robots can work togethter in the same place are expected in construction industry, the transportation industry, homes, offices, etc. Thus, in this paper we focus on human-robot cooperative handling of a long object. The concept of our target system is depicted in Fig.1. It might be difficult to transport a long or big object holding at one place even if it is not a heavy. Such an object can be operated more easily by two or more persons. It would be useful if such cooperative transportation can be achieved by a robot instead of a human assistant. In this paper, we use just the force sensor at the robot wrist and the joint angle sensors. We do not use visual and auditory information that requires complicated setup, e.g. image processing, voice recognition and intelligent processing for human intention understanding. We aim the system used as a simple tool rather than an autonomous agent equivalent to a human assistant. Previous works [1]-[3] investigated techniques called extender or power assist, which amplifies human's physical force by a robot. The robot has two force sensors, which measure his/her operational force and the work load, respectively. The operator's force is amplified by the robot actuators. This method is convenient when the load can be grasped at one point. However, in the cooperative handling of a long object, the sensor cannot measure the force by the

Fig.1 Cooperative handling with human There have been several studies on human-robot collaboration based on impedance control [4]-[8]. In this method, the robot is given a virtual impedance and passively follows the force which human applies. However, if the operator and the robot grasp each end of a long object, it is difficult to apply a translational force at the robot side because it requires the operator to apply a large torque. Thus, the operating force consists of mainly translational components. Though the axial translation is easy, this method causes sideslip of the object in the normal direction and it deceives the manipulation when the object turns (Fig.2(a)). To solve this problem, we proposed a method that assigns a virtual nonholonomic constraint at the robot hand, which is equivalent to a wheel attached axially at the robot grasping point (Fig.2(b)) [10]. This method prevents the sideslip and the human operator can maneuver it like a wheelbarrow. He/She does not have to use torque and can transport the object to the arbitrary position and posture applying only translational force. In our previous study [11], we extend this method to the cooperative handling in the vertical plane. Furthermore, we combined the horizontal motion and the vertical motion to carry the object to the desired position in 3-D space. However, this

method cannot rotate the object about the principal axis. In this paper, we consider the rotation about the principal axis so that 6 degree-of-freedom operation (position and posture) in 3-D space can be achieved. First of all, we add the impedance control with regard to the axial rotation to the technique of Ref. [11]. However, there is a possibility that the rotation axes become singular posture while changing the posture. To solve this problem, we propose that the object rotation about a new rotation axis to avoid the singular posture. As another way to solve it simply, we apply the impedance control in the hand coordinate. This method avoids the singular posture and maintains the controllability. The rest of this paper is organized as follows. The basic concept of virtual nonholonomic constraint is illustrated in Section 2. Section 3 explains the method to accomplish the cooperative handling in 3-D space. We implement the control law into a robot and conduct experiments on human-robot cooperation to verify its effectiveness in Section 4. Top view

Operator

AAA AAA AAA AA AA

Sideslip

Object

Robot

(a)Impedance control

AAA AAA AAA AA AA

X

Y

Virtual Wheel

(b)Virtual nonholonomic constraint Fig.2 Concept of Virtual Nonholonomic Constraint

2. Cooperative handling nonholonomic constraint

with

Impedance control, by which the robot moves compliant to the human force, has been a typical approach for human-robot collaboration. However, when the human and the robot cooperatively handle each end of a long object, it is difficult to detect the operator's torque by the force sensor at the robot hand. Thus, he/she has to manipulate it with mainly translational force. It is difficult for the human to control the rotation and the translation in the normal direction by only translational force, because the object behaves as if it were moving in a gravity-free state. This problem occurs because the robot impedance parameters are decided uniformly in any directions. Then, we proposed to simplify the object's movement by limiting the direction to which the object can move. We considered a virtual nonholonomic constraint that behaves like a unicycle and maintains controllability in the plane. This technique prevents sideslip in the normal direction and the operator

can control the object position using only translational force. The constraint is achieved by anisotropic impedance control in which the impedance parameters depend on the direction in the hand coordinate frame. The anisotropic impedance control law, which relates the movement of the robot hand to the force/torque from the object to the robot, is formulated as follows.  f Xr = m X X˙˙r + bX X˙ r   fYr = mY X˙˙r + bY X˙ r   τ r = iθ˙˙r + cθ˙r

(1)

Where m X and bX are mass and friction coefficient in the X direction, and mY and bY are those in the Y direction, respectively. i and c are the moment of inertia and friction coefficient around robot hand. mY and bY are given large value to suppress the velocity and acceleration in the normal direction. On the other hand, m X , bX , i and c are set as small as possible so that the robot can move almost freely. Then, the virtual nonholonomic constraint that is equivalent to a wheel attached to robot hand is realized. Fig.3 illustrates the cooperative handling by the proposed method and by the normal impedance control in a horizontal plane. We confirmed that the nonholonomic constraint prevented the sideslip and the operator could transport the object by the proposed method more easily than the normal impedance control. Moreover, we extended this method to the motion in a vertical plane, so that the object can be moved to the desired height and inclination using similar skill for steering a wheel in a horizontal plane (Fig.4). Because the controllability in the vertical plane is guaranteed based on nonholonomy equivalent to a unicycle, we need not change the grip position and the control law to reach the desired position. Fig.5 illustrates the proposed cooperation in the vertical plane. In this method, though we cannot directly lift the object in the vertical direction, he/she can transport it to an arbitrary position and inclination in the vertical plane by only translational movements. 0.8

0.6

0.8

0.4

Initial Position

0.6

0.2 0.4

Initial Position

0 0.2

Robot

-0.2

0 Robot

End Position

-0.4

-0.2

-0.6 -0.4 -0.6

-1

Human

-0.8 -1

-1.2 -1

End Position Human

-0.8

-0.5

0

(a)Impedance control

0.5

-0.5

0

(b)Virtual nonholonomic constraint Fig.3 Trajectory of cooperative handling in horizontal plane

0.5

P

Z

Z

θY

The hand frame

θY

θP

X

Y X

z Virtual Wheel

y x

Fig.4 Nonholonomic constraint in vertical plane 1.3

1.2

Human 1.1

Fig.6 Rotation about P axis

3 1

P

Robot 0.9

2

XPplane

0.8

1

Z

θX

Y

0.7

X

0.6

z 0.5 -1.4

-1.2

-1

-0.8

-0.6

-0.4

-0.2

y

Fig.5 Trajectory of cooperative handling in vertical plane x

3. Cooperative handling in 3-D space

Fig.7 If the object rotated 90 degree around X- axis, Y axis was in the XP- plane and the object couldn't rotate in Here, we consider combining transportation in a the XP- plane. horizontal plane and a vertical plane to achieve the Here, we focus on the composite rotation about the X-, cooperative transportation in 3-D space. Y and P- axes for position control in 3-D space. In this The constraint in the horizontal plane and the vertical plane has been defined in the robot hand coordinate. If we method, the orientation of P- axis is fixed in the absolute combine these constraints for transportation in 3-D space, coordinate. Then, if the object rotates +-90 degrees about it could cause the object to twist around the X-axis while X- or Y - axis, the X-, Y - and P- axes are in the same plane the object is rotated around Y - and Z-axes in the hand and become the singular posture, which degrades the frame. This rotation of the posture consists of Z Y Z(Y Z Y) controllability (Fig.7). To solve this problem, we consider Euler angles, which is well-known as representation of the rotation about Q- axis,which is perpendicular to z- and orientation in 3-D space. However, it is not easy for the X- axes, instead of Z- axis (Fig.8). This method is operator to understand this motion to prevent the twist. To effective except the case that P- and X- axes have the same solve this problem, we suggested that the horizontal posture when Q-axis rotate +-90 degrees. These problems constraint is defined in the x y-plane in the absolute occur because the posture of P-axis is fixed. P-axis is coordinate and the vertical constraint is defined in the considered so that the rotation angle of X-axis is not X Z-plane in the robot hand coordinate [11]. In this generated and the operator can move it easily to the desired method, the operator can transport the object to the desired position. However, finally we consider to control all 6 position (x,y,z) and posture (Pitch, Yaw), but he/she can degree-of-freedom position and posture in 3-D space. Then, not twist the object about X-axis (Fig.6). Then, we X-axis is impedance controlled and we can control actively consider applying impedance control about X-axis to to remain the desired posture, if the rotation about X-axis control the Roll angle of the object. It is difficult for the coincides with the rotation about Y - and Z- axes. This operator to apply a large torque at the tip of the long object method is used only with impedance control in each axis of The impedance to control the Pitch and Yaw angles. However, the torque the robot hand coordinate (Fig.9). parameter of each rotational axis is set to a small value and applied around X-axis is spread to the other side directly when the object is rigid. Thus we can roll the object easily the robot can rotate the object around the grasping position like a free joint. As each axis in the robot hand coordinate by impedance control about X-axis. are always kept the relative angle after the rotation in 3-D space, controllability in 3-D space is maintained (Fig.10).

P

− x˙ sin θ Y + y˙ cos θ Y sin θ X + z˙ cos θ Y cos θ X = 0

θP

The robot and human holding points are related as,  x h   cos θ Z cos θ Y   xr   y  =  sin θ cos θ  L +  y  Z Y  h    r  zr   zh   − sin θ Y 

Z Y θQ

θX

Q

X

 x˙ h   − sin θ Z cos θ Y  y˙  =  cos θ cos θ Z Y  h  0  z˙h  

y x

Fig.8 Rotation about Q axis The hand frame

θZ θY

X Y z

AAAAA AAAAA y

x Robot

Operator

Fig 9 Rotation about hand frame

Z θZ θX

− cos θ Z sin θ Y   x˙ r   θ˙ Z   − sin θ Z sin θ Y   L +  y˙r    θ˙    − cos θ Y  Y  z˙r  (5)

The state equation of this system is obtained from eqs.(2), (3), and (5).

Z

θX

(4)

Differentiating eq.(3), the relation of the velocities is as follows,

z

Object

(3)

Y θY

X

AAAAA AAAAA

Fig.10 If the posture of the object is changed, the orthogonality of the rotation axes is kept. Then, we discuss the controllability of the proposed method. The operator holds the object at H : ( x h , yh , zh ) . The position and posture of the robot hand is R : ( x r , yr , zr , θ X , θ Y , θ Z ) . The length of the object is L. The condition that the robot hand does not slip in the direction of the Y - and Z- axes is x˙ r (sin θ Z cos θ Y ) + y˙r (sin θ Z sin θ Y sin θ X − cos θ Z cos θ X ) + z˙(sin θ Z sin θ Y cos θ X − cos θ Z sin θ X ) = 0 (2)

 x˙ r   y˙r   x˙ h     z˙r  = RT  y˙h   θ˙   z˙h   Z ˙ θ Y 

(6)

= g1 x˙ h + g2 y˙h + g3 z˙h Where, the matrix RT is 5 x 3 and the vectors g1 , g2 and g3 are 5 x 1. This is a drift-free affine system with the configuration of the object as the state variable and the translational velocities of H as control inputs. Now, we prove this system is controllable and can reach any arbitrary state [9]. Lie bracket of the vector fields g1 , g2 and g3 is calculated as,  [ g1 , g2 ] =   [ g2 , g3 ] =    [ g3 , g1 ] = 

∂g2 ∂g g1 − 1 g2 ∂q ∂q ∂g3 ∂g g2 − 2 g3 ∂q ∂q ∂g ∂g1 g3 − 3 g1 ∂q ∂q

(7)

Then, it is calculated by numerical calculation software "Mathematica" as follows, rank ( g1 , g2 , g3 , [ g1 , g2 ], [ g2 , g3 ], [ g3 , g1 ]) = 5

(8)

As the matrix ( g1 , g2 , g3 ,[ g1 , g2 ],[ g2 , g3 ],[ g3 , g1 ]) is of full rank from eq.(8), the system (6) satisfies Lie algebra rank condition, and it is controllable. It is possible to control the position ( x r , yr , zr ) and posture (θ Y , θ Z ) of the object only by treating the translational velocity of the operator's hand. Finally, the rotation about X-axis ( θ X ) is controllable, because it is controlled directly by the impedance control according to the operator's torque about X- axis ( τ X ).

4. Experiments

5. Conclusions

Now, we explain the robot controller to implement the proposed method to the robot arm. The robot has a force sensor at the wrist. We consider the frame of robot hand in Fig.9 and use the impedance control for each axis of the following impedance characteristic.

In this paper, we proposed a method for the human-robot cooperative transportation in 3-D space. In this method, we applied a virtual nonholonomic constraint like a unicycle to the robot hand. The constraint is achieved by anisotropic impedance control in the robot hand coordinate. The operator can transport an arbitrary position and posture in 3-D space by similar skill using a wheelbarrow. We verified the effectiveness of the proposed method experimentally.

 f Xr = m X X˙˙r + bX X˙ r  ˙˙ ˙  fYr = mY Yr + bY Yr ˙˙  f Zr = mZ Zr + bZ Z˙ r  ˙˙ ˙  τ Xr = iXθ X + c Xθ X  τ Yr = iYθ˙˙Y + cYθ˙Y   τ Zr = iZθ˙˙Z + cZθ˙ Z

(9)

f Xr , fYr , f Zr , τ Xr , τ Yr , τ Zr are force and torque, X˙ r , Y˙r , Z˙ r , θ˙ X , θ˙Y , θ˙ Z are velocity and angular velocity, ˙˙ , Y˙˙ , Z˙˙ , θ˙˙ , θ˙˙ , θ˙˙ are acceleration and angular acceleration, X r r r X Y Z m X , mY , mZ , iX , iY , iZ are inertia and moment of inertia and

bX , bY , bZ , c X , cY , cZ are viscous friction and moment of viscous friction. The impedance parameters of the Y - and Z- axes are set large to achieve the constraint eq.(2) and (3). And the X- axis and the rotational parameters are given small so that the object can be moved more easily by the operator's force. Table 1 shows the impedance parameters. The impedance parameters about the constraint axes are defined 20 times as large as those of the unconstrained axes. Fig.11 shows the block diagram of the control system. In this experiment, we use an industrial robot arm (PA-10, MHI) attached a 6-axes force/torque sensor at the wrist. The controller is constructed with a personal computer (CPU : Pentium 550 MHz). The sampling period is 2 ms. The length and weight of the object (aluminium pipe) are 0.75m and 0.7kg. The operator and the robot transport the object cooperatively to the target position from the initial position. The initial and target positions of the robot are ( x, y, z, θ x , θ y , θ z ) = ( −0.47, −0.47, 0.37, 0, 0, 0) , (0.35, 0.2, 0.63, 0, 0, π / 2) in the absolute coordinate, respectively. Fig.12 shows the experimental result with the rotation about Y -, Z-axes and Fig.13 shows the experimental result with the rotation about X-, Y -, Z-axes. Fig.14 shows the rotational angle about X-axis in each experiment. The rotation about Y and Z-axes results in the rotation about X-axis during the transportation to the target position. The operator cannot understand how the posture changes, and it is too difficult to reach the desired angle only by the rotation about Y -, Z-axes. On the other hand, the result of rotation about X-, Y -, Z- axes is that the operator can easily control the Roll angle about the X-axis and the object can reach the target position and posture. This result shows that the proposed method can control position and posture of 6 degree of freedoms in 3-D space.

References [1] H. Kazerooni, "Human Robot Interaction via the Transfer of Power and Information Signals," IEEE Trans. Systems, Man and Cybernetics, Vol.20, No.2, pp 450-463, 1990. [2] K.Kosuge, Y.Fujisawa and T.Fukuda, "Mechanical System Control with Man-Machine-Environment Interactions," Proc.1993 IEEE Int. Conf. on Robotics and Automation, pp.239-244,1993. [3] Y. Hayashibara, K. Tanie, H. Arai and H. Tokashiki, "Development of Power Assist System with Individual Compensation Ratios for Gravity and Dynamic Load," Proc. 1997 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS'97), pp.640-646, 1997 [4] Y. Hayashibara et al., "Assist System for Carrying a Long Object with a Human - Analysis of a Human Cooperative Behavior in the Vertical Direction -,'' Proc. 1999 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS'99), pp.695-700, 1999. [5] K. Kosuge, H. Yoshida and T. Fukuda, "Dynamic Control for Robot - Human Collaboration," IEEE Int. Workshop on Robot and Human Communication, pp.398-401, 1993. [6] K. Kosuge, N. Kazamura, "Control of Robot Handling an Object in Cooperation with a Human," IEEE. Int. Workshop on Robot and Human Communication, pp.142-147, 1997. [7] R. Ikeura and H. Inooka, "Variable Impedance Control of a Robot for Cooperation with a Human," Proc. 1995 IEEE Int. Conf. on Robotics and Automation, pp.3097-3102, 1995. [8] O. M Al-Jarrah and Y. F. Zheng, "Arm - Manipulator Coordination for Load sharing Using Reflexive Motion Control," Proc. 1997 IEEE Int. Conf. on Robotics and Automation, pp.2326-2331, 1997. [9] R. M. Murray, Z. Li and S. S. Sastry, A Mathematical Introduction to Robotic Manipulation, CRC Press, 1993.;' [10] H. Arai, T. Takubo, K. Tanie, "Human-Robot Cooperative Manipulation Using a Virtual Nonholonomic Constraint,'' Proc. 2000 IEEE Int. Conf. Robotics and Automation (ICRA2000), pp.4064-4070, 2000. [11] T. Takubo, H. Arai, K. Tanie, "Virtual Nonholonomic Constraint for Human-Robot Cooperation in 3-D space,'' Proc. 2000 IEEE Int. Conf. on Intelligent Robots and Systems (IROS'00), to appear, 2000.

Table 1 Impedance parameter mX bX mY bY mZ bZ iX , iY ,i Z cX , cY , cZ

[Kg] [Kg / s] [Kg] [Kg / s] [Kg] [Kg / s] [ Kgm2 ] [ Kgm2 / s]

5.5 20

AAAAAAAAAA AAAAAAAAAA AA A AA AA AAAAAAAAA AAAAAAAAAA A AAAAAAAAAA AA A A AAAAAAAAA AA A AA AA AAAAAAAAAA A A A A A AAAAAAAAAAA AA AA A AAAAAAAAA AAAAAAAAAA AAAAAAAAA AAAAAAAAAAA AA A A AA A AAAAAAAAAAA A A AA A AAAAAAAAA A A AA AA A AAAAAAAAAAA AA A AAAAAAAAA A AAAAAAAAAA A AAAAAAAAA A A A A AAAAAAAAA AAAAAAAAAAA AAAAAAAAAA AAAAAAAAAA AAAAAAAAA AAAAAAAAA A AAAAAAAAA A AA AAAAAAAAA AAAAAAAAAAA AAAAAAAAAA AAAAAAAAA AAAAAAAAAAA AAAAAAAAAA AAAAAAAAA AAAAAAAAA AAAAAAAAA A AAAAAAAAAA AAAAAAAAA AAAAAAAAA AAAAAAAAA AAAAAAAAAA AAAAAAAAAA AAAAAAAAA AAAAAAAAA AAAAAAAAAA AAAAAAAAA Fig.13 Experimental result

1.5

Robot

100 5000

End position

1

100

z

5000

0.5

5.5

Initial position

20

Human

0 0.5

The hand frame

0

Z

0

-0.5

y

-0.5

-1

-1.5

X Y

Force Sensor

Φ x

Storage

x

-1

-1.5

(Rotation about X-, Y -, Z- axis)

AAAAA AAAAA

τ

0 .3 5 0 .3

Angler Velocity Controller

0 .2 5

Cordination Transform

0 .2 0 .1 5

1 M

+

F

-

1 s

˙d X

0 .1

˙d Φ

J(Φ)−1

0 .0 5 0

1 B F = (f Xr

M = (m X B = (bX

(

X˙ d = X˙ r

fYr

τ Xr τYr

fZr

mY bY

mZ bZ

Y˙r

iX cX

iY cY

Φ Z˙ r θ˙ X

˙ θY

˙d Φ J( Φ) τ x

-0 .0 5

T

diag

cZ )

diag

˙ θZ

)

T

10

20 Time( s)

τ Zr ) : Force vector applied to the robot arm

iZ )

0

: Inertia of impedancd parameter : Viscouse friction of impedance parameter : Joint angle : Desired velocity of robot hand codinate : Desired joint angle vector : Jacobian matrix : Torque vector which actuator should generate : Position of the robot hand in the absolute coordinate

Rot at ion about X,Y,Z axes

Rot at ion about Y,Z axes

Fig.14 Rotation angle about X- axis

Fig.11 Control system

1.5

A AAAAAAAAA AAAAAAAAAA AAAAAAAAA A A AAAAAAAAAA A AAAAAAAAA A A A A AAAAAAAAAA A AA AA A AAAAAAAAAA AAAAAAAAA A A A A AAAAAAAAAA A AA AAAAAAAAAA AA A A A AAAAAAAAAA A AA AA AA AAAAAAAAAA A A A AAAAAAAAAA A AA AAAAAAAAAA A A AAAAAAAAAA A AA A AAAAAAAAAAA A A AAAAAAAAAA AAAAAAAAAA AA AA AA A A A AAAAAAAAAA AAAAAAAAA AAAAAAAAAA AAAAAAAAA A AAAAAAAAAA AAAAAAAAAA AAAAAAAAAA AA AA AA AAAAAAAAAA A AAAAAAAAAA AAAAAAAAAA AAAAAAAAAA AAAAAAAAAA AAAAAAAAA AAAAAAAAAA AAAAAAAAAA AAAAAAAAAA A AAAAAAAAA AAAAAAAAAA AAAAAAAAAA AAAAAAAAAA AAAAAAAAA AAAAAAAAAA AAAAAAAAA AAAAAAAAAA AAAAAAAAAA AAAAAAAAA AAAAAAAAAA Fig.12 Experimental result Robot

End position

1

z 0.5

Initial position

0 0.5

Human

0.5

0

0

-0.5

y

-0.5

-1

-1

-1.5

-1.5

(Rotation about Y -, Z- axis)

x

Fig.15 Experiment of 3-D transportation

0.5