Human-like Arm Motion Generation for Humanoid ... - (LASA) | EPFL

Report 8 Downloads 19 Views
Human-like Arm Motion Generation for Humanoid Robots Using Motion Capture Database Seungsu Kim∗ , ChangHwan Kim† and Jong Hyeon Park‡ ∗‡ School

of Mechanical Engineering Hanyang University, Seoul, 133-791, Korea. ∗† Intelligent Robotics Research Center Korea Institute of Science and Technology, Seoul, 130-650, Korea. Email: ∗ [email protected]; † [email protected]; ‡ [email protected]

Abstract— During the communication and interaction with a human using motions or gestures, a humanoid robot needs to not only look like a human but also behavior like a human to avoid confusions in the communication and interaction. Among humanlike behaviors, arm motions of the humanoid robot are essential for the communication with people through motions. In this work, a mathematical representation for characterizing human arm motions is first proposed. The human arm motions are characterized by the elbow elevation angle that is determined using the position and orientation of human hands. That representation is mathematically obtained using an approximation tool, Response Surface Method (RSM). Then a method to generate humanlike arm motions in real time using the proposed representation is presented. The proposed method was evaluated to generate human-like arm motions when the humanoid robot was asked to move its arms from a point to another point including the rotation of hand. An example motion was performed using the KIST humanoid robot, MAHRU.

I. I NTRODUCTION A few humanoid robots have been developed and shown to the public in the last decades aiming for providing people with useful services. Most interactions between a humanoid robot and a human happen through voices and behaviors. Such behaviors need to look like humans’ otherwise they may cause people to understand the meaning of behaviors incorrectly. It is natural that a behavior of humanoid robot be comfortable to and expectable by a human. Human-like arm motions are then the bottom requests for humanoid robots to do like humans. Some studies have been done to generate a human-like motion by imitating a human motion as closely as possible. The human motion is measured by a motion capture system and then adapted to a humanoid robot or an animation character. In the case of using an optical motion capture system the human motion are captured in the form of time trajectories of markers attached on the human body. This approach have been developed by several researchers. Kim et al.[?] proposed a method to convert the captured marker data of human arm to motions available to a humanoid robot using and optimization scheme. The position and orientation of hand, and the orientation of upper arm of a human were imitated by a humanoid robot under bounded capacities of

joint motors. However this method were not able to generate a new human-like motion. Pollard et al.[?] also developed a method to adapt captured human motions to a humanoid robot that consists of only a upper body. The captured upper body motions of an actor was optimized, minimizing the posture differences between the humanoid robot and the actor. The limits of joint position and velocity were considered. Nakaoka et al.[?] explored a procedure to let a humanoid robot (HRP1S) imitate a Japanese folk dance captured by a motion capture system. Symbolic representations for primitive motions were presented. The time trajectories of joint positions were first generated to imitate the primitive motions. These trajectories were then modified to satisfy mechanical constraints of the humanoid robot. Especially, for the dynamic stabilities the trajectory of waist was modified to be consistent with the desired ZMP trajectory. The imitation of the Japanese folk dance was performed in the dynamics simulator, OpenHRP, and was realized by the real humanoid robot, HRP-1S, as well. These methods are all used to imitate the given human motions. The methods may have difficulties to generate a variety of new human-like motions from the Human’s motion capture database, since they adapted only the given captured motions. Another approach to generate human-like arm motions using a mathematical representation for a human’s arm motion was performed by Asfour et al.[?]. The mathematical representation in [?] and [?] was used. In those papers the four parameters were defined and represented in terms of wrist positions of a human in the spherical coordinate system at the shoulder. However, the proposed representation approximated arm movements so that the method developed in [?] provided erroneous results at the position and orientation of a humanoid hand. In addition, the four parameters used in that work may not have physical meanings. For a humanoid robot not only to imitate human motions but also to perform a human-like motions anytime needed using motion database, a new method is needed. In this paper, a method for extracting movement characteristics of human arms from the motion capture database will be presented.

The characteristics will be described in terms of the elbow elevation angle. This angle will be determined by the position of wrist and the angle between the palm and the ground. Using this representation of a human’ natural elbow elevation angle a human-like motion will be generated. II. E LBOW E LEVATION A NGLE : C HARACTERIZING A H UMAN A RM M OTION

To obtain the natural elbow postures of a human the kinematic analysis were performed as seen in Fig. 3. During the experiments, the actor relaxed his arm, and moved without planned arm postures. A number of wrist positions and palm directions were examined in the rule given to the actor. For the experiments the reachable volume by human arms were divided by 6 planes vertical to the ground as equally as possible. Then the actor drew 5 circles having different diameters from each other during 5 seconds. The same experiment were repeated by 3 times by varying palm directions.

Elbow elevation angle

Fig. 2. Fig. 1.

Motion capture system and actor

The definition of elbow elevation angle for a human arm

In this section the characterizing process for the movement of a human arm in the motion capture database is described. The motion database is constructed using an commercially available optical motion capture system as seen in Fig. 2. The human model in Fig. 3 were modeled by the S/W of provided the motion capture system. In daily life, the hand motions of moving from a point to another point with varying its orientation occur anytime such as when pointing out by a hand, moving a hand to grasp an object on a table or in the air, talking to persons with hand gestures and so on. The human arm’s posture may be described in terms of the position of wrist, the orientation of hand, the elbow posture from the body and more. From the capture arm motion database it was observed that the elbow posture might be determined mainly by the position of wrist and the direction of vector normal to the palm, as called palm direction. In other words, a posture of an arm at a certain instance can be described in terms of the position of wrist, palm direction and elbow posture. Moreover elbow posture can be expressed by the position of wrist and palm direction. The wrist position is obtained using the markers on the human arm first about the global cartesian coordinate on the ground and then converted about the reference frame attached at the shoulder. The elbow posture is defined by the angle between a plane vertical to the ground (see the red dashed triangle in Fig. 1) and the plane defined by the three markers of shoulder, elbow and wrist (see the blue dashed triangle in Fig. 1). This angle between the two planes are called elbow elevation angle in the entire paper. Using this angle, human arm motions are characterized, since the angle is represented in terms of the wrist position and the palm direction that are the key factors for natural postures of human arms. The elbow elevation angle is defined as zero when the blue dashed plane in Fig. 1 is parallel to the vertical plane (red dashed plane in the figure).

Fig. 3.

Kinematic analysis for a variety of human arm postures

The human arm motions were captured using the Hawk Digital System commercially available from Motion Analysis Inc as seen in Fig. 2. 29 markers were attached to the upper body of actor and 8 cameras were used. The capturing rate was 120 frames per second. The time trajectories of markers representing human motions were stored. Using such markers’ time trajectories, the wrist positions were obtained with respect to the reference frame at the shoulder and the palm directions was calculated at each frame as well. III. E QUATION OF E LBOW E LEVATION A NGLE From the kinematic analysis in the forgoing section it was observed that the arm posture could be characterized by the elbow elevation angle which is represented in terms of the wrist position and the palm direction. In this section, the representation of the elbow elevation angle is obtained using Response Surface Methodology (RSM) given in [?]. A. Response Surface Methodology The Response Surface Methodology (RSM) in [?] is a technique for representing relationship between controllable input variables and a response. In the methodology a response function is defined to approximate experiment results. The brief descriptions are made as follows:

The response of an experiment is approximated using a response function as y(x) = yˆ(x) + e

(1)

where y denotes the given response of experiment, yˆ is the unknown response function of y and e is the error between the response and the response function. x is a controllable input variable vector. The response function approximates the response using shape functions as yˆ(x) =

Nb X

bi ξi (x)

(2)

i=1

where Nb is the number of terms of the response function. ξi for i = 1 ∼ Nb are called shape functions (or basis functions by some researchers). Unknown coefficients of shape functions, bi for i = 1 ∼ Nb , need to be determined by curvefitting experiment at results. When the multiple responses are given, the multiple errors are also obtained using Eq. (1) and Eq. (2) as

ej = yj − yˆ(xj ) = yj −

Nb X

bi ξi (xj )

for j = 1 ∼ N (3)

i=1

where N is the number of responses (or experiments); yj and ej are the value of the j th response and the corresponding error, respectively; xj is the input variable vector corresponding to the j th response. Equation (3) can be rewritten in a vector form as e = y − Xb

(4)

where the dimension of matrix, X, is N ×Nb having the values of ξi (xj ) as its elements. The unknown constant vector, b, is then determined by minimizing the root mean square (RMS) for e as v s u N u1 X 1 T t 2 eRM S = ei = e e. (5) N i=1 ny Note that minimizing eRM S is equivalent to minimizing eT e. Using the optimality conditions the vector, b, can be obtained as ¡ ¢−1 T b = XT X X y

of human, the normalization makes it easy to apply the human database to the humanoid. As mentioned in Sec. II the characteristics of human arm motions can be represented using the wrist position and the palm angle. The wrist positions are obtained about the spherical coordinate system on the shoulder using the trajectories of the marker at the wrist. The palm direction denotes the direction of vector normal to the palm as defined in Sec. II. The angle between this direction and the ground is used as one of input variables. These representation parameters are normalized to the dimensionless magnitude of 2 as

(6)

so that the response function is achieved. It should be noticed that the process in this section is called the least squares method. B. Normalization of input variables In the solution process in the previous section, it is worthwhile normalizing input variables separately, since big differences in the the magnitudes of the variables may exist. This normalization may help reduce the approximation error. Moreover, since the size of the humanoid is different from that

0 ≤ r¯ ≤ 2 ; −1 ≤ α ¯≤1 ¯ −1 ≤ β ≤ 1 ; −1 ≤ θ¯ ≤ 1

(7)

where r¯ is the distance from the shoulder to the wrist; α ¯ and β¯ are the angles for the spherical coordinate system at the shoulder as seen in Fig. 7; θ¯ is the angle between the palm direction and the ground. C. Characteristic equation for elbow elevation angle For the shape function a second ordered polynomial is widely used in the response surface methodology. Using the parameters defined in the previous section are used to represent the response function for the elbow elevation angle as γˆ = b0 + b1 x1 + b2 x2 +b3 x3 + b4 x4 + · · · · · · + b5 x1 x2 + b13 x23 + b14 x24 [ x1 x2 x3 x4 ] =

£

r¯ α ¯ β¯ θ¯

¤

(8) (9)

where γˆ is the normalized response function for the elbow elevation angle. The input variable vector, x is given as Eq. (9). The unknown coefficient vector, b, for the shape function above is then obtained using Eq. (6) and the results of kinematic analysis of human arm in Sec. 1. Once the response function for the elevation angle for a human are completed, the most natural elbow elevation angle of a humanoid robot is then determined by the wrist positions and the palm directions of the humanoid robot. In addition the motions generated using this response function should look like those of a human. Figure 4 shows the effects of parameters on the elbow elevation angle with varying input parameters. IV. I NVERSE K INEMATICS Up to the previous section the elbow elevation angle of a human was obtained using RSM and motion capture database. In this section, The inverse kinematics problem to generate a human-like arm motion using the elbow elevation angle and a typical procedure of inverse kinematics solution process in robotics, is solved for joint positions. As a testbed, the humanoid robot of MAHRU in Fig. 5 developed by Korean Institute of Science and Technology (KIST) with 6 degrees of freedom for each arm, was used. To solve inverse kinematics problem, 6 holonomic constraints are needed. Input for desired posture is a wrist position in the

160

α = −45

Elbow elevation angle ( γ ), in degree

Elbow elevation angle ( γ ), in degree

200

150

100

α = 45

α=0

50

0

−50 −200

−150

−100

−50

0

50

100

150

Palm direction angle ( θ ), when r = 1.7 and β = 0

Elbow elevation angle ( γ ), in degree

Elbow elevation angle ( γ ), in degree

θ = 90

50 40

θ=0

20 10

θ = −90

0 −10 −20 −50

0

50

Wrist pitch angle ( β ), when r = 1.7 and α = 0.0

Fig. 4.

100

80 60

α = 45

40

α=0

20

1

1.2

1.4

1.6

1.8

Wrist distance ( r ), when β = 0.0 and θ = 0.0

2

140 120

θ = −45 100

θ=0 80 60

θ = 45

40 20 0 −50

0

Wrist yaw angle ( α ), when r = 1.7 and θ = 0.0

50

Elbow elevation angles of a human with respect to the four parameters, r, α, β, and θ

shoulder-centered spherical coordinate and a palm direction angle. Wrist stoop angle also can be input. But in this paper, the angle was set zero. To generate human-like posture, human arm characteristic equation will be used. Therefore, 6 constraint was set. Our approach to solve inverse kinematics is derived from geometric analysis of the problem.

Fig. 5.

100

160

60

30

α = −45 120

0

200

80 70

140

The KIST humanoid robot, MAHRU

Figure 6 shows the home position of left arm. The posture of the arm at this position stretch down the ground and the palm faces to the hip.

When the elbow elevation angle is obtained from the previous section, the remaining joint angles from θ0 to θ4 are obtained through the procedure in this section. First, the joint angle θ3 depends only on the distance r as seen in Fig. 7. µ 2 ¶ Lu + L2l − r2 −1 θ3 = π − cos (10) 2Lu Ll ~ The joint angles θ0 and θ1 is dependent on the vector E. ~ The E0 is the elbow position when α, β and γ were set zero at given wrist positions and palm directions. The plane builded ~ 0 and the wrist position vector from shoulder to wrist with E lies on the x-z plane of shoulder centered coordinate. h 2 2 2 ³ ³ 2 2 2 ´´ iT ~ 0 = r +Lu −Ll 0 −Lu sin cos−1 r +Lu −Ll E 2r 2rLu (11) ~ can be calculated by the elbow elevation angle, γˆ , in Eq. E (8) and the wrist position. ~ E

=

θ1

=

θ0

=

~0 Rx (γ) · Ry (β) · Rz (α) · E à ! ~y E sin−1 Lu à ! ~x ~z E E a tan 2 , Lu cos (θ1 ) Lu cos (θ1 )

(12) (13) (14)

z

Shoulder

θ0

x θ1

y

θ2 Lu

Shoulder



W→

E

Elbow Elbow

E→ W

θ3 θ4

Wrist

Ll

Wrist

Fig. 8.

θ5

where ci is cos (θi ), si is sin (θi ). In this paper, the wrist stoop angle θ5 was set zero. To find θ4 , angle between blow two vector was used.

Hand

Fig. 6.

coordinates of left arm

~c N ~v N z x

θdif f

Shoulder

y

Ll

Wrist

Fig. 7.

=

− → −−→ E × EW 

 0 −−→   = EW × 0 1   ~c · N ~v N ° ° ° = cos−1  ° °~ ° °~ ° °Nc ° · °Nv °

~ v is the normal vector of the plane consisting of the where, N vector from the elbow to the wrist and the normal direction ~ c is the normal vector of the plane vector from the ground. N consisting of the origin at the shoulder, the wrist position and the elbow position under given input variables.

Lu

r

Elbow

coordinates of left arm

θ4 = θ − θdif f

Parameters for human arm posture

(19)

V. A N E XAMPLE ~ ∗ is ∗ component of E. ~ where E The wrist position is expressed as below 0 1A

~ =W ~ · 12 A · 23 A · 34 A · 4 W

(15)

where ij A is a homogeneous transformation matrix from the ~ is the ith reference frame to the j th reference frame and 4 W th 4~ wrist position vector £ ¤T in the 4 frame. That vector is W = 0 −Ll 0 0 . The Wrist position is given and θ0 ,θ1 and θ3 are known using the equations above. Therefore, θ2 can be obtained as s2 c2 θ2

~ z + (Lu + Ll c3 ) s1 W Ll c1 s3 ~ y + c0 (c1 (Lu + Ll c3 ) + Ll s1 s3 ) W = Ll s0 s3 = atan2 (s2 , c2 ) =

(16) (17) (18)

From above section, The equation of elbow elevation angle was implemented. Using this equation, the best natural humanlike posture can be obtained. Moreover, inverse kinematics solution of KIST humanoid MAHRU can be obtained in any reachable wrist position and palm direction. To evaluate the equation and the inverse kinematics solution, the humanoid robot was required to follow the desired trajectories of wrist position and palm direction. The wrist trajectory is given by a sin wave in the y-z plane of the cartesian coordinate system at the shoulder with the distance of 0.44 m in the x direction. The desired trajectory of palm direction was generated by tangential vectors of sin wave function for the wrist position at each time frame. Using those desired trajectories the desired trajectories of joint angles were calculated giving human-like arm motions. Such desired trajectories of joint angles were examined by the KIST humanoid robot, MAHRU. The experiment was performed using a PC operated by the real-time Linux (RTAI) and DSP control boards at each joint motors. The Linux PC could send the desired joint angle and

Fig. 9.

Comparison of the human arm motion and the human-like arm motion by MAHRU using the developed method

desired joint velocity to each DSP board with CAN protocol at each 5 ms. The real-time Linux (RTAI) system guaranteed the 5 ms sampling time. Each DSP board controlled each motor to chace the desired values for joints with PD controller. Figure 9 shows the snap shots of the experiment result. The left and right wrist positions are symmetric and both of the palm directions in the first, third and last scenes of the figure are same about the cartesian coordinate systems at each shoulder. It should be noticed that the resultant arm postures of humanoid robot in such scenes are not symmetric so that one of one elbow was lifted more than the other one as a human does. VI. C ONCLUSION A mathematical representation for characterizing human arm motions have been proposed. The motion capture database were used for the representation. The representation was implemented and evaluated for the KIST humanoid robot, MAHRU successfully. The developed method for characteristics of human arms was very simple for implementation and generated a human-like posture for an arbitrary arm configuration. The method can be used to generate arm motions in real time. In addition, the generated motion followed the desired wrist positions exactly, since the elbow elevation angle did not effect the wrist positions. Furthermore the method may be used for the case where the humanoid robot is required to move the wrist or the hand from a point to another point such a case as approaching arm action to an object in the field of visual servoing. The method may not satisfy a desired orientation of hand fully, since the elbow elevation angle used only one angle, which is relative to the palm direction, out of three angles of desired orientation. If the desired orientation of the hand is satisfied, more degrees of freedom are needed to the humanoid robot. In addition the arm motion generation considering dynamics and the self-collision problem are still remaining for our future work. R EFERENCES [1] C. Kim, D. Kim, and Y. Oh, “Solving an inverse kinematics problem for a humanoid robots imitation of human motions using optimization,” in

[2]

[3]

[4]

[5] [6] [7]

Proc. of Int. Conf. on Infomatics in Control, Automation and Robotics, 2005, pp. 85–92. N. S. Pollard, J. K. Hodgins, Marcia J. Riley, and Christopher G. Atkeson, “Adapting human motion for the control of a humanoid robot,” in Proc. of IEEE Int. Conf. on Robotics and Automation, 2002, vol. 2, pp. 1390– 1397. S. Nakaoka, A. Nakazawa, K. Yokoi, H. Hirukawa, and K. Ikeuchi, “Generating whole body motions for a biped humanoid robot from captured human dances,” in Proc. of Int. Conf. on Robotics and Automation, 2003, pp. 3905–3910. T. Asfour and R. Dillmann, “Human-like motion of a humanoid robot arm based on a closed-form solution of the inverse kinematics problem,” in Proc. of IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, 2003, vol. 2, pp. 1407–1412. J. F. Soechting and M. Flanders, “Errors in pointing are due to approximations in targets in sensorimotor transformations,” in Journal of Neurophysiology, 1989, vol. 62, pp. 595–608. J. F. Soechting and M. Flanders, “Sensorimotor representations for pointing to targets in three-dimensional space,” in Journal of Neurophysiology, 1989, vol. 62, pp. 582–594. R. T. Haftka, Experimental Optimum Engineering Design Course NOTEs, Department of Aerospace Engineering, Mechanics and Engineering Science, University of Florida, Gainesville, Florida, U.S.A., 2000.