Mobile Robot Relocation from Echolocation Constraints - CiteSeerX

Report 5 Downloads 108 Views
Mobile Robot Relocation from Echolocation Constraints Jong Hwan Lim Department of Mechanical Engineering Cheju National University, Cheju, Korea [email protected]

John J. Leonard Department of Ocean Engineering Massachusetts Institute of Technology Cambridge, MA 02139, USA [email protected]

Abstract This paper presents a method for relocation of a mobile robot using sonar data. The process of determining the pose of a mobile robot with respect to a global reference frame in situations where no a priori estimate of the robot's location is available is cast as a problem of searching for correspondences between measurements and an a priori map of the environment. A physically-based sonar sensor model is used to characterize the geometric constraints provided by echolocation measurements of di erent types of objects. Individual range returns are used as data features in a constraint-based search to determine the robot's position. A hypothesize and test technique is employed in which positions of the robot are calculated from all possible combinations of two range returns that satisfy the measurement model. The algorithm determines the positions which provide the best match between the range returns and the environment model. The performance of the approach is demonstrated using data from both a single scanning Polaroid sonar and from a ring of Polaroid sonar sensors.

I. Introduction Navigation is a central problem in mobile robotics [6]. The terms relocation and localization refer to two di erent scenarios for determining the position of an autonomous mobile robot with respect to a global reference frame [16]. The former is the direct measurement of the position in a way that is independent of assumptions about previous movements [8], while the latter is the continual provision of a knowledge of position which is deduced from its a priori position estimation [15], [5]. While there has been a large amount of work on continuous localization of a mobile robot using sonar, less work has been done on relocation. Relocation is an important problem for development of long-term autonomous systems, because it will be necessary for error recovery if a mobile robot

gets lost. The present paper addresses a limited form of the relocation problem, in which it is assumed that the mobile robot has an accurate a priori map. The problem of relocation using sonar was rst considered by Drumheller [8]. He developed a search procedure for determining the robot's position based on the Interpretation Tree method of Grimson and Lozano-Perez [11], [12]. Drumheller's approach is strongly based on the use of line segments as features that are extracted from scanning sonar data. Kuc [14], however, has demonstrated that in a specular wavelength regime, sonar scans should be expected to consist of circular arcs, not straight line segments. Leonard [16] presented a simple thresholding technique for extracting these circular arc features from Polaroid sonar data. These features are called regions of constant depth (RCDs). RCDs are caused by specular planes and cylinders and also by corners and edges. Since most re ective surfaces in man-made environments contain specular re ectors, RCDs are a more natural and useful feature for sonar data interpretation than straight line segments. Using range information from a single sensor only, the RCD features produced by edges and corners are indistinguishable from the RCDs produced by planes. Hence, the extension to apply Drumheller's method with RCDs in lieu of line segments is not straightforward. In addition, it would be highly bene cial to develop a technique which can be applied to sparse sonar data, collected by a ring of sensors, as well as to densely-sampled sonar data from a rotating sensor. In the approach to relocation described in this paper, the environment is a room or area inside a building, which can be modeled in terms of four types of geometric primitives (corner, edges, cylinders, and walls). We assume that an approximate model of the objects in the environment is available, in terms of these types of features. In practice, a small hole or narrow crack between two objects that is not in the

model can also produce range returns. The relocation method does not rely on an exhaustively detailed model, but instead can be applied when the locations of key environmental features are known. Our work assumes a feature-based representation of the environment. Alternative formulations of relocation and continuous localization employing gridbased representations were considered by Moravec and Elfes [18], [9], Lim [17], and Schultz and Adams [20]. More recently, a Monte Carlo Localization procedure using laser data has been reported by Dellaert et al. [7] and an Interpretation Tree approach has been presented by Castellanos et al. [13]. The structure of this paper is as follows: Section II reviews the sonar measurement model which is used to derive the geometric constraints employed in the relocation method. The details of the relocation procedure are speci ed in Section III. Experimental results obtained with the algorithm are presented in Section IV. Finally, Conclusions are drawn in Section V.

II. Measurement Model Echolocation refers to the process of measuring the location of an object by determining the time for an echo to return from the object to the sonar, in response to an interrogating pulse from the sonar [1]. If the echoes re ected from an object can be simultaneously detected by multiple transducers [2], then one can also measure the direction to a re ection object. However, in the present paper, we assume that each transducer operates independently, and hence angle cannot be measured directly. Because of the wide beam pattern of the Polaroid sonar, the angle to the re ecting object cannot be reliably estimated from individual returns [16]. The measurement produced by the sensor is a range value r = ct2 , where c is the speed of sound in air and t is the round-trip travel time from the sensor to the object and back. A physics-based sonar sensor model can be used to derive the geometric constraints provided by echolocation measurements for di erent types of objects [16]. We assume that environmental features can be classi ed according to four types: planes, cylinders, corners, and edges. We approximate the world as being two dimensional, so that planes are represented by lines, cylinders by circles, and corners or edges by points. Examples of cylindrical features that might be encountered in an indoor environment include building pillars. We use the word \target" to refer to the environmental features.

β

R

y θs r ys

θt

α

xs

x

Fig. 1. Plane target model. A plane is represented by the perpendicular distance r and orientation . The shaded rectangle indicates a single sonar sensor located at the position (xs ; ys ; s ).

In addition, we assume that the surfaces of the environment are smooth in relation to the wavelength of the sonar. In a specular wavelength regime, rough surface di raction [4] can be ignored. If rough surfaces are encountered, extra returns will be produced at high angles of incidence from line targets; these will need to be rejected as outliers as a by-product of the constraint-based search procedure. For a long duration, single frequency transmitted pulse, the beam pattern b() of a circular disc transducer such as the Polaroid sonar is given by [19]:

2  b() = 2J1ka(kasinsin ) ;

(1)

where J1 is a rst-order Bessel function,  is the angle from the sensor axis, k = 2 is the wavenumber, and a is the radius of the transducer. For the Polaroid sensor, a = 39 mm and  = 6:9 mm. Bozma and Kuc [3] have found that with short, impulsive excitations, the beam pattern of the Polaroid transducer has a Gaussian shape and side-lobe e ects are minimized. However, with the standard Polaroid driver circuit, which uses a longer transmitted pulse, side-lobe levels can be signi cant. While under normal circumstances, a range return is produced by the main central lobe, returns can also be generated from the side lobes of the radiation pattern. For specular surfaces, only the perpendicular portion of the surface re ects the beam directly back to the transducer [14]. For a line target, if we let t be the angle with respect to the x axis of a perpendicular drawn from the line to the sensor location, as shown in Figure 1, the range of values of sensor bearing, s ,

that can produce a range return is

t ? 2  s  t + 2 ;

(2)

where is de ned as visibility angle of the target and represents the maximum range of angles over which a return is produced by a target. We de ne Equation 2 as the sensor orientation constraint. For a point target such as a corner or edge, the range of values of s is identical to Equation 2, except t in this case is de ned as the angle between the x axis of the global reference frame and the line drawn from the sensor location to the point that de nes the location of the point target. In practice, edge targets will be visible over a smaller range of angles than other types of targets, because they provide weaker returns. However, in this paper we adopt the conservative strategy of using the large value of = 50 for all types of targets. A cylinder is represented by a circle and is de ned by the x and y coordinates of the center and the radius of the circle. t for a cylinder is de ned as the angle between the x axis of global reference frame and the line drawn from the sensor location to the center of the cylinder.

III. Relocation Procedure Relocation is basically a searching problem that nds the best correspondence between sensor data and the model features. Thus the reduction of the search cost as well as the accuracy of the result is very important. If we have m data features (sonar returns) and n model features, the search cost will grow at the rate of (n + 1)m when we use the basic Interpretation Tree algorithm of Grimson and Lozano-Perez [11]. To reduce the number of data values to be considered in such a procedure, one method is to group sonar returns that are hypothesized to originate from the same environmental object into a \data feature". This was the motivation for Drumheller to extract straight line segments from sonar scans, to serve as input to the data. In Drumheller's work, line segments extracted from sonar scans were e ectively used as constraints for relocation. A line segment can reduce dramatically both the search cost and the angle uncertainty of the robot's con guration. As stated earlier, however, it is dicult to extract line segments from even densely sampled data because most of the object surfaces in an indoor environment can be considered to be specular.

Furthermore, it is impossible to extract line segments from sparse data such as a circular ring of 16 sonar sensors. The relocation method presented here uses the constraints derived from a physics-based measurement model in a hypothesize and test search procedure [11]. The algorithm can employ either individual sonar returns or multiple adjacent sonar returns grouped together (circular arcs) as data features. For simplicity, we state the algorithm in terms of the situation when all sonar returns are referenced to the center of the robot, but the method can be generalized to accommodate arbitrary sonar con gurations. The method is summarized in Algorithm 1. The key steps are described below: Step 1: generation of trial positions. From the combination of any two range returns, fi and fj , we consider all possible ways of pairing the returns with targets Fp and Fq , i.e., sets of pairings fi :Fp and fj :Fq . The association of a return with a plane target gives a line on which the sensor is constrained to be located. The association of a return with a corner, edge, or cylinder target gives a circle of possible sensor positions. Each pair of feature-to-model associations generates zero, one, or two possible positions of the robot, based on computation of the intersection points of the line and circle position constraints produced from each association. (See Figures 2 to 4). For associations that match both returns to plane targets, many infeasible pairings can be quickly removed without calculation of a trial position through application of a binary angle constraint [11]. The binary angle constraint tests the relative orientation of the measurements fi and fj against the di erence in angle between the normals of Fp and Fq , to see if they agree within the target visibility angle . Additional binary constraints based on intersection of visibility regions for di erent targets can also be incorporated in the method.

Step 2: removal of infeasible trial positions.

For each trial position x^k calculated in step 1, a trial angle is calculated for the sensor using the mean of the angle to each target from the trial position. Then, predicted sonar returns are generated for the features Fp and Fq , and these are matched against the measurements fi and fj by the application of the sensor orientation constraint (Equation 2). The sonar prediction function incorporates an occlusion test, which determines if target Fp is visible from position x^k . If the occlusion test fails for either pairing, then the position is considered infeasible.

Step 3: matching of additional range val-

Fq 2222 2222 2222222222222222 2222 2222 2222222222222222 2222 2222 2222222222222222 2222 2222 2222222222222222 2222 R j 2222 2222 2222 222222222222 2222 2222 222222222222 2222 2222 222222222222 2222 P Lj 2222 2222 Fp 2222 Ri 2222 2222 2222 2222 2222 Li 2222 2222 2222 2222 2222 222222222222222222222222222 2222 222222222222222222222222222 2222 222222222222222222222222222 Fig. 2. A possible trial position P for the robot calculated

from the hypothesized match of return Ri with line target Fp and return Rj with line target Fq .

2222 2222 2222222222222222 2222 2222 2222222222222222 2222 2222 2222222222222222 P2 2222 2222 2222222222222222 2222 2222 2222 2222 222222222222 F q 222222222222 2222 2222 2222 2222 222222222222 2222 P1 Rj 2222 2222 Fp 2222 Ri 2222 2222 2222 2222 2222 Li 2222 2222 2222 2222 2222 222222222222222222222222222 2222 222222222222222222222222222 2222 222222222222222222222222222 Fig. 3. Possible trial positions P1 and P2 for the robot calcu-

lated from the hypothesized match of return Ri with line target Fp and Return Rj with point target Fq .

2222 2222 2222222222222222 2222 2222 2222222222222222 2222 2222 2222222222222222 2222 2222 2222222222222222 2222 2222 2222 2222 222222222222 Ri 2222 2222 222222222222 2222 2222 222222222222 2222 2222 2222 P2 Fp 2222 2222 r Rj 2222 2222 Fq 2222 2222 P1 2222 2222 Li 2222 2222 2222 222222222222222222222222222 2222 222222222222222222222222222 2222 222222222222222222222222222 Fig. 4. Possible trial positions P1 and P2 for the robot calcu-

lated from the hypothesized match of return Ri with line target Fp and return Rj with cylinder target Fq (r is the radius of the cylinder).

ues based on the hypothesized location. Predicted returns (range and angle) are generated for each model feature that is visible from the trial position and matched to the remaining ranges in the data set. Predicted ranges R^t and actual ranges Rt are compared with a set threshold jR^t ? Rt j  re (3) If Equation 3 holds true, we get a successful match. The value of re is chosen based on the range accuracy of the sensor and the uncertainty of the trial position. If multiple predicted ranges match a measured range, then the closest prediction is used.

Step 4: pose re nement using least squares estimation. For each trial position, suppose that K

is the number of matched predictions and observations determined in step 3. If K is greater than a minimum number of matches N , then a least squares algorithm is used to compute an improved location and orientation estimate for the robot based on all of the matched predictions and observations, and this location is added to the solution set. An additional parameter , is used to control the size of the solution set. If K ?  is greater than N , then the solution set is pruned by removing any solutions with with less that K ?  matches, and N is set to K ? . In our experiments, good results have been obtained with  = 0. By increasing the value of  to a larger value, such as  = 2, a greater number of solutions is provided, and a better insight into the the algorithm's performance can be obtained. Step 5: clustering of computed locations. The locations in the solution set are clustered together to produce a single solution for each set of locations within a speci ed error range (xe , ye , e ) of one another.

IV. Experimental Results The algorithm described above has been implemented with several data sets, including data from a single scanning Polaroid sonar and from an array of xed transducers. The sonar scan data was taken from Leonard and Durrant-Whyte [16]. Circular arc features were extracted as in [16] and supplied as input to the relocation procedure. The algorithm parameters were set to the values: = 30 degrees, xe = 0:1 m, ye = 0:1 m, e = 10 degrees, and re = 0:05 m. The results of the method for a sequence of 18 sonar scans are shown in Figure 5 and Table I. The robot's actual position was measured carefully by hand to an

TABLE I

Summary of localization results for scan data (18 positions).

mean std. dev. distance error 0.011 m 0.006 m angle error -0.65 degrees 0.59 degrees number of matches 8.33 0.97 TABLE II

Summary of localization results for ring sonar data (84 positions).

mean std. dev. distance error .021 m 0.012 m angle error -0.016 degrees 2.97 degrees number of matches 11.7 1.12 accuracy of approximately 2 millimeters and 1 degree. Figure 6 shows the set of circular arc features used as input for several scans, and the set of circular arc features successfully matched for these positions. Sonar range values from a ring of sonar sensors were obtained using a Scout mobile robot from Nomadic Technologies, Inc. The robot is equipped with 16 sonar sensors spaced at 22.5 degree angular intervals. The range resolution of the sonar data is 0.025 m. The data sets were acquired at a sequence of 84 locations in a room as shown in Figure 7. The objects in the room were paper boxes, desks, a cylinder and triangular shaped object made of metal. Narrow cracks between some of the objects were not modeled. The positions were distributed fairly evenly around the room, and a range of robot orientations were considered. The algorithm parameters were set to the values: = 50 degrees, xe = 0:1 m, ye = 0:1 m, e = 10 degrees, and re = 0:1 m. Figures 8 and 9 and Table II show the results for the ring sonar data. The robot's actual position was estimated by using the initialization technique for the model-based localization method described by Leonard [16]. This method was determined to be accurate to within approximately 0:02 meters and approximately 2 degrees, based on comparison with handmeasured vehicle locations.

V. Conclusion and Discussion A method for mobile robot relocation using sonar has been implemented and tested with real data obtained in an indoor environment. The method can de-

termine the position of a Nomad Scout robot within a known room to an accuracy of approximately  3 centimeters and approximately  3 degrees. Since most object surfaces in an indoor environment are specular, it is dicult to extract line segment features reliably in typical indoor environments using data obtained from only one position. The algorithm presented in this paper can be applied to individual sonar returns, or groups of returns combined together to form circular arcs. Geometric constraints derived from a physics-based sensor model are used in a \hypothesize and test" search technique. Trial positions and orientations of the robot are calculated by considering all possible pairs of range values and model primitives that satisfy the geometric constraints. The algorithm determines a set of positions that produce the best correspondence between the environment and the measurements. A critical issue with this type of algorithm is to consider its performance when the model does not fully match the environment. The method reported in this paper exhibited good performance in limited testing when objects were present in the environment but not in the model, and when objects were present in the model but not in the environment. In preliminary experiments, we found that the introduction of an unmodelled object led to position errors of less than 10 cm and angular errors of less than 10 degrees for the environment shown in Figure 7. Although many indoor surfaces are indeed specular, rough surface re ections can be important in many environments. Incorporation of echolocation constraints from rough surfaces is more dicult because di racted sonar returns provide weaker geometric constraints than specular sonar returns. An additional topic to consider in future research is to combine a heuristic search termination strategy with a pre-selection method that can consider datato-model pairings that have a higher likelihood earlier in the search process [11]. This could result in a signi cant decrease in computation time, but might lead to erroneous results in environments such as building corridors that exhibit a high degree of symmetry. Further testing is necessary to determine the performance of the method in very large environments. An additional constraint that might be employed to disambiguate multiple possible hypothesized positions would be to apply the sonar barrier test employed by Drumheller [8]. This test would reject any hypothesized positions which have measured sonar returns which \penetrate" planar objects at near-orthogonal angles of incidence. We believe, however, that the

sonar barrier test may cause problems in situations when there are unmodeled objects present or when there are objects in the model which are no longer in the same positions in the environment. The true test of a relocation method will be its performance using maps built autonomously by the robot [21], [10], [22], [23]. We believe that the approach adopted in this paper o ers sucient generality such that integration with concurrent mapping and localization can be achieved without major modi cations. A search-based relocation algorithm, using geometric constraints derived from individual sonar returns, can enable successful mobile robot relocation in a wide range of operating environments.

Acknowledgments J. H. Lim has been supported by the Korean Science and Engineering Foundation (KOSEF 97 postdoctoral fellowship program). J. Leonard has been partially supported by the Henry L. and Grace Doherty Assistant Professorship in Ocean Utilization and NSF grant BES-9733040.

References [1] W. Au. The Sonar of Dolphins. New York: SpringerVerlag, 1993. [2] B. Barshan and R. Kuc. Di erentiating sonar re ections from corners and planes by employing an intelligent sensor. IEEE Transactions on Pattern Analysis and Machine Intelligence, PAMI-12(6):560{569, June 1990. [3] O. Bozma and R. Kuc. Building a sonar map in a specular environment using a single mobile transducer. IEEE Trans. Pattern Analysis and Machine Intelligence, 13(12), December 1991. [4] O. Bozma and R. Kuc. Characterizing pulses re ected from rough surfaces using ultrasound. J. Acoustical Society of America, 89(6):2519{2531, June 1991. [5] J. A. Castellanos. Mobil Robot Localization and Map Building: A Multisensor Fusion Approach. PhD thesis, University of Zaragoza, Spain, 1989. [6] I. J. Cox and G. T. Wilfong. Autonomous Robot Vehicles. Springer-Verlag, 1990. [7] F. Dellaert, D. Fox, and and S. Thrun W. Burgard. Monte carlo localization for mobile robots. In Proc. IEEE Int. Conf. Robotics and Automation, 1999. [8] M. Drumheller. Mobile robot localization using sonar. IEEE Transactions on Patten Analysis and Machine Intelligence, PAMI-9(2):325{332, March 1987. [9] A. Elfes. Sonar-based real-world mapping and navigation. IEEE Journal of Robotics and Automation, RA-3(3):249{ 265, June 1987. [10] H. J. S. Feder, J. J. Leonard, and C. M. Smith. Adaptive mobile robot navigation and mapping. Int. J. Robotics Research, 18(7):650{668, July 1999. [11] W. E. L. Grimson. Object Recognition by Computer: The Role of Geometric Constraints. MIT Press, 1990. (With contributions from T. Lozano-Perez and D. P. Huttenlocher).

[12] W. E. L. Grimson and T. Lozano-Perez. Model-based recognition and localization from sparse range or tactile data. International Journal of Robotics Research, 3(3):3{ 35, 1984. [13] J. Neira J. A. Castellanos and J. D. Tardos. Constraintbased mobile robot localization. In Advanced Robotics and Intelligent Systems, Control Series 51. IEE, January 1996. ISBN 0-85296-853-1. [14] R. Kuc and M. W. Siegel. Physically based simulation model for acoustic sensor robot navigation. IEEE Transactions on Pattern Analysis and Machine Intelligence, PAMI-9(6):766{778, November 1987. [15] J. J. Leonard and H. F. Durrant-Whyte. Mobile robot localization by tracking geometric beacons. IEEE Trans. Robotics and Automation, 7(3):376{382, June 1991. [16] J. J. Leonard and H. F. Durrant-Whyte. Directed Sonar Sensing for Mobile Robot Navigation. Boston: Kluwer Academic Publishers, 1992. [17] J. H. Lim. Map construction, exploration, and position estimation for an autonomous mobile robot using sonar sensors. PhD thesis, Pohang Institute of Science and Technology, Korea, 1994. [18] H. Moravec. Sensor fusion in certainty grids for mobile robots. In Sensor Devices and Systems for Robotics, pages 253{276. Springer-Verlag, 1989. Nato ASI Series. [19] P. M. Morse and K. U. Ingard. Theoretical Acoustics. New York: McGraw-Hill, 1968. [20] A. C. Shultz and W. Adams. Continuous localization using evidence grids. In Proc. IEEE Int. Conf. Robotics and Automation, pages 2833{2839, 1998. [21] R. Smith, M. Self, and P. Cheeseman. Estimating uncertain spatial relationships in robotics. In I. Cox and G. Wilfong, editors, Autonomous Robot Vehicles. SpringerVerlag, 1990. [22] S. Thrun, J.-S. Gutmann, D. Fox, W. Bugard, and B. J. Kuipers. Integrating topological and metric maps for mobile robot navigation: A statistical approach. In AAAI-98, 1998. [23] B. Yamauchi, A. Schultz, and W. Adams. Mobile robot exploration and map building with continuous localization. In Proc. IEEE Int. Conf. Robotics and Automation, pages 3715{3720, May 1998.

Room layout for ring sonar data sets Plane

0.025

Plane

0.02

4

0.015

Plane

Cylinder

Plane

0.01

0.005

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

2

18

Plane

0

3

Plane

y (meters)

distance error (meters)

5

scan number 0.5

Triangle

1

pos 1 pos 84

−0.5

0

Plane

−1

−1

−1.5

−2.5

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

1

2

3

4

5

6

Fig. 7. Room layout for ring sonar data sets. The small circles indicate the 84 locations from which sonar data was acquired

18

scan number 10

9

8

0.06

15

0.05

10

7

distance error (meters)

number of matches

0

x (meters) −2

6

5

4

3

2

angle error (degrees)

angle error (degrees)

0

0.04

0.03

0.02

5

0

−5

−10

0.01

1

0

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

0

18

1

11

21

31

41

51

61

71

−15

81

1

11

21

31

41

51

61

71

81

scan number

scan number

scan number

14

Fig. 5. (Top) distance error for each scan data set, (middle) angle error for each scan data set, (bottom), number of matches for each scan data set.

number of matches

12

10

8

6

4

2

0

1

11

21

31

41

51

61

71

81

scan number

Fig. 8. (Top) distance and angle errors for each ring sonar data set, (bottom) number of matches for each position.

1.5

1.5 1

1

y (meters)

0.5

0

5

5

0.5 4.5

−0.5

4.5

4

4

−1

0

3.5

y (meters)

3.5 −1.5

−0.5

−2 −2

−1.5

−1

−0.5

0

0.5

1

1.5

2

2.5

x (meters)

−1

−0.5

0

0.5

1

3

3 2.5

2.5 2

2

1.5

1.5

1.5

1.5

1

2

1.5

0.5

1

0

0.5

1

y (meters)

1

−2

−1

0

1

2

3

0

4

x (meters) 0.5

−1

0

1

2

3

4

5

0.5

5.5 0

5

5 0

−0.5

4.5 4 −1

−2

−1 −1

0

1

2

x (meters)

3

4

−1

−0.5

0

0.5

1

1.5

2

y (meters)

4 −0.5 −1.5

3

3.5

2

2.5

3

2

Fig. 6. (Left column), all circular arcs referenced to true pose for data sets 1 and 9, (right column), matched circular arcs referenced to the estimated pose for data sets 1 and 9. (The di erence in scale between the gures on the left and on the right is necessary to show the long sonar returns that arise from multiple re ections.)

1

1.5 1 0

0.5 0

1

2

3

4

x (meters)

5

6

7

−1

0

1

2

3

4

5

Fig. 9. (Left column), all sonar returns referenced to true pose for ring sonar data sets 1 and 31, (right column), matched sonar returns referenced to the estimated pose for ring sonar data sets 1 and 31.

1: function x = relocation(R, F ) 2: inputs: 3: R = fR1 ; : : : ; Rm g . a set of m sonar returns (observations) 4: F = fF1 ; : : : ; Fn g . a set of n model features (targets) 5: outputs: 6: x = fx1 ; : : : ; xr g . a set of r potential robot positions 7: M . number of positions in x 8: control parameters: 9: xe ; ye ; e ; re . position, range, and angle tolerances (typical values are 10 cm and 10 degrees) 10: . maximum range of angles over with a target is visible (50 degrees for Polaroid sonar) 11: N . initial value for the minimum number of matching 12: sonar returns for each position in x (typically six) 13:  . maximum di erence in the number of matching sonar returns 14: for each location in the solution set (typically zero) 15: local variables: 16: a . set of assignments between predicted and observed sonar returns 17: R^ . predicted sonar returns 18: K . number of matches for a hypothesized location 19: x^ . hypothesized vehicle positions 20: x ; 21: M 0 22: for i = 1 to m ? 1 23: for j = i + 1 to m 24: for p = 1 to n 25: for q = 1 to n 26: if binary constraints(Ri ; Rj ; Fp; Fq ) == TRUE then . apply binary constraints 27: x^ = fx^1 ; : : : ; x^s g generate locations(Ri ; Rj ; Fp ; Fq ) . generate s hypothesized positions 28: for k = 1 to s . predict sonar return for feature Fp 29: R^i sonar prediction(^xk ; Fp ) 30: if match returns(Ri ; R^i ) == TRUE then 31: R^j sonar prediction(^xk ; Fq ) . predict sonar return for feature Fq 32: if match returns(Rj ; R^j ) == TRUE then . generate predicted 33: R^ = fR^1 ; : : : ; R^t g sonar prediction (^xk ; fF1 ; : : : ; Fn g) sonar 34: returns from hypothesized position k ^ fR1 ; : : : ; Rm gnRi ; Rj ) . match predicted sonar returns 35: (K; a) match returns(R; 36: with remaining measurements, 37: producing a set of assignments 38: if K >= N then . calculate improved position 39: xM compute position(Ri ; Fp ; Rj ; Fq ; a) 40: estimate using all matched 41: predictions and observations 42: x x [ fxM g . add new solution to set of solutions 43: M M +1 44: end 45: if (K ? ) > N then 46: (x; M ) prune(x; K ? ) . remove any solutions with 47: less than K ?  matches 48: N K? . increase the minimum number of matches 49: end 50: end 51: end 52: end 53: end 54: end 55: end 56: end 57: end 58: 59: x clustering(x; xe ; ye ; e ) . group positions in x into clusters based on the tolerances xe ; ye ; e 60: and replace the set x by the set of average positions for each cluster 61: x sort(x). sort solutions based on number of matches and residual from least squares computation 62: return x; M

Algorithm 1: Summary of the relocation procedure.