Adaptive Reconstruction of Freeform Objects with 3D SOM Neural Network Grids Jacob Barhak, Anath Fischer CMSR Laboratory for Computer Graphics and CAD Department of Mechanical Engineering Technion - Israel Institute of Technology, Haifa, 32000, Israel Tel: 972-4-829-3260, Fax: 972-4-832-4533 meranath @tx.technion.ac.il
reconstructed [20]. Reconstructing a CAD model represented by parametric smooth surfaces is based on two main stages: parameterization and surface fitting.
Abstract Reverse engineering is an important process in CAD systems today. Yet several open problems lead to a bottleneck in the reverse engineering process. First, because the topology of the object to be reconstructed is unknown, point connectivity relations are undefined. Second, the fitted surface must satisfy global and local shape preservation criteria that are undefined explicitly. In reverse engineering, object reconstruction is based both on parameterization and on fitting. Nevertheless, the by above problems are influenced mainly parameterization. In order to overcome the above problems, this paper proposes a neural network Self Organizing Map (SOM) method for creating a 3 0 parametric grid. The main advantage of the proposed SOM method is that it detects both the orientation of the grid and the position of the sub-boundaries. The neural network grid converges to the sampled shape through an adaptive learning process. The SOM method is applied directly on 3 0 sampled data and avoids the projection anomalies common to other methods. The paper also presents boundary correction and growing grid extensions to the SOM method. In the surface fitting stage, an RSEC (Random Su$ace Error Correction) fitting method based on the SOM method was developed and implemented.
1.1 Parameterization The parameterization stage in CAD model reconstruction establishes the intrinsic parametric order and connectivity between points. Parameterization poses a problem similar to that of the chicken and the egg. In order to fit a parametric surface to a set of points, the parameterization of the points should be known; however, in order to determine the exact parameterization of these points, the surface shape should be known. Therefore, surface prediction methods need to be devised. The literature discusses several parameterization schemes, most of which are based on triangular meshes [10,11]. Other parametric surface fitting methods over a rectangular domain can be found in [4,12,15,19,14,18,1]. All of these methods have the following disadvantages: 1. When the scanned data has varying density, these methods yield large parametric gaps between adjacent points. These gaps can subsequently create surfaces that are poorly fitted due to over-stretching or wrinkles on the fitted surface. 2. These methods rely on user intervention or on arbitrary decisions regarding the location of the surface boundary, thus leading to incorrect detection of the surface orientation. This paper proposes a neural network parameterization method for rectangular domains that avoids the above problems. Neural networks have been used for surface approximation in [3,9,21]. Despite the ability of neural networks to learn complex patterns by examples, they have not been widely used in reverse engineering applications. In [9], feed forward neural networks are used to approximate a B-Spline surface; the surface, however, is not explicitly extracted. In [3], selforganization is used to reproduce a surface represented by
Keywords: Reverse Engineering, Parameterization, Surface Fitting, SOM, Neural Networks.
1 Overview Reverse engineering (RE) is an important process used today for reconstructing CAD models. Currently, the laser scanner is the most common sampling technique used in RE applications because it is fast and robust relative to other methods. Data scanned by a laser provides explicit3D points from which a 3D model can be
0-7695-1227-5/01$10.00 0 2001 IEEE
97
Authorized licensed use limited to: BEIJING UNIVERSITY OF TECHNOLOGY. Downloaded on October 22, 2008 at 09:35 from IEEE Xplore. Restrictions apply.
polygons. [21] extracts B-Spline surfaces from scattered data by using the growing grid method, which is an extension of the SOM method and is also used in this paper. However, this method does not address the problem of outlier points, which is addressed in this Paper.
3. Creating an initial parametric 3D base surface. 4. Projecting the sampled points onto the 3D base surface and correcting their parameterization. 5. Approximating a B-Spline surface to the 3D digitized points. 6. Adaptively optimizing the parametric surface by repeating Stages 4-5 until the surface approximation error satisfies a given convergence tolerance. Stages 1-3 deal with the initial parameterization from which the initial base surface is calculated. Stages 4-6 adaptively improve the reconstructed base surface until it converges to the sampled surface under a given convergence tolerance. Each of the above stages is described in detail in the following sections. We have distinguished between 2D parameterization and 3D parameterization, and for simplicity have chosen to begin with 2D parameterization. The sampled points are first projected onto a plane in the sampling direction as a preprocessing stage. Then, the 2D points are parameterized and used to create a 3D base surface. In 3D parameterization, the initial parameterization is performed directly on 3D sampled points to create the initial 3D base surface. Thus, steps 3-4 can be skipped for the 3D case. In the proposed approach, surface fitting is performed by the Random Surface Error Correction (RSEC) method that is based on the SOM Neural Network method. The SOM and RSEC methods will be described in detail in the next section.
1.2 Surface Fitting A fitted surface must satisfy global and local constraints, such as shape preservation, continuity conditions, boundary conditions and geometric constraints, as well as conform to smoothness and accuracy according to a given tolerance. In order for a parametric B-Spline surface to be fitted to the sampled points, all the following parameters should be determined: mesh size, control points, degree of basis functions, knot vectors. Generally, mesh size, degree and knot vectors are predicted, while point parameterization is calculated in the parameterization stage and the control points are calculated in the surface fitting stage [12,18]. In [15], these methods are expanded through a relatively fast but non-optimal algorithm that calculates an average knot vector. In [ 191, the parameterization and the control points are calculated during the surface fitting stage. In [14], only the size and the order of the B-Spline are predetermined, while all other variables are calculated during the surface fitting stage. Most techniques use least squares (LSQ) algorithms that minimize according to a distance criterion [15,12,18, 19,14,1]. Some LSQ methods satisfy additional geometric constraints that are formulated in the minimization function [18,1]. LSQ methods can be divided into iterative and non-iterative methods. Common iterative methods include gradient methods, Newton and quasiNewton methods and the Levenberg-Marquardt method [18,15,19,14,1]. Non-iterative LSQ methods solve an over-constrained equation set by using the Householder's Reduction [I51 or SVD [6]. However, only iterative methods can handle large-scale data due to their small memory consumption. Therefore an iterative method was used as a base for our approach.
3 Implementation First, parameterization is performed using the SOM neural network method. Then, the surface is fitted by the proposed RSEC method.
3.1 SOM Neural Network Parameterization A Self Organizing Map (SOM) is a neural network that can be described as a mesh whose nodes are neurons. Each neuron contains the node position in 3D space (x,y,z) and represents the geometrical properties of the surface. The neurons are connected to each other with edges that represent topological characteristics of the surface. The SOM grid is trained by the sampled data in random order and is modified according to the shape of the sampled object. In this work we concentrate on SOM with rectangular topology, which is the same intrinsic topology as that of B-Spline surfaces. However, other topologies can be used, as described in [7]. The advantage of SOM over other parameterization methods is that by its nature all the sampled points, and not only the boundary points, participate in the creation of the grid. In this section, the concept of the SOM method is proposed as the basis for the parameterization method. Implementation of the basic SOM algorithm can be found in [13]. However, we have added some modifications to
2 Approach In this paper, a Neural Network Self Organizing Maps (SOM) method is proposed for reconstructing a single BSpline surface from a single range image. Neural network techniques are used in this paper both for parameterization and for surface fitting. The stages of the proposed reconstruction method are as follows (Figure 1): 1. Constructing a parametric grid by a SOM neural network method. 2. Parameterizing the sampled data according to the parametric grid.
98
Authorized licensed use limited to: BEIJING UNIVERSITY OF TECHNOLOGY. Downloaded on October 22, 2008 at 09:35 from IEEE Xplore. Restrictions apply.
SOM neural network algorithm. The SOM neural network algorithm is as follows: 1. Initialize the position of all the neurons. Neuron coordinates are set to random numbers within the bounding box of the sampled points. Unless otherwise stated, all neurons are initialized as mobile and active. 2. Pick a sampled point P, in random order. 3. Find (i*j*) parametric coordinates of the winner neuron N(i*,j*) among all the active neurons. The winner neuron is the closest active neuron to the sampled point P,. 4. Update the position of each active and mobile neuron in the neighborhood of the winner neuron with respect to the sample point (according to the update rules below). 5. Increment s; if s is smaller than the run length, go back to step 2, otherwise stop. The neuron update rules and parameters are: Projected sampled points P, are represented in (p,q) coordinates for 2D parameterization or in (x,y,z) coordinates for 3D parameterization, as explained in Sections 4.1.1. and 4.1.2 respectively. The weight of each sampled point is W,, and is initially set to 1. Initial and final neighborhood radius: Rh Rf 0 Initial and final learning rate : Q , q 0 Initial and final width parameter : 00 , q Run length of the algorithm : L The bubble neighborhood of the winner neuron is defined as follows: All the neurons N(i,j) that satisfies:
interpolate the boundary of the sampled points for the parameterization process.
-
Laser-Scanning of 3D Points
f Creating the parametric grid by Neural Network SOM
a
"c
rn
Initial parameterization of the sampled points
I
+
I
Fitting correction of the B-Spline surface using RSEC
S
-
li-i*l+lj-j*l < R ~ ( R /,R ~ ) L Figure 1. General scheme of the proposed approach.
The neuron position displacement is defined as: N(i.j).Position = N ( i , j).Position + hc, -(P,- N ( i , j).Position)
The SOM data structure. The SOM structure is defined as an array of neurons N(ij) of size nxm. In our case, the number of neurons nxm is usually ten times smaller than the number of sampled points. In effect, each neuron is a structure that describes a vector in 2D or 3D space. An edge exists between neighboring neurons in the array and describes the topological connectivity in the grid, i.e. edges exist between N(ij) and N(i+lj) and between N(ij) and N(i,j+l). Each neuron also has the following characteristics: Index - the topological position of the neuron in the neuron array (ij). Position - the neuron's position in space (x,y,z). Mobile - a Boolean parameter that represents whether the neuron can move during training. Active - a Boolean parameter that signals whether the neuron can be chosen as a winner.
(1)
(2)
where hc, is the Gaussian training function: (1-1 ' '*)2- ( J J'*)' '_
hc, =as- e 20,z a, is the learning rate for point s:
(3)
(4)
o, defines the width parameter. It can be viewed as an elasticity feature of the SOM: S -
os= o o ( o o / o f ) ~ (5) A proof for the convergence of the SOM to the sampled points exists for a simple 1D case and can be found in [13]. The proof also shows that the boundary does not always coincide with the boundary of the sampled points. This problem is corrected in this paper.
99
Authorized licensed use limited to: BEIJING UNIVERSITY OF TECHNOLOGY. Downloaded on October 22, 2008 at 09:35 from IEEE Xplore. Restrictions apply.
The above algorithm was used as a base for parameterization. The 2D SOM parameterization algorithm is performed on 2D due to its simplicity and relative robustness (Section 4.1.1). Later, we present an extension to direct 3D parameterization (Section 4.1.2).
2D SOM Neural Network Parameterization. 2D parameterization includes the following stages: (1) projection, (2) 2D parametric grid generation, and (3) point parameterization according to the parametric grid. In the projection stage, 3D data is mapped into 2D data. During the mapping process, the point neighborhood should be preserved by projecting the 3D points (x,y,z) onto a surface in the scanning direction. Each point is then assigned its 2D projected coordinates (p,q). Next, a rectangular grid is created by the SOM method. Finally, the parameterization is established in two steps: (1) mapping a point (p,q) to a quadrilateral cell (ij) and (2) calculating its (u,v)parametric values by the inverse of the bilinear interpolation in that quadrilateral cell, as explained in [2]. The algorithm creates a 2D SOM grid based on the projected points. This grid also reflects the distribution of the projected sampled points. It can be proven that the neurons are spread uniformly over the sample domain for the 1D case and that the boundary of the grid created by the basic SOM algorithm does not exactly coincide with the boundary of the sampled points [13]. However, there is no proof for higher dimension networks. Therefore, the
SOM boundary first algorithm was proposed to cope with the boundary problem [2]. In the boundury first method, the boundary of the grid is isolated from the rest of the grid and is trained separately. In this way, the boundary, which has the same intrinsic dimension of a closed curve, is trained by data with the same intrinsic dimension and is not affected by the inner points. Later, the interior of the grid is trained while keeping the boundary of the grid immobile. This training yields a better boundary than the regular SOM method (Figure 2). Moreover, the SOM method independently determines the grid orientation and the four grid comer positions. However, the boundary has to be predefined through an edge detection technique. 3D SOM Neural Network Parameterization. Although 2D SOM parameterization is very suitable for parameterization of a single range image obtained by a parallel scan, it has drawbacks when several range images are merged. These drawbacks emerge in the initial projection stage: The 2D parametric density does not always coincide with geometric density of the 3D sampled points due to dimension reduction during the projection. It is not always possible to find a suitable projection direction that preserves the topological connectivity relations among points.
(a) ordering
(b) boundary learning
(d) ordering
(e) boundary learning
(c) inner point learning
~~~
(f) inner point learning
Figure 2. Different phases in the SOM “boundary first” method applied on a mask model (a-c) and on a plane model (d-f). It can be seen that the SOM method works well unless the sample boundary is highly concave. 100
Authorized licensed use limited to: BEIJING UNIVERSITY OF TECHNOLOGY. Downloaded on October 22, 2008 at 09:35 from IEEE Xplore. Restrictions apply.
A
Boundary Sample Point A
\
Winner Boundary Neuron
Sampled point near the boundary -but not a boundary sample
U
Inner Neuron
Figure 3. Classification of boundary sampled points. extended by a growing grid algorithm [21,7]. In this algorithm, the size of the SOM network is modified during training by the following extension algorithm.
The problems described above suggest a 3D approach. As can be seen from the basic SOM algorithm, the 2D
topology is defined by the edges connecting the grid neurons, while the geometry is defined by the training process over the 3D points. In the 2D case, the 2D grid is trained over 2D projected sampled data. For a 3D SOM neural network, the SOM neural network is trained directly over the 3D sampled data. The result is a 3D mesh with rectangular topology. This 3D mesh can be viewed as a bi-linear B-Spline surface approximating the sampled points. Thus, no initial parameterization is needed, so parametric optimization and surface fitting can be performed directly on the 3D mesh. However, detecting the 3D boundary of the mesh is more complex than in the 2D case. For this reason, a boundary correction algorithm was added to the Basic SOM algorithm: 1. Every Lk processed sampled points stages 2-4 are executed for Le iterations: 2. Pick sampled points at random until a boundary sampled point is found. A boundary sampled point is characterized by being closest to a boundary neuron. Boundary neurons are the neurons { N(i,j) where i=l or i=n o r j = l or j=m } (Figure 3). 3. The weight of the sampled point is increased by a constant AW. 4. Train only the boundary neurons of the SOM using the boundary sampled points. 5. Whenever a processed sampled point is not a boundary point, its weight is reset to 1. Even with this correction, some points are still left outside the grid. These points are eliminated at this stage and are corrected later during the adaptive parameterizationprocess, as explained in Section 4.2.1. One more shortcoming of the SOM algorithm is its inability to detect the aspect ratio of the range image. In order to overcome this problem, the SOM algorithm is
-
SOM Extension Growing Grid Algorithm Initialization: 0 The initial SOM grid size is defined as n=3, m=3. Every neuron retains a win counter that accumulates the number of times it was chosen as the neuron closest to a sampled point. This number is reset to 0 at initialization.
SOM algorithm modification: The grid size is increased by adding one row/column to the grid, whenever the number of processed points - s is a h multiple of the neurons number (s=h*n*m), as follows: 1. Find the neuron NB with the largest number of wins. 2. Find the direct neighbor NN with the largest number of wins for the neuron NB 3. If NB and NN are on the same row, add a column of neurons between the columns of NB and NN: Increase the network size m by 1 and modify the indices of all the neurons accordingly. Interpolate the position of the new column neuron positions with respect to the neighboring columns. 4. If NN and NB are on the same column, add a row of neurons between the rows of NB and NN: Increase the network size n by 1 and modify the indices of all the neurons accordingly. Interpolate the position of the new row neuron positions with respect to the neighboring rows. 5. Reset the win counter of all the neurons to zero and continue the SOM algorithm. Using the growing grid algorithm improves the speed and stability of the SOM algorithm. Also, the aspect ratio
101
Authorized licensed use limited to: BEIJING UNIVERSITY OF TECHNOLOGY. Downloaded on October 22, 2008 at 09:35 from IEEE Xplore. Restrictions apply.
SOM parameterization is used (Section 4.1.1), the initial 3D approximation of the surface is defined first by the CMAmethod [6].
of the sample points is better detected using a growing grid, and thus the orientation of the grid is better detected by the SOM. Consequently, the quadrilateral grid cells become more rectangular. Figure 4 shows the progress of the SOM algorithm with the boundary correction and growing grid extension. Figure 5 compares the results to results obtained without these SOM extensions.
RSEC algorithm. In the proposed RSEC algorithm, fitting is according to a local rather than a global error. The random concept is based on the SOM method. The main difference is that instead of finding the winner neuron in an array, a winner parameter (u*,v*) is selected on a B-Spline surface. The algorithm is based on the following stages (Figure 6): 1. Pick L sampled points at random from the range image. Randomness is essential for convergence to the desired surface. 2. For each point P, find the closest point on the surface S(U,.V,), s=l..L : a. Evaluate the distance of the point P, from the surface S(&.v,) :
SOM comparison to other methods. The SOM method has several advantages: 0 The SOM method detects the orientation of the grid and the position of the four sub-boundaries. 0 The SOM method takes into account all the sampled points, both internal and boundary, and is sensitive to the density of the sampled points. The criterion for spreading the neurons is the minimal quantization error, that is, the minimal sum of distances between the sampled points and their nearest neuron (see 7). Other training methods, however, can be used to achieve the maximal entropy criterion, by which the neurons are spread so that the number of sampled points in their Voronoi region is the same for every neuron. The disadvantagesof the SOM methods are: SOM grids may self intersect. This is especially problematic for highly concave boundaries. The SOM method is sensitive to training parameters. 0 The SOM method may yield different solutions for the same data due to the random order of the points especially in 3D for range images with complex boundaries (Figure 5b). It is important to note that 2D parameterization is best used for a single range image taken from a single direction, while 3D SOM parameterization can handle several merged range images. 3D parameterization is usually superior, but is more sensitive to training parameters than 2D SOM is.
n
m
Err, = P, - ~ ~ B i ( u , ) - B j ( v s ) . V G (6) i=l j=1
b. Correct the surface control points as follows:
Yj=y, +~B~(U,).B~(V,)EW,
(7)
Re-parametrize the control points by projecting them onto a base surface (Section 4.2.1). The error is measured, and stage 2 is repeated until the global error converges to a given convergence tolerance. Equation 7 for the B-Spline control points is analogous to equation 2 for the SOM neural network, but instead of a Gaussian training function, the B-Spline basis functions are used to describe the influence of the winner parameter on the control points. It is important that q < l so that local error will decrease. Although this algorithm worked well in practice (Figure 7), there is no proof for its convergence. The algorithm is very conservative in memory usage, since one point is processed at a time and no special data structure is needed beyond the B-Spline surface representation. Time complexity is hard to predict since measuring the convergence is not simple as in other iterative methods such as gradient descent. The error measurement process in stage 3 of the algorithm is quite exhaustive. As a rule of thumb, the number of processed points L should be linear to the number of control points of the B-Spline surface with a large constant (about 40 times the number of control points worked in our examples). 3.
3.2 The RSEC Surface Fitting Method After the parameterization stage, the neighborhood relations between the points are known. A surface fitting process should then be applied in order to find the best approximation with respect to the sampled points and their parametric values. In this research, B-Spline representation is used as the geometric representation due to its natural compatibility with CAGD. B-Spline surface fitting is performed using Random Surface Error Correction (RSEC) based on the SOM algorithm. Mesh size, degree order and knot vectors of the surface are defined, and an initial B-Spline surface is created prior to initiating the RSEC algorithm. The RSEC fitting algorithm improves the (x,y,z) coordinates of the control points. For 3D SOM parameterization (Section 4.1.2), the initial surface is defined by the 3D parametric grid. However, for the 2D case where 2D
Adaptive parametric correction. The alternating method [15] was chosen as a surface fitting and parametric optimization method due to its small memory consumption and relative simplicity. The role of 3D base surface parameterization is to correct the initial parameterization. Once an approximated surface has been computed (Section 4.2), another parameterization iteration can be calculated where the current surface is
102
Authorized licensed use limited to: BEIJING UNIVERSITY OF TECHNOLOGY. Downloaded on October 22, 2008 at 09:35 from IEEE Xplore. Restrictions apply.
2. J. Barhak and A. Fischer. Parameterization and Reconstruction from 3D Scattered Points Based on Neural Network and PDE Techniques. IEEE-TVCG, 7( l), JanuaryMarch 2001. 3. S.W. Chen, G.C. Stockman and Kuo-En Chang. SO Dynamic Deformation for Building of 3-D Models. IEEE Transactions on Neural Networks, 7(2):374-387, 1996. 4. 1.0. Dellahy, B. Shariat and D. Vandorpe. Conformal Mapping for the Parameterization of Surfaces to fit Range Data. Product modeling for computer integrated design and manufacture. Chapman & Hall, 1997, pp. 263-272. 5. G. Elber. The Irit solid modeller version 7.0. http://www.cs.technion.ac.il, 1996. 6. A. Fischer, A. Manor and J. Barhak J. Adaptive Parameterizationfor Reconstruction of 3D Freeform Objects from Laser-ScannedData. IEEE Pacific Graphics '99, Seoul, Korea, October 5-7, 1999. 7. B. Fritzke. Some Competitive Learning Methods. Institute for Neural Computation Ruhr-Universitat Bochum http://www.neuroinformatik.ruhr-uni-bochum.de/inil 8. VDMIresearchlgsrdJavaPapedreport draft 1997. 9. P. Gu and X. Yuan. Neural Network Approach to the Reconstruction of Freefoe Surfaces for Reverse Engineering. Computer Aided Design, 27(1):59-69, 1995. 10. B. Guo B.-Su$ace Reconstruction from Points to Splines. Computer-AidedDesign, 29(4):269-277, 1997. 11. H. Hoppe, T. DeRose, T. Duchamp, M. Halstead, J. Hubert, J. McDonald, J. Schweitzer and S. Werner. Piecewise Smooth Surface Reconstruction. Siggraph 1994:295-301. 12. J. Hoscheck, D. Lasser and L. Schumaker (trans). Fundamentals of Computer Aided Geometric Design. A K Peters, 1993. 13. T. Kohonen. Self-Organizing Maps, Springer, 1997. 14. P. Laurent-Gengoux and M. Mekhilef. Optimization of a NURBS Representation. Computer Aided Design, 25(11):699-710, 1993. 15. W. Ma and J.P. Kruth. Parameterization of Randomly Measured Points for Least Squares Fitting of B-Spline Curves and Squares. Computer-Aided Design, 27(9):663775, 1995. 16. L. Piegl and W. Tiller. The Nurbs Book, Springer, 1997. 17. W.H. Press, B.P. Flannery, S.A. Teukolsky and W.T. Vetterling. Numerical Recipes, Cambridge University Press, 1989. 18. D.F. Rogers and N.G. Fog N.G. Constrained B-Spline Curve and Surface Fitting. Computer-Aided Design, 21(10):641648,1989. 19. T. Speer, M. Kuppe and J. Hoschek. Global Reparameterization for Curve Approximation. Computer Aided Geometric Design, 15:869-877, 1998. 20. T. Varady, R.R. Martin and J. Cox. Reverse Engineering of Geomemc Models - an Introduction. Computer-Aided Design, 29(4):255-268, 1997. 21 L. Virady, M. Hoffmann and E. Kovics. Improved Freeform Modelling of Scattered Data by Dynamic Neural Networks. Journal for Geometry and Graphics, 3(2):177181, 1999.
used as the new parametric base surface. This is accomplished by projecting the 3D sample points onto the base surface. as follows: An initial point on the base surface that is close to the sampled point is found through an exhaustive search over the parametric domain of the base surface by steps of Au, AV. The point closest to the sample point is chosen as an initial approximation. The sampled point parameter is corrected iteratively using the Newton method described in [16] until it converges to a given tolerance.
Summary & Conclusions This paper proposes a SOM neural network technique for reverse engineering that is applicable both to parameterization and to surface fitting. The inherent characteristics of the SOM enable it to automatically detect the orientation and sub-boundaries of the grid. Moreover, the initial parameterization takes into account all the sampled points and not only the boundary points. The proposed approach is based on the following stages: (a) reconstructing a parametric base surface by applying a 2D/3D SOM neural network method; (b) adaptively calculating the parameterization of the 3D sampled points; and (c) fitting the surface by applying a B-Spline RSEC method. The RSEC method is based on the SOM neural network method and is competitive with respect to the LSQ method. As a result of applying the proposed parametric and fitting SOM method, the reconstruction process is highly improved. The initial base surface is calculated quickly while preserving the shape Characteristics of the reconstructed surface. The correct orientation of the surface is detected while preserving topology under a minimal quantization error criterion. The aspect ratio of the grid is made compatible to the aspect ratio of the range image using the growing grid extension. The boundary of the 3D range image is extracted using either the boundary first extension in 2D or the boundary correction extension in 3D. The RSEC method enables high quality approximation of the final surface. Currently, this work related to one sampling view. In the future, the SOM neural network will be enhanced to arbitrary topology in order to create a complete envelope of a volumetric object. Preliminary results have already been achieved in this direction, but they are beyond the scope of this paper.
5 Bibliography 1. M. Alhanaty and M. Bercovier M. Curve and Surface Fitting
and Design by Optimal Control Methods. .Hebrew University, Leibniz Center Technical Report, No. 99-29, 1999.
103
Authorized licensed use limited to: BEIJING UNIVERSITY OF TECHNOLOGY. Downloaded on October 22, 2008 at 09:35 from IEEE Xplore. Restrictions apply.
Figure 4. The progress of the SOM algorithm with the growing grid extension and the boundary correction for (a) random initialization (b)-(e) intermediate stages, and ( f )the final grid.
Figure 5 . Comparison of the basic SOM algorithm to the SOM algorithm with the growing grid and boundary correction extension. (a) The SOM grid with no extensions. (b) The SOM grid with the growing grid extension and boundary correction.
104
Authorized licensed use limited to: BEIJING UNIVERSITY OF TECHNOLOGY. Downloaded on October 22, 2008 at 09:35 from IEEE Xplore. Restrictions apply.
Control mesh of S built
Displacement of the vertices by eq. 7
Closest point to P, and Err calculated
Figure 6. The effect of a single sampled point on the control mesh of a B-Spline surface iso-curve.
Figure 7. Reconstructed surface with 2D SOM parameterization and RSEC fitting (a) mask surface shaded, (b) iso-lines, (c) error map, (d) error scale in cm.
105
Authorized licensed use limited to: BEIJING UNIVERSITY OF TECHNOLOGY. Downloaded on October 22, 2008 at 09:35 from IEEE Xplore. Restrictions apply.