在好例子网,分享、交流、成长!
您当前所在位置:首页Others 开发实例一般编程问题 → 点云的特征提取

点云的特征提取

一般编程问题

下载此实例
  • 开发语言:Others
  • 实例大小:4.28M
  • 下载次数:8
  • 浏览次数:107
  • 发布时间:2021-01-25
  • 实例类别:一般编程问题
  • 发 布 人:好学IT男
  • 文件格式:.pdf
  • 所需积分:2
 

实例介绍

【实例简介】
This paper describes a new method to extract feature lines directly from a surface point cloud. No surface reconstruction is needed in advance, only the inexpensive computation of a neighbor graph connecting nearby points.
submitted for publication tetrahedron, we check if the center of its circumsphere is on the same side of the triangle as the fourth point of the tetrahedron. In case of two incident tetrahedra, we check if their circumsphere cen- ters lay on different sides of the triangle The second step of the algorithm is to filter out a reasonable sub- set of the Gabriel complex, which describes the surface. We used two triangle filters. The first was a simplified version of Adamy et al. s umbrella filter, which ensures that the triangles incident on each data point form at least one closed fan to find a good guess for this fan, thc gabriel triangles incident on a point arc sorted accord ing to a parameter, which gives a reasonable measure of whether the triangle should be part of the surface, For each data point triangles are extracted from the sorted list, until the first fan is completed at which point these triangles are validated as part of the output of the filter. We implemented the fan detection and identification with a modified version of a disjoint set data structure that did not prune the trees of the connected sets The simplified umbrella filter closes the holes of the model and connects different sur face parts(see figure 4 a)and c). The aim of the second filter was to recover the holes and to disconnect distant surface parts simply by eliminating triangles that have extraordinar- ily long edges. For this, we first computed the average edge length at each data point. Then we removed all triangles which contained an edge that was three times longer than the average edge length d次 然然入 of the two incident vertices In this way we coverered holes and disconnected nearby but disjoint surface parts(see figure 4 b)and Figure 4: Effects of filtering long triangles: a) closed holes before filter, b)holes recovered, c)inter-surface connections before filter, The delaunay filtering has the advantage that it helps to discon d)surface separated nect data points that lay on different surface parts. Our simplified filter did not serve for surface reconstruction. in cases in which the Models contained sharp edges as figures 3 c)and d)illustrate. But this is not important for our purpose as we detect the creases later of each data point and classifies the points into surface points, po tential crease points and potential border points. For the latter two on and only nccd the neighbor information contained in the pro duced triangles. More serious is that the delaunay filtering does not cases we must determine penalty functions, which measure the un produce enough edges(figure 3 a)) for very noisy input data sets, likelihood of a data point to be on a crease or a border line which causes problems especially for the feature detection stage in The second stage extracts the feature line patterns. The edges which the output is a subset of the neighbor graph. The two Dela of the adjacency graph are weighted with the penalty functions of nay filters are relatively fast and consumed for the brain dataset of their incident data points. We used a modified minimum spanning tree algorithm to extract the feature line patterns figure 13 with 84,000 points about 10 seconds, which is about one third of the time consumed by Qhull 3.1 Point classification Riemannian Graph The Riemannian graph contains for each data point the edges to the k nearest neighbors where we chose k For each data point pi one must gather from the neighbor graph between 10 and 16. For our purposes we could compute a sufficient set Ni of neighbor data points. The number of neighbors collected approximation to the riemannian graph with the approximate near- dcpcnds on thc noisc lcvcl of thc datasct. For data with low or no est neighbor library Ann [4]. For the brain dataset the Riemannian noise it is necessary only to gather the direct neighbors of the data graph was computed in 7 to 10 seconds for h:= 10 to 16. The points while for highly noisy data one must consider all neighbors advantage of the Riemannian Graph is that it can handle noisy data with a graph distance less than three four or five edges depending and consumes much less time than the Delaunay filtering approach on the noise leve From the set of neighbors we extracted two quantities. The cen- ter location Ci and the correlation matrix Ci given by 2.2 Sampling Density In the following computations it was often important to relate the R ∑ computed quantities to the sampling density. Most often we needed q∈N the sampling density in units of length. Therefore we defined the average distance pi of a data point pi with the set Ni of direct neighbor points in the neighbor graph as q∈N ∑|四 The eigenvectors eo, e1, e2 of the correlation matrix together q∈N with the corresponding eigenvalues{λ0,A1,λ2}, where Ao≤ A1 < X2, define the correlation ellipsoid that adopts the genera form of thc neighbor points. Figurc 5 shows the diffcrent shapes 3 Feature detection that we are interested in. It shows in green color the surface typi at the current data point: plane, crease wedge, border half plane In this section we describe how to find the crease and border pat- and corner. The data point is the yellow ball and the correlation terns in a point cloud The first stage analyzes the neighborhood ellipsoid is depicted around the centroid ci in violet submitted for publication to 2d/u and is a good criterion for detecting crease and corner points. We define the curvature estimate ki for each data point pi with distance d: from the fitted plane as de: 2d: By computing the maximum curvature estimation fmax of all data points, wc can dcfinc thc curvature penalty function wr as Rma The correlation ellipsoid tells us more than the approximated normal direction. Figure 5 b)shows that in the case of a crease point the correlation ellipsoid is stretched in the direction of the Figure 5: shape of the correlation ellipsoid around different types crease and the eigenvalues obeyλo≈h1andA+A1≈A2.We encapsulate the information of how well the eigenvalues fit to the of data points: a)surface point, b) crease point, c) border point, d) crcasc casc togcthcr with thc primary direction of thc ellipsoid into corner point a vector valued penalty function wicm For a point on a flat surface the ellipsoid degenerates to a pancake smax、入(m)-A0(m),A入2()=(A0(2)+A1(D) with入1≈A2 and do≈0. The unit vector en points in normal (6) direction, illustrated by the arrow in figure 5 a). The normal dervied In the case of a border point, the ellipsoid degenerates to an el- in this way is actually the sur face normal of a plane fitted to the set lipse. The smallest eigenvalue is still approximately zero and the of neighbors, that minimizes the squared distance to the neighbors. corresponding eigenvector gives a good approximation of the sur- Thc optimal planc must go through thc centroid Ci. The equivalence face normal. The other two eigenvalues obey 2A he of the different normal estimations can be seen easily by looking cigcnvcctor ey of the biggest cigcnvaluc givcs again thc approxi briefly at the least squares problem. Suppose n is the normal of the mate direction of the border loop. This yields the vector valued optimal plane. Then we want to minimize 2oN (n (a-ci))2 border penalty function Gbl under the constraint iiit= 1. By the method of the Langrange multipliers we need to minimize dr|入2(m)-2入1(m) A2(m) ∑n(q-c)(q-c)n+1-分=(C-ADi+入 With the help of the normal direction we can derive a second ∈ penalty function for the border. For this we project the point p to gether with its neighbors into an y-coordinate system orthogonal with the Langrange multiplier A. Notice that uu denotes the outer to the normal direction, where p is the origin as shown in figure 6 rector product, which results in a simmetric matrix, whereasv'u b). Then we sort the neighbors according to their angle around p denotes the dot product. Setting the gradient for n of this equa tion to zero yields exactly the eigenvalue equation C:n=An. The Lct b be the maximum angle interval, that docs not contain any neighbor point. The bigger B is, the higher is the probability that p solution with the minimum residual is given by the eigenvector cor- responding to the smallest eigenvalue is a border point. The corresponding penalty wv2 function is given (P) B(p) °Q4 2 Finally, at a corner(see figure 5 d) the correlation ellipsoid has no preference direction and all eigenvalues should be approxi mately the same. The corner penalty function wco is defined as 入2(m) 3.2 Feature Line Linking Figure 6: a)curvature estimation from fitted plane and average In the previous section we defined several penalty functions for neighbor distance A, b) determination of maximum open angle crease, corner and border detection Now we describe how to use these functions to extract the crease and border patterns. This task For data with low noise we can estimate the average curvature would be much easier if we knew the topology of the crease pattern froIn the least square fitted plane given by (eo, ci). Figure 6 a) and probably also the location of corners shows a two dimensional projection of the data point p with its igure 7 illustrates our approach. Given the penalty functions tangent planc dcgcncratcd to thc horizontal linc. Thc point has the (figurc 7 a)), wc compute a minimum spanning pattcrn on a subsct distance d i(p ci) from its tangent plane. The curvature of the neighbor graph. A minimum spanning pattern allows for radius, shown as a dashed line in the figure intersects the tangent cycles that contain more edges than a user defined constant(figure 7 plane at approximately the average neighbor distance u. From s b)). In a second step we remove short branches of the minimum 2d2 and the curvature k=1/computes spanning pattern(7 c)) submitted for publication a disjoint set data structure is used to determine whether this edge produces a cycle. If not, the edge is added to the crease pattern. In the case in which the edge under consideration produces a cycle in the crease pattern, we check if the produced cycle is longer than a user defined constant p. In that case this edge also becomes part of the crease pattern. Given the number of input points n a good guess for p is vr/2, which reflects the assumption that a cyclic crease should at least have the length of half of the diameter of the described object. The pseudo code for the first part of the feature line detection looks like this: iterate over all edges e in neighbor graph if weight(e). penalty T th queue. insert(e, weight(e)) while not queue empty ()do queue. extractMinimum() Figure 7: Feature line extrac- tion a) correlation tensors(the if disjointSet. findAndUpdate(p, q)or darker the color, the high the attern. pathIsLonger(p, p) then pattern. addEdge(p, g) curvature cstimatc), b)mod fied minimum spanning tree c) short branches removed The method findAndUpdate(p, g) in the disjoint set checks if the edge produces a cycle. If not, it joins the two connected components incident to p and q and returns true. Otherwise it re- In the case of crease pattern extraction we define the weight turns false and the method pathIsLonger(p, p) of the graph wc(e) of an edge e(p q) with length e =lq-pl and direction data structure checks if the cycle is long enough. The latter method e=(g-p/e as is time critical for large values of p. We implemented it with a breadth first search through the graph, in which we kept track of def the visited vertices. We used counters instead of flags for the track (Pq)) a(wr(p)+w(a))+ ing of visited vertices in order to avoid expensive initializations for (1-a)(min ler(p)'el,=wco(p))+ each breadth first search. In this way the search cost was O(p2)in min{问n(q)引u(q)})+ the worst casc. But the vast majority of edges arc addcd to graph components with diameter much smaller than p. In practicc thcrc 2el is hardly any dependence of the run time on p. Even for the brain p)+u(a model with 84,000 vertices and a p of 150 the pattern was extracted Wc takc thc minimum of the directional penalty function wcr and At this point in the process, the extracted pattern looks like the the corner penalty function uco in order to incorporate the corner one in figure? b). There are a lot of short branches that we have to penalty function what improves the locations of the detected crease remove. The basic idea is to remove all branches shorter than p/2 corners. The factor a depends on the amount of noise in the data while still managing to take care of singleton ends, which we do not For data sets with nearly no noise we choose a1/2. In cases of want to cut short. Thus in a first iteration over these points in the high noise levels we reduce a to 0. 2. as the curvature estimate is no pattern that have more than two incident edges, we check at each in- longer very reliable. The edge length term that favors short edges cident edge whether the incident branch is a subtree with maximum is added in order to straighten the feature lines depth less than p/2. The check is performed with a similar breadth corresponding peaty weights of the incident poinl (e)from the To detect border loops we define the edge weigl first searching algorithm as we used for pathIsLonger(p, 9, P If there are at least two branches with depht larger than p/2, we can safely remove all incident branches with smaller depth without af- b(e=(,q)当(b2(p)+a2(q)+ fecting singleton ends. The second step takes care of the singleton (1-1)(dn(p)e+|b(q)2d)+ ends and finds for each tree at a singleton end the longest path with minimum edge weight. All other branches at the singleton end are 2el removed. The final result of this process can be seen in figure 7c) (p)+p1(q We chose the factor y= 1/2, which produced closed border loops 4 Feature Recovery in all cases that we tested For the extraction of the crease pattern as shown in figure 7 b)we For datasets in which the data points lie directly on the feature lines first compute the weights for all edges in the neighbor graph edges the above algorithm robustly extracts the crease and border loop are not considered at all if the portion of the penalty derived from patterns. For datasets scanned from real data with for example a the vertex penalty functions exceeds a tolerance T. The tolerance laser range scanner even without any noise the probability that a tau is a measure of the sensitivity of the crease detection. as all the data point lies on a feature line is extremely low. If the laser beam cnalty functions arc normalized to thc interval [0, 1, it turns out of thc scanning dcvicc directly hits a fcaturc linc, the bcam is dis that a value of t= l is appropriate. if the dataset contains blunt persed and the measurement impossible. Thus we rather have to creases, T will require some fine tuning by the user expect that there are no points directly on the feature lines. As a The remaining edges are entered into a queue. The crease pattern result of the lack of feature points, the output of the feature line is built by considering the edge with the smallest remaining weight. detection algorithm is a zigzag line strip around the actual feature submitted for publication defined by two planes p1 and P2. The dotted line shows how the wedge splits the space into three regions A, B and C. All points in region A are closest to plane p1, the ones in region B to p2, and the points in region C are closest to the intersection line(or in two dimensions, the intersection point) of the two planes. This split into three regions makes a least squares fitting approach extremely dif- ficult, as it is not known in advance which point will fall into which region after the fitting process. Thus the primary question is how to lit the input points into regions The approach described in the next subsection splits the diffi cult three-dimensional problem into a number of simplified two b)三 dimensional problems, where the grouping problem can be solved efficiently. A robust approximate solution to the three-dimensional grouping problem is then composed from the two-dimensional So- lutions via a voting process In this way we can exploit the infor- vailable from the feature detection stage to simplify th feature recovery. After we have derived a reliable grouping we can fit the wedges to the points in the crease neighborhood and project the potential crease points This crease line recovery approach breaks down close to crease junctions as too many data points arc incorporated into the fit- ting process. For this reason we first determine the point group ing around corners. As we exploit the solution strategy of the line d)图 recovery but in a more complicated manner, we describe the line recovery process first Figure 8: a)zigzag crease line, b)zigzag border line, c)smoothed crease, d)smoothed border loops 4.1 Feature Line Recovery line. Figure a)and b) give two examples of this behavior. A sim- ple solution is sufficient for many applications. We just smooth the zigzag away by fitting a low degree spline to the feature lines. An example of the result from such an approach is shown in figures 8 c)and d). As the points on the crease lines are ordered, we can use a standard spline fitting procedure, which estimates the spline param eter values of the input points via a chordal length parametrization and then solves the resulting least squarcs problcm. For this fitting we have used the Nurbs++ library A A b) sp it e:" B A split lines 9: a) the two dimen- sional wedge fitting problem, b)the simplified wedge fit- Figure 10: Crease Recovery a)detected crease pattern, b) smooth- Q ting, c)sweeping through all ing through spline fitting, c)summed grouping weights along crease possible groupings strip, d)points moved onto recovered crease ae. n the case of crease line reconstruction, we propose a better so- Figure 10 demonstrates the different steps of crease line recot on than spline fitting. The basic idea is to reconstruct the crease ery on a very noisy dataset. We generated this dataset by randomly from a sufficiently large neighborhood and then move the points distribution 10,000 points in a volume defined as the solid differ on the zigzag pattcrn linc onto thc reconstructed crcasc. The qucs cncc bctwccn a cubc with edgc lcngth 2 and onc with cdgc length tion is how to reconstruct the crease lines and corner locations at 1.6, both centered around the origin. The result was a cube surface the crease junctions. Given a point close to a crease line and its with 20% noise in all directions neighborhood one would like to fit a wedge to the neighbor points In order to project the wedge fitting problem for each crease Figure 9 a) ill lustrates the fitting in two dimensions. The wedge is point to the two dimensions case as shown in figure 9 a), we need submitted for publication to know the crease direction or a good approximation to it. As the wedge fitting problem. However, each single grouping might con detected feature lines are noisy, we use the spline approximation. tain a few misgrouped data points, for example if the crease has From the spline fitting process we know for each crease point, p a sharp turn. We want to combine as many groupings of adjacent the parameter value tp of its counterpart on the spline s(t). Thus crease points as possible. The maximum subgraph of a crease pat we can compute the smoothed location Ps s(tp) and crease di tern, where the neighboring points split into exactly two groups, is rection ds =s(tp)/s(tp) We extend ds to an orthogonal coordi- either a closed loop or a path connecting two crease points of degree ate system with an ay plane orthogonal to ds and project neighbor unequal two, i. e. crease junctions and singleton ends. We call these points of p to the ay plane arriving at the configuration described maximum crease paths crease strips. For each crease strip we attach a weight field to each of the data point that is adjacent to at least one in figure 9 b) The wedge fitting problem in two dimensions is still very diffi point on the crease strip. After initializing the weights to zero, we cult as wc do not know the zenith location of the wedge. Thus we start a voting process at one end of the strip. We build the a y coordi simplify the problem be assuming that ps is the zenith of the wed nate system orthogonal to the spline direction and find the optimal Furthermore we split the space at the split line into only two regions grouping of the neighbors that solves the simplified wedge fitting A and B, as depicted in figure 9 b). The distances from the points problem. The weights of neighbors are incremented for members in the previously defined region c to the wedge are underestimated of group A and decremented for group B members. The amount of the increment decreases with increasing distance from the wedge (see neighbors qA and qB in figure 9 b) for two examples ). This is not a serious problem for our application as these points are rare zenith. When we move on to the next crease point on the strip and their existance either implies that we detected the crease point we choose I-and y-axes as close as possible to the previous axes wrong or that there is noise in the data such that we can identify the groups A and b of the new grouping The helpful property of the simplified wedge fitting problem is ith the previous groups. In this way we sweep through the strip and increment and decrement weights for the neighbors. a samplc that no matter what the solution is, the points are split into two weighting along one crease strip in the cube model is shown in fig groups at a split linc that gocs through pa. Thcrc arc only n of thesc groupings, where n is the number of points in the neighborhood ure 10 c). After all crease points have contributed to the weights We can enumerate them by sweeping the split line in a rotational we classify the neighbor points into group A if their final weight is motion through the neighbor points as illustrated in figure 9 c). The positive and into group B if the resulting weight is negative. If a weight is very close to zero we do not consider the corresponding final solution is the one with the smallest residual Given the set of neighbor points N and a grouping M1nN2 neighbor point in the final recovery stage N, the solution of the simplified wedge fitting problem is found by In the end we derive a grouping of the crease neighbors that ex solving two least square line fitting problems. To each neighbor ploits all the information we get from the feature detection stage.A group N1 and N2 we fit a line through the origin pa. To avoid th final iteration through the points on the crease strip recovers the ori O(n floating point operations per grouping, we prepare the input gin crease locations. The neighborhood of each point on the strip matrices is now reliably split into two groups. Thus we can fit two planes to the two groups in the least square sense by computing and ana =∑ lyzing thc corrclation matrix for cach group rclativc to the centroid of the group The local crease approximation is the intersection of the resulting planes. We project the crease point orthogonally onto incrementally during the rotation of the split line. For the first split the crease approximation. Figure 10 d) shows the result, a more line location we compute the matrices explicitly. Each time the or less straight line for the cube dataset with 20% noise. There plit linc rotates onc stcp furthcr, only onc point q changcs from are still problems near the crease junctions because we do not ex one group to the next. We subtract qg from the matrix of the group clude neighbors in opposite faces of the creases. The solution to that q leaves and add it to the other matrix. As the matrices are this problem is described in the next section symmetric and we are in two dimensions, the cost is only three multiplications and six additions. Given the matrix mi, the normal to the optimal line is given by the eigenvector of mi correspond 4.2 Feature Junction Recovery ing to the smallest eigenvalue. The following algorithm computes In order to find realistic crease lines and their intersection at creas this eigenvector of a matrix m with three square root operations one division, four multiplications, five additions and some sign and line junctions, we have to solve the corner fitting problem. Suppose binary shift operations we have a junction of k intersecting crease lines. The surface is lit into h regions, which are planar near the junction as depicted in figurc 11 a). The corner, which we want to fit to the data points, a:=(m11-m22)/2 is given by the piecewise planar approximation of the surface at if a =0 and m12=0 then return (1.0 the junction. It is defined by the corner point and k plane normal if m12 =0 then or a least squares fit of a corner to a set of data points we have if a>0 then return(0, 1) to split the data points into k groups, such that the total residual else return (1,0) of square distances is minimized, when k planes are fit to the k if o then groups. Figure ll c)shows such a grouping. This problem is more if 12 >0 then return(v2 difficult than the wedge fitting problem and the number of different else return(v2,√2/2 groupings increases exponentially with k, even if we can arrange the vertices in a linear orde f:=a/(2√a2+m2 Figure 1l illustrates our approach to an approximate solution of 1/2+f the corner fitting problem Suppose we start of with a crease junc d:=sgn(m12)√1/2 tion point pc detected in the feature detection stage. First we find if sgn(a)= sgn(d=-c) then return (c, d) a two dimcnsional projcction of the problcm, in which the points else return (d, -c) in the neighborhood Nc of pc can be sorted around pc according to their angle relative to an arbitrary -axis in the projected space(fig Now we are in the position to efficiently group the neighbor ure 11 b). We can imagine different approaches to do this. First hoods of each crease point optimally according to the simplified we could fit an ellipsoid to the point set [l] and use the direction submitted for publication the grouping terminated alter two or three iterations Given the optimal grouping, we can fit k planes to each of the k groups and reconstruct the corner For k> 3 the planes do not always intersect in a single point. Therefore, we again use a least squares approach to reconstruct the corner location. We minimize the square distance to all of the k planes the tool to solve this min imization problem is also known as quadric error metrics, orignal introduced by Garland[9 in mesh simplification to find optimal lo- cations for the mesh vertice after an edge collapse operation. With thc knowledge of thc vortex grouping around thc junction point, we can finally recover the crease close to the junction by not consider- b ing any of the data points of Nc that do not belong to the two groups Let us summarize. The feature recovery stage first groups the neighborhoods around crease junctions into as many groups as there 唱 are incident faces. From the grouping the corner locations are re- covered. Then the crease lines close to the junctions are recon structed with the knowledge about grouping near the junctions. Fi 2.. nally, the remaining parts of the crease lines are recovered with the Inethod described in the previous section. Figure 11 f shows the final result of the noisy cube. 5 Applications Results 5.1 Surface Meshing Our primary goal in developing the feature extraction algorithIm was as the first stage of a point cloud mesh generation tool. The basic idca behind the point cloud mesh generator is to locally fit polyno mial patches to the point cloud and directly generate a mesh with element sizes that are adapted to curvature and to an anisotropic siz ing function. The mesh is generated with an advancing front algo rithm. While the front advances a new local approximation is com puNted, whenever the front leaves the domain of the current polyno- f) mial patch. An important ingredient is again a neighbor graph that reflects the proximity along the surface. Problems arise as always at Figure 11: Junction Recovery a) a corner is given by a set of planes, the crease lines, where a naively computed neighbor graph connects b) finding a suitable 2d projection, c)initial grouping, d) grouping the different sur face parts incident to the crease. This will dimin along crease strip, e)summed weights after strip grouping, f)cor- sh the quality of the fitted patch near the crease lines significantl ners and creases reconstructed With the help of point grouping along the edges, as described in sec- tion 4.1 and 4.2, we can easily remove all edges that cut the creases and corners. Thus our feature extraction stage not only produces from the centcr to thc junction point as thc projcction direction. A the feature lines for the point cloud mesher, but also a sophisticated second approach would be to trace the normal direction, given from neighbor graph the correlation tensor, from the surrounding points to the junction We used the following approach, which works very well in all test 5.2 Point cloud Enrichment cases we evaluated. We selected the direction from the centroid of the neighborhood to the junction point as the projection direction ∴:的: An initial group of the neighbor points is derived from the spline pproximation of the incident crease lines. We partition the two di mensional space into k angular partitions, which split the neighbor points from Nc into k groups as shown in 11 c) To improve the grouping, we use the weighting idea described in the previous section. We trace along each of the k incident crease strips and solve the simplified wedge fitting problem without con sidering data points in the groups not incident to the strip(see fig- ure 1l d)). Each data point in N accumulates distance-weighted group votes in k weight variables. Each strip is traced until no more Figure 12: The triangulations produced by simple delaunay filter votes are added to the points in N or the end of the strip is reached ing resulting from point cloud enrichment. a) enriched torso model After all incident strips have been traced, a new grouping is derived (compare figure 3 d)), b) fandisk without enrichment, c)enriched from the weights, an cxamplc of thcsc stcps is shown in figurc 11 c) fandisk Each data point in Nc is assigned to the group with the maximum weight. Then we start the whole strip tracing and voting process A very simple but useful application of the feature extraction al over and over again until no data point changes the group anymore gorithm is point cloud enrichment. The goal here is to improve the or a maximum number of iterations is reached. In all our examples resolution of feature lines by resampling the initial point set so that submitted for publication standard algorithms can now handle the features. Figure 12 shows The last three columns tabulate the feature recovery stage. First we two examples, the torso model and the fandisk. The torso is re- specify the graph distance in which we analysed the neighborhood constructed correctly with the simplified delaunay filter. On the fan round corners and crease strips last two columns contain the disk there are problems with a few triangles that connect two differ- run times for the corner and strip recovery, respectively ent sides of a crease. An example can be seen in the middle of the concave crease. As we group the vertices around creases during the feature recovery stage, we could add a third filter for the simplified 6 Conclusion surface reconstruction that removes triangles, that interconnected different sides of a crease. This was not implemented yet 6.1 Summary We have presented a new approach for feature extraction of crease 5.3 Non Photorealistic Rendering and border patterns from point clouds. The proposed method is reasonably fast, robust against noise and adapts easily to non uni- formly sampled point clouds. For noisy data or data that is undcr sampled on thc crcasc lincs, a Icast squarc bascd fcaturc recovery algorithm allows us to project data points that are close to the actual crease junctions and lines onto the features. These properties make our approach an excellent preprocessing step for surface reconstruc tion algorithms. Other applications like point cloud enrichment and non photorealistic rendering have been described b 6.2 Discussion Future work Figure 13: non photorealistic rendering of brain a)surface model reconstructed with delaunay filtering b)extracted feature lines c We presented efficient solutions to the wedge and corner fitting roblem. The information from the feature detection stage was combined rendering ploited to simplify the problems. In future work we plan to termine whether the feature detection and recovery stages can In Nonphotorealistic rendering [ll] surface display is often en- combined to make the algorithm even more robust hanced along feature lines and lines of high curvature. Our first Some datasets contain isolated peeks without any creases. For example of surface enhancement was the bunny model in figure 2, example, the well known cow dataset has two peaks at the horns in which we singled out the crease and border lines. We did the These peaks are singleton points in the crease pattern and not de- same for the brain model as shown in figure 13 tcctcd by our algorithm. Thcir ncighborhood is also quite diffi cult to mesh with polygonal faces. We want to investigate how to 5.4 Results fit cones to the peaks and how to incorporate them into polygonal mesh representations Acknowledgements was supported by al Institutes of health References [1 R.Fisher A. Fitzgibbon, M. Pilu. Direct least-square fitting of ellipses. In Inter Figure 14: Crease patterns on torso, fandisk and bunny model national Conference on Pattern Recognition, August 1996 [2 Udo Adamy, Joachim Giesen, and Matthias John. New techniques for topolog- ically correct surface reconstruction. In T. Ertl, B. Ilamann, and A. Varshney, Figure 14 shows the extracted crease patterns of the torso, the editors, Proceedings Visualization 2000, pages 373-380. IEEE Computer Soci fandisk and the bunny models. The crease patterns of the random ety 'Technical Committee on Computer Graphics, 2000 cube and the brain can be seen in figure 11 f) and 13 c). The surface triangulations were generated with the Delaunay filtering approach [3 Maria-Elcna Algorri and Francis Schmitt. Surfacc reconstruction from unstruc describcd in scction 2.1 tured 3D data. Computer Graphics Farum, 15(1): 47-60, March 1996 Table I gathers more information on the used parameters and the [4] Arya, Mount, Netanyahu, Silverman, and Wu. An optimal algorithm for approx run time consumption of the different stages. All times were mea imate ncarcst neighbor searching fixcd dimensions. JACM: J Journal of the ACM, sured in seconds on a Pentium Ill 600 MHz. The capitalized colums 45,1998 in table I contain timing measurements. The second column con [5 C. Bradford Barber, David P. Dobkin, and Hannu Huhdanpaa. The quick ains the number of input data points. For the analysis stage we hull algorithm for convex hulls. ACM Transactions on Mathematical Software, measured the time consumed by the delaunay filtering split into 22(4):469483, December1996. the run time of qhull and the run time of the filters- see column [6E. Bittar, N.(or R??)Tsingos, and M -P Gascuel. Automatic reconstruction of three. We also built the riemannian graph, which we used for fea- unstructured 3D data: Combining medial axis and implicit surfaces. Computer ture extraction. The timings are in the fourth column and the num- rm,14(3)C457-C/468, September1995 ber of nearest neighbors in the fifth. The next four columns contain [7 A Blake and A. Zisserman. Invariant surface reconstruction using weak conti- the parameter values uscd for fcature detection and the consumed nuity constraints. In Proc. IEEE Computer Vision and Pattern Recognition, July time, which is split into the time for the point classification and the 1985 time for the minimum pattern computation and pruning. The point [8 P. Fua and P Sander. Reconstructing surfaces from unstructured 3d points. In classification consumed most time for the cube dataset with a lot of Defense Advanced Research Projects Ager noise as we used all neighbor points with graph distance up to three. Software and Intelligent Systems Office, January 1992. San Diego, California submitted for publication Model Analysis Feature Detection Feature Recovery naine vertices DELAUNAY RIemann TIME avg dist CORNERS STRIPS torso 71‖[0.15+005 0.0613 0.50.515 0.05 0.07 random cube10,000‖25-1.22160.2505[40[10z+1.4‖ fandisk 16.477 6.4+2.0 3.1313‖0.670.540 2.9 043 bunny 9.5+5.1 2.7[1006670.54035+08 0.8 2.3 brain 83.835 26+10 0.50.580 10+3.5 Table l Parameter values and run time measurements for the different feature extraction stages measured on different models the times were measured in seconds on a pentium iii with 600 Mhz [9]M. Garland and P. S. Heckbert. Surface simplification using quadric error met- rics. In SIGGRAPH'97 Conference Proceedings, pages 209-216, 1997 10 G. Guy and G. Medioni. Inference of surfaces, 3d curves and junctions from sparse, noisy 3d data, 1997. [11] John Lansdown and Simon Schofield. Expressive rendering: a review of nonpho- torealistic techniques IEEE Computer Graphics and Applications, 15(3): 29 May 1995 [12]R. Mencl and H. Miller. Graph-based surface reconstruction using structures in scattered point sets. In Franz-Erich Wolter and Nicholas M. Patrikalakis edi dings of the Conf (CG1-98), pages 298-311, Los Alamitos, California, June 22-26 1998 IEEE Computer societv [13 S. Choi N. Amenta and R Kolluri. The powcr crust. In to appear in ACM Symposium on Solid Modeling and Applications, 2001 [14S. S. Sinha and B G. Schunck. A Lwo-slage algorithIn for discontinuity- preserving surface reconstruction. IEEE Transactions on Pattern Analysis and Machine intelligence, 14(1): 36-55, January 1992 [15 Richard Szeliski and David Tonnesen. Surface modeling with oriented particle systems. volume 26, pages 185-194, July 1992 [16 Demetri Terzopoulos and Dimitri Metaxas. Dynamic 3D models with local and global deformations: Deformable superquadrics. IEEE Tr nsactions on pattern Analysis and Machine Intelligence, PAM1-13(7): 703-714, July 1991 【实例截图】
【核心代码】

标签:

实例下载地址

点云的特征提取

不能下载?内容有错? 点击这里报错 + 投诉 + 提问

好例子网口号:伸出你的我的手 — 分享

网友评论

发表评论

(您的评论需要经过审核才能显示)

查看所有0条评论>>

小贴士

感谢您为本站写下的评论,您的评论对其它用户来说具有重要的参考价值,所以请认真填写。

  • 类似“顶”、“沙发”之类没有营养的文字,对勤劳贡献的楼主来说是令人沮丧的反馈信息。
  • 相信您也不想看到一排文字/表情墙,所以请不要反馈意义不大的重复字符,也请尽量不要纯表情的回复。
  • 提问之前请再仔细看一遍楼主的说明,或许是您遗漏了。
  • 请勿到处挖坑绊人、招贴广告。既占空间让人厌烦,又没人会搭理,于人于己都无利。

关于好例子网

本站旨在为广大IT学习爱好者提供一个非营利性互相学习交流分享平台。本站所有资源都可以被免费获取学习研究。本站资源来自网友分享,对搜索内容的合法性不具有预见性、识别性、控制性,仅供学习研究,请务必在下载后24小时内给予删除,不得用于其他任何用途,否则后果自负。基于互联网的特殊性,平台无法对用户传输的作品、信息、内容的权属或合法性、安全性、合规性、真实性、科学性、完整权、有效性等进行实质审查;无论平台是否已进行审查,用户均应自行承担因其传输的作品、信息、内容而可能或已经产生的侵权或权属纠纷等法律责任。本站所有资源不代表本站的观点或立场,基于网友分享,根据中国法律《信息网络传播权保护条例》第二十二与二十三条之规定,若资源存在侵权或相关问题请联系本站客服人员,点此联系我们。关于更多版权及免责申明参见 版权及免责申明

;
报警