亚洲免费av电影一区二区三区,日韩爱爱视频,51精品视频一区二区三区,91视频爱爱,日韩欧美在线播放视频,中文字幕少妇AV,亚洲电影中文字幕,久久久久亚洲av成人网址,久久综合视频网站,国产在线不卡免费播放

        ?

        Regularized Principal Component Analysis?

        2017-06-19 07:30:20YonathanAFLALORonKIMMEL

        Yonathan AFLALO Ron KIMMEL

        (Dedicated to Haim Brezis on the occasion of his 70th birthday)

        1 Introduction

        One of the most popular procedures used for data representation is the principal component analysis(PCA for short).Based on recent results reported in[2]that proclaim the optimality of eigenfunctions of the Laplace-Beltrami operator(LBO for short),we revisit the construction of PCA for geometric structures and propose to redesign one of the fundamental tools in information analysis.Provided smoothness of a given data entity,we could exploit the intrinsic geometric structure to regularize the construction of a basis by which the observed data is represented.The same LBO that when it is applied to the coordinates of a surface produces the mean curvature vector,can be decomposed to provide a representation space optimized for both internal structure and external observations.As a motivational sketch,the reconstruction of the horse model,given the training setis demonstrated in Figure 1.The reconstruction is computed by projecting the coordinates of the model onto thefirst 100 eigenvalues of the PCA(left),the LBO(middle),and the suggested hybrid measure(right).In this case,the proposed hybrid model best represents the shape of the desired model.It allows the horse to rotate its head,a pose that does not exist in the training set,and thus could not have been realized by a convex combination of its decomposition captured by the PCA basis.

        Figure 1 The reconstruction of a surface of a horse by projecting its coordinates onto the first 100 PCA eigenvectors(left)extracted from training on two horses,on the LBO eigenvectors(middle)extracted from one of the training horses,and the proposed hybrid(right).

        2 Ubiquitous LBO

        We start with the simple example of a scalar periodic function f defined on a closed curve C:[0,L)→R2in the plane,such that one could write f:[0,L)→R1.Here,L is the length of the planar curve,and s∈[0,L)denotes the arclength parametrization along the curve C(s)={x(s),y(s)}.The second order derivative of f(s)can be written as fss(s)=?ssf(s).If instead of a general f,we differentiate the curve’s coordinates,we obtain ?ssC(s)=Css(s)={xss(s),yss(s)}=,where κis the curvature of the curve,and?n its normal.Next,consider the operator?ssin its discrete setting along the curve.That is,assume uniform sampling of n points along the closed curve,such that the operator is approximated by

        The discrete operator is a circulant matrix which is diagonalized by the discrete Fourier transform.In other words,the eigenvectors of the matrix are the sine and cosine functions uniformly sampled along the closed curve.It appears that ordering these eigenvectors in an ascending order by their corresponding eigenvalues and picking only the first vectors as the representation sub-space,one obtains an optimal subspace for representing smooth functions defined on the curve.In signal processing terminology,we say that a low pass filter applied to the family of functions with bounded gradient magnitude,is optimal in terms of representation error.In approximation theory terms,projecting bounded gradient functions to the leading eigenvectors is optimal in a worst case L2sense.

        The second derivative operator that generated the curvature and provided the Fourier transform as an efficient representation space,can be extended to more interesting domains.The second order derivative of a scalar function in a flat domain is known as the Laplacian.Its extension to curved domains is the Laplace-Beltrami operator or LBO.

        The influence of the Laplace-Beltrami operator can be found in all modern fields of science.Its eigenspace is used to analyze and represent structures in graph theory,geometry,chemistry,biology,physics and machine learning.Its action on the coordinates of a surface produces the mean curvature vector,while its decomposition as an operator defined for regular sampled periodic scalar functions yields the vectors that compose the Fourier transform.Here,we exploit some of the LBO nice properties in order to accurately and efficiently represent signals in a way that goes beyond the classical convex hull of the given observation set.To that end,we revisit the notion of principal component analysis(PCA for short)in data understanding and couple it with a regularization term in an attempt to construct a natural orthonormal low dimensional representation space.We relate our construction to the Laplace-Beltrami operator that is known to provide an optimal basis for smooth functions on manifolds(see[2]).The efficiency of the proposed space is demonstrated to be a compact representation for articulated objects.The surface coordinates at one end,captured by the classical PCA,and the intrinsic geometric structure of the objects expressed by the metric at the other end,support one another in reconstructing the object from a small number of projections.Out-of-sample poses are encapsulated by the metric,while the fine details of the shapes are handled by the principal components.It is worth noting that a similar motivation led the authors of[17]to construct a different numerical solver to a dual regularized PCA problem,while different(nongeometric)smooth PCA methods can be found,for example,in[28].For further reading on regularization methods using the LBO for other problems,see[19,21–22,31].

        3 History in a Brief

        A classical problem in machine learning is one in which the data is to be represented by a linear combination of a small set of vectors that belong to a specific orthonormal basis.The most common approach in data representation is probably the principal component analysis method known as PCA(see[18]).Related to the PCA,we have multidimensional scaling(MDS for short)methods(see[7])and generalized MDS(GMDS for short)(see[9]).Observing the data from a different geometric perspective,local relations between data points can construct a manifold for which a Laplacian could be defined.The spectral domain of such a Laplacian could serve for data representation.We focus on a specific data representation application,namely shape reconstruction,that would allow us to demonstrate the regularized PCA procedure.Reviewing all existing shape representation techniques is obviously beyond the scope of this paper yet,it is interesting to mention Gotsman and Karni[20]who used the LBO spectral domain to approximate coordinates of shapes,while Lévy[23]filtered the spectral domain to deform shapes.Eigenfunctions and eigenvalues of the LBO are often used to approximate diffusion distances on manifolds(see[6,11–13,27]).The spectral domain provides an efficient way to construct shape descriptors like the scale-space representation(see[35]),the heat kernel signature(HKS for short)(see[33,15]),the wave kernel signature(WKS for short)(see[3]),and the global point signature(GPS for short)(see[30]).The eigenfunctions were also used for dimensionality reduction for object recognition,as demonstrated in[29].

        Traditional signal representation techniques resort to low pass filters in the Fourier domain that are classically justified through statistical reasoning like the optimality of the Karhunen-Lo`eve transform that coincides with the Fourier in some cases.Related stochastic geometry processing reasoning motivated Ben-Chen et al.[5]to support the eigenfunctions of the LBO as a convenient representation space.Recently,with Brezis,we have been able to prove that the LBO spectral domain is optimal for the representation of smooth functions on the manifold(see[2],where a deterministic optimality proof based on the Courant-Fischer min-max principle in[34],and Problems 37 and 49 in[8],was provided).The LBO eigen-space was also suggested by Belkin and Niyogi[4]to represent information about data manifolds.The number of required eigenfunctions was estimated in[36](see[32,16,14]for related approaches).

        The generalization of Fourier produced by the LBO handles optimally signals of bounded gradients.In some cases,more information about the data we would like to represent is provided,say,samples of typical signals.Assuming there is a low rank description to the information,we could use the observations to estimate a low dimensional representation space.Technically,instead of considering all possible periodic functions,assuming that we are given k observationwhere fi:[0,L)→R1,we represent the space of functions which we are interested in.Next,denote bythe sampled fi(s)at equally spaced n points.

        When dealing with observations of discrete data,the most popular tool for finding a low rank representation space is probably the PCA algorithm.It finds an orthonormal basis of m vectors Pj∈ Rndenoted by Pn×m,out of k sample vectors xi∈ Rndenoted by Xn×kfor k≥m.

        The PCA algorithm searches for a low rank space P that would best describe these observations.In a flat domain,the PCA algorithm computes an orthonormal basis P that minimizes the error of projecting xionto P,namely,

        where the projection of xionto the basis P is given by PPTxi.Here,I denotes an identity matrix of size m×m.Using these notations,one could show that

        Recalling thatthe minimization(3.1)can be easily shown to be equivalent to the maximization problem

        P that solves(3.3)can be extracted by a singular value decomposition(SVD for short)of X.

        4 Intrinsic LBO Versus Extrinsic PCA

        A shape representation model that is invariant to poses of articulated objects and to gestures of most creatures in nature is the isometry model.As empirically shown in[10]for the case of facial expressions,the intrinsic geometry of a facial surface is more or less preserved for the same subject when smiling,posing a neutral expression or expressing sadness.Isometry or length preserving mappings were thus used for expression invariant face recognition.At the other end,the coordinates of the surface,that can be thought of as functions defined on the surface manifold,determine the pose itself.

        We suggest to consider two geometric structures,the intrinsic one defined by the differential relations between surface points also known as the metric,and the extrinsic one defined by the coordinates themselves.As shown in[2],smooth functions defined on the surface are optimally approximated by the leading eigenfunctions of the Laplace-Beltrami operator.In some cases,more information about functions defined on the surfaces is provided,like examples of the surface coordinates that we would like to represent,given in different poses.In that case,the leading terms,or the principal components derived through a PCA method could provide the best representation space for realizations within the convex hull of the observed functions.The combination of both intrinsic structure and extrinsic observations could allow us to enjoy the best of both worlds.The result that we are looking for would be an orthonormal optimal representation space in terms of projection to the leading subspace.It could faithfully approximate realizations outside the convex hull of the observed functions.Here,we propose to combine the knowledge of both intrinsic geometry,that is captured by the decomposition of the LBO,and extrinsic examples apprehended by the PCA,to construct a naturally regularized PCA.

        5 Regularized-PCA Formulation

        Principal component analysis in[25],as a classical method in data representation and analysis,also plays a major role in modeling and studying surfaces.

        Given the surface S,the principal components of k given scalar continuous functions,fi:S → R,are defined by the orthonormal basis{ψj}that is obtained as a solution of

        for any given m.Up to this point the surface S and its metric g do not play a major role in the definition of{ψj}.The manifold S and its metric g should allow us to tune the representation space better.In the above definition of continuous principal components,there is no smoothness assumption of the representation space{ψj}on the manifold.

        Next,assume that the functions that we would like to represent,have a bounded gradient magnitudeon the surface S.Such smooth functions are best represented by the eigenfunctions of the LBO,and those can be obtained as a minimization of the Dirichlet energy

        Next let us translate the general metric case into its discrete form.The metric(gij)defines the mapping between an arc length on a given parametrization space to length on the given manifold.An area element can then be defined byand in a discrete setting,one can define an area elements diagonal matrix A.For example,for a triangulated surface,the area elements Aiiare proportional to the sum of areas of all triangles that share the vertex i(see Figure 2).

        Figure 2 The area of the Voronoi cell about the vertex vi(red region)defines the entry Aiiof the diagonal matrix A.

        On the surface S,the geometric projection of xionto P reads PPTAxi,and an orthonormal P is defined as PTAP=I.The geometric PCA reads

        which can be shown to be equivalent to solving

        Next,we would also like to incorporate the natural smoothness of the signals that we would like to represent.It turns out that the geometric Dirichlet energy produces the Laplace-Beltrami operator that would have yielded the generalized Fourier basis.On a manifold,the discrete Laplacian can be defined,for example,by the general form L=A?1W.A specific realization of these weights for two dimensional triangulated surfaces is given by the cotangent weights scheme(see for example[26,24]).The Laplacian for a triangulated surface given by its vertices{vi}and edges{eij}can be approximated by L=A?1W,where A is the diagonal matrix of the areas of Voronoi cells about each vertex(see Figure 2).

        The matrix W is defined by the cotangent weights

        where N1(vi)is the set of 1-ring(neighboring)vertices about vi,and αijand βijare the angles opposite to the edge(vi,vj)(see Figure 3).

        Figure 3 Cotangent weights as defined in the equation 5.5.

        For the j-th column of P we have

        and for the basis P as a whole,optimizing for the geometric discrete Dirichlet energy reads

        We could replace the above minimization problem with the following maximization one

        where W+is the pseudo inverse,also known as Moore-Penrose matrix inverse,of W.

        6 Secrets of Marriage

        In[1],it was proven that an optimal and unique solution to the regularized-PCA problem

        whose solution is equivalent to that of

        can be realized by decomposing the matrix M=AXXTA?μW.The leading eigenvectors of the decomposition of M define P.Another option is defining the intrinsic smoothness given by the equation(5.8)and the PCA given by the equation(5.4)as eigendecomposition maximization problems that again allows us to unite them into a single regularized-PCA problem

        Here,the matrix that we have to decompose,namely AXXTA+μW+,is positive definite and symmetric.Most numerical decomposition algorithms extract the eigenvectors of a given matrix one at a time,where the order is determined by the magnitude of the corresponding eigenvalues.The basic operation of such procedures,also known as Arnoldi iteration,involves in the multiplication of the matrix with a vector.It is obviously efficient for sparse matrices.Here,we are combining a low-rank(PCA)part,given by AXXTA,with a sparse LBO,given by W.This combination poses a numerical challenge.

        As a remedy,we propose to apply a small size SVD to Xn×k,such that

        where D is a diagonal,and U and V are both orthonormal matrices.Denoting?U=AU,we can write

        Given a general vector y,the multiplication

        can be efficiently executed by the following procedure.

        ?

        Let us analyze the complexity of the above procedure.The first step takes O(kn),after?U is truncated to an n×k matrix,where k is the number of required PCA components,and k?n.The result,a k × 1 vector,is then multiplied bywhich takes O(k2).The complexity of the third step is O(kn),obtained by computing.Next,numerically solving the sparse system y=Wb,with O(n)elements in W,can be efficiently performed by classical procedures.The above marriage of a low rank term with a sparse structured part is computed very fast with very low requirements.For example,hundred eigenvectors for a 10,000 vertices surface can be computed in a few seconds by using Matlab on a modern laptop computer.Note also that the above procedure exploits both the sparsity of W and the low rank property of X within a unified procedure.

        7 Experimenting with Shapes

        Figure 4 Top row:Training set.Second row:Test set.Third row:Projection to the 100 leading eigenfunctions of the LBO.Fourth row:Projection to the first 100 principal components vectors.Bottom row:Projection to the first 100 vectors computed by the regularized-PCA hybrid model.

        In the following example,shown in Figure 4,we demonstrate the power of the regularized PCA in modeling out-of-sample shapes.We treat the X,Y and Z coordinates of given shapes as training vectors for a PCA procedure.We also use the intrinsic geometry of one of the shapes,which is assumed to be similar for all poses,to define a Laplace-Beltrami operator for which we computed the leading eigenfunctions(corresponding to the smallest eigenvalues).The coordinates of five shapes,being more different than the training ones,are then projected onto the first 100 leading eigenvectors or the LBO(third row),the PCA(fourth row),and the regularized model(bottom row).From Figure 4,it appears that the intrinsic geometry provided by the leading eigenfunctions of the LBO captures the general pose while giving up the fine details.The PCA subspace,at the other end,captures the fine details and poses within the convex-hull of the given training set while it is failing to represent poses beyond the observed ones.The hybrid model captures both poses and details,and obviously enjoys the best of both representation spaces.

        8 Happy Ending

        The LBO Δgapplied to the coordinates of a surface S yields the mean curvature vector,H?N=ΔgS.At the spectral end,smooth functions on the manifold,like shape coordinates,are best represented by the leading eigenfunctions of the LBO,Δgψi= λiψi.However,when a set of observations is provided,like many instances of the same objects in various poses{Si},classical statistical learning techniques suggest the PCA as a method of choice for finding a low rank matrix P that would represent these observations,by minimizingor equivalently maximizing trace(PPTSST),whereWe have shown that the best representation space could be provided by a model that unifies the two different sources of information about our data.By marrying the intrinsic structure with extrinsic observations,we have shown that even extreme out-of-sample configurations can be projected onto a new low dimensional space that preserves poses without sacrificing the fine details.

        [1]Aflalo,Y.,Brezis,H.,Bruckstein,A.M.,Kimmel,R.and Sochen,N.,Best bases for signal spaces,Comptes Rendus de l’Acadmie des Sciences,Series I,2016.

        [2]Aflalo,Y.,Brezis,H.and Kimmel,R.,On the optimality of shape and data representation in the spectral domain,SIAM Journal on Scientific Computing,8(2),2015,1141–1160.

        [3]Aubry,M.,Schlickewei,U.and Cremers,D.,The wave kernel signature:A quantum mechanical approach to shape analysis,Computer Vision Workshops(ICCV Workshops),2011 IEEE International Conference,IEEE,2011,1626–1633.

        [4]Belkin,M.and Niyogi,P.,Convergence of Laplacian eigenmaps,Proceedings of the Twentieth Annual Conference on NIPS,2006,129–136.

        [5]Ben-Chen,M.and Gotsman,C.,On the optimality of spectral compression of mesh data,ACM Trans.Graph.,24(1),2005,60–80.

        [6]Bérard,P.,Besson,G.and Gallot,S.,Embedding Riemannian manifolds by their heat kernel,Geometric and Functional Analysis,4(4),1994,373–398.

        [7]Borg,I.and Groenen,P.,Modern Multidimensional Scaling:Theory and Applications,Springer-Verlag,New York,1997.

        [8]Brezis,H.,Functional Analysis,Sobolev Spaces and Partial Differential Equations,Universitext Series,Springer-Verlag,New York,2010.

        [9]Bronstein,A.M.,Bronstein,M.M.and Kimmel,R.,Generalized multidimensional scaling:A framework for isometry-invariant partial surface matching,Proceedings of the National Academy of Sciences,103(5),2006,1168–1172.

        [10]Bronstein,A.M.,Bronstein,M.M.and Kimmel,R.,Rock,paper,and scissors:Extrinsic vs.intrinsic similarity of non-rigid shapes,IEEE 11th International Conference on Computer Vision,2007.

        [11]Bronstein,A.M.,Bronstein,M.M.,Kimmel,R.,et al.,A Gromov-Hausdor ffframework with diffusion geometry for topologically-robust non-rigid shape matching,International Journal of Computer Vision,89(2–3),2010,266–286.

        [12]Coifman,R.R.and Lafon,S.,Diffusion maps,Applied and Computational Harmonic Analysis,21(1),2006,5–30.

        [13]Coifman,R.R.,Lafon,S.,Lee,A.B.,et al.,Geometric diffusions as a tool for harmonic analysis and structure definition of data:Diffusion maps,Proceedings of the National Academy of Sciences,102(21),2005,7426–7431.

        [14]Coifman,R.R.and Maggioni,M.,Diffusion wavelets,Applied and Computational Harmonic Analysis,21(1),2006,53–94.

        [15]G?ebal,K.,B?rentzen,J.A.,Aan?s,H.and Larsen,R.,Shape analysis using the auto diffusion function,Computer Graphics Forum,28,2009,1405–1413.

        [16]Hammond,D.K.,Vandergheynst,P.and Gribonval,R.,Wavelets on graphs via spectral graph theory,Applied and Computational Harmonic Analysis,30(2),2011,129–150.

        [17]Jiang,B.,Ding,C.,Luo,B.and Tang,J.,Graph-Laplacian PCA:Closed-form solution and robustness,IEEE Conf.Computer Vision and Pattern Recognition(CVPR),IEEE,2013,3492–3498.

        [18]Jolliffe,I.T.,Principal Component Analysis,Second Edition,Springer-Verlag,New York,2002.

        [19]Kalofolias,V.,Bresson,X.,Bronstein,M.M.and Vandergheynst,P.,Matrix completion on graphs,Proc.NIPS,Workshop Out of the Box:Robustness in High Dimension,2014.

        [20]Karni,Z.and Gotsman,C.,Spectral compression of mesh geometry,ACM Transactions on Graphics,2000,279–286.

        [21]Kovnatsky,A.,Bronstein,M.M.,Bresson,X.and Vandergheynst,P.,Functional correspondence by matrix completion,Proc.Computer Vision and Pattern Recognition(CVPR),2015.

        [22]Kovnatsky,A.,Bronstein,M.M.,Bronstein,A.M.,et al.,Coupled quasi-harmonic bases,Computer Graphics Forum,32,2013,439–448.

        [23]Lévy,B.,Laplace-Beltrami eigenfunctions towards an algorithm that “understands” geometry,Shape Modeling and Applications,IEEE International Conference,IEEE,2006,13–13.

        [24]Meyer,M.,Desbrun,M.,Schr¨oder,P.and Barr,A.H., Discrete differential-geometry operators for triangulated 2-manifolds,Visualization and mathematics,3(2),2002,52–58.

        [25]Pearson,K.,On lines and planes of closest fit to systems of points in space,Philosophical Magazine,2(11),1901,559–572.

        [26]Pinkall,U.and Polthier,K.,Computing discrete minimal surfaces and their conjugates,Experiment.Math.,2,1993,15–36.

        [27]Qiu,H.and Hancock,E.R.,Clustering and embedding using commute times,IEEE Trans.Pattern Anal.Mach.Intell.,29(11),2007,1873–1890.

        [28]Ramsay,J.O.and Silverman,B.W.,Functional data analysis,Second Edition,Springer-Verlag,New York,2005.

        [29]Rong,G.,Cao,Y.,and Guo,X.,Spectral mesh deformation,Vis.Comput.,24(7),2008,787–796.

        [30]Rustamov,R.,Ovsjanikov,M.,Azencot,O.,et al.,Map-based exploration of intrinsic shape differences and variability,SIGGRAPH,ACM,2013.

        [31]Shahid,N.,Bresson,X.,Bronstein,M.M.and Vandergheynst,P.,Robust principal component analysis on graphs,Int.Conf.Comp.Vision(ICCV),2015.

        [32]Shawe-Taylor,J.and Cristianini,N.,Kernel Methods For Pattern Analysis,Cambridge University Press,Cambridge,2004.

        [33]Sun,J.,Ovsjanikov,M.and Guibas,L.,A concise and provably informative multi-scale signature based on heat diffusion,Proceedings of the Symposium on Geometry Processing,Eurographics Association,SGP’09,Aire-la-Ville,Switzerland,2009,1383–1392.

        [34]Weinberger,H.,Variational Methods for Eigenvalue Approximation,Society for Industrial and Applied Mathematics,1974.

        [35]Zaharescu,A.,Boyer,E.,Varanasi,K.and Horaud,R.,Surface feature detection and description with applications to mesh matching,Computer Vision and Pattern Recognition,IEEE Conference on CVPR 2009,IEEE,2009,373–380.

        [36]Zhou,X.and Srebro,N.,Error analysis of laplacian eigenmaps for semi-supervised learning,International Conference on Artificial Intelligence and Statistics,2011,901–908.

        中文字幕亚洲综合久久| 久久精品国产视频在热| 久久精品视频在线看99| 亚洲是图一区二区视频| 久久精品伊人久久精品伊人| 熟女体下毛荫荫黑森林| 亚洲av五月天一区二区| 亚洲av无码乱码在线观看裸奔| 乱中年女人伦av| 亚洲精品一区二区三区播放| 久久老熟女一区二区三区福利| 国产精品爽爽久久久久久竹菊| 无码aⅴ在线观看| 精品人妻一区二区三区蜜桃| 男女激情视频网站在线| 巨人精品福利官方导航| 久久免费网国产AⅤ| 熟女人妻一区二区在线观看| 亚洲综合日韩一二三区| 真人做爰片免费观看播放 | www.日本一区| 成人男性视频在线观看| 比较有韵味的熟妇无码| 亚洲国产理论片在线播放| 午夜一区二区三区在线视频| 精品国产一区二区三区av免费 | 手机看片1024精品国产| 少妇人妻字幕一区二区| 欧美性猛交xxxx乱大交极品| 国产乱子伦一区二区三区| 亚洲无线码一区在线观看| 美腿丝袜日韩在线观看| 北条麻妃国产九九九精品视频 | 少妇无码吹潮| 日韩久久久久中文字幕人妻| 日韩精品一区二区亚洲专区| 成人无码av一区二区| 国产不卡一区二区三区免费视| 精品粉嫩国产一区二区三区| 日韩a级精品一区二区| 久久精品人人爽人人爽|