亚洲免费av电影一区二区三区,日韩爱爱视频,51精品视频一区二区三区,91视频爱爱,日韩欧美在线播放视频,中文字幕少妇AV,亚洲电影中文字幕,久久久久亚洲av成人网址,久久综合视频网站,国产在线不卡免费播放

        ?

        Variable Selection and Parameter Estimation with M-Estimation Method for Ultra-high Dimensional Linear Model

        2021-10-20 03:26:22ZHUYanling朱艷玲WANGKai汪凱ZHAOMingtao趙明濤
        應(yīng)用數(shù)學(xué) 2021年4期

        ZHU Yanling(朱艷玲),WANG Kai(汪凱),ZHAO Mingtao(趙明濤)

        (School of Statistics and Applied Mathematics,Anhui University of Finance and Economics,Bengbu 233000,China)

        Abstract:In this paper,the variable selection and parameter estimation of linear regression model in ultra-high dimensional case are considered.The proposed penalty likelihood M-estimator is proved to have good large sample properties by unifying the least squares,least absolute deviation,quantile regression and Huber regression into a general framework.The variable selection and parameter estimation are performed best by combining backward regression with local linear regression in the numerical simulation.In the case of ultra-high dimension,our general method has good robustness and effectiveness in variable selection and parameter estimation.

        Key words:Ultra-high dimensionality;M-estimation;Penalized likelihood;Variable selection

        1.Introduction

        For the classical linear regression modelY=Xβ+ε,whereY=(y1,y2,...,yn)Tis the response vector,X=(X1,X2,...,Xpn)=(x1,x2,...,xn)T=(xij)n×pnis ann×pndesign matrix,andε=(ε1,ε2,...,εn)Tis a random vector.When dimensionpnis high,it is often assumed that only a small number of predictors among all the predictors contribute to the response,which amounts to assuming ideally that the parameter vectorβis sparse.In order to implement sparsity,variable selection can improve the accuracy of estimation by effectively identifying the subset of important predictors,and also enhance model interpretability of the model.

        In this paper,we assume that the functionρis convex,hence the objective function is still convex and the obtained local minimizer is global minimizer.In Section 2,we discuss some theoretical properties of LLA estimator.In Section 3,we supply a simple and efficient algorithm to give the numerical simulation results.The proofs are given in Section 4.

        2.Main Results

        3.Simulation Results

        In this section we evaluate the performance of the M-estimator proposed in(1.1)by simulation studies.

        For the RSIS+LLA method,it tends to make a more robust selection in the first step,and performs well on the two indicators of estimation error and prediction error,but thus loses part of the ability to misidentify important variables,which makes it easy to omit important variables and cause great errors.In fact,we can guarantee the robustness of the selection and estimation by using the LAD loss function in the second step variable selection.

        For the HOLP+LLA method,its performance in the three indicators of prediction error,correct exclusion of unimportant variables,and error identification of important variables is almost equal to that of FR+LLA,but it is slightly worse in the estimation error of the first index.

        Tab.1 Numerical Results for ε ~N(0,1)

        Example 2 We consider the case of the LAD loss function and the error termε ~t5.The estimated results are shown in Tab.2.

        Tab.2 Numerical Results for ε ~t5

        When the selection error term follows the heavy-tailed distribution,all six methods in Tab.2 perform better than the error term of the standard normal distribution in the first index estimation error and the third index correctly excluding the unimportant variable.The second indicator is slightly worse than the forecast error,and the fourth indicator is basically flat.Overall conclusion is consistent with Example 1,i.e.the performance of the FR+LLA method is slightly superior.

        Example 3We consider the case of the LAD loss function and the error termε ~0.9N(0,1)+0.1N(0,9).The estimated results are shown in Tab.3.

        Tab.3 Numerical Results for ε ~0.9N(0,1)+0.1N(0,9)

        Synthesizing simulation results from Example 1 to Example 3,it can be seen that in the case that the number of explanatory variables is larger than the sample size,we design the plan that the backward regression FR method is used in the first step of variable screening,and the second step uses the local linear penalty LLA method proposed,and the performance on the four indicators is quite good.It also shows that for ultra-high dimensional data models,using the screening method of FR+LLA we provide is feasible and effective,and can be applied to more extensive data to obtain more satisfactory results.

        4.Proofs of Main Results

        亚洲熟妇无码一区二区三区导航| 亚洲精品综合在线影院| 青青草视频在线播放81| 国产白浆一区二区在线| 亚洲一区二区三区影院| 日韩a无v码在线播放| 粉嫩极品国产在线观看| 国产高清一区二区三区视频| 97精品人妻一区二区三区在线| av中文字幕潮喷人妻系列| 毛茸茸的中国女bbw| 亚洲av不卡电影在线网址最新 | 久久无码精品精品古装毛片| 国产午夜av一区二区三区| av在线不卡一区二区| 日本a片大尺度高潮无码| 欧洲人妻丰满av无码久久不卡| 国产欧美日韩综合一区二区三区| 日本久久一区二区三区高清| 亚洲av无一区二区三区综合| 免费国产一区二区视频| 无码国产精品一区二区av | 无码毛片aaa在线| 久久99国产亚洲高清观看首页| 中文字幕中文字幕777| 无码人妻h动漫中文字幕| 日本老熟欧美老熟妇| 免费无码又爽又刺激又高潮的视频| 亚洲一区域二区域三区域四| 嗯啊好爽高潮了在线观看| 国产av丝袜旗袍无码网站| 久久se精品一区精品二区国产| 日韩av中文字幕一卡二卡| 成人av毛片免费大全| 人人妻人人狠人人爽| 亚洲国产精品线路久久| 粗大挺进孕妇人妻在线| 女人的精水喷出来视频| 国产va免费精品观看| 曰本亚洲欧洲色a在线| 白白发在线视频免费观看2|