Logistic Regression Program Removal

12/28/2017by

I want to use R to perform a stepwise linear Regression using p-values as a selection criterion e.g. At each step dropping variables that have the highest i.e. The most insignificant p-values, stopping when all values are significant defined by some treshold alpha. I am totally aware that I should use the AIC (e.g.

Command step or stepAIC) or some other criterion instead, but my boss has no grasp of statistics and insist on using p-values. If nesseary I could program my own routine, but I am wondering if there is an already implemented version of this. Here is an example. Start with the most complicated model: this includes interactions between all three explanatory variables.

Logistic Regression Program Removal

Purposeful selection of variables in logistic regression. Microsoft Office 2010 X64 German Language Pack Final Grade. Zoran BursacEmail author,; C Heath Gauss,; David Keith Williams and; David W Hosmer. Source Code for Biology and Medicine20083:17. © Bursac et al; licensee BioMed Central Ltd. Received: 22 August 2008. ACE Model: A twin study model where variance for a certain trait is broken down into three factors: additive genetic factors (A), common environmental factors (C) and.

Logistic Regression Program Removal

Model1 t) (Intercept) 5.683e+02 2.073e+02 2.741 0.00725 ** temp -1.076e+01 4.303e+00 -2.501 0.01401 * wind -3.237e+01 1.173e+01 -2.760 0.00687 ** rad -3.117e-01 5.585e-01 -0.558 0.57799 temp:wind 2.377e-01 1.367e-01 1.739 0.08519 temp:rad 8.402e-03 7.512e-03 1.119 0.26602 wind:rad 2.054e-02 4.892e-02 0.420 0.47552 temp:wind:rad -4.324e-04 6.595e-04 -0.656 0.51358 The three-way interaction is clearly not significant. This is how you remove it, to begin the process of model simplification: model2. If you are just trying to get the best predictive model, then perhaps it doesn't matter to much, but for anything else, don't both with this sort of model selection.

Use a shrinkage methods such as ridge regression (in lm.ridge() in package MASS for example), or the lasso, or the elasticnet (a combination of ridge and lasso constraints). Of these, only the lasso and elastic net will do some form of model selection, i.e. Force the coefficients of some covariates to zero. See the Regularization and Shrinkage section of the task view on CRAN.

Comments are closed.