EFEKTIVITAS METODE NEW STEPWISE DALAM PEMILIHAN VARIABEL PADA MODEL REGRESI GANDA
Abstract
New stepwise method is a method of selecting predictor variables in a linear reg- ression model. This method is an extension of the principal component regressi- on, and consists of the selection of the original predictor variables iteratively at the same time, a group of main subset component is selected repeatedly. This me- thod has also the basic properties of the stepwise method. Thus we will get the best combination of stepwise selection and principal component selection me- thods. Model that is obtained by using this method characterizes a low-valued PRESS. The application of this method is not only for linear model, but also can be expanded to generalized linear models. The comparison of both methods are based on the R2 criteria in the variable selection, obtained R2 value results which are almost the same as those models in the case of solid waste of data, so having payed fully attention to the number of predictor variables entered into the mo- dels, it can be said that the new stepwise method tends to be better than the prin- cipal component regression.
Downloads
References
Anton, H., Aljabar Linear Elementer, Penerbit Erlangga, Jakarta: Penerbit Erlangga, 1997.
Boneh, S. and G. R. Mendieta, Variable selection in regression models using principal components, Commun. Statist. – Theory & Meth., 23(1), 197-213., 1994
Draper, N. R. and H. Smith, Applied Regression Analysi,s 2nd ed. New York: John Wiley & Son, Inc., 1981.
Jackson, J. E., A User’s Guide To Principal components. New York: John Wiley & Son, Inc., 1991. Mahmud, M. E., Perbandingan Pemilihan Variabel Independen Dalam Model Regresi Linear Dengan
Pendekatan Cp Mallow,Tp Dan RTp Pada Data Limbah Padat Pabrik Gula Asembagus
Situbondo, TA, Surabaya, 1999.
Mansfield, E. R., J. T. Webster, & R. F. Gunst, An analytic variable selection technique for principal component analysis, Appl. Statistl., 36, 34-40., 1977.
Mason, R. L. and R. F. Gunst, Selecting principal components in regression, Stat. & Prob.
Letters, No.3, 299-301., 1985.
Montgomery, D. C. and E. A. Peck, Introduction to Linear Regression Analysis. New York: John Wiley & Son, Inc., 1991.
Myers, R. H., Classical and Modern Regression with Applications. Boston: PWS-KENT, 1989. Schott, R. J., Matrix Analysis for Statistic. New York: John Wiley & Son, Inc., 1997.
Walpole, R. E. and R. H. Myers, Probability and Statistics for Engineers and Scientists, fourth edition. New York: Macmillan, 1989.
Weisberg, S., Applied Linear Regression, second edition. York: John Wiley & Son, Inc., 1985. Wetherill, G. B., P. Duncombe, M. Kenward, J. Kollerstrom, S. R. Paul, and B. J. Vowden,
Regression Analysis With Application, London: Chapman and Hall, 1986.
Authors who publish with this journal agree to the following terms:
1) Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
2) Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
3)Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).