Pokaż uproszczony rekord

dc.contributor.authorGatnar, Eugeniusz
dc.date.accessioned2016-02-01T12:05:48Z
dc.date.available2016-02-01T12:05:48Z
dc.date.issued2007
dc.identifier.issn0208-6018
dc.identifier.urihttp://hdl.handle.net/11089/16836
dc.description.abstractSignificant improvement of model stability and prediction accuracy in classification and regression can be obtained by using the multiple model approach. In classification multiple models are built on the basis of training subsets (selected from the training set) and combined into an ensemble or a committee. Then the component models (classification trees) determine the predicted class by voting. In this paper some problems of feature selection for ensembles will be discussed. We propose a new correlation-based feature selection method combined with the wrapper approach.pl_PL
dc.description.sponsorshipZadanie pt. „Digitalizacja i udostępnienie w Cyfrowym Repozytorium Uniwersytetu Łódzkiego kolekcji czasopism naukowych wydawanych przez Uniwersytet Łódzki” nr 885/P-DUN/2014 zostało dofinansowane ze środków MNiSW w ramach działalności upowszechniającej naukę.pl_PL
dc.language.isoenpl_PL
dc.publisherWydawnictwo Uniwersytetu Łódzkiegopl_PL
dc.relation.ispartofseriesActa Universitatis Lodziensis. Folia Oeconomica;206
dc.subjecttree-based modelspl_PL
dc.subjectaggregationpl_PL
dc.subjectfeature selectionpl_PL
dc.subjectrandom subspacespl_PL
dc.titleFeature Selection and Multiple Model Approach in Discriminant Analysispl_PL
dc.title.alternativeDobór zmiennych a podejście wielomodelowe w analizie dyskryminacyjnejpl_PL
dc.typeArticlepl_PL
dc.rights.holder© Copyright by Wydawnictwo Uniwersytetu Łódzkiego, Łódź 2007pl_PL
dc.page.number159-165pl_PL
dc.contributor.authorAffiliationKatowice University of Economics, Institute of Statisticspl_PL
dc.referencesBlake C., Keogh E., Merz C. J. (1998), UCI Repository of Machine Learning Databases, Department of Information and Computer Science, University of California, Irvine (CA).pl_PL
dc.referencesВreiman L. (2001), Random forests, “Machine Learning”, 45, 5-32.pl_PL
dc.referencesFayyad U. M., Irani K. B. (1993), Multi-interval discretisation of continuous-valued attributes, fin:] Proceedings of the XIII International Joint Conference on Artificial Intelligence, Morgan Kaufmann, San Francisco, 1022-1027.pl_PL
dc.referencesGatnar E. (2001), Nieparametryczna metoda estymacji i regresji, Wydawnictwo Naukowe PWN, Warszawa.pl_PL
dc.referencesHall M. (2000), Correlation-based feature selection for discrete and numeric class machine learning, [in:] Proceedings of the 17th International Conference on Machine Learning, Morgan Kaufmann, San Francisco.pl_PL
dc.referencesHellwig Z. (1969), On the problem of the optimal selection of predictors, “Statistical Revue”, 3-4 (in Polish).pl_PL
dc.referencesHо Т. K. (1998), The random subspace method for constructing decision forests, IEEE Trans, on Pattern Analysis and Machine Learning, 20, 832-844.pl_PL
dc.referencesKira A., Rendell L. (1992), A practical approach to feature selection, [in:] Proceedings of the 9th International Conference on Machine Learning, D. Sleeman, P. Edwards (eds.), Morgan Kaufmann, San Francisco, 249-256.pl_PL
dc.referencesKohavi R., John G. H. (1997), Wrappers for feature subset selection, “Artificial Intelligence”, 97, 273-324.pl_PL
dc.referencesOza N. C., Tumar K. (1999), Dimensionality reduction through classifier ensembles. Technical Report, NASA-ARC-IC-I999-I26, Computational Sciences Division, NASA Ames Research Center.pl_PL
dc.referencesPress W. H., Flannery B. P., Teukolsky S. A., Vetterling W. T. (1989), Numerical recipes in Pascal, Cambridge University Press, Cambridge.pl_PL
dc.referencesTherneau T. M., Atkinson E. J. (1997), An introduction to recursive partitioning using the RPART routines, Mayo Foundation, Rochester.pl_PL


Pliki tej pozycji

Thumbnail

Pozycja umieszczona jest w następujących kolekcjach

Pokaż uproszczony rekord