dc.contributor.author | Gatnar, Eugeniusz | |
dc.date.accessioned | 2016-02-01T12:05:48Z | |
dc.date.available | 2016-02-01T12:05:48Z | |
dc.date.issued | 2007 | |
dc.identifier.issn | 0208-6018 | |
dc.identifier.uri | http://hdl.handle.net/11089/16836 | |
dc.description.abstract | Significant improvement of model stability and prediction accuracy in classification
and regression can be obtained by using the multiple model approach. In classification multiple
models are built on the basis of training subsets (selected from the training set) and combined
into an ensemble or a committee. Then the component models (classification trees) determine
the predicted class by voting.
In this paper some problems of feature selection for ensembles will be discussed. We
propose a new correlation-based feature selection method combined with the wrapper approach. | pl_PL |
dc.description.sponsorship | Zadanie pt. „Digitalizacja i udostępnienie w Cyfrowym Repozytorium Uniwersytetu Łódzkiego kolekcji czasopism naukowych wydawanych przez Uniwersytet Łódzki” nr 885/P-DUN/2014 zostało dofinansowane ze środków MNiSW w ramach działalności upowszechniającej naukę. | pl_PL |
dc.language.iso | en | pl_PL |
dc.publisher | Wydawnictwo Uniwersytetu Łódzkiego | pl_PL |
dc.relation.ispartofseries | Acta Universitatis Lodziensis. Folia Oeconomica;206 | |
dc.subject | tree-based models | pl_PL |
dc.subject | aggregation | pl_PL |
dc.subject | feature selection | pl_PL |
dc.subject | random subspaces | pl_PL |
dc.title | Feature Selection and Multiple Model Approach in Discriminant Analysis | pl_PL |
dc.title.alternative | Dobór zmiennych a podejście wielomodelowe w analizie dyskryminacyjnej | pl_PL |
dc.type | Article | pl_PL |
dc.rights.holder | © Copyright by Wydawnictwo Uniwersytetu Łódzkiego, Łódź 2007 | pl_PL |
dc.page.number | 159-165 | pl_PL |
dc.contributor.authorAffiliation | Katowice University of Economics, Institute of Statistics | pl_PL |
dc.references | Blake C., Keogh E., Merz C. J. (1998), UCI Repository of Machine Learning Databases, Department of Information and Computer Science, University of California, Irvine (CA). | pl_PL |
dc.references | Вreiman L. (2001), Random forests, “Machine Learning”, 45, 5-32. | pl_PL |
dc.references | Fayyad U. M., Irani K. B. (1993), Multi-interval discretisation of continuous-valued attributes, fin:] Proceedings of the XIII International Joint Conference on Artificial Intelligence, Morgan Kaufmann, San Francisco, 1022-1027. | pl_PL |
dc.references | Gatnar E. (2001), Nieparametryczna metoda estymacji i regresji, Wydawnictwo Naukowe PWN, Warszawa. | pl_PL |
dc.references | Hall M. (2000), Correlation-based feature selection for discrete and numeric class machine learning, [in:] Proceedings of the 17th International Conference on Machine Learning, Morgan Kaufmann, San Francisco. | pl_PL |
dc.references | Hellwig Z. (1969), On the problem of the optimal selection of predictors, “Statistical Revue”, 3-4 (in Polish). | pl_PL |
dc.references | Hо Т. K. (1998), The random subspace method for constructing decision forests, IEEE Trans, on Pattern Analysis and Machine Learning, 20, 832-844. | pl_PL |
dc.references | Kira A., Rendell L. (1992), A practical approach to feature selection, [in:] Proceedings of the 9th International Conference on Machine Learning, D. Sleeman, P. Edwards (eds.), Morgan Kaufmann, San Francisco, 249-256. | pl_PL |
dc.references | Kohavi R., John G. H. (1997), Wrappers for feature subset selection, “Artificial Intelligence”, 97, 273-324. | pl_PL |
dc.references | Oza N. C., Tumar K. (1999), Dimensionality reduction through classifier ensembles. Technical Report, NASA-ARC-IC-I999-I26, Computational Sciences Division, NASA Ames Research Center. | pl_PL |
dc.references | Press W. H., Flannery B. P., Teukolsky S. A., Vetterling W. T. (1989), Numerical recipes in Pascal, Cambridge University Press, Cambridge. | pl_PL |
dc.references | Therneau T. M., Atkinson E. J. (1997), An introduction to recursive partitioning using the RPART routines, Mayo Foundation, Rochester. | pl_PL |