Por favor, use este identificador para citar o enlazar este ítem: http://inaoe.repositorioinstitucional.mx/jspui/handle/1009/1592
Decision tree induction using a fast splitting attribute selection for large datasets
Anilú Franco Arcega
Jesús Ariel Carrasco Ochoa
José Francisco Martínez Trinidad
Acceso Abierto
Atribución-NoComercial-SinDerivadas
Decision trees
Large datasets
Gain-ratio criterion
Several algorithms have been proposed in the literature for building decision trees (DT) for large datasets, however almost all of them have memory restrictions because they need to keep in main memory the whole training set, or a big amount of it, and such algorithms that do not have memory restrictions, because they choose a subset of the training set, need extra time for doing this selection or have parameters that could be very difficult to determine. In this paper, we introduce a new algorithm that builds decision trees using a fast splitting attribute selection (DTFS) for large datasets. The proposed algorithm builds a DT without storing the whole training set in main memory and having only one parameter but being very stable regarding to it. Experimental results on both real and synthetic datasets show that our algorithm is faster than three of the most recent algorithms for building decision trees for large datasets, getting a competitive accuracy.
Elsevier Ltd.
2011
Artículo
Inglés
Estudiantes
Investigadores
Público en general
Franco-Arcega, A., et al., (2011). Decision tree induction using a fast splitting attribute selection for large datasets, Expert Systems with Applications, (38): 14290–14300
CIENCIA DE LOS ORDENADORES
Versión aceptada
acceptedVersion - Versión aceptada
Aparece en las colecciones: Artículos de Ciencias Computacionales

Cargar archivos:


Fichero Tamaño Formato  
4 Franco_2011_ExperSystems38.pdf1.04 MBAdobe PDFVisualizar/Abrir