An in-depth comparison of methods handling mixed-attribute data for general fuzzy min-max neural network
- Publication Type:
- Journal Article
- Citation:
- 2020
- Issue Date:
- 2020-09-01
Open Access
Copyright Clearance Process
- Recently Added
- In Progress
- Open Access
This item is open access.
A general fuzzy min-max (GFMM) neural network is one of the efficient
neuro-fuzzy systems for classification problems. However, a disadvantage of
most of the current learning algorithms for GFMM is that they can handle
effectively numerical valued features only. Therefore, this paper provides some
potential approaches to adapting GFMM learning algorithms for classification
problems with mixed-type or only categorical features as they are very common
in practical applications and often carry very useful information. We will
compare and assess three main methods of handling datasets with mixed features,
including the use of encoding methods, the combination of the GFMM model with
other classifiers, and employing the specific learning algorithms for both
types of features. The experimental results showed that the target and
James-Stein are appropriate categorical encoding methods for learning
algorithms of GFMM models, while the combination of GFMM neural networks and
decision trees is a flexible way to enhance the classification performance of
GFMM models on datasets with the mixed features. The learning algorithms with
the mixed-type feature abilities are potential approaches to deal with
mixed-attribute data in a natural way, but they need further improvement to
achieve a better classification accuracy. Based on the analysis, we also
identify the strong and weak points of different methods and propose potential
research directions.
Please use this identifier to cite or link to this item: