Computational Intelligence - February 2016 - 50

Deep structures with multiple layers of non-linear
operations is able to learn high-level abstraction
needed to solve challenging vision, language,
and other AI-level tasks. Ensemble of deep learners
can boost the performance.
conversion method called extreme randomized discretization (ERD) to convert continuous values to discrete values.
The ERD grouped the continuous values into bins where the boundaries were
created randomly in order to create
ensembles. The authors evaluated the
proposed method with several regression
datasets and they concluded that the
ensemble method outperformed the
ensemble regression converted to classification method with equal-width discretization bins. They also found that a
greater number of bins yielded better
performance.
IV. Conclusion

This paper has reviewed the state-ofthe-art on ensemble classification and
ensemble regression. The theories relating to ensemble classification and
regression have been discussed including
bias-variance decomposition and the
diversity issue. Conventional ensemble
methods have been introduced and
recent improvements to the conventional ensemble methods have also been discussed. State-of-the-art ensemble
methods such as multi-objective optimization based ensembles, multiple kernel
learning, and fuzzy ensembles have also
been reviewed. The early stage development of ensemble incorporating deep
learning has also been surveyed. Some
divide-and-conquer based ensemble
methods specialized for time series forecasting have also been presented.
The paper has also reviewed some
methods that convert regression problems into multiple-class classification
problems and that apply ensemble classification to regression problems.
V. Future Work

Although there are numerous recent
ensemble methods for solving classification and regression problems reported

50

in the literature, there is still ample room
for improvement. There are several
promising research directions for ensemble classification and regression as discussed below.
"Big Data" [176] has attracted considerable attention recently. It is worthy
to investigate the benefits of ensemble
learning for solving big data problems.
We can also investigate the benefits of
ensemble methods for solving other
machine learning tasks such as clustering
[177], [178].
There are learning algorithms which
are unstable [13], [25], [179], especially
randomized learning approaches such as
random vector functional link (RVFL)
neural network. According to a recent
comprehensive evaluation [180], there is
a large gap in performance between
randomized learning algorithms and
rank 1 methods (Random Forest). On
the other hand, according to [25], highly
unstable methods are naturally suitable
for ensembles. That is, the variance of
ensemble can be reduced significantly.
Therefore, it is worthwhile to investigate
the performance of ensemble methods
with fast randomized base learners such
as RVFL.
There are few ensemble classification
and regression methods containing fuzzy
systems as base predictors. Multi-objective optimization based ensemble classification and regression methods are also
under-researched in the literature. Further research on fuzzy ensemble and
multi-objective optimization based
ensemble methods is recommended. A
possible direction is to combine fuzzy
systems and optimization algorithms
because ensemble methods with fuzzy
systems require much more parameter
tuning. Hence, employing optimization
algorithms will accelerate the process.
For multi-objective optimization based
ensemble methods, we would like to

IEEE ComputatIonal IntEllIgEnCE magazInE | FEbruary 2016

highlight the possibility of applying
optimization to different base predictors
other than feed-forward neural networks. We can also investigate the applicability of ensemble methods for evolutionary algorithms [181], [182].
In Section III-G, we reviewed some
deep learning based ensemble classification and regression methods such as
CNN ensembles and deep SVM ensembles. Current deep learning approaches
aim to generate deep representations for
the data. However, it is common sense
that deep neural networks are very difficult to train because of the vanishing gradient. Recent research remedies this by
either initializing the network with some
unsupervised training approach such as
restricted Boltzmann machine or reducing the number of parameters by 'sharing
weights' such as CNN. However, it
remains an open question how well these
approaches can regularize the network.
Ensemble methods, which can be
regarded as complementary to the current research, boost the performance of
neural networks by reducing the variance
significantly as we mentioned in Section
II. In the future, researchers should further investigate deep learning based
ensemble methods. Several questions may
be addressed: Is it necessary to pretrain
the 'base deep network' in an unsupervised manner in the ensemble? Is it possible to increase the diversity of the 'base
deep network' without losing too much
accuracy in the ensemble? How can one
develop advanced ensemble methods
which are suitable for deep learning?
Ensemble regressors for approximating fitness landscapes in the context of
evolutionary and swarm algorithms are
also under-researched [183], [184].
Moreover, deep learning ensembles are a
future research direction. Although certain deep learning ensemble classification methods can be easily transformed/
migrated to deal with regression problems, we need further development on
deep learning ensemble regression algorithms. Deep RVFL ensembles are a
promising approach due to their diversity and fast training.
Deep and complex models are much
more difficult to train than shallow and



Table of Contents for the Digital Edition of Computational Intelligence - February 2016

Computational Intelligence - February 2016 - Cover1
Computational Intelligence - February 2016 - Cover2
Computational Intelligence - February 2016 - 1
Computational Intelligence - February 2016 - 2
Computational Intelligence - February 2016 - 3
Computational Intelligence - February 2016 - 4
Computational Intelligence - February 2016 - 5
Computational Intelligence - February 2016 - 6
Computational Intelligence - February 2016 - 7
Computational Intelligence - February 2016 - 8
Computational Intelligence - February 2016 - 9
Computational Intelligence - February 2016 - 10
Computational Intelligence - February 2016 - 11
Computational Intelligence - February 2016 - 12
Computational Intelligence - February 2016 - 13
Computational Intelligence - February 2016 - 14
Computational Intelligence - February 2016 - 15
Computational Intelligence - February 2016 - 16
Computational Intelligence - February 2016 - 17
Computational Intelligence - February 2016 - 18
Computational Intelligence - February 2016 - 19
Computational Intelligence - February 2016 - 20
Computational Intelligence - February 2016 - 21
Computational Intelligence - February 2016 - 22
Computational Intelligence - February 2016 - 23
Computational Intelligence - February 2016 - 24
Computational Intelligence - February 2016 - 25
Computational Intelligence - February 2016 - 26
Computational Intelligence - February 2016 - 27
Computational Intelligence - February 2016 - 28
Computational Intelligence - February 2016 - 29
Computational Intelligence - February 2016 - 30
Computational Intelligence - February 2016 - 31
Computational Intelligence - February 2016 - 32
Computational Intelligence - February 2016 - 33
Computational Intelligence - February 2016 - 34
Computational Intelligence - February 2016 - 35
Computational Intelligence - February 2016 - 36
Computational Intelligence - February 2016 - 37
Computational Intelligence - February 2016 - 38
Computational Intelligence - February 2016 - 39
Computational Intelligence - February 2016 - 40
Computational Intelligence - February 2016 - 41
Computational Intelligence - February 2016 - 42
Computational Intelligence - February 2016 - 43
Computational Intelligence - February 2016 - 44
Computational Intelligence - February 2016 - 45
Computational Intelligence - February 2016 - 46
Computational Intelligence - February 2016 - 47
Computational Intelligence - February 2016 - 48
Computational Intelligence - February 2016 - 49
Computational Intelligence - February 2016 - 50
Computational Intelligence - February 2016 - 51
Computational Intelligence - February 2016 - 52
Computational Intelligence - February 2016 - 53
Computational Intelligence - February 2016 - 54
Computational Intelligence - February 2016 - 55
Computational Intelligence - February 2016 - 56
Computational Intelligence - February 2016 - 57
Computational Intelligence - February 2016 - 58
Computational Intelligence - February 2016 - 59
Computational Intelligence - February 2016 - 60
Computational Intelligence - February 2016 - 61
Computational Intelligence - February 2016 - 62
Computational Intelligence - February 2016 - 63
Computational Intelligence - February 2016 - 64
Computational Intelligence - February 2016 - 65
Computational Intelligence - February 2016 - 66
Computational Intelligence - February 2016 - 67
Computational Intelligence - February 2016 - 68
Computational Intelligence - February 2016 - 69
Computational Intelligence - February 2016 - 70
Computational Intelligence - February 2016 - 71
Computational Intelligence - February 2016 - 72
Computational Intelligence - February 2016 - Cover3
Computational Intelligence - February 2016 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202311
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202308
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202305
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202302
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202211
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202208
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202205
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202202
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202111
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202108
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202105
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202102
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202011
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202008
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202005
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202002
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201911
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201908
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201905
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201902
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201811
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201808
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201805
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201802
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter12
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall12
https://www.nxtbookmedia.com