Computational Intelligence - August 2015 - 60

V. Conclusions and Future Work

In this paper, we proposed an innovative
ENN classification methodology based
on the maximum gain of intra-class
coherence. By analyzing the generalized
class-wise statistics, ENN is able to learn
from the global distribution to improve
pattern recognition performance. Unlike
the classic KNN rule which only considers the nearest neighbors of a test
sample to make a classification decision,
ENN method considers not only who
are the nearest neighbors of the test
sample, but also who consider the test
sample as their nearest neighbors. We
have developed three versions of the
ENN classifier, ENN, ENN.V1 and
ENN.V2, and analyzed their foundations and relationships to each other.
The experimental results on numerous
benchmarks demonstrated the effectiveness of our ENN method.
As a new classification method, we are
currently exploring the power of the
ENN method and developing its variants
for numerous machine learning and data
mining problems, such as imbalanced
learning, class conditional density estimation, clustering, regression, among others.
For example, for imbalanced learning
problems [30] [36] [37], notice that the
distribution scale sensitive issue addressed
by our ENN method can also be considered as an unequal distribution learning
problem, therefore, we expect that our
ENN method could be applied easily to
the learning from imbalanced data.
Meanwhile, the idea of the proposed
ENN method may also benefit the class
conditional density estimation [38] [39],
if one considers a different size of neighborhood for each class according to our
defined generalized class-wise statistic Ti .
One similar idea is adaptive or variablebandwidth kernel density estimation
where the width of kernels is varied for
different samples [11] [40]. Furthermore,
similar to different variations of the
KNN method, other forms of ENN classifiers could be developed, such as distance-weighted ENN. As nearest neighbor-based classification methods are used
in many scientific applications because of
their easy implementation, non-parametric nature, and competitive classification

60

performance, we would expect the new
ENN method and its future variations
could have widespread use in many areas
of data and information processing.
VI. Acknowledgment

This research was partially supported by
National Science Foundation (NSF)
under grant ECCS 1053717 and CCF
1439011, and the Army Research Office
under grant W911NF-12-1-0378.
References

[1] C. M. Bishop, Pattern Recognition and Machine Learning.
New York: Springer, 2006.
[2] Nature. (2008). Big data. [Online]. Available: http://
www.nature.com/news/specials/bigdata/index.html
[3] Z. Zhou, N. Chawla, Y. Jin, and G. Williams, "Big
data opportunities and challenges: Discussions from data
analytics perspectives," IEEE Comput. Intell. Mag., vol. 9,
no. 4, pp. 62-74, 2014.
[4] Y. Zhai, Y. Ong, and I. Tsang, "The emerging big
dimensionality," IEEE Comput. Intell. Mag., vol. 9, no. 3,
pp. 14-26, 2014.
[5] E. Fix and J. L. Hodges Jr, "Discriminatory analysisnonparametric discrimination: Consistency properties,"
U.S. Air Force Sch. Aviation Medicine, Randolph Field,
Texas, Project 21-49-004, Tech. Rep. 4, 1951.
[6] T. Cover and P. Hart, "Nearest neighbor pattern classification," IEEE Trans. Inform. Theory, vol. 13, no. 1, pp.
21-27, 1967.
[7] P. Horton and K. Nakai, "Better prediction of protein
cellular localization sites with the k nearest neighbors
classifier," in Proc. 5th Int. Conf. Intelligent Systems Molecular Biology, 1997, vol. 5, pp. 147-152.
[8] R. Parry, W. Jones, T. Stokes, J. Phan, R. Moffitt, H.
Fang, L. Shi, A. Oberthuer, M. Fischer, W. Tong, and M.
Wang, "k-Nearest neighbor models for microarray gene
expression analysis and clinical outcome prediction,"
Pharmacogenomics J., vol. 10, no. 4, pp. 292-312, 2010.
[9] J. MacQueen, "Some methods for classification and
analysis of multivariate observations," in Proc. 5th Berkeley
Symp. Mathematical Statistics Probability, California, 1967,
vol. 1, pp. 281-297.
[10] N. S. Altman, "An introduction to kernel and nearest-neighbor nonparametric regression," Amer. Stat., vol.
46, no. 3, pp. 175-185, 1992.
[11] G. R. Terrell and D. W. Scott, "Variable kernel density
estimation," Ann. Stat., vol. 20, no. 3, pp. 1236-1265, 1992.
[12] M. M. Breunig, H.-P. Kriegel, R. T. Ng, and J.
Sander, "LOF: Identifying density-based local outliers,"
ACM SIGMOD Rec., vol. 29, no. 2, pp. 93-104, 2000.
[13] X. Wu, V. Kumar, J. R. Quinlan, J. Ghosh, Q. Yang,
H. Motoda, G. J. McLachlan, A. Ng, B. Liu, P. S. Yu,
Z.-H. Zhou, M. Steinbach, D. J. Hand, and D. Steinberg,
"Top 10 algorithms in data mining," Knowl. Inform. Syst.,
vol. 14, no. 1, pp. 1-37, 2007.
[14] R. Kohavi, "A study of cross-validation and bootstrap for accuracy estimation and model selection,"
in Proc. Int. Joint Conf. Artificial Intelligence, vol. 14, pp.
1137-1145, 1995.
[15] J. Goldberger, S. Roweis, G. Hinton, and R. Salakhutdinov, "Neighbourhood components analysis," in
Proc. Advances Neural Information Processing Systems, 2005,
pp. 513-520.
[16] K. Q. Weinberger, J. Blitzer, and L. K. Saul, "Distance metric learning for large margin nearest neighbor
classification," in Proc. Advances Neural Information Processing Systems, 2005, pp. 1473-1480.
[17] K. Q. Weinberger and L. K. Saul, "Distance metric
learning for large margin nearest neighbor classification,"
J. Machine Learn. Res., vol. 10, pp. 207-244, Feb. 2009.
[18] S. Chopra, R. Hadsell, and Y. LeCun, "Learning
a similarity metric discriminatively, with application to
face verification," in Proc. IEEE Computer Society Conf.
Vision Pattern Recognition, 2005, pp. 539-546.

IEEE ComputatIonal IntEllIgEnCE magazInE | august 2015

[19] J. Wang, P. Neskovic, and L. N. Cooper, "Improving nearest neighbor rule with a simple adaptive distance
measure," Pattern Recognit. Lett., vol. 28, no. 2, pp. 207-
213, 2007.
[20] A. Bellet, A. Habrard, and M. Sebban, "A survey on
metric learning for feature vectors and structured data,"
Comput. Res. Repository, 2013, to be published.
[21] T. Hastie and R. Tibshirani, "Discriminant adaptive
nearest neighbor classification," IEEE Trans. Pattern Anal.
Machine Intell., vol. 18, no. 6, pp. 607-616, 1996.
[22] K. C. Gowda and G. Krishna, "The condensed nearest neighbor rule using the concept of mutual nearest
neighborhood," IEEE Trans. Inform. Theory, vol. 25, no.
4, pp. 488-490, 1979.
[23] S. C. Bagui, S. Bagui, K. Pal, and N. R. Pal, "Breast
cancer detection using rank nearest neighbor classification rules," Pattern Recognit., vol. 36, no. 1, pp. 25-34,
2003.
[24] J. H. Friedman, J. L. Bentley, and R. A. Finkel,
"An algorithm for finding best matches in logarithmic
expected time," ACM Trans. Math. Softw., vol. 3, no. 3,
pp. 209-226, 1977.
[25] S. Z. Li, K. L. Chan, and C. Wang, "Performance
evaluation of the nearest feature line method in image
classification and retrieval," IEEE Trans. Pattern Anal.
Machine Intell., vol. 22, no. 11, pp. 1335-1349, 2000.
[26] J. McNames, "A fast nearest-neighbor algorithm
based on a principal axis search tree," IEEE Trans. Pattern Anal. Machine Intell., vol. 23, no. 9, pp. 964-976,
2001.
[27] T. Liu, A. W. Moore, and A. Gray, "New algorithms
for efficient high-dimensional nonparametric classification," J. Machine Learn. Res., vol. 7, pp. 1135-1158, 2006.
[28] V. Garcia, E. Debreuve, and M. Barlaud, "Fast k
nearest neighbor search using GPU," in Proc. IEEE Computer Society Conf. Computer Vision Pattern Recognition
Workshops, 2008, pp. 1-6.
[29] P. Indyk and R. Motwani, "Approximate nearest
neighbors: Towards removing the curse of dimensionality," in Proc. 30th Annu. ACM Symp. Theory Computing,
1998, pp. 604-613.
[30] H. He and E. A. Garcia, "Learning from imbalanced
data," IEEE Trans. Know. Data Eng., vol. 21, no. 9, pp.
1263-1284, 2009.
[31] J. H. Friedman, S. Steppel, and J. Tukey, A Nonparametric Procedure for Comparing Multivariate Point Sets. no.
153, Stanford Linear Accelerator Center Computation
Research Group Technical Memo, 1973.
[32] M. F. Schilling, "Multivariate two-sample tests
based on nearest neighbors," J. Amer. Stat. Assoc., vol. 81,
no. 395, pp. 799-806, 1986.
[33] M. Schilling, "Mutual and shared neighbor probabilities: Finite-and infinite-dimensional results," Adv.
Appl. Probab., vol. 18, no. 2, pp. 388-405, 1986.
[34] Y. LeCun, L. Bottou, Y. Bengio, and P. Haffner,
"Gradient-based learning applied to document recognition," Proc. IEEE, vol. 86, no. 11, pp. 2278-2324, 1998.
[35] M. Lichman. (2013). UCI Machine Learning Repository. School of Information and Computer Science.
Irvine, CA: Univ. California. [Online]. Available: http://
archive.ics.uci.edu/ml/.
[36] H. He, Y. Bai, E. A. Garcia, and S. Li, "ADASYN:
Adaptive synthetic sampling approach for imbalanced
learning," in Proc. IEEE Int. Joint Conf. Neural Networks,
2008, pp. 1322-1328.
[37] S. Chen, H. He, and E. A. Garcia, "RAMOBoost:
Ranked minority over-sampling in boosting," IEEE
Trans. Neural Networks, vol. 21, no. 10, pp. 1624-1642,
2010.
[38] G. Biau, F. Chazal, D. Cohen-Steiner, L. Devroye,
and C. Rodriguez, "A weighted k-nearest neighbor density estimate for geometric inference," Electron. J. Stat.,
vol. 5, pp. 204-237, 2011.
[39] U. von Luxburg and M. Alamgir, "Density estimation from unweighted k-nearest neighbor graphs: A
roadmap," in Proc. Advances Neural Information Processing
Systems, 2013, pp. 225-233.
[40] D. W. Scott, Multivariate Density Estimation: Theory,
Practice, and Visualization. Hoboken, NJ: Wiley, 2009.


http://http:// http://www.nature.com/news/specials/bigdata/index.html http://http:// http://archive.ics.uci.edu/ml/

Table of Contents for the Digital Edition of Computational Intelligence - August 2015

Computational Intelligence - August 2015 - Cover1
Computational Intelligence - August 2015 - Cover2
Computational Intelligence - August 2015 - 1
Computational Intelligence - August 2015 - 2
Computational Intelligence - August 2015 - 3
Computational Intelligence - August 2015 - 4
Computational Intelligence - August 2015 - 5
Computational Intelligence - August 2015 - 6
Computational Intelligence - August 2015 - 7
Computational Intelligence - August 2015 - 8
Computational Intelligence - August 2015 - 9
Computational Intelligence - August 2015 - 10
Computational Intelligence - August 2015 - 11
Computational Intelligence - August 2015 - 12
Computational Intelligence - August 2015 - 13
Computational Intelligence - August 2015 - 14
Computational Intelligence - August 2015 - 15
Computational Intelligence - August 2015 - 16
Computational Intelligence - August 2015 - 17
Computational Intelligence - August 2015 - 18
Computational Intelligence - August 2015 - 19
Computational Intelligence - August 2015 - 20
Computational Intelligence - August 2015 - 21
Computational Intelligence - August 2015 - 22
Computational Intelligence - August 2015 - 23
Computational Intelligence - August 2015 - 24
Computational Intelligence - August 2015 - 25
Computational Intelligence - August 2015 - 26
Computational Intelligence - August 2015 - 27
Computational Intelligence - August 2015 - 28
Computational Intelligence - August 2015 - 29
Computational Intelligence - August 2015 - 30
Computational Intelligence - August 2015 - 31
Computational Intelligence - August 2015 - 32
Computational Intelligence - August 2015 - 33
Computational Intelligence - August 2015 - 34
Computational Intelligence - August 2015 - 35
Computational Intelligence - August 2015 - 36
Computational Intelligence - August 2015 - 37
Computational Intelligence - August 2015 - 38
Computational Intelligence - August 2015 - 39
Computational Intelligence - August 2015 - 40
Computational Intelligence - August 2015 - 41
Computational Intelligence - August 2015 - 42
Computational Intelligence - August 2015 - 43
Computational Intelligence - August 2015 - 44
Computational Intelligence - August 2015 - 45
Computational Intelligence - August 2015 - 46
Computational Intelligence - August 2015 - 47
Computational Intelligence - August 2015 - 48
Computational Intelligence - August 2015 - 49
Computational Intelligence - August 2015 - 50
Computational Intelligence - August 2015 - 51
Computational Intelligence - August 2015 - 52
Computational Intelligence - August 2015 - 53
Computational Intelligence - August 2015 - 54
Computational Intelligence - August 2015 - 55
Computational Intelligence - August 2015 - 56
Computational Intelligence - August 2015 - 57
Computational Intelligence - August 2015 - 58
Computational Intelligence - August 2015 - 59
Computational Intelligence - August 2015 - 60
Computational Intelligence - August 2015 - 61
Computational Intelligence - August 2015 - 62
Computational Intelligence - August 2015 - 63
Computational Intelligence - August 2015 - 64
Computational Intelligence - August 2015 - 65
Computational Intelligence - August 2015 - 66
Computational Intelligence - August 2015 - 67
Computational Intelligence - August 2015 - 68
Computational Intelligence - August 2015 - 69
Computational Intelligence - August 2015 - 70
Computational Intelligence - August 2015 - 71
Computational Intelligence - August 2015 - 72
Computational Intelligence - August 2015 - 73
Computational Intelligence - August 2015 - 74
Computational Intelligence - August 2015 - 75
Computational Intelligence - August 2015 - 76
Computational Intelligence - August 2015 - 77
Computational Intelligence - August 2015 - 78
Computational Intelligence - August 2015 - 79
Computational Intelligence - August 2015 - 80
Computational Intelligence - August 2015 - Cover3
Computational Intelligence - August 2015 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202311
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202308
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202305
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202302
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202211
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202208
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202205
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202202
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202111
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202108
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202105
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202102
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202011
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202008
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202005
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202002
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201911
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201908
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201905
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201902
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201811
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201808
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201805
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201802
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter12
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall12
https://www.nxtbookmedia.com