IEEE Computational Intelligence Magazine - May 2022 - 48

[49] B. Carpenter et al., " Stan: A probabilistic programming language, " J. Statist. Softw.,
vol. 76, no. 1, 2017, doi: 10.18637/jss.v076.i01.
[50] A. Gelman et al., " Prior choice recommendations. " 2020. https://github.com/
stan-dev/stan/wiki/Prior-Choice-Recommendations (Accessed: Jul. 13, 2020).
[51] D. Silvestro and T. Andermann, " Prior choice affects ability of Bayesian neural networks
to identify unknowns, " 2020. [Online]. Available: http://arxiv.org/abs/2005.04987
[52] K. P. Murphy, Machine Learning: A Probabilistic Perspective. Cambridge, MA, USA:
MIT Press, 2012.
[53] A. A. Pourzanjani, R. M. Jiang, B. Mitchell, P. J. Atzberger, and L. R. Petzold,
" Bayesian inference over the Stiefel manifold via the Givens representation, " 2017. [Online].
Available: http://arxiv.org/abs/1710.09443
[54] J. L. Ba, J. R. Kiros, and G. E. Hinton, " Layer normalization, " 2016, arXiv:1607.06450.
[55]
G.-J. Qi and J. Luo, " Small data challenges in big data era: A survey of recent progress
on unsupervised and semi-supervised methods, " 2019. [Online]. Available: http://arxiv.
org/abs/1903.11260
[56] N. Natarajan, I. S. Dhillon, P. K. Ravikumar, and A. Tewari, " Learning with noisy
labels, " in Proc. Adv. Neural Inf. Process. Syst., Curran Associates, 2013, pp. 1196-1204.
[57] B. Frenay and M. Verleysen, " Classification in the presence of label noise: A survey, "
IEEE Trans. Neural Netw. Learn. Syst., vol. 25, no. 5, pp. 845-869, May 2014, doi:
10.1109/TNNLS.2013.2292894.
[58] A. C. Tommi and T. Jaakkola, " On information regularization, " in Proc. 19th UAI,
2003.
[59] K. Sohn et al., " FixMatch: Simplifying semi-supervised learning with consistency
and confidence, " 2020. [Online]. Available: https://arxiv.org/abs/2001.07685
[60] M. Belkin, P. Niyogi, and V. Sindhwani, " Manifold regularization: A geometric
framework for learning from labeled and unlabeled examples, " J. Mach. Learn. Res., vol.
7, pp. 2399-2434, Dec. 2006.
[61] S. Yu, B. Krishnapuram, R. Rosales, and R. B. Rao, " Bayesian co-training, " J. Mach.
Learn. Res., vol. 12, no. 80, pp. 2649-2680, 2011.
[62] R. Kunwar, U. Pal, and M. Blumenstein, " Semi-supervised online Bayesian network
learner for handwritten characters recognition, " in Proc. 2014 22nd Int. Conf. Pattern
Recognit., 2014, pp. 3104-3109.
[63] D.-H. Lee, " Pseudo-label: The simple and efficient semi-supervised learning method
for deep neural networks, " in Proc. Workshop Challenges Representation Learn., ICML, 2013,
vol. 3.
[64] Z. Li, B. Ko, and H.-J. Choi, " Naive semi-supervised deep learning using pseudo-label, "
Peer-to-Peer Netw. Appl., vol. 12, no. 5, pp. 1358-1368, 2019, doi: 10.1007/
s12083-018-0702-9.
[65] M. S. Bari, M. T. Mohiuddin, and S. Joty, " MultiMix: A robust data augmentation
strategy for cross-lingual NLP, " in Proc. ICML, 2020.
[66] O. Chapelle, J. Weston, L. Bottou, and V. Vapnik, " Vicinal risk minimization, " in
Proc. Adv. Neural Inf. Process. Syst., MIT Press, 2001, pp. 416-422.
[67] Q. Xie, Z. Dai, E. H. Hovy, M. Luong, and Q. V. Le, " Unsupervised data augmentation, "
2019. [Online]. Available: http://arxiv.org/abs/1904.12848
[68] T. Hospedales, A. Antoniou, P. Micaelli, and A. Storkey, " Meta-learning in neural
networks: A survey, " 2020. [Online]. Available: http://arxiv.org/abs/2004.05439
[69] S. J. Pan and Q. Yang, " A survey on transfer learning, " IEEE Trans. Knowl. Data Eng.,
vol. 22, no. 10, pp. 1345-1359, Oct. 2010, doi: 10.1109/TKDE.2009.191.
[70] X. Qiu, T. Sun, Y. Xu, Y. Shao, N. Dai, and X. Huang, " Pre-trained models for
natural language processing: A survey, " CoRR, vol. abs/2003.08271, 2020.
[71] L. Jing and Y. Tian, " Self-supervised visual feature learning with deep neural networks:
A survey, " IEEE Trans. Pattern Anal. Mach. Intell., vol. 43, no. 11, pp. 4037-4058,
Nov. 2021, doi: 10.1109/TPAMI.2020.2992393.
[72] X.-F. Han, H. Laga, and M. Bennamoun, " Image-based 3D object reconstruction:
State-of-the-art and trends in the deep learning era, " IEEE Trans. Pattern Anal. Mach.
Intell., vol. 43, no. 5, pp. 1578-1604, May 2021, doi: 10.1109/TPAMI.2019.2954885.
[73] H. Laga, L. V. Jospin, F. Boussaid, and M. Bennamoun, " A survey on deep learning
techniques for stereo-based depth estimation, " IEEE Trans. Pattern Anal. Mach. Intell., vol.
44, no. 4, pp. 1738-1764, Apr. 2022, doi: 10.1109/TPAMI.2020.3032602.
[74] E. Grant, C. Finn, S. Levine, T. Darrell, and T. L. Griffiths, " Recasting gradientbased
meta-learning as hierarchical Bayes, " in Proc. 6th Int. Conf. Learn. Representations,
Vancouver, BC, Canada, 2018.
[75] L. Beyer, X. Zhai, A. Oliver, and A. Kolesnikov, " S4L: Self-supervised semi-supervised
learning, " in Proc. 2019 IEEE/CVF Int. Conf. Comput. Vis. (ICCV), 2019, pp. 1476-1485.
[76] W. K. Hastings, " Monte Carlo sampling methods using Markov chains and their
applications, " Biometrika, vol. 57, no. 1, pp. 97-109, Apr. 1970, doi: 10.1093/biomet/57.1.97.
[77]
D. M. Blei, A. Kucukelbir, and J. D. McAuliffe, " Variational inference: A review
for statisticians, " J. Amer. Statist. Assoc., vol. 112, no. 518, pp. 859-877, 2017, doi:
10.1080/01621459.2017.1285773.
[78] R. Bardenet, A. Doucet, and C. Holmes, " On Markov Chain Monte Carlo methods
for tall data, " J. Mach. Learn. Res., vol. 18, no. 1, pp. 1515-1557, Jan. 2017.
[79] E. I. George, G. Casella, and E. I. George, " Explaining the Gibbs sampler, " Amer.
Statist., 1992.
[80] S. Chib and E. Greenberg, " Understanding the Metropolis-Hastings algorithm, "
Amer. Statist., vol. 49, no. 4, pp. 327-335, 1995, doi: 10.2307/2684568.
[81] R. M. Neal et al., " MCMC using Hamiltonian dynamics, " in Handbook of Markov
Chain Monte Carlo, vol. 2, 2011, p. 2.
[82] M. D. Hoffman and A. Gelman, " The No-U-Turn sampler: adaptively setting path lengths
in Hamiltonian Monte Carlo, " J. Mach. Learn. Res., vol. 15, no. 1, pp. 1593-1623, 2014.
[83] S. Kullback and R. A. Leibler, " On information and sufficiency, " Ann. Math. Statist.,
vol. 22, no. 1, pp. 79-86, 1951, doi: 10.1214/aoms/1177729694.
[84] C. E. Shannon, " A mathematical theory of communication, " Bell Syst. Tech. J., vol.
27, no. 3, pp. 379-423, 1948, doi: 10.1002/j.1538-7305.1948.tb01338.x.
[85] M. D. Hoffman, D. M. Blei, C. Wang, and J. Paisley, " Stochastic variational inference, "
J. Mach. Learn. Res., vol. 14, no. 1, pp. 1303-1347, May 2013.
[86] A. Graves, " Practical variational inference for neural networks, " in Proc. Adv. Neural
Inf. Process. Syst., Curran Associates, 2011, pp. 2348-2356.
[87] Z. Ghahramani and M. J. Beal, " Propagation algorithms for variational Bayesian
learning, " in Proc. Adv. Neural Inf. Process. Syst., MIT Press, 2001, pp. 507-513.
[88] H. Ritter, A. Botev, and D. Barber, " A scalable laplace approximation for neural
networks, " in Proc. Int. Conf. Learn. Representations, 2018.
[89] W. J. Maddox, P. Izmailov, T. Garipov, D. P. Vetrov, and A. G. Wilson, " A simple
baseline for Bayesian uncertainty in deep learning, " in Proc. Adv. Neural Inf. Process. Syst.,
Curran Associates, 2019, pp. 13,153-13,164.
[90] J. M. Hernández-Lobato and R. P. Adams, " Probabilistic backpropagation for scalable
learning of Bayesian neural networks, " in Proc. 32nd Int. Conf. Mach. Learn., 2015,
vol. 37, pp. 1861-1869.
[91] C. Blundell, J. Cornebise, K. Kavukcuoglu, and D. Wierstra, " Weight uncertainty
in neural network, " in Proc. 32nd Int. Conf. Mach. Learn., 2015, vol. 37, pp. 1613-1622.
[92] D. P. Kingma et al., " An introduction to variational autoencoders, " Foundations
Trends[textregistered] Mach. Learn., vol. 12, no. 4, pp. 307-392, 2019, doi:
10.1561/2200000056.
[93] D. Kingma and J. Ba, " Adam: A method for stochastic optimization, " in Proc. Int.
Conf. Learn. Representations, Dec. 2014.
[94] N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever, and R. Salakhutdinov, " Dropout:
A simple way to prevent neural networks from overfitting, " J. Mach. Learn. Res., vol.
15, no. 56, pp. 1929-1958, 2014.
[95] Y. Gal and Z. Ghahramani, " Dropout as a Bayesian approximation: Representing
model uncertainty in deep learning, " in Proc. 33rd Int. Conf. Mach. Learn., 2016, vol. 48,
pp. 1050-1059.
[96] Y. Li and Y. Gal, " Dropout inference in Bayesian neural networks with alpha-divergences, "
in Proc. 34th Int. Conf. Mach. Learn., 2017, vol. 70, pp. 2052-2061.
[97] J. Hron, A. Matthews, and Z. Ghahramani, " Variational Bayesian dropout: pitfalls
and fixes, " in Proc. 35th Int. Conf. Mach. Learn., 2018, vol. 80, pp. 2019-2028.
[98] A. Chan, A. Alaa, Z. Qian, and M. Van Der Schaar, " Unlabelled data improves
Bayesian uncertainty calibration under covariate shift, " in Proc. 37th Int. Conf. Mach.
Learn., Jul. 13-18, 2020, vol. 119, pp. 1392-1402.
[99] S. Mandt, M. D. Hoffman, and D. M. Blei, " Stochastic gradient descent as approximate
Bayesian inference, " J. Mach. Learn. Res., vol. 18, no. 1, pp. 4873-4907,
2017.
[100] M. Welling and Y. W. Teh, " Bayesian learning via stochastic gradient Langevin
dynamics, " in Proc. 28th Int. Conf. Mach. Learn., 2011, pp. 681-688.
[101] N. Seedat and C. Kanan, " Towards calibrated and scalable uncertainty representations
for neural networks, " 2019. [Online]. Available: http://arxiv.org/abs/1911.
00104
[102] B. Lakshminarayanan, A. Pritzel, and C. Blundell, " Simple and scalable predictive
uncertainty estimation using deep ensembles, " in Proc. Adv. Neural Inf. Process. Syst., Curran
Associates, 2017, pp. 6402-6413.
[103] M. Khan, D. Nielsen, V. Tangkaratt, W. Lin, Y. Gal, and A. Srivastava, " Fast and
scalable Bayesian deep learning by weight-perturbation in Adam, " in Proc. 35th Int. Conf.
Mach. Learn., 2018, vol. 80, pp. 2611-2620.
[104] T. Pearce, F. Leibfried, A. Brintrup, M. Zaki, and A. Neely, " Uncertainty in neural
networks: Approximately Bayesian ensembling, " in Proc. AISTATS, 2020.
[105] J. Zeng, A. Lesnikowski, and J. M. Alvarez, " The relevance of Bayesian layer positioning
to model uncertainty in deep Bayesian active learning, " 2018. [Online]. Available:
http://arxiv.org/abs/1811.12535
[106] N. Brosse, C. Riquelme, A. Martin, S. Gelly, and Éric Moulines, " On last-layer
algorithms for classification: Decoupling representation from uncertainty estimation, "
2020. [Online]. Available: http://arxiv.org/abs/2001.08049
[107] E. Snelson and Z. Ghahramani, " Compact approximations to Bayesian predictive
distributions, " in Proc. 22nd Int. Conf. Mach. Learn., 2005, pp. 840-847.
[108] A. Korattikara, V. Rathod, K. Murphy, and M. Welling, " Bayesian dark knowledge, "
in Proc. 28th Int. Conf. Neural Inf. Process. Syst., 2015, vol. 2, pp. 3438-3446.
[109] G. Hinton, O. Vinyals, and J. Dean, " Distilling the knowledge in a neural network, "
2015, arXiv:1503.02531.
[110] A. K. Menon, A. S. Rawat, S. J. Reddi, S. Kim, and S. Kumar, " Why distillation
helps: A statistical perspective, " 2020. [Online]. Available: https://arxiv.org/
abs/2005.10419
[111] K.-C. Wang, P. Vicol, J. Lucas, L. Gu, R. Grosse, and R. Zemel, " Adversarial distillation
of Bayesian neural network posteriors, " in Proc. 35th Int. Conf. Mach. Learn., 2018,
vol. 80, pp. 5190-5199.
[112] K. Janocha and W. M. Czarnecki, " On loss functions for deep neural networks in
classification, " Schedae Informaticae, vol. 1/2016, 2017, doi: 10.4467/20838476SI.16.004.
6185.
[113] V. Kuleshov, N. Fenner, and S. Ermon, " Accurate uncertainties for deep learning
using calibrated regression, " in Proc. 35th Int. Conf. Mach. Learn., 2018, vol. 80, pp.
2796-2804.
48 IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE | MAY 2022
https://www.github.com/stan-dev/stan/wiki/Prior-Choice-Recommendations https://www.github.com/stan-dev/stan/wiki/Prior-Choice-Recommendations http://www.arxiv.org/abs/2005.04987 http://www.arxiv.org/abs/1710.09443 http://arxiv.org/abs/1903.11260 http://arxiv.org/abs/1903.11260 https://www.arxiv.org/abs/2001.07685 http://www.arxiv.org/abs/1904.12848 http://www.arxiv.org/abs/1911.00104 http://www.arxiv.org/abs/1911.00104 http://www.arxiv.org/abs/2004.05439 http://www.arxiv.org/abs/1811.12535 http://www.arxiv.org/abs/2001.08049 https://www.arxiv.org/abs/2005.10419 https://www.arxiv.org/abs/2005.10419

IEEE Computational Intelligence Magazine - May 2022

Table of Contents for the Digital Edition of IEEE Computational Intelligence Magazine - May 2022

Contents
IEEE Computational Intelligence Magazine - May 2022 - Cover1
IEEE Computational Intelligence Magazine - May 2022 - Cover2
IEEE Computational Intelligence Magazine - May 2022 - Contents
IEEE Computational Intelligence Magazine - May 2022 - 2
IEEE Computational Intelligence Magazine - May 2022 - 3
IEEE Computational Intelligence Magazine - May 2022 - 4
IEEE Computational Intelligence Magazine - May 2022 - 5
IEEE Computational Intelligence Magazine - May 2022 - 6
IEEE Computational Intelligence Magazine - May 2022 - 7
IEEE Computational Intelligence Magazine - May 2022 - 8
IEEE Computational Intelligence Magazine - May 2022 - 9
IEEE Computational Intelligence Magazine - May 2022 - 10
IEEE Computational Intelligence Magazine - May 2022 - 11
IEEE Computational Intelligence Magazine - May 2022 - 12
IEEE Computational Intelligence Magazine - May 2022 - 13
IEEE Computational Intelligence Magazine - May 2022 - 14
IEEE Computational Intelligence Magazine - May 2022 - 15
IEEE Computational Intelligence Magazine - May 2022 - 16
IEEE Computational Intelligence Magazine - May 2022 - 17
IEEE Computational Intelligence Magazine - May 2022 - 18
IEEE Computational Intelligence Magazine - May 2022 - 19
IEEE Computational Intelligence Magazine - May 2022 - 20
IEEE Computational Intelligence Magazine - May 2022 - 21
IEEE Computational Intelligence Magazine - May 2022 - 22
IEEE Computational Intelligence Magazine - May 2022 - 23
IEEE Computational Intelligence Magazine - May 2022 - 24
IEEE Computational Intelligence Magazine - May 2022 - 25
IEEE Computational Intelligence Magazine - May 2022 - 26
IEEE Computational Intelligence Magazine - May 2022 - 27
IEEE Computational Intelligence Magazine - May 2022 - 28
IEEE Computational Intelligence Magazine - May 2022 - 29
IEEE Computational Intelligence Magazine - May 2022 - 30
IEEE Computational Intelligence Magazine - May 2022 - 31
IEEE Computational Intelligence Magazine - May 2022 - 32
IEEE Computational Intelligence Magazine - May 2022 - 33
IEEE Computational Intelligence Magazine - May 2022 - 34
IEEE Computational Intelligence Magazine - May 2022 - 35
IEEE Computational Intelligence Magazine - May 2022 - 36
IEEE Computational Intelligence Magazine - May 2022 - 37
IEEE Computational Intelligence Magazine - May 2022 - 38
IEEE Computational Intelligence Magazine - May 2022 - 39
IEEE Computational Intelligence Magazine - May 2022 - 40
IEEE Computational Intelligence Magazine - May 2022 - 41
IEEE Computational Intelligence Magazine - May 2022 - 42
IEEE Computational Intelligence Magazine - May 2022 - 43
IEEE Computational Intelligence Magazine - May 2022 - 44
IEEE Computational Intelligence Magazine - May 2022 - 45
IEEE Computational Intelligence Magazine - May 2022 - 46
IEEE Computational Intelligence Magazine - May 2022 - 47
IEEE Computational Intelligence Magazine - May 2022 - 48
IEEE Computational Intelligence Magazine - May 2022 - 49
IEEE Computational Intelligence Magazine - May 2022 - 50
IEEE Computational Intelligence Magazine - May 2022 - 51
IEEE Computational Intelligence Magazine - May 2022 - 52
IEEE Computational Intelligence Magazine - May 2022 - 53
IEEE Computational Intelligence Magazine - May 2022 - 54
IEEE Computational Intelligence Magazine - May 2022 - 55
IEEE Computational Intelligence Magazine - May 2022 - 56
IEEE Computational Intelligence Magazine - May 2022 - 57
IEEE Computational Intelligence Magazine - May 2022 - 58
IEEE Computational Intelligence Magazine - May 2022 - 59
IEEE Computational Intelligence Magazine - May 2022 - 60
IEEE Computational Intelligence Magazine - May 2022 - 61
IEEE Computational Intelligence Magazine - May 2022 - 62
IEEE Computational Intelligence Magazine - May 2022 - 63
IEEE Computational Intelligence Magazine - May 2022 - 64
IEEE Computational Intelligence Magazine - May 2022 - 65
IEEE Computational Intelligence Magazine - May 2022 - 66
IEEE Computational Intelligence Magazine - May 2022 - 67
IEEE Computational Intelligence Magazine - May 2022 - 68
IEEE Computational Intelligence Magazine - May 2022 - 69
IEEE Computational Intelligence Magazine - May 2022 - 70
IEEE Computational Intelligence Magazine - May 2022 - 71
IEEE Computational Intelligence Magazine - May 2022 - 72
IEEE Computational Intelligence Magazine - May 2022 - 73
IEEE Computational Intelligence Magazine - May 2022 - 74
IEEE Computational Intelligence Magazine - May 2022 - 75
IEEE Computational Intelligence Magazine - May 2022 - 76
IEEE Computational Intelligence Magazine - May 2022 - 77
IEEE Computational Intelligence Magazine - May 2022 - 78
IEEE Computational Intelligence Magazine - May 2022 - 79
IEEE Computational Intelligence Magazine - May 2022 - 80
IEEE Computational Intelligence Magazine - May 2022 - 81
IEEE Computational Intelligence Magazine - May 2022 - 82
IEEE Computational Intelligence Magazine - May 2022 - 83
IEEE Computational Intelligence Magazine - May 2022 - 84
IEEE Computational Intelligence Magazine - May 2022 - 85
IEEE Computational Intelligence Magazine - May 2022 - 86
IEEE Computational Intelligence Magazine - May 2022 - 87
IEEE Computational Intelligence Magazine - May 2022 - 88
IEEE Computational Intelligence Magazine - May 2022 - 89
IEEE Computational Intelligence Magazine - May 2022 - 90
IEEE Computational Intelligence Magazine - May 2022 - 91
IEEE Computational Intelligence Magazine - May 2022 - 92
IEEE Computational Intelligence Magazine - May 2022 - 93
IEEE Computational Intelligence Magazine - May 2022 - 94
IEEE Computational Intelligence Magazine - May 2022 - 95
IEEE Computational Intelligence Magazine - May 2022 - 96
IEEE Computational Intelligence Magazine - May 2022 - 97
IEEE Computational Intelligence Magazine - May 2022 - 98
IEEE Computational Intelligence Magazine - May 2022 - 99
IEEE Computational Intelligence Magazine - May 2022 - 100
IEEE Computational Intelligence Magazine - May 2022 - 101
IEEE Computational Intelligence Magazine - May 2022 - 102
IEEE Computational Intelligence Magazine - May 2022 - 103
IEEE Computational Intelligence Magazine - May 2022 - 104
IEEE Computational Intelligence Magazine - May 2022 - Cover3
IEEE Computational Intelligence Magazine - May 2022 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202311
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202308
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202305
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202302
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202211
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202208
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202205
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202202
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202111
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202108
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202105
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202102
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202011
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202008
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202005
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202002
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201911
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201908
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201905
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201902
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201811
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201808
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201805
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201802
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter12
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall12
https://www.nxtbookmedia.com