IEEE Computational Intelligence Magazine - May 2023 - 75
network acceleration: A survey, " Neurocomputing,
vol. 461, pp. 370-403, 2021.
[30] W. Wen, C. Wu, Y. Wang, Y. Chen, and H. Li,
" Learning structured sparsity in deep neural networks, "
in Proc. Adv. Neural Inf. Process. Syst., 2016,
pp. 2074-2082.
[31] Z. Liu, J. Li, Z. Shen, G. Huang, S. Yan, and C.
Zhang, " Learning efficient convolutional networks
through network slimming, " in Proc. IEEE Int. Conf.
Comput. Vis., 2017, pp. 2736-2744.
[32] H. Hu, R. Peng, Y.-W. Tai, and C.-K. Tang,
" Network trimming: A data-driven neuron pruning
approach towards efficient deep architectures, " 2016,
arXiv:1607.03250.
[33] B. Hassibi and D. G. Stork, Second Order Derivatives
for Network Pruning: Optimal Brain Surgeon. San
Mateo, CA, USA: Morgan, 1993.
[34] N. Lee, T. Ajanthan, and P. H. Torr, " Snip: Single-shot
network pruning based on connection sensitivity, "
2018, arXiv:1810.02340.
[35] H. X. Choong, Y.-S. Ong, A. Gupta, and R.
Lim, " Jack and masters of all trades: One-pass learning
of a set ofmodel sets from foundation models, " 2022,
arXiv:2205.00671.
[36] Q. Huang, K. Zhou, S. You, and U. Neumann,
" Learning to prune filters in convolutional neural networks, "
in Proc. IEEE Winter Conf. Appl. Comput.
Vis., 2018, pp. 709-718.
[37] A. Renda, J. Frankle, and M. Carbin, " Comparing
rewinding and fine-tuning in neural network
pruning, " 2020, arXiv:2003.02389.
[38] C. Wang, G. Zhang, and R. Grosse, " Picking
winning tickets before training by preserving gradient
flow, " 2020, arXiv:2002.07376.
[39] D. C. Mocanu, E. Mocanu, P. Stone, P. H.
Nguyen, M. Gibescu, and A. Liotta, " Scalable training
of artificial neural networks with adaptive sparse connectivity
inspired by network science, " Nature Commun.,
vol. 9, no. 1, pp. 1-12, 2018.
[40] S. Gupta, A. Agrawal, K. Gopalakrishnan, and P.
Narayanan, " Deep learning with limited numerical
precision, " in Proc. Int. Conf. Mach. Learn., 2015,
pp. 1737-1746.
[41] V. Vanhoucke, A. Senior, and M. Z. Mao,
" Improving the speed of neural networks on CPUs, "
in Proc. Deep Learn. Unsupervised Feature Learn. Workshop,
2011.
[42] I. Hubara, M. Courbariaux, D. Soudry, R. ElYaniv,
and Y. Bengio, " Binarized neural networks, "
in Proc. Adv. Neural Inf. Process. Syst., vol. 29, 2016.
[43] M. Rastegari, V. Ordonez, J. Redmon, and A.
Farhadi, " XNOR-Net: ImageNet classification using
binary convolutional neural networks, " in Proc. Eur.
Conf. Comput. Vis., 2016, pp. 525-542.
[44] B. Jacob et al., " Quantization and training of
neural networks for efficient integer-arithmetic-only
inference, " in Proc. IEEE Conf. Comput. Vis. Pattern
Recognit., 2018, pp. 2704-2713.
[45] A. Zhou, A. Yao, Y. Guo, L. Xu, and
Y. Chen, " Incremental network quantization:
Towards lossless CNNs with low-precision weights, "
2017, arXiv:1702.03044.
[46] S. Zhou, Y. Wu, Z. Ni, X. Zhou, H. Wen, and
Y. Zou, " DOREFA-Net: Training low bitwidth convolutional
neural networks with low bitwidth gradients, "
2016, arXiv:1606.06160.
[47] Y. Gong, L. Liu, M. Yang, and L. Bourdev,
" Compressing deep convolutional networks using
vector quantization, " 2014, arXiv:1412.6115.
[48] J. Wu, C. Leng, Y. Wang, Q. Hu, and J. Cheng,
" Quantized convolutional neural networks for mobile
devices, " in Proc. IEEE Conf. Comput. Vis. Pattern Recognit.,
2016, pp. 4820-4828.
[49] S. Han, H. Mao, and W. J. Dally, " Deep compression:
Compressing deep neural networks with
pruning, trained quantization and huffman coding, "
2015, arXiv:1510.00149.
[50] L. Wang and K.-J. Yoon, " Knowledge distillation
and student-teacher learning for visual intelligence:
A review and new outlooks, " IEEE
Trans. Pattern Anal. Mach. Intell., vol. 44, no. 6,
pp. 3048-3068, Jun. 2022.
[51] J. Gou, B. Yu, S. J. Maybank, and D. Tao,
" Knowledge distillation: A survey, " Int. J. Comput.
Vis., vol. 129, no. 6, pp. 1789-1819, 2021.
[52] G. Hinton, O. Vinyals, and J. Dean, " Distilling
the knowledge in a neural network, " in Proc. NIPS
Deep Learn. Representation Learn. Workshop, vol. 1050,
2014, Art. no. 9.
[53] A. Romero, N. Ballas, S. E. Kahou, A. Chassang,
C. Gatta, and Y. Bengio, " FITNets: Hints for thin
deep nets, " 2014, arXiv:1412.6550.
[54] N. Komodakis and S. Zagoruyko, " Paying more
attention to attention: Improving the performance of
convolutional neural networks via attention transfer, "
in Proc. Int. Conf. Learn. Representations, 2017.
[55] S. Lee and B. C. Song, " Graph-based knowledge
distillation by multi-head attention network, " 2019,
arXiv:1907.02226.
[56] W. Park, D. Kim, Y. Lu, and M. Cho, " Relational
knowledge distillation, " in Proc. IEEE/CVFConf. Comput.
Vis. Pattern Recognit., 2019, pp. 3967-3976.
[57] Y. Tian, D. Krishnan, and P. Isola, " Contrastive
representation distillation, " 2019, arXiv:1910.10699.
[58] S. H. Lee, D. H. Kim, and B. C. Song, " Selfsupervised
knowledge distillation using singular value
decomposition, " in Proc. Eur. Conf. Comput. Vis.,
2018, pp. 335-350.
[59] J. Yim, D. Joo, J. Bae, and J. Kim, " A gift
from knowledge distillation: Fast optimization, network
minimization and transfer learning, " in Proc.
IEEE Conf. Comput. Vis. Pattern Recognit., 2017,
pp. 4133-4141.
[60] T. Fukuda, M. Suzuki, G. Kurata, S. Thomas, J.
Cui, and B. Ramabhadran, " Efficient knowledge distillation
from an ensemble of teachers, " in Proc. Annu.
Conf. Int. Speech Commun. Assoc., 2017, pp. 3697-
3701.
[61] F. Yuanet al., " Reinforced multi-teacher selection
for knowledge distillation, " in Proc. AAAI Conf.
Artif. Intell., 2021, pp. 14284-14291.
[62] T. Furlanello, Z. Lipton, M. Tschannen, L.
Itti, and A. Anandkumar, " Born again neural networks, "
in Proc.Int.Conf.Mach. Learn., 2018,
pp. 1607-1616.
[63] Y. Zhang, T. Xiang, T. M. Hospedales, and H.
Lu, " Deep mutual learning, " in Proc. IEEE Conf.
Comput. Vis. Pattern Recognit., 2018, pp. 4320-4328.
[64] Q. Xu, Z. Chen, K. Wu, C. Wang, M. Wu,
and X. Li, " KDNet-Rul: A knowledge distillation
framework to compress deep neural networks for
machine remaining useful life prediction, " IEEE
Trans. Ind. Electron., vol. 69, no. 2, pp. 2022-2032,
Feb. 2022.
[65] L. Beyer, X. Zhai, A. Royer, L. Markeeva, R.
Anil, and A. Kolesnikov, " Knowledge distillation: A
good teacher is patient and consistent, " 2021,
arXiv:2106.05237.
[66] B. Li et al., " Full-cycle energy consumption
benchmark for low-carbon computer vision, " 2021,
arXiv:2108.13465.
[67] J. Deng, W. Dong, R. Socher, L.-J. Li, K. Li, and
L. Fei-Fei, " ImageNet: A large-scale hierarchical
image database, " in Proc. IEEE Conf. Comput. Vis.
Pattern Recognit., 2009, pp. 248-255.
[68] P. Renet al., " A survey of deep active learning, "
2020, arXiv:2009.00236.
[69] B. Settles, " Active learning literature survey, " Technical
Report, University ofWisconsin-Madison, 2009.
[70] R. Schumann and I. Rehbein, " Active learning
via membership query synthesis for semi-supervised
sentence classification, " in Proc. 23rd Conf. Comput.
Natural Lang. Learn., 2019, pp. 472-481.
[71] J. Smailovic, M. Grcar, N. Lavrac, and M.
sic, " Stream-based active learning for sentiment
Znidar
analysis in the financial domain, " Inf. Sci., vol. 285,
pp. 181-203, 2014.
[72] T. He, X. Jin, G. Ding, L. Yi, and C. Yan,
" Towards better uncertainty sampling: Active learning
with multiple views for deep convolutional neural
network, " in Proc. IEEE Int. Conf. Multimedia Expo,
2019, pp. 1360-1365.
[73] Y. Siddiqui, J. Valentin, and M. Nieûner,
" Viewal: Active learning with viewpoint entropy
for semantic segmentation, " in Proc. IEEE/CVF
Conf. Comput. Vis. Pattern Recognit., 2020,
pp. 9433-9443.
[74] Y. Geifman and R. El-Yaniv, " Deep active
learning over the long tail, " 2017, arXiv:1711.00941.
[75] Y. Wang and Q. Yao, " Few-shot learning: A survey, "
2019, arXiv:1904.05046.
[76] J. Vanschoren, " Meta-learning: A survey, " 2018,
arXiv:1810.03548.
[77] C. Finn, P. Abbeel, and S. Levine, " Modelagnostic
meta-learning for fast adaptation ofdeep networks, "
in Proc. Int. Conf. Mach. Learn., 2017,
pp. 1126-1135.
[78] M. A. Jamal and G.-J. Qi, " Task agnostic metalearning
for few-shot learning, " in Proc. IEEE/CVF
Conf. Comput. Vis. Pattern Recognit., 2019, pp. 11719-
11727.
[79] Z. Hou, A. Walid, and S.-Y. Kung, " Metalearning
with attention for improved few-shot learning, "
in Proc. IEEE Int. Conf. Acoust., Speech, Signal
Process., 2021, pp. 2725-2729.
[80] G. Koch et al., " Siamese neural networks for
one-shot image recognition, " in Proc. Int. Conf. Mach.
Learn. Deep Learn. Workshop,2015.
[81] E. Hoffer and N. Ailon, " Deep metric learning
using triplet network, " in Proc. Int. Workshop Similarity-Based
Pattern Recognit., 2015, pp. 84-92.
[82] O. Vinyals et al., " Matching networks for one
shot learning, " in Proc. Adv. Neural Inf. Process. Syst.,
2016, pp. 3630-3638.
[83] J. Snell, K. Swersky, and R. S. Zemel, " Prototypical
networks for few-shot learning, " in Proc. 31st
Int. Conf. Neural Inf. Process. Syst., 2017, pp. 4080-
4090.
[84] F. Sung, Y. Yang, L. Zhang, T. Xiang, P. H.
Torr, and T. M. Hospedales, " Learning to compare:
Relation network for few-shot learning, " in Proc.
IEEE Conf. Comput. Vis. Pattern Recognit., 2018,
pp. 1199-1208.
[85] P. Hemmer, N. K€uhl, and J. Sch€offer, " Deal:
Deep evidential active learning for image classification, "
in Deep Learning Applications, vol. 3. Berlin,
Germany: Springer, 2022, pp. 171-192.
[86] T. Yuanet al., " Multiple instance active learning
for object detection, " in Proc. IEEE/CVF Conf. Comput.
Vis. Pattern Recognit., 2021, pp. 5330-5339.
[87] D. Wertheimer, L. Tang, and B. Hariharan,
" Few-shot classification with feature map reconstruction
networks, " in Proc. IEEE/CVF Conf. Comput.
Vis. Pattern Recognit., 2021, pp. 8012-8021.
[88] S. J. Pan and Q. Yang, " A survey on transfer
learning, " IEEE Trans. Knowl. Data Eng., vol. 22,
no. 10, pp. 1345-1359, Oct. 2010.
[89] E. Tzeng, J. Hoffman, N. Zhang, K. Saenko, and
T. Darrell, " Deep domain confusion: Maximizing for
domain invariance, " 2014, arXiv:1412.3474.
[90] C. Chen et al., " Homm: Higher-order moment
matching for unsupervised domain adaptation, " in
Proc. AAAI Conf. Artif. Intell., 2020, pp. 3422-3429.
[91] B. Sun, J. Feng, and K. Saenko, " Correlation
alignment for unsupervised domain adaptation, " in
Domain Adaptation in Computer Vision Application. Berlin,
Germany: Springer, 2017, pp. 153-171.
[92] M. M. Rahman, C. Fookes, M. Baktashmotlagh,
and S. Sridharan, " On minimum discrepancy estimation
for deep domain adaptation, " in Domain Adaptation
for Visual Understanding. Berlin, Germany:
Springer, 2020, pp. 81-94.
[93] E. Tzeng, J. Hoffman, K. Saenko, and T. Darrell,
" Adversarial discriminative domain adaptation, " in
Proc. IEEE Conf. Comput. Vis. Pattern Recognit., 2017,
pp. 7167-7176.
[94] Y. Ganinet al., " Domain-adversarial training of
neural networks, " J. Mach. Learn. Res., vol. 17, no. 1,
pp. 1-35, 2016.
[95] K. Wu, M. Wu, J. Yang, Z. Chen, Z. Li, and X.
Li, " Deep reinforcement learning boosted partial
domain adaptation, " in Proc. Int. Joint Conf. Artif.
Intell., 2021, pp. 3192-3199.
[96] S. Wang and L. Zhang, " Self-adaptive reweighted
adversarial domain adaptation, " 2020,
arXiv:2006.00223.
MAY 2023 | IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE 75
IEEE Computational Intelligence Magazine - May 2023
Table of Contents for the Digital Edition of IEEE Computational Intelligence Magazine - May 2023
Contents
IEEE Computational Intelligence Magazine - May 2023 - Cover1
IEEE Computational Intelligence Magazine - May 2023 - Cover2
IEEE Computational Intelligence Magazine - May 2023 - Contents
IEEE Computational Intelligence Magazine - May 2023 - 2
IEEE Computational Intelligence Magazine - May 2023 - 3
IEEE Computational Intelligence Magazine - May 2023 - 4
IEEE Computational Intelligence Magazine - May 2023 - 5
IEEE Computational Intelligence Magazine - May 2023 - 6
IEEE Computational Intelligence Magazine - May 2023 - 7
IEEE Computational Intelligence Magazine - May 2023 - 8
IEEE Computational Intelligence Magazine - May 2023 - 9
IEEE Computational Intelligence Magazine - May 2023 - 10
IEEE Computational Intelligence Magazine - May 2023 - 11
IEEE Computational Intelligence Magazine - May 2023 - 12
IEEE Computational Intelligence Magazine - May 2023 - 13
IEEE Computational Intelligence Magazine - May 2023 - 14
IEEE Computational Intelligence Magazine - May 2023 - 15
IEEE Computational Intelligence Magazine - May 2023 - 16
IEEE Computational Intelligence Magazine - May 2023 - 17
IEEE Computational Intelligence Magazine - May 2023 - 18
IEEE Computational Intelligence Magazine - May 2023 - 19
IEEE Computational Intelligence Magazine - May 2023 - 20
IEEE Computational Intelligence Magazine - May 2023 - 21
IEEE Computational Intelligence Magazine - May 2023 - 22
IEEE Computational Intelligence Magazine - May 2023 - 23
IEEE Computational Intelligence Magazine - May 2023 - 24
IEEE Computational Intelligence Magazine - May 2023 - 25
IEEE Computational Intelligence Magazine - May 2023 - 26
IEEE Computational Intelligence Magazine - May 2023 - 27
IEEE Computational Intelligence Magazine - May 2023 - 28
IEEE Computational Intelligence Magazine - May 2023 - 29
IEEE Computational Intelligence Magazine - May 2023 - 30
IEEE Computational Intelligence Magazine - May 2023 - 31
IEEE Computational Intelligence Magazine - May 2023 - 32
IEEE Computational Intelligence Magazine - May 2023 - 33
IEEE Computational Intelligence Magazine - May 2023 - 34
IEEE Computational Intelligence Magazine - May 2023 - 35
IEEE Computational Intelligence Magazine - May 2023 - 36
IEEE Computational Intelligence Magazine - May 2023 - 37
IEEE Computational Intelligence Magazine - May 2023 - 38
IEEE Computational Intelligence Magazine - May 2023 - 39
IEEE Computational Intelligence Magazine - May 2023 - 40
IEEE Computational Intelligence Magazine - May 2023 - 41
IEEE Computational Intelligence Magazine - May 2023 - 42
IEEE Computational Intelligence Magazine - May 2023 - 43
IEEE Computational Intelligence Magazine - May 2023 - 44
IEEE Computational Intelligence Magazine - May 2023 - 45
IEEE Computational Intelligence Magazine - May 2023 - 46
IEEE Computational Intelligence Magazine - May 2023 - 47
IEEE Computational Intelligence Magazine - May 2023 - 48
IEEE Computational Intelligence Magazine - May 2023 - 49
IEEE Computational Intelligence Magazine - May 2023 - 50
IEEE Computational Intelligence Magazine - May 2023 - 51
IEEE Computational Intelligence Magazine - May 2023 - 52
IEEE Computational Intelligence Magazine - May 2023 - 53
IEEE Computational Intelligence Magazine - May 2023 - 54
IEEE Computational Intelligence Magazine - May 2023 - 55
IEEE Computational Intelligence Magazine - May 2023 - 56
IEEE Computational Intelligence Magazine - May 2023 - 57
IEEE Computational Intelligence Magazine - May 2023 - 58
IEEE Computational Intelligence Magazine - May 2023 - 59
IEEE Computational Intelligence Magazine - May 2023 - 60
IEEE Computational Intelligence Magazine - May 2023 - 61
IEEE Computational Intelligence Magazine - May 2023 - 62
IEEE Computational Intelligence Magazine - May 2023 - 63
IEEE Computational Intelligence Magazine - May 2023 - 64
IEEE Computational Intelligence Magazine - May 2023 - 65
IEEE Computational Intelligence Magazine - May 2023 - 66
IEEE Computational Intelligence Magazine - May 2023 - 67
IEEE Computational Intelligence Magazine - May 2023 - 68
IEEE Computational Intelligence Magazine - May 2023 - 69
IEEE Computational Intelligence Magazine - May 2023 - 70
IEEE Computational Intelligence Magazine - May 2023 - 71
IEEE Computational Intelligence Magazine - May 2023 - 72
IEEE Computational Intelligence Magazine - May 2023 - 73
IEEE Computational Intelligence Magazine - May 2023 - 74
IEEE Computational Intelligence Magazine - May 2023 - 75
IEEE Computational Intelligence Magazine - May 2023 - 76
IEEE Computational Intelligence Magazine - May 2023 - 77
IEEE Computational Intelligence Magazine - May 2023 - 78
IEEE Computational Intelligence Magazine - May 2023 - 79
IEEE Computational Intelligence Magazine - May 2023 - 80
IEEE Computational Intelligence Magazine - May 2023 - 81
IEEE Computational Intelligence Magazine - May 2023 - 82
IEEE Computational Intelligence Magazine - May 2023 - 83
IEEE Computational Intelligence Magazine - May 2023 - 84
IEEE Computational Intelligence Magazine - May 2023 - 85
IEEE Computational Intelligence Magazine - May 2023 - 86
IEEE Computational Intelligence Magazine - May 2023 - 87
IEEE Computational Intelligence Magazine - May 2023 - 88
IEEE Computational Intelligence Magazine - May 2023 - 89
IEEE Computational Intelligence Magazine - May 2023 - 90
IEEE Computational Intelligence Magazine - May 2023 - 91
IEEE Computational Intelligence Magazine - May 2023 - 92
IEEE Computational Intelligence Magazine - May 2023 - 93
IEEE Computational Intelligence Magazine - May 2023 - 94
IEEE Computational Intelligence Magazine - May 2023 - 95
IEEE Computational Intelligence Magazine - May 2023 - 96
IEEE Computational Intelligence Magazine - May 2023 - 97
IEEE Computational Intelligence Magazine - May 2023 - 98
IEEE Computational Intelligence Magazine - May 2023 - 99
IEEE Computational Intelligence Magazine - May 2023 - 100
IEEE Computational Intelligence Magazine - May 2023 - 101
IEEE Computational Intelligence Magazine - May 2023 - 102
IEEE Computational Intelligence Magazine - May 2023 - 103
IEEE Computational Intelligence Magazine - May 2023 - 104
IEEE Computational Intelligence Magazine - May 2023 - Cover3
IEEE Computational Intelligence Magazine - May 2023 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202311
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202308
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202305
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202302
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202211
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202208
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202205
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202202
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202111
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202108
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202105
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202102
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202011
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202008
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202005
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202002
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201911
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201908
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201905
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201902
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201811
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201808
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201805
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201802
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter12
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall12
https://www.nxtbookmedia.com