IEEE Computational Intelligence Magazine - February 2023 - 50

References
[1] Z. Wu, S. Pan, F. Chen, G. Long, C. Zhang, and S. Y. Philip, " A comprehensive
survey on graph neural networks, " IEEE Trans. Neural Netw. Learn. Syst., vol. 32,
no. 1, pp. 4-24, Jan. 2020.
[2] Z. Zhang, P. Cui, and W. Zhu, " Deep learning on graphs: A survey, " IEEE
Trans. Knowl. Data Eng., vol. 34, no. 1, pp. 249-270, Jan. 2022.
[3] F. Xia et al., " Graph learning: A survey, " IEEE Trans. Artif. Intell., vol. 2, no. 2,
pp. 109-127, Apr. 2021.
[4] M. Zitnik, M. Agrawal, andJ. Leskovec, " Modeling polypharmacy side effects with
graph convolutional networks, " Bioinformatics,vol.34, no. 13, pp.i457-i466, 2018.
[5] Z. Wang, T. Chen, J. Ren, W. Yu, H. Cheng, and L. Lin, " Deep reasoning with
knowledge graph for social relationship understanding, " in Proc. 27th Int. Joint Conf.
Artif. Intell., pp. 1021-1028. 2018.
[6] J. Liu et al., " Shifu2: A network representation learning based model for advisoradvisee
relationship mining, " IEEE Trans. Knowl. Data Eng., vol. 33, no. 4,
pp. 1763-1777, Apr. 2019.
[7] X. Chen, T. Tang, J. Ren, I. Lee, H. Chen, and F. Xia, " Heterogeneous graph
learning for explainable recommendation over academic networks, " in Proc. IEEE/
WIC/ACM Int. Conf. Web Intell. Intell. Agent Technol., 2021, pp. 29-36.
[8] X. Kong, K. Wang, M. Hou, F. Xia, G. Karmakar, and J. Li, " Exploring human
mobility for multi-pattern passenger prediction: A graph learning framework, " IEEE
Trans. Intell. Transp. Syst., vol. 23, no. 9, pp. 16148-16160, Sep. 2022.
[9] F. Zhou and C. Cao, " Overcoming catastrophic forgetting in graph neural networks
with experience replay, " in Proc. AAAIConf. Artif. Intell., 2021, pp. 4714-4722.
[10] J.Wang,G.Song,Y.Wu,and L. Wang, " Streaming graph neural networks via continual
learning, " in Proc.29thACMInt. Conf.Inf.Knowl.Manage., 2020, pp. 1515-1524.
[11] T. Lesort, V. Lomonaco, A. Stoian, D. Maltoni, D. Filliat, and N. DıazRodr
ıguez, " Continual learning for robotics: Definition, framework, learning strategies,
opportunities and challenges, " Inf. Fusion, vol. 58, pp. 52-68, 2020.
[12] A. Chaudhry, P. K. Dokania, T. Ajanthan, and P. H. Torr, " Riemannian walk
for incremental learning: Understanding forgetting and intransigence, " in Proc. Eur.
Conf. Comput. Vis., 2018, pp. 532-547.
[13] T. Mitchell et al., " Never-ending learning, " Commun. ACM, vol. 61, no. 5,
pp. 103-115, 2018.
[14] Z. Chen and B. Liu, Lifelong Machine Learning, vol. 12. San Rafael, CA, USA:
Morgan & Claypool, 2018.
[15] P. Ruvolo and E. Eaton, " ELLA: An efficient lifelong learning algorithm, " in
Proc. 30th Int. Conf. Mach. Learn., S. Research Dasgupta and D. McAllester, Eds.,
vol. 28, Jun. 2013, pp. 507-515. [Online]. Available: https://proceedings.mlr.press/
v28/ruvolo13.html
[16] G. I. Parisi, R. Kemker, J. L. Part, C. Kanan, and S. Wermter, " Continual lifelong
learningwith neural networks: A review, " NeuralNetw., vol. 113, pp. 54-71, 2019.
[17] R. French, " Catastrophic forgetting in connectionist networks, " Trends Cogn.
Sci., vol. 3, no. 4, pp. 128-135, 1999.
[18] A. Robins, " Catastrophic forgetting, rehearsal and pseudorehearsal, " Connection
Sci., vol. 7, no. 2, pp. 123-146, 1995.
[19] J. Kirkpatrick et al., " Overcoming catastrophic forgetting in neural networks, "
Proc. Nat. Acad. Sci., vol. 114, no. 13, pp. 3521-3526, 2017.
[20] Z. Li and D. Hoiem, " Learning without forgetting, " IEEE Trans. Pattern Anal.
Mach. Intell., vol. 40, no. 12, pp. 2935-2947, Dec. 2018.
[21] A. A. Rusu et al., " Progressive neural networks, " 2022, arXiv:1606.04671.
[22] V. Lomonaco and D. Maltoni, " Core50: A new dataset and benchmark for continuous
object recognition, " in Proc. Conf. Robot Learn., 2017, pp. 17-26.
[23] D. Lopez-Paz and M. Ranzato, " Gradient episodic memory for continual learning, "
in Proc. 31st Int. Conf. Neural Inf. Process. Syst., 2017, pp. 6470-6479.
[24] D. Maltoni and V. Lomonaco, " Continuous learning in single-incremental-task
scenarios, " Neural Netw., vol. 116, pp. 56-73, 2019.
[25] J. Yoon, E. Yang, J. Lee, and S. J. Hwang, " Lifelong learning with dynamically
expandable networks, " in Proc. Int. Conf. Learn. Representations (ICLR), 2018.
[26] H. Cai, V. W. Zheng, and K. C.-C. Chang, " A comprehensive survey ofgraph
embedding: Problems, techniques, and applications, " IEEE Trans. Knowl. Data Eng.,
vol. 30, no. 9, pp. 1616-1637, Sep. 2018.
[27] T. N. Kipf and M. Welling, " Semi-supervised classification with graph convolutional
networks, " in Proc. Int. Conf. Learn. Representations (ICLR), 2017.
[28] B. Li and D. Pi, " Learning deep neural networks for node classification, " Expert
Syst. Appl., vol. 137, pp. 324-334, 2019.
[29] L. Cai, J. Li, J. Wang, and S. Ji, " Line graph neural networks for link prediction, "
IEEETrans. Pattern Anal. Mach. Intell.,vol. 44, no.9,pp. 5103-5113, Sep. 2022.
[30] M. Zhang, Z. Cui, M. Neumann, and Y. Chen, " An end-to-end deep learning
architecture for graph classification, " in Proc. 32nd AAAI Conf. Artif. Intell., 2018,
pp. 4438-4445.
[31] F. Scarselli, M. Gori, A. C. Tsoi, M. Hagenbuchner, and G. Monfardini, " The graph
neural networkmodel, " IEEETrans. Neural Netw.,vol.20,no.1,pp.61-80,Jan. 2009.
[32] J. You, R. Ying, X. Ren, W. L. Hamilton, and J. Leskovec, " GraphRNN:
Generating realistic graphs with deep auto-regressive models, " in Proc. Int. Conf.
Mach. Learn., 2018, pp. 5708-5717, arXiv:1802.08773v3.
[33] T. N. Kipf and M. Welling, " Variational graph auto-encoders, " in Proc. NIPS
Workshop Bayesian Deep Learn., 2016.
[34] Z. Wu, S. Pan, G. Long,J. Jiang, and C. Zhang, " GraphWaveNet for deep spatial-temporal
graph modeling, " in Proc. 28th Int. Joint Conf. Artif. Intell., 2019,
pp. 1907-1913.
[35] W. Hamilton, Z. Ying, and J. Leskovec, " Inductive representation learning
on large graphs, " in Proc. 31st Int. Conf. Neural Inf. Process. Syst., 2017,
pp. 1025-1035.
[36] L. Galke, B. Franke, T. Zielke, and A. Scherp, " Lifelong learning ofgraph neural
networks for open-world node classification, " in Proc. Int. Joint Conf. Neural
Netw., 2021, pp. 1-8.
[37] S. M. Kazemi et al., " Representation learning for dynamic graphs: A survey, " J.
Mach. Learn. Res., vol. 21, no. 70, pp. 1-73, 2020.
[38] C. D. T. Barros, M. R. F. Mendonça, A. B. Vieira, and A. Ziviani, " A survey
on embedding dynamic graphs, " ACM Comput. Surv., vol. 55, no. 1, pp. 1-37,
Nov. 2021.
[39] G. Xue, M. Zhong, J. Li, J. Chen, C. Zhai, and R. Kong, " Dynamic network
embedding survey, " Neurocomputing, vol. 472, pp. 212-223, 2022.
[40] P. Goyal, N. Kamra, X. He, and Y. Liu, " DynGEM: Deep embedding method
for dynamic graphs, " 2018, arXiv:1805.11273.
[41] P. Goyal, S. R. Chhetri, and A. Canedo, " dyngraph2vec: Capturing network
dynamics using dynamic graph representation learning, " Knowl.-Based Syst.,
vol. 187, 2020, Art. no. 104816.
[42] L. Zhou, Y. Yang, X. Ren, F. Wu, and Y. Zhuang, " Dynamic network
embedding by modeling triadic closure process, " in Proc. 32nd AAAI Conf. Artif.
Intell. 30th Innov. Appl. Artif. Intell. Conf. 8th AAAI Symp. Educ. Adv. Artif. Intell.,
2018, pp. 571-578.
[43] Y. Han, S. Karunasekera, and C. Leckie, " Graph neural networks with continual
learning for fake news detection from social media, " 2020, arXiv:2007.03316.
[44] X. Chen, J. Wang, and K. Xie, " TrafficStream: A streaming traffic flow forecasting
framework based on graph neural networks and continual learning, " in Proc.
30th Int.Joint Conf. Artif. Intell., 2021, pp. 3620-3626.
[45] T. Guo et al., " Graduate employment prediction with bias, " in Proc. AAAI
Conf. Artif. Intell., 2020, vol. 34, pp. 670-677.
[46] A. Chaudhary, H. Mittal, and A. Arora, " Anomaly detection using graph neural
networks, " in Proc. Int. Conf. Mach. Learning, Big Data, Cloud Parallel Comput., 2019,
pp. 346-350.
[47] M. Delange et al., " A continual learning survey: Defying forgetting in classification
tasks, " IEEETrans. Pattern Anal. Mach. Intell., vol. 44, no. 7, pp. 3366-3385,
Jul. 2022.
[48] M. Biesialska, K. Biesialska, and M. R. Costa-Jussa, " Continual lifelong learning
in natural language processing: A survey, " in Proc. 28th Int. Conf. Comput. Linguistics,
2020, pp. 6523-6541.
[49] C.-Y. Hung, C.-H. Tu, C.-E. Wu, C.-H. Chen, Y.-M. Chan, and C.-S. Chen,
" Compacting, picking and growing for unforgetting continual learning, " in Proc.
33rd Int. Conf. Neural Inf. Process. Syst., 2019, pp. 13677-13687.
[50] T. L. Hayes, N. D. Cahill, and C. Kanan, " Memory efficient experience
replay for streaming learning, " in Proc. Int. Conf. Robot. Automat., 2019,
pp. 9769-9776.
[51] S.-A. Rebuffi, A. Kolesnikov, G. Sperl, and C. H. Lampert, " iCaRL: Incremental
classifier and representation learning, " in Proc. IEEE Conf. Comput. Vis. Pattern
Recognit., 2017, pp. 2001-2010.
[52] A. Jain, A. R. Zamir, S. Savarese, and A. Saxena, " Structural-RNN: Deep
learning on spatio-temporal graphs, " in Proc. IEEE Conf. Comput. Vis. Pattern Recognit.,
2016, pp. 5308-5317.
[53] B. Yu, H. Yin, and Z. Zhu, " Spatio-temporal graph convolutional networks: A
deep learning framework for traffic forecasting, " in Proc. 27th Int. Joint Conf. Artif.
Intell., 2018, pp. 3634-3640.
[54] J. Zhang, X. Shi, J. Xie, H. Ma, I. King, and D.-Y. Yeung, " GaAN: Gated
attention networks for learning on large and spatiotemporal graphs, " in Proc. 24th
Conf. Uncertainty Artif. Intell., 2018, pp. 339-349.
[55] S. Guo, Y. Lin, N. Feng, C. Song, and H. Wan, " Attention based spatial-temporal
graph convolutional networks for traffic flow forecasting, " in Proc. AAAIConf.
Artif. Intell., 2019, vol. 33, pp. 922-929.
[56] C. Wang, D. Gao, Y. Qiu, and S. Scherer, " Lifelong graph learning, " in Proc.
Conf. Comput. Vis. Pattern Recognit., 2022, pp. 13719-13728.
[57] A. Carta, A. Cossu, F. Errica, and D. Bacciu, " Catastrophic forgetting in deep
graph networks: An introductory benchmark for graph classification, " in Proc. ACM
Symp. Neural Gaze Detection, 2018.
[58] F. Zhuang et al., " A comprehensive survey on transfer learning, " Proc. IEEE,
vol. 109, no. 1, pp. 43-76, Jan. 2021.
[59] Y. Zhang and Q. Yang, " A survey on multi-task learning, " IEEE Trans. Knowl.
Data Eng., vol. 34, no. 12, pp. 5586-5609, Dec. 2022.
[60] S. C. Hoi, D. Sahoo, J. Lu, and P. Zhao, " Online learning: A comprehensive
survey, " Neurocomputing, vol. 459, pp. 249-289, 2021.
[61] J. Yang, K. Zhou, Y. Li, and Z. Liu, " Generalized out-of-distribution detection:
A survey, " 2021, arXiv:2110.11334.
[62] H. Li, X. Wang, Z. Zhang, and W. Zhu, " Out-of-distribution generalization
on graphs: A survey, " 2022, arXiv:2202.07987.
[63] T. Lesort, M. Caccia, and I. Rish, " Understanding continual learning settings
with data distribution drift analysis, " 2021, arXiv:2104.01678.
[64] M. Yang, M. Zhou, M. Kalander, Z. Huang, and I. King, " Discrete-time temporal
network embedding via implicit hierarchical learning in hyperbolic space, " in
Proc. 27th ACMSIGKDD Conf. Knowl. Discov. Data Mining, 2021, pp. 1975-1985.
[65] J. Skarding, B. Gabrys, and K. Musial, " Foundations and modeling of dynamic
networks using dynamic graph neural networks: A survey, " IEEE Access,vol.9,
pp. 79143-79168, 2021.
50 IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE | FEBRUARY 2023
https://proceedings.mlr.press/v28/ruvolo13.html https://proceedings.mlr.press/v28/ruvolo13.html

IEEE Computational Intelligence Magazine - February 2023

Table of Contents for the Digital Edition of IEEE Computational Intelligence Magazine - February 2023

Contents
IEEE Computational Intelligence Magazine - February 2023 - Cover1
IEEE Computational Intelligence Magazine - February 2023 - Cover2
IEEE Computational Intelligence Magazine - February 2023 - Contents
IEEE Computational Intelligence Magazine - February 2023 - 2
IEEE Computational Intelligence Magazine - February 2023 - 3
IEEE Computational Intelligence Magazine - February 2023 - 4
IEEE Computational Intelligence Magazine - February 2023 - 5
IEEE Computational Intelligence Magazine - February 2023 - 6
IEEE Computational Intelligence Magazine - February 2023 - 7
IEEE Computational Intelligence Magazine - February 2023 - 8
IEEE Computational Intelligence Magazine - February 2023 - 9
IEEE Computational Intelligence Magazine - February 2023 - 10
IEEE Computational Intelligence Magazine - February 2023 - 11
IEEE Computational Intelligence Magazine - February 2023 - 12
IEEE Computational Intelligence Magazine - February 2023 - 13
IEEE Computational Intelligence Magazine - February 2023 - 14
IEEE Computational Intelligence Magazine - February 2023 - 15
IEEE Computational Intelligence Magazine - February 2023 - 16
IEEE Computational Intelligence Magazine - February 2023 - 17
IEEE Computational Intelligence Magazine - February 2023 - 18
IEEE Computational Intelligence Magazine - February 2023 - 19
IEEE Computational Intelligence Magazine - February 2023 - 20
IEEE Computational Intelligence Magazine - February 2023 - 21
IEEE Computational Intelligence Magazine - February 2023 - 22
IEEE Computational Intelligence Magazine - February 2023 - 23
IEEE Computational Intelligence Magazine - February 2023 - 24
IEEE Computational Intelligence Magazine - February 2023 - 25
IEEE Computational Intelligence Magazine - February 2023 - 26
IEEE Computational Intelligence Magazine - February 2023 - 27
IEEE Computational Intelligence Magazine - February 2023 - 28
IEEE Computational Intelligence Magazine - February 2023 - 29
IEEE Computational Intelligence Magazine - February 2023 - 30
IEEE Computational Intelligence Magazine - February 2023 - 31
IEEE Computational Intelligence Magazine - February 2023 - 32
IEEE Computational Intelligence Magazine - February 2023 - 33
IEEE Computational Intelligence Magazine - February 2023 - 34
IEEE Computational Intelligence Magazine - February 2023 - 35
IEEE Computational Intelligence Magazine - February 2023 - 36
IEEE Computational Intelligence Magazine - February 2023 - 37
IEEE Computational Intelligence Magazine - February 2023 - 38
IEEE Computational Intelligence Magazine - February 2023 - 39
IEEE Computational Intelligence Magazine - February 2023 - 40
IEEE Computational Intelligence Magazine - February 2023 - 41
IEEE Computational Intelligence Magazine - February 2023 - 42
IEEE Computational Intelligence Magazine - February 2023 - 43
IEEE Computational Intelligence Magazine - February 2023 - 44
IEEE Computational Intelligence Magazine - February 2023 - 45
IEEE Computational Intelligence Magazine - February 2023 - 46
IEEE Computational Intelligence Magazine - February 2023 - 47
IEEE Computational Intelligence Magazine - February 2023 - 48
IEEE Computational Intelligence Magazine - February 2023 - 49
IEEE Computational Intelligence Magazine - February 2023 - 50
IEEE Computational Intelligence Magazine - February 2023 - 51
IEEE Computational Intelligence Magazine - February 2023 - 52
IEEE Computational Intelligence Magazine - February 2023 - 53
IEEE Computational Intelligence Magazine - February 2023 - 54
IEEE Computational Intelligence Magazine - February 2023 - 55
IEEE Computational Intelligence Magazine - February 2023 - 56
IEEE Computational Intelligence Magazine - February 2023 - 57
IEEE Computational Intelligence Magazine - February 2023 - 58
IEEE Computational Intelligence Magazine - February 2023 - 59
IEEE Computational Intelligence Magazine - February 2023 - 60
IEEE Computational Intelligence Magazine - February 2023 - 61
IEEE Computational Intelligence Magazine - February 2023 - 62
IEEE Computational Intelligence Magazine - February 2023 - 63
IEEE Computational Intelligence Magazine - February 2023 - 64
IEEE Computational Intelligence Magazine - February 2023 - 65
IEEE Computational Intelligence Magazine - February 2023 - 66
IEEE Computational Intelligence Magazine - February 2023 - 67
IEEE Computational Intelligence Magazine - February 2023 - 68
IEEE Computational Intelligence Magazine - February 2023 - 69
IEEE Computational Intelligence Magazine - February 2023 - 70
IEEE Computational Intelligence Magazine - February 2023 - 71
IEEE Computational Intelligence Magazine - February 2023 - 72
IEEE Computational Intelligence Magazine - February 2023 - 73
IEEE Computational Intelligence Magazine - February 2023 - 74
IEEE Computational Intelligence Magazine - February 2023 - 75
IEEE Computational Intelligence Magazine - February 2023 - 76
IEEE Computational Intelligence Magazine - February 2023 - 77
IEEE Computational Intelligence Magazine - February 2023 - 78
IEEE Computational Intelligence Magazine - February 2023 - 79
IEEE Computational Intelligence Magazine - February 2023 - 80
IEEE Computational Intelligence Magazine - February 2023 - 81
IEEE Computational Intelligence Magazine - February 2023 - 82
IEEE Computational Intelligence Magazine - February 2023 - 83
IEEE Computational Intelligence Magazine - February 2023 - 84
IEEE Computational Intelligence Magazine - February 2023 - 85
IEEE Computational Intelligence Magazine - February 2023 - 86
IEEE Computational Intelligence Magazine - February 2023 - 87
IEEE Computational Intelligence Magazine - February 2023 - 88
IEEE Computational Intelligence Magazine - February 2023 - 89
IEEE Computational Intelligence Magazine - February 2023 - 90
IEEE Computational Intelligence Magazine - February 2023 - 91
IEEE Computational Intelligence Magazine - February 2023 - 92
IEEE Computational Intelligence Magazine - February 2023 - 93
IEEE Computational Intelligence Magazine - February 2023 - 94
IEEE Computational Intelligence Magazine - February 2023 - 95
IEEE Computational Intelligence Magazine - February 2023 - 96
IEEE Computational Intelligence Magazine - February 2023 - 97
IEEE Computational Intelligence Magazine - February 2023 - 98
IEEE Computational Intelligence Magazine - February 2023 - 99
IEEE Computational Intelligence Magazine - February 2023 - 100
IEEE Computational Intelligence Magazine - February 2023 - 101
IEEE Computational Intelligence Magazine - February 2023 - 102
IEEE Computational Intelligence Magazine - February 2023 - 103
IEEE Computational Intelligence Magazine - February 2023 - 104
IEEE Computational Intelligence Magazine - February 2023 - Cover3
IEEE Computational Intelligence Magazine - February 2023 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202311
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202308
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202305
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202302
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202211
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202208
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202205
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202202
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202111
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202108
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202105
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202102
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202011
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202008
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202005
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202002
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201911
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201908
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201905
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201902
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201811
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201808
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201805
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201802
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter12
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall12
https://www.nxtbookmedia.com