Computational Intelligence - February 2017 - 47
MMORPGs. MHiCS is a multi-layer
architecture in which each layer consists
of multiple LCS. The first-level LCS aim
at learning multiple high-level behaviors
(e.g., attack, track and eat), based on different motivation values that are computed as a function of cur rent
environmental conditions and the given
NPC personalities' traits. The secondlevel LCS are applied to produce a specific action based on the g iven
motivation values and fitness of the firstlevel LCS. The last two levels of LCS are
used to produce final actions, by diffusing outputs from the activated LCS in
the previous levels, and to manipulate
effectors or resources related to these
actions. The overall architecture for
MHiCS is quite involved and this precludes conducting an in-depth analysis
of its utility [60]. The authors contend
that LCS are better methods for modeling NPCs, as they fulfill four key agent
design requirements: a) reactivity
through event-action rules; b) proactivity through encoding flexibility; c)
autonomy through evolutionary learning; and d) reconfigurability because of
their interpretable knowledge base. Sanchez et al. [63] used an XCS within a
behavioral animation framework, called
Virtual Behaviors (ViBes), which was
used to model and simulate NPCs.ViBes
consists of four main components,
including a decision-making module
that is responsible for selecting given
behaviors (e.g., eat, sleep or walk), based
on a perceived input. An XCS was used
in this module to learn rules adaptively
for selecting different behaviors. The
resulting system was tested in a game
called V-Man, to model the agent behavior in a virtual kitchen environment.
The authors observed that LCS-based
agent models are able to provide reactive
behaviors at the same time as planning a
long sequence of actions needed for the
virtual characters to operate in situated
environments. Recently, Kop et al. [64]
proposed the Evolutionary Dynamic
Scripting (EDS) method to model NPCs
in serious games. EDS borrows ideas
from LCS, using a rule-based framework
that is similar to that used in LCS but
differs in two algorithmic components.
First, instead of using a GA, EDS uses a
tree representation for rules and genetic
programming to evolve the rules. Second, EDS replaces the Q-learning-based
RL component with a technique called
Dynamic Scripting (DS). Unlike other RL
methods that aim at learning the stateaction mapping or a policy during
interaction with the environment, DS
works on the pre-defined rules (or a
policy) and only adjusts the rule weights
based on their applicability to different
states and corresponding performance.
An air combat simulation was used to
evaluate EDS and showed that EDS can
produce slightly improved NPC behavior and can discover useful novel rules.
The authors note that the evolutionary
rule-based RL systems have an edge
over traditional techniques because of
their ability to produce continuous
novel behavior in such scenarios and to
allow the integration of exter nal
domain knowledge.
Several studies have explored the use
of LCS for designing soft-soccer playing
agents. Among these, Bonarini and Trianni's [65] work was the first application
of fuzzy-LCS to simulated soccer games,
specifically to RoboCup. The fuzzy-LCS
are similar in effect to XCS, but differ by
using fuzzy logic to encode rules. The
experimental results showed a significantly better passing performance when
communication between agents was
enabled. These results provided evidence
for a successful co-evolution of cooperation using LCS-based agents. Castillo et
al. [66] used an XCS to model the players in RoboCup. The XCS-based players were built on top of an existing static
rule-based agent architecture (11Monkeys [67]). Experiments with different
settings were conducted. However, the
best results were obtained when the
agents used a mixed strategy in which
the XCS population was always initialized with scripted rules taken from
11Monkeys. These rules were kept
unchanged during the evolution, but
additional rules were learned to complement this fixed strategy. Sato, along with
different co-authors, later published a
series of papers on an event-driven
hybrid LCS approach for modeling
online soccer playing agents [68]-[72].
The main idea behind this system is to
employ LCS as a meta-learner or a
hyper-heuristic algorithm that learn
rules to select appropriate action-selection algorithms or strategies in an online
soccer game. That is, different soccerplaying algorithms and strategies may
perform differently in different game
scenarios; the goal of meta-learning LCS
is to map the most effective algorithms
to the corresponding game scenarios.
RoboCup, being a real-time and
dynamic learning environment and
requiring coordination between a team
of agents, introduces interesting challenges for the agents' design. The abovementioned studies have demonstrated
the modeling power and flexibility that
LCS provide in dealing with such an
environment. Bonarini's work showed
the integration of a fuzzy knowledge
base with RL using the LCS framework.
Castillo's work showed how LCS agents
could be seeded with pre-scripted rules
and evolved thereafter. Sato's work
showed how LCS could be used as an
agent as well as a meta-agent. These
studies also demonstrated the ability of
LCS to allow incorporating communication between agents and to evolve
their knowledge cooperatively.
Among applications of LCS to other
action and RTS games, Falke and Ross
[73] used ZCS to model an adaptive
agent in a war game in which models
engaged in a battle between two armies.
ZCS was used to learn an adaptive strategy to control an army squad against
human-controlled army squads. To avoid
slow convergence, the system was provided with an immediate reward signal
and was made to evolve classifiers at an
abstract level. Learning in this more generalized or simpler search space helped
ZCS to evolve increasingly complex
controllers when competing against
humans. This work showed that lightweight LCS models encoded with simpler conditions and abstracted actions
could be more effective for RTS games
than complex LCS models. Such models
also provided better transparency and
portability than previous attempts in that
literature. Lujan et al. [74], [75] applied
FEbruary 2017 | IEEE ComputatIonal IntEllIgEnCE magazInE
47
Table of Contents for the Digital Edition of Computational Intelligence - February 2017
Computational Intelligence - February 2017 - Cover1
Computational Intelligence - February 2017 - Cover2
Computational Intelligence - February 2017 - 1
Computational Intelligence - February 2017 - 2
Computational Intelligence - February 2017 - 3
Computational Intelligence - February 2017 - 4
Computational Intelligence - February 2017 - 5
Computational Intelligence - February 2017 - 6
Computational Intelligence - February 2017 - 7
Computational Intelligence - February 2017 - 8
Computational Intelligence - February 2017 - 9
Computational Intelligence - February 2017 - 10
Computational Intelligence - February 2017 - 11
Computational Intelligence - February 2017 - 12
Computational Intelligence - February 2017 - 13
Computational Intelligence - February 2017 - 14
Computational Intelligence - February 2017 - 15
Computational Intelligence - February 2017 - 16
Computational Intelligence - February 2017 - 17
Computational Intelligence - February 2017 - 18
Computational Intelligence - February 2017 - 19
Computational Intelligence - February 2017 - 20
Computational Intelligence - February 2017 - 21
Computational Intelligence - February 2017 - 22
Computational Intelligence - February 2017 - 23
Computational Intelligence - February 2017 - 24
Computational Intelligence - February 2017 - 25
Computational Intelligence - February 2017 - 26
Computational Intelligence - February 2017 - 27
Computational Intelligence - February 2017 - 28
Computational Intelligence - February 2017 - 29
Computational Intelligence - February 2017 - 30
Computational Intelligence - February 2017 - 31
Computational Intelligence - February 2017 - 32
Computational Intelligence - February 2017 - 33
Computational Intelligence - February 2017 - 34
Computational Intelligence - February 2017 - 35
Computational Intelligence - February 2017 - 36
Computational Intelligence - February 2017 - 37
Computational Intelligence - February 2017 - 38
Computational Intelligence - February 2017 - 39
Computational Intelligence - February 2017 - 40
Computational Intelligence - February 2017 - 41
Computational Intelligence - February 2017 - 42
Computational Intelligence - February 2017 - 43
Computational Intelligence - February 2017 - 44
Computational Intelligence - February 2017 - 45
Computational Intelligence - February 2017 - 46
Computational Intelligence - February 2017 - 47
Computational Intelligence - February 2017 - 48
Computational Intelligence - February 2017 - 49
Computational Intelligence - February 2017 - 50
Computational Intelligence - February 2017 - 51
Computational Intelligence - February 2017 - 52
Computational Intelligence - February 2017 - 53
Computational Intelligence - February 2017 - 54
Computational Intelligence - February 2017 - 55
Computational Intelligence - February 2017 - 56
Computational Intelligence - February 2017 - Cover3
Computational Intelligence - February 2017 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202311
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202308
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202305
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202302
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202211
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202208
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202205
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202202
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202111
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202108
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202105
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202102
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202011
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202008
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202005
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202002
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201911
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201908
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201905
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201902
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201811
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201808
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201805
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201802
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter12
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall12
https://www.nxtbookmedia.com