IEEE Systems, Man and Cybernetics Magazine - April 2018 - 20

Discussion

approaches [26], [72] or based on statistically significant
voxels, as in [24] and [38].
From a BCI perspective, it is possible that the BCI
classification performance may be inferior if we use
only the data within a predefined ROI, as there is a risk
that the discriminative sources could, as well, be outside ROIs. Nevertheless, in the context of BCI for movement decoding, the neural ROIs are well established,
and they can complement the information provided by
data-driven ROI selection. It would be interesting to
explore the objective comparison of these two approaches in a BCI paradigm, which is
another possible future direction
for ESI studies.

Source Localization: Pitfalls and Challenges
There exist some practical concerns regarding source
imaging. We recommend having a subject-specific MRI
scan to achieve a more reliable head model, which is crucial for better source-imaging results. Although MRI is
expensive, only one scan would ever be needed for a subject-specific anatomy, as it can be reused. Furthermore,
we adv ise per forming the digitization of the EEG
electrode positions to create a coordinate transform
between the subject-specific head
model and the electrode location
in a three-dimensional (3-D) geoFuture studies should
physical space. Although it is
obvious that the higher the numFuture Directions for Practical
aim at leveraging
ber of EEG electrodes, the better
Applications in ESI-Based
the anatomical
the scalp spatial resolution, the
Online BCI
improvement in the source localAlthough there has been signifiinformation of ROIs
ization accuracy is minimal be cant progress in identifying the
and extracting taskyond a certain number of channels
cortical sources responsible for
[69]. If there is no subject-specific
t he movement of a r m pa r t s,
relevant features for
MRI scan available, then the next
there is a long way to go toward
fast classification
suggested approach is to use an
realizing the BCI-based neuroof complex
ICBM-152 t emplat e a n at omy,
prosthetics control that is funccoregistering the positions of the
tionally as capable as the real
movement types.
EEG electrodes with the aid of a
human hand. To this end, future
digitizer like Polhemus or ANTwork should a im at rea l-time
Xensor. In the absence of both an
source imaging of multiclass BCI
MRI and a 3-D digitizer, the final recourse would be to
with far more DoFs. In Table 2, we have highlighted
apply a template anatomy as is, at the cost of less-reliable
some of the studies in this direction, most of which are
source localization accuracy.
offline data analysis of EEG-based decoding of differRecently, Yu et al. proposed the New York head, a preent DoFs associated with the upper limb. Prompted by
cise, standardized head model that can be used in the
a recent work on robotic arm control for reach and
absence of an MRI [70]. Since the New York head is a
grasp using noninvasive scalp EEG [73], the objective
highly detailed anatomical model developed using FEM,
of noninvasive BCI-based neuroprosthetics seems to
the source localization accuracy is higher than what one
be achievable.
would get by using a BEM of ICBM-152, and it is competiAlthough it is evident that the online use of ESI decodtive with individualized BEMs. Furthermore, the transfer
ing is paramount in the practical use of state-of-the-art
learning approach used in [71] has been shown to handle
neural signal processing methods, there are certain hurintersubject variabilities by training a BCI classifier
dles in an online ESI, such as the low signal-to-noise ratio
using the source-imaging data transferred from other
in a single-trial EEG and the limited time available to comsubjects with better accuracy than the standard subjectpute the inverse solution [74]. Because of this concern of
specific approach.
computational intractability, there are only a handful of
studies that report online source imaging for BCI [56], [57],
[75], [76]. One way to address this challenge is to use a
Remarks on the Identification of ROIs
smaller lead-field matrix (of the forward model) with
for Movement Decoding
appropriate regularization techniques to handle single-triIn the previous section, we noted that there are several
al nonstationarity.
ROIs involved in the encoding and decoding of motor
Future studies should aim at leveraging the anatomitasks. However, there is no general consensus regarding
cal information of ROIs and extracting task-relevant feathe selection of ROIs, as there are studies reporting
tures for fast classification of complex movement types.
both brain-atlas-based predefined selection of ROIs
Recently, real-time source-imaging tool boxes have been
(e.g., [28], [35], [71]) and data-driven methods for such
made available as open source [77], [78], so we hope
selection [24], [26], [38], [72]. In the absence of a subjectthere will be an increasing number of real-time ESI studspecific cortical model, we suggest employing data-drivies for BCI applications.
en ROI selection, either by using information theoretic
20

IEEE SyStEmS, man, & CybErnEtICS magazInE A pr il 2 0 1 8



Table of Contents for the Digital Edition of IEEE Systems, Man and Cybernetics Magazine - April 2018

Contents
IEEE Systems, Man and Cybernetics Magazine - April 2018 - Cover1
IEEE Systems, Man and Cybernetics Magazine - April 2018 - Cover2
IEEE Systems, Man and Cybernetics Magazine - April 2018 - Contents
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 2
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 3
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 4
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 5
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 6
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 7
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 8
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 9
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 10
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 11
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 12
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 13
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 14
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 15
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 16
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 17
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 18
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 19
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 20
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 21
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 22
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 23
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 24
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 25
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 26
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 27
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 28
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 29
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 30
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 31
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 32
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 33
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 34
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 35
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 36
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 37
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 38
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 39
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 40
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 41
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 42
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 43
IEEE Systems, Man and Cybernetics Magazine - April 2018 - 44
IEEE Systems, Man and Cybernetics Magazine - April 2018 - Cover3
IEEE Systems, Man and Cybernetics Magazine - April 2018 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/smc_202310
https://www.nxtbook.com/nxtbooks/ieee/smc_202307
https://www.nxtbook.com/nxtbooks/ieee/smc_202304
https://www.nxtbook.com/nxtbooks/ieee/smc_202301
https://www.nxtbook.com/nxtbooks/ieee/smc_202210
https://www.nxtbook.com/nxtbooks/ieee/smc_202207
https://www.nxtbook.com/nxtbooks/ieee/smc_202204
https://www.nxtbook.com/nxtbooks/ieee/smc_202201
https://www.nxtbook.com/nxtbooks/ieee/smc_202110
https://www.nxtbook.com/nxtbooks/ieee/smc_202107
https://www.nxtbook.com/nxtbooks/ieee/smc_202104
https://www.nxtbook.com/nxtbooks/ieee/smc_202101
https://www.nxtbook.com/nxtbooks/ieee/smc_202010
https://www.nxtbook.com/nxtbooks/ieee/smc_202007
https://www.nxtbook.com/nxtbooks/ieee/smc_202004
https://www.nxtbook.com/nxtbooks/ieee/smc_202001
https://www.nxtbook.com/nxtbooks/ieee/smc_201910
https://www.nxtbook.com/nxtbooks/ieee/smc_201907
https://www.nxtbook.com/nxtbooks/ieee/smc_201904
https://www.nxtbook.com/nxtbooks/ieee/smc_201901
https://www.nxtbook.com/nxtbooks/ieee/smc_201810
https://www.nxtbook.com/nxtbooks/ieee/smc_201807
https://www.nxtbook.com/nxtbooks/ieee/smc_201804
https://www.nxtbook.com/nxtbooks/ieee/smc_201801
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1017
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0717
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0417
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0117
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1016
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0716
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0416
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0116
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1015
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0715
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0415
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0115
https://www.nxtbookmedia.com