Computational Intelligence - February 2016 - 22

f1

f2

f3

f4

Domain Adaptation

g

Rule Adaptation

Figure 1 Given a set of training datasets (top) there are two ways to
model the similarities shared by them. Domain adaptation (left)
refers to the strategy of attempting to find a transformation to a data
space in which a single decision rule will classify all samples. Instead
of learning a new rule for the new data, data is simply transformed to
the invariant space. Rule adaptation (right) is the strategy of attempting to learn the structure of the classification rules. New datasets are
faced with a much smaller search space of possible rules which
allows for much faster learning of novel decision boundaries.

years, several groups have started explicitly modelling such variations to exploit structure that is shared between data recorded
from multiple subjects and/or sessions. In this article, we provide
an overview of previous work on the topic and present a unifying approach to transfer learning in the field of BCIs. We demonstrate the utility of our framework on subject-to-subject
transfer in a motor-imagery paradigm as well as on session-tosession transfer in one patient diagnosed with amyotrophic lateral sclerosis (ALS).
1.1. Previous Work

Transfer learning describes the procedure of using data recorded
in one task to boost performance in another, related task (for a
more exhaustive review of the machine learning literature, see
[11]). That is to say, we assume a priori that there is some structure shared by these tasks; the goal, then, is to learn some representation of this structure so further tasks can be solved more
easily. In the context of BCIs, transfer learning is of critical
importance - it has long been known that the EEG signal is
not stationary, and so in its strictest sense one can consider
every trial a slightly new task. As such, long sessions of BCI
usage present unique problems in terms of consistent classification [12]. The question is how to transfer some sort of knowledge between them: a question that can be answered in one of
two general ways. Either we can attempt to find some structure
in the data that is invariant across datasets or we can find some
structure in how the decision rules differ between different
subjects or sessions. We denote these as domain adaptation and
rule adaptation respectively (Figure 1).

22

IEEE ComputatIonal IntEllIgEnCE magazInE | FEbruary 2016

Looking at the literature, BCI has been almost exclusively
dominated by domain adaptation approaches. One popular
feature space in the field is the trial covariance matrices used
both in Common Spatial Patterns (CSP) [4], [13] and other
more modern methods [14]. Many transfer learning techniques have been attempted with CSP, mostly relying on an
assumption that there exists a set of linear filters that is invariant across either sessions or subjects. An early example of session-to-session transfer of spatial filters is the work by Krauledat et al. [15], in which a clustering procedure is employed to
select prototypical spatial filters and classifiers, which are in
turn applied to newly recorded data. Using this approach, the
authors demonstrate that calibration time can be greatly
reduced with only a slight loss in classification accuracy. The
problem of subject-to-subject transfer of spatial filters is
addressed by Fazli et al. [16]: also building upon CSP for spatial filtering, the authors utilize a large database of pairs of spatial filters and classifiers from 45 subjects to learn a sparse subset of these pairs that are predictive across subjects. Using a
leave-one-subject-out cross-validation procedure, the authors
then demonstrate that this sparse subset of spatial filters and
classifiers can be applied to new subjects with only a moderate
performance loss in comparison to subject-specific calibration.
Note that in both above approaches transfer learning amounts
to determining invariant spaces on which to project the data
and learning classifiers in these spaces. This line of work has
been further extended by Kang et al. [17], [18], Lotte and
Guan [19], and Devlaminck et al. [20]. In these contributions,
the authors demonstrate successful subject-to-subject transfer
by regularizing spatial filters derived by CSP with data from
other subjects, which amounts to attempting to find an invariant subspace on which to project the data of new subjects.
Recently, a method of distance measures between trial covariance matrices has also been used to great effect in both motor
imagery [21] and event-related potential paradigms [22] as a
domain adaptation tool. Related to the spirit of the regularized CSP methods described above, they work by trying to
find the best projection plane for the trial covariance matrices,
invariant to subjects and sessions, and then run a classification
algorithm. Other domain adaptation approaches include that
by Morioka et al. [23], in which an invariant sparse representation of the data is learned using many subjects and then the
transformation into that space is applied to new subjects, and
the technique of stationary subspace analysis [24], [25], which
attempts to find a stationary subspace of the data from multiple subjects and/or sessions.
A very related technique to domain adaptation is covariate
shift, which has also found use in BCIs. Sugiyama et al. have
used covariate shift adaptation to combine labeled training
data with unlabeled test data [26]. Here, it is assumed that the
marginal distribution of the data changes between the subjects and/or sessions, but the decision rule with respect to this
marginal distribution remains constant. This assumption leads
to a re-weighting of training data from other subjects and/or
previous sessions based on unlabeled data from the current



Table of Contents for the Digital Edition of Computational Intelligence - February 2016

Computational Intelligence - February 2016 - Cover1
Computational Intelligence - February 2016 - Cover2
Computational Intelligence - February 2016 - 1
Computational Intelligence - February 2016 - 2
Computational Intelligence - February 2016 - 3
Computational Intelligence - February 2016 - 4
Computational Intelligence - February 2016 - 5
Computational Intelligence - February 2016 - 6
Computational Intelligence - February 2016 - 7
Computational Intelligence - February 2016 - 8
Computational Intelligence - February 2016 - 9
Computational Intelligence - February 2016 - 10
Computational Intelligence - February 2016 - 11
Computational Intelligence - February 2016 - 12
Computational Intelligence - February 2016 - 13
Computational Intelligence - February 2016 - 14
Computational Intelligence - February 2016 - 15
Computational Intelligence - February 2016 - 16
Computational Intelligence - February 2016 - 17
Computational Intelligence - February 2016 - 18
Computational Intelligence - February 2016 - 19
Computational Intelligence - February 2016 - 20
Computational Intelligence - February 2016 - 21
Computational Intelligence - February 2016 - 22
Computational Intelligence - February 2016 - 23
Computational Intelligence - February 2016 - 24
Computational Intelligence - February 2016 - 25
Computational Intelligence - February 2016 - 26
Computational Intelligence - February 2016 - 27
Computational Intelligence - February 2016 - 28
Computational Intelligence - February 2016 - 29
Computational Intelligence - February 2016 - 30
Computational Intelligence - February 2016 - 31
Computational Intelligence - February 2016 - 32
Computational Intelligence - February 2016 - 33
Computational Intelligence - February 2016 - 34
Computational Intelligence - February 2016 - 35
Computational Intelligence - February 2016 - 36
Computational Intelligence - February 2016 - 37
Computational Intelligence - February 2016 - 38
Computational Intelligence - February 2016 - 39
Computational Intelligence - February 2016 - 40
Computational Intelligence - February 2016 - 41
Computational Intelligence - February 2016 - 42
Computational Intelligence - February 2016 - 43
Computational Intelligence - February 2016 - 44
Computational Intelligence - February 2016 - 45
Computational Intelligence - February 2016 - 46
Computational Intelligence - February 2016 - 47
Computational Intelligence - February 2016 - 48
Computational Intelligence - February 2016 - 49
Computational Intelligence - February 2016 - 50
Computational Intelligence - February 2016 - 51
Computational Intelligence - February 2016 - 52
Computational Intelligence - February 2016 - 53
Computational Intelligence - February 2016 - 54
Computational Intelligence - February 2016 - 55
Computational Intelligence - February 2016 - 56
Computational Intelligence - February 2016 - 57
Computational Intelligence - February 2016 - 58
Computational Intelligence - February 2016 - 59
Computational Intelligence - February 2016 - 60
Computational Intelligence - February 2016 - 61
Computational Intelligence - February 2016 - 62
Computational Intelligence - February 2016 - 63
Computational Intelligence - February 2016 - 64
Computational Intelligence - February 2016 - 65
Computational Intelligence - February 2016 - 66
Computational Intelligence - February 2016 - 67
Computational Intelligence - February 2016 - 68
Computational Intelligence - February 2016 - 69
Computational Intelligence - February 2016 - 70
Computational Intelligence - February 2016 - 71
Computational Intelligence - February 2016 - 72
Computational Intelligence - February 2016 - Cover3
Computational Intelligence - February 2016 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202311
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202308
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202305
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202302
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202211
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202208
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202205
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202202
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202111
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202108
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202105
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202102
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202011
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202008
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202005
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202002
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201911
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201908
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201905
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201902
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201811
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201808
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201805
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201802
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter12
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall12
https://www.nxtbookmedia.com