Computational Intelligence - February 2016 - 25

t Ts R - 1 w
ts=
w

/ / R i-,j1 wt s,i wt s,j,
i

Algorithm 1 Multitask BCI training.

j

we observe that R i-, j1 is proportional to the partial correlation
between the i-th and j-th components of the weight vector,
which is defined as the correlation between these after all other
components have been regressed out. Thus, for a given matrix
R - 1, this term will be minimized when for each set of components with high partial correlation, the subject/sessionspecific
weight vectors w s allow only one of these to deviate greatly
from the mean of that component. Hence, R - 1 acts as an
implicit feature selector. The final term, which is a constant in
the independent setting of (8), controls the complexity of the
covariance matrix.
We solve the minimization in (10) with respect to W and
(n, R) iteratively by alternating holding (n, R) and W constant.
For fixed n and R, optimization over w s decouples across
subjects/sessions and hence can be optimized independently. In
each iteration we get the new w s by taking the derivative with
respect to w s for all s and equating to 0. This yields the following closed form update for each w s:
-1
w s = c 1 X Ts X s + R - 1 m c 1 X Ts y s + R - 1 n m .

m

m

m

m

2: Set "^ n, Rh, = ^0, Ih

3: repeat
4: Update w s using (12)
5:

Update n using n * = 1 / s w s
S

6:

Update R using R * =

7: until convergence
8: Output: ^ n, Rh

(11)

/ s (w s - n) (w s - n) T
+ eI
Tr ` / s (w s - n) (w s - n) T j

Algorithm 2 Multitask BCI training with uninformative a.
1: Input: D, m
2: Set {(n w, R w), (n a, R a)} = (0, I)
3: Set a s = 1
4: repeat
5: repeat
6:
Compute Xu s = [a Ts X 1s ; f; a Ts X ns ]
7:
Compute W
X s = [X 1s w s, f, X ns w s]

-1
Update w s using w *s = c 1 R w Xu Ts Xu s + I m c 1 R w Xu Ts y s + n w m

8:

Hence, the model parameters are a combination of the shared
model contribution R - 1 n and the contribution of the individual subject/session data X Ts y s . This combination is scaled with
the inverse of covariance term which again comes from both
the data X Ts X s and the shared model R - 1 . In order to avoid
inverting R, which is a O (d 3) operation, we perform the
equivalent update
-1
w s = c 1 RX Ts X s + I m c 1 RX Ts y s + n m .

1: Input: D, m

m

9:
10:
11:
12:

Update a s using

a *s

m

-1
T
= c 1 Ra W
XsW
X s + I m c 1 Ra W
X s y s + na m

m

m

until W and A converge for fixed (n, R)
Update n w, n a using n *w = 1 / s w s, n a* = 1 / s a s
S
S
/ s ( w s - n w) ( w s - n w) T
*
+ eI, R a*
Update R w, R a using R w =
Tr ` / s (w s - n w) (w s - n w) T j
=

/ s (a s - n ) (a s - n ) T
+ eI
Tr ` / s (a s - n ) (a s - n ) T j
a

a

a

a

13: until convergence
14: Output: (n w, R w, n a, R a)

(12)

For fixed W, the updates of n and R are given in Algorithm 1 and derived in the Supplementary Materials.
2.4. Decomposition of Spatial and Spectral Features

The learning method described above can be applied to any
feature representation where the features extracted from each
electrode are appended together. Let E be the number of electrodes and F be the number of features extracted from each
electrode. The final feature vector then is size EF, rendering
the covariance matrix large and iterative updates expensive. It
also causes the number of features to grow linearly with the
number of channels and channel-specific features, an increase
that can be avoided by taking advantage of the structure of the
EEG. Specifically, we assume that the contribution of the features is invariant across electrodes but the importance of each
electrode varies. Hence, the weights corresponding to the feature vector mentioned above can be decomposed into two
components: the weight of each electrode a = (a 1, f, a E) and
the weights of features that are shared across all electrodes
w = (w 1, f, w F ) . We note that though in this paper spectral
features are used, there is no reason that temporal features such

Algorithm 3 Multitask BCI training with a initialization.
1: Input: D, m
2: Set {(n w, R w), (n a, R a)} = (0, I)
t
3: Concatenate subject data in D into single pooled subject D
t using the feature decomposition
4: Run ridge regression on D
regression function
5: Set a s to the ridge regression spatial weights
6: repeat
7: repeat
8:
Compute Xu s = [a Ts X 1s ; f; a Ts X ns ]
X s = [X 1s w s, f, X ns w s]
9:
Compute W
10:
11:
12:
13:
14:

-1
Update w s using w s = c 1 R w Xu Ts Xu s + I m c 1 R w Xu Ts y s + n w m

m

m

Update a s using a s = c 1 R a W
XsW
X s + I m c 1 Ra W
X s y s + na m
T

-1

m

m

until W and A converge for fixed (n, R)
Update n w, n a using n *w = 1 / s w s, n a* = 1 / s a s
S
S
/ s ( w s - n w ) (w s - n w ) T
*
Update R w, R a using R w =
Tr ` / s (w s - n w) (w s - n w) T j
+ eI, R *a =

/ s (a s - n ) (a s - n ) T
a

a

Tr ` / s (a s - n a) (a s - n a) T j

+ eI

13: until convergence
14: Output: (n w, R w, n a, R a)

February 2016 | Ieee ComputatIonal IntellIgenCe magazIne

25



Table of Contents for the Digital Edition of Computational Intelligence - February 2016

Computational Intelligence - February 2016 - Cover1
Computational Intelligence - February 2016 - Cover2
Computational Intelligence - February 2016 - 1
Computational Intelligence - February 2016 - 2
Computational Intelligence - February 2016 - 3
Computational Intelligence - February 2016 - 4
Computational Intelligence - February 2016 - 5
Computational Intelligence - February 2016 - 6
Computational Intelligence - February 2016 - 7
Computational Intelligence - February 2016 - 8
Computational Intelligence - February 2016 - 9
Computational Intelligence - February 2016 - 10
Computational Intelligence - February 2016 - 11
Computational Intelligence - February 2016 - 12
Computational Intelligence - February 2016 - 13
Computational Intelligence - February 2016 - 14
Computational Intelligence - February 2016 - 15
Computational Intelligence - February 2016 - 16
Computational Intelligence - February 2016 - 17
Computational Intelligence - February 2016 - 18
Computational Intelligence - February 2016 - 19
Computational Intelligence - February 2016 - 20
Computational Intelligence - February 2016 - 21
Computational Intelligence - February 2016 - 22
Computational Intelligence - February 2016 - 23
Computational Intelligence - February 2016 - 24
Computational Intelligence - February 2016 - 25
Computational Intelligence - February 2016 - 26
Computational Intelligence - February 2016 - 27
Computational Intelligence - February 2016 - 28
Computational Intelligence - February 2016 - 29
Computational Intelligence - February 2016 - 30
Computational Intelligence - February 2016 - 31
Computational Intelligence - February 2016 - 32
Computational Intelligence - February 2016 - 33
Computational Intelligence - February 2016 - 34
Computational Intelligence - February 2016 - 35
Computational Intelligence - February 2016 - 36
Computational Intelligence - February 2016 - 37
Computational Intelligence - February 2016 - 38
Computational Intelligence - February 2016 - 39
Computational Intelligence - February 2016 - 40
Computational Intelligence - February 2016 - 41
Computational Intelligence - February 2016 - 42
Computational Intelligence - February 2016 - 43
Computational Intelligence - February 2016 - 44
Computational Intelligence - February 2016 - 45
Computational Intelligence - February 2016 - 46
Computational Intelligence - February 2016 - 47
Computational Intelligence - February 2016 - 48
Computational Intelligence - February 2016 - 49
Computational Intelligence - February 2016 - 50
Computational Intelligence - February 2016 - 51
Computational Intelligence - February 2016 - 52
Computational Intelligence - February 2016 - 53
Computational Intelligence - February 2016 - 54
Computational Intelligence - February 2016 - 55
Computational Intelligence - February 2016 - 56
Computational Intelligence - February 2016 - 57
Computational Intelligence - February 2016 - 58
Computational Intelligence - February 2016 - 59
Computational Intelligence - February 2016 - 60
Computational Intelligence - February 2016 - 61
Computational Intelligence - February 2016 - 62
Computational Intelligence - February 2016 - 63
Computational Intelligence - February 2016 - 64
Computational Intelligence - February 2016 - 65
Computational Intelligence - February 2016 - 66
Computational Intelligence - February 2016 - 67
Computational Intelligence - February 2016 - 68
Computational Intelligence - February 2016 - 69
Computational Intelligence - February 2016 - 70
Computational Intelligence - February 2016 - 71
Computational Intelligence - February 2016 - 72
Computational Intelligence - February 2016 - Cover3
Computational Intelligence - February 2016 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202311
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202308
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202305
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202302
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202211
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202208
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202205
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202202
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202111
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202108
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202105
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202102
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202011
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202008
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202005
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202002
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201911
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201908
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201905
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201902
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201811
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201808
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201805
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201802
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter12
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall12
https://www.nxtbookmedia.com