IEEE Systems, Man and Cybernetics Magazine - January 2022 - 26

partial eta squared (),
2
h partial whose
values can be benchmarked against
Cohen's [43] criteria of small (0.01),
medium (0.06), and large (0.14)
effects, according to Richardson
(2011) [42].
During this subject-independent
(cross-subject) analysis, all the deep
networks (the EEGNet, shallow
CNN, and deep CNN) and methods
(the fuzzy CSP and FTDCSSP) used
hyperparameter settings that were
similar to those mentioned in the
preceding for the cross-session
(subject-specific) analysis. In addition,
500 epochs were employed to
train the deep networks. In summary, the F-DivIT-JAD performed
better than the other algorithms for predicting the RT in
subject-specific and subject-independent settings. Table 2 contains
the average RMSE obtained from all the methods in subjectspecific
and subject-independent settings.
Deep learning has
widely nullified the
necessity for expertise
in feature extraction,
achieving advanced
performance in
computer vision and
speech recognition.
we studied the EEGNet, shallow
CNN, deep CNN, fuzzy CSPROVR
with a LASSO, FTDCSSP with a
LASSO, and two new methods (the
F-DivCSP-WS with a LASSO and
the F-DivIT-JAD with a LASSO). We
observe that the EEGNet CNN performed
decently in subject-specific
and subject-independent settings.
The deep and shallow CNNs demonstrated
overfitting in the subject-independent
setting. This is attributed to
the depth-wise and separable convolutions
used in the EEGNet architecture.
The proposed methods (the
F-DivCSP-WS with a LASSO and the
Discussion and Conclusion
In this article, several intelligent machine learning systems
were provided for driver drowsiness detection through EEG
signals. The algorithms were tested under subject-specific
and subject-independent calibration settings. In summary,
F-DivIT-JAD with a LASSO) had a lower average RMSE
than the baseline methods (the FTDCSSP, shallow CNN,
deep CNN, and fuzzy CSPROVR). The fuzzy CSPROVR and
FTDCSSP performed close to and even better than the
EEGNet in subject-specific and subject-independent settings.
This is attributed to the diverse nature of the filters
learned in both methods by using optimal variance criterion
(the fuzzy CSPROVR and FTDCSSP) and the meansquare-error
loss (the EEGNet, deep CNN, and shallow CNN).
In general, the regression performances of the deep
Table 1. Paired t-test results (p values) for RMSE comparison
across methods.
Cross-Session (Leave-One-Session-Out Validation)
F-DivIT-JAD
0.01
0.001
0.002
F-DivCSP-WS FTDCSSP Fuzzy CSPROVR Shallow CNN Deep CNN EEGNet
0.002
0.01
0.001
Cross-Subject (Leave-One-Subject-Out Validation)
F-DivIT-JAD
0.02
0.002
0.001
F-DivCSP-WS FTDCSSP Fuzzy CSPROVR Shallow CNN Deep CNN EEGNet
0.001
0.01
0.0001
Bold-face p values indicate large effects (h2
partial > 0.13).
Table 2. A comparison of the average RMSE across methods
for subject-independent and subject-specific (cross-session)
validation.
Subject Specific (Leave-One-Session-Out Validation)
0.032
0.038
0.024
0.022
EEGNet Deep CNN Shallow CNN Fuzzy CSPROVR FTDCSSP F-DivCSP-WS F-DivIT-JAD
0.024
0.02
0.019
Subject Independent (Leave-One-Session-Out Validation)
0.028
0.025
0.021
0.019
EEGNet Deep CNN Shallow CNN Fuzzy CSPROVR FTDCSSP F-DivCSP-WS F-DivIT-JAD
0.029
0.017
0.015
26 IEEE SYSTEMS, MAN, & CYBERNETICS MAGAZINE January 2022
CNN and EEGNet were analogous across all cross-subject
analyses, whereas the deep
CNN performed worse for subject-specific
analyses. One
explanation for this is the multitude
of data employed to train
the model; in subject-independent
analyses, the training set
sizes were 15-20 times larger
than those for subject-specific
analyses. This helps us infer
that the deep CNN is more datahungry
in comparison to the
EEGNet, an expected result, provided
that the architecture of the
deep CNN is two times larger
than the EEGNet. We presume
that the argument is consistent
with the findings originally published
by the developers of the
deep CNN [27], who mentioned
that a training data augmentation
technique was mandatory
to record good classification
performance on sensorimotor
rhythm data. In contrast to that
work, the EEGNet and the other
proposed and baseline models
performed well across all test
settings, beyond the need for

IEEE Systems, Man and Cybernetics Magazine - January 2022

Table of Contents for the Digital Edition of IEEE Systems, Man and Cybernetics Magazine - January 2022

Contents
IEEE Systems, Man and Cybernetics Magazine - January 2022 - Cover1
IEEE Systems, Man and Cybernetics Magazine - January 2022 - Cover2
IEEE Systems, Man and Cybernetics Magazine - January 2022 - Contents
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 2
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 3
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 4
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 5
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 6
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 7
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 8
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 9
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 10
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 11
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 12
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 13
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 14
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 15
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 16
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 17
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 18
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 19
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 20
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 21
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 22
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 23
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 24
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 25
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 26
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 27
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 28
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 29
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 30
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 31
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 32
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 33
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 34
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 35
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 36
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 37
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 38
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 39
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 40
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 41
IEEE Systems, Man and Cybernetics Magazine - January 2022 - 42
IEEE Systems, Man and Cybernetics Magazine - January 2022 - Cover3
IEEE Systems, Man and Cybernetics Magazine - January 2022 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/smc_202310
https://www.nxtbook.com/nxtbooks/ieee/smc_202307
https://www.nxtbook.com/nxtbooks/ieee/smc_202304
https://www.nxtbook.com/nxtbooks/ieee/smc_202301
https://www.nxtbook.com/nxtbooks/ieee/smc_202210
https://www.nxtbook.com/nxtbooks/ieee/smc_202207
https://www.nxtbook.com/nxtbooks/ieee/smc_202204
https://www.nxtbook.com/nxtbooks/ieee/smc_202201
https://www.nxtbook.com/nxtbooks/ieee/smc_202110
https://www.nxtbook.com/nxtbooks/ieee/smc_202107
https://www.nxtbook.com/nxtbooks/ieee/smc_202104
https://www.nxtbook.com/nxtbooks/ieee/smc_202101
https://www.nxtbook.com/nxtbooks/ieee/smc_202010
https://www.nxtbook.com/nxtbooks/ieee/smc_202007
https://www.nxtbook.com/nxtbooks/ieee/smc_202004
https://www.nxtbook.com/nxtbooks/ieee/smc_202001
https://www.nxtbook.com/nxtbooks/ieee/smc_201910
https://www.nxtbook.com/nxtbooks/ieee/smc_201907
https://www.nxtbook.com/nxtbooks/ieee/smc_201904
https://www.nxtbook.com/nxtbooks/ieee/smc_201901
https://www.nxtbook.com/nxtbooks/ieee/smc_201810
https://www.nxtbook.com/nxtbooks/ieee/smc_201807
https://www.nxtbook.com/nxtbooks/ieee/smc_201804
https://www.nxtbook.com/nxtbooks/ieee/smc_201801
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1017
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0717
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0417
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0117
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1016
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0716
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0416
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0116
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1015
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0715
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0415
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0115
https://www.nxtbookmedia.com