IEEE Computational Intelligence Magazine - February 2023 - 70
ground-truth images for supervised learning [27]. Therefore,
some semi-supervised deep-learning algorithms [28], [29]
have been proposed for time-series classification (TSC) and
to achieve excellent supervised and unsupervised performances
on a large number of datasets. Furthermore, some
unsupervised methods [30], [31], [32], [33], [34] have also
been proposed and achieved good performance, but the
methods based on generative adversarial network (GAN)
have the problem of unstable training results. The depth
structure information and shallow texture information of a
pair of source images cannot be retained completely at the
same time.
To address the deficiency of the above methods, we propose
a self-supervised network with contrastive auto-encoding
and information exchange, named SSN-CAEþIE, to perform
the MMIF task. To this end, a contrastive auto-encoder
(CAE) based on a combination of a vision transformer (Vit)
and a convolutional neural network (CNN) in parallel is first
constructed to extract the features of source images pairwise.
Furthermore, inspired by the principle of information
exchange [35], a multi-convolutional information exchange
network (MCIEN) is designed to pixel-wisely measure the
fusion contribution between source images. The main contributions
ofthe proposal are as follows.
1) The MMIF task is modeled as a contribution estimation
problem, such that a self-supervised framework consisting
ofCAE and MCIEN is constructed to generate fusion contributions
between source images.
2) This paper proposes a novel CAE combining Vit and
CNN to adaptively extract and integrate the local and
global features from paired source images, in a manner of
contrastive learning. Also, MCIEN is designed to perform
the information exchange among feature maps in multiconvolutional
spaces and finally estimate the fusion contribution
between source images.
3) Effective losses are proposed for auxiliary and downstream
tasks. For the auxiliary task with a CAE, an effective
contrastive loss is designed, based on a group of
positive and negative results constructed elaborately.
Further, a hybrid loss with a weighted fidelity and a regulationtothe
informationexchangeisemployedtotrain
our SSN-CAEþIE.
4) Extensive simulations verify that the proposal is superior to
other typical and state-of-the-art fusion methods.
II. Related Work
A. Deep Learning-Based Fusion Methods
With the rise ofdeep learning in the last few years, a lot ofDLbased
methods have been employed inMMIF. This section covers
the DL-based medical image fusion methods first, and then
briefly introduces a novel improved deep learning method.
Lahoud et al. [25] proposed a real-time image fusion
method named " Zero Learning " (ZL) using a pre-trained
CNN to extract depth feature images as weight images, and
70 IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE | FEBRUARY 2023
Zhang et al. [33] proposed a general image fusion framework
based on CNN, named IFCNN. The above two methods
only extract the features of the last layer from the networks,
leading to the loss of partial detailed texture information. In
CNN-based networks, with the increase in network depth, a
degradation problem has been exposed. To address this degradation
problem and improve the information flow between
layers, Li et al. [36] introduced an auto-encoding network
with a Siamese structure into the image fusion task, in which
the encoder is composed of CNN layers and dense blocks.
However, the source image features extracted by the
encoders are independent and lack complementarity in this
method. U2Fusion [34] adopts only one convolutional
encoder to replace two Siamese encoders in the feature
extraction of a pair of source images, in which source images
are concatenated in channel dimensions before being fed into
the network. In principle, this encoder can extract weaker
complementary information from a pair of source images, but
it is difficult to produce better conditional features. Shi et al.
[37] proposed an end-to-end multimodal brain image fusion
framework (MMI-fuse) that applies an autoencoder to extract
the features of source images and uses a decoder model to
restructure the fused medical image. However, it adopts an
information preservation weighted channel spatial attention
model (ICS) to fuse the features which then causes the complexity
of the manual design. Ma et al. [31] proposed a dualdiscriminator
conditional generative adversarial network
(DDcGAN) that establishes an adversarial game between a
generator and two discriminators. It does not require
ground-truth fused images for training. However, the success
of GAN was limited as they were known to be unstable for
training. DDcGAN reduces the structural difference between
the fusion image and the source image, but it still has the
problem of unstable training results. A fast unified image
fusion network based on proportional maintenance of gradient
and intensity (PMGI) [38] has been proposed and is
divided into a gradient path and intensity path for information
extraction to avoid loss of information due to convolution.
PMGI considers the detailed texture information of the
source image but ignores the complementarity of the depth
structure information for the source image. These methods
always use the same representations of medical images from
different modalities, resulting in the distortion of unique
information and limiting the fusion performance. Xu et al.
[30] proposed an unsupervised enhanced medical image
fusion network (EMFusion) that performs both surface-level
and deep-level constraints for enhanced information preservation.
EMFusion concerns the unique features of medical
images from different modalities, but it ignores the fact that
these unique features are related to a pair of source images.
Extracting unique features from a single image will lead to
information redundancy. Therefore, a contrastive loss is proposed
to constrain the extracted features, which ensures that
the unique feature of a single image is maintained and the
same feature of a pair of source images is extracted.
IEEE Computational Intelligence Magazine - February 2023
Table of Contents for the Digital Edition of IEEE Computational Intelligence Magazine - February 2023
Contents
IEEE Computational Intelligence Magazine - February 2023 - Cover1
IEEE Computational Intelligence Magazine - February 2023 - Cover2
IEEE Computational Intelligence Magazine - February 2023 - Contents
IEEE Computational Intelligence Magazine - February 2023 - 2
IEEE Computational Intelligence Magazine - February 2023 - 3
IEEE Computational Intelligence Magazine - February 2023 - 4
IEEE Computational Intelligence Magazine - February 2023 - 5
IEEE Computational Intelligence Magazine - February 2023 - 6
IEEE Computational Intelligence Magazine - February 2023 - 7
IEEE Computational Intelligence Magazine - February 2023 - 8
IEEE Computational Intelligence Magazine - February 2023 - 9
IEEE Computational Intelligence Magazine - February 2023 - 10
IEEE Computational Intelligence Magazine - February 2023 - 11
IEEE Computational Intelligence Magazine - February 2023 - 12
IEEE Computational Intelligence Magazine - February 2023 - 13
IEEE Computational Intelligence Magazine - February 2023 - 14
IEEE Computational Intelligence Magazine - February 2023 - 15
IEEE Computational Intelligence Magazine - February 2023 - 16
IEEE Computational Intelligence Magazine - February 2023 - 17
IEEE Computational Intelligence Magazine - February 2023 - 18
IEEE Computational Intelligence Magazine - February 2023 - 19
IEEE Computational Intelligence Magazine - February 2023 - 20
IEEE Computational Intelligence Magazine - February 2023 - 21
IEEE Computational Intelligence Magazine - February 2023 - 22
IEEE Computational Intelligence Magazine - February 2023 - 23
IEEE Computational Intelligence Magazine - February 2023 - 24
IEEE Computational Intelligence Magazine - February 2023 - 25
IEEE Computational Intelligence Magazine - February 2023 - 26
IEEE Computational Intelligence Magazine - February 2023 - 27
IEEE Computational Intelligence Magazine - February 2023 - 28
IEEE Computational Intelligence Magazine - February 2023 - 29
IEEE Computational Intelligence Magazine - February 2023 - 30
IEEE Computational Intelligence Magazine - February 2023 - 31
IEEE Computational Intelligence Magazine - February 2023 - 32
IEEE Computational Intelligence Magazine - February 2023 - 33
IEEE Computational Intelligence Magazine - February 2023 - 34
IEEE Computational Intelligence Magazine - February 2023 - 35
IEEE Computational Intelligence Magazine - February 2023 - 36
IEEE Computational Intelligence Magazine - February 2023 - 37
IEEE Computational Intelligence Magazine - February 2023 - 38
IEEE Computational Intelligence Magazine - February 2023 - 39
IEEE Computational Intelligence Magazine - February 2023 - 40
IEEE Computational Intelligence Magazine - February 2023 - 41
IEEE Computational Intelligence Magazine - February 2023 - 42
IEEE Computational Intelligence Magazine - February 2023 - 43
IEEE Computational Intelligence Magazine - February 2023 - 44
IEEE Computational Intelligence Magazine - February 2023 - 45
IEEE Computational Intelligence Magazine - February 2023 - 46
IEEE Computational Intelligence Magazine - February 2023 - 47
IEEE Computational Intelligence Magazine - February 2023 - 48
IEEE Computational Intelligence Magazine - February 2023 - 49
IEEE Computational Intelligence Magazine - February 2023 - 50
IEEE Computational Intelligence Magazine - February 2023 - 51
IEEE Computational Intelligence Magazine - February 2023 - 52
IEEE Computational Intelligence Magazine - February 2023 - 53
IEEE Computational Intelligence Magazine - February 2023 - 54
IEEE Computational Intelligence Magazine - February 2023 - 55
IEEE Computational Intelligence Magazine - February 2023 - 56
IEEE Computational Intelligence Magazine - February 2023 - 57
IEEE Computational Intelligence Magazine - February 2023 - 58
IEEE Computational Intelligence Magazine - February 2023 - 59
IEEE Computational Intelligence Magazine - February 2023 - 60
IEEE Computational Intelligence Magazine - February 2023 - 61
IEEE Computational Intelligence Magazine - February 2023 - 62
IEEE Computational Intelligence Magazine - February 2023 - 63
IEEE Computational Intelligence Magazine - February 2023 - 64
IEEE Computational Intelligence Magazine - February 2023 - 65
IEEE Computational Intelligence Magazine - February 2023 - 66
IEEE Computational Intelligence Magazine - February 2023 - 67
IEEE Computational Intelligence Magazine - February 2023 - 68
IEEE Computational Intelligence Magazine - February 2023 - 69
IEEE Computational Intelligence Magazine - February 2023 - 70
IEEE Computational Intelligence Magazine - February 2023 - 71
IEEE Computational Intelligence Magazine - February 2023 - 72
IEEE Computational Intelligence Magazine - February 2023 - 73
IEEE Computational Intelligence Magazine - February 2023 - 74
IEEE Computational Intelligence Magazine - February 2023 - 75
IEEE Computational Intelligence Magazine - February 2023 - 76
IEEE Computational Intelligence Magazine - February 2023 - 77
IEEE Computational Intelligence Magazine - February 2023 - 78
IEEE Computational Intelligence Magazine - February 2023 - 79
IEEE Computational Intelligence Magazine - February 2023 - 80
IEEE Computational Intelligence Magazine - February 2023 - 81
IEEE Computational Intelligence Magazine - February 2023 - 82
IEEE Computational Intelligence Magazine - February 2023 - 83
IEEE Computational Intelligence Magazine - February 2023 - 84
IEEE Computational Intelligence Magazine - February 2023 - 85
IEEE Computational Intelligence Magazine - February 2023 - 86
IEEE Computational Intelligence Magazine - February 2023 - 87
IEEE Computational Intelligence Magazine - February 2023 - 88
IEEE Computational Intelligence Magazine - February 2023 - 89
IEEE Computational Intelligence Magazine - February 2023 - 90
IEEE Computational Intelligence Magazine - February 2023 - 91
IEEE Computational Intelligence Magazine - February 2023 - 92
IEEE Computational Intelligence Magazine - February 2023 - 93
IEEE Computational Intelligence Magazine - February 2023 - 94
IEEE Computational Intelligence Magazine - February 2023 - 95
IEEE Computational Intelligence Magazine - February 2023 - 96
IEEE Computational Intelligence Magazine - February 2023 - 97
IEEE Computational Intelligence Magazine - February 2023 - 98
IEEE Computational Intelligence Magazine - February 2023 - 99
IEEE Computational Intelligence Magazine - February 2023 - 100
IEEE Computational Intelligence Magazine - February 2023 - 101
IEEE Computational Intelligence Magazine - February 2023 - 102
IEEE Computational Intelligence Magazine - February 2023 - 103
IEEE Computational Intelligence Magazine - February 2023 - 104
IEEE Computational Intelligence Magazine - February 2023 - Cover3
IEEE Computational Intelligence Magazine - February 2023 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202311
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202308
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202305
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202302
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202211
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202208
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202205
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202202
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202111
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202108
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202105
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202102
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202011
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202008
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202005
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202002
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201911
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201908
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201905
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201902
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201811
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201808
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201805
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201802
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter12
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall12
https://www.nxtbookmedia.com