Signal Processing - May 2017 - 96

16 pulse emissions in the same direction. This is performed in a number of
directions, and an image of the axial
velocity distribution is acquired and
displayed as shown in Figure 1(b).
The image is acquired from the neck
and shows the carotid artery supplying blood to the brain and the returning
blood via the jugular vein with flow in
the opposite direction. The underlying gray-level image shows the anatomy. The red color indicates velocities
toward the probe, which is placed at
the top of the image, and blue represents
velocities away. Such color flow map
(CFM) images can be acquired at a rate of
10-60 Hz, depending on the depth and
the number of directions for performing velocity imaging. The images give
a real-time visualization of flow in the
arteries and an indication of the axial
velocity direction and magnitude. CFM
images are not quantitative due to the
angle dependence of the estimates, and
the standard deviation is usually high
due to the few emissions employed in
the velocity estimation. They are, however, very helpful in diagnosing many
haemodynamic problems like stenosis
and cardiac function, and widely used
for detection of vascularization in, e.g.,
infection, inflammation, and cancer.

Challenges
The scattering from blood is weak,
and the vessels in an ultrasound image
appear black. It is, thus, vital to make a processing for reducing the noise influence.
Matched filters are employed on the data,
and averaging is performed in the velocity estimators over both a number of pulse
emissions Nc and along the depth direction.
This improves the variance of the velocity estimates proportionally to the number
of independent lines and samples. Often
eight to 16 emissions in the same direction are averaged at the penalty of a reduction in the frame rate. For triplex images
that show the anatomic B-mode image,
the spectrum, and a CFM image of the
velocity, a frame rate down to 5 Hz is often
attained. This is not sufficient to visualize
the dynamics of the heart, where a frame
rate above 20 Hz up to 60 Hz is preferred.
A further complication is in separation of tissue and blood. The tissue signal
96

amplitude is often 20-40 dB larger than
that of the blood and the tissue signal has
to be removed to avoid bias in the blood
velocity estimation. It is assumed that
the tissue is roughly stationary, and a filtration along the emissions is made by,
e.g., subtracting the mean signal for all
emissions from the signals. Other more
advanced methods are found, and this
filtration is often the most challenging
part of velocity estimation, as there is a
limited amount of data available, and
the filtration inevitable introduces noise
especially for low velocities.

Estimating the velocity vector
The estimation schemes in the "Axial
Velocity Estimation" section only find
the axial velocity component, but most
vessels run parallel to the skin surface,
so the least important component is the
axial velocity. This is often sought, compensated for by tilting the ultrasound
beam, but it can be difficult to maintain
a beam-to-flow angle below 60c needed
for achieving an acceptable 10% precision of the velocity estimate. Also, flow
in the human circulation is pulsating and
complex with disturbed and turbulent
flow patterns, and transitory vortices,
which appear in short time periods, are
often found. A single angle correction,
thus, cannot be performed, and the angle
should be estimated for all positions in
each individual image.
Several methods for finding the velocity vector have been developed and investigated. The first methods were based on
dual beam systems, where the velocities
were estimated for two beams at two different angles [5]. The axial and lateral
velocity components could then be found
from the sum and difference of the two
estimates. Other methods include tracking the speckle pattern in the ultrasound
image [6], and various forms of beamforming transverse to the ultrasound
beam or along the flow direction.
A commercially introduced approach
is transverse oscillation (TO) [7] for
which advanced beamforming is used
to introduce an oscillation in the ultrasound field transverse to the propagation
direction. The feature that enables axial
velocity estimation is the sinusoidal oscillation of the emitted pulse. This makes
IEEE Signal Processing Magazine

|

May 2017

|

it possible to find the frequency, time, or
phase shift. Introducing a lateral oscillation enables the estimation of the lateral
velocity component.
In the focal region of an ultrasound
probe, there is a Fourier relation between
the amplitude weighting (apodization) of
the individual elements and the lateral
beam pattern. Generating a sinusoidal
oscillation is, thus, possible by having two
separated peaks in the apodization function. Shaping the peaks with a window
function limits the width of the oscillation. Commonly, two peaks with a von
Hann or Gauss window shape are used
during processing the receive signals,
and the lateral oscillation period or wavelength m x is given by [7]
mx =

2md = 2md ,
Pd
N d Pi

(7)

where d is the depth and Pd is the distance between the two peaks in the apodization function. The transducer pitch is
Pi , and the number of elements between
the peaks is N d . A fairly broad transmit
field with a focus beneath the region of
interest is used, and the apodization is
applied during the receive processing
making it possible to dynamically adapt
the lateral wavelength. Two beams separated by a lateral distance of m x 4 are
focused in parallel to make it possible to
find the sign of the velocity. The beams
are roughly 90c phase shifted compared
to each other, and a complex signal is
attained like for the axial velocity estimator. Each pulse emission, thus, gives
four samples: two for each beam from the
axial Hilbert transform. The measured
signals are given by
rsq (n, i) = x l (n, i) + jy r (n, i)
rsqh (n, i) = H " x l (n, i) , + jH " y r (n, i) ,,
(8)
where x l (n, i) is the left beam signal
and y r (n, i) the right. The received signals from the transducer are then Hilbert
transformed in the temporal direction n
to yield rsqh (n, k, i) . Two new signals are
then formed from
r1 (n, k, i) = rsq (n, k, i) + jrsqh (n, k, i),
r2 (n, k, i) = rsq (n, k, i) - jrsqh (n, k, i),
(9)



Table of Contents for the Digital Edition of Signal Processing - May 2017

Signal Processing - May 2017 - Cover1
Signal Processing - May 2017 - Cover2
Signal Processing - May 2017 - 1
Signal Processing - May 2017 - 2
Signal Processing - May 2017 - 3
Signal Processing - May 2017 - 4
Signal Processing - May 2017 - 5
Signal Processing - May 2017 - 6
Signal Processing - May 2017 - 7
Signal Processing - May 2017 - 8
Signal Processing - May 2017 - 9
Signal Processing - May 2017 - 10
Signal Processing - May 2017 - 11
Signal Processing - May 2017 - 12
Signal Processing - May 2017 - 13
Signal Processing - May 2017 - 14
Signal Processing - May 2017 - 15
Signal Processing - May 2017 - 16
Signal Processing - May 2017 - 17
Signal Processing - May 2017 - 18
Signal Processing - May 2017 - 19
Signal Processing - May 2017 - 20
Signal Processing - May 2017 - 21
Signal Processing - May 2017 - 22
Signal Processing - May 2017 - 23
Signal Processing - May 2017 - 24
Signal Processing - May 2017 - 25
Signal Processing - May 2017 - 26
Signal Processing - May 2017 - 27
Signal Processing - May 2017 - 28
Signal Processing - May 2017 - 29
Signal Processing - May 2017 - 30
Signal Processing - May 2017 - 31
Signal Processing - May 2017 - 32
Signal Processing - May 2017 - 33
Signal Processing - May 2017 - 34
Signal Processing - May 2017 - 35
Signal Processing - May 2017 - 36
Signal Processing - May 2017 - 37
Signal Processing - May 2017 - 38
Signal Processing - May 2017 - 39
Signal Processing - May 2017 - 40
Signal Processing - May 2017 - 41
Signal Processing - May 2017 - 42
Signal Processing - May 2017 - 43
Signal Processing - May 2017 - 44
Signal Processing - May 2017 - 45
Signal Processing - May 2017 - 46
Signal Processing - May 2017 - 47
Signal Processing - May 2017 - 48
Signal Processing - May 2017 - 49
Signal Processing - May 2017 - 50
Signal Processing - May 2017 - 51
Signal Processing - May 2017 - 52
Signal Processing - May 2017 - 53
Signal Processing - May 2017 - 54
Signal Processing - May 2017 - 55
Signal Processing - May 2017 - 56
Signal Processing - May 2017 - 57
Signal Processing - May 2017 - 58
Signal Processing - May 2017 - 59
Signal Processing - May 2017 - 60
Signal Processing - May 2017 - 61
Signal Processing - May 2017 - 62
Signal Processing - May 2017 - 63
Signal Processing - May 2017 - 64
Signal Processing - May 2017 - 65
Signal Processing - May 2017 - 66
Signal Processing - May 2017 - 67
Signal Processing - May 2017 - 68
Signal Processing - May 2017 - 69
Signal Processing - May 2017 - 70
Signal Processing - May 2017 - 71
Signal Processing - May 2017 - 72
Signal Processing - May 2017 - 73
Signal Processing - May 2017 - 74
Signal Processing - May 2017 - 75
Signal Processing - May 2017 - 76
Signal Processing - May 2017 - 77
Signal Processing - May 2017 - 78
Signal Processing - May 2017 - 79
Signal Processing - May 2017 - 80
Signal Processing - May 2017 - 81
Signal Processing - May 2017 - 82
Signal Processing - May 2017 - 83
Signal Processing - May 2017 - 84
Signal Processing - May 2017 - 85
Signal Processing - May 2017 - 86
Signal Processing - May 2017 - 87
Signal Processing - May 2017 - 88
Signal Processing - May 2017 - 89
Signal Processing - May 2017 - 90
Signal Processing - May 2017 - 91
Signal Processing - May 2017 - 92
Signal Processing - May 2017 - 93
Signal Processing - May 2017 - 94
Signal Processing - May 2017 - 95
Signal Processing - May 2017 - 96
Signal Processing - May 2017 - 97
Signal Processing - May 2017 - 98
Signal Processing - May 2017 - 99
Signal Processing - May 2017 - 100
Signal Processing - May 2017 - 101
Signal Processing - May 2017 - 102
Signal Processing - May 2017 - 103
Signal Processing - May 2017 - 104
Signal Processing - May 2017 - 105
Signal Processing - May 2017 - 106
Signal Processing - May 2017 - 107
Signal Processing - May 2017 - 108
Signal Processing - May 2017 - 109
Signal Processing - May 2017 - 110
Signal Processing - May 2017 - 111
Signal Processing - May 2017 - 112
Signal Processing - May 2017 - Cover3
Signal Processing - May 2017 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201809
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201807
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201805
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201803
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201801
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1117
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0917
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0717
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0517
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0317
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0117
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1116
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0916
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0716
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0516
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0316
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0116
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1115
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0915
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0715
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0515
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0315
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0115
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1114
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0914
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0714
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0514
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0314
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0114
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1113
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0913
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0713
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0513
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0313
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0113
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1112
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0912
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0712
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0512
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0312
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0112
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1111
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0911
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0711
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0511
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0311
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0111
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1110
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0910
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0710
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0510
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0310
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0110
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1109
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0909
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0709
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0509
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0309
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0109
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1108
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0908
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0708
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0508
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0308
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0108
https://www.nxtbookmedia.com