The ATA Chronicle - November/December 2024 - 27

The correlation of
automatic metrics with
human judgment remains
a contentious issue, but
their role in MT research
and development is
indispensable. Such metrics
allow developers to compare
the output of their models
on the fly across numerous
modifications to see if a
small change made in the
system (e.g., adding another
transformer layer, changing
a batch size, or tweaking
another hyperparameter of
the model) leads to quality
improvements. Automatic
metrics are also used to track
the training of neural MT
models by computing, say, a
BLEU score on a held-out test
set (a portion of the bilingual
corpus that was not used in
training) after each iteration.
One can stop the training
when there's no further
improvement.
Until very recently,
automatic metrics were both
technically inaccessible and
of no real use to freelance
translators who have always
relied on their professional
judgment and manual
analysis. But with the
seamless integration of MT
engines with CAT tools, the
rapid increase in the amount
of bilingual data available
to human translators at a
mouse click, and the recent
progress in generative AI,
the playing field is changing
very rapidly. More and more
translators are using MT
in their workflow, often
confronting the need to
choose among numerous
available MT engines for
another 50K-word project in
a specialized field.
Typically, translators
possess high-quality bilingual
resources such as translation
www.ata-chronicle.online
Figure 1: Evaluation results-bar plot
memories and term bases
from previous projects in
the same field. This gives
them everything they need
to compare the output of
several MT engines in just a
few minutes in terms of the
automatic metrics mentioned
earlier, using the recently
developed free online toolkits
intended for users with no
programming skills.2
MATEO
Developed by a group
of researchers at Ghent
University in Belgium,
MAchine Translation
Evaluation Online (MATEO)3
is an online application with
an easy user interface that
allows anyone to estimate the
quality of several candidate
translations with any or all
of the automatic metrics
mentioned earlier. (Please see
MATEO's homepage for more
information.)
You simply need to
upload the source file and its
reference translation (i.e.,
your " gold standard " from
a previous project) and up to
four candidate translations
of the same source document
from standard MT, large
language models, or another
translation produced by a
human. One of the outputs
( " System #1 " ) will serve as
a baseline. Important: all
files must be in plain text
in Unicode (UTF-8), one
sentence per line, and must be
perfectly aligned. That means
no tags or inline formatting,
just plain Unicode. This may
be dissatisfying to some users,
but remember the goal is to
evaluate the quality of the
translated text, not the layout.
Assuming you have done
everything correctly, you
should see the " Evaluate MT "
button at the bottom of the
screen. Click it-and then
get yourself a cup of coffee
or do something else on your
computer while MATEO goes
to work. Calculating the three
neural metrics (COMET,
BLEURT, and BERTScore)
will take several minutes or
longer, depending on the
document size. But most users
will not need all three metrics.
I suggest combining COMET
with the string-based metrics
The automatic
evaluation of
the quality
of machine
translation has
long been a
hot topic in
the industry.
(BLEU, chrF, and TER), which
will prove much faster.
When finished, the system
will present the evaluation
results, complete with the
bar and radar plots and tables
of the metric scores, which
can also be downloaded in
PNG, SVG, and Excel. You
can easily use the data to
create your own, even more
impressive, charts in Excel
for presentations (or just to
impress your client!).
The bar plot in Figure 1
above shows that MT2 (in red)
consistently outperforms the
other three MT engines (which
will remain unnamed). Note
that for TER, less is better.
American Translators Association 27
https://mateo.ivdnt.org/ http://www.ata-chronicle.online

The ATA Chronicle - November/December 2024

Table of Contents for the Digital Edition of The ATA Chronicle - November/December 2024

Cover
The ATA Chronicle - November/December 2024 - Cover
The ATA Chronicle - November/December 2024 - Cover
The ATA Chronicle - November/December 2024 - 3
The ATA Chronicle - November/December 2024 - 4
The ATA Chronicle - November/December 2024 - 5
The ATA Chronicle - November/December 2024 - 6
The ATA Chronicle - November/December 2024 - 7
The ATA Chronicle - November/December 2024 - 8
The ATA Chronicle - November/December 2024 - 9
The ATA Chronicle - November/December 2024 - 10
The ATA Chronicle - November/December 2024 - 11
The ATA Chronicle - November/December 2024 - 12
The ATA Chronicle - November/December 2024 - 13
The ATA Chronicle - November/December 2024 - 14
The ATA Chronicle - November/December 2024 - 15
The ATA Chronicle - November/December 2024 - 16
The ATA Chronicle - November/December 2024 - 17
The ATA Chronicle - November/December 2024 - 18
The ATA Chronicle - November/December 2024 - 19
The ATA Chronicle - November/December 2024 - 20
The ATA Chronicle - November/December 2024 - 21
The ATA Chronicle - November/December 2024 - 22
The ATA Chronicle - November/December 2024 - 23
The ATA Chronicle - November/December 2024 - 24
The ATA Chronicle - November/December 2024 - 25
The ATA Chronicle - November/December 2024 - 26
The ATA Chronicle - November/December 2024 - 27
The ATA Chronicle - November/December 2024 - 28
The ATA Chronicle - November/December 2024 - 29
The ATA Chronicle - November/December 2024 - Cover4
https://www.nxtbook.com/nxtbooks/chronicle/20241112
https://www.nxtbook.com/nxtbooks/chronicle/20240910
https://www.nxtbook.com/nxtbooks/chronicle/20240708
https://www.nxtbook.com/nxtbooks/chronicle/20240506
https://www.nxtbook.com/nxtbooks/chronicle/20240304
https://www.nxtbook.com/nxtbooks/chronicle/20240102
https://www.nxtbook.com/nxtbooks/chronicle/20231112
https://www.nxtbook.com/nxtbooks/chronicle/20230910
https://www.nxtbook.com/nxtbooks/chronicle/20230506
https://www.nxtbook.com/nxtbooks/chronicle/20230304
https://www.nxtbook.com/nxtbooks/chronicle/20230102
https://www.nxtbook.com/nxtbooks/chronicle/20221112
https://www.nxtbook.com/nxtbooks/chronicle/20220910
https://www.nxtbook.com/nxtbooks/chronicle/20220708
https://www.nxtbook.com/nxtbooks/chronicle/20220506
https://www.nxtbook.com/nxtbooks/chronicle/20220304
https://www.nxtbook.com/nxtbooks/chronicle/20220102
https://www.nxtbook.com/nxtbooks/chronicle/20211112
https://www.nxtbook.com/nxtbooks/chronicle/20210910
https://www.nxtbook.com/nxtbooks/chronicle/20210708
https://www.nxtbook.com/nxtbooks/chronicle/20210506
https://www.nxtbook.com/nxtbooks/chronicle/20210304
https://www.nxtbook.com/nxtbooks/chronicle/20210102
https://www.nxtbookmedia.com