Quarterly Mark December 2023 - 93

EXHIBIT 1
Mapping of Refinitiv's ESG Data
architecture adjustments. The BERT's framework
includes two steps: pre-training and fine-tuning.
Pre-training. In the pre-training step, the
Categories
Environmental
Resource use
Emissions
Innovation
Social
Workforce
Human rights
Community
Product
responsibility
Governance
Management
Shareholders
Corporate social
responsibility
strategy
model trains deep bidirectional representations
using a large scale of unlabeled textual corpora
(e.g., BooksCorpus [800M words] Zhu et al. 2015
and English Wikipedia [2500M words] based on
Transformer encoder Vaswani et al. 2017). The
textual input x is first transformed into token
sequence x. A token sequence can be a single
sentence or sentence pairs where a sentence may
not be a real linguistic sentence but an arbitrary length of contiguous text. A single
sentence sequence is of the form [CLS]
consisting of sentences A and B takes the form [CLS] xA
x [SEP], while a sentence pair sequence
[SEP] xB [SEP], where
[CLS] is a special classification token and [SEP] is a special separator token. BERT
then learns the hidden vectors (embedding) of the input tokens. Two unsupervised
tasks are used for pre-training.
Masked Language Modeling (MLM). MLM masks some percentage of inputs
at random and the model needs to predict the masked ones using their contexts.
BERT is designed to fill the mask tokens by a bidirectional fashion, which consumes
both the left and right contexts. The model randomly selects 15% of the input tokens
to predict, and a specific strategy is employed to mitigate the issue of mismatch
between pre-training and fine-tuning. Specifically, for a chosen token, BERT replaces
it with the [MASK] token 80% of the time, a random token 10% of the time, and the
unchanged token 10% of the time.
Next Sentence Prediction (NSP). Many NLP tasks involve finding patterns for
sentence pairs, such as question-passage pairs in question answering or hypothesispremise
pairs in natural language inference. To make BERT adaptive to such tasks,
a binarized NSP task is formulated in pre-training. In particular, the model is trained
to distinguish whether sentence B is the actual next one after sentence A in a
sentence pair sequence under a classification framework. To achieve this goal, 50%
of sentence B in sentence pairs is the original next sentence following sentence A,
while 50% of sentence B in sentence pairs is a random sentence from the corpus.
Fine-tuning. In the fine-tuning step, apart from output layers, the model
architectures keep the same as those in pre-training step and the input is taskspecific
labeled data. For each downstream task, the task-specific inputs are
plugged into BERT at first, then the BERT generated representations are fed into
output layers. Specially, for classification task, the hidden vector of [CLS] is fed into
output layers. The parameters of BERT model are initialized with the pre-trained
ones and the parameters of output layers are randomly initialized. All parameters
are fine-tuned using the task-specific labeled data.
EXHIBIT 2
Sizes of Datasets
Training Dataset
Testing Dataset
Total
Environmental
500
3,000
3,500
Social
500
3,000
3,500
Governance
500
3,000
3,500
Non-ESG
500
500
1,000
Total
2,000
9,500
11,500
Quarterly Mark | 93

Quarterly Mark December 2023

Table of Contents for the Digital Edition of Quarterly Mark December 2023

Quarterly Mark December 2023 - Cover
Quarterly Mark December 2023 - Inside Front Cover
Quarterly Mark December 2023 - 1
Quarterly Mark December 2023 - 2
Quarterly Mark December 2023 - 3
Quarterly Mark December 2023 - 4
Quarterly Mark December 2023 - 5
Quarterly Mark December 2023 - 6
Quarterly Mark December 2023 - 7
Quarterly Mark December 2023 - 8
Quarterly Mark December 2023 - 9
Quarterly Mark December 2023 - 10
Quarterly Mark December 2023 - 11
Quarterly Mark December 2023 - 12
Quarterly Mark December 2023 - 13
Quarterly Mark December 2023 - 14
Quarterly Mark December 2023 - 15
Quarterly Mark December 2023 - 16
Quarterly Mark December 2023 - 17
Quarterly Mark December 2023 - 18
Quarterly Mark December 2023 - 19
Quarterly Mark December 2023 - 20
Quarterly Mark December 2023 - 21
Quarterly Mark December 2023 - 22
Quarterly Mark December 2023 - 23
Quarterly Mark December 2023 - 24
Quarterly Mark December 2023 - 25
Quarterly Mark December 2023 - 26
Quarterly Mark December 2023 - 27
Quarterly Mark December 2023 - 28
Quarterly Mark December 2023 - 29
Quarterly Mark December 2023 - 30
Quarterly Mark December 2023 - 31
Quarterly Mark December 2023 - 32
Quarterly Mark December 2023 - 33
Quarterly Mark December 2023 - 34
Quarterly Mark December 2023 - 35
Quarterly Mark December 2023 - 36
Quarterly Mark December 2023 - 37
Quarterly Mark December 2023 - 38
Quarterly Mark December 2023 - 39
Quarterly Mark December 2023 - 40
Quarterly Mark December 2023 - 41
Quarterly Mark December 2023 - 42
Quarterly Mark December 2023 - 43
Quarterly Mark December 2023 - 44
Quarterly Mark December 2023 - 45
Quarterly Mark December 2023 - 46
Quarterly Mark December 2023 - 47
Quarterly Mark December 2023 - 48
Quarterly Mark December 2023 - 49
Quarterly Mark December 2023 - 50
Quarterly Mark December 2023 - 51
Quarterly Mark December 2023 - 52
Quarterly Mark December 2023 - 53
Quarterly Mark December 2023 - 54
Quarterly Mark December 2023 - 55
Quarterly Mark December 2023 - 56
Quarterly Mark December 2023 - 57
Quarterly Mark December 2023 - 58
Quarterly Mark December 2023 - 59
Quarterly Mark December 2023 - 60
Quarterly Mark December 2023 - 61
Quarterly Mark December 2023 - 62
Quarterly Mark December 2023 - 63
Quarterly Mark December 2023 - 64
Quarterly Mark December 2023 - 65
Quarterly Mark December 2023 - 66
Quarterly Mark December 2023 - 67
Quarterly Mark December 2023 - 68
Quarterly Mark December 2023 - 69
Quarterly Mark December 2023 - 70
Quarterly Mark December 2023 - 71
Quarterly Mark December 2023 - 72
Quarterly Mark December 2023 - 73
Quarterly Mark December 2023 - 74
Quarterly Mark December 2023 - 75
Quarterly Mark December 2023 - 76
Quarterly Mark December 2023 - 77
Quarterly Mark December 2023 - 78
Quarterly Mark December 2023 - 79
Quarterly Mark December 2023 - 80
Quarterly Mark December 2023 - 81
Quarterly Mark December 2023 - 82
Quarterly Mark December 2023 - 83
Quarterly Mark December 2023 - 84
Quarterly Mark December 2023 - 85
Quarterly Mark December 2023 - 86
Quarterly Mark December 2023 - 87
Quarterly Mark December 2023 - 88
Quarterly Mark December 2023 - 89
Quarterly Mark December 2023 - 90
Quarterly Mark December 2023 - 91
Quarterly Mark December 2023 - 92
Quarterly Mark December 2023 - 93
Quarterly Mark December 2023 - 94
Quarterly Mark December 2023 - 95
Quarterly Mark December 2023 - 96
Quarterly Mark December 2023 - 97
Quarterly Mark December 2023 - 98
Quarterly Mark December 2023 - 99
Quarterly Mark December 2023 - 100
Quarterly Mark December 2023 - 101
Quarterly Mark December 2023 - Back Cover
https://www.nxtbookmedia.com