Quarterly Mark December 2023 - 94

Prompt-Based Learning
A successful traditional pre-train and fine-tune method of the pre-training model
such as BERT requires a minimum size of labeled data from downstream tasks, but
in many applications, it is a luxury to find a sufficient amount of labeled data. In
addition, the traditional pre-train and fine-tune paradigm often requires modification
of the language model's architecture when applying to downstream tasks. The
prompt-based learning is inspired by the fact that a task tends to be easier if there is
a task description that helps understand what the task is. By introducing a prompt,
the method transfers a specific task into a masked language prediction problem,
and hence, overcomes two issues: (1) it reduces the need of a large amount of taskspecific
labeled data and (2) it mitigates the gap between pre-training and finetuning.
A prompt usually consists of a template and a verbalizer.
Tem plate formulation. A crucial step of prompt construction is to formulate a
template. A template is a piece of text that wraps the original text with a sequence
including predefined text and mask token [MASK]. Let x be the original input, we
denote a template by F(x). The design of the template depends on the task to be
solved. For example, given an input x in a classification task or a sentiment analysis,
a template can be formulated as
Fx =() [CLS] Itis[MASK]
x
Verbalizer construction. A verbalizer is an injective mapping v: L → V that maps
each label in task-specific label space L to one or more words in the vocabulary V of
a pre-trained language model. For instance, in sentiment analysis, the verbalizer can
be set as { " positive " : " great " } and { " negative " : " terrible " }.
After defining the prompt, we can then use it to solve specific tasks. Specifically,
given an input x and a task specific label space L. We firstly transfer the original
input into the form of a template F(x). Next, we employ a pre-trained language
model with vocabulary V to determine the label y ∈ L such that v(y) ∈ V has the
highest probability to fill the mask token [MASK]. In practice, we can choose a wide
variety of pre-trained language models, ranging from a moderately sized language
model (e.g., BERT, RoBERTa) to a large-sized language model (e.g., GPT-3).
Training strategy. Before applying a prompt to solve downstream tasks, it is
necessary to design the training strategy for a prompt-based method. Because the
downstream task can be incorporated into the prompt, we can directly use the pretrained
language model to predict the desired text of the mask token in the prompt
and obtain the answer for the task of interest. Therefore, in many circumstances the
pre-trained language model can be applied to downstream task without additional
training under prompt-based approach (zero-shot learning). In contrast, we can also
use task-specific data to train the model. The prompt-based paradigm is shown to
be effective when only a small number of task-specific data is available (few-shot
learning).
The parameters of prompt-based learning usually include parameters from the
pre-trained language model and prompt. The fine-tuning strategy varies depending
on the choice of pre-trained model and specific downstream task. We present four
typical fine-tuning methods in prompt-based learning as follows (Liu et al. 2021):
1. Tune-free prompting: No additional prompt parameters are introduced,
and the parameters of pre-trained language models are not updated. The
method directly generates task answers with the aid of the prompt.
2. Fixed prompt language model tuning: No additional prompt parameters are
introduced, and the parameters of pre-trained language models are finetuned
using the traditional pre-train and fine-tune approach. In this method,
94 | ESG Text Classification: An Application of the Prompt-Based Learning Approace

Quarterly Mark December 2023

Table of Contents for the Digital Edition of Quarterly Mark December 2023

Quarterly Mark December 2023 - Cover
Quarterly Mark December 2023 - Inside Front Cover
Quarterly Mark December 2023 - 1
Quarterly Mark December 2023 - 2
Quarterly Mark December 2023 - 3
Quarterly Mark December 2023 - 4
Quarterly Mark December 2023 - 5
Quarterly Mark December 2023 - 6
Quarterly Mark December 2023 - 7
Quarterly Mark December 2023 - 8
Quarterly Mark December 2023 - 9
Quarterly Mark December 2023 - 10
Quarterly Mark December 2023 - 11
Quarterly Mark December 2023 - 12
Quarterly Mark December 2023 - 13
Quarterly Mark December 2023 - 14
Quarterly Mark December 2023 - 15
Quarterly Mark December 2023 - 16
Quarterly Mark December 2023 - 17
Quarterly Mark December 2023 - 18
Quarterly Mark December 2023 - 19
Quarterly Mark December 2023 - 20
Quarterly Mark December 2023 - 21
Quarterly Mark December 2023 - 22
Quarterly Mark December 2023 - 23
Quarterly Mark December 2023 - 24
Quarterly Mark December 2023 - 25
Quarterly Mark December 2023 - 26
Quarterly Mark December 2023 - 27
Quarterly Mark December 2023 - 28
Quarterly Mark December 2023 - 29
Quarterly Mark December 2023 - 30
Quarterly Mark December 2023 - 31
Quarterly Mark December 2023 - 32
Quarterly Mark December 2023 - 33
Quarterly Mark December 2023 - 34
Quarterly Mark December 2023 - 35
Quarterly Mark December 2023 - 36
Quarterly Mark December 2023 - 37
Quarterly Mark December 2023 - 38
Quarterly Mark December 2023 - 39
Quarterly Mark December 2023 - 40
Quarterly Mark December 2023 - 41
Quarterly Mark December 2023 - 42
Quarterly Mark December 2023 - 43
Quarterly Mark December 2023 - 44
Quarterly Mark December 2023 - 45
Quarterly Mark December 2023 - 46
Quarterly Mark December 2023 - 47
Quarterly Mark December 2023 - 48
Quarterly Mark December 2023 - 49
Quarterly Mark December 2023 - 50
Quarterly Mark December 2023 - 51
Quarterly Mark December 2023 - 52
Quarterly Mark December 2023 - 53
Quarterly Mark December 2023 - 54
Quarterly Mark December 2023 - 55
Quarterly Mark December 2023 - 56
Quarterly Mark December 2023 - 57
Quarterly Mark December 2023 - 58
Quarterly Mark December 2023 - 59
Quarterly Mark December 2023 - 60
Quarterly Mark December 2023 - 61
Quarterly Mark December 2023 - 62
Quarterly Mark December 2023 - 63
Quarterly Mark December 2023 - 64
Quarterly Mark December 2023 - 65
Quarterly Mark December 2023 - 66
Quarterly Mark December 2023 - 67
Quarterly Mark December 2023 - 68
Quarterly Mark December 2023 - 69
Quarterly Mark December 2023 - 70
Quarterly Mark December 2023 - 71
Quarterly Mark December 2023 - 72
Quarterly Mark December 2023 - 73
Quarterly Mark December 2023 - 74
Quarterly Mark December 2023 - 75
Quarterly Mark December 2023 - 76
Quarterly Mark December 2023 - 77
Quarterly Mark December 2023 - 78
Quarterly Mark December 2023 - 79
Quarterly Mark December 2023 - 80
Quarterly Mark December 2023 - 81
Quarterly Mark December 2023 - 82
Quarterly Mark December 2023 - 83
Quarterly Mark December 2023 - 84
Quarterly Mark December 2023 - 85
Quarterly Mark December 2023 - 86
Quarterly Mark December 2023 - 87
Quarterly Mark December 2023 - 88
Quarterly Mark December 2023 - 89
Quarterly Mark December 2023 - 90
Quarterly Mark December 2023 - 91
Quarterly Mark December 2023 - 92
Quarterly Mark December 2023 - 93
Quarterly Mark December 2023 - 94
Quarterly Mark December 2023 - 95
Quarterly Mark December 2023 - 96
Quarterly Mark December 2023 - 97
Quarterly Mark December 2023 - 98
Quarterly Mark December 2023 - 99
Quarterly Mark December 2023 - 100
Quarterly Mark December 2023 - 101
Quarterly Mark December 2023 - Back Cover
https://www.nxtbookmedia.com