Quarterly Mark December 2023 - 10
Rules-Based NLP (1950s to 1970s)
Early NLP research focused on rules-based approaches, including grammarbased
parsers and dictionary-based information extraction systems.
§ Grammar-based parsers: These systems used formal grammars to analyze
the structure of text, such as its syntax, and to generate parse trees. This
information could then be used to answer questions about the semantics of
a text or to perform other NLP tasks, such as information extraction or text
classification. Some early influential articles include Chomsky (2015), Ross
(1967), and Earley (1970).
§ Dictionaries: These systems relied on predefined dictionaries of words and
phrases to identify and extract specific pieces of information from text, such
as names, dates, and locations (Luhn 1958). Handcrafted rules were used to
classify text into predefined categories, such as positive, negative, or neutral
sentiment (Plath 1976). They were likewise used to identify and classify
named entities in text, such as people, organizations, and locations.
Exhibit 3 provides an example of rules-based parsing of a sentence. Such
parsing and pattern matching were used by the early chatbots like ELIZA. These
rules-based models represented an important early step in the development of NLP,
but had several inherent limitations, including the need for extensive handcrafted
rules and dictionaries and the inability to handle exceptions and outliers in text.
Some recent works related to rules-based methods are Klein and Manning (2003),
Agichtein and Gravano (2000), Turney (2002), and Nadeau and Sekine (2007).
Early Statistical NLP Models (1980s to 1990s)
Hidden Markov models (HMMs) (Baum and Petrie 1966) and maximum entropy
models (MEMs) were among the first statistical NLP models to be developed
and applied to NLP tasks, including the tagging of parts of speech, named entity
recognition, and speech recognition (Baker 1979; Berger et al. 1996). HMMs are a
EXHIBIT 3
Sentence Tree Constructed by a Rule-Based Grammar Parser for the Sentence " Happy researchers develop
a language model "
Sentence
Noun Phrase
Verb Phrase
Adjective
Noun
Verb
Noun Phrase
Happy
researchers
develop
a language model
NOTE: Such grammar parsers were used by early chatbots like ELIZA to understand the sentence and respond based on the script (dictionary
of sentences).
10 | From ELIZA to ChatGPT: The Evolution of Natural Language Processing and Financial Applications
Quarterly Mark December 2023
Table of Contents for the Digital Edition of Quarterly Mark December 2023
Quarterly Mark December 2023 - Cover
Quarterly Mark December 2023 - Inside Front Cover
Quarterly Mark December 2023 - 1
Quarterly Mark December 2023 - 2
Quarterly Mark December 2023 - 3
Quarterly Mark December 2023 - 4
Quarterly Mark December 2023 - 5
Quarterly Mark December 2023 - 6
Quarterly Mark December 2023 - 7
Quarterly Mark December 2023 - 8
Quarterly Mark December 2023 - 9
Quarterly Mark December 2023 - 10
Quarterly Mark December 2023 - 11
Quarterly Mark December 2023 - 12
Quarterly Mark December 2023 - 13
Quarterly Mark December 2023 - 14
Quarterly Mark December 2023 - 15
Quarterly Mark December 2023 - 16
Quarterly Mark December 2023 - 17
Quarterly Mark December 2023 - 18
Quarterly Mark December 2023 - 19
Quarterly Mark December 2023 - 20
Quarterly Mark December 2023 - 21
Quarterly Mark December 2023 - 22
Quarterly Mark December 2023 - 23
Quarterly Mark December 2023 - 24
Quarterly Mark December 2023 - 25
Quarterly Mark December 2023 - 26
Quarterly Mark December 2023 - 27
Quarterly Mark December 2023 - 28
Quarterly Mark December 2023 - 29
Quarterly Mark December 2023 - 30
Quarterly Mark December 2023 - 31
Quarterly Mark December 2023 - 32
Quarterly Mark December 2023 - 33
Quarterly Mark December 2023 - 34
Quarterly Mark December 2023 - 35
Quarterly Mark December 2023 - 36
Quarterly Mark December 2023 - 37
Quarterly Mark December 2023 - 38
Quarterly Mark December 2023 - 39
Quarterly Mark December 2023 - 40
Quarterly Mark December 2023 - 41
Quarterly Mark December 2023 - 42
Quarterly Mark December 2023 - 43
Quarterly Mark December 2023 - 44
Quarterly Mark December 2023 - 45
Quarterly Mark December 2023 - 46
Quarterly Mark December 2023 - 47
Quarterly Mark December 2023 - 48
Quarterly Mark December 2023 - 49
Quarterly Mark December 2023 - 50
Quarterly Mark December 2023 - 51
Quarterly Mark December 2023 - 52
Quarterly Mark December 2023 - 53
Quarterly Mark December 2023 - 54
Quarterly Mark December 2023 - 55
Quarterly Mark December 2023 - 56
Quarterly Mark December 2023 - 57
Quarterly Mark December 2023 - 58
Quarterly Mark December 2023 - 59
Quarterly Mark December 2023 - 60
Quarterly Mark December 2023 - 61
Quarterly Mark December 2023 - 62
Quarterly Mark December 2023 - 63
Quarterly Mark December 2023 - 64
Quarterly Mark December 2023 - 65
Quarterly Mark December 2023 - 66
Quarterly Mark December 2023 - 67
Quarterly Mark December 2023 - 68
Quarterly Mark December 2023 - 69
Quarterly Mark December 2023 - 70
Quarterly Mark December 2023 - 71
Quarterly Mark December 2023 - 72
Quarterly Mark December 2023 - 73
Quarterly Mark December 2023 - 74
Quarterly Mark December 2023 - 75
Quarterly Mark December 2023 - 76
Quarterly Mark December 2023 - 77
Quarterly Mark December 2023 - 78
Quarterly Mark December 2023 - 79
Quarterly Mark December 2023 - 80
Quarterly Mark December 2023 - 81
Quarterly Mark December 2023 - 82
Quarterly Mark December 2023 - 83
Quarterly Mark December 2023 - 84
Quarterly Mark December 2023 - 85
Quarterly Mark December 2023 - 86
Quarterly Mark December 2023 - 87
Quarterly Mark December 2023 - 88
Quarterly Mark December 2023 - 89
Quarterly Mark December 2023 - 90
Quarterly Mark December 2023 - 91
Quarterly Mark December 2023 - 92
Quarterly Mark December 2023 - 93
Quarterly Mark December 2023 - 94
Quarterly Mark December 2023 - 95
Quarterly Mark December 2023 - 96
Quarterly Mark December 2023 - 97
Quarterly Mark December 2023 - 98
Quarterly Mark December 2023 - 99
Quarterly Mark December 2023 - 100
Quarterly Mark December 2023 - 101
Quarterly Mark December 2023 - Back Cover
https://www.nxtbookmedia.com