EXHIBIT 1 Word Cloud Representing the Most Frequently Occurring Words in this Article EVOLUTION OF NLP MODELS NLP models have evolved over time with theoretical innovation and the availability of data and computational resources. Exhibit 2 shows the computation and data points used by the artificial intelligence (AI) systems-including language systems-developed over time in which we can observe an exponential increase in the computation and datasets used by AI systems. This was possible due to the increase in the computational resources and decreasing cost of computational and data storage facilities. Increase in computation and datasets implies increasing complexity of the AI/ NLP models over time. The earliest NLP-based models in the literature are rulesbased systems, developed in the 1950s and 1960s. These systems used handcrafted rules and grammars to analyze natural language text. In the 1980s and 1990s, statistical NLP models emerged, which supplanted rules-based approaches with statistical models. During the 2000s, with the increased availability of datasets and the exponential increase in computational power, ML models were developed that could learn patterns in textual data and make predictions based on this interpretation of text. In this section, we summarize the models that were developed over this decades-long history. We have divided the timeline of NLP models into multiple categories based on model complexity, and consider them one by one. 8 | From ELIZA to ChatGPT: The Evolution of Natural Language Processing and Financial Applications