{"id":294,"date":"2023-01-29T17:16:49","date_gmt":"2023-01-29T17:16:49","guid":{"rendered":"https:\/\/fintext.ai\/?page_id=294"},"modified":"2023-01-30T12:10:10","modified_gmt":"2023-01-30T12:10:10","slug":"gold-standard-financial-benchmarks","status":"publish","type":"page","link":"https:\/\/fintext.ai\/?page_id=294","title":{"rendered":"Gold-Standard Financial Benchmarks"},"content":{"rendered":"<p style=\"text-align: justify;\">We introduce the first <em>gold-standard financial benchmark<\/em> for systematically comparing word embeddings using a financial language framework. This benchmark covers seven groups of financial analogies. Each group contains 80 analogies reaching 2660 unique analogies in total for all groups. All financial analogies are developed using the Bureau van Dijk\u2019s Orbis database and are available for <a href=\"https:\/\/fintext.ai\/?page_id=44\">download<\/a>.<\/p>\n<p style=\"text-align: justify;\">In the table below, the first five groups cover publicly listed US companies, the sixth group mixes US and UK publicly listed companies, and the last group mixes US, UK, China, and Japan publicly listed companies. \u2018Ticker\u2019 is the security ticker identifier, \u2018Name\u2019 is the full name of the company, \u2018City\u2019 is the headquarters location, \u2018Exchange\u2019 is the stock exchange where the company\u2019s share is traded, \u2018Country\u2019 is the country where headquarters is located, \u2018State\u2019 (for US companies) is the state where the headquarters is located, and finally \u2018Incorporation year\u2019 is the incorporation year of the company. To generate sufficient challenges, we chose the top 20, 10 and 5 companies from the \u2018very large companies\u2019 class for groups I-V, VI and VII, respectively. The permutation of chosen companies in each group generates 380 unique analogies for each group and 2660 analogies in total. The accuracy of each word embedding is reported for each group and all groups (overall).<\/p>\n<p><img fetchpriority=\"high\" decoding=\"async\" class=\"wp-image-303 aligncenter\" src=\"https:\/\/fintext.ai\/wp-content\/uploads\/2023\/01\/FinText_WE_Gold_Standard_Benchmark_table-300x112.jpg\" alt=\"\" width=\"1020\" height=\"381\" srcset=\"https:\/\/fintext.ai\/wp-content\/uploads\/2023\/01\/FinText_WE_Gold_Standard_Benchmark_table-300x112.jpg 300w, https:\/\/fintext.ai\/wp-content\/uploads\/2023\/01\/FinText_WE_Gold_Standard_Benchmark_table-1024x383.jpg 1024w, https:\/\/fintext.ai\/wp-content\/uploads\/2023\/01\/FinText_WE_Gold_Standard_Benchmark_table-768x287.jpg 768w, https:\/\/fintext.ai\/wp-content\/uploads\/2023\/01\/FinText_WE_Gold_Standard_Benchmark_table-1536x575.jpg 1536w, https:\/\/fintext.ai\/wp-content\/uploads\/2023\/01\/FinText_WE_Gold_Standard_Benchmark_table-2048x766.jpg 2048w, https:\/\/fintext.ai\/wp-content\/uploads\/2023\/01\/FinText_WE_Gold_Standard_Benchmark_table-800x299.jpg 800w\" sizes=\"(max-width: 1020px) 100vw, 1020px\" \/><\/p>\n<p style=\"text-align: justify;\">This table clearly shows that FinText has substantially better performance than all other word embeddings; it is 8 times better than Google Word2Vec, and 512 times better than WikiNews. WikiNews accuracy is lower than 0.1% for all sections, with an overall accuracy of 0.05%. For Google Word2Vec, the overall accuracy is 3.01%.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>We introduce the first gold-standard financial benchmark for systematically comparing word embeddings using a financial language&#46;&#46;&#46;<\/p>\n","protected":false},"author":1,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"om_disable_all_campaigns":false,"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"_uf_show_specific_survey":0,"_uf_disable_surveys":false,"footnotes":""},"class_list":["post-294","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/fintext.ai\/index.php?rest_route=\/wp\/v2\/pages\/294","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/fintext.ai\/index.php?rest_route=\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/fintext.ai\/index.php?rest_route=\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/fintext.ai\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/fintext.ai\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=294"}],"version-history":[{"count":15,"href":"https:\/\/fintext.ai\/index.php?rest_route=\/wp\/v2\/pages\/294\/revisions"}],"predecessor-version":[{"id":323,"href":"https:\/\/fintext.ai\/index.php?rest_route=\/wp\/v2\/pages\/294\/revisions\/323"}],"wp:attachment":[{"href":"https:\/\/fintext.ai\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=294"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}