Please use this identifier to cite or link to this item:

The Informativeness of Text, the Deep Learning Approach

File Size Format  
HARC-2021 paper 169.pdf 805.69 kB Adobe PDF View/Open

Item Summary

Title:The Informativeness of Text, the Deep Learning Approach
Authors:Allen Huang
Hui Wang
Yi Yang
Keywords:Natural Language Processing
Machine Learning
Deep Learning
Textual Analysis
show 2 moreEarnings
Capital Investment
show less
Date Issued:16 Aug 2020
Abstract:This paper uses a deep learning natural language processing approach (Google's Bidirectional Encoder Representations from Transformers, hereafter BERT) to comprehensively summarize financial texts and examine their informativeness. First, we compare BERT's effectiveness in sentiment classification in financial texts with that of a finance specific dictionary, the naïve Bayes, and Word2Vec, a shallow machine learning approach. We find that first, BERT outperforms all other approaches, and second, pre-training BERT with financial texts further improves its performance. Using BERT, we show that conference call texts provide information to investors and that other less accurate approaches underestimate the economic significance of textual informativeness by at least 25%. Last, textual sentiments summarized by BERT can predict future earnings and capital expenditure, after controlling for financial statement based determinants commonly used in finance and accounting research.
Appears in Collections: 09 Financial Accounting 2: Disclosure

Please email if you need this content in ADA-compliant format.

Items in ScholarSpace are protected by copyright, with all rights reserved, unless otherwise indicated.