The Informativeness of Text, the Deep Learning Approach Huang, Allen Wang, Hui Yang, Yi 2020-12-01T01:01:19Z 2020-12-01T01:01:19Z 2020-08-16
dc.description.abstract This paper uses a deep learning natural language processing approach (Google's Bidirectional Encoder Representations from Transformers, hereafter BERT) to comprehensively summarize financial texts and examine their informativeness. First, we compare BERT's effectiveness in sentiment classification in financial texts with that of a finance specific dictionary, the naïve Bayes, and Word2Vec, a shallow machine learning approach. We find that first, BERT outperforms all other approaches, and second, pre-training BERT with financial texts further improves its performance. Using BERT, we show that conference call texts provide information to investors and that other less accurate approaches underestimate the economic significance of textual informativeness by at least 25%. Last, textual sentiments summarized by BERT can predict future earnings and capital expenditure, after controlling for financial statement based determinants commonly used in finance and accounting research.
dc.subject Natural Language Processing
dc.subject Machine Learning
dc.subject Deep Learning
dc.subject Textual Analysis
dc.subject Informativeness
dc.subject Earnings
dc.subject Capital Investment
dc.title The Informativeness of Text, the Deep Learning Approach
Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
805.69 KB
Adobe Portable Document Format