The Informativeness of Text, the Deep Learning Approach

dc.contributor.author Huang, Allen
dc.contributor.author Wang, Hui
dc.contributor.author Yang, Yi
dc.date.accessioned 2020-12-01T01:01:19Z
dc.date.available 2020-12-01T01:01:19Z
dc.date.issued 2020-08-16
dc.description.abstract This paper uses a deep learning natural language processing approach (Google's Bidirectional Encoder Representations from Transformers, hereafter BERT) to comprehensively summarize financial texts and examine their informativeness. First, we compare BERT's effectiveness in sentiment classification in financial texts with that of a finance specific dictionary, the naïve Bayes, and Word2Vec, a shallow machine learning approach. We find that first, BERT outperforms all other approaches, and second, pre-training BERT with financial texts further improves its performance. Using BERT, we show that conference call texts provide information to investors and that other less accurate approaches underestimate the economic significance of textual informativeness by at least 25%. Last, textual sentiments summarized by BERT can predict future earnings and capital expenditure, after controlling for financial statement based determinants commonly used in finance and accounting research.
dc.identifier.uri http://hdl.handle.net/10125/70549
dc.subject Natural Language Processing
dc.subject Machine Learning
dc.subject Deep Learning
dc.subject Textual Analysis
dc.subject Informativeness
dc.subject Earnings
dc.subject Capital Investment
dc.title The Informativeness of Text, the Deep Learning Approach
Files
Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
HARC-2021_paper_169.pdf
Size:
805.69 KB
Format:
Adobe Portable Document Format
Description: