The Informativeness of Text, the Deep Learning Approach

Date
2020-08-16
Authors
Huang, Allen
Wang, Hui
Yang, Yi
Contributor
Advisor
Department
Instructor
Depositor
Speaker
Researcher
Consultant
Interviewer
Annotator
Journal Title
Journal ISSN
Volume Title
Publisher
Volume
Number/Issue
Starting Page
Ending Page
Alternative Title
Abstract
This paper uses a deep learning natural language processing approach (Google's Bidirectional Encoder Representations from Transformers, hereafter BERT) to comprehensively summarize financial texts and examine their informativeness. First, we compare BERT's effectiveness in sentiment classification in financial texts with that of a finance specific dictionary, the naïve Bayes, and Word2Vec, a shallow machine learning approach. We find that first, BERT outperforms all other approaches, and second, pre-training BERT with financial texts further improves its performance. Using BERT, we show that conference call texts provide information to investors and that other less accurate approaches underestimate the economic significance of textual informativeness by at least 25%. Last, textual sentiments summarized by BERT can predict future earnings and capital expenditure, after controlling for financial statement based determinants commonly used in finance and accounting research.
Description
Keywords
Natural Language Processing, Machine Learning, Deep Learning, Textual Analysis, Informativeness, Earnings, Capital Investment
Citation
Extent
Format
Geographic Location
Time Period
Related To
Table of Contents
Rights
Rights Holder
Local Contexts
Email libraryada-l@lists.hawaii.edu if you need this content in ADA-compliant format.