Neural Network Translated into Bag-of-Words: Lexicon of Attentions

dc.contributor.author Iwasaki, Hitoshi
dc.contributor.author Chen, Ying
dc.contributor.author Huang, Allen
dc.contributor.author Wang, Hui
dc.date.accessioned 2021-11-12T18:45:51Z
dc.date.available 2021-11-12T18:45:51Z
dc.date.issued 2021
dc.description.abstract We present a framework that translates trained neural network's decision making process to a lexicon. First, we train an interpretable neural network, hierarchical neural network (HAN) that predicts cumulative abnormal returns (CAR) with analyst reports. Second, we relate the trained attentions with words and compile the analytical results as a parsimonious lexicon. The attention-based lexicon reflects contextual information, and through out-of-sample experiments, we show that it outperforms as well as complements with Loughran and McDonald (LM) lexicon, and also offers a smart weighting scheme that dominates existing word weighting methods. Additional experiments confirm the advantage of the proposed lexicon for earnings call transcripts and generalize its usefulness beyond the original training corpus. Our proposed framework materializes contextual information in financial texts and allows bag-of-words models to incorporate it, and thus it provides subsequent users with a way of exploring contextual information in an interpretable manner.
dc.identifier.uri http://hdl.handle.net/10125/76961
dc.subject Textual Analysis
dc.subject Attention
dc.subject Information Content
dc.title Neural Network Translated into Bag-of-Words: Lexicon of Attentions
dc.type.dcmi Text
Files
Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
HARC-2022_paper_194.pdf
Size:
653.41 KB
Format:
Adobe Portable Document Format
Description: