Is a Pretrained Model the Answer to Situational Awareness Detection on Social Media?

Date

2023-01-03

Contributor

Advisor

Department

Instructor

Depositor

Speaker

Researcher

Consultant

Interviewer

Narrator

Transcriber

Annotator

Journal Title

Journal ISSN

Volume Title

Publisher

Volume

Number/Issue

Starting Page

2110

Ending Page

Alternative Title

Abstract

Social media can be valuable for extracting information about an event or incident on the ground. However, the vast amount of content shared, and the linguistic variants of languages used on social media make it challenging to identify important situational awareness content to aid in decision-making for first responders. In this study, we assess whether pretrained models can be used to address the aforementioned challenges on social media. Various pretrained models, including static word embedding (such as Word2Vec and GloVe) and contextualized word embedding (such as DistilBERT) are studied in detail. According to our findings, a vanilla DistilBERT pretrained language model is insufficient to identify situation awareness information. Fine-tuning by using datasets of various event types and vocabulary extension is essential to adapt a DistilBERT model for real-world situational awareness detection.

Description

Keywords

Data Analytics, Data Mining, and Machine Learning for Social Media, bert, fine tuning, pretrained models, situational awareness, vocabulary extension

Citation

Extent

10

Format

Geographic Location

Time Period

Related To

Proceedings of the 56th Hawaii International Conference on System Sciences

Related To (URI)

Table of Contents

Rights

Attribution-NonCommercial-NoDerivatives 4.0 International

Rights Holder

Local Contexts

Email libraryada-l@lists.hawaii.edu if you need this content in ADA-compliant format.