A Lite Hierarchical Model for Dialogue Summarization with Multi-Granularity Decoder

Date

2023-01-03

Contributor

Advisor

Department

Instructor

Depositor

Speaker

Researcher

Consultant

Interviewer

Narrator

Transcriber

Annotator

Journal Title

Journal ISSN

Volume Title

Publisher

Volume

Number/Issue

Starting Page

2180

Ending Page

Alternative Title

Abstract

Abstract dialogue summarization generation has recently attracted considerable research attention, especially in using hierarchical models to accomplish abstract dialogue summarization tasks successfully. However, problems in recent studies often include an excessive amount of model parameters and long training time mainly because existing dialogue summaries of hierarchical models are typically generated by adding extra encoders and attention layers in the decoder to enhance learning and summarization generation ability of the model. Hence, designing an increasingly lightweight hierarchical model is necessary. A lightweight hierarchical model named ALH-BART is proposed in this study to generate high-accuracy dialogue summaries rapidly. The proposed hierarchical model includes word and turn encoders, which enhance the ability of the model to understand dialogue. A multigranularity decoder in the model is also proposed to decode word- and turn-level information in the decoder at the same time. Encoder parameters in multihead self-attention are provided for each corresponding multihead self-attention to reduce the number of model parameters and improve the speed of model learning effectively. Finally, the effectiveness of the model is verified on SAMSum and DialogSum datasets.

Description

Keywords

Data Analytics, Data Mining, and Machine Learning for Social Media, dialog/chat summarization, hierarchical model, machine learning, social media

Citation

Extent

10

Format

Geographic Location

Time Period

Related To

Proceedings of the 56th Hawaii International Conference on System Sciences

Related To (URI)

Table of Contents

Rights

Attribution-NonCommercial-NoDerivatives 4.0 International

Rights Holder

Local Contexts

Email libraryada-l@lists.hawaii.edu if you need this content in ADA-compliant format.