DEEPFAKE DETECTION GENERALIZATION VIA KNOWLEDGE DISTILLATION

Date
2023
Authors
Flores, Cristian
Contributor
Advisor
Baek, Kyungim
Department
Computer Science
Instructor
Depositor
Speaker
Researcher
Consultant
Interviewer
Annotator
Journal Title
Journal ISSN
Volume Title
Publisher
Volume
Number/Issue
Starting Page
Ending Page
Alternative Title
Abstract
Two ongoing challenges in the field of deepfake detection are a lack of generalized modelsand a loss in performance when analyzing highly compressed videos. This work attempts to address these problems, through the use of knowledge distillation (KD) using heterogeneous teachers. In a typical class setting, a student learns from several teachers, each an expert in their domain. This process is analogous to the use of KD with heterogeneous teachers. A contribution of this work is the creation of a KD pipeline that can effectively utilize the knowledge of heterogeneous teachers to train a student model. This pipeline introduces the Winner-takes-all method for utilizing the knowledge of all teachers during training time. The three primary goals of this work are to show that a student model trained utilizing the proposed KD pipeline is a generalized learner, demonstrate that knowledge from its teachers was retained, and maintain a sufficiently high level of performance despite using a shallower, more compressed model compared to its teacher(s). The results indicate that all three goals were met, with the benefits of using the KD pipeline varying from marginal to moderate.
Description
Keywords
Artificial intelligence, Computer science, AI, computer vision, Deepfake, knowledge distillation
Citation
Extent
47 pages
Format
Geographic Location
Time Period
Related To
Table of Contents
Rights
All UHM dissertations and theses are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission from the copyright owner.
Rights Holder
Local Contexts
Email libraryada-l@lists.hawaii.edu if you need this content in ADA-compliant format.