博士资格考试

A Survey for Knowledge Distillation in Image Classification

The Hong Kong University of Science and Technology (Guangzhou)

数据科学与分析学域

PhD Qualifying Examination

By Mr. Bo HUANG

摘要

The advent of deep learning has revolutionized various domains, including computer vision, natural language processing, etc. However, the deployment of large deep neural networks is often constrained by their extensive resource requirements, particularly in edge computing scenarios. Knowledge distillation emerges as an effective solution to this problem by enabling the training of compact yet high-performing student models through transferring knowledge from larger teacher models. This survey provides a comprehensive review of knowledge distillation techniques, emphasizing their effectiveness in model compression and architectural flexibility, which is crucial for adapting to diverse deployment environments.

The survey reviews the evolution of knowledge distillation from logits-based methods to advanced relational approaches and points out the importance of transferring not just knowledge but also adversarial robustness to protect compact models against attacks in real-world applications. It also discusses the capacity gap challenge and presents various strategies to mitigate this issue.

Additionally, the survey investigates the effectiveness of knowledge distillation from the perspectives of multi-view data characterization and variance reduction, providing deep insights into the methodology’s success. Finally, the survey outlines potential research directions, aiming to guide future advancements in this field of knowledge distillation.

Zoom Link

PQE Committee

Chairperson: Dr. Nan TANG

Prime Supervisor: Prof. Wei WANG

Co-Supervisor: Dr. Minhao CHENG

Examiner: Dr. Wenjia WANG

日期

23 January 2024

时间

10:00:00 - 11:30:00

地点

E3-2F-201, HKUST(GZ)

Join Link

Zoom Meeting ID:
875 8964 8454


Passcode: dsa2024

联系邮箱

dsarpg@hkust-gz.edu.cn

参与者

All are welcome!