论文开题审查

TOWARDS EFFICIENT NEURAL NETWORK TRAINING: DATA AUGMENTATION AND DISTILLATION

The Hong Kong University of Science and Technology (Guangzhou)

数据科学与分析学域

PhD Thesis Proposal Examination

By Mr. Jiahang JIANG

摘要

Deep learning has achieved remarkable success in the past decade and has become the primary choice in various research fields. In addition to optimizing the structure of deep learning models, researchers are increasingly paying attention to directly manipulating the original dataset, and many methods have been proposed for the dataset to enhance prediction accuracy, training efficiency, and generalization. In this paper, we specifically investigate the influence of data augmentation and distillation techniques, with the goal of improving both theoretical understanding and experimental performance.


TPE Committee

Chairperson: Prof. Fugee TSUNG

Prime Supervisor: Prof Wenjia WANG

Co-Supervisor: Prof Jia LI

Examiner: Prof Xinlei HE

日期

13 June 2024

时间

14:50:00 - 16:05:00

地点

E1-149