【图像超分】论文精读:MTKD: Multi-Teacher Knowledge Distillation for Image Super-Resolution

请先看【专栏介绍文章】:【超分辨率(Super-Resolution)】关于【超分辨率重建】专栏的相关说明,包含专栏简介、专栏亮点、适配人群、相关说明、阅读顺序、超分理解、实现流程、研究方向、论文代码数据集汇总等)


前言

论文题目:MTKD: Multi-Teacher Knowledge Distillation for Image Super-Resolution —— MTKD:图像超分辨率的多教师知识蒸馏

论文地址:MTKD: Multi-Teacher Knowledge Distillation for Image Super-Resolution

论文源码:https://github.com/YuxuanJJ/MTKD

ECCV 2024


文章目录

  • 前言
  • Abstract
  • 1 Introduction
  • 2 Related Work
  • 3 Proposed method: MTKD
    • 3.1 Stage 1: Knowledge Aggregation
    • 3.2 Stage 2: Model Distillation
  • 4 Experiment Configuration
  • 5 Results and Discussion
    • 5.1 Quantitative Evaluation
    • 5.2 Qualitative Evaluation
    • 5.3 Ablation Study
  • 6 Conclusion


你可能感兴趣的:(超分辨率重建(理论+实战,科研+应用),深度学习,人工智能,图像处理,计算机视觉,超分辨率重建,论文阅读,论文笔记)