Defang Chen

prof_pic.jpg

I am a Postdoctoral Associate working with SUNY Distinguished Professor Siwei Lyu (IEEE/IAPR Fellow) at the University at Buffalo, State University of New York (SUNY). I received my Ph.D. from Zhejiang University in Jun 2024. My Ph.D. thesis, titled Knowledge Distillation on Deep Neural Networks won the Outstanding Doctoral Dissertation award. My Google Scholar citations reached 2025 in 2025.

I am working on diffusion-based generative models (theoretical understanding, accelerated sampling), and knowledge distillation. I have reviewed over 100 papers for top-tier conferences and journals, including serving in senior roles. I lived in Hangzhou (Paradise on Earth) and Wenzhou (Cradle of Mathematicians) for more than 25 years.

Ten selected papers (full)

† denotes the corresponding author / project lead

  1. 25-JSTAT
    Diffusion
    Geometric Regularity in Deterministic Sampling of Diffusion-based Generative Models
    J. Stat. Mech., 2025
    Top-tier journal in Statistical Mechanics, Mathematical Physics. Published more than 30 papers from the Nobel Laureate Giorgio Parisi (Physics, 2021).
  2. 26-AAAI
    Diffusion
    DICE: Distilling Classifier-Free Guidance into Text Embeddings
    Zhenyu Zhou, Defang Chen, Can Wang, Chun Chen, and Siwei Lyu
    In Proceedings of the AAAI Conference on Artificial Intelligence, 2025
    Oral, less than 5%
  3. 24-NeurIPS
    Diffusion
    Simple and fast distillation of diffusion models
    Zhenyu Zhou, Defang Chen, Can Wang, Chun Chen, and Siwei Lyu
    In Advances in Neural Information Processing Systems, 2024
  4. 25-TMLR
    Survey
    Conditional Image Synthesis with Diffusion Models: A Survey
    Zheyuan Zhan, Defang Chen, Jian-Ping Mei, Zhenghe Zhao, Jiawei Chen, and 3 more authors
    Transactions on Machine Learning Research, 2025
  5. 25-ICCV
    Distillation
    Knowledge distillation with refined logits
    Wujie Sun, Defang Chen, Siwei Lyu, Genlang Chen, Chun Chen, and 1 more author
    In Proceedings of the IEEE/CVF International Conference on Computer Vision, 2025
  6. 24-ICML
    Diffusion
    On the Trajectory Regularity of ODE-based Diffusion Sampling
    In International Conference on Machine Learning, 2024
  7. 24-CVPR
    Diffusion
    Fast ODE-based Sampling for Diffusion Models in Around 5 Steps
    Zhenyu Zhou, Defang Chen, Can Wang, and Chun Chen
    In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2024
  8. 22-CVPR
    Distillation
    Knowledge Distillation with the Reused Teacher Classifier
    Defang Chen, Jian-Ping Mei, Hailin Zhang, Can Wang, Yan Feng, and 1 more author
    In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2022
  9. 21-AAAI
    Distillation
    Cross-Layer Distillation with Semantic Calibration
    Defang Chen, Jian-Ping Mei, Yuan Zhang, Can Wang, Zhe Wang, and 2 more authors
    In Proceedings of the AAAI Conference on Artificial Intelligence, 2021
    Journal version published in IEEE Trans. Knowl. Data Eng. (TKDE)
    Highly-Cited Paper Indexed by 2024/2025 Google Scholar Metrics
  10. 20-AAAI
    Distillation
    Online Knowledge Distillation with Diverse Peers
    Defang Chen, Jian-Ping Mei, Can Wang, Yan Feng, and Chun Chen
    In Proceedings of the AAAI Conference on Artificial Intelligence, 2020
    Highly-Cited Paper Indexed by 2024/2025 Google Scholar Metrics

Misc.

Recommended Online Courses related to my research interests: (1) Differential Equations and Dynamical Systems, Steve Brunton, (2) Principles of Deep Representation Learning, Yi Ma