About
I am a PhD student in the Optimization & Machine Learning(OptiML) Laboratory at Kim Jaechul Graduate School of AI at Korea Advanced Institute of Science and Technology (KAIST AI), where I am fortunate to be advised by Prof. Chulhee Yun.
Before this, I got my master’s degree in Artificial Intelligence and bachelor’s degree in Mathematical Science and Electrical Engineering (double major) at Korea Advanced Institute of Science and Technology (KAIST).
Research Interests
- Deep Learning Theory
Education
- PhD student in Artificial Intelligence, KAIST, Mar.2024-Current
- MSc in Artificial Intelligence, KAIST, Mar.2022-Feb.2024
- BSc in Mathematical Science and Electrical Engineering, KAIST, Mar.2018-Feb.2022
Academic Services
Conference/Workshop Reviewer
- NeurIPS 2024, 2025
- ICML 2025
- ICLR 2025, 2026
- AISTATS 2025, 2026
- ICML 2025 Workshop on High-dimensional Learning Dynamics
Contact
- {first name}.{last name} at kaist dot ac dot kr
News
- [10/2025] Our new preprint on Mamba’s in-context learning is out! This work was done during my visit to the University of Tokyo where I was fortunate to work with Wei Huang and Taiji Suzuki!
- [09/2025] Our paper on weak-to-strong generalization got accepted to NeurIPS 2025!
- [08/2025] Our work on weak-to-strong generalization was selected as KT Best Paper Award at the 2025 Korean AI Association Conference (CKAIA 2025)!
- [07/2025] I will soon start a research internship at the University of Tokyo, hosted by Prof. Taiji Suzuki, and stay there until September. If you are around, feel free to reach out!
- [06/2025] Our work on weak-to-strong generalization got accepted to ICML 2025 Workshop on High-dimensional Learning Dynamics 2025.
- [09/2024] Two papers got accepted to NeurIPS 2024 and our paper on Cutout/CutMix was selected as a spotlight presentation!
[08/2024] Our paper on Cutout and CutMix was selected as KT Best Paper Award at the 2024 Korean AI Association Conference (CKAIA 2024)!
[06/2024] One paper got accepted to ICML 2024 Workshop on Advancing Neural Network Training (WANT): Computational Efficiency, Scalability, and Resource Optimization. Our paper proposes a theory-inspired method called Direction-Aware SHrinking (DASH), which aims to mitigate plasticity loss.
[06/2024] Our theoretical work on Cutout and CutMix got accepted to ICML 2024 Workshop on High-dimensional Learning Dynamics 2024: The Emergence of Structure and Reasoning.
[12/2023] I defended my Master’s degree and I will start my Ph.D.
[06/2023] Our paper on theoretical analysis of Mixup is out! I’m very happy to hear that this work got accepted to ICML 2023.
- [02/2022] I received my bachelor’s degree and will start the master’s course at KAIST AI.
