About Me

I am a second-year Ph.D. student in the Optimization & Machine Learning (OptiML) Laboratory at Kim Jaechul Graduate School of AI, Korea Advanced Institute of Science and Technology (KAIST AI), where I am fortunate to be advised by Prof. Chulhee Yun. Previously, I completed my M.Sc. in AI at KAIST and B.E. in Cyber Security (minor in Mathematics) at Korea University.

Research Interests

I study the theoretical foundations of modern deep learning and design algorithms grounded in rigorous analysis. My work [C1, C4] establishes lower bounds on convergence rates for without-replacement SGD. Another line of work [C2, C3] develops and theoretically justifies an algorithm that strengthens length generalization in Transformers. Current projects investigate speculative decoding, where I am working on designing sampling strategies to improve inference efficiency given a tree-structured draft. I am also studying formal constructions showing how Transformers can perform in-context inference in hidden Markov models.

News

Academic Services

Conference Reviewer

  • NeurIPS 2023, 2024, 2025
  • ICLR 2025, 2026
  • ICML 2025

Journal Reviewer

  • TMLR 2025
  • JOTA 2025

Education

  • Ph.D. in Artificial Intelligence, KAIST (Mar. 2024 – Present)
  • M.Sc. in Artificial Intelligence, KAIST (Mar. 2022 – Feb. 2024)
  • B.E. in Cyber Security, Korea University (Mar. 2018 – Feb. 2022)
    • Minor in Mathematics

Contact

  • CV: PDF
  • Email: chajaeyoung at kaist dot ac dot kr