MoonJeong Park

profile_cand_icml_v3.jpeg

I am a Ph.D. student at the POSTECH Machine Learning Lab, under the guidance of Prof. Dongwoo Kim, within the Graduate School of Artificial Intelligence at POSTECH.

My research interest lies in various aspects of machine learning, particularly focusing on the theoretical analysis of neural network architectures and their implicit biases. Specifically, my recent work focuses on the theoretical impact of GNN architectures on their generalization error and optimization landscapes. Furthermore, I am broadening my research scope to include Transformer architectures, with a focus on generalizing my findings to a wider range of neural systems.

I’m actively seeking opportunities for meaningful collaborations and internships. If you find my work interesting or have any questions, please don’t hesitate to reach out.

News

Mar 25, 2026 🇦🇺 Excited to be visiting the Computational Media Lab at ANU as a visiting researcher! (March – June 2026)
Mar 11, 2026 📚 Check out our latest preprint on arXiv, where we propose practically applicable transductive generalization bounds for GNNs.
Nov 8, 2025 🏆 A paper about label smoothing in GNN is accepted to AAAI 2026 as oral presentation!
Sep 18, 2025 🎉 A paper suggests influence fuction for GNNs accepted to NeurIPS 2025!
Aug 21, 2025 🎉 A paper about Personalizing LLMs with graph-based collaborative filtering framework is accepted to EMNLP 2025!

Education

Sep, 2022 - Present Pohang University of Science and Technology (POSTECH), Pohang, South Korea
Ph.D. student in Computer Science and Engineering
Advisor: Dongwoo Kim
Sep, 2019 - Sep, 2022 Pohang University of Science and Technology (POSTECH), Pohang, South Korea
Integrated M.S. student in Computer Science and Engineering
Advisor: Dongwoo Kim
Mar, 2014 - Sep, 2019 Daegu Gyeongbuk Institute of Science and Technology (DGIST), Daegu, South Korea
B.S. in School of Undergraduate Studies

Experience

March, 2026 - Present Computational Media Lab at Australian National University, Canberra, Australia
Visiting Researcher
  • Participate in research projects about graph neural networks.
August, 2020 - June, 2022 POSCO AI education program, Pohang, South Korea
Teaching Assistant
  • Participate as a teaching assistant for machine learning and deep learning.
June, 2018 - January, 2019 Data Mining Lab at Seoul National University, Seoul, South Korea
Research Intern
  • Mentor: Prof. U Kang and Prof. Sael Lee
  • Participate in research projects about sparse tucker factorization for large-scale tensor
June, 2016 - July, 2016 Multi-Scale Robotics Lab at Eidgenössische Technische Hochschule Zürich, Zürich, Switzerland
Research Intern
  • Mentor: Dr. Carmela De Marco
  • Participate in research projects about fabrication of microrobot to acquire single cell

Publications

* indicates equal contribution.

  1. Transductive Generalization via Optimal Transport and Its Application to Graph Node Classification
    MoonJeong Park*, Seungbeom Lee*, Kyungmin Kim, Jaeseung Heo, Seunghyuk Cho, Shouheng Li, Sangdon Park,  and Dongwoo Kim
    arXiv preprint, 2026
  2. In-Place Feedback: A New Paradigm for Guiding LLMs in Multi-Turn Reasoning
    Youngbin Choi*, Minjong Lee*, Saemi Moon, Seunghyuk Cho, Chaehyeon Chung,  MoonJeong Park,  and Dongwoo Kim
    arXiv preprint, 2026
  3. Training-free Composition of Pre-trained GFlowNets for Multi-Objective Generation
    Seokwon Yoon, Youngbin Choi, Seunghyuk Cho, Seungbeom Lee,  MoonJeong Park,  and Dongwoo Kim
    arXiv preprint, 2026
  4. The Oversmoothing Fallacy: A Misguided Narrative in GNN Research
    MoonJeong Park, Sunghyun Choi, Jaeseung Heo, Eunhyeok Park,  and Dongwoo Kim
    arXiv preprint, 2025
  5. Taming Gradient Oversmoothing and Expansion in Graph Neural Networks
    MoonJeong Park,  and Dongwoo Kim
    arXiv preprint, 2024
  6. Posterior Label Smoothing for Node Classification
    Jaeseung Heo,  MoonJeong Park,  and Dongwoo Kim
    AAAI Conference on Artificial Intelligence (AAAI), 2026
    Oral
  7. Influence Functions for Edge Edits in Non-Convex Graph Neural Networks
    Jaeseung Heo, Kyeongheung Yun, Seokwon Yoon,  MoonJeong Park, Jungseul Ok,  and Dongwoo Kim
    Conference on Neural Information Processing Systems (NeurIPS), 2025
  8. CoPL: Collaborative Preference Learning for Personalizing LLMs
    Youngbin Choi, Seunghyuk Cho, Minjong Lee,  MoonJeong Park, Yesong Ko, Jungseul Ok,  and Dongwoo Kim
    Empirical Methods in Natural Language Processing (EMNLP), 2025
  9. Mitigating Oversmoothing Through Reverse Process of GNNs for Heterophilic Graphs
    MoonJeong Park, Jaeseung Heo,  and Dongwoo Kim
    International Conference on Machine Learning (ICML), 2024
    Excellent Paper Award at BK21 Paper Award
  10. SpReME: Sparse Regression for Multi-Environment Dynamic Systems,
    MoonJeong Park*, Youngbin Choi*,  and Dongwoo Kim
    AAAI Conference on Artificial Intelligence, Workshop on When Machine Learning meets Dynamical Systems: Theory and Applications (AAAIw), 2023
  11. MetaSSD: Meta-Learned Self-Supervised Detection,
    MoonJeong Park, Jungseul Ok, Yo-Seb Jeon,  and Dongwoo Kim
    IEEE International Symposium on Information Theory (ISIT), 2022
  12. Large-scale tucker Tensor factorization for sparse and accurate decomposition,
    Jun-Gi Jang*,  MoonJeong Park*, Jongwuk Lee,  and Lee Sael
    The Journal of Supercomputing, 2022
    Extended version of the conference paper "VeST: Very Sparse Tucker Factorization of Large-Scale Tensors"
  13. VeST: Very Sparse Tucker Factorization of Large-Scale Tensors,
    MoonJeong Park*, Jun-Gi Jang*,  and Lee Sael
    IEEE International Conference on Big Data and Smart Computing (BigComp), 2021
    Best Paper Award, 1st Place

Honors and Awards

BK21 Best Paper Award, POSTECH AIGS (2024)
  • Excellence Award - Mitigating oversmoothing through reverse process of gnns for heterophilic graphs (ICML2024)
Best Paper Award, IEEE BigComp (2021)
  • Best Paper Award, 1st Place - VeST: Very Sparse Tucker Factorization of Large-Scale Tensors (BigComp2021)