About

I am a Ph.D. student at the KAIST School of Computing, working under the supervision of Prof. Alice Oh. My research focuses on learning representations of structured and unstructured knowledge using Graph Neural Networks (GNNs) and Large Language Models (LLMs).

Specifically, I studied graph representation learning methods to leverage pairwise and higher-order interactions for graph-structured data (edges [C2], partial subgraphs [C3], subgraphs [C4], and k-hop subgraphs [P1]).

Currently, my research explores the intersection of graph and language models, with a focus on uncovering latent structures in unstructured language data. My ongoing works are integrating GNNs and LLMs to develop multi-cultural network and language models [W4], searching gene interactions regulating Multiple Sclerosis from single-cell RNA-seq data, and analyzing U.S. lobbying networks.

Recent Publications (See all)

Chani Jung, Dongkwan Kim, Jiho Jin, Jiseon Kim, Yeon Seonwoo, Yejin Choi, Alice Oh, and Hyunwoo Kim. "Perceptions to Beliefs: Exploring Precursory Inferences for Theory of Mind in Large Language Models", Empirical Methods in Natural Language Processing (EMNLP), 2024

Dongkwan Kim and Alice Oh. "Generalizing Weisfeiler-Lehman Kernels to Subgraphs", Arxiv, 2024

Education

  • M.S. School of Computing, KAIST, Sep 2019
  • B.S. Major in Computer Science and Minor in Chemistry, KAIST, Feb 2018

Talks & Presentations

Dongkwan Kim. "Salad-Bowl-LLM: Multi-Culture LLMs by Mixed In-Context Demonstrations", International NLP Workshop at KAIST 2024, 2024

Dongkwan Kim. "Leveraging Structure for Graph Neural Networks", IBS Data Science Group Seminar, 2022

Dongkwan Kim. "How to Find Your Friendly Neighborhood: Graph Attention Design with Self-Supervision", KAIST AI Workshop 21/22, 2022

Dongkwan Kim. "How to Find Your Friendly Neighborhood: Graph Attention Design with Self-Supervision", Learning on Graphs and Geometry Reading Group (LoGaG), 2021

Academic Services

Teaching Experiences

  • TA, Head TA of Data Structure (Spring 2018, Fall 2018)
  • Head TA, TA of Machine Learning for Natural Language Processing (Fall 2019, Spring 2021), Best TA Award at Fall 2019
  • Head TA of Deep Learning for Real-world Problems (Spring 2020, Fall 2020), Best TA Award at Spring 2020
  • TA of AI Tech Boostcamp at NAVER Connect Foundation (Fall 2024)

Open Source Contributions