Yujin Sung

Hi!😀 I’m a M.S. Student at Korea University, Vision & AI lab (Advisor: Prof. Jungbeom Lee & Jinkyu Kim). I earned a Bachelor’s degree in Computer Science from Korea University.

My research focuses on developing generalist robot policies that enable robots to perform diverse tasks in complex, unstructured environments. I am particularly interested in enabling robots to continually learn from real-world interaction and efficiently acquire skills from human demonstrations. Ultimately, I aim to build safe and robust robotic systems that can operate reliably in everyday environments.

Currently, I am researching about fully autonomous robot learning framework that can learn from real-world interaction without human intervention.

Email  /  CV  /  Github

profile photo

Research

AxisGuide: Grounding Robot Action Coordinate System in RGB Observations for Robust Visuomotor Manipulation Jiyun Jang, Yujin Sung, Woosung Joung, Daewon Chae, Sangwon Lee, Sohwi Kim, Jinkyu Kim, Jungbeom Lee
RSS, 2026

A framework that injects action coordinate cues into RGB observations to explicitly ground the robot’s action space, improving zero-shot execution and robustness in visuomotor manipulation across diverse environments.

uCLIP: Parameter-Efficient Multilingual Extension of Vision-Language Models with Unpaired Data Dahyun Chung*, Donghyun Shin*, Yujin Sung*, Seunggi Moon*, Jinwoo Jeon, Byung-Jun Lee
AAAI, 2026
project / paper / code

Lightweight framework that enables multilingual vision–language alignment for underrepresented languages by using English as a semantic pivot and requiring no paired supervision.

Website template from Jon Barron.