Yiyang Ling

I am a first year PhD student in Computer Science at University of Southern California, advised by Prof. Daniel Seita. I received my Bachelor's degree at ACM Honors Class, Shanghai Jiao Tong University.

My research interest lies in robot learning. Previously, I worked as a student intern at University of California San Diego, under the supervision of Prof. Xiaolong Wang. Before this, I gained valuable experience as a research assistant in the SJTU Machine Vision and Intelligence Group, where I was advised by Prof. Cewu Lu.

Email  /  GitHub  /  Google Scholar  /  Twitter  

profile photo

Research

project image

GenSim: Generating Robotic Simulation Tasks via Large Language Models


Lirui Wang, Yiyang Ling*, Zhecheng Yuan*, Mohit Shridhar, Chen Bao, Yuzhe Qin, Bailin Wang, Huazhe Xu, Xiaolong Wang
Workshop on Language Grounding and Robot Learning, CoRL 2023 (Best Paper);
International Conference on Learning Representations (ICLR), 2024 (Spotlight)

arxiv / code / website

In this work, we propose to automatically generate rich simulation environments and expert demonstrations by exploiting LLMs’ grounding and coding ability. Our approach has two modes: goal-directed generation, wherein LLM proposes a task curriculum to solve a target task, and exploratory generation, wherein LLM bootstraps from previous tasks and iteratively proposes novel complex tasks.

project image

TC-CNE: Scalable Tensorized Contrastive Cross-Network Embedding


Hao Xiong, Yiyang Ling, Junchi Yan

We propose a scalable tensorized cross-network embedding method based on contrastive learning by introducing the utilization of CP Decomposition and intra/inter-network sub-embeddings. We achieve to save storage space and accelerate embedding learning simultaneously in a unified training pipeline.




Design and source code from Leonid Keselman's website