서브 헤더

공지사항

공지사항

행사세미나 (세미나) Text Diffusion Models: DiffuSeq, RDMs, and DoTs

페이지 정보

profile_image
작성자 관리자 댓글 0건 조회 334회 작성일 24.05.16

본문

Title: Text Diffusion Models: DiffuSeq, RDMs, and DoTs


Speaker: Prof. Lingpeng Kong @ HKU


Time : 15:00 ~ 16:00, June 3rd, 2024


Location: Online


Language: English speech & English slides


Abstract

This talk explores three recent works from our group in text diffusion models: DiffuSeq, Reparameterized Discrete diffusion Models (RDMs), and Diffusion of Thoughts (DoTs). 


DiffuSeq extends the unconditional diffusion framework to conditional generation tasks, introducing partial noising and conditional denoising for high-quality, diverse text generation. RDMs reveal a latent routing mechanism in discrete diffusion, enabling more effective training and decoding strategies for a better runtime-performance tradeoff compared to existing language models. DoTs implement chain-of-thought reasoning in diffusion language models, allowing for single-pass and multi-pass thought refinement. Built on Plaid 1B, DoTs show strong performance on reasoning tasks while providing speed-ups and benefiting from self-consistency.


The talk covers the basics of diffusion processes, contrasting continuous and discrete diffusion. Results highlight how these methods are closing the gap with autoregressive models in generation quality while offering advantages in parallel generation and runtime. Overall, the talk showcases the rapid progress and potential of diffusion models for advancing state-of-the-art natural language generation.


 

Bio:

Lingpeng Kong is an assistant professor in the Department of Computer Science at the University of Hong Kong (HKU) and a co-director of the HKU NLP Lab. His work lies at the intersection of natural language processing (NLP) and machine learning (ML), with a focus on representation learning, structured prediction, and generative models. Before joining HKU, Kong was a research scientist at Google DeepMind. He obtained his Ph.D. from Carnegie Mellon University.