Syntax-Guided Diffusion Large Language Model for Personalized Text Generation
Annie Qu
Speaker
University of California At Irvine
Monday, Aug 4: 9:00 AM - 9:25 AM
Invited Paper Session
Music City Center
Large language models (LLMs) have demonstrated success in generating human-like text. However, sentences generated by LLMs (e.g., ChatGPT) tend to be generic, and lacking personalized characteristics. Recent development on diffusion models has shown the potential in diversified generation and iterative refinement, however, its limitations still exist, especially when the generated text is complicated. In this work, we propose a syntax-guided diffusion model to achieve both well-written and personalized text generation. A hierarchical pipeline is designed to first generate a syntactic structure and then generate relevant texts accordingly, and an encoder is introduced to extract personalized characteristics. By incorporating syntactic information in the generating process, we can capture both general and personalized patterns of sentence construction. A novel loss function is constructed to guide the cross-attention maps to align well with the desired syntactic and personalized features. We further extend our framework to encourage a more sophisticated and diversified paragraph generation in a hierarchical way. We validate the effectiveness of the proposed method through comprehensive experiments and analysis, showing its capability of generating high-quality text in conjunction with personalized characteristics.
Large language model
personalization
diffusion model
You have unsaved changes.