
Xiang Lisa Li
My research encompasses many stages of language model development, including architecture (Diffusion-LM), adaptation (Prefix-Tuning), self-supervision (GV-consistency), decoding …
Xiang Lisa Li - Google Scholar
Xiang Lisa Li. Stanford University. Verified email at stanford.edu - Homepage. Computational Linguistics. Articles Cited by Public access. Title. Sort. Sort by citations Sort by year Sort by …
Prefix-Tuning: Optimizing Continuous Prompts for Generation
2021年1月1日 · In this paper, we propose prefix-tuning, a lightweight alternative to fine-tuning for natural language generation tasks, which keeps language model parameters frozen, but …
美国版“本科特等奖学金”来了!获奖者有人发Nature、也有人是顶 …
2020年12月29日 · Xiang Li目前在斯坦福大学自然语言处理小组攻读博士学位,研究领域是机器学习与自然语言处理的交叉科学。 斯坦福大学自然语言处理小组大牛云集,看看这个小组都有 …
[2205.14217] Diffusion-LM Improves Controllable Text Generation
2022年5月27日 · View a PDF of the paper titled Diffusion-LM Improves Controllable Text Generation, by Xiang Lisa Li and 4 other authors View PDF Abstract: Controlling the behavior …
[2210.15097] Contrastive Decoding: Open-ended Text Generation …
2022年10月27日 · View a PDF of the paper titled Contrastive Decoding: Open-ended Text Generation as Optimization, by Xiang Lisa Li and 7 other authors
EMNLP2019最佳论文揭晓,约翰霍普金斯大学华人作者与NLP大 …
2019年11月8日 · 最佳论文奖一作是来自约翰霍普金斯大学计算机科学系的Xiang Lisa Li,他的导师则是NLP界公认的大神Jason Eisner,曾获得ACL2017年最佳长论文奖。 报道 人工智能 数 …
Prefix-Tuning: Optimizing Continuous Prompts for Generation
4 天之前 · We show that by learning only 0.1% of the parameters, prefix-tuning obtains comparable performance in the full data setting, outperforms fine-tuning in low-data settings, …
Xiang Lisa Li - ACL Anthology
2025年3月23日 · Xiang Lisa Li | Percy Liang Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on …
40+ "Lisa Xiang" profiles | LinkedIn
View the profiles of professionals named "Lisa Xiang" on LinkedIn. There are 40+ professionals named "Lisa Xiang", who use LinkedIn to exchange information, ideas, and opportunities.