Link Search Menu Expand Document

(Bill) Yuchen Lin, PhD Candidate


Bill Yuchen Lin

Experience Publication News Twitter G-Scholar INK Lab

👨‍🎓 I am on the job market for postdoc and industry research positions starting Winter 2022!

Yuchen Lin is a final-year Ph.D. candidate in Computer Science at the University of Southern California. He works with Prof. Xiang Ren at the Intelligence and Knowledge Discovery Research Lab (USC-INK). He is interested in developing intelligent systems that can demonstrate a deep understanding of the world with common-sense knowledge and reasoning ability by teaching machines to think, talk, and act as humans do. To this end, his research aims to integrate techniques of information extraction, knowledge graphs, robustness, cross-task generalization, etc. Besides, he is also interested in continual learning and federated learning topics.

Previously, he got his bachelor’s degree from the IEEE Honored Class at Shanghai Jiao Tong University (2014-2018) and won the Best Thesis Award, advised by Prof. Kenny Zhu. He was an intern at Facebook AI Research (FAIR) (2021 with Scott Yih), Google AI (2020 with William Cohen, 2019 with Sandeep Tata), and Microsoft Research Asia (2017-2018). He has been serving as the PC members (i.e., reviewers) for ARR (*CL conference), ICML, NeurIPS, ICLR, AAAI, etc. He co-organizes several workshops (e.g., CSKB, CSRR, and FL4NLP) and will give a tutorial on knowledge-augmented NLP at ACL 2022.

  • Email: yuchen [dot] lin [at] usc [dot] edu
  • Resume/CV: Please email me for the latest pdf version.

News

[full list]

04-17, 2022 New preprint on unsupervised cross-task generalization via retrieval augmentation. We will present it as non-archival papers at Spa-NLP and LNLS workshop at ACL2022.
04-07, 2022 Our FedNLP project has been accepted by the NAACL 2022 Findings, and a joint work with Jun Yan on entity robustness has been accepted by the main conference! :)
02-24, 2022 My internship work at FAIR has been accepted to ACL 2022!
02-24, 2022 Invited as a reviewer to the TMLR (Transactions on Machine Learning Research)! It's a new venue for dissemination of machine learning research that is intended to complement JMLR while supporting the unmet needs of a growing ML community. Please consider submitting your work via OpenReview!
02-08, 2022 I finally passed my thesis proposal! Thanks a lot for the support of my committee members: Prof. Xiang Ren (chair), Prof. Cyrus Shahabi, Prof. Yan Liu, Prof. Robin Jia, and Prof. Toby Mintz.
01-26, 2022 Chenguang, Yicong, Meng, Wenhao, Xiang, and I will be giving a tutorial on "Knowledge-Augmented Methods for Natural Language Processing" at ACL 2022.
01-01, 2022 Excited to co-organize two workshops at ACL 2022 with really cool teams: CSRR (commonsense representation & reasoning) and FL4NLP (federated learning for NLP). Please follow both for more info and participate!
07-06, 2021 Vered, Antoine, Lorraine and I are organizing an AKBC workshop on commonsense and KBs (CSKB@AKBC21). Please consider submitting your (published/unpublished) work there! We have some stellar speakers and panelists. Check it here.
06-09, 2021

Our paper “AutoTriggER: Named Entity Recognition with Auxiliary Trigger Extraction” won the Best Paper Award at NAACL21 TrustNLP workshop!

05-11, 2021

Selected as one of the AI Rising Stars in Chinese Students by Baidu Research.

05-06, 2021

We have two full papers on commonsense reasoning accepted to ACL2021 (1 long + 1 findings): X-CSR and RiddleSense!

04-17, 2021 Releasing FedNLP-- A research platform for Federated Learning in NLP. [Tweet]
04-17, 2021 A new arXiv preprint with Qinyuan, CʀᴏssFɪᴛ: A Few-shot Learning Challenge for Cross-Task Generalization in NLP. [Tweet]
04-07, 2021 Our works on commonsense reasoning got covered by an article on Communications of the ACM: The Best of NLP.
03-15, 2021 Finally passed my qual exam and officially became a PhD Candidate now. [Slides]
03-10, 2021 My Google internship work about open-ended commonsense reasoning is accepted to NAACL21. Check our website here.
01-25, 2021 Releasing Rebiber, a simple tool to fix outdated arXiv citations! [Twitter]
01-20, 2021 Check out our ICLR 2021 paper on pre-training text-to-text transformers for common sense.