Chao-Wei Huang

Chao-Wei Huang

Ph.D. Candidate at National Taiwan University

National Taiwan University

Biography

Hi! My name is Chao-Wei Huang and I’m a Ph.D. candidate at National Taiwan University, advised by Prof. Yun-Nung (Vivian) Chen. My research interests include dialogue modeling and retrieval-augmented large language models. Previously, I worked as a Research Scientist Intern at Amazon Alexa AI (‘20, ‘21) and Meta AI Research (‘22, ‘23).

Interests
  • Computational Linguistics
  • Information Retrieval
Education
  • Ph.D. in Computer Science and Information Engineering, 2018

    National Taiwan University

  • B.S. in Computer Science and Information Engineering, 2014

    National Taiwan University

Experience

 
 
 
 
 
Meta AI Research (FAIR)
Research Scientist Intern
Meta AI Research (FAIR)
July 2023 – November 2023 Seattle, WA
 
 
 
 
 
Meta AI
Research Scientist Intern
Meta AI
September 2022 – January 2023 New York City, NY
 
 
 
 
 
Amazon Alexa AI
Applied Scientist Intern
Amazon Alexa AI
July 2021 – November 2021 Sunnyvale, CA
 
 
 
 
 
Amazon Alexa AI
Applied Scientist Intern
Amazon Alexa AI
March 2020 – June 2020 Sunnyvale, CA

Recent Publications

Quickly discover relevant content by filtering publications.
(2024). InstUPR: Instruction-based Unsupervised Passage Reranking with Large Language Models. arXiv preprint arXiv:2403.16435.

Cite

(2024). Investigating Decoder-only Large Language Models for Speech-to-text Translation. Proceedings of the Annual Conference of the International Speech Communication Association, INTERSPEECH.

Cite

(2024). Unsupervised Multilingual Dense Retrieval via Generative Pseudo Labeling. *Pro- ceedings of the 18th Conference of the European Chapter of the Association for Com- putational Linguistics *.

Cite

(2023). CONVERSER: Few-shot Conversational Dense Retrieval with Synthetic Data Generation. Proceedings of the 24th Annual Meeting of the Special Interest Group on Discourse and Dialogue.

Cite DOI URL

(2023). Visually-Enhanced Phrase Understanding. Findings of the Association for Computational Linguistics: ACL 2023.

Cite