About me
I am a third-year PhD student (2021 - present) of Department of Computer Science and Engineering at Shanghai Jiao Tong University (SJTU). I am fortunate to be advised by Prof. Rui Wang. Before that, I received the bachelor degree in Software Engineering from South China University of Technology (SCUT). I am currently a research intern at Tencen AI Lab, co-advised by Dr. Xing Wang and Dr. Zhaopeng Tu.
π¬ Research
Autonomous Agent powered by Large Language Models
- Multi-agent debate [Preprint]
Human-centered Machine Translation
- Bridging the gap between training signal and real user input [ACL 2022]
- Human-like translation strategy [TACL 2024]
- Improving translation with human feedback
π₯ News
- 2023.11: ππ One paper about human-like translation strategy is accepted by TACL 2024.
- 2023.05: We introduce the MAPS framework, enabling LLMs to mimic the human translation strategy. See also the media coverage πΈ and demo.
- 2023.05: We propose multi-agent debate framework (MAD) with large language models (preprint).
- 2023.05: One short paper about tense consistency of machine translation is accepted by ACL 2023.
π¨οΈ Preprints
* denotes co-first authors

Zhuosheng Zhang, Yao Yao, Aston Zhang, Xiangru Tang, Xinbei Ma, Zhiwei He, Yiming Wang, Mark Gerstein, Rui Wang, Gongshen Liu, Hai Zhao
- A journey from CoT to language agent.

Encouraging Divergent Thinking in Large Language Models through Multi-Agent Debate
Tian Liang*, Zhiwei He*, Wenxiang Jiao*, Xing Wang, Yan Wang, Rui Wang, Yujiu Yang, Zhaopeng Tu, Shuming Shi
- Yes. Multi-agent debate.
-
Leveraging Word Guessing Games to Assess the Intelligence of Large Language Models
Tian Liang, Zhiwei He, Jen-tse Huang, Wenxuan Wang, Wenxiang Jiao, Rui Wang, Yujiu Yang, Zhaopeng Tu, Shuming Shi, Xing Wang
π Publications
* denotes co-first authors

Exploring Human-Like Translation Strategy with Large Language Models
Zhiwei He*, Tian Liang*, Wenxiang Jiao, Zhuosheng Zhang, Yujiu Yang, Rui Wang, Zhaopeng Tu, Shuming Shi, Xing Wang
- We propose MAPS, the first machine translation system that mimics human translation strategies.
- Outperforms WMT22 winners in 5 out of 11 translation directions.
- Media coverage
-
Wenxiang Jiao, Jen-tse Huang, Wenxuan Wang, Zhiwei He, Tian Liang, Xing Wang, Shuming Shi, Zhaopeng Tu
-
TeCS: A Dataset and Benchmark for Tense Consistency of Machine Translation
Yiming Ai, Zhiwei He, Kai Yu, Rui Wang
-
Zhiwei He, Xing Wang, Zhaopeng Tu, Shuming Shi, Rui Wang
- Machine translation system for Livonian
- π₯1st place for English$\Rightarrow$Livonian (unconstrained system)
- π₯2nd place for Livonian$\Rightarrow$English (unconstrained system)
-
Bridging the Data Gap between Training and Inference for Unsupervised Neural Machine Translation
Zhiwei He, Xing Wang, Rui Wang, Shuming Shi, Zhaopeng Tu
π Honors and Awards
- 2022.8: 1st place in the WMT22 General Translation Task, English to Livonian (Unconstrained System).
- 2022.8: 2nd place in the WMT22 General Translation Task, Livonian to English (Unconstrained System).
- 2018, 2019: First Class Scholarship.
π¬ Invited Talks
- 2023.11: Improving Machine Translation with Human Strategy and Feedback, CJNLP | [slide]
- 2022.08: Unsupervised Neural Machine Translation, CCKS 2022
π» Internships
- 2021 - present: Tencent AI Lab, Shenzhen, Mentors: Dr. Xing Wang and Dr. Zhaopeng Tu.