From Complex to Simple: Enhancing Multi-Constraint Complex Instruction Following Ability of Large Language Models

Abstract

It is imperative for Large language models (LLMs) to follow instructions with elaborate requirements (i.e. Complex Instructions Following). Yet, it remains under-explored how to enhance the ability of LLMs to follow complex instructions with multiple constraints. To bridge the gap, we initially study what training data is effective in enhancing complex constraints following abilities. We found that training LLMs with instructions containing multiple constraints enhances their understanding of complex instructions, especially those with lower complexity levels. The improvement can even generalize to compositions of out-of-domain constraints. Additionally, we further propose methods addressing how to obtain and utilize the effective training data. Finally, we conduct extensive experiments to prove the effectiveness of our methods in terms of overall performance, training efficiency, and generalization abilities under four settings.

Qianyu He
Qianyu He
Ph.D. Candidate

My research interests focus on the Evaluation and Enhancement of Large Language Models (LLMs), with an emphasis on the models’ instruction-following and creative generation ability.