Part of International Conference on Representation Learning 2025 (ICLR 2025) Conference
Jonathan Light, Yue Wu, Yiyou Sun, Wenchao Yu, Yanchi Liu, Xujiang Zhao, Ziniu Hu, Haifeng Chen, Wei Cheng
We frame code generation as a black-box optimization problem within the codespace and demonstrate how optimization-inspired techniques can enhance inferencescaling over text. Based on this perspective, we propose SCATTERED FORESTSEARCH (SFS), a novel approach that improves solution diversity during evolutionary search,thereby avoiding local optima. Our theoretical analysis illustrates how thesemethods improve exploration and enhance efficiency. Extensive experimentson HumanEval, MBPP, APPS, CodeContests, and Leetcode reveal significantperformance gains. For instance, our method achieves a pass@1 rate of 67.1% onHumanEval+ and 87.2% on HumanEval with GPT-3.5, marking improvements of8.6% and 4.3% over the state-of-the-art, while also halving the iterations neededto find the correct solution. Furthermore, our approach scales more efficientlythan existing search techniques, including tree search, line search, and repeatedsampling (Best of N).