University of California, Irvine, USA.
World Journal of Advanced Research and Reviews, 2025, 26(01), 1353-1359
Article DOI: 10.30574/wjarr.2025.26.1.1172
Received on 01 March 2025; revised on 07 April 2025; accepted on 10 April 2025
Fine-tuning pre-trained language models for code generation represents a significant advancement in bridging artificial intelligence and software development. This process adapts foundation models trained on vast code repositories to specific programming languages, frameworks, and domains. The article examines the complete pipeline for effective fine-tuning, beginning with selecting appropriate base architectures such as Code Llama, StarCoder, and Codex, which are specifically designed for code understanding. A critical exploration of dataset preparation techniques highlights the importance of curated, diverse examples that represent target domains accurately while avoiding biases. The article further delves into parameter-efficient adaptation techniques like Low-Rank Adaptation, adapter modules, and prompt tuning, dramatically reducing computational requirements while preserving performance. These innovations democratize access to specialized code-generation capabilities, making them available even with limited resources. Applications span intelligent code completion, natural language to code translation, refactoring, cross-language conversion, and test generation, transforming developer workflows across experience levels. The article provides comprehensive insights into how fine-tuned models reshape software development practices by examining the interplay between model architecture, data quality, fine-tuning techniques, and practical applications.
Code Generation; Fine-Tuning, Parameter Efficiency; Knowledge Distillation; Security-Aware Programming; Adaptive Learning
Preview Article PDF
Siddhant Sonkar. Fine-tuning AI Models for code generation: Advances and applications. World Journal of Advanced Research and Reviews, 2025, 26(01), 1353-1359. Article DOI: https://doi.org/10.30574/wjarr.2025.26.1.1172.
Copyright © 2025 Author(s) retain the copyright of this article. This article is published under the terms of the Creative Commons Attribution Liscense 4.0