zikele

zikele

人生如此自可乐

基於進化選擇微調的遷移學習優化

2508.15367v1

中文标题#

基於進化選擇微調的遷移學習優化

英文标题#

Transfer learning optimization based on evolutionary selective fine tuning

中文摘要#

深度學習在圖像分析方面取得了顯著進展。 然而,大型完全訓練模型的計算需求仍然是一個需要考慮的問題。 遷移學習提供了一種將預訓練模型適應到新任務的策略。 傳統的微調通常涉及更新所有模型參數,這可能導致過擬合和更高的計算成本。 本文介紹了 BioTune,這是一種進化自適應微調技術,可選擇性地微調層以提高遷移學習效率。 BioTune 採用進化算法來確定一組聚焦的層進行微調,旨在優化模型在給定目標任務上的性能。 在來自不同領域的九個圖像分類數據集上的評估表明,與現有的微調方法如 AutoRGN 和 LoRA 相比,BioTune 實現了具有競爭力或改進的準確性和效率。 通過將微調過程集中在相關層的一個子集上,BioTune 減少了可訓練參數的數量,可能降低計算成本,並促進在不同數據特徵和分佈下的更高效的遷移學習。

英文摘要#

Deep learning has shown substantial progress in image analysis. However, the computational demands of large, fully trained models remain a consideration. Transfer learning offers a strategy for adapting pre-trained models to new tasks. Traditional fine-tuning often involves updating all model parameters, which can potentially lead to overfitting and higher computational costs. This paper introduces BioTune, an evolutionary adaptive fine-tuning technique that selectively fine-tunes layers to enhance transfer learning efficiency. BioTune employs an evolutionary algorithm to identify a focused set of layers for fine-tuning, aiming to optimize model performance on a given target task. Evaluation across nine image classification datasets from various domains indicates that BioTune achieves competitive or improved accuracy and efficiency compared to existing fine-tuning methods such as AutoRGN and LoRA. By concentrating the fine-tuning process on a subset of relevant layers, BioTune reduces the number of trainable parameters, potentially leading to decreased computational cost and facilitating more efficient transfer learning across diverse data characteristics and distributions.

文章页面#

基於進化選擇微調的遷移學習優化

PDF 获取#

查看中文 PDF - 2508.15367v1

智能達人抖店二維碼

抖音掃碼查看更多精彩內容

載入中......
此文章數據所有權由區塊鏈加密技術和智能合約保障僅歸創作者所有。