zikele

zikele

人生如此自可乐

基于进化选择微调的迁移学习优化

2508.15367v1

中文标题#

基于进化选择微调的迁移学习优化

英文标题#

Transfer learning optimization based on evolutionary selective fine tuning

中文摘要#

深度学习在图像分析方面取得了显著进展。 然而,大型完全训练模型的计算需求仍然是一个需要考虑的问题。 迁移学习提供了一种将预训练模型适应到新任务的策略。 传统的微调通常涉及更新所有模型参数,这可能导致过拟合和更高的计算成本。 本文介绍了 BioTune,这是一种进化自适应微调技术,可选择性地微调层以提高迁移学习效率。 BioTune 采用进化算法来确定一组聚焦的层进行微调,旨在优化模型在给定目标任务上的性能。 在来自不同领域的九个图像分类数据集上的评估表明,与现有的微调方法如 AutoRGN 和 LoRA 相比,BioTune 实现了具有竞争力或改进的准确性和效率。 通过将微调过程集中在相关层的一个子集上,BioTune 减少了可训练参数的数量,可能降低计算成本,并促进在不同数据特征和分布下的更高效的迁移学习。

英文摘要#

Deep learning has shown substantial progress in image analysis. However, the computational demands of large, fully trained models remain a consideration. Transfer learning offers a strategy for adapting pre-trained models to new tasks. Traditional fine-tuning often involves updating all model parameters, which can potentially lead to overfitting and higher computational costs. This paper introduces BioTune, an evolutionary adaptive fine-tuning technique that selectively fine-tunes layers to enhance transfer learning efficiency. BioTune employs an evolutionary algorithm to identify a focused set of layers for fine-tuning, aiming to optimize model performance on a given target task. Evaluation across nine image classification datasets from various domains indicates that BioTune achieves competitive or improved accuracy and efficiency compared to existing fine-tuning methods such as AutoRGN and LoRA. By concentrating the fine-tuning process on a subset of relevant layers, BioTune reduces the number of trainable parameters, potentially leading to decreased computational cost and facilitating more efficient transfer learning across diverse data characteristics and distributions.

文章页面#

基于进化选择微调的迁移学习优化

PDF 获取#

查看中文 PDF - 2508.15367v1

智能达人抖店二维码

抖音扫码查看更多精彩内容

Loading...
Ownership of this post data is guaranteed by blockchain and smart contracts to the creator alone.