zoukankan      html  css  js  c++  java
  • PyTorch固定参数

    In situation of finetuning, parameters in backbone network need to be frozen. To achieve this target, there are two steps.

    First, locate the layers and change their requires_grad attributes to be False.

    for param in net.backbone.parameters():
        param.requires_grad = False
    for pname, param in net.named_parameters():
        if(key_word in pname):
            param.requires_grad = False

    Here we use parameters() or named_parameters() method, it will give both bias and weight.

    Second, filter out those parameters who need to be updated and pass them to the optimizer.

    optimizer = torch.optim.SGD(filter(lambda p: p.requires_grad == True, net.parameters()), lr=learning_rate, momentum=mom)
  • 相关阅读:
    ASP.NET 学习笔记(一)ASP.NET 概览
    JSP基础
    算法
    TestNG基础教程
    TestNG基础教程
    TestNG基础教程
    Jira
    Jira
    Jira
    Jira
  • 原文地址:https://www.cnblogs.com/hizhaolei/p/12535294.html
Copyright © 2011-2022 走看看