A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://mmsegmentation.readthedocs.io/en/latest/advanced_guides/customize_runtime.html below:

Customize Runtime Settings — MMSegmentation 1.2.2 documentation

Customize Runtime Settings Customize hooks Step 1: Implement a new hook

MMEngine has implemented commonly used hooks for training and test, When users have requirements for customization, they can follow examples below. For example, if some hyper-parameter of the model needs to be changed when model training, we can implement a new hook for it:

# Copyright (c) OpenMMLab. All rights reserved.
from typing import Optional, Sequence

from mmengine.hooks import Hook
from mmengine.model import is_model_wrapper

from mmseg.registry import HOOKS


@HOOKS.register_module()
class NewHook(Hook):
    """Docstring for NewHook.
    """

    def __init__(self, a: int, b: int) -> None:
        self.a = a
        self.b = b

    def before_train_iter(self,
                          runner,
                          batch_idx: int,
                          data_batch: Optional[Sequence[dict]] = None) -> None:
        cur_iter = runner.iter
        # acquire this model when it is in a wrapper
        if is_model_wrapper(runner.model):
          model = runner.model.module
        model.hyper_parameter = self.a * cur_iter + self.b
Step 2: Import a new hook

The module which is defined above needs to be imported into main namespace first to ensure being registered. We assume NewHook is implemented in mmseg/engine/hooks/new_hook.py, there are two ways to import it:

from .new_hook import NewHook

__all__ = [..., NewHook]
custom_imports = dict(imports=['mmseg.engine.hooks.new_hook'], allow_failed_imports=False)
Step 3: Modify config file

Users can set and use customized hooks in training and test followed methods below. The execution priority of hooks at the same place of Runner can be referred here, Default priority of customized hook is NORMAL.

custom_hooks = [
    dict(type='NewHook', a=a_value, b=b_value, priority='ABOVE_NORMAL')
]
Customize optimizer Step 1: Implement a new optimizer

We recommend the customized optimizer implemented in mmseg/engine/optimizers/my_optimizer.py. Here is an example of a new optimizer MyOptimizer which has parameters a, b and c:

from mmseg.registry import OPTIMIZERS
from torch.optim import Optimizer


@OPTIMIZERS.register_module()
class MyOptimizer(Optimizer):

    def __init__(self, a, b, c)
Step 2: Import a new optimizer

The module which is defined above needs to be imported into main namespace first to ensure being registered. We assume MyOptimizer is implemented in mmseg/engine/optimizers/my_optimizer.py, there are two ways to import it:

from .my_optimizer import MyOptimizer
custom_imports = dict(imports=['mmseg.engine.optimizers.my_optimizer'], allow_failed_imports=False)
Step 3: Modify config file

Then it needs to modify optimizer in optim_wrapper of config file, if users want to use customized MyOptimizer, it can be modified as:

optim_wrapper = dict(type='OptimWrapper',
                     optimizer=dict(type='MyOptimizer',
                                    a=a_value, b=b_value, c=c_value),
                     clip_grad=None)
Customize optimizer constructor Step 1: Implement a new optimizer constructor

Optimizer constructor is used to create optimizer and optimizer wrapper for model training, which has powerful functions like specifying learning rate and weight decay for different model layers. Here is an example for a customized optimizer constructor.

from mmengine.optim import DefaultOptimWrapperConstructor
from mmseg.registry import OPTIM_WRAPPER_CONSTRUCTORS

@OPTIM_WRAPPER_CONSTRUCTORS.register_module()
class LearningRateDecayOptimizerConstructor(DefaultOptimWrapperConstructor):
    def __init__(self, optim_wrapper_cfg, paramwise_cfg=None):

    def __call__(self, model):

        return my_optimizer

Default optimizer constructor is implemented here. It can also be used as base class of new optimizer constructor.

Step 2: Import a new optimizer constructor

The module which is defined above needs to be imported into main namespace first to ensure being registered. We assume MyOptimizerConstructor is implemented in mmseg/engine/optimizers/my_optimizer_constructor.py, there are two ways to import it:

from .my_optimizer_constructor import MyOptimizerConstructor
custom_imports = dict(imports=['mmseg.engine.optimizers.my_optimizer_constructor'], allow_failed_imports=False)
Step 3: Modify config file

Then it needs to modify constructor in optim_wrapper of config file, if users want to use customized MyOptimizerConstructor, it can be modified as:

optim_wrapper = dict(type='OptimWrapper',
                     constructor='MyOptimizerConstructor',
                     clip_grad=None)

RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4