paddlespeech.vector.training.scheduler module

class paddlespeech.vector.training.scheduler.CyclicLRScheduler(base_lr: float = 1e-08, max_lr: float = 0.001, step_size: int = 10000)[source]

Bases: LRScheduler

Methods

__call__()

Return lastest computed learning rate on current epoch.

get_lr()

For those subclass who overload LRScheduler (Base Class), User should have a custom implementation of get_lr() .

set_dict(state_dict)

Loads the schedulers state.

set_state_dict(state_dict)

Loads the schedulers state.

state_dict()

Returns the state of the scheduler as a dict.

state_keys()

For those subclass who overload LRScheduler (Base Class).

step()

step should be called after optimizer.step .

get_lr()[source]

For those subclass who overload LRScheduler (Base Class), User should have a custom implementation of get_lr() .

Otherwise, an NotImplementedError exception will be thrown.

step()[source]

step should be called after optimizer.step . It will update the learning rate in optimizer according to current epoch . The new learning rate will take effect on next optimizer.step .

Args:

epoch (int, None): specify current epoch. Default: None. Auto-increment from last_epoch=-1.

Returns:

None