zae_engine.schedulers package

Submodules

zae_engine.schedulers.core module

class zae_engine.schedulers.core.SchedulerBase(optimizer: Optimizer, total_iters: int, eta_min: float, last_epoch: int = -1)[소스]

기반 클래스: LRScheduler, ABC

Base class for learning rate schedulers.

This class extends PyTorch’s LRScheduler and adds additional functionality for custom learning rate scheduling.

매개변수:
  • optimizer (Optimizer) – The optimizer for which to schedule the learning rate.

  • total_iters (int) – The total number of iterations for the scheduler.

  • eta_min (float) – The minimum learning rate.

  • last_epoch (int, optional) – The index of the last epoch. Default is -1.

optimizer

The optimizer being used.

Type:

Optimizer

total_iters

The total number of iterations for the scheduler.

Type:

int

eta_min

The minimum learning rate.

Type:

float

last_epoch

The index of the last epoch.

Type:

int

_step_count

The step count for the scheduler.

Type:

int

abstract get_lr()[소스]

Get the learning rate for the current epoch.

반환:

The learning rate for the current epoch.

반환 형식:

float

class zae_engine.schedulers.core.SchedulerChain(*schedulers: SchedulerBase)[소스]

기반 클래스: SchedulerBase

Chain multiple learning rate schedulers together.

This class allows you to chain multiple learning rate schedulers so that they are applied sequentially.

매개변수:

*schedulers (SchedulerBase) – The schedulers to chain together.

schedulers

The list of chained schedulers.

Type:

list of SchedulerBase

optimizer

The optimizer being used.

Type:

Optimizer

next_iters

The iteration counts at which to switch to the next scheduler.

Type:

list of int

total_iters

The total number of iterations for the entire chain of schedulers.

Type:

int

i_scheduler

The index of the current scheduler in the chain.

Type:

int

get_lr()[소스]

Get the learning rate for the current epoch.

반환:

The learning rate for the current epoch.

반환 형식:

float

sanity_check()[소스]

Check that all schedulers use the same optimizer.

반환:

The common optimizer used by all schedulers.

반환 형식:

Optimizer

예외 발생:

AssertionError – If multiple optimizers are detected.

step(epoch=None)[소스]

Perform a step of the scheduler.

매개변수:

epoch (int, optional) – The current epoch. If None, use the internal step count.

zae_engine.schedulers.scheduler module

class zae_engine.schedulers.scheduler.CosineAnnealingScheduler(optimizer: Optimizer, total_iters, eta_min: float = 0, last_epoch: int = -1)[소스]

기반 클래스: SchedulerBase

get_lr()[소스]

Get the learning rate for the current epoch.

반환:

The learning rate for the current epoch.

반환 형식:

float

class zae_engine.schedulers.scheduler.WarmUpScheduler(optimizer: Optimizer, total_iters, eta_min: float = 0, last_epoch: int = -1)[소스]

기반 클래스: SchedulerBase

get_lr()[소스]

Get the learning rate for the current epoch.

반환:

The learning rate for the current epoch.

반환 형식:

float

Module contents