clr_keras_utils module

class clr_keras_utils.CyclicLR(base_lr=0.001, max_lr=0.006, step_size=2000.0, mode='triangular', gamma=1.0, scale_fn=None, scale_mode='cycle')[source]

Bases: keras.callbacks.Callback

This callback implements a cyclical learning rate policy (CLR). The method cycles the learning rate between two boundaries with some constant frequency. # Arguments

base_lr: initial learning rate which is the

lower boundary in the cycle.

max_lr: upper boundary in the cycle. Functionally,

it defines the cycle amplitude (max_lr - base_lr). The lr at any cycle is the sum of base_lr and some scaling of the amplitude; therefore max_lr may not actually be reached depending on scaling function.

step_size: number of training iterations per

half cycle. Authors suggest setting step_size 2-8 x training iterations in epoch.

mode: one of {triangular, triangular2, exp_range}.

Default ‘triangular’. Values correspond to policies detailed above. If scale_fn is not None, this argument is ignored.

gamma: constant in ‘exp_range’ scaling function:

gamma**(cycle iterations)

scale_fn: Custom scaling policy defined by a single

argument lambda function, where 0 <= scale_fn(x) <= 1 for all x >= 0. mode paramater is ignored

scale_mode: {‘cycle’, ‘iterations’}.

Defines whether scale_fn is evaluated on cycle number or cycle iterations (training iterations since start of cycle). Default is ‘cycle’.

The amplitude of the cycle can be scaled on a per-iteration or per-cycle basis. This class has three built-in policies, as put forth in the paper. “triangular”:

A basic triangular cycle w/ no amplitude scaling.

“triangular2”:

A basic triangular cycle that scales initial amplitude by half each cycle.

“exp_range”:

A cycle that scales initial amplitude by gamma**(cycle iterations) at each cycle iteration.

For more detail, please see paper.

# Example for CIFAR-10 w/ batch size 100:
```python
clr = CyclicLR(base_lr=0.001, max_lr=0.006,

step_size=2000., mode=’triangular’)

model.fit(X_train, Y_train, callbacks=[clr])

```

Class also supports custom scaling functions:
```python

clr_fn = lambda x: 0.5*(1+np.sin(x*np.pi/2.)) clr = CyclicLR(base_lr=0.001, max_lr=0.006,

step_size=2000., scale_fn=clr_fn, scale_mode=’cycle’)

model.fit(X_train, Y_train, callbacks=[clr])

```

# References

  • [Cyclical Learning Rates for Training Neural Networks](

https://arxiv.org/abs/1506.01186)

clr()[source]
on_batch_end(epoch, logs=None)[source]

A backwards compatibility alias for on_train_batch_end.

on_epoch_end(epoch, logs=None)[source]

Called at the end of an epoch.

Subclasses should override for any actions to run. This function should only be called during TRAIN mode.

Parameters
  • epoch – Integer, index of epoch.

  • logs

    Dict, metric results for this training epoch, and for the

    validation epoch if validation is performed. Validation result keys are prefixed with val_. For training epoch, the values of the

    Model’s metrics are returned. Example`{‘loss’: 0.2, ‘accuracy’:

    0.7}`.

on_train_begin(logs={})[source]

Called at the beginning of training.

Subclasses should override for any actions to run.

Parameters

logs – Dict. Currently no data is passed to this argument for this method but that may change in the future.

clr_keras_utils.clr_callback(mode=None, base_lr=0.0001, max_lr=0.001, gamma=0.999994)[source]

Creates keras callback for cyclical learning rate.

clr_keras_utils.clr_check_args(args)[source]
clr_keras_utils.clr_set_args(args)[source]