Packages

case class Plateau(monitor: String, factor: Float = 0.1f, patience: Int = 10, mode: String = "min", epsilon: Float = 1e-4f, cooldown: Int = 0, minLr: Float = 0) extends LearningRateSchedule with Product with Serializable

Plateau is the learning rate schedule when a metric has stopped improving. Models often benefit from reducing the learning rate by a factor of 2-10 once learning stagnates. It monitors a quantity and if no improvement is seen for a 'patience' number of epochs, the learning rate is reduced.

monitor

quantity to be monitored, can be Loss or score

factor

factor by which the learning rate will be reduced. new_lr = lr * factor

patience

number of epochs with no improvement after which learning rate will be reduced.

mode

one of {min, max}. In min mode, lr will be reduced when the quantity monitored has stopped decreasing; in max mode it will be reduced when the quantity monitored has stopped increasing

epsilon

threshold for measuring the new optimum, to only focus on significant changes.

cooldown

number of epochs to wait before resuming normal operation after lr has been reduced.

minLr

lower bound on the learning rate.

Linear Supertypes
Serializable, Serializable, Product, Equals, LearningRateSchedule, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. Plateau
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. LearningRateSchedule
  7. AnyRef
  8. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new Plateau(monitor: String, factor: Float = 0.1f, patience: Int = 10, mode: String = "min", epsilon: Float = 1e-4f, cooldown: Int = 0, minLr: Float = 0)

    monitor

    quantity to be monitored, can be Loss or score

    factor

    factor by which the learning rate will be reduced. new_lr = lr * factor

    patience

    number of epochs with no improvement after which learning rate will be reduced.

    mode

    one of {min, max}. In min mode, lr will be reduced when the quantity monitored has stopped decreasing; in max mode it will be reduced when the quantity monitored has stopped increasing

    epsilon

    threshold for measuring the new optimum, to only focus on significant changes.

    cooldown

    number of epochs to wait before resuming normal operation after lr has been reduced.

    minLr

    lower bound on the learning rate.

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. var best: Float
  6. def clone(): AnyRef
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @native() @throws( ... )
  7. val cooldown: Int
  8. val currentRate: Double
    Definition Classes
    LearningRateSchedule
  9. val epsilon: Float
  10. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  11. val factor: Float
  12. def finalize(): Unit
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  13. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  14. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  15. val minLr: Float
  16. val mode: String
  17. val monitor: String
  18. var monitorOp: (Float, Float) ⇒ Boolean
  19. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  20. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  21. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  22. val patience: Int
  23. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  24. def updateHyperParameter[T](optimMethod: SGD[T]): Unit

    update learning rate by config table and state table

    update learning rate by config table and state table

    optimMethod

    init optiMethod.

    Definition Classes
    PlateauLearningRateSchedule
  25. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  26. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  27. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @throws( ... )

Deprecated Value Members

  1. def updateHyperParameter(config: Table, state: Table): Unit
    Definition Classes
    LearningRateSchedule
    Annotations
    @deprecated
    Deprecated

    (Since version 0.2.0) Please input SGD instead of Table

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from LearningRateSchedule

Inherited from AnyRef

Inherited from Any

Ungrouped