|
class | AdaDelta |
| Adadelta is an optimizer that uses two ideas to improve upon the two main drawbacks of the Adagrad method: More...
|
|
class | Adam |
| Adam is an optimizer that computes individual adaptive learning rates for different parameters from estimates of first and second moments of the gradients. More...
|
|
class | AugLagrangian |
| The AugLagrangian class implements the Augmented Lagrangian method of optimization. More...
|
|
class | AugLagrangianFunction |
| This is a utility class used by AugLagrangian, meant to wrap a LagrangianFunction into a function usable by a simple optimizer like L-BFGS. More...
|
|
class | AugLagrangianTestFunction |
| This function is taken from "Practical Mathematical Optimization" (Snyman), section 5.3.8 ("Application of the Augmented Lagrangian Method"). More...
|
|
class | ExponentialSchedule |
| The exponential cooling schedule cools the temperature T at every step according to the equation. More...
|
|
class | GockenbachFunction |
| This function is taken from M. More...
|
|
class | GradientDescent |
| Gradient Descent is a technique to minimize a function. More...
|
|
class | L_BFGS |
| The generic L-BFGS optimizer, which uses a back-tracking line search algorithm to minimize a function. More...
|
|
class | LovaszThetaSDP |
| This function is the Lovasz-Theta semidefinite program, as implemented in the following paper: More...
|
|
class | LRSDP |
| LRSDP is the implementation of Monteiro and Burer's formulation of low-rank semidefinite programs (LR-SDP). More...
|
|
class | LRSDPFunction |
| The objective function that LRSDP is trying to optimize. More...
|
|
class | MiniBatchSGD |
| Mini-batch Stochastic Gradient Descent is a technique for minimizing a function which can be expressed as a sum of other functions. More...
|
|
class | PrimalDualSolver |
| Interface to a primal dual interior point solver. More...
|
|
class | RMSprop |
| RMSprop is an optimizer that utilizes the magnitude of recent gradients to normalize the gradients. More...
|
|
class | SA |
| Simulated Annealing is an stochastic optimization algorithm which is able to deliver near-optimal results quickly without knowing the gradient of the function being optimized. More...
|
|
class | SDP |
| Specify an SDP in primal form. More...
|
|
class | SGD |
| Stochastic Gradient Descent is a technique for minimizing a function which can be expressed as a sum of other functions. More...
|
|