MyCaffe
1.12.2.41
Deep learning software for Windows C# programmers.

The MyCaffe.solvers namespace contains all solver classes, including the base Solver. More...
Classes  
class  AdaDeltaSolver 
Use AdaDelta Solver which has gradient based optimization like SGD. More...  
class  AdaGradSolver 
Use AdaGrad Solver based optimization like SGD that tries to find rarely seen features. More...  
class  AdamSolver 
Use Adam Solver which uses gradient based optimization like SGD that includes 'adaptive momentum estimation' and can be thought of as a generalization of AdaGrad. More...  
class  AdamWSolver 
Use AdamW Solver which uses gradient based optimization like Adam with a decoupled weight decay. More...  
class  LBFGSSolver 
Optimizes the parameters of a Net using LBFGS. This implementation is based on minFunc, by Marc Schmidt. More...  
class  NesterovSolver 
Use Nesterov's accelerated gradient Solver, which is similar to SGD, but the error gradient is computed on the weights with added momentum. More...  
class  RmsPropSolver 
Use RmsProp Solver which uses gradient based optimization like SGD. More...  
class  SGDSolver 
Stochastic Gradient Descent solver with momentum updates weights by a linear combination of the negative gradient and the previous weight update. More...  
class  Solver 
An interface for classes that perform optimization on Nets  this class serves as the base class for all solvers. More...  
The MyCaffe.solvers namespace contains all solver classes, including the base Solver.