1. Unconstrained Optimization Methods
(Used when there are no explicit constraints on variables)
- Gradient-Based Methods
- Second-Order Methods
- Heuristic & Meta-Heuristic Methods
- Bayesian Optimization
2. Constrained Optimization Methods
(Used when optimization involves constraints on variables)
- Convex Optimization Methods
- Augmented Lagrangian Methods
- Penalty Methods
- Sequential Quadratic Programming (SQP)
- Interior-Point Methods
- Constraint-Specific Heuristic Approaches
Here are the Wikipedia references for each optimization method:
Unconstrained Optimization Methods
- Gradient Descent
- Stochastic Gradient Descent (SGD)
- Mini-Batch Gradient Descent
- Momentum
- Nesterov Accelerated Gradient (NAG)
- RMSprop
- AdaGrad
- Adam
- Adadelta
- Newton’s Method
- Quasi-Newton Methods (BFGS, L-BFGS)
- Conjugate Gradient Method
- Genetic Algorithms
- Simulated Annealing
- Particle Swarm Optimization (PSO)
- Bayesian Optimization
Constrained Optimization Methods
- Linear Programming (LP)
- Simplex Method
- Interior Point Methods
- Quadratic Programming (QP)
- Semidefinite Programming (SDP)
- Augmented Lagrangian Method
- Penalty Method (Quadratic Penalty)
- Barrier Methods (Log Barrier)
- Sequential Quadratic Programming (SQP)
- Interior-Point Method
- Genetic Algorithms with Constraints
- Constrained Particle Swarm Optimization (CPSO)
- Email me: Neil@HarwaniSytems.in
- Website: www.HarwaniSystems.in
- Blog: www.TechAndTrain.com/blog
- LinkedIn: Neil Harwani | LinkedIn