In Chapter 7, we discussed many optimization algorithms and inbuilt MATLAB functions, and we saw examples of their use. However, almost all of these algorithms are gradient-based and return only local minima. If the objective function has many local minima and we start from an initial guess close to one such minimum, the algorithm will most probably output the local minimum closest to the initial guess. In such cases, we would never reach the global minimum. Hence the success of the method depends highly on the initial guess. Most real world objectives are multivariate functions and contain multiple minima. Therefore, new heuristic search-based algorithms have been devised. These algorithms start with multiple initial guesses and result in solutions which evolve with time. Hence these methods are known as evolutionary computation methods.