This special issue emerged from contributions presented at the 10th Metaheuristics International Conference (MIC), held at the Singapore Management University in Singapore, August 5–8, 2013.

The first MIC took place in Breckenridge, Colorado, USA in 1995. Due to its success, the conference was established as a bi-annual event that gained further attraction and grew in size steadily. Today, MIC has a long-standing international tradition and is for many researchers and practitioners working in the field of metaheuristics one of the leading scientific events to exchange with colleagues and meet friends.

Following the tradition, MIC2013 invited extended abstracts as well as papers of up to 10 pages describing original yet unpublished scientific work related to metaheuristics. We received 86 submissions which have been reviewed by the Program Committee, finally yielding 25 accepted full papers and 34 extended abstracts that were scheduled to be presented at the conference and published in the conference proceedings (Lau et al. 2013). These contributions documented that the field of metaheuristics is many-fold. While a substantial portion of the papers focuses on applications in transport optimization, there are many other application areas addressed, including network design, data analysis, scheduling, packing, and constraint satisfaction. From an algorithmic point-of-view, a broad range of techniques is considered, established metaheuristics are adapted, new variants proposed, and in particular hybrid approaches where metaheuristics are applied in conjunction with other methods, e.g. from the field of mathematical programming, are becoming more and more popular.

After the conference, we offered the possibility to submit significantly extended manuscripts for publication in this special issue of the Journal of Heuristics. We received 29 submissions, which were independently reviewed according to the standards of this journal. Twelve articles were finally accepted, and we feel that they represent true highlights. They also indicate the breadth in terms of algorithmic strategies as well as applications the field of metaheuristics has to offer.

The article de Araújo et al. (2015) entitled “DC-GRASP: directing the search on continuous-GRASP” is authored by Tiago Maritan Ugulino de Araújo, Lisieux Marie M. S. Andrade, Carlos Magno, Lucídio dos Anjos Formiga Cabral, Roberto Quirino do Nascimento, and Cláudio N. Meneses. The well-known Greedy Randomized Adaptive Search Procedure (GRASP) is considered in conjunction with continuous optimization problems. Earlier work already described a variant of GRASP, called C-GRASP, that is applicable to continuous problems. It has been observed, however, that due to its random construction C-GRASP may fail to detect promising search directions especially in the vicinity of minima, resulting in slow convergence. In the new DC-GRASP, this problem is reduced by incorporating effective local search methods for a more directed search.

The article Benaichouche et al. (2014) entitled “Multiobjective improved spatial fuzzy c-means clustering for image segmentation combining Pareto-optimal clusters” by Ahmed Nasreddine Benaichouche, Hamouche Oulhadj, and Patrick Siarry, proposes a grayscale image segmentation method based on a multiobjective optimization approach that optimizes two complementary criteria (region and edge based). The paper uses an improved spatial fuzzy c-means clustering measure for region-based fitness, while the edge-based fitness measure is based on contour statistics and the number of connected components in the image segmentation result. The method was compared to the most widely used FCM-based algorithms in the literature and experimental results demonstrate its effectiveness.

The article Caserta and Voß (2014) entitled “A corridor method based hybrid algorithm for redundancy allocation”, by Marco Caserta and Stefan Voß, considers the problem of allocating redundant components within series-parallel systems to increase reliability of hardware or software systems. The proposed heuristic approach consists of three phases effectively combining the cross entropy method, the corridor method and a dynamic programming-based scheme.

The article Fawcett and Hoos (2015) “Analysing differences between algorithm configurations through ablation” is from Chris Fawcett and Holger H. Hoos. The authors present a technique that seek to analyse differences between algorithm configurations so that developers that apply automated configuration tools may answer questions about which parameter changes contribute most to improved performance. They perform an extensive empirical analysis of their technique on five scenarios from propositional satisfiability, mixed-integer programming and AI planning, and show that in all of these scenarios more than 95 % of the performance gains between default configurations and optimised configurations obtained from automated configuration tools can be explained by modifying the values of a small number of parameters.

The article Fischetti and Monaci (2015) “Proximity search heuristics for wind farm optimal layout” by Martina Fischetti and Michele Monaci proposes a heuristic framework for turbine layout optimization in a wind farm that combines ad-hoc heuristics and mixed-integer linear programming, refining the current best solution according to the recently-proposed proximity search paradigm. Computational results on instances with up to 20,000 potential turbine sites demonstrate the potential benefits of the approach.

In the article Inführ and Raidl (2014) “A memetic algorithm for the virtual network mapping problem” Johannes Inführ and Günther Raidl consider the problem of network virtualization of the Internet where multiple virtual networks are embedded on demand into the physical network, and each of them is adapted to a specific application class. In mapping the different virtual networks with their resource requirement into the available physical network, they introduce a memetic algorithm that significantly outperforms the previously best algorithms, and perform an analysis of the influence of different problem representations and in particular the implementation of a uniform crossover for the grouping genetic algorithm that may also be interesting outside of the virtual network mapping domain.

The article Irawan et al. (2015) “Hybrid meta-heuristics with VNS and exact methods: application to large unconditional and conditional vertex p-centre problems” is from Chandra Ade Irawan, Said Salhi, and Zvi Drezner. The authors consider the problem of optimally locating p facilities among a set of potential sites and to assign demand points to these facilities in order to minimize the maximum distance between demand points and their nearest facility. Two new hybrid heuristics are proposed and experimentally compared. The first method is a three-stage approach first solving an aggregated version of the problem, then applying a variable neighborhood search that makes use of the potential sites identified in the first stage, and finally applying a post-optimization procedure. In the second approach, these three stages are more flexibly applied in an intertwined way, resulting in a guided multi-start method.

In the article Mendoza et al. (2015) “A hybrid metaheuristic for the vehicle routing problem with stochastic demand and duration constraints” by Jorge E. Mendoza, Louis-Martin Rousseau, and Juan G. Villegas, the vehicle routing problem with stochastic demands (VRPSD) is considered, and the goal is to generate routes with a minimal expected travel time to satisfy a set of customers with random demands following some known probability distributions. Due to uncertainty in demands, the route duration becomes a random variable. The authors present two strategies to deal with route-duration constraints: first, the individual route duration constraints are handled as chance constraints, and second, the expected violations to the duration constraint are penalized in the objective function. The authors propose a greedy randomized adaptive search procedure (GRASP) enhanced with heuristic concentration and provide extensive experimental results.

In the article Otsuki and Aihara (2014) “New variable depth local search for multiple depot vehicle scheduling problems” by Tomoshi Otsuki and Kazuyuki Aihara, the authors consider the problem of scheduling given trips from specified departure points to arrival points at certain times given a fleet of vehicles belonging to different depots. For this well-known problem, an iterative deepening local search is first described, which is then extended by effective pruning techniques into a variable depth search. Computational results are remarkable especially when the running time is strongly limited.

In the article Pranzo and Pacciarelli (2015) “An iterated greedy metaheuristic for the blocking job shop scheduling problem” by Marco Pranzo and Dario Pacciarelli, the authors consider a job shop scheduling problem with blocking constraints which model the absence of buffers (zero buffer), whereas in the traditional job shop scheduling model buffers have infinite capacity. The authors consider two variants of the problem, with swap allowed and the one with no swap allowed. They propose an Iterated Greedy (IG) algorithm to solve both variants of the problem. The experimental comparison with recently published results shows that the proposed iterated greedy algorithm outperforms other state-of-the-art algorithms on benchmark instances.

The paper Truchet et al. (2015) “Estimating parallel runtimes for randomized algorithms in constraint solving” by Charlotte Truchet, Alejandro Arbelaez, Florian Richoux, and Philippe Codognet, proposes a framework for estimating the parallel performance of an algorithm by analyzing the runtime behavior of its sequential version. Experimental results inidcate that the proposed framework estimates the runtime of the parallel algorithm with great accuracy.

The article Xavier and Xavier (2014) “Flying elephants: a general method for solving non-differentiable problems” by Adilson Elias Xavier and Vinicius Layter Xavier, proposes a generalization of the Hyperbolic Smoothing approach. Computational experiments on distance geometry, covering, clustering, Fermat–Weber and hub location problems indiacte that the proposed method has good performance and robustness.

Finally, we thank all authors who submitted their manuscripts to this special issue and the reviewers for their hard work in the reviewing process.