Advertisement

Parametric Optimization of Reconfigurable Designs Using Machine Learning

  • Maciej Kurek
  • Tobias Becker
  • Wayne Luk
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7806)

Abstract

This paper presents a novel technique that uses meta- heuristics and machine learning to automate the optimization of design parameters for reconfigurable designs. Traditionally, such an optimization involves manual application analysis as well as model and parameter space exploration tool creation. We develop a Machine Learning Optimizer (MLO) to automate this process. From a number of benchmark executions, we automatically derive the characteristics of the parameter space and create a surrogate fitness function through regression and classification. Based on this surrogate model, design parameters are optimized with meta-heuristics. We evaluate our approach using two case studies, showing that the number of benchmark evaluations can be reduced by up to 85% compared to previously performed manual optimization.

Keywords

optimization surrogate modeling PSO GP SVM FPGA 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Jin, Y., Olhofer, M., Sendhoff, B.: A framework for evolutionary optimization with approximate fitness functions. IEEE Transactions on Evolutionary Computation 6(5), 481–494 (2002)CrossRefGoogle Scholar
  2. 2.
    Ong, Y.S., et al.: Evolutionary optimization of computationally expensive problems via surrogate modeling. AIAA 41(4), 689–696 (2003)CrossRefMathSciNetGoogle Scholar
  3. 3.
    Su, G.: Gaussian process assisted differential evolution algorithm for computationally expensive optimization problems. In: PACIIA, pp. 272–276. IEEE Computer Society (2008)Google Scholar
  4. 4.
    Guoshao, S., Quan, J.: A cooperative optimization algorithm based on gaussian process and particle swarm optimization for optimizing expensive problems. In: CSO, vol. 2, pp. 929–933 (2009)Google Scholar
  5. 5.
    Thi, H.A.L., Pham, D.T., Thoai, N.V.: Combination between global and local methods for solving an optimization problem over the efficient set. EJOR 142(2), 258–270 (2002)zbMATHCrossRefGoogle Scholar
  6. 6.
    Kurek, M., Luk, W.: Parametric Reconfigurable Designs with Machine Learning Optimizer. In: FPT (2012)Google Scholar
  7. 7.
    Pilato, C., et al.: Improving evolutionary exploration to area-time optimization of FPGA designs. J. Syst. Archit. 54(11), 1046–1057 (2008)CrossRefGoogle Scholar
  8. 8.
    Seeger, M.: Gaussian processes for machine learning. International Journal of Neural Systems 14, 69–106 (2004)CrossRefGoogle Scholar
  9. 9.
    Rasmussen, C., Williams, C.: Gaussian Processes for Machine Learning. MIT Press (2006)Google Scholar
  10. 10.
    Bishop, C.M.: Pattern Recognition and Machine Learning. Springer (2006)Google Scholar
  11. 11.
    Van Den Bergh, F.: An analysis of particle swarm optimizers. Ph.D. dissertation, University of Pretoria, South Africa (2002)Google Scholar
  12. 12.
    Tse, A.H.T., Chow, G.C.T., Jin, Q., Thomas, D.B., Luk, W.: Optimising Performance of Quadrature Methods with Reduced Precision. In: Choy, O.C.S., Cheung, R.C.C., Athanas, P., Sano, K. (eds.) ARC 2012. LNCS, vol. 7199, pp. 251–263. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  13. 13.
    Becker, T., Luk, W., Cheung, P.Y.K.: Parametric Design for Reconfigurable Software-Defined Radio. In: Becker, J., Woods, R., Athanas, P., Morgan, F. (eds.) ARC 2009. LNCS, vol. 5453, pp. 15–26. Springer, Heidelberg (2009)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Maciej Kurek
    • 1
  • Tobias Becker
    • 1
  • Wayne Luk
    • 1
  1. 1.Department of ComputingImperial College LondonUK

Personalised recommendations