Skip to main content

A Hybrid CS–GSA Algorithm for Optimization

  • Chapter
  • First Online:
Hybrid Soft Computing Approaches

Part of the book series: Studies in Computational Intelligence ((SCI,volume 611))

Abstract

The chapter presents a hybridized population-based Cuckoo search–Gravitational search algorithm (CS–GSA) for optimization. The central idea of this chapter is to increase the exploration capability of the Gravitational search algorithm in the Cuckoo search (CS) algorithm. The CS algorithm is common for its exploitation conduct. The other motivation behind this proposal is to obtain a quicker and stable solution. Twenty-three different kinds of standard test functions are considered here to compare the performance of our hybridized algorithm with both the CS and the GSA methods. Extensive simulation-based results are presented in the results section to show that the proposed algorithm outperforms both CS and GSA algorithms. We land up with a faster convergence than the CS and the GSA algorithms. Thus, best solutions are found with significantly less number of function evaluations. This chapter also explains how to handle the constrained optimization problems with suitable examples.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Du W, Li B (2008) Multi-strategy ensemble particle swarm optimization for dynamic optimization. Inf Sci 178:3096–3109

    Article  Google Scholar 

  2. Panda R, Naik MK (2012) A crossover bacterial foraging optimization algorithm. Appl Comput Intell Soft Comput, 1–7. Hindawi Publication

    Google Scholar 

  3. Mastorakis NE, Gonos IF, Swamy MNS (2003) Design of two-dimensional recursive filters using genetic algorithm. IEEE Trans Circuits Syst-I Fundam Theory Appl 50:634–639

    Article  MathSciNet  Google Scholar 

  4. Panda R, Naik MK (2013) Design of two-dimensional recursive filters using bacterial foraging optimization. In: Proceedings of the 2013 IEEE Symposium on Swarm Intelligence (SIS), pp 188–193

    Google Scholar 

  5. Cordon O, Damas S, Santamari J (2006) A fast and accurate approach for 3D image registration using the scatter search evolutionary algorithm. Pattern Recogn Lett 26:1191–1200

    Article  Google Scholar 

  6. Panda R, Agrawal S, Bhuyan S (2013) Edge magnitude based multilevel thresholding using cuckoo search technique. Expert Syst Appl 40:7617–7628

    Article  Google Scholar 

  7. Panda R, Naik MK, Panigrahi BK (2011) Face recognition using bacterial foraging strategy. Swarm Evol Comput 1:138–146

    Article  Google Scholar 

  8. Liu C, Wechsler H (2000) Evolutionary pursuit and its application to face recognition. IEEE Trans Pattern Anal Mach Intell 22:570–582

    Article  Google Scholar 

  9. Zheng WS, Lai JH, Yuen PC (2005) GA-Fisher: a new LDA-based face recognition algorithm with selection of principal components. IEEE Trans Syst Man Cybern Part B 35:1065–1078

    Article  Google Scholar 

  10. Mitchell M (1998) An introduction to genetic algorithms. MIT Press, Cambridge

    MATH  Google Scholar 

  11. Dorigo M, Maniezzo V, Colorni A (1996) The ant system: optimization by a colony of cooperating agents. IEEE Trans Syst Man Cybern Part B 26:29–41

    Article  Google Scholar 

  12. Kennedy J, Eberhart RC (1995) Particle swarm optimization. In: Proceedings of the IEEE international conference on neural networks vol 4, pp 1942–1948

    Google Scholar 

  13. Gazi V, Passino KM (2004) Stability analysis of social foraging swarms. IEEE Trans Syst Man Cybern Part B 34:539–557

    Google Scholar 

  14. Yang XS, Deb S (2009) Cuckoo search via Lévy flights. In: Proceedings of the world congress on nature and biologically inspired computing, (NaBIC 2009), pp 210–214

    Google Scholar 

  15. Yang XS, Deb S (2013) Cuckoo search: recent advances and applications. Neural Comput Appl 24(1):169–174

    Article  Google Scholar 

  16. Cuckoo Search and Firefly Algorithm. http://link.springer.com/book/10.1007%2F978-3-319-02141-6

  17. Pinar C, Erkan B (2011) A conceptual comparison of the Cuckoo-search, particle swarm optimization, differential evolution and artificial bee colony algorithms. Artif Intell Rev, Springer. doi:10.1007/s10462-011-9276-0

    Google Scholar 

  18. Chakraverty S, Kumar A (2011) Design optimization for reliable embedded system using cuckoo search. In: Proceedings of the international conference on electronics, computer technology, pp 164–268

    Google Scholar 

  19. Barthelemy P, Bertolotti J, Wiersma DS (2008) A Lévy flight for light. Nature 453:495–498

    Article  Google Scholar 

  20. Rashedi E, Nezamabadi S, Saryazdi S (2009) GSA: a gravitational search algorithm. Inf Sci 179:2232–2248

    Article  MATH  Google Scholar 

  21. Yao X, Liu Y, Lin G (1999) Evolutionary programming made faster. IEEE Trans Evol Comput 3:82–102

    Article  Google Scholar 

  22. Chetty S, Adewumi AO (2014) Comparison study of swarm intelligence techniques for annual crop planning problem. IEEE Trans Evol Comput 18:258–268

    Article  Google Scholar 

  23. Chen J-F, Do QH (2014) Training neural networks to predict student academic performance: a comparison of cuckoo search and gravitational search algorithms. Int J Comput Intell Appl 13(1):1450005

    Article  Google Scholar 

  24. Swain KB, Solanki SS, Mahakula AK (2014) Bio inspired cuckoo search algorithm based neural network and its application to noise cancellation. In: Proceedings of the international conference on signal processing and integrated networks (SPIN), pp 632–635

    Google Scholar 

  25. Khodier M (2013) Optimisation of antenna arrays using the cuckoo search algorithm. IET Microwaves Antennas Propag 7(6):458–464

    Article  Google Scholar 

  26. Zhao P, Li H (2012) Opposition based Cuckoo search algorithm for optimization problems. In: Proceedings of the 2012 fifth international symposium on computational intelligence and design, pp 344–347

    Google Scholar 

  27. Saha SK, Kar R, Mandal D, Ghosal SP (2013) Gravitational search algorithm: application to the optimal IIR filter design. Journal of King South University, 1–13

    Google Scholar 

  28. Rashedi E, Nezamabadi-pour H, Saryazdi S (2011) Filter modeling using gravitational search algorithm. Eng Appl Artif Intell 24:117–122

    Article  Google Scholar 

  29. Rashedi E, Nezamabadi-pour H, Saryazdi S (2011) Disruption: A new operator in gravitational search algorithm. Sci Iranica D 18:539–548

    Article  Google Scholar 

  30. Doraghinejad M, Nezamabadi-pour H, Sadeghian AH, Maghfoori M (2012) A hybrid algorithm based on gravitational search algorithm for unimodal optimization. In: Proceedings of the 2nd international conference on computer and knowledge engineering (ICCKE), pp 129–132

    Google Scholar 

  31. Yazdani S, Nezamabadi-pour H, Kamyab S (2013) A gravitational search algorithm for multimodal optimization. Swarm Evol Comput 1–14

    Google Scholar 

  32. Rashedi E, Nezamabadi-pour H, Saryazdi S (2009) BGSA: binary gravitational search algorithm. Nat Comput 9(3):727–745

    Article  MathSciNet  Google Scholar 

  33. Mirjalili S, Hashim SZM (2010) A new hybrid PSOGSA algorithm for function optimization. In: 2010 international conference on computer and information application, pp 374–377

    Google Scholar 

  34. Jiang S, Ji Z, Shen Y (2014) A novel hybrid particle swarm optimization and gravitational search algorithm for solving economic emission load dispatch problems with various practical constraints. Electr Power Energy Syst 55:628–644

    Article  Google Scholar 

  35. Ghodrati A, Lotfi S (2012) A hybrid CS/PSO algorithm for global optimization. Lect Notes Comput Sci 7198:89–98

    Article  Google Scholar 

  36. Guo Z (2012) A hybrid optimization algorithm based on artificial bee colony and gravitational search algorithm. Int J Digit Content Technol Appl 6(17):620–626

    Article  Google Scholar 

  37. Sun G, Zhang A (2013) A hybrid genetic algorithm and gravitational using multilevel thresholding. Pattern Recognit Image Anal 7887:707–714

    Article  Google Scholar 

  38. Yin M, Hu Y, Yang F, Li X, Gu W (2011) A novel hybrid K-harmonic means and gravitational search algorithm approach for clustering. Expert Syst Appl 38:9319–9324

    Article  Google Scholar 

  39. Liu H, Cai Z, Wang Y (2010) Hybridizing particle swarm optimization with differential evolution for constrained numerical and engineering optimization. Appl Soft Comput 10:629–640

    Article  Google Scholar 

  40. Sarangi SK, Panda R, Dash M (2014) Design of 1-D and 2-D recursive filters using crossover bacterial foraging and cuckoo search techniques. Eng Appl Artif Intell 34:109–121

    Article  Google Scholar 

  41. He J (2008) An experimental study on the self-adaption mechanism used by evolutionary programing. Prog Nat Sci 10:167–175

    Google Scholar 

  42. Ji M (2004) A single point mutation evolutionary programing. Inf Process Lett 90:293–299

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Manoj Kumar Naik .

Editor information

Editors and Affiliations

Appendix: Benchmark Functions

Appendix: Benchmark Functions

  1. a.

    Sphere Model

    $$F_{1} \left( {\rm X} \right) = \sum\nolimits_{i = 1}^{n} {x_{i}^{2} } , \, - 95 \le x_{i} \le 95,{\text{ and min}}\left( {F_{1} } \right) = F_{1} \left( {0, \ldots ,0} \right) = 0$$
  2. b.

    Schwefel’s Problem 2.22 [21, 42]

    $$F_{2} \left( {\rm X} \right) = \sum\nolimits_{i = 1}^{n} {\left| {x_{i} } \right| + \prod\nolimits_{i = 1}^{n} {\left| {x_{i} } \right|} } , \, - 12 \le x_{i} \le 12 , {\text{ and min}}\left( {F_{2} } \right) = F_{2} \left( {0, \ldots ,0} \right) = 0$$
  3. c.

    Schwefel’s Problem 1.2

    $$F_{3} \left( {\rm X} \right) = \sum\nolimits_{i = 1}^{n} {\left( {\sum\nolimits_{j = 1}^{i} {x_{j} } } \right)^{2} } , { } - 90 \le x_{i} \le 90,{\text{ and min}}\left( {F_{3} } \right) = F_{3} \left( {0, \ldots ,0} \right) = 0$$
  4. d.

    Schwefel’s Problem 2.21

    $$F_{4} \left( {\rm X} \right) = \mathop {\hbox{max} }\limits_{i} \left\{ {\left| {x_{i} } \right|,1 \le i \le n} \right\} , { } - 90 \le x_{i} \le 90,{\text{ and min}}\left( {F_{4} } \right) = F_{4} \left( {0, \ldots ,0} \right) = 0$$
  5. e.

    Generalized Rosenbrock’s Function

    $$\begin{aligned} & F_{5} \left( {\rm X} \right) = \sum\nolimits_{i = 1}^{n - 1} {\left[ {100\left( {x_{i + 1} - x_{i}^{2} } \right)^{2} + \left( {x_{i} - 1} \right)^{2} } \right]} , { } - 30 \le x_{i} \le 30 \\ & { \hbox{min} }\left( {F_{5} } \right) = F_{5} \left( {0, \ldots ,0} \right) = 0. \\ \end{aligned}$$
  6. f.

    Step Function

    $$F_{6} \left( {\rm X} \right) = \sum\nolimits_{i = 1}^{n} {\left( {\left\lfloor {x_{i} + 0.5} \right\rfloor } \right)^{2} } , { } - 100 \le x_{i} \le 100,{\text{ and min}}\left( {F_{6} } \right) = F_{6} \left( {0, \ldots ,0} \right) = 0.$$
  7. g.

    Quartic Function i.e. Noise

    $$\begin{aligned} & F_{7} \left( {\rm X} \right) = \sum\nolimits_{i = 1}^{n} i x_{i}^{4} + \text{random}\left[ {0,1} \right), \, - 1.28 \le x_{i} \le 1.28 \\ & { \hbox{min} }\left( {{\text{F}}_{ 7} } \right) = F_{7} \left( {0, \ldots ,0} \right) = 0 \\ \end{aligned}$$
  8. h.

    Generalized Rastrigin’s Function

    $$\begin{aligned} & F_{8} \left( {\rm X} \right) = \sum\nolimits_{i = 1}^{n} {\left[ {x_{i}^{2} - 10\cos \left( {2\pi x_{i} } \right) + 10} \right] , { } - 5.12 \le x_{i} \le 5.12} \\ & \hbox{min} \left( {F_{8} } \right) = F_{8} \left( {0, \ldots ,0} \right) = 0. \\ \end{aligned}$$
  9. i.

    Ackley’s Function

    $$\begin{aligned} & F_{9} \left( {\rm X} \right) = - 20\exp \left( { - 0.2\sqrt {\frac{1}{n}\sum\nolimits_{i = 1}^{n} {x_{i}^{2} } } } \right) - \, \exp \left( {\frac{1}{n}\sum\nolimits_{i = 1}^{n} {\cos \left( {2\pi x_{i} } \right)} } \right) + 20 + e \\ & - 32 \le x_{i} \le 32,{\text{ and min}}\left( {F_{9} } \right) = F\left( {0, \ldots ,0} \right) = 0. \\ \end{aligned}$$
  10. j.

    Generalized Griewank Function

    $$\begin{aligned} & F_{10} \left( {\rm X} \right) = \frac{1}{4000}\sum\nolimits_{i = 1}^{n} {x_{i}^{2} - \prod\limits_{i = 1}^{n} {\cos \left( {\frac{{x_{i} }}{\sqrt i }} \right) + 1} } , { } - 600 \le x_{i} \le 600 \\ & \hbox{min} \left( {F_{10} } \right) = F_{10} \left( {0, \ldots ,0} \right) = 0. \\ \end{aligned}$$
  11. k.

    Generalized Penalized Function 1

    $$\begin{aligned} F_{11} \left( {\rm X} \right) = & \frac{\pi }{n}\left\{ {10\sin \left( {\pi y_{i} } \right) + \sum\nolimits_{i = 1}^{n} {\left( {y_{i} - 1} \right)^{2} } \left[ {1 + 10\sin^{2} \left( {\pi y_{i + 1} } \right)} \right] + \left( {y_{n} - 1} \right)^{2} } \right\} \\ + & \sum\nolimits_{i = 1}^{n} {u\left( {x_{i} ,10,100,4} \right)} , \\ \end{aligned}$$

    where

    $$\begin{aligned} & u\left( {x_{i} ,a,k,m} \right) = \left\{ {\begin{array}{*{20}c} {k\left( {x_{i} - a} \right)^{m} , \, x_{i} > a} \\ {0,{ - }a < x_{i} < a,} \\ {k\left( { - x_{i} - a} \right)^{m} , \, x_{i} < - a} \\ \end{array} } \right.{\text{ and }}y_{i} = 1 + \frac{1}{4}\left( {x_{i} + 1} \right) \\ & - 50 \le x_{i} \le 50,{\text{ and min}}\left( {F_{11} } \right) = F_{11} \left( {1, \ldots ,1} \right) = 0. \\ \end{aligned}$$
  12. l.

    Generalized Penalized Function 2

    $$\begin{aligned} F_{12} \left( {\rm X} \right) = 0.1\left\{ {\sin^{2} \left( {3\pi x_{1} } \right) + \sum\nolimits_{i = 1}^{n} {\left( {x_{i} - 1} \right)^{2} } } \right.\left[ {1 + \sin^{2} \left( {3\pi x_{i} + 1} \right)} \right] + \left( {x_{n} - 1} \right)^{2} \cdot \hfill \\ \, \left. {\left[ {1 + \sin^{2} \left( {2\pi x_{n} } \right)} \right]} \right\} + \sum\nolimits_{i = 1}^{n} {u\left( {x_{i} ,5,100,4} \right)} ,\hfill \\ \end{aligned}$$

    where

    $$\begin{aligned} & u\left( {x_{i} ,a,k,m} \right) = \left\{ {\begin{array}{*{20}c} {k\left( {x_{i} - a} \right)^{m} , \, x_{i} > a} \\ {0,{ - }a < x_{i} < a} \\ {k\left( { - x_{i} - a} \right)^{m} , \, x_{i} < - a} \\ \end{array} } \right. , {\text{ and }}y_{i} = 1 + \frac{1}{4}\left( {x_{i} + 1} \right) \\ & - 50 \le x_{i} \le 50,{\text{ and min}}\left( {F_{12} } \right) = F_{12} \left( {1, \ldots ,1} \right) = 0. \\ \end{aligned}$$
  13. m.

    Generalized Schwefel’s Problem 2.26

    $$\begin{aligned} & F_{13} \left( {\rm X} \right) = \sum\nolimits_{i = 1}^{30} { - x_{i} \sin \left( {\sqrt {\left| {x_{i} } \right|} } \right)} , \, - 500 \le x_{i} \le 500 \\ & \hbox{min} \left( {F_{13} } \right) = F_{13} \left( {420.9687, \ldots ,420.9687} \right) = - 12569.5. \\ \end{aligned}$$
  14. n.

    Shekel’s Foxholes Function

    $$\begin{aligned} & F_{14} \left( {\rm X} \right) = \left( {\frac{1}{500} + \sum\nolimits_{j = 1}^{25} {\frac{1}{{j + \sum\nolimits_{i = 1}^{2} {\left( {x_{i} - a_{\text{ij}} } \right)^{6} } }}} } \right)^{ - 1} ,- 65.536 \le x_{i} \le 65.536 \\ & \hbox{min} \left( {F_{14} } \right) = F_{14} \left( { - 32, - 32} \right) \approx 1, \\ \end{aligned}$$

    where

    $$a_{\text{ij}} = \left( {\begin{array}{*{20}l} { - 32} \hfill & { - 16} \hfill & 0 \hfill & {16} \hfill & {32} \hfill & { - 32} \hfill & \cdots \hfill & 0 \hfill & {16} \hfill & {32} \hfill \\ { - 32} \hfill & { - 32} \hfill & { - 32} \hfill & { - 32} \hfill & { - 32} \hfill & { - 16} \hfill & \cdots \hfill & {32} \hfill & {32} \hfill & {32} \hfill \\ \end{array} } \right).$$
  15. o.

    Kowalik’s Function

    $$F_{15} \left( {\rm X} \right) = \sum\nolimits_{i = 1}^{11} {\left[ {a_{i} - \frac{{x_{1} \left( {b_{i}^{2} + b_{i} x_{2} } \right)}}{{b_{i}^{2} + b_{i} x_{3} + x_{4} }}} \right]^{2} , { } - 5 \le x_{i} \le 5}$$

    \(\hbox{min} \left( {F_{15} } \right) \approx F_{15} \left( {0.1928,0.1908,0.1231,0.1358} \right) \approx 0.0003075\). The coefficients are displayed in Table 14 [21, 41, 42].

    Table 14 Kowalik’s function F 15
  16. p.

    Six-Hump Camel-Back Function

    $$F_{16} \left( {\rm X} \right) = 4x_{1}^{2} - 2.1x_{1}^{4} + \frac{1}{3}x_{1}^{6} + x_{1} x_{2} - 4x_{2}^{2} + 4x_{2}^{4} , \, - 5 \le x_{i} \le 5$$
    $$\begin{aligned} & X_{\hbox{min} } = \left( {0.08983, - 0.7126} \right),\left( { - 0.08983,0.7126} \right) \\ & \hbox{min} \left( {F_{16} } \right) = - 1.0316285. \\ \end{aligned}$$
  17. q.

    Branin Function

    $$\begin{aligned} & F_{17} \left( {\rm X} \right) = \left( {x_{2} - \frac{5.1}{{4\pi^{2} }}x_{1}^{2} + \frac{5}{\pi }x_{1} - 6} \right) + 10\left( {1 - \frac{1}{8\pi }} \right)\cos x_{1} + 10 \\ & - 5 \le x_{1} \le 10, \, 0 \le x_{2} \le 15 \\ \end{aligned}$$
    $$\begin{aligned} & X_{\hbox{min} } = \left( { - 3.142,12.275} \right),\left( {3.142,2.275} \right),\left( {9.425,2.425} \right) \\ & \hbox{min} \left( {F_{17} } \right) = 0.398. \\ \end{aligned}$$
  18. r.

    Goldstein-Price Function

    $$\begin{aligned} & F_{18} \left( {\rm X} \right) = \left[ {1 + \left( {x_{1} + x_{2} + 1} \right)^{2} \left( {19 - 14x_{1} + 3x_{1}^{2} } \right.} \right.\left. {\left. {{ - 14}x_{2} + 6x_{1} x_{2} } \right)} \right] \times \left[ {\left( {2x_{1} - 3x_{2} } \right)^{2} } \right. \\ & \quad \quad \quad \quad \times \left( { 1 8 { - 32}x_{1} + 12x_{1}^{2} + 48x_{2} } \right.\left. {\left. { - 36x_{1} x_{2} + 27x_{2}^{2} } \right) + 30} \right] \\ & - 2 \le x_{i} \le 2,{\text{ and }}\hbox{min} \left( {F_{18} } \right) = F_{18} \left( {0, - 1} \right) = 3. \\ \end{aligned}$$
  19. s.

    Hartman’s Family

    $$F\left( {\rm X} \right) = - \sum\limits_{i = 1}^{4} {c_{i} \exp \left( { - \sum\limits_{j = 1}^{n} {a_{\text{ij}} \left( {x_{j} - p_{\text{ij}} } \right)^{2} } } \right)} , \, 0 \le x_{j} \le 1, \, n = 3,6$$

    for F 19(X) and F 20(X), respectively. \(X_{\hbox{min} } {\text{ of }}F_{19} = \left( {0.114,0.556,0.852} \right)\), and \(\hbox{min} \left( {F_{19} } \right) = - 3.86\). \(X_{\hbox{min} } {\text{ of }}F_{20} = \left( {0.201,0.150,0.477,0.275,0.311,0.657} \right)\), and \(\hbox{min} \left( {F_{20} } \right) = - 3.32\).

    The coefficients are shown in Tables 15 and 16, respectively.

    Table 15 Hartman function F 19
    Table 16 Hartman function F 20
  20. t.

    Shekel’s Family \(F\left( X \right) = - \sum\nolimits_{i = 1}^{m} {\left[ {\left( {X - a_{i} } \right)\left( {X - a_{i} } \right)^{T} + c_{i} } \right]^{ - 1} , \, m = 5,7,{\text{ and 10, for }}F_{21} ,F_{22} , {\text{ and }}F_{23} }\) \({ 0} \le x_{j} \le 10,\,x_{{{\text{local\_optima}}}} \approx a_{i} ,\,{\text{and}}\,F\left( {x_{{{\text{local\_optima}}}} } \right) \approx {1 \mathord{\left/ {\vphantom {1 {c_{i} }}} \right. \kern-0pt} {c_{i} }}\,{\text{for}}\,1 \le i \le m.\)

    These functions have five, seven, and ten local minima for \(F_{21} ,F_{22} , {\text{ and }}F_{23}\), respectively. The coefficients are shown in Table 17.

    Table 17 Shekel function F 21, F 22, and F 23

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer India

About this chapter

Cite this chapter

Naik, M.K., Samantaray, L., Panda, R. (2016). A Hybrid CS–GSA Algorithm for Optimization. In: Bhattacharyya, S., Dutta, P., Chakraborty, S. (eds) Hybrid Soft Computing Approaches. Studies in Computational Intelligence, vol 611. Springer, New Delhi. https://doi.org/10.1007/978-81-322-2544-7_1

Download citation

  • DOI: https://doi.org/10.1007/978-81-322-2544-7_1

  • Published:

  • Publisher Name: Springer, New Delhi

  • Print ISBN: 978-81-322-2543-0

  • Online ISBN: 978-81-322-2544-7

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics