Boosted Incremental Nelder-Mead Simplex Algorithm: Distributed Regression in Wireless Sensor Networks

  • Parisa Jalili Marandi
  • Nasrollah Moghadam Charkari
Part of the IFIP International Federation for Information Processing book series (IFIPAICT, volume 284)

Wireless sensor networks (WSNs) have been of great interest among academia and industry due to their diverse applications in recent years. The main goal of a WSN is data collection. As the amount of the collected data increases, it would be essential to develop some techniques to analyze them. In this paper, we propose an in-network optimization algorithm based on Nelder-Mead simplex to incrementally do regression analysis over distributed data. Then, we improve the resulted regressor by the application of boosting concept from machine learning theory. Simulation results show that the proposed algorithm not only increases accuracy but is also more efficient in terms of communication compared to its gradient based counterparts.


Wireless Sensor Network Root Mean Square Mobile Networking Hamiltonian Cycle Fusion Center 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Rabbat, M., Nowak, R.: Distributed optimization in sensor networks. In: International Symposium on information processing in sensor networks, ACM Press, Berkley California USA, (2004)Google Scholar
  2. 2.
    Wang, B., He, Z.: Distributed Optimization Over Wireless Sensor Networks using Swarm Intelligence. In: IEEE International Symposium on Circuits and Systems, pp. 2502 - 2505, (2005)Google Scholar
  3. 3.
    Predd, J. B., Kulkarni, S. R., Poor, H. V.: Distributed Learning in Wireless Sensor Networks. J. Signal Processing, vol. 23, pp. 56-69. (2006)CrossRefGoogle Scholar
  4. 4.
    Son, S. H., Chiang, M., Kulkarni, S. R., Schwartz, S. C.: The value of clustering in distributed estimation for sensor networks. In: Proceedings of International Conference on Wireless Networks, Communications and Mobile Computing, pp. 969-974, vol. 2, IEEE, Maui, Hawaii, (2005)Google Scholar
  5. 5.
    Charkari, N. M., Marandi, P. J.: Distributed Regression based on gradient optimization in Wireless sensor networks. In: proceedings of first Iranian Data Mining Conference, Tehran, Iran, (2007)Google Scholar
  6. 6.
    Schapire, R.: The strength of weak learnability. J. Machine Learning, vol. 5, pp. 197-227. (1990)Google Scholar
  7. 7.
    Freund, Y., Schapire, R.: A decision Theoretic generalization of on-line learning and an  application to boosting. J. Computer and System Sciences, vol. 55, pp.119-139. (1995)CrossRefMathSciNetGoogle Scholar
  8. 8.
    Freund, Y., Schapire, R.: Experiments with a new boosting algorithm. In: Proceedings of 13th International Conference on Machine Learning, Morgan Kaufmann Press, pp. 148-156. (1996)Google Scholar
  9. 9.
    Li, L.: Multiclass boosting with repartitioning. In: Proceedings of the 23rd international conference on Machine learning, ACM Press, vol. 148, pp. 569 - 576. (2006)Google Scholar
  10. 10.
    Solomatine, D. P., Shrestha, D. L.: AdaBoost.RT: a Boosting Algorithm for Regression Problems. In: IEEE International Joint Conference on Neural Networks, IEEE Press, vol. 2, pp. 1163 - 1168. (2004)Google Scholar
  11. 11.
    Wang, L., Zhu, X.: A Modified Boosting Based Neural Network Ensemble Method for Regression and Forecasting. In: 2nd IEEE Conference on Industrial Electronics and Applications: IEEE Press, pp. 1280-1285 (2007)Google Scholar
  12. 12.
    Avnimelech, R., Intrator, N.: Boosting regression estimators. J. Neural Computation, vol. 11, pp. 491--513. (1999)Google Scholar
  13. 13.
    Drucker, H.: Improving Regressors using Boosting Techniques. In: Proceedings of the 14th International Conference on Machine Learning, pp. 107-115. (1997)Google Scholar
  14. 14.
    Lazarevic, A., Obradovic, Z.: Boosting Algorithms for Parallel and Distributed Learning. In: Distributed and Parallel Databases, Kluwer Academic Press, vol. 11, pp. 203-229. (2002)Google Scholar
  15. 15.
    Yu, C., Skillicorn, D. B.: Parallelizing Boosting and Bagging. Technical Report, Queen's University, Kingston, Ontario, Canada K7L 3N6 February (2001)Google Scholar
  16. 16.
    Lozano, F., Rangel, P.: Algorithms for parallel boosting. In: Proceedings. Fourth International Conference on Machine Learning and Applications, (2005)Google Scholar
  17. 17.
    Chapelle, O., Scholkopf, B., Zien, A.: Semi Supervised Learning. MIT Press, (2006)Google Scholar
  18. 18.
    Draper, N. R., Smith, H.: Applied Regression Analysis. Wiley Press, (1998)Google Scholar
  19. 19.
    Langendoen, K., Reijers, N.: Distributed localization in wireless sensor networks: a quantitative comparison. In: J. Computer and Telecommunications Networking, vol. 43, pp. 499-518. (2003)MATHGoogle Scholar
  20. 20.
    Nelder, J. A., Mead, R.: A simplex method for function minimization. J. Computer vol. 7, pp. 308-313. (1965)MATHGoogle Scholar
  21. 21.
    Pedroso, J. P.: Simple Metaheuristics Using the Simplex Algorithm for Non-linear Programming, vol.4638, pp. 217-221. Springer Berlin, (2007)Google Scholar
  22. 22.
    Reklaitis, G. V., Ravindran, A., Ragsdell, K. M.: Engineering Optimization: Methods and Applications. John Willey Press, (1983)Google Scholar
  23. 23.
    Lagarias, J. C., Reeds, J. A., Wright, M. H., Wright, P. E.: Convergence properties of the Nelder-Mead simplex method in low dimensions. SIAM J. Optima, vol. 9, pp. 112-147. (1998)MATHCrossRefMathSciNetGoogle Scholar
  24. 24.
    Padmanabhan, V., Rhinehart, R. R.: A Novel Termination Criterion for Optimization. In: Proceedings of the Americal Control Conference, vol. 4, pp. 2281-2286. Portland, OR, USA, (2005)Google Scholar
  25. 25.
    Petit, J.: Hamiltonian cycles in faulty random geometric networks. In: proceedings of International Workshop on Approximation and Randomization Algorithms in Communication Networks, BRICS Aarhus, Denmark., (2001)Google Scholar
  26. 26.
    Guestrin, C., Bodi, P., Thibau, R., Paskin, M., Madde, S.: Distributed regression: An efficient framework for modeling sensor network data. In: proceedings of third international symposium on Information processing in sensor networks, ACM Press, pp. 1-10. Berkeley, California, USA, (2004)CrossRefGoogle Scholar
  27. 27.
  28. 28.
    Rabbat, M., Nowak, R.: Quantized Incremental Algorithms for Distributed Optimization. IEEE J.Sel Areas Commun, Vol.23, no. 4, pp. 798-808. (2005)CrossRefGoogle Scholar

Copyright information

© International Federation for Information Processing 2008

Authors and Affiliations

  • Parisa Jalili Marandi
    • 1
  • Nasrollah Moghadam Charkari
    • 1
  1. 1.Parallel Processing Lab, Electrical and Computer Engineering DepartmentFaculty of Engineering, Tarbiat Modares UniversityIran

Personalised recommendations