Learning Effectiveness Enhancement Project “LEEP”

  • Riadh BesbesEmail author
Part of the Lecture Notes in Educational Technology book series (LNET)


The overall objective, to which the project will contribute, is to improve teaching and learning effectiveness within academic institutions by exploiting data mining methods on collected databases for educational knowledge extraction. These teaching and learning databases are accumulated from quantitative “measures” done through indoor classroom visits within academic institutions, online web access learners’ questionnaires and answers, paper written statements’ analysis of academic exams in STEM education (science, technology, engineering, and mathematics), and online elementary grades seizure from written traces of learners’ performances in STEM exams. Findings of these processes, elaborated by researcher’s team within beneficiary organizations, are disseminated through diversified publication and are the subject of multiple professional meetings, especially, teachers’ training sessions. The project’s data mining strategy in educational context will support and develop teachers’ expertise, enhance and scaffold students’ learning, and improve and raise education system’s performance. This is a project that combines data mining analysis methods with educational and cognitive science findings. It attempts to unify these two paradigms, generally distant from each other. New strategies of educational assessment, training, and innovating are designed and are able to enhance significantly the effectiveness of teaching and learning performances in academic institutions such as secondary schools. The use of these methods aims to identify and better understand the learners’ profiles, teaching practices, characteristics, and context details in which teachers and learners act. These tools for decision support are exploited by the researcher, an educational inspector and expert in educational assessment, to generate, make available, and process databases on teaching practices, learning performances, and learners’ profiles.


Blended learning Assessment for learning Knowledge extraction Profile recognition 



I would like to express my deepest appreciation to Professor Mohamed JEMNI, Director of ICT in The Arab League Educational, Cultural and Scientific Organization—ALECSO, who convincingly provided assistance that enhanced the quality of this work.


  1. Besbes, R. (2012). Hierarchical fuzzy system for teaching effectiveness assessment. PhD thesis, National Institute of Engineering, Sfax, Tunisia.Google Scholar
  2. Bloom, B. S., Englehart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). A taxonomy of educational objectives: Handbook I. David McKay: Cognitive domain. New York.Google Scholar
  3. Braverman, M. T. (2004) Foundations and evaluation context and practices for effective philanthropy.Google Scholar
  4. Darling-Hammond, L. (2000). Authentic assessment of teaching in context. Teaching and Teacher Education, 16, 523–545.CrossRefGoogle Scholar
  5. Darling-Hammond, L., & Ducommun, C. E. (2010). Recognizing and developing effective teaching: What policy makers should know and do. National Education Association (NEA) and American Association of Colleges for Teacher Education (AACTE).Google Scholar
  6. Felder, R. M., & Silverman, L. K. (1988). Learning styles and teaching styles in engineering education. Engineering Education, 78(7), 674–681.Google Scholar
  7. Huczynski, A. A., & Buchanan, D. A. (2007). Organizational behaviour. Pearson Education.Google Scholar
  8. Jorro, A. (2007). Evaluation et développementprofessionnel (p. 256). Paris: L’Harmattan.Google Scholar
  9. Kolb, D. A. (1981). Learning styles and disciplinary differences (pp. 232–255). The Modern American College.Google Scholar
  10. Lafortune, L., & Allal, L. (2008). “Jugement professionnel en évaluation,” collection éducation-Intervention (p. 22). Presses de l’université du Quebec.Google Scholar
  11. Legendre, R. (2005) Dictionnaireactuel de l’éducation (3rd ed., p. 1320). Montréal: Guérin.Google Scholar
  12. Montgomery, S. M., & Groat, L. N. (1998). Student learning styles and their implications for teaching. The Center for Research on Learning and Teaching.Google Scholar
  13. Myers, I. B., & McCaully, M. H. (1986). Manual: A guide to the development and use of the Myers-Briggs type indicator. Palo Alto, CA: Consulting Psychologists Press.Google Scholar
  14. OECD. (2010). PISA 2009 results: “What makes a school successful?” resources, policies and practices (Vol. IV).Google Scholar
  15. OECD. (2010). PISA 2009 results: “What students know and can do—student performance in reading, mathematics and Science (Vol. I).Google Scholar
  16. Shapiro, E. S., & Heick, P. (2004). School psychologist assessment practices in the evaluation of students referred for social/behavioral/emotional problems. Psychology in the Schools, 41, 551–561.CrossRefGoogle Scholar
  17. Sammons, P., & Ko, J. (2008). Using systematic classroom observation schedules to investigate effective teaching: Overview of quantitative findings. An Effective Classroom Practice Project Report. Swindon: ESRC.Google Scholar
  18. Trochim, W., & Donnelly, J. P. (2005). Research methods: The concise knowledge base, 1e. Cornell University.Google Scholar

Copyright information

© Springer Science+Business Media Singapore 2016

Authors and Affiliations

  1. 1.Research Laboratory of Technologies of Information and Communication and Electrical Engineering (LaTICE)TunisTunisia
  2. 2.Ecole Supérieure des Sciences et Techniques de TunisTunisTunisia
  3. 3.MonastirTunisia

Personalised recommendations