Skip to main content

Clinically Relevant Complex Systematic Reviews in Endodontics: Relevance to Comparative Effectiveness Research and Evidence-Based Clinical Decision-Making

  • Chapter
  • First Online:
  • 806 Accesses

Abstract

Clinically relevant complex systematic reviews follow the scientific process in the process of identifying, selecting, and evaluating the best available systematic reviews and meta-analyses on any given clinical question specifically in endodontics, and generally in dentistry, as a model of health care. Careful consideration must be given to the tools of quantification of the best evidence and to the process of updating the analysis of the concerted available evidence. The relevance of the research metasynthesis that characterizes clinically relevant complex systematic reviews is discussed in the context of the distinct modes of clinical decision-making in comparative effectiveness vs. evidence-based analysis.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   149.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   199.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    Cf., Baruch Spinoza (1632–1677): a “thing” can be defined in se and per sein se, meaning that its totality effectively defines what it is, and per se, indicating that it actually can define itself as the concept of itself. That is to say, the product of a research synthesis endeavor is, in its totality, a fundamental primary research product, and it so defines its own essence because it follows the scientific method, the body of scientific techniques we concur to utilize in the pursuit of investigating phenomena, acquiring and creating new knowledge, and of correcting and integrating previous knowledge, as originally crafted by Aristotle (384 A.C.–322 A.C.), and which is the foundation of modern science.

  2. 2.

    Some years ago, I had a dialectical discussion on science and philosophy with a humanist, world-renown for his ground-breaking work in linguistics and literature of the middle ages. His assertion that “biochemistry does not exist as a science” was as grounded on his ignorance of the fundamentals of the science of biochemistry, as the assertion of many that “research synthesis is not a science” rests on their lack of awareness of the rich complexity of the science of research synthesis.

  3. 3.

    This property is indeed not unlike what commonly occurs in the other health sciences: multiple reports by several groups of investigators may be discordant or concordant – hence the need for replicative studies, literature reviews, etc.

  4. 4.

    The term “complex systematic review” to indicate a research meta-analysis paper (i.e., a synthesis of systematic reviews) might suggest that a “systematic review,” a research synthesis paper (i.e., a synthesis of primary research reports) is in some manner or another simple or simplistic, or at least not at all complex and complicated. This statement is misleading and far from the reality of things, and goes to reinforce the points about misconception, misinformation, misunderstanding, and frankly ignorance (cf., footnote 2) about the science of research synthesis and metasynthesis made above.

  5. 5.

    Gestalt – from the German (Berlin School of Psychology, late nineteenth century), meaning the entirety of the essence or being of an entity’s complete form; e.g., the mind, viewed as holistic, parallel, and analog, with self-organizing tendencies.

  6. 6.

    Sequential statistical hypothesis testing is not bound by a predetermined sample. Rather, data are evaluated as they are collected. Sampling is aborted if statistical significance is obtained, based on criteria established at the onset of the study. In other words, the end of a study may be attained at a much earlier stage, with a smaller sample size, and at a lesser overall cost than if the traditional power analysis has estimated the required sample size prior to the onset of the study.

  7. 7.

    Luc de Clapiers Vauvenarques (1715–47) stated in Réflexions et Maximes that …il est plus aisé de dire des choses nouvelles que de concilier celles qui ont étés dites (it is easier to say new things than to reconcile those things that have been said). John William Strutt, 3rd Baron and Lord of Rayleigh (1842–1919), said “The work which deserves, but I am afraid does not always receive, the most credit is that in which discovery and explanation go hand in hand, in which not only are new facts presented, but their relation to old ones is pointed out.”

  8. 8.

    One could conceive of the following: the (simple) design of research synthesis yields the (simple) systematic review by pooling and evaluating primary research (e.g., clinical trials); the (complex) research synthesis, which we have called research metasynthesis, yields the complex systematic review, also termed CRCSR, and results from the pooling and evaluating existing systematic reviews. Yet, a compounded or mixed-form research synthesis would yield a clinically relevant mixed systematic review (CRMSR) and pool and evaluate together both existing systematic reviews and primary research (e.g., clinical trials) that have not yet been incorporated in any existing systematic review. CRMSR promise to be the most arduous and challenging of the research synthesis designs because they will coalesce two fundamentally divergent research entities: primary research reports (e.g., clinical trials) and secondary research reports (i.e., systematic reviews). Study validity issues will be particularly difficult to untangle in CRMSR. Special care must be given not only to the sampling and the measurement processes but as well as to the data analysis component. Case in point, the meta-analysis of a CRMSR will involve a notably greater level of difficulty than even the Bayesian meta-analytical inference required for CRCSR.

  9. 9.

    Vide infra, Part III, 1 and III, 2.

  10. 10.

    Rather than inquisitive.

  11. 11.

    Vide supra, Part I.

  12. 12.

    The “bibliome” is the body of pertinent research literature available for any given systematic review [16].

  13. 13.

    That is, and purely on statistical grounds, if, following acceptable sampling analysis and homogeneity analysis, the number of papers fed into a meta-analysis is less than, say, 2 or 3, one may wonder as to the validity of the generated forest plot.

  14. 14.

    The process of random sampling can be expected to have exactly the same effects when one randomly samples literature in a research synthesis design, as when one obtains a random sample of subjects in an experimental design.

  15. 15.

    Such selection barriers include language, search engine, and library availability, among others.

  16. 16.

    That is, choosing and picking the scientific findings that we want to disseminate because they fit our preferred theory.

  17. 17.

    Cf., Levins R, Lewotin R. The Dialectical Biologist. Harvard University Press, 1985.

  18. 18.

    That is why, even at that very early stage, the National Institutes of Health refers to this research as “clinical research.”

  19. 19.

    Cf., Cochran Q, and its transformation as the I2 test.

  20. 20.

    Not all, it must be emphasized. There exists a school of thought that argues in favor of including all – bad and good – studies in a meta-analysis, akin perhaps to including all – good and bad – materials in the construction of a skyscraper. Should we be surprised if the foundations eventually give, and the edifice crumbles? Should we be surprised if a high proportion of meta-analyses conducted in this manner is likely to evince no statistical significance overall?

  21. 21.

    In the case of repeated t test, this bias lends the analysis increasingly less powerful, by “chipping away” from the level of significance, α, as follows: p(Type I error)  =  1  −  (1  −  α)c, thus when c  =  1 (one comparison), then p  =  1  −  (1  −  0.05)1  =  1  −  (0.95)1  =  0.05; but if we were to perform three “cumulative” t tests, then c  =  3, and p  =  1  −  (0.95)3  =  0.14.

  22. 22.

    An attempt toward quantifying clinical recommendations based on R-AMSTAR scores, and translating the outcome into a simple ranking system has been proposed by Kung et al. [16].

  23. 23.

    As noted elsewhere, the research synthesis process for comparative effectiveness typically includes bibliomes composed of cluster randomized controlled trials (CRCT’s), whereas better informed patient-centered clinical decisions about comparative efficacy for evidence-based practice results from systematic reviews that arise from the research synthesis of bibliomes consisting of traditional RCT’s [36]

References

  1. Littell JH, Corcoran J, Pillai V (2008) Systematic reviews and meta-analysis. Oxford Univeristy Press, New York

    Book  Google Scholar 

  2. Chiappelli F (2008) The science of research synthesis: a manual of evidence-based research for the health sciences – implications and applications in dentistry. NovaScience, Hauppauge

    Google Scholar 

  3. Chiappelli F, Cajulis OC, Oluwadara O, Ramchandani MH (2009) Evidence-based based decision making — implications for clinical dentistry. In: Columbus F (ed) Clinical dentistry: diagnostic, preventive, and restorative services. NovaScience, Hauppauge

    Google Scholar 

  4. Chiappelli F (2010) Sustainable evidence-based decision-making. Monograph. NovaScience, Hauppauge

    Google Scholar 

  5. Chiappelli F, Brant XMC, Oluwadara OO, Neagos N, Ramchandani MH (eds) (2010) Understanding evidence-based practice: toward optimizing clinical outcomes. Springer, Heidelberg

    Google Scholar 

  6. Filipp G, Szentivanyi A, Mess B (1952) Anaphylaxis and the nervous system. (Hungarian). Acta Med Acad Sci Hung 3(2):163–173

    PubMed  CAS  Google Scholar 

  7. Solomon GF, Moos RH (1964) Emotions, immunity, and disease: a speculative theoretical integration. Arch Gen Psychiatry 11:657–674

    Article  PubMed  CAS  Google Scholar 

  8. Ader R (1981) Psychoneuroimmunology. Academic Press, New York

    Google Scholar 

  9. Chiappelli F, Abanomy A, Hodgson D, Mazey KA, Messadi DV, Mito RS, Nishimura I, Spigleman I (2001) Clinical, experimental and translational psychoneuroimmunology research models in oral biology and medicine. In: Ader R, Cohen N, Felten D (eds) Psychoneuro­immunology, III. Academic Press, New York

    Google Scholar 

  10. Whitlock EP, Lin JS, Chou R, Shekelle P, Robinson KA (2008) Using existing systematic reviews in complex systematic reviews. Ann Intern Med 148(10):776–782

    PubMed  Google Scholar 

  11. Chiappelli F (2010) Future avenues of research synthesis for evidence-based clinical decision making. In: Chiappelli F, Brant XMC, Oluwadara OO, Neagos N, Ramchandani MH (eds) Understanding evidence-based practice: toward optimizing clinical outcomes. Springer, Heidelberg

    Chapter  Google Scholar 

  12. Pocock SJ (1982) Interim analyses for randomized clinical trials: the group sequential approach. Biometrics 38(1):153–162

    Article  PubMed  CAS  Google Scholar 

  13. Chiappelli F, Cajulis O, Newman M (2009) Comparative effectiveness research in evidence-based dental practice. J Evid Based Dent Pract 9:57–58

    Article  PubMed  Google Scholar 

  14. Systematic Reviews: CRD’s guidance for undertaking reviews in health care (2009) Center for reviews and dissemination. York Univ. www.york.ac.uk/inst/crd/index.htm) last accessed 25 SEPT 2011

  15. Whitlock EP, Lopez SA, Chang S, Helfand M, Eder M, Floyd N (2009) Identifying, selecting, and refining topics. In: Methods guide for comparative effectiveness reviews: agency for healthcare research and quality. Rockville. http://effectivehealthcare.ahrq.gov/healthInfo.cfm?infotype=rr&ProcessID=60 last accessed 25 SEPT 2011

  16. Kung J, Chiappelli F, Cajulis OS, Avezova R, Kossan G, Chew L, Maida CA (2010) From systematic reviews to clinical recommendations for evidence-based health care: validation of revised assessment of multiple systematic reviews (R-AMSTAR) for grading of clinical relevance. Open Dent J 4:84–91

    PubMed  Google Scholar 

  17. Moher D, Schulz KF, Altman DG (2001) The CONSORT statement: revised recommendations for improving the quality of reports of parallel-group randomized trials. Ann Intern Med 134(8):657–662

    PubMed  CAS  Google Scholar 

  18. Jadad AR, Moore RA, Carroll D, Jenkinson C, Reynolds DJM, Gavaghan DJ, McQuay HJ (1996) Assessing the quality of reports of randomized clinical trials: is blinding necessary? Control Clin Trials 17(1):1–12

    Article  PubMed  CAS  Google Scholar 

  19. Suebnukarn S, Ngamboonsirisingh S, Rattanabanlang A (2010) A systematic evaluation of the quality of meta-analyses in endodontics. J Endod 36(4):602–608

    Article  PubMed  Google Scholar 

  20. Wong J, Prolo P, Chiappelli F (2003) Extending evidence-based dentistry beyond clinical trials: implications for materials research in endodontics. Braz J Oral Sci 2:227–231

    Google Scholar 

  21. Chiappelli F, Navarro AM, Moradi DR, Manfrini E, Prolo P (2006) Evidence-based research in complementary and alternative medicine III: treatment of patients with Alzheimer’s disease. Evid Based Complement Alternat Med 3:411–424

    Article  PubMed  Google Scholar 

  22. Shea BJ, Bouter LM, Peterson J, Boers M, Andersson N, Ortiz Z, Ramsay T, Bai A, Shukla VK, Grimshaw JM (2007) External validation of a measurement tool to assess systematic reviews (AMSTAR). PLoS One 2:e1350

    Article  PubMed  Google Scholar 

  23. Shea BJ, Grimshaw JM, Wells GA, Boers M, Andersson N, Hamel C, Porter AC, Tugwell P, Moher D, Bouter LM (2007) Development of AMSTAR: a measurement tool to assess the methodological quality of systematic reviews. BMC Med Res Methodol 7:10

    Article  PubMed  Google Scholar 

  24. Bartolucci AA, Hillegas WB (2010) Overview, strengths, and limitations of systematic reviews and meta-analyses. In: Chiappelli F, Brant XMC, Oluwadara OO, Neagos N, Ramchandani MH (eds) Understanding evidence-based practice: toward optimizing clinical outcomes. Springer, Heidelberg

    Google Scholar 

  25. Lau J, Antman EM, Jimenez-Silva J, Kupelnick B, Mosteller F, Chalmers TC (1992) Cumulative meta-analysis of therapeutic trials for myocardial infarction. N Engl J Med 327(4):248–254

    Article  PubMed  CAS  Google Scholar 

  26. Lau J, Schmid CH, Chalmers TC (1995) Cumulative meta-analysis of clinical trials builds evidence for exemplary medical care. J Clin Epidemiol 48(1):45–57

    Article  PubMed  CAS  Google Scholar 

  27. Moles DR, Needleman IG, Niederman R, Lau J (2005) Introduction to cumulative meta-analysis in dentistry: lessons learned from undertaking a cumulative meta-analysis in periodontology. J Dent Res 84(4):345–349

    Article  PubMed  CAS  Google Scholar 

  28. Janket SJ, Moles DR, Lau J, Needleman I, Niederman R (2005) Caveat for a cumulative meta-analysis. J Dent Res 84(6):487

    Article  PubMed  Google Scholar 

  29. Moradi DR, Moy PK, Chiappelli F (2006) Evidence-based research in alternative protocols to dental implantology: a closer look at publication bias. J Calif Dent Assoc 34:877–886

    PubMed  Google Scholar 

  30. Atkins D, Best D, Briss PA, Eccles M, Falck-Ytter Y, Flottorp S, Guyatt GH, Harbour RT, Haugh MC, Henry D, Hill S, Jaeschke R, Leng G, Liberati A, Magrini N, Mason J, Middleton P, Mrukowicz J, O’Connell D, Oxman AD, Phillips B, Schünemann HJ, Edejer TT, Varonen H, Vist GE, Williams JW Jr, Zaza S (2004) GRADE working group. Grading quality of evidence and strength of recommendations. BMJ 328(7454):1490

    Article  PubMed  Google Scholar 

  31. Kunz R, Burnand B, Schünemann HJ (2008) Grading of recommendations, assessment, development and evaluation (GRADE) working group. The GRADE system. An international approach to standardize the graduation of evidence and recommendations in guidelines. Internist (Berl) 49(6):673–680

    Article  CAS  Google Scholar 

  32. Cluzeau FA, Littlejohns P, Grimshaw JM, Feder G, Moran SE (1999) Development and application of a generic methodology to assess the quality of clinical guidelines. Int J Qual Health Care 11(1):21–28

    Article  PubMed  CAS  Google Scholar 

  33. AGREE Collaboration (2003) Development and validation of an international appraisal instrument for assessing the quality of clinical practice guidelines: the AGREE project. Qual Saf Health Care 12(1):18–23

    Article  Google Scholar 

  34. Phi L, Ajaj RA, Ramchandani MH, Brant XMC, Oluwadara O, Polinovsky O, Moradi D, Barkhordarian A, Sriphanlop P, Ong M, Giroux A, Lee J, Siddiqui M, Ghodousi N, Chiappelli F (2011) Expanding the Grading of Recommendations Assessment, Development, and Evaluation (Ex-GRADE) for Evidence-Based Clinical Recommendations: Validation Study. The Open Dentistry Journal 5: In press

    Google Scholar 

  35. Dousti M, Ramchandani MH, Chiappelli F (2011) Evidence-Based Clinical Significance in Health Care: Toward an Inferential Analysis of Clinical Relevance. Dental Hypotheses 2:165–77

    Article  Google Scholar 

  36. Chiappelli F (2012) Cluster randomized controlled trials (CRCTs) in evidence-based dentistry. Dent Hypotheses 3:1–4

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Francesco Chiappelli .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Chiappelli, F., Ramchandani, M.H., Phi, L., Brant, X.M.C. (2012). Clinically Relevant Complex Systematic Reviews in Endodontics: Relevance to Comparative Effectiveness Research and Evidence-Based Clinical Decision-Making. In: Chiappelli, F. (eds) Comparative Effectiveness and Efficacy Research and Analysis for Practice (CEERAP). Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-23144-5_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-23144-5_3

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-23143-8

  • Online ISBN: 978-3-642-23144-5

  • eBook Packages: MedicineMedicine (R0)

Publish with us

Policies and ethics