Yıl: 2014 Cilt: 39 Sayı: 174 Sayfa Aralığı: 1 - 32 Metin Dili: Türkçe İndeks Tarihi: 29-07-2022

Etkili Araştırma Sentezleri Yapabilmek için Bir Araştırma Yöntemi: Meta-Analiz

Öz:
Son yıllarda, eğitim bilimlerindeki birincil çalışmaların sayısı artıkça kapsamlı ve sistematik araştırma sentezlerine olan ihtiyaç da artmaktadır. En etkili araştırma sentezi yollarından bir tanesi olan meta-analizin çeşitli uygulamalarının, sosyal bilimler ve eğitim bilimleri de dâhil olmak üzere birçok alanda teşvik edilmesinin temel sebebi budur. Bu makalenin temel amacı, metaanalizin diğer araştırma sentezi yöntemlerine kıyasla zayıf ve güçlü taraflarını sorgulayarak meta-analiz için kavramsal bir çerçeve oluşturmaktır. Bununla birlikte, sabit-etki ve rastgeleetkiler modellerinin karşılaştırılması, farklı etki büyüklüğü ölçüleri, analiz birimi, yayın yanlılığı ve birincil çalışmaların kalitesi gibi geçerlikle ilgili sorunlar ile heterojenlik, ara-değişken ve güç analizleri gibi bazı metodolojik ve istatistiksel hususlar detaylı şekilde tartışılmaktadır. Ayrıca bu makale kapsamında, meta-analizde kullanılabilecek yazılımlar hakkında kısa ve öz bir bilgiye ve meta-analizlerin raporlanması için geliştirilen standartların özetine yer verilmiştir. Sadece bilimin birikimli doğası için değil aynı zamanda politika belirleyiciler ve uygulayıcılar için çok önemli bir yere sahip olması sebebiyle meta-analizlerin ne kadar iyi yapıldığı ve raporlandığı çok büyük bir önem taşımaktadır. Buradan hareketle, bu makaleyle meta-analiz uygulayıcılarına meta-analiz araştırmaları sırasında yararlanabilecekleri giriş seviyesinde bir rehber sağlamayı amaçlıyoruz.
Anahtar Kelime:

Belge Türü: Makale Makale Türü: Araştırma Makalesi Erişim Türü: Erişime Açık
  • Ahn, S., Ames, A. J., & Myers, N. D. (2012). A review of meta-analyses in education: Methodological strengths 10.3102/0034654312458162 Review of Educational Research, 82(4), 436-476. doi:
  • APA Publications and Communications Board Working Group on Journal Article Reporting Standards. (2008). Reporting standards for research in psychology: Why do we need them? What http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2957094/ they be? American Psychologist, 63(9), 839-851.
  • Bax, L., Yu, L. M., Ikeda, N., & Moons, K. G. (2007). A systematic comparison of software dedicated to meta-analysis of causal studies. BMC Medical Research Methodology, 7(1), 40.
  • Becker, B. J. (2005). Failsafe N or file-drawer number. In H. R. Rothstein, A. J. Sutton & M. Borenstein (Eds.), Publication bias in meta-analysis: Prevention, assessment and adjustments. West Sussex, England: John Wiley & Sons, Ltd.
  • Bennett, D. A., Latham, N. K., Stretton, C., & Anderson, C. S. (2004). Capture-recapture is a potentially useful method for assessing publication bias. Journal of Clinical Epidemiology, 57(4), 349-357.
  • Bennett, J. (2005). Systematic reviews of research in science education: Rigour or rigidity? International Journal of Science Education, 27(4), 387-406.
  • Berliner, D. C. (2002). Educational research: The hardest science of all. Educational Researcher, 31(8), 18.
  • Berman, N., & Parker, R. (2002). Meta-analysis: Neither quick nor easy. BMC Medical Research Methodology, 2(1), 10.
  • Bligh, J. (2000). Problem-based learning: The story continues to unfold. Medical Education, 34(9), 688- 689.
  • Borenstein, M. (2005). Software for Publication Bias. In H. R. Rothstein, A. J. Sutton & M. Borenstein (Eds.), Publication Bias for Meta-Analysis: Prevention, Assessment and Adjustments. West Sussex, England: John Wiley & Sons Ltd.
  • Borenstein, M. (2009). Effect size for continuous data. In H. Cooper, L. V. Hedges & J. C. Valentine (Eds.), The handbook of research synthesis and meta-analysis (2nd ed.). New York: Russell Sage Foundation.
  • Borenstein, M., Hedges, L. V., Higgins, J. P. T., & Rothstein, H. R. (2009). Introduction to meta-analysis. West Sussex, UK: John Wiley & Sons, Ltd.
  • Bushman, B. J., & Wang, M. C. (2009). Vote-counting procedures in meta-analysis. In H. Cooper, L. V. Hedges & J. C. Valentine (Eds.), The handbook of research synthesis and meta-analysis (pp. 207-220). New York: Russell Sage Foundation
  • Bushman, B. J., & Wells, G. L. (2001). Narrative impressions of literature: The availability bias and the corrective properties of meta-analytic approaches. Personality and Social Psychology Bulletin, 27(9), 1123-1130.
  • Card, N. A. (2012). Applied meta-analysis for social science research. New York: The Guilford Press.
  • Carlton, P. L., & Strawderman, W. E. (1996). Evaluating cumulated research I: The inadequacy of traditional methods. Biological Psychiatry, 39(1), 65-72.
  • Chalmers, I., Hedges, L. V., & Cooper, H. (2002). A brief history of research synthesis. Evaluation & The Health Professions, 25(1), 12-37.
  • Chan, M. L. E., & Arvey, R. D. (2012). Meta-analysis and the development of knowledge. Perspectives on Psychological Science, 7(1), 79-92.
  • Clarke, M. (2009). Reporting format. In H. Cooper, L. V. Hedges & J. C. Valentine (Eds.), The handbook of research synthesis and meta-analysis (2 ed., pp. 521-534). New York: Russell Sage Foundation.
  • Cohen, J. (1988). Statistical power analysis for the behavioral sciences. Hillsdale, NJ: Lawrance Erlbaum Associates, Inc.
  • Cohen, J. (1990). Things I have learned (so far). American Psychologist, 45(12), 1304.
  • Cohn, L. D., & Becker, B. J. (2003). How meta-analysis increases statistical power. Psychological Methods, 8(3), 243-253.
  • Cooper, H. (1997). Some finer points in the meta-analysis. In M. Hunt (Ed.), How science takes stock: The story of meta-analysis. New York: Russell Sage Foundation.
  • Cooper, H., & Hedges, L. V. (2009). Research synthesis as a scientific process. In H. Cooper, L. V. Hedges & J. C. Valentine (Eds.), The handbook of research synthesis and meta-analysis (2nd ed., pp. 3- 16). New York: Russell Sage Foundation
  • Cooper, H., & Rosenthal, R. (1980). Statistical versus traditional procedures for summarizing research findings. Psychological Bulletin, 87(3), 442.
  • Dalton, D. R., & Dalton, C. M. (2008). Meta-analyses. Organizational Research Methods, 11(1), 127-147.
  • Davies, P. (2000). The relevance of systematic reviews to educational policy and practice. Oxford Review of Education, 26(3-4), 365-378.
  • Deeks, J. J., Dinnes, J., D'Amico, R., Sowden, A. J., Sakarovitch, C., Song, F., . . . Altman, D. G. (2003). Evaluating non-randomised intervention studies. Health Technology Assessment, 7(27).
  • Duval, S., Rothstein, H. R., Sutton, A. J., & Borenstein, M. (2005). The Trim and Fill Method Publication Bias in Meta-Analysis: Prevention, Assessment and Adjustments. West Sussex, England: John Wiley & Sons, Ltd.
  • Duval, S., & Tweedie, R. (2000a). A nonparametric" trim and fill" method of accounting for publication bias in meta-analysis. Journal of the American Statistical Association, 95(449), 89-98.
  • Duval, S., & Tweedie, R. (2000b). Trim and fill: A simple funnel-plot-based method of testing and adjusting for publication bias in meta-analysis. Biometrics, 56(2), 455-463.
  • Egger, M., Smith, G. D., Schneider, M., & Minder, C. (1997). Bias in meta-analysis detected by a simple, graphical test. British Medical Journal, 315(7109), 629.
  • Ellis, P. D. (2010). The essential guide to effect sizes: Statistical power, meta-analysis, and the interpretation of research results. Cambridge: Cambridge University Press.
  • Erez, A., Bloom, M. C., & Wells, M. T. (1996). Using random rather than fixed effects models in meta- analysis: Implications for situational specificity and validity generalization. Personnel Psychology,
  • Eysenck, H. J. (1978). An exercise in mega-silliness. American Psychologist, 33(5), 517.
  • Eysenck, H. J. (1984). Meta-analysis: An abuse of research integration. The Journal of Special Education, 18(1), 41-59.
  • Eysenck, H. J. (1994). Systematic reviews: Meta-analysis and its problems. British Medical Journal, 309, 789-792.
  • Fan, X. (2001). Statistical significance and effect size in education research: Two sides of a coin. The Journal of Educational Research, 94(5), 275-282.
  • Feinstein, A. R. (1995). Meta-analysis: Statistical alchemy for the 21st century Journal of Clinical Epidemiology, 48(1), 71-79.
  • Field, A. P. (2003). The problem in using fixed-effects models of meta-analysis on real world data. Understanding Statistics, 2, 77-96.
  • Fitz-Gibbon, C. T. (1985). The implications of meta-analysis for educational research. British Educational Research Journal, 11(1), 45-49.
  • Fitzgerald, S. M., & Rumrill, P. D. (2003). Meta-analysis as a tool for understanding existing research literature. Work: A Journal of Prevention, Assessment and Rehabilitation, 21(1), 97-103.
  • Fitzgerald, S. M., & Rumrill, P. D. (2005). Quantitative alternatives to narrative reviews for understanding existing research literature. Work: A Journal of Prevention, Assessment and Rehabilitation, 24(3), 317-323.
  • Fleiss, J. L., & Berlin, J. A. (2009). Effect size for dichotomous data. In H. Cooper, L. V. Hedges & J. C. Valentine (Eds.), The handbook of research synthesis and meta-analysis (2nd ed.). New York: Russell Sage Foundation.
  • Glass, G. V. (1976). Primary, secondary, and meta-analysis of research. Educational Researcher, 5(10), 3- 8.
  • Glass, G. V. (1982). Meta-analysis: An approach to the synthesis of research results. Journal of Research in Science Teaching, 19(2), 93-112.
  • Glass, G. V. (2006). Meta-analysis: The quantitative synthesis of research findings. In J. L. Green, P. B. Elmore & G. Camilli (Eds.), Handbook of Complementary Methods in Education Research. Mahwah: Lawrence Erlbaum Associates.
  • Glass, G. V., McGaw, B., & Smith, M. L. (1981). Meta-analysis in social research. CA: Sage Publications.
  • Gleser, L. J., & Olkin, I. (2009). Stochastically dependent effect sizes. In H. Cooper, L. V. Hedges & J. C. Valentine (Eds.), The handbook of research synthesis and meta-analysis. New York: Russell Sage Foundation.
  • Gliner, J. A., Morgan, G. A., & Harmon, R. J. (2003). Meta-analysis: Formulation and interpretation. Journal of the American Academy of Child and Adolescent Psychiatry, 42(11), 1376.
  • Gravetter, F. J., & Walnau, L. B. (2007). Statistics for behavioral sciences. Belmont, CA: Thomson Learning, Inc.
  • Hedges, L. V. (1992). Meta-analysis. Journal of Educational and Behavioral Statistics, 17(4), 279-296.
  • Hedges, L. V., & Olkin, I. (1980). Vote-counting methods in research synthesis. Psychological Bulletin, 88(2), 359-369.
  • Hedges, L. V., & Olkin, I. (1985). Statistical methods for meta-analysis. Orlando: Academic Press
  • Hedges, L. V., & Vevea, J. L. (1998). Fixed-and random-effects models in meta-analysis. Psychological Methods, 3, 486-504.
  • Herbison, P., Hay-Smith, J., & Gillespie, W. J. (2006). Adjustment of meta-analyses on the basis of quality scores should be abandoned. Journal of Clinical Epidemiology, 59, 1249-1256.
  • Higgins, J. P. T., Thompson, S. G., Deeks, J. J., & Altman, D. G. (2003). Measuring inconsistency in meta-analyses. British Medical Journal, 327(7414), 557-560.
  • Huberty, C. J. (2002). A history of effect size indices. Educational and Psychological Measurement, 62(2), 227.
  • Huedo-Medina, T. B., Sanchez-Meca, J., Marin-Martinez, F., & Botella, J. (2006). Assessing heterogeneity in meta-anlaysis: Q statistic or I2 index? Psychological Methods, 11(2), 193-206.
  • Hunt, M. (1997). How science takes stock: The story of meta-analysis. NY: The Russell Sage Foundation.
  • Hunter, J. E., & Schmidt, F. L. (2000). Fixed effects vs. Random effects meta-analysis models: Implications for cumulative research knowledge. International Journal of Selection and Assessment, 8(4), 275-292.
  • Hunter, J. E., & Schmidt, F. L. (2004). Methods of meta-analysis: Correcting error and bias in research findings (2 ed.). California: Sage Publications.
  • Jüni, P., Altman, D. G., & Egger, M. (2001). Assessing the quality of controlled clinical trials. British Medical Journal, 323, 42-46.
  • Jüni, P., Witschi, A., Bloch, R., & Egger, M. (1999). The hazards of scoring the quality of clinical trials for meta-analysis. Journal of the American Medical Association, 282, 1054-1060.
  • Kirk, R. E. (1996). Practical significance: A concept whose time has come. Educational and Psychological Measurement, 56(5), 746-759.
  • Kirk, R. E. (2001). Promoting good statistical practices: Some suggestions. Educational and Psychological Measurement, 61(2), 213-218.
  • Lau, J., Ioannidis, J. P. A., Terrin, N., Schmid, C. H., & Olkin, I. (2006). Evidence based medicine: The case of the misleading funnel plot. BMJ: British Medical Journal, 333(7568), 597.
  • Lewis, S., & Clarke, M. (2001). Forest plots: trying to see the wood and the trees. British Medical Journal, 322(7300), 1479.
  • Lipsey, M. W., & Wilson, D. B. (2001). Practical meta-analysis. California: Sage Publications.
  • Littell, J. H., Corcoran, J., & Pillai, V. (2008). Systematic reviews and meta-analysis. Oxford: Oxford University Press.
  • Lundahl, B., & Yaffe, J. (2007). Use of meta-analysis in social work and allied disciplines. Journal of Social Service Research, 33(3), 1-11.
  • Marin-Martinez, F., & Sanchez-Meca, J. (1999). Averaging dependent effect sizes in meta-analysis: A cautionary note about procedures. The Spanish Journal of Psychology, 2(1), 32-38.
  • Moher, D., Cook, D. J., Eastwood, S., Olkin, I., Rennie, D., Stroup, D., & The QUOROM group. (1999). Improving the quality of reporting of meta-analysis of randomized controlled trials: The QUOROM statement. Lancet, 354(9193), 1896-1900.
  • Moher, D., Liberati, A., Tetzlaff, J., Altman, D. G., & PRISMA Group. (2009). Preferred reporting items for systematic reviews and meta-analyses: The PRISMA Statement. Annals of Internal Medicine, 151(4), 264-269.
  • Mullen, B., Muellerleile, P., & Bryant, B. (2001). Cumulative meta-analysis: a consideration of indicators of sufficiency and stability. Personality and Social Psychology Bulletin, 27(11), 1450.
  • Mulrow, C. D. (1994). Systematic reviews: Rationale for systematic reviews. British Medical Journal, 309, 597-599.
  • National Research Council. (1992). Combining information: Statistical issues and opportunities for research Washington, DC: National Academy Press.
  • Normand, S. L. T. (1999). Tutorial in biostatistics meta-analysis: Formulating, evaluating, combining, and reporting. Statistics in Medicine, 18(3), 321-359.
  • O'Rourke, K. (2007). An historical perspective on meta-analysis: Dealing quantitatively with varying study results. Journal of the Royal Society of Medicine, 100(12), 579-582.
  • Oakley, A. (2002). Social science and evidence-based everything: The case of education. Educational Review, 54(3), 277-286.
  • Olejnik, S., & Algina, J. (2000). Measures of effect size for comparative studies: Applications, interpretations and limitations. Contemporary Educational Psychology, 25(3), 241-286.
  • Orwin, R. G., & Vevea, J. L. (2009). Evaluating Coding Decisions. In H. Cooper, L. V. Hedges & J. C. Valentine (Eds.), The handbook of research synthesis and meta-analysis. New York: Russell Sage Foundation.
  • Overton, R. C. (1998). A comparison of fixed-effects and mixed (random-effects) models for meta- analysis tests of moderator variable effects. Psychological Methods, 3(3), 354-379.
  • Pearson, K. (1904). Report on certain enteric fever inoculation statistics. British Medical Journal, 3, 1243- 1246.
  • Petticrew, M. (2003). Why certain systematic reviews reach uncertain conclusions. British Medical Journal, 326(7392), 756-758.
  • Petticrew, M., & Roberts, H. (2006). Systematic reviews in the social sciences: A practical guide. Malden: Blackwell Publishing.
  • Pigott, T. D. (2012). Advances in meta-analysis. NY: Springer Verlag.
  • Rendina-Gobioff, G. (2006). Detecting publication bias in random-effects meta-analysis: An emprical comparison of statistical methods Unpublished doctoral dissertation. University of South Florida. Florida.
  • Rosenthal, R. (1979). The 'file drawer' problem and tolerance for null results. Psychological Bulletin, 86, 638-641.
  • Rosenthal, R. (1991). Meta-analytic procedures for social research. (Vol. 6). CA: Sage Publication.
  • Rosenthal, R., & DiMatteo, M. R. (2001). Meta-analysis: Recent developments in quantitative methods for literature reviews. Annual Review of Psychology, 52(1), 59-82.
  • Rosenthal, R., & Rubin, D. B. (1978). Interpersonal expectancy effects: The first 345 studies. Behavioral and Brain Sciences, 3, 377-386.
  • Rosenthal, R., & Rubin, D. B. (1986). Meta-analytic procedures for combining studies with multiple effect sizes. Psychological Bulletin, 99, 400-406.
  • Rothstein, H. R., Sutton, A. J., & Borenstein, M. (2005). Publication bias in meta-analysis. In H. R. Rothstein, A. J. Sutton & M. Borenstein (Eds.), Publication bias in meta-analysis: Prevention, assessment and adjustments. West Sussex, England: John Wiley & Sons.
  • Sánchez-Meca, J., & Marín-Martínez, F. (1998). Testing continuous moderators in meta-analysis: A comparison of procedures. British Journal of Mathematical and Statistical Psychology, 51(2), 311-326.
  • Sánchez-Meca, J., & Marín-Martínez, F. (2010a). Meta-analysis in psychological research. International Journal of Psychological Research, 3(1), 150-162.
  • Sánchez-Meca, J., & Marín-Martínez, F. (2010b). Meta Analysis. In P. Peterson, E. Baker & B. McGaw (Eds.), International Encyclopedia of Education (Vol. 7, pp. 274-282). Oxford: Elsevier.
  • Schmidt, F. L. (1992). What do data really mean? Research findings, meta-analysis, and cumulative knowledge in psychology. American Psychologist, 47(10), 1173-1181.
  • Schmidt, F. L. (1996). Statistical significance testing and cumulative knowledge in psychology: Implications for training of researchers. Psychological Methods, 1(2), 115-129.
  • Schmidt, F. L., & Hunter, J. E. (1977). Development of a general solution to the problem of validity generalization. Journal of Applied Psychology, 62(5), 529-540.
  • Schmidt, F. L., Oh, I.-S., & Hayes, T. L. (2009). Fixed- versus random effects models in meta-analysis: Model properties and an emprical comparison of differences in results. British Journal of Mathematical and Statistical Psychology, 62, 97-128.
  • Schulze, R. (2007). The state and the art of meta-analysis. Zeitschrift für Psychologie/Journal of Psychology,
  • Shapiro, S. (1994). Meta-analysis/Shmeta-analysis. American Journal of Epidemiology, 140(9), 771-778.
  • Shelby, L. B., & Vaske, J. J. (2008). Understanding meta-analysis: A review of the methodological literature. Leisure Sciences, 30(2), 96-110.
  • Smith, M. L., & Glass, G. V. (1977). Meta-analysis of psychotherapy outcome studies. American Psychologist, 32(9), 752-760.
  • Smith, M. L., Glass, G. V., & Miller, T. I. (1980). The benefits of psychotherapy. Baltimore, MD: Johns Hopkins University Press.
  • Song, F., Khan, K. S., Dinnes, J., & Sutton, A. J. (2002). Asymmetric funnel plots and publication bias in meta-analyses of diagnostic accuracy. International Journal of Epidemiology, 31(1), 88.
  • Sterne, J. A. C., & Egger, M. (2001). Funnel plots for detecting bias in meta-analysis Guidelines on choice of axis. Journal of Clinical Epidemiology, 54(10), 1046-1055.
  • Sterne, J. A. C., & Egger, M. (2005). Regression methods to detect publication and other bias in meta- analysis. In H. R. Rothstein, A. J. Sutton & M. Borenstein (Eds.), Publication bias in meta-analysis: Prevention, assessment and adjustments. West Sussex, England: John Wiley & Sons, Ltd.
  • Sterne, J. A. C., & Harbord, R. M. (2004). Funnel plots in meta-analysis. The Stata Journal, 4(2), 127-141.
  • Stroup, D. F., Berlin, J. A., Morton, S. C., Olkin, I., Williamson, G. D., Rennie, D., . . . Thacker, S. B. (2000). Meta-analysis of observational studies in epidemiology: A proposal for reporting. The Journal of the American Medical Association, 283(15), 2008-2012.
  • Sutton, A. J. (2009). Publication bias. In H. Cooper, L. V. Hedges & J. C. Valentine (Eds.), The handbook of research synthesis and meta-analysis (2nd ed., pp. 435-452). New York: Russell Sage Foundation.
  • Tang, J. L., & Liu, J. L. Y. (2000). Misleading funnel plot for detection of bias in meta-analysis. Journal of Clinical Epidemiology, 53(5), 477-484.
  • Terrin, N., Schmid, C. H., & Lau, J. (2005). In an empirical evaluation of the funnel plot, researchers could not visually identify publication bias. Journal of Clinical Epidemiology, 58(9), 894-901.
  • Thornton, A., & Lee, P. (2000). Publication bias in meta-analysis its causes and consequences. Journal of Clinical Epidemiology, 53(2), 207-216.
  • Torgerson, C. (2003). Systematic reviews. London: Continuum International Publishing Group.
  • Tweedie, R. L., Smelser, N. J., & Baltes, P. B. (2004). Meta-analysis: Overview International Encyclopedia of the Social & Behavioral Sciences. (pp. 9717-9724): Elsevier Science Ltd.
  • Üstün, U. (2012). To what extent is problem-based learning effective as compared to traditional teaching in science education? A meta-analysis study. Unpublished doctoral dissertation. METU. Ankara.
  • Vacha-Haase, T. (2001). Statistical significance should not be considered one of life's guarantees: Effect sizes are needed. Educational and Psychological Measurement, 61(2), 219-224.
  • Valentine, J. C. (2009). Judging the quality of primary research. In H. Cooper, L. V. Hedges & J. C. Valentine (Eds.), The handbook of research synthesis and meta-analysis (2nd ed.). New York: Russell Sage Foundation.
  • Wallace, B. C., Schmid, C. H., Lau, J., & Trikalinos, T. A. (2009). Meta-Analyst: Software for meta- analysis of binary, continuous and diagnostic data. BMC Medical Research Methodology, 9(80). doi: 10.1186/1471-2288-9-80
  • Wells, K., & Littell, J. H. (2009). Study quality assessment in systematic reviews of research on intervention effects. Research on Social Work Practice, 19(1), 52-62.
  • Wolf, F. M. (1986). Meta-analysis: Quantitative methods for research synthesis. California: Sage Publications Inc.
  • Yeh, J., & D'Amico, F. (2004). Forest plots: data summaries at a glance. The Journal of Family Practice, 53, 1007.
APA Ustun U, ERYILMAZ A (2014). Etkili Araştırma Sentezleri Yapabilmek için Bir Araştırma Yöntemi: Meta-Analiz. , 1 - 32.
Chicago Ustun Ulas,ERYILMAZ Ali Etkili Araştırma Sentezleri Yapabilmek için Bir Araştırma Yöntemi: Meta-Analiz. (2014): 1 - 32.
MLA Ustun Ulas,ERYILMAZ Ali Etkili Araştırma Sentezleri Yapabilmek için Bir Araştırma Yöntemi: Meta-Analiz. , 2014, ss.1 - 32.
AMA Ustun U,ERYILMAZ A Etkili Araştırma Sentezleri Yapabilmek için Bir Araştırma Yöntemi: Meta-Analiz. . 2014; 1 - 32.
Vancouver Ustun U,ERYILMAZ A Etkili Araştırma Sentezleri Yapabilmek için Bir Araştırma Yöntemi: Meta-Analiz. . 2014; 1 - 32.
IEEE Ustun U,ERYILMAZ A "Etkili Araştırma Sentezleri Yapabilmek için Bir Araştırma Yöntemi: Meta-Analiz." , ss.1 - 32, 2014.
ISNAD Ustun, Ulas - ERYILMAZ, Ali. "Etkili Araştırma Sentezleri Yapabilmek için Bir Araştırma Yöntemi: Meta-Analiz". (2014), 1-32.
APA Ustun U, ERYILMAZ A (2014). Etkili Araştırma Sentezleri Yapabilmek için Bir Araştırma Yöntemi: Meta-Analiz. Eğitim ve Bilim, 39(174), 1 - 32.
Chicago Ustun Ulas,ERYILMAZ Ali Etkili Araştırma Sentezleri Yapabilmek için Bir Araştırma Yöntemi: Meta-Analiz. Eğitim ve Bilim 39, no.174 (2014): 1 - 32.
MLA Ustun Ulas,ERYILMAZ Ali Etkili Araştırma Sentezleri Yapabilmek için Bir Araştırma Yöntemi: Meta-Analiz. Eğitim ve Bilim, vol.39, no.174, 2014, ss.1 - 32.
AMA Ustun U,ERYILMAZ A Etkili Araştırma Sentezleri Yapabilmek için Bir Araştırma Yöntemi: Meta-Analiz. Eğitim ve Bilim. 2014; 39(174): 1 - 32.
Vancouver Ustun U,ERYILMAZ A Etkili Araştırma Sentezleri Yapabilmek için Bir Araştırma Yöntemi: Meta-Analiz. Eğitim ve Bilim. 2014; 39(174): 1 - 32.
IEEE Ustun U,ERYILMAZ A "Etkili Araştırma Sentezleri Yapabilmek için Bir Araştırma Yöntemi: Meta-Analiz." Eğitim ve Bilim, 39, ss.1 - 32, 2014.
ISNAD Ustun, Ulas - ERYILMAZ, Ali. "Etkili Araştırma Sentezleri Yapabilmek için Bir Araştırma Yöntemi: Meta-Analiz". Eğitim ve Bilim 39/174 (2014), 1-32.