>
Clearinghouse on Assessment and Evaluation

Library | SearchERIC | Test Locator | ERIC System | Resources | Calls for papers | About us

 

 


ERIC Documents Database Citations & Abstracts for the Methodology of Meta Analysis


Instructions for ERIC Documents Access

Search Strategy:
Meta Analysis [ERIC Descriptor, with heavily weighted status]
AND
Research Methodology OR Evaluation Methods OR Methods Research OR Evaluation Research OR Synthesis [ERIC Descriptors]
AND
Statistical Significance OR Effect Size OR Analysis of Variance OR Generalizability Theory OR Statistical Analysis [ERIC Descriptors]
-----
OR
Gene V Glass [author]
  ED403270  TM025465
  A Meta-Meta-Analysis: Methodological Aspects of Meta-Analyses in
Educational Achievement.
  Sipe, Theresa Ann; Curlette, William L.
  Apr 1996
  44p.; Paper presented at the Annual Meeting of the American 
Educational Research Association (New York, NY, April 8-12, 1996).
  Document Type: REVIEW LITERATURE (070);  EVALUATIVE REPORT (142);  
CONFERENCE PAPER (150)
  Selected methodological characteristics of meta-analyses related to 
educational achievement are reviewed in an exploration of the 
practice of meta-analysis and the characteristics of meta-analyses 
related to educational achievement, as well as possible relationships 
among background, methodological and substantive characteristics, and 
effect sizes.  A literature search identified 1,197 documents, of 
which 694 were retrieved as pertinent.  Using only meta-analyses 
published after 1984, 103 published meata-analyses were selected as 
having met study criteria.  The most frequent type of meta-analysis 
was that of treatment effectiveness.  Hypothesis and theory testing 
did not appear as frequently as descriptive research.  Many primary 
research articles did not include sample size, precluding the 
computation of effect size.  Many details of the search procedures in 
meta-analyses were not included, and fewer than 40% of the authors 
reported some kind of homogeneity of effect size testing.  Overall, 
results suggest that researchers are not exploiting the full 
capabilities of meta-analytic techniques.  Appendix A lists meta-
analyses included in the study, and Appendix B lists those 
specifically excluded.  (Contains 6 tables, 10 figures, and 38 
references.) (SLD)
  Descriptors: *Academic Achievement; *Effect Size; Elementary 
Secondary Education; Higher Education; Hypothesis Testing; Literature 
Reviews; *Meta Analysis; *Outcomes of Treatment; *Research 
Methodology; Sample Size
  Identifiers: *Descriptive Research

  
  EJ520936  TM519322
  The Impact of Data-Analysis Methods on Cumulative Research 
Knowledge: Statistical Significance Testing, Confidence Intervals, 
and Meta-Analysis.
  Schmidt, Frank; Hunter, John E.
  Evaluation and the Health Professions, v18 n4 p408-27 Dec 
  1995
  Special issue titled "The Meta-Analytic Revolution in Health 
Research: Part II."
  ISSN: 0163-2787
  Available From: UMI
  Document Type: EVALUATIVE REPORT (142);  JOURNAL ARTICLE (080)
  It is argued that point estimates of effect sizes and confidence 
intervals around these point estimates are more appropriate 
statistics for individual studies than reliance on statistical 
significance testing and that meta-analysis is appropriate for 
analysis of data from multiple studies.  (SLD)
  Descriptors: *Effect Size; Estimation (Mathematics); *Knowledge 
Level; *Meta Analysis; *Research Methodology; *Statistical 
Significance; Test Use
  Identifiers: *Confidence Intervals (Statistics)

  
  EJ520935  TM519321
  Interpreting and Evaluating Meta-Analysis.
  Hall, Judith A.; Rosenthal, Robert
  Evaluation and the Health Professions, v18 n4 p393-407 Dec 
  1995
  Special issue titled "The Meta-Analytic Revolution in Health 
Research: Part II."
  ISSN: 0163-2787
  Available From: UMI
  Document Type: EVALUATIVE REPORT (142);  JOURNAL ARTICLE (080)
  Some guidelines are offered for interpreting and evaluating meta-
analytic reviews of research.  The choice of unit of analysis, the 
issue of fixed versus random effects, the meaning of heterogeneity, 
the determination of appropriate contrasts, and the choice of 
measures of central tendency are discussed.  (SLD)
  Descriptors: Comparative Analysis; Effect Size; *Evaluation Methods; 
Health; *Medical Care Evaluation; *Meta Analysis; *Research 
Methodology; *Synthesis
  Identifiers: Heterogeneity of Variance

  
  EJ520933  TM519319
  Meta-analysis at 20: Retrospect and Prospect.
  Kavale, Kenneth A.
  Evaluation and the Health Professions, v18 n4 p349-69 Dec 
  1995
  Special issue titled "The Meta-Analytic Revolution in Health 
Research: Part II."
  ISSN: 0163-2787
  Available From: UMI
  Document Type: EVALUATIVE REPORT (142);  JOURNAL ARTICLE (080)
  Explores the nature of meta-analysis by placing it in the context 
of research synthesis.  Methods of meta-analysis are described and 
compared with other forms of research integration, and findings for 
several meta-analyses are provided to show advantages of quantitative 
review methods.  (SLD)
  Descriptors: *Comparative Analysis; Effect Size; Health; *Medical 
Care Evaluation; *Meta Analysis; *Research Methodology; Statistical 
Data; *Synthesis

  
  EJ498134  RC510475
  Going beyond the Literature Review with Meta-Analysis.
  McNeil, Keith; Newman, Isadore
  Mid-Western Educational Researcher, v8 n1 p23-26 Win 
  1995
  ISSN: 1056-3997
  Document Type: RESEARCH REPORT (143);  JOURNAL ARTICLE (080)
  Presents situations in which researchers can use the general linear 
model to uncover reasons for discrepant effect-size results of meta-
analysis of similar studies.  Situations include similarly labeled 
treatments or participants differing in important ways, treatment 
effectiveness varying by subject aptitude or situational variables, 
research design strongly influencing outcome, and analysis of several 
results from one study.  (RAH)
  Descriptors: *Effect Size; *Hypothesis Testing; *Meta Analysis; 
*Research Methodology; *Research Problems; Statistical Analysis; 
Synthesis
  Identifiers: *General Linear Model

  
  EJ490157  SP523457
  Comparison of the Glass and Hunter-Schmidt Meta-Analytic 
Techniques.
  Hough, Susan L.; Hall, Bruce W.
  Journal of Educational Research, v87 n5 p292-96 May-Jun 
  1994
  ISSN: 0022-0671
  Document Type: RESEARCH REPORT (143);  JOURNAL ARTICLE (080)
  Compares results of Hunter-Schmidt meta-analytic technique with 
results of Glass meta-analytic technique on three meta-analytic data 
sets chosen from the literature, hypothesizing that the Hunter-
Schmidt mean effect size would be significantly larger than the Glass 
mean effect size because of correlation for measurement error.  
Results confirmed the hypothesis, but the Glass formulas appear 
adequate and are more easily calculated.  (SM)
  Descriptors: Comparative Analysis; Educational Research; *Effect 
Size; *Error of Measurement; *Evaluation Methods; Higher Education; 
*Meta Analysis
  Identifiers: *Glass Analysis Method; *Hunter Schmidt Meta Analysis

  
  EJ488861  TM518079
  A Conservative Inverse Normal Test Procedure for Combining P-Values 
in Integrative Research.
  Saner, Hilary
  Psychometrika, v59 n2 p253-67 Jun   1994
  ISSN: 0033-3123
  Document Type: RESEARCH REPORT (143);  JOURNAL ARTICLE (080)
  The use of p-values in combining results of studies often involves 
studies that are potentially aberrant.  This paper proposes a 
combined test that permits trimming some of the extreme p-values.  
The trimmed statistic is based on an inverse cumulative normal 
transformation of the ordered p-values.  (SLD)
  Descriptors: *Effect Size; *Meta Analysis; *Research Methodology; 
Sample Size; Simulation; Statistical Distributions; *Statistical 
Significance; Statistical Studies; *Synthesis
  Identifiers: *Integrative Processes; Inverse Normal Test; *P Values

  
  ED372120  TM021963
  Philosophical Inquiry into Meta-Analysis.
  Grover, Burton L.
  Oct 1993
  14p.; Paper presented to the Northwest Philosophy of Education 
Society (Vancouver, British Columbia, Canada, October 1993).
  Document Type: POSITION PAPER (120);  EVALUATIVE REPORT (142);  
CONFERENCE PAPER (150)
  A search of the ERIC database and a review of the literature 
suggests that meta-analysis is ignored by philosophers, a situation 
that is regrettable but remediable.  Meta-analysis is a method by 
which one attempts to integrate findings quantitatively from several 
research studies related to a common general topic.  Philosophers 
should certainly pay attention to meta-analysis if their task is to 
investigate knowledge claims and assess their significance.  Three 
areas in particular are fertile ground for philosophers.  One is the 
importance of the questions considered by meta-analysis.  Another is 
the matter of generalization to a population.  A third area for 
philosophers to consider is variation in criterion variables and 
parsimony.  Many have been excited about the potential of meta-
analysis to make sense of a mass of confusing contradictory studies 
and to reach new conclusions where none seemed logically possible.  
While results of some meta-analyses encourage this excitement, 
disagreements among methodologists can be disconcerting.  Better 
technical expertise may resolve such problems, but it is also 
possible that philosophical consideration will give more direction to 
these efforts.  (Contains 9 references.) (SLD)
  Descriptors: *Effect Size; Hypothesis Testing; Integrated 
Activities; Literature Reviews; *Meta Analysis; *Philosophy; 
*Research Methodology; *Statistical Analysis
  Identifiers: Criterion Variables; *Philosophers


  ED358115  TM019881
  Trends in Published Meta-Analyses.
  Grover, Burton L.
  Apr 1993
  22p.; Paper presented at the Annual Meeting of the American 
Educational Research Association (Atlanta, GA, April 12-16, 1993).
  Document Type: REVIEW LITERATURE (070);  PROJECT DESCRIPTION (141); 
 CONFERENCE PAPER (150)
  Meta analytic procedures recommended by various authorities were 
the subject of a literature review designed not to discuss the 
relative merits of contrasting recommendations, but to find what is 
actually in the literature.  The sample reviewed included 89 articles 
published between 1986 and 1992, from 2 journals and 2 information 
databases.  Meta analyses were coded for a number of variables.  Most 
reported the databases used to find the studies.  The median number 
of studies synthesized by the data analysis was 48.  About three-
quarters of these reported collecting and aggregating mean 
differences.  Of the 66 that examined mean differences, 55 calculated 
and reported these differences as standardized mean differences.  
Fifteen studies reported effect size, eight used the standard normal 
deviate, and eight used some other method.  A large variety of 
statistical methods was reported for the analysis of the relationship 
of moderator variables with effect size.  Forty-seven studies 
reported an overall test of homogeneity of effect sizes.  One 
implication of the study for researchers is that, given the diversity 
of approaches to meta analysis, a good part of the potential audience 
may well prefer a meta analytic approach that differs from that 
chosen by the researcher.  Five tables present information on trends 
in meta analysis.  An appendix lists studies that appeared in the two 
main journals reviewed.  (SLD)
  Descriptors: Comparative Analysis; Databases; *Effect Size; 
Literature Reviews; *Meta Analysis; *Publications; *Research 
Methodology; Scholarly Journals; Synthesis; *Trend Analysis
  Identifiers: Mean (Statistics)


  EJ484369  TM517924
  Program Evaluation: A Pluralistic Enterprise.
  Sechrest, Lee, Ed.
  New Directions for Program Evaluation, n60 p1-101 Win 
  1993
  ISSN: 0164-7989
  Document Type: SERIAL (022);  EVALUATIVE REPORT (142);  JOURNAL 
ARTICLE (080)
  Two chapters of this issue consider critical multiplism as a 
research strategy with links to meta analysis and generalizability 
theory.  The unifying perspective it can provide for quantitative and 
qualitative evaluation is discussed.  The third chapter explores meta 
analysis as a way to improve causal inferences in nonexperimental 
data.  (SLD)
  Descriptors: Causal Models; *Evaluation Methods; *Generalizability 
Theory; Inferences; *Meta Analysis; *Program Evaluation; Qualitative 
Research; *Research Methodology; Statistical Analysis
  Identifiers: *Critical Multiplism; *Pluralistic Method

  
  EJ458576  TM517006
  Meta-Analysis: Literature Synthesis or Effect-Size Surface 
Estimation?
  Rubin, Donald B.
  Journal of Educational Statistics, v17 n4 p363-74 Win 
  1992
  Special issue with title "Meta-Analysis."
  ISSN: 0362-9791
  Document Type: JOURNAL ARTICLE (080);  EVALUATIVE REPORT (142)
  In contrast to the average effect sizes of the approach to 
metanalysis that can be thought of as literature synthesis, an effect-
size surface is proposed as a function of scientifically relevant 
factors, estimated by extrapolating a response surface of observed 
effect sizes to a region of ideal studies.  (SLD)
  Descriptors: *Effect Size; Equations (Mathematics); *Estimation 
(Mathematics); *Literature Reviews; *Mathematical Models; *Meta 
Analysis; Research Methodology; *Synthesis
  Identifiers: *Extrapolation

  
  EJ458572  TM517002
  Meta-Analysis.
  Hedges, Larry V.
  Journal of Educational Statistics, v17 n4 p279-96 Win 
  1992
  Special issue with title "Meta-Analysis."
  ISSN: 0362-9791
  Document Type: JOURNAL ARTICLE (080);  REVIEW LITERATURE (070);  
EVALUATIVE REPORT (142)
  The use of statistical methods to combine the results of 
independent empirical research studies (metanalysis) has a long 
history, with work mainly divided into tests of the statistical 
significance of combined results and methods for combining estimates 
across studies.  Methods of metanalysis and their applications are 
reviewed.  (SLD)
  Descriptors: Chi Square; *Educational Research; Effect Size; 
*Estimation (Mathematics); Hypothesis Testing; *Mathematical Models; 
*Meta Analysis; Research Methodology; *Statistical Data; Statistical 
Significance
  Identifiers: Empirical Research; Missing Data; Parametric Analysis; 
Random Effects

  
  EJ437344  CS742861
  Meta-Analysis for Primary and Secondary Data Analysis: The Super-
Experiment Metaphor.
  Jackson, Sally
  Communication Monographs, v58 n4 p449-62 Dec   1991
  ISSN: 0363-7751
  Document Type: JOURNAL ARTICLE (080);  RESEARCH REPORT (143)
  Considers the relation between meta-analysis statistics and 
analysis of variance statistics.  Discusses advantages and 
disadvantages as a primary data analysis tool.  Argues that the two 
approaches are partial paraphrases of one another.  Advocates an 
integrative approach that introduces the best of meta-analytic 
thinking into primary analysis without abandoning the characteristic 
features of analysis of variance.  (SR)
  Descriptors: *Analysis of Variance; Higher Education; *Meta 
Analysis; *Research; *Research Methodology

  
  ED339743  TM017675
  A Comparison of the Glass Meta-Analytic Technique with the Hunter-
Schmidt Meta-Analytic Technique on Three Studies from the Education 
Literature.
  Hough, Susan L.; Hall, Bruce W.
  Nov 1991
  25p.; Paper presented at the Annual Meeting of the Florida 
Educational Research Association (Clearwater, FL, November 13-16, 
1991).
  Document Type: RESEARCH REPORT (143);  CONFERENCE PAPER (150)
  The meta-analytic techniques of G. V. Glass (1976) and J. E. Hunter 
and F. L. Schmidt (1977) were compared through their application to 
three meta-analytic studies from education literature.  The following 
hypotheses were explored: (1) the overall mean effect size would be 
larger in a Hunter-Schmidt meta-analysis (HSMA) than in a Glass meta-
analysis (GMA) due to correction for measurement error when compared 
on the same set of experimental data; (2) the overall mean effect 
size calculated using the pooled within-group standard deviation in 
HSMA would not differ significantly from that in a GMA that uses the 
control group standard deviation; (3) most of the variation between 
study effect sizes would be due to sampling error according to 
sampling error correction formulas from the HSMA method; and (4) no 
moderator variables would be found because most of the variation 
between study effect sizes is due to sampling error.  A correlated t-
test was used to compare the overall mean effect sizes that were 
calculated using GMA and HSMA.  Pearson correlations and analyses of 
variances were run on the study data.  Three meta-analytic studies 
were selected and statistical data from each of the individual 
studies were collated.  Results support Hypotheses 1 and 2, but 
reject Hypotheses 3 and 4. It is argued that the HS correction 
formulas are technically more accurate, but that the Glass method is 
adequate in portraying effect size and more easily calculated.  Three 
tables present data from the meta-analyses.  A 21-item list of 
references is included.  (SLD)
  Descriptors: Comparative Analysis; Educational Research; *Effect 
Size; *Error of Measurement; Hypothesis Testing; *Literature Reviews; 
*Meta Analysis; *Research Methodology; Sampling
  Identifiers: *Gass Analysis Method; *Hunter Schmidt Meta Analysis

  
  EJ412548  TM515213
  The Usefulness of the "Fail-Safe" Statistic in Meta-Analysis.
  Carson, Kenneth P.; And Others
  Educational and Psychological Measurement, v50 n2 p233-43 Sum 
  1990
  Document Type: JOURNAL ARTICLE (080);  RESEARCH REPORT (143)
  The utility of the fail-safe "N" statistic was evaluated by 
computing it for studies in three organizational research domains in 
which discrepant conclusions were reached by initial and subsequent 
meta-analyses.  Calculation of the fail-safe "N" may have led to more 
cautious interpretations.  Implications for meta-analyses are 
discussed.  (SLD)
  Descriptors: Comparative Analysis; Effect Size; Evaluation Methods; 
Institutional Research; *Mathematical Models; *Meta Analysis; 
Organizations (Groups); *Research Methodology
  Identifiers: *Fail Safe Strategies

  
  ED322218  TM015467
  Conventional and Newer Statistical Methods in Meta-Analysis.
  Kulik, James A.; Kulik, Chen-Lin C.
  Apr 1990
  6p.; Paper presented at the Annual Meeting of the American 
Educational Research Association (Boston, MA, April 16-20, 1990).
  Document Type: EVALUATIVE REPORT (142);  CONFERENCE PAPER (150)
  The assumptions and consequences of applying conventional and newer 
statistical methods to meta-analytic data sets are reviewed.  The 
application of the two approaches to a meta-analytic data set 
described by L. V. Hedges (1984) illustrates the differences.  Hedges 
analyzed six studies of the effects of open education on student 
cooperation.  The conventional way to test the hypothesis that 
treatment fidelity significantly influenced results is through a t-
test for independent results.  Hedges' more modern approach was to 
use a chi-square analog of the analysis of variance (ANOVA), a method 
that, in contrast to conventional statistics, found strong support 
for the hypothesized effect.  Conventional ANOVA and newer techniques 
were also applied to a data set in which all studies were of the same 
size, with each assumed to have experimental and control groups 
containing 25 students each.  The cell means and variances for 
Hedges' meta-analytic data set were reconstructed to determine the 
source of the difference in results between conventional and newer 
tests.  It is concluded that conventional ANOVA is appropriate for 
use with meta-analytic data sets because conventional ANOVA uses the 
correct error term for testing the significance of effects of group 
factors.  Newer meta-analytic methods are not recommended because of 
their use of an inappropriate error term.  (SLD)
  Descriptors: *Analysis of Variance; Chi Square; Comparative 
Analysis; *Data Analysis; Hypothesis Testing; *Meta Analysis; 
Research Methodology; Statistical Significance

  
  EJ397347  TM514667
  Meta-Analysis in Education.
  Kulik, James A.; Kulik, Chen-Lin C.
  International Journal of Educational Research, v13 n3 p221-340 
  1989
  Document Type: JOURNAL ARTICLE (080);  EVALUATIVE REPORT (142);  
REVIEW LITERATURE (070)
  An overview of meta-analysis in education is provided.  Methodology 
is discussed and substantive findings from meta-analytic studies are 
reviewed for six major areas of educational research: (1) 
instructional systems; (2) instructional design; (3) curricular 
innovation; (4) teacher education and evaluation; (5) class and 
school organization; and (6) equity.  (SLD)
  Descriptors: Comparative Analysis; *Educational Research; 
Evaluation Methods; Literature Reviews; *Meta Analysis; *Research 
Methodology; Statistical Analysis

  
  ED309952  SE050788
  A Practical Guide to Modern Methods of Meta-Analysis.
  Hedges, Larry V.; And Others
  National Science Teachers Association, Washington, D.C.
  1989
  80p.
  Sponsoring Agency: National Science Foundation, Washington, D.C.
  Available From: National Science Teachers Association, 1742 
Connecticut Avenue, NW, Washington, DC 20009 ($9.50; PB-52).
  Document Type: INSTRUCTIONAL MATERIAL (051);  BOOK (010);  RESEARCH 
REPORT (143)
  Target Audience: Teachers; Researchers; Students; Practitioners
  Methods for meta-analysis have evolved dramatically since Gene 
Glass first proposed the term in 1976.  Since that time statistical 
and nonstatistical aspects of methodology for meta-analysis have been 
developing at a steady pace.  This guide is an attempt to provide a 
practical introduction to rigorous procedures in the meta-analysis of 
social science research.  It approaches the use of modern statistical 
methods in meta-analysis from the perspective of a potential user.
The treatment is limited to meta-analysis of studies of between-group 
comparisons using the standardized mean difference as an index of 
effect magnitude.  This guide is organized according to a variant of 
Cooper's stages of the research review process: (1) problem 
formulation; (2) data collection and data evaluation, data analysis 
and interpretation; and (3) presentation of results.  Although each 
stage is discussed, the greatest emphasis is placed on the stage of 
data analysis and interpretation.  Examples from a synthesis of 
research on the effects of science curricula are used throughout for 
illustration.  Because this book is intended to be a practical guide, 
the references are provided primarily to exemplify issues or 
techniques rather than to provide theoretical discussions or 
derivations.  (CW)
  Descriptors: *Comparative Analysis; Effect Size; Higher Education; 
*Meta Analysis; *Research Methodology; *Science Education; 
*Statistical Analysis; Statistical Data; Statistical Studies; 
Synthesis

  
  EJ388296  CG535451
  Meta-Analysis: A Statistical Method for Integrating the Results of 
Empirical Studies.
  Blimling, Gregory S.
  Journal of College Student Development, v29 n6 p543-49 Nov 
  1988
  Document Type: JOURNAL ARTICLE (080);  GENERAL REPORT (140)
  Introduces statistical and procedural methods of meta-analysis, and 
explains how to interpret the findings of meta-analytic studies 
currently appearing throughout the social science literature.  
Includes overview of meta-analysis and discussion of seven steps used 
in conducting a meta-analysis.  (Author/NB)
  Descriptors: *Meta Analysis; *Research Methodology; *Social Science 
Research; Statistical Analysis

  
  EJ382869  SE543620
  Disturbed by Meta-Analysis?
  Wachter, Kenneth W.
  Science, v241 n4872 p1407-08 Sep 16   1988
  Document Type: JOURNAL ARTICLE (080);  PROJECT DESCRIPTION (141)
  Defines meta-analysis as statistical procedures for combining 
results from previous separate studies.  Discusses four charges 
promoted by some skeptics as it relates to this statistical procedure.  
States that many of the trends making a place for meta-analysis are 
disturbing.  (RT)
  Descriptors: *Comparative Analysis; Effect Size; Higher Education; 
*Meta Analysis; *Research Methodology; *Statistical Analysis; 
*Statistical Data

  
  ED300411  TM012402
  Meta-analysis: A Bibliography of Conceptual Issues and Statistical 
Methods.
  Preiss, Raymond W.
  1988
  28p.
  Document Type: BIBLIOGRAPHY (131)
  Target Audience: Researchers; Teachers; Practitioners
  The usefulness of meta-analysis in summarizing domains of primary 
research has led to widespread use of the techniques.  Often, 
however, the researcher will have several options when cumulating 
empirical studies and readers will have questions regarding judgment 
calls made during a meta-analysis.  In these cases, it is helpful to 
consult the primary literature on meta-analytic theory and practice.  
Reported here are over 160 articles and papers divided into two 
bibliographies: (1) conceptual issues; and (2) statistical issues.  
The bibliographies are discussed in terms of conducting a meta-
analysis and teaching research methods.  Both bibliographies trace 
the evolution of meta-analytic procedures in order to highlight the 
intellectual dialog surrounding the practice.  The bibliography as a 
whole may prove valuable to teachers of graduate and undergraduate 
research methods courses, and may be used as a teaching tool by 
alerting students to the conceptual similarities of diverse 
statistical methods.  (Author/SLD)
  Descriptors: *Educational Research; Higher Education; *Meta 
Analysis; *Research Methodology; *Resource Materials; Statistical 
Analysis
  Identifiers: Conceptual Models

  
  ED297015  TM011982
  Meta-analysis: Historical Origins and Contemporary Practice.
  Kulik, James A.; Kulik, Chen-Lin C.
  Apr 1988
  39p.; Paper presented at the Annual Meeting of the American 
Educational Research Association (New Orleans, LA, April 5-9, 1988).
  Document Type: RESEARCH REPORT (143);  CONFERENCE PAPER (150);  
REVIEW LITERATURE (070)
  The early and recent history of meta-analysis is outlined.  After 
providing a definition of meta-analysis and listing its major 
characteristics, developments in statistics and research are 
described that influenced the formulation of modern meta-analytic 
methods.  Major meta-analytic methods currently in use are described.  
Statistical and other research developments contributing to meta-
analysis include the introduction of combined tests, combined 
treatment effects, use of percentages as outcome variables, and use 
of correlations as outcomes.  Meta-analytic approaches reviewed 
include Glass' methodology, Hedges' modern statistical methods, 
Hunter and Schmidt's validity generalization, and Rosenthal's methods.  
Problems affecting meta-analysis include inflated sample sizes, non-
independent measures in statistical analyses, the failure to take 
experimental design into account when estimating effect sizes and 
sampling errors, and the development of inappropriate statistical 
methods for testing the influence of study features on study outcomes.  
Four tables and two graphs are included.  (TJH)
  Descriptors: Generalization; Literature Reviews; *Meta Analysis; 
*Research Methodology; *Statistical Analysis
  Identifiers: *Historical Background

  
  ED319784  TM015018
  Putting the "But" Back in Meta-Analysis: Issues Affecting the 
Validity of Quantitative Reviews.
  L'Hommedieu, Randi; And Others
  Apr 1987
  12p.; Paper presented at the Annual Meeting of the American 
Educational Research Association (Washington, DC, April 20-24, 1987).
  Document Type: EVALUATIVE REPORT (142);  CONFERENCE PAPER (150)
  Some of the frustrations inherent in trying to incorporate 
qualifications of statistical results into meta-analysis are 
reviewed, and some solutions are proposed to prevent the loss of 
information in meta-analytic reports.  The validity of a meta-
analysis depends on several factors, including the: thoroughness of 
the literature search; selection of studies for inclusion; 
appropriate coding and analysis of studies; and report format 
selected.  The solution proposed to the problem of methodological 
quality is to include all selected studies and report an average 
effect size for the aggregate.  The report on the meta-analysis then 
should be a qualitative, discursive argument rather than a simple 
statistic.  Proposals for putting the "but" back in meta-analysis 
are: (1) assure that it is not a substitute for qualitative review; 
(2) offer the reader information necessary to evaluate the validity 
of decisions made at the individual level; and (3) assure that 
qualifications of studies are not excluded from the analysis.  A 
thorough quantitative review should include: a discursive review of 
each study; a report on how each effect size was calculated; the 
location of the summary statistics upon which each effect size was 
based; and a discussion of study limitations and the factors that 
affect validity of effect size.  Suggestions are also given for 
appropriate reporting and avoiding publication bias.  (SLD)
  Descriptors: *Data Analysis; Literature Reviews; *Meta Analysis; 
*Research Methodology; Research Problems; Research Reports; 
*Statistical Analysis; *Validity

 
  ED262095  TM850578
  The Meta-Analytic Debate.
  Bangert-Drowns, Robert L.
  1985
  12p.; Paper presented at the Annual Meeting of the American 
Educational Research Association (69th, Chicago, IL, March 31-April 
4, 1985).
  Document Type: CONFERENCE PAPER (150);  EVALUATIVE REPORT (142)
  Target Audience: Researchers
  Since meta-analysis was described in 1976 (Glass) as the 
application of familiar experimental methods to the integration of 
available research, at least five coherent approaches to meta-
analysis have appeared in common use.  These approaches can be 
divided into two broad groups.  In the first group (including 
procedures by Robert Rosenthal, Larry Hedges, and Frank Schmidt and 
John Hunter), meta-analysis is used to approximate data-pooling.  
This type of meta-analysis attempts to answer the same questions as 
primary research, only larger samples are used by combining 
information from many studies.  The alternate view (shown in the 
procedures of Gene Glass and James Kulik) is that meta-analysis is a 
form of literature review.  As a literature review, meta-analysis is 
not meant to test a hypothesis but to summarize features and outcomes 
of a body of research.  The differences in approaches to meta-
analysis indicate that the procedure is still in a period of 
invention and change.  It is important that editors, consumers, and 
critics of meta-analysis know about these differences so that they 
can make more informed evaluations of meta-analytic findings.  
(Author/PN)
  Descriptors: Data; Effect Size; Error of Measurement; *Literature 
Reviews; *Measurement Techniques; *Meta Analysis; Research Design; 
*Research Methodology; Sampling
  Identifiers: Glass (GV)


  EJ309340  TM510208
  Advances in Statistical Methods for Meta-Analysis.
  Hedges, Larry V.
  New Directions for Program Evaluation, n24 p25-42 Dec 
  1984
  Theme issue with title "Issues in Data Synthesis." Research 
supported by the Spencer Foundation.
  Document Type: JOURNAL ARTICLE (080);  POSITION PAPER (120);  
RESEARCH REPORT (143)
  The adequacy of traditional effect size measures for research 
synthesis is challenged.  Analogues to analysis of variance and 
multiple regression analysis for effect sizes are presented.  The 
importance of tests for the consistency of effect sizes in 
interpreting results, and problems in obtaining well-specified models 
for meta-analysis are discussed.  (BS)
  Descriptors: Analysis of Variance; *Effect Size; Mathematical 
Models; *Meta Analysis; Research Methodology; Research Problems; 
*Statistical Analysis

  
  EJ297528  TM508771
  Effect Size Estimation in Meta-Analysis.
  Holmes, C. Thomas
  Journal of Experimental Education, v52 n2 p106-09 Win 
  1984
  Document Type: RESEARCH REPORT (143)
  Methods for estimating effect sizes when complete data are not 
reported are presented.  When not precise, these methods provide a 
conservative estimate and, therefore, allow for the inclusion in a 
meta-analysis of relevant studies whose data might otherwise be 
discarded.  (Author/PN)
  Descriptors: *Effect Size; *Estimation (Mathematics); *Measurement 
Techniques; *Meta Analysis; Research Methodology; *Research Problems; 
Statistical Analysis

  
  ED249266  TM840619
  Power Differences among Tests of Combined Significance.
  Becker, Betsy Jane
  Apr 1984
  21p.; Paper presented at the Annual Meeting of the American 
Educational Research Association (68th, New Orleans, LA, April 23-27, 
1984).
  Document Type: CONFERENCE PAPER (150);  RESEARCH REPORT (143)
  Target Audience: Researchers
  Power is an indicator of the ability of a statistical analysis to 
detect a phenomenon that does in fact exist.  The issue of power is 
crucial for social science research because sample size, effects, and 
relationships studied tend to be small and the power of a study 
relates directly to the size of the effect of interest and the sample 
size.  Quantitative synthesis methods can provide ways to overcome 
the problem of low power by combining the results of many studies.  
In the study at hand, large-sample (approximate) normal distribution 
theory for the non-null density of the individual p value is used to 
obtain power functions for significance value summaries.  Three p-
value summary methods are examined.  Tippett's counting method, 
Fisher's inverse chi-square summary, and the logit method.  Results 
for pairs of studies and for a set of five studies are reported.  
They indicate that the choice of a "most-powerful" summary will 
depend on the number of studies to be summarized, the sizes of the 
effects in the populations studied, and the sizes of the samples 
chosen from those populations.  (BW)
  Descriptors: Effect Size; Hypothesis Testing; *Meta Analysis; 
Research Methodology; Sample Size; *Statistical Analysis; 
*Statistical Significance
  Identifiers: *Power (Statistics)

  
  ED248618  EA017186
  Uses and Misuses of Meta-Analysis.
  Kulik, James A.
  Apr 1984
  10p.; Paper presented at the Annual Meeting of the American 
Educational Research Association (New Orleans, LA, April 23-27, 
1984).
  Document Type: POSITION PAPER (120);  CONFERENCE PAPER (150)
  Target Audience: Researchers
  Several developments in the use of the new method of meta-analysis 
give cause for optimism.  First, different meta-analysts are doing 
work in the same areas, leading to increased confidence in meta-
analytic results.  Second, meta-analysts are beginning to include raw 
data in their reports, which helps readers pinpoint the exact studies 
that lead to disagreements in conclusions.  Third, reviewers are 
comparing results from unrelated meta-analyses, which can lead to a 
better understanding of the factors influencing the outcomes of 
educational research.  Finally, some of the worst abuses that have 
taken place in meta-analysis appear to be in the past.  (DCS)
  Descriptors: *Comparative Analysis; *Meta Analysis; Research; 
*Research Methodology; *Statistical Analysis; Statistical Data; 
Synthesis


  
  ED248262  TM840565
  Developments in Meta-Analysis: A Review of Five Methods.
  Bangert-Drowns, Robert L.
  Michigan Univ., Ann Arbor. Center for Research on Learning and 
Teaching.  Jun 1984
  69p.
  Document Type: REVIEW LITERATURE (070)
  Target Audience: Researchers
  It is easy to observe that meta-analysis is quickly establishing 
itself as useful tool of the social sciences.  Perusal of 
representative journals confirms that meta-analysis has been applied 
in various ways to diverse literatures.  It is imperative, therefore, 
that reviewers, publishers, consumers, and critics of these reviews 
be best informed about the method.  It is especially important to 
clarify exactly what the term "meta-analysis" refers to.  This 
article proposes a clarification in two ways.  First, meta-analysis 
is compared to and distinguished from other methods of research 
integration that preceded it.  Second, five different types of meta-
analytic method are distinguished by their purposes (for scientific 
criticism or for literature review) or by their methods (using 
combined probability, using tests of homogeneity, or using estimates 
of population variation).  (Author/BW)
  Descriptors: Estimation (Mathematics); Literature Reviews; *Meta 
Analysis; Probability; *Research Methodology; Research Utilization; 
Scientific Research; Statistical Analysis

  
  ED243937  TM840250
  Diagnostic Techniques in Research Synthesis.
  Ludlow, Larry H.
  Apr 1984
  23p.; Paper presented at the Annual Meeting of the American 
Educational Research Association (68th, New Orleans, LA, April 23-27, 
1984).
  Document Type: CONFERENCE PAPER (150);  RESEARCH REPORT (143)
  Target Audience: Researchers
  One purpose for combining research studies is to estimate a 
population treatment effect.  The internal validity of a model for 
how effect size estimates should be computed and combined will hinge 
upon the homogeneity of the effect size variation.  Effect size 
variation may be assessed in the form of a summary fit statistic, and 
a direct consideration of the extent of individual effect variation 
from the population estimate.  This paper presents some diagnostic 
techniques that facilitate the analysis of effect size variation.  
Bivariate plots of effect size residuals can aid in detecting sources 
of variation inconsistent with the model.  Particularly, plotting the 
standardized residual of each study against the homogeneity of the 
sample if that study were removed is of interest for assessing the 
extent of heterogeneity contributed by individual studies.  It is 
emphasized that the use of diagnostic techniques is useful for 
revealing why a lack of fit occurred, and is not advocated for the ad 
hoc purpose of finding a best-fitting subset of studies.  (BW)
  Descriptors: *Data Analysis; *Effect Size; Estimation (Mathematics); 
Graphs; *Meta Analysis; Research Methodology
  Identifiers: Data Interpretation; *Residuals (Statistics)

  
  EJ292517  TM508577
  Theory of Estimation and Testing of Effect Sizes: Use in Meta-
Analysis.
  Kraemer, Helena Chmura
  Journal of Educational Statistics, v8 n2 p93-101 Sum 
  1983
  Available From: UMI
  Document Type: RESEARCH REPORT (143)
  Approximations to the distribution of a common form of effect size 
are presented.  Single sample tests, confidence interval formulation, 
tests of homogeneity, and pooling procedures are based on these 
approximations.  Caveats are presented concerning statistical 
procedures as applied to sample effect sizes commonly used in meta-
analysis.  (Author)
  Descriptors: *Effect Size; *Meta Analysis; *Research Methodology; 
Statistical Data; *Statistical Distributions; *Statistical Studies; 
Synthesis


  ED225049  CG016397
  Meta-Analytic Applications in Program Evaluation.
  Wolf, Fredric M.
  Aug 1982
  27p.; Paper presented at the Annual Covention of the American 
Psychological Association (90th, Washington, DC, August 23-27, 1982).
  Document Type: RESEARCH REPORT (143);  CONFERENCE PAPER (150)
  In a variety of psychological and educational situations, it is 
desirable to be able to make data-based evaluative summary statements 
regarding the impact of a given program.  Certain procedures 
typically used in meta-analytic studies that review and integrate 
results from individual studies, such as combined tests and measures 
of effect size, are particularly well suited for program evaluation 
in certain situations.  This paper describes a number of such 
situations, briefly reviews the literature on combined tests and 
effect size, and provides several illustrative numerical examples of 
their application in program evaluation.  The three examples 
illustrate the practical utility of using combined tests and measures 
of effect size in program evaluations in situations where data are 
available either cross-sectionally, or on successive occasions, or on 
independent components of a larger program.  The materials suggest 
that measures of effect size are clearly valuable in providing 
potential insight into the differential impact of a given program, 
information that is more obscured when relying solely on statistical 
tests.  (Author/JAC)
  Descriptors: Case Studies; *Data Analysis; Elementary Secondary 
Education; *Evaluation Methods; Higher Education; Literature Reviews; 
Pretests Posttests; *Program Effectiveness; *Program Evaluation; 
Psychological Evaluation; *Research Methodology; Student Evaluation
  Identifiers: *Meta Analysis


  EJ269282  TM507246
  Fitting Categorical Models to Effect Sizes from a Series of 
Experiments.
  Hedges, Larry V.
  Journal of Educational Statistics, v7 n2 p119-37 Sum 
  1982
  Document Type: JOURNAL ARTICLE (080);  RESEARCH REPORT (143)
  A statistical test is described which determines homogeneity of 
effect size of an experiment series.  An overall fit statistic is 
partitioned into between-class fit statistic and within-class fit 
statistic.  These statistics permit assessment of differences between 
effect sizes for different classes and homogeneity of effect size 
within classes.  (Author/DWH)
  Descriptors: *Analysis of Variance; *Data Analysis; *Estimation 
(Mathematics); Goodness of Fit; Mathematical Models; Research 
Methodology; Statistical Significance; *Statistical Studies
  Identifiers: *Effect Size; *Meta Analysis

  
  ED228280  TM830156
  Meta-Analysis, Meta-Evaluation and Secondary Analysis.
  Martin, Paula H.
  Oct 1982
  37p.
  Document Type: REVIEW LITERATURE (070)
  Meta-analysis, meta-evaluation and secondary analysis are methods 
of summarizing, examining, evaluating or re-analyzing data in 
research and evaluation efforts.  Meta-analysis involves the 
summarization of research findings to come to some general 
conclusions regarding effects or outcomes of a given 
treatment/project/program.  Glass's approach standardizes various 
effect measures and controls for these in analyzing data.  Meta-
evaluation is a method of evaluation research examining evaluation 
methodologies, procedures, data analysis techniques, interpretation 
of results, and the validity and reliability of conclusions.  
Secondary analysis, as defined by Glass, is, "the re-analysis of data 
for the purpose of answering the original research question with 
better statistical techniques or answering new questions with old 
data." A review of the literature related to these methodologies 
gives examples of actual studies using these techniques.  Specifics 
on meta-evaluation in federally funded bilingual education programs 
illustrate the methodology.  As current budgetary cutbacks affect 
state and federal programs, meta-analysis and meta-evaluation are 
assuming important roles.  (CM)
  Descriptors: Bilingual Education Programs; *Educational Research; 
Elementary Secondary Education; Federal Programs; *Program Evaluation; 
*Research Methodology; *Statistical Analysis; *Statistical Data
  Identifiers: Elementary Secondary Education Act Title VII; Glass (G 
V); *Meta Analysis; *Meta Evaluation; Secondary Analysis


  EJ266601  UD509289
  Rigor in Data Synthesis: A Case Study of Reliability in Meta-
analysis.
  Stock, William A.; And Others
  Educational Researcher, v11 n6 p10-14 Jun-Jul   1982
  Document Type: JOURNAL ARTICLE (080);  RESEARCH REPORT (143)
  Describes a study of reliability among coders of information for 
meta-analysis (a quantitative procedure for synthesizing data in 
primary research reports) of research on life satisfaction in 
American adults.  Identifies sources and areas of disagreement among 
coders and discusses measures that can be used to enhance intercoder 
consistency.  (Author/MJL)
  Descriptors: *Classification; Correlation; *Data Analysis; 
Experimenter Characteristics; *Reliability; *Research Problems; 
Researchers; *Synthesis; Validity
  Identifiers: Coding; *Meta Analysis

  
  ED227133  TM830125
  Statistical Methodology in Meta-Analysis.
  Hedges, Larry V.
  ERIC Clearinghouse on Tests, Measurement, and Evaluation, 
Princeton, N.J.  Dec 1982
  79p.
  Sponsoring Agency: National Inst. of Education (ED), Washington, 
DC.
  Available From: ERIC/TM, Educational Testing Service, Princeton, NJ 
08541 ($7.00).
  Document Type: ERIC PRODUCT (071);  NON-CLASSROOM MATERIAL (055)
  Meta-analysis has become an important supplement to traditional 
methods of research reviewing, although many problems must be 
addressed by the reviewer who carries out a meta-analysis.  These 
problems include identifying and obtaining appropriate studies, 
extracting estimates of effect size from the studies, coding or 
classifying studies, analyzing the data, and reporting the results of 
the data analysis.  Earlier work by Glass, McGaw, and Smith describes 
methods for dealing with these problems: and has generated a great 
interest in the development of systematic statistical theory for meta-
analysis.  This monograph supplements the existing literature on meta-
analysis by providing a unified treatment of rigorous statistical 
methods for meta-analysis.  These methods provide a mechanism for 
responding to criticisms of meta-analysis, such as that meta-analysis 
may lead to oversimplified conclusions or be influenced by design 
flaws in the original research studies.  Contents include: indices of 
effect size, statistical analysis of effect size data, assumptions 
and the statistical model, estimations of effect size, an analogue to 
the analysis of variance for effect sizes, the effects of measurement 
error on effect size, statistical analysis when correlations or 
proportions are the index of effect magnitude, and statistical 
analysis for correlations as effect magnitude.  (Author/PN)
  Descriptors: Analysis of Variance; Correlation; Error of 
Measurement; *Mathematical Models; *Research Methodology; Research 
Problems; *Statistical Analysis; Statistical Studies
  Identifiers: *Effect Size; Glass (G V); *Meta Analysis

  
  EJ260365  SE530947
  Meta-analysis: An Approach to the Synthesis of Research Results.
  Glass, Gene V.
  Journal of Research in Science Teaching, v19 n2 p93-112 Feb 
  1982
  Document Type: JOURNAL ARTICLE (080);  RESEARCH REPORT (143)
  Discusses three general characteristics and four criticisms of meta-
analysis (statistical analysis of the summary findings of many 
research studies).  Illustrates application of meta-analysis on 
research studies relating to school class size and achievement and 
inquiry teaching of biology.  (JN)
  Descriptors: Academic Achievement; Biology; Class Size; College 
Science; Elementary School Science; Elementary Secondary Education; 
Higher Education; Inquiry; *Research Methodology; *Science Education; 
*Science Instruction; Secondary School Science; *Statistical Analysis
  Identifiers: *Meta Analysis; *Science Education Research


  EJ256426  EC140469
  Meta-Analysis and the Integration of Research in Special Education.
  Kavale, Kenneth A.; Glass, Gene V.
  Journal of Learning Disabilities, v14 n9 p531-38 Nov 
  1981
  Document Type: JOURNAL ARTICLE (080);  REVIEW LITERATURE (070);  
POSITION PAPER (120)
  Traditional methods of integrating special education research (such 
as narrative reviews and box score analyses) are described, and the 
procedures involved in meta-analysis, by which findings from previous 
studies are systematically synthesized, are detailed.  Benefits of 
this approach are noted.  (CL)
  Descriptors: *Disabilities; *Research Methodology; *Statistical 
Analysis
  Identifiers: *Meta Analysis


  ED208024  TM810715
  Statistical Aspects of Effect Size Estimation.
  Hedges, Larry V.
  Apr 1981
  40p.; Paper presented at the Annual Meeting of the American 
Educational Research Association (65th, Los Angeles, CA, April 13-17, 
1981).
  Sponsoring Agency: Spencer Foundation, Chicago, Ill.
  Document Type: CONFERENCE PAPER (150);  RESEARCH REPORT (143)
  When the results of a series of independent studies are combined, 
it is useful to quantitatively estimate the magnitude of the effects.  
Several methods for estimating effect size are compared in this paper.  
Glass' estimator and the uniformly minimum variance unbiased 
estimator are based on the ratio of the sample mean difference and 
the pooled within-group standard deviation.  The third estimator is 
the maximum likelihood estimator.  The fourth estimator is a shrunken 
form of the minimum variance unbiased estimator.  All four estimators 
are shown to be equivalent in large samples, but they differ in 
finite samples.  Two procedures for testing the fit of the data to 
the proposed structural model and for detection of outliers, an 
example of the application of the techniques, and a summary of 
recommendations on statistical procedures for estimation of effect 
size from a series of experiments are presented.  (Author/BW)
  Descriptors: *Literature Reviews; *Mathematical Models; Maximum 
Likelihood Statistics; *Statistical Analysis
  Identifiers: *Effect Size; Estimation (Mathematics); *Meta Analysis; 
Sample Size


  ED208003  TM810675
  Integration of Research Studies: Meta-Analysis of Research. Methods 
of Integrative Analysis; Final Report.
  Glass, Gene V.; And Others
  Colorado Univ., Boulder. Lab. of Educational Research.
  15 Aug 1980
  340p.; Appendix B is removed due to copyright restrictions.
  Sponsoring Agency: National Inst. of Education (ED), Washington, 
D.C.
  Document Type: REVIEW LITERATURE (070);  NON-CLASSROOM MATERIAL 
  Integrative analysis, or what is coming to be known as meta-
analysis, is the integration of the findings of many empirical 
research studies of a topic.  Meta-analysis differs from traditional 
narrative forms of research reviewing in that it is more quantitative 
and statistical.  Thus, the methods of meta-analysis are merely 
statistical methods, suitably adapted in many instances, that are 
applicable to the job of integrating findings from many studies.  A 
meta-analysis involves about a half-dozen steps: (1) defining the 
problem, (2) finding the research studies, (3) coding the study 
characteristics.  The thinking and research reported here is recorded 
in roughly the same order.  The report encompasses general background 
on the approach, and the results of some original research on 
approach taken in a meta-analysis, numerous illustrations of the 
approach, and the results of some original research on 
characteristics, (4) measuring the study findings on a common scale, 
and (5) analyzing the aggregation of findings and their relationship 
to the characteristics.  The thinking can be read in at least three 
ways: as a textbook of methods of integrative analysis, as a record 
of some new ideas about integrative analysis, or as an apologia for 
meta-analysis.  (Author/BW)
  Descriptors: *Data Analysis; *Literature Reviews; *Research 
Methodology; Research Problems; Statistical Analysis
  Identifiers: *Meta Analysis
  
  
  EJ235541  TM505692
  Research Integration: The State of the Art.
  Walberg, Herbert J., Ed.; Haertel, Edward H., Ed.
  Evaluation in Education: International Progress, v4 n1 p1-142 
  1980
  Document Type: JOURNAL ARTICLE (080);  RESEARCH REPORT (143);  
REVIEW LITERATURE (070)
  Forty-five brief papers cover four areas: research integration 
methodology; standard curriculum topics in educational research and 
evaluation; individual differences and special programs suited to 
particular groups of students; and programs of research integration 
being conducted by the University of Illinois-Chicago Circle and the 
National Institute of Education.  (BW)
  Descriptors: Curriculum Research; Elementary Secondary Education; 
Higher Education; Individual Differences; Policy Formation; 
Productivity; *Research Methodology; *Research Problems; State of the 
Art Reviews; *Synthesis
  Identifiers: *Research Integration


  EJ239575  TM505815
  Methods for Integrative Reviews.
  Jackson, Gregg B.
  Review of Educational Research, v50 n3 p438-60 Fall 
  1980
  Document Type: JOURNAL ARTICLE (080);  REVIEW LITERATURE (070);  
EVALUATIVE REPORT (142)
  Methods for reviews of research that focus on inferring 
generalizations about substantive issues from a set of studies 
directly bearing on those issues are examined.  The primary source of 
data was a content analysis of two samples of such reviews.  
(Author/RL)
  Descriptors: *Data Analysis; *Literature Reviews; *Research 
Methodology; *Social Science Research
  Identifiers: *Integrative Reviews; *Meta Analysis

  
  EJ237866  TM505768
  Choice of the Metric for Effect Size in Meta-analysis.
  McGaw, Barry; Glass, Gene V.
  American Educational Research Journal, v17 n3 p325-37 Fall 
  1980
  Document Type: JOURNAL ARTICLE (080);  RESEARCH REPORT (143)
  There are difficulties in expressing effect sizes on a common 
metric when some studies use transformed scales to express group 
differences, or use factorial designs or covariance adjustments to 
obtain a reduced error term.  A common metric on which effect sizes 
may be standardized is described.  (Author/RL)
  Descriptors: Control Groups; Error of Measurement; *Mathematical 
Models; *Research Problems; *Scaling; *Scoring Formulas; Statistical 
Bias; Statistical Significance
  Identifiers: *Effect Size; *Meta Analysis; Standard Deviation
  
  
  ED208003  TM810675
  Integration of Research Studies: Meta-Analysis of Research. Methods 
of Integrative Analysis; Final Report.
  Glass, Gene V.; And Others
  Colorado Univ., Boulder. Lab. of Educational Research.
  15 Aug 1980
  340p.; Appendix B is removed due to copyright restrictions.
  Sponsoring Agency: National Inst. of Education (ED), Washington, 
D.C.
  Document Type: REVIEW LITERATURE (070);  NON-CLASSROOM MATERIAL 
(055)
  Integrative analysis, or what is coming to be known as meta-
analysis, is the integration of the findings of many empirical 
research studies of a topic.  Meta-analysis differs from traditional 
narrative forms of research reviewing in that it is more quantitative 
and statistical.  Thus, the methods of meta-analysis are merely 
statistical methods, suitably adapted in many instances, that are 
applicable to the job of integrating findings from many studies.  A 
meta-analysis involves about a half-dozen steps: (1) defining the 
problem, (2) finding the research studies, (3) coding the study 
characteristics.  The thinking and research reported here is recorded 
in roughly the same order.  The report encompasses general background 
on the approach, and the results of some original research on 
approach taken in a meta-analysis, numerous illustrations of the 
approach, and the results of some original research on 
characteristics, (4) measuring the study findings on a common scale, 
and (5) analyzing the aggregation of findings and their relationship 
to the characteristics.  The thinking can be read in at least three 
ways: as a textbook of methods of integrative analysis, as a record 
of some new ideas about integrative analysis, or as an apologia for 
meta-analysis.  (Author/BW)
  Descriptors: *Data Analysis; *Literature Reviews; *Research 
Methodology; Research Problems; Statistical Analysis
  Identifiers: *Meta Analysis


  EJ207324  TM504364
  Meta-Analysis of Research on Class Size and Achievement.
  Glass, Gene V.; Smith, Mary Lee
  Educational Evaluation and Policy Analysis, v1 n1 p2-16 Jan-Feb 
  1979
  Document Type: JOURNAL ARTICLE (080);  REVIEW LITERATURE (070)
  The relationship between class size and school achievement is 
investigated in this "meta-analysis" of the class size literature.  
In addition, some methodological considerations of meta-analysis are 
discussed.  (JKS)
  Descriptors: *Academic Achievement; Classroom Research; *Class Size; 
Elementary Secondary Education; *Evaluation; Evaluation Methods; 
Literature Reviews
  Identifiers: Meta Analysis


  EJ149191  UD504875
  Primary, Secondary, and Meta-Analysis of Research
  Glass, Gene V.
  Educational Researcher, 5, 10, 3-8  Nov 1976
  Examines data analysis at three levels: analysis of data; secondary 
analysis is the re-analysis of data for the purpose of answering the 
original research question with better statistical techniques, or 
answering new questions with old data; and, meta-analysis refers to 
the statistical analysis of many analysis results from individual 
studies for the purpose of integrating the findings.  (Author/JM)
  Descriptors: *Research Methodology; *Data Analysis; *Statistical 
Analysis; *Research Utilization; *Educational Research; Research 
Problems; Information Utilization; Research Design; Literature 
Reviews; Research Reviews (Publications)

Return to FAQ on Meta Analysis in Educational Research

Return to the Index of FAQs


Degree Articles

School Articles

Lesson Plans

Learning Articles

Education Articles

 

 Full-text Library | Search ERIC | Test Locator | ERIC System | Assessment Resources | Calls for papers | About us | Site map | Search | Help

Sitemap 1 - Sitemap 2 - Sitemap 3 - Sitemap 4 - Sitemap 5 - Sitemap 6

©1999-2012 Clearinghouse on Assessment and Evaluation. All rights reserved. Your privacy is guaranteed at ericae.net.

Under new ownership