Educational policy-makers around the world constantly make decisions about how to use scarce resources to improve the education of children. Unfortunately, their decisions are rarely informed by evidence on the consequences of these initiatives in other settings. Nor are decisions typically accompanied by well-formulated plans to evaluate their causal impacts. As a result, knowledge about what works in different situations has been very slow to accumulate. Over the last several decades, advances in research methodology, administrative record keeping, and statistical software have dramatically increased the potential for researchers to conduct compelling evaluations of the causal impacts of educational interventions, and the number of well-designed studies is growing. Written in clear, concise prose, Methods Matter: Improving Causal Inference in Educational and Social Science Research offers essential guidance for those who evaluate educational policies. Using numerous examples of high-quality studies that have evaluated the causal impacts of important educational interventions, the authors go beyond the simple presentation of new analytical methods to discuss the controversies surrounding each study, and provide heuristic explanations that are also broadly accessible. Murnane and Willett offer strong methodological insights on causal inference, while also examining the consequences of a wide variety of educational policies implemented in the U.S. and abroad. Representing a unique contribution to the literature surrounding educational research, this landmark text will be invaluable for students and researchers in education and public policy, as well as those interested in social science.



Autorentext

Richard J. Murnane, Juliana W. and William Foss Thompson Professor of Education and Society at Harvard University, is an economist who focuses his research on the relationships between education and the economy, teacher labor markets, the determinants of children's achievement, and strategies for making schools more effective. John B. Willett, Charles William Eliot Professor of Education at Harvard University, is a quantitative methodologist who has devoted his career to improving the research design and data-analytic methods used in education and the social sciences, with a particular emphasis on the design of longitudinal research and the analysis of longitudinal data .



Inhalt

1 The Challenge for Educational Research 1.1 The Long Quest 1.2 The Quest is World-Wide 1.3 What this Book is About 1.4 What to Read Next 2 The Importance of Theory 2.1 What is Theory? 2.2 Theory in Education 2.3 Voucher Theory 2.4 What Kind of Theories? 2.5 What to Read Next 3 Designing Research to Address Causal Questions 3.1 Conditions to Strive for in All Research 3.2 Making Causal Inferences 3.3 Past Approaches To Answering Causal Questions in Education 3.4 The Key Challenge of Causal Research 3.5 What to Read Next 4 Investigator-Designed Randomized Experiments 4.1 Conducting Randomized Experiments 4.1.1 An Example of a "Two-Group" Experiment 4.2 Analyzing Data from Randomized Experiments 4.2.1 Better Your Research Design, the Simpler Your Data-Analysis 4.2.2 Bias and Precision in the Estimation of Experimental Effects 4.3 What to Read Next 5 Challenges in Designing, Implementing, and Learning from Randomized Experiments 5.1 Critical Decisions in the Design of Experiments 5.1.1 Defining the Treatment Being Evaluated 5.1.2 Defining the Population from Which Participants Will Be Sampled 5.1.3 Deciding Which Outcomes to Measure 5.1.4 Deciding How Long To Track Participants 5.2 Threats to Validity of Randomized Experiments 5.2.1 Contamination of the treatment-control contrast 5.2.2 Cross-Overs 5.2.3 Attrition from the sample 5.2.4 Participation in an Experiment Itself Affects Participants' Behavior 5.3 Gaining Support for Conducting Randomized Experiments: Examples from India 5.3.1 Evaluating an Innovative Input Approach 5.3.2 Evaluating an Innovative Incentive Policy 5.4 What to Read Next 6 Statistical Power and Sample Size 6.1 Statistical Power 6.1.1 Reviewing the Process of Statistical Inference 6.1.2 Defining Statistical Power 6.2 Factors Affecting Statistical Power 6.2.1 The Strengths and Limitations of Parametric Tests 6.2.2 The Benefits of Covariates 6.2.3 The Reliability of the Outcome Measure Matters 6.2.4 The Choice between One-Tailed and Two-Tailed Tests 6.3 What to Read Next 7 Experimental Research When Participants Are Clustered within Intact Groups 7.1 Using the Random-Intercepts Multilevel Model to Estimate Effect Size When Intact Groups of Participants Were Randomized To Experimental Conditions 7.2 Statistical Power When Intact Groups of Participants Were Randomized To Experimental Conditions 7.2.1. Statistical Power of the Cluster-Randomized Design and Intraclass Correlation 7.3 Using Fixed-Effects Multilevel Models to Estimate Effect Size When Intact Groups of Participants are Randomized To Experimental Conditions 7.3.1 Specifying a "Fixed-Effects" Multilevel Model 7.3.2. Choosing Between Random- and Fixed-Effects Specifications 7.2 What to Read Next 8 Using Natural Experiments To Provide "Arguably Exogenous" Treatment Variability 8.1 Natural- and Investigator-Designed Experiments: Similarities and Differences 8.2 Two Examples of Natural Experiments 8.2.1 The Vietnam Era Draft Lottery 8.2.2 The Impact of an Offer of Financial Aid for College 8.3 Sources of Natural Experiments 8.4 Choosing the Width of the Analytic Window 8.5 Threats to Validity in Natural Experiments with a Discontinuity Design 8.5.1 Accounting for the Relationship between the Forcing Variable and the Outcome in a Discontinuity Design 8.5.2 Actions by Participants Can Undermine Exogenous Assignment to Experimental Conditions in a Natural Experiment with a Discontinuity Design 8.6 What to Read Next? 9 Estimating Causal Effects Using a Regression-Continuity Approach 9.1 Maimonides' Rule and the Impact of Class Size on Student Achievement 9.1.1 A Simple "First Difference" Analysis 9.1.2 A "Difference-in-Differences" Analysis 9.1.3 A Basic "Regression-Discontinuity" Analysis 9.1.4 Choosing an Appropriate "Window" or "Bandwidth" 9.2 Generalizing the Relationship between Outcome and Forcing Variable 9.2.1 Specification Checks Using Pseudo-Outcomes and Pseudo-Cutoffs 9.2.2 RD Designs and Statistical Power 9.3 Additional Threats to Validity in an RD Design 9.4 What to Read Next 10 Introducing Instrumental Variables Estimation 10.1 Introducing Instrumental Variables Estimation 10.1.1 Investigating the Relationship Between an Outcome and a Potentially- Endogenous Question Predictor Using OLS Regression Analysis 10.1.2 Instrumental Variables Estimation 10.2 Two Critical Assumptions That Underpin Instrumental Variables Estimation 10.3 Alternative Ways of Obtaining the IV Estimate 10.3.1 Obtaining an IV Estimate by the Method of Two-Stage Least-Squares 10.3.2 Obtaining an IVE by Simultaneous Equations Estimation 10.4 Extensions of the Basic IVE Approach 10.4.1 Incorporating Exogenous Covariates into IV Estimation 10.4.2 Incorporating Multiple Instruments into the First-Stage Model 10.4.3 Examining the Impact of Interactions between the Endogenous Question Predictor and Exogenous Covariates in the Second-Stage Model 10.4.4 Choosing Appropriate Functional Forms for the Outcome/Predictor Relationships in the First- and Second-Stage Models 10.5 Finding and Defending Instruments 10.5.1 Proximity of Educational Institutions 10.5.2 Institutional Rules and Personal Characteristics 10.5.3 Deviations from Cohort Trends 10.5.4 The Search Continues 10.6 What To Read Next 11 Using IVE to Recover the Treatment Effect in a Quasi-Experiment 11.1 The Notion of a "Quasi-Experiment" 11.2 Using IVE to Estimate the Causal Impact of a Treatment in a Quasi-Experiment 11.3 Further Insight into the IVE (LATE) Estimate, in the Cont…

Titel
Methods Matter
Untertitel
Improving Causal Inference in Educational and Social Science Research
EAN
9780199780310
ISBN
978-0-19-978031-0
Format
E-Book (pdf)
Hersteller
Veröffentlichung
15.09.2010
Digitaler Kopierschutz
Adobe-DRM
Dateigrösse
2.1 MB
Anzahl Seiten
416
Jahr
2010
Untertitel
Englisch