MPP2019-525

Policy Evaluation

Edmund J. Malesky, Le Viet Phu
Date: 01/09/2018 17:42; File size: 157,500 bytes
Week
1
|
2
|
3
|
4
|
5
|
6
Date
Time
Activity
Monday, 18/06/2018
13:30 - 15:00

Lecture 1. What is Policy Evaluation?

Importance of evaluation and the difference from monitoring

  • KKS, pp 3-20.
  • EGAP, "10 Strategies for Figuring Out if X Caused Y"
Wednesday, 20/06/2018
13:30 - 15:00

Lecture 2. What is Causal Inference?

Counterfactuals and the potential outcomes framework

  • Angrist and Pishke, Introduction.
  • KKS, pp. 20-30.
  • EGAP, "10 Things You Need to Know about Causal Inference"
Friday, 22/06/2018
10:15 - 11:45

Lecture 3. Introduction to Randomized Controlled Trials (RCTs)

Basics of field experiments and exploration of different designs.

  • KKS, pp. 33-38.
  • EGAP, "10 Types of Treatment Effects You Should Know About"
  • EGAP, "10 Things to Know about External Validity"
Nathan M. Jensen & Edmund J. Malesky
Date: 29/08/2018 09:53; File size: 51,380 bytes
Matthew Stephenson
Date: 29/08/2018 09:45; File size: 52,042 bytes
EGAP
Date: 29/08/2018 09:13; File size: 48,496 bytes
Alejandro Ganiminan
Date: 28/08/2018 17:10; File size: 47,495 bytes
David McKenzie
Date: 28/08/2018 17:08; File size: 47,793 bytes
Joshua D. Angrist, Jörn-Steffen Pischke
Date: 28/08/2018 16:54; File size: 116,978 bytes
Shahidur R. Khandker, Gayatri B. Koolwal, Hussain A. Samad
Date: 28/08/2018 16:36; File size: 116,978 bytes
Shahidur R. Khandker, Gayatri B. Koolwal, Hussain A. Samad
Date: 28/08/2018 16:35; File size: 116,978 bytes
Shahidur R. Khandker, Gayatri B. Koolwal, Hussain A. Samad
Date: 28/08/2018 16:34; File size: 116,978 bytes
Joshua D. Angrist, Jörn-Steffen Pischke
Date: 28/08/2018 16:54; File size: 116,978 bytes
Shahidur R. Khandker, Gayatri B. Koolwal, Hussain A. Samad
Date: 28/08/2018 16:34; File size: 116,978 bytes
Shahidur R. Khandker, Gayatri B. Koolwal, Hussain A. Samad
Date: 28/08/2018 16:31; File size: 116,978 bytes
Abhijit Banerjee, Jordan Kyle Cohen, Rema Hanna, Benjamin Olken, Sudarno Sumarto
Date: 28/08/2018 16:51; File size: 50,531 bytes
EGAP
Date: 28/08/2018 16:46; File size: 44,831 bytes
EGAP
Date: 28/08/2018 16:45; File size: 116,978 bytes
EGAP
Date: 28/08/2018 16:44; File size: 116,978 bytes
Joshua D. Angrist, Jörn-Steffen Pischke
Date: 28/08/2018 16:43; File size: 116,978 bytes
Shahidur R. Khandker, Gayatri B. Koolwal, Hussain A. Samad
Date: 28/08/2018 16:30; File size: 116,978 bytes
Shahidur R. Khandker, Gayatri B. Koolwal, Hussain A. Samad
Date: 28/08/2018 16:27; File size: 116,978 bytes
Shahidur R. Khandker, Gayatri B. Koolwal, Hussain A. Samad
Date: 28/08/2018 16:20; File size: 116,978 bytes
Joshua D. Angrist, Jörn-Steffen Pischke
Date: 28/08/2018 16:19; File size: 116,978 bytes
Shahidur R. Khandker, Gayatri B. Koolwal, Hussain A. Samad
Date: 28/08/2018 16:07; File size: 116,978 bytes
Edmund Malesky
Date: 18/07/2018 11:08; File size: 1,489,096 bytes
Edmund Malesky
Date: 16/07/2018 10:39; File size: 1,475,650 bytes
Edmund Malesky
Date: 13/07/2018 10:17; File size: 699,773 bytes
Edmund Malesky
Date: 11/07/2018 11:10; File size: 3,266,374 bytes
Edmund Malesky
Date: 04/07/2018 07:54; File size: 2,542,347 bytes
Edmund Malesky
Date: 02/07/2018 14:33; File size: 661,886 bytes
Edmund Malesky
Date: 24/06/2018 23:04; File size: 2,115,004 bytes
Edmund Malesky
Date: 22/06/2018 10:40; File size: 1,348,752 bytes
Edmund Malesky
Date: 19/06/2018 15:30; File size: 1,606,507 bytes
Edmund Malesky
Date: 14/06/2018 16:12; File size: 1,121,417 bytes

This course offers a first systematic approach to policy evaluation from a perspective of a practitioner. It provides rationale why evaluation may be used to inform and improve policy development, adoption, implementation, and effectiveness, and builds the evidence for policy interventions. We begin with experimental approaches, the gold standard in program evaluation. The main purpose of randomized evaluations is to determine whether a program has an impact, and more specifically, to quantify how large that impact is. Impact evaluations measure program effectiveness typically by comparing outcomes of those who received the program against those who did not. We will learn basic sets of skills for designing and evaluating policy interventions, and then practice them immediately. The first lecture will be devoted to the goals and organization of program design before beginning our discussion of the experimental ideal. Each subsequent class will delve into particular research tools used in evaluation for attempting to recover the experimental ideal (randomized control trials, survey experiments, regression discontinuity design, matching estimators, and difference-in-differences).  Within each lecture, we will discuss the underlying assumptions, power estimations, and diagnostics for determining whether the tool is appropriate for the particular research question.

The course will take the organizational structure of a workshop. Understanding the challenges of teaching econometrics without formulas, we have selected a nuanced approach which offers a harmonic, narrative based, combination of theory, in-class discussions, and computer applications. The course assessment is based on identifying a critical policy question that students are interested in and then designing the ideal evaluation for it.  The final project will a Pre-Analysis Plan, a specialized research design that lays out the specific for how a new policy will be evaluated.