30480 - TOPICS IN POLITICS I
Course taught in English
Go to class group/s: 31
Familiarity with basic algebra and comfort with basic statistics would be helpful.
Randomized control trials (RCTs), randomized evaluations, rigorous evaluations, impact assessments are all a slightly complicated way of saying something really quite simple. If we want to know how effective a program is, we need to have a comparison group. Without a comparison, we are limited in our ability to know what would have happened without the program. And the only way of having an equitable comparison group is with random assignment. Randomized control trials (RCTs), randomized evaluations, rigorous evaluations, impact assessments are all a slightly complicated way of saying something really quite simple. If we want to know how effective a program is, we need to have a comparison group. Without a comparison, we are limited in our ability to know what would have happened without the program. And the only way of having an equitable comparison group is with random assignment. Randomized evaluations to measure impact provide the most credible and reliable way to learn what works and what does not. Randomized evaluations use the same methods frequently used in high quality medical research and rely on the random assignment of a program or policy to measure its impact on those that received the program compared to those who did not. The mission of the course is to provide students with the knowledge and tools to design, conduct and analyse randomized field experiments (RCTs) in politics to evaluate theories, programmes and policies.
Why Conducting Experiments
Designing and Executing Field Experiments
Social Contact and Prejudice Reduction
Gender and Politics
Define the fundamental principles and practice of RCTs
Describe the critical issues involved in planning, conducting and completing a successful RCT
Understand the basic statistics used to plan and analyse RCTs
Understand the importance of Research Integrity, Transparency, and Reproducibility
Design a Randomized Evaluation
Collect and manage experimental data
Structure the analytic and thinking process for public policy evaluation.
- Face-to-face lectures
- Group assignments
Working in pairs students will be asked to write a research design essay (3000 words) on how they would address a causal research question of their choice in Political Science using a randomized field experiment. The research design should include a short literature review, hypotheses, research design, and pre-analysis plan. Each pair will also prepare one 5-minute presentation of a preliminary draft that will provide a checkpoint to obtain early feedback.
|Continuous assessment||Partial exams||General exam|
will have to submit a group assignment and present their assignment.
The grading criteria for the assignment will be the following:
•Research question, theory (brief literature review) and hypotheses (25%, approx. 500 words)
•Design (50%, approx. 1000 words)
-Description of Treatment(s)
-Type of random assignment
-Excludability and non-interference
•Analysis Strategy (25%, approx. 500 words)
-Estimator (in PO notation) / regression equations
-Hypothesis testing (one or two-tailed?)
-Any adjustments for clusters? Covariates?
students’ assessment is based on the written exam (100%).
The written exam consists of a series of multiple choice questions aimed to assess students’ ability to apply the fundamental principles and practice of RCTs illustrated during the course, to describe the critical issues involved in planning, conducting and completing a successful RCT as well as to test the student’s understanding of the basic statistics used to plan and run randomized control trials.
Gerber, Alan and Donald P. Green. 2012. Field Experiments: Design, Analysis, and Interpretation, New York: W.W. Norton, 2012.
Glennerster, Rachel and Kudzai Takavarasha. 2013. Running Randomized Evaluations: A Practical Guide. Princeton, NJ: Princeton University Press.
John, Peter. 2017. Field Experiments in Political Science and Public Policy: Practical Lessons in Design and Delivery, London: Routledge.
Karlan, Dean and Jacob Appel. Failing in the Field, Princeton, NJ: Princeton University Press, 2016.
Journal articles and other reading list texts, as assigned.