Progress in Integrating Big Data into Program Evaluation: Case studies and lessons learned
September 4, 9am-5pm
Hosted at Independent Evaluation Group (IEG) at the World Bank
“I” Building, (Room to be announced)
1850 I St NW, Washington, DC 20006
Join us for a one-day workshop on Big Data and Evaluation!
Does “big data” still seem like something that is out of reach for you and your organization? Join us for a one-day workshop where we will “demystify” big data for evaluators and program managers. We’ll show that there are a wide range of tools and techniques that can be incorporated into evaluations, and we’ll aim to advance discussions on how to build bridges between the evaluation and big data communities.
A core part of the workshop will be the presentation and discussion of case studies illustrating big data and data analytics tools and techniques that are already being used (or tested) in program evaluation, including:
- Satellites and drones
- Mobile phones and phone record analysis
- Economical methods for the collection and analysis of evaluation data that permit significant increases in sample size
- Social media analytics
- Predictive analytics
- Artificial intelligence
- Constructing integrated data bases
The workshop will conclude with a general discussion of the opportunities and challenges for integrating big data into program evaluation and some of the next steps.
Michael Bamberger has a Ph.D. in Sociology from the London School of Economics. He has 45 years of experience in development evaluation including 23 years with the World Bank where worked on development evaluation, evaluation training for the Asia region and gender and development. Since retiring from the Bank he has worked as an independent consultant on development evaluation with 10 UN agencies as well as with development banks, bilateral aid agencies, NGOs and foundations. Michael has advised the Evaluation Office of the Rockefeller Foundation, is a member of the International Evaluation Advisory Panel of the Independent Evaluation Office of UNDP, and is a member of UN Women’s Evaluation Advisory Panel. He is also on the editorial board of a number of leading evaluation journals. Recent publications include RealWorld Evaluation: Working under Budget, Time, Data and Political Constraints (2012); Introduction to Mixed Methods in Impact Evaluation (2012); How to Design and Manage Equity Focused Evaluations (2011); Emerging Opportunities: Monitoring and Evaluation in a TechEnabled World (2014); Dealing with Complexity in Development Evaluation (2016); and Integrating Big Data into the Monitoring and Evaluation of Development Programs (2016).
Co-hosted and sponsored by:
Questions? Contact Linda Raftree