IMPACT-L Archives

Moderated conference on impact assessment of agricultural research: May 2014

Impact-L@LISTSERV.FAO.ORG

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Condense Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Content-Type:
text/plain; charset="iso-8859-1"
Date:
Thu, 29 May 2014 13:57:44 +0200
Reply-To:
Subject:
Content-Transfer-Encoding:
quoted-printable
Message-ID:
MIME-Version:
1.0
Sender:
Moderated conference on impact assessment of agricultural research <[log in to unmask]>
From:
Parts/Attachments:
text/plain (44 lines)
This is Matthieu Stigler again (Message 15).

I would like to react  to two issues: attribution vs. contribution (Message 84, by Hailemichael Taye), and the claim that impact evaluation requires to quantify an impact, and hence can only be done through quantitative methods (Message 70, by Daniel Suryadarma).

I am a little bit skeptical about Daniel Suryadarma's point (Message 86) about the fact that quantitative impact evaluation (IE) methods do investigate intermediary steps in a pathway, and I would welcome references to studies that do this. [Daniel wrote "attribution analyses can, and have been used to, provide an in-depth understanding why observed outcomes have or have not occurred as a result of an intervention. Proper attribution analyses do not merely measure impacts and then provide no explanation as to why they have occurred, but extensively use theory of change and also examine the intermediary steps between an intervention and the final outcomes"...Moderator].

A general point is that the whole quantitative IE methodology is based on the "effects of causes": for a given cause, estimate its counterfactual effect. This is the opposite of the "causes of effects" approach, that seeks the causes of observed effects. In most cases, however, IE looks at the single causal effect of a single cause, without asking whether other variables have themselves a causal interpretation (relegating them to a vague status of "covariates", "controls", etc..). In fact, I am not even sure for example that the book Daniel recommended in Message 70 (Gertler et al, 2011) defines what a cause is (beyond the definition of a cause's counterfactual effect).

Indeed, I have seen very few quantitative studies estimating explicitly an impact pathway (note, I define here a pathway as a non-trivial chain with more than one arrow linking output to impact). The only case I am aware of is that of Thirtle et al (2003), who investigate a detailed chain from R&D to Productivity to GDP to Inequality to Poverty (to appreciate how complex his pathway indeed is, see a representation of it prepared for the forthcoming IMPRESA guidelines for case studies, this diagram is currently available at https://dl.dropboxusercontent.com/u/6113358/Permanent/IPA_Thirtle_2003.pdf). Ironically, Thirtle et al's (2003) study would nowadays probably be rejected by 99% of scholars doing "causal analysis" and, in fact, the tools that allow us to do quantitative analysis of a pathway (the structural/simultaneous equation models SEM) are considered nowadays an object of the past and are very rarely mentioned in "impact evaluation handbooks" (I could not find it in Gertler et al 2011). But again, this is just my ignorance in this domain, and I'll be most happy to see methods and cases analysing pathways in a "causal manner".

Finally, I note that Daniel in Message 70 associates impact evaluation with quantification of effect, and hence links it exclusively to quantitative methods. [Daniel wrote: "Since the main purpose of an impact evaluation is to measure effect size, quantitative methods must be used. Without quantitative estimates, there is no impact evaluation"...Moderator]. Interestingly, the possibility of quantitative methods to produce "single number estimates" has been questioned recently, with several scholars criticizing the "incredible certitude" and very strong assumptions needed to obtain such "single number estimates" (see Manski, 2011; Manski and Pepper, 2000). When trying to rely on more realistic assumptions, they discovered that all that quantitative methods can do is to inform on "estimates of intervals of numbers", not on "single number estimates", or on bounds such as maxima and minima (which is technically called point identification versus interval identification). In some cases, minimal assumptions allow us only to infer the sign of an (impact) estimate not it's magnitude (see Machado et al, 2013).

The last point is very interesting, as it links to the previous discussion on attribution vs. contribution. I must say first that I am a little bit confused over what "contribution" means, for it seems to be at the same time a definition of a type of causality (a contributory cause) and a method to infer that type of causality. It is indeed not very clear to me how "contribution" differs from "partial attribution" (see White [2010] for a similar point), as well as whether contribution is about effects of causes (what the definition suggests) or causes of effects (what the method suggests). My (very) personal interpretation is that contribution can be seen as a "unquantified partial attribution" where one just seeks to determine whether the effect is positive or null (or negative depending on the context). In that sense, contribution analysis gives the same insights as "serious" (as opposed to "rigorous") quantitative methods discussed above, with eventually the supplementary advantage that it claims to also uncover the causes of the effects.

To conclude, the whole question of whether epIA should estimate impacts (i.e. whether should be on effects of causes, or also on causes of effects) depends in my opinion on the goal of the impact evaluation itself. If epIA is purely for accountability purpose, quantitative methods giving a precise estimate with less analysis of the mechanisms might be the most suited for donors or newspapers. On the other hand, if the epIA wants to serve policy questions such as replicating or scaling up, qualitative methods with less precise estimates but better insights into the mechanisms might be preferred, (let alone the fact that, as far as I know, almost no quantitative method is informative about the scaling-up relevant "treatment effect on untreated" - zero-blinded randomized controlled trials (RCT), instrumental variables (IV) and difference-in-difference (DiD) are not; regression discontinuity design (RDD) only very marginally by definition; matching would actually be the only method).

Matthieu Stigler
Institut de recherche de lšagriculture biologique (FiBL) 
Ackerstrasse 113, 
Case postale 219
5070 Frick,
Switzerland
www.fibl.org
e-mail: matthieu.stigler (at) gmail.com

References:

-Machado, C., Shaikh, A.M. and Vytlacil, E.J. (2013) Instrumental variables and the sign of the average treatment effect. Working paper. http://home.uchicago.edu/amshaikh/webfiles/sign.pdf 

- Manski, C.F. and John V. Pepper (2000) Monotone instrumental variables: With an application to the returns to schooling. Econometrica 68: 997-1010. http://faculty.smu.edu/millimet/classes/eco7377/papers/manski%20pepper%2000.pdf (320 KB).

-Manski, C.F. (2011) Policy analysis with incredible certitude. Economic Journal, 121:, pages F261-F289. http://www.nber.org/papers/w16207 

 -Thirtle, C., Lin, L. and J. Piesse. 2003. The impact of research-led agricultural productivity growth on poverty reduction in Africa, Asia and Latin America. World Development 31: 1959-1975. http://impact.cgiar.org/pdf/158.pdf (300 KB).

-White (2010). A contribution to current debates in impact evaluation. Evaluation, 16, 153-164.

[To contribute to this conference, send your message to [log in to unmask] The last day for sending messages to the conference is 1 June. The searchable message archive is at https://listserv.fao.org/cgi-bin/wa?A0=Impact-L ].

########################################################################

To unsubscribe from the Impact-L list, click the following link:
https://listserv.fao.org/cgi-bin/wa?SUBED1=Impact-L&A=1

ATOM RSS1 RSS2