In the January 31st, 2013 issue of Nature, Skip Garner published a comment entitled “Research funding: Same work, twice the money?“. In this article Garner and his collaborators report the results of a large scale analysis of grant applications submitted to the main US funding agencies. Their approach combines automatic text mining of the proposal abstracts with a manual review of proposals flagged by software. The authors identified a number of proposal totaling about $70M in potentially duplicate funding.

The article position is controversial. In times of escalating deficits of the federal budget, taxpayers and Congress are looking for wastes of tax dollars. At the same time, the scientific community is struggling with historically low success rates in grant application funding.  Funding agencies and scientists do not want to give the impression that they do a lousy job managing limited resources.

Reactions on the blogosphere

A week after the publication, I thought it might be interesting to look at the reactions to this comment.  I used my Altmetric  subscription to track down relevant postings. Some of this information is available free of charge using the bookmarklet. I encourage you to give it a try.

  • Twitter:  48 postings on Twitter. This number is a good indication that the paper got people’s attention but this is not a very high number of Tweets. The postings are fairly neutral.
  • Facebook: The paper was mentioned on four Facebook pages without any comments.
  • New-York Times published a short story that summarizes the main findings in a very matter-of-fact way.
  • Chemistry World published an article that emphasizes that the numbers in the Nature paper are probably underestimations using quotes from Garner.
  • Nature itself published an editorial and a news item related to the Garner’s paper that includes a link to a document with statements from three funding agencies (NIH, NSF, and Department of Defense). These accompanying papers help put the authors findings in perspective.

I could find three bloggers commenting on the paper. Kyle Niemeyer summarized the article main results in Ars Technica. He did an excellent job presenting the different sides of this story. His post received 44 comments. Ivan Oransky summarized the paper on RetractionWatch which received 22 comments. Finally, Hank Campbell wrote a more opinionated blog entry on Science 2.0. The comments on the blog entries stress that it is not uncommon for funding agencies to only partially fund research projects. They also emphasized the responsibilities of program directors in making funding decisions. Finally, several comments found reassuring that the amount of potentially  duplicated funding was finally fairly small considering that the study covered 35 years of federal R&D budgets.

Beyond proposal summaries

The paper focuses exclusively on one dimension, the mining of research proposal summaries, to identify potential duplicate fundings.  It may be useful to consider other aspects of the funding cycle to get a more comprehensive perspective on this complex problem. This includes the nature of the funding, the results of the funding, and the people receiving funding.

Grants vs. contract: The idea of duplicated funding applies well to contracts where the government purchases products or services from external sources. In the case of a grant, the government funds an effort that benefits the public.  Results of a grant are not a set of deliverables but a report showing that money allocated by government was used in a way consistent with the general goals of the grant. Recently, many grants are supporting scientists more than projects. The NIH Pioneer and Innovator Awards or the recent recommendations of the CSR Council illustrate this trend.  One can debate how much funding should PI receive but this is a completely different issue than duplicate funding for the same work.

Proposals vs. publications: Grant proposals outline a research plan that rarely unfolds as described. Analyzing the outcomes of a research investment would provide another, possibly more accurate, source of information on potential duplicate funding. The outcomes of most research projects  are reported in scientific publications that disclose the sources of funding used to perform the work. Most publications list more than one source of funding. Many authors are supported by different grants. Different coauthors of a paper may be supported by different sources of funding. Many publications result from collaborations between scientists who bring different perspectives to a problem. It is rare for all these scientists to be supported by a single source of funding. I have not heard anyone argue that having different sources of funding supporting the work described in a publication is an indication of fiscal waste. It merely reflects the fact that most grants fund only a fraction of a research effort. It is the Principal Investigator’s responsibility to raise funds from different sources in order to get the job done.

Twice the money… why? What would be the motivation of a PI for collecting twice the money to do the same work? In a profit-driven environment, this could make sense because charging twice for the same service would have a positive impact on the bottom line. However, in an academic environment it is not clear who would profit from it. The salary of the PI and people charged to the grant would not be doubled. It is unlikely that the institution receiving the grant or contract would be able to use it for  a completely different purpose. I just fail to see why would anyone working in an academic environment have any incentive to defraud the government by charging twice for the same service. However, I see a number of reasons why many PIs may need several grants to make substantial progress on a research project.

Conclusions

Another possible interpretation of the authors findings is that the funding agencies, like any prudent investor, tend to limit their risk by spreading their investments. They only fund part of many research projects. If the projects are successful, they can claim credits for supporting them. If a project fails, they limit their exposure and the embarrassment. Without access to financial information, it seems difficult to claim that similarities between summaries of research proposals imply that the same project has been funded twice.

That being said, the development of text mining technologies to uncover similarities between research projects is a worthy goal. It is obvious that it is necessary to extend this analysis beyond proposal summaries. Full proposals could be analyzed as well as project reports. Each funding agency could use such a tool to manage their own portfolio. This could help track proposals that are submitted to two different programs within the same agency, something that most agencies forbid. It could also be used to track similarities between different proposals submitted by different PIs that may simply reflect the current evolution of a discipline and the convergence of scientific leaders toward similar projects. In addition to helping funding agencies track redundancies in their portfolio, such a tool could prove useful to identify specialties that have been underfunded.

At the federal level, the analysis of a central database of research project could highlight how investments from different agencies complement each other. The general public could use this portal to gain  a broader perspective on what agency is funding what. To take the example of synthetic biology, I would love to better understand what’s been funded in this field by what agency like I can browse books related to each other on Amazon.com.

What do you think?

As always, we’re curious to know what you think? Have you ever witnessed duplicated funding of a research project? Do you think it is possible? Why?