Stop wasting researchers’ time

“It might be high time for Statistics Canada to start collecting data and measure the thousands of hours wasted by Canada’s researchers in filling forms and preparing proposals that lead to nowhere.”  That’s what I wrote many posts ago in, “The downhill race between NSERC and CIHR”, after learning that the CIHR’s success rate for this year’s individual Operating Grants competition was down to 15% (and the ensuing petition). But this turned out to be the tip of the iceberg.

I have also written about the sharp decline in the numbers of scholarships and fellowships awarded by NSERC. Out of the 1431 applications for postdoctoral fellowships in this year’s competition, only 133 were awarded, that is a 9.2% success rate!  Keep this in mind the next time you are writing yet another reference letter!

It also turned out that the success rate for the 2010-11 NSERC’s Strategic Project grants competition has dropped to 22%. That was a record low until the success rate for this year was announced: a ridiculous 16%. I was told that UBC’s was down to 12%. No wonder our computer scientists and engineers are so unhappy these days!

What about the celebrated Banting fellowships? Well, there were 70 of them available for the 2010/11 competition (23 for each of NSERC, SSHRC and CIHR). The website says that 658 eligible applications were received. If you think this is bad, wait to hear the rest of the story about the real cost of this program.

Take a look at the UBC example, a university recently ranked in one of those university rankings as 22d in the world . Only 3 out of the 80 scholars who submitted an application for a Banting at UBC were successful. That’s a dreadful success rate of 3.7%.

The real cost can be measured when one realizes that the review process for the Banting goes through 3 stages, the first one being at the university level.

UBC alone had 3 committees to select the top 30 applicants that would go to the second stage. One committee for each council and each one comprising 6 to 9 faculty members. Besides reviewing and selecting, the panelists have to write a custom made letter supporting each one of the applicants they choose to nominate forward.  And this is only one of  four letters going from the institution to Ottawa.

·       One about the “Institutional Synergy”.
·       The supervisors’  statement.
·       Elements for Research Environment.
·       Elements for Professional Development.

The whole list of items for the application are in here (Task #8 onwards).

Add to this the cumulative work of the three peer review committees established by the three granting agencies, who picked 105 out of the 658 applications that the universities put forward. Then, we are off to the last stage of the review process, which is done by yet another “independent interagency Selection Board” who evaluated each of the 105 applications and made the final recommendations for the 70 awards available.

Let me say it again.  It is high time for Statistics Canada to start collecting data and count the thousands of hours spent by Canada’s researchers in filling forms, writing letters, preparing proposals, and evaluating grant applications.

Peer reviewing is absolutely essential for the allocation of resources for research and for training. But before starting a new program, governments and policy makers need to be aware that peer review is costly, and that success rates should be part of the equation estimating these costs. The closer the success rate is to zero, the more astronomical the cost becomes.

This entry was posted in R&D Policy. Bookmark the permalink.

5 Responses to Stop wasting researchers’ time

  1. Noel Semple says:

    Amen! These competitions should be evaluated based on the applicant’s record and proposal. The work which goes into those items is not wasted even if the application is unsuccessful. The same cannot be said of all the other numerous docs required. I spent a whole week this month on Banting application docs which will have no function in the likely event that I am unsuccessful.

  2. melonie fullick says:

    “It might be high time for Statistics Canada to start collecting data and measure the thousands of hours wasted by Canada’s researchers in filling forms and preparing proposals that lead to nowhere.”

    Ugh, I have had this thought so many times! Well-put. Those are some numbers I would LOVE to see. I.e. precisely what such funding yields per hour of work that’s invested in the application process.

  3. Pingback: » Science Evolving Perspectives on Health, Technology, and Society

  4. anon says:

    See “Cost of the NSERC Science Grant Peer Review System exceeds the cost of giving every qualified researcher a baseline grant.” by Gordon R, Poulin BJ:

    http://www.ncbi.nlm.nih.gov/pubmed/19247851

    linked from today’s huffpost:

    http://www.huffingtonpost.ca/johannes-wheeldon-phd/research-funding_b_1080238.html

    A provocative piece…

  5. In engineering, I think this has a strong effect on the rise of schools like CornellNYCTech, which seem to dispense with the notion of independent funding and simply seek it directly from industry. E.g. http://www.nytimes.com/2013/01/22/nyregion/cornell-nyc-tech-will-foster-commerce-amid-education.html?pagewanted=all&_r=0

Leave a comment