“It might be high time for Statistics Canada to start collecting data and measure the thousands of hours wasted by Canada’s researchers in filling forms and preparing proposals that lead to nowhere.” That’s what I wrote many posts ago in, “The downhill race between NSERC and CIHR”, after learning that the CIHR’s success rate for this year’s individual Operating Grants competition was down to 15% (and the ensuing petition). But this turned out to be the tip of the iceberg.
I have also written about the sharp decline in the numbers of scholarships and fellowships awarded by NSERC. Out of the 1431 applications for postdoctoral fellowships in this year’s competition, only 133 were awarded, that is a 9.2% success rate! Keep this in mind the next time you are writing yet another reference letter!
It also turned out that the success rate for the 2010-11 NSERC’s Strategic Project grants competition has dropped to 22%. That was a record low until the success rate for this year was announced: a ridiculous 16%. I was told that UBC’s was down to 12%. No wonder our computer scientists and engineers are so unhappy these days!
What about the celebrated Banting fellowships? Well, there were 70 of them available for the 2010/11 competition (23 for each of NSERC, SSHRC and CIHR). The website says that 658 eligible applications were received. If you think this is bad, wait to hear the rest of the story about the real cost of this program.
Take a look at the UBC example, a university recently ranked in one of those university rankings as 22d in the world . Only 3 out of the 80 scholars who submitted an application for a Banting at UBC were successful. That’s a dreadful success rate of 3.7%.
The real cost can be measured when one realizes that the review process for the Banting goes through 3 stages, the first one being at the university level.
UBC alone had 3 committees to select the top 30 applicants that would go to the second stage. One committee for each council and each one comprising 6 to 9 faculty members. Besides reviewing and selecting, the panelists have to write a custom made letter supporting each one of the applicants they choose to nominate forward. And this is only one of four letters going from the institution to Ottawa.
· One about the “Institutional Synergy”.
· The supervisors’ statement.
· Elements for Research Environment.
· Elements for Professional Development.
The whole list of items for the application are in here (Task #8 onwards).
Add to this the cumulative work of the three peer review committees established by the three granting agencies, who picked 105 out of the 658 applications that the universities put forward. Then, we are off to the last stage of the review process, which is done by yet another “independent interagency Selection Board” who evaluated each of the 105 applications and made the final recommendations for the 70 awards available.
Let me say it again. It is high time for Statistics Canada to start collecting data and count the thousands of hours spent by Canada’s researchers in filling forms, writing letters, preparing proposals, and evaluating grant applications.
Peer reviewing is absolutely essential for the allocation of resources for research and for training. But before starting a new program, governments and policy makers need to be aware that peer review is costly, and that success rates should be part of the equation estimating these costs. The closer the success rate is to zero, the more astronomical the cost becomes.