Skip to main content
Log in

Toward predicting research proposal success

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

Citation analysis and discourse analysis of 369 R01 NIH proposals are used to discover possible predictors of proposal success. We focused on two issues: the Matthew effect in science—Merton’s claim that eminent scientists have an inherent advantage in the competition for funds—and quality of writing or clarity. Our results suggest that a clearly articulated proposal is more likely to be funded than a proposal with lower quality of discourse. We also find that proposal success is correlated with a high level of topical overlap between the proposal references and the applicant’s prior publications. Implications associated with the analysis of proposal data are discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

Notes

  1. The average amount of time required to prepare a single proposal ranges from 170 h (Von Hippel and Von Hippel 2015) to 270 h (Herbert et al. 2013) of researcher or investigator time. This equates to roughly $8500 to $13,400 USD in salary costs alone. Actual costs per proposal can be $20,000 on average when administrative overhead rates are included.

  2. Professor Swales’ two most highly cited works are entitled ‘Genre analysis: English in academic and research settings (cited nearly 12,000 times in Google Scholar) and ‘Research genres: Explorations and applications’ (cited over 2000 times). As an emeritus professor at Linguistics at the University of Michigan, he remains very active in the field.

  3. Personal communication with Dr. Sarewitz on October 6th, 2017.

References

  • Biddle, C., & Aker, J. (1996). How does the peer review process influence AANA Journal article readability? Journal of the American Association of Nurse Anesthetists, 64(1), 65–68.

    Google Scholar 

  • Bornmann, L., & Daniel, H.-D. (2005). Selection of research fellowship recipients by committee peer review. Reliability, fairness and predictive validity of Board of Trustees decisions. Scientometrics, 63(2), 297–320.

    Article  Google Scholar 

  • Bornmann, L., & Daniel, H.-D. (2006). Selecting scientific excellence through committee peer review—A citation analysis of publications previously published to approval or rejection of post-doctoral research fellowship applicants. Scientometrics, 68(3), 427–440.

    Article  Google Scholar 

  • Bornmann, L., Leydesdorff, L., & van den Besselaar, P. (2010). A meta-evaluation of scientific research proposals: Different ways of comparing rejected to awarded applications. Journal of Informetrics, 4, 211–220.

    Article  Google Scholar 

  • Bornmann, L., Wallon, G., & Ledin, A. (2008). Does the committee peer review select the best applicants for funding? An investigation of the selection process for two European Molecular Biology Organization programmes. PLoS ONE, 3(10), e3480.

    Article  Google Scholar 

  • Cabezas-Clavijo, A., Robinson-Garcia, N., Escabias, M., & Jimenez-Contreras, E. (2013). Reviewers’ ratings and bibliometric indicators: Hand in hand when assessing over research proposals? PLoS ONE, 8(6), e68258.

    Article  Google Scholar 

  • Cole, S., Cole, J. R., & Simon, G. A. (1981). Chance and consensus in peer review. Science, 214, 881–886.

    Article  Google Scholar 

  • Cole, S., Rubin, L., & Cole, J. R. (1978). Peer review in the national science foundation: Phase one of a study. Washington, DC: The National Academies Press. https://doi.org/10.17226/20041.

    Google Scholar 

  • Enger, S. G., & Castellacci, S. (2016). Who get Horizon 2020 research grants? Propensity to apply and probability to succeed in a two-step analysis. Scientometrics, 109, 1611–1638.

    Article  Google Scholar 

  • Fang, F. C., Bowen, A., & Casadevall, A. (2016). NIH peer review percentile scores are poorly predictive of grant productivity. eLife, 5, e13323.

    Google Scholar 

  • Gallo, S. G., Carpenter, A. S., Irwin, D., McPartland, C. D., Travis, J., Reynders, S., et al. (2014). The validation of peer review through research impact measures and the implications for funding strategies. PLoS ONE, 9(9), e106474.

    Article  Google Scholar 

  • Garfield, E. (1955). Citation indexes for science: A new dimension in documentation through association of ideas. Science, 122, 108–111.

    Article  Google Scholar 

  • Graves, N., Barnett, A. G., & Clarke, P. (2011). Funding grant proposals for scientific research: Retrospective analysis of scores by members of grant review panel. British Medical Journal, 343, d4797.

    Article  Google Scholar 

  • Herbert, D. L., Barnett, A. G., Clarke, P., & Graves, N. (2013). On the time spent preparing grant proposals: An observational study of Australian researchers. British Medical Journal Open, 3, e002800.

    Google Scholar 

  • Hörlesberger, M., Roche, I., Besagni, D., Scherngell, T., Francois, C., Cuxac, P., et al. (2013). A concept for inferring ‘frontier research’ in grant proposals. Scientometrics, 97, 129–148.

    Article  Google Scholar 

  • Hornbostel, S., Böhmer, S., Klingsporn, B., Neufeld, J., & Von Ins, M. (2009). Funding of young scientist and scientific excellence. Scientometrics, 79(1), 171–190.

    Article  Google Scholar 

  • Jacob, B. A., & Lefgren, L. (2011). The impact of research grant funding on scientific productivity. Journal of Public Economics, 95(9), 1168–1177.

    Article  Google Scholar 

  • Johnson, V. E. (2008). Statistical analysis of the National Institutes of Health peer review system. Proceedings of the National Academy of Sciences of the USA, 105, 11076–11080.

    Article  Google Scholar 

  • Klavans, R., & Boyack, K. W. (2017). Research portfolio analysis and topic prominence. Journal of Informetrics, 11, 1158–1174.

    Article  Google Scholar 

  • Li, D., & Agha, L. (2015). Big names or big ideas: Do peer-review panels select the best science proposals? Science, 348, 434–438.

    Article  Google Scholar 

  • Lindner, M. D., & Nakamura, R. K. (2015). Examining the predictive validity of NIH peer review scores. PLoS ONE, 10(6), e126938.

    Article  Google Scholar 

  • Melin, G., & Danell, R. (2006). The top eight percent: Development of approved and rejected applicants for a prestigious grant in Sweden. Science and Public Policy, 33(10), 702–712.

    Article  Google Scholar 

  • Merton, R. K. (1968). The Matthew effect in science. Science, 159(3810), 56–63.

    Article  Google Scholar 

  • Mintzberg, H., & Waters, J. A. (1985). Of strategies, deliberate and emergent. Strategic Management Journal, 6, 257–272.

    Article  Google Scholar 

  • Mutz, R., Bornmann, L., & Daniel, H.-D. (2015). Testing for the fairness and predictive validity of funding decisions: A multilevel multiple imputation for missing data approach using ex-ante and ex-post evaluation data from the Austrian Science Fund. Journal of the Association for Information Science and Technology, 66(11), 2321–2339.

    Article  Google Scholar 

  • Neufeld, J., & Hornbostel, S. (2012). Funding programmes for young scientists—Do the ‘best’ apply? Research Evaluation, 21, 270–279.

    Article  Google Scholar 

  • Neufeld, J., Huber, N., & Wegner, A. (2013). Peer review-based selection decisions in individual research funding, applicants’ publication strategies and performance: The case of ERC Starting Grants. Research Evaluation, 22, 237–247.

    Article  Google Scholar 

  • Nicholson, J. M., & Ioannidis, J. P. A. (2012). Conform and be funded. Nature, 492(7427), 34–36.

    Article  Google Scholar 

  • Reinhart, M. (2009). Peer review of grant applications in biology and medicine: Reliability, fairness and validity. Scientometrics, 81(3), 789–809.

    Article  Google Scholar 

  • Roberts, J. C., Fletcher, R. H., & Fletcher, S. W. (1994). Effects of peer review and editing on the readability of articles published in Annals of Internal Medicine. Journal of the American Medical Association, 272(2), 119–121.

    Article  Google Scholar 

  • Sarewitz, D., & Pielke, R. A., Jr. (2007). The neglected heart of science policy: Reconciling supply of and demand for science. Environmental Science & Policy, 10, 5–16.

    Article  Google Scholar 

  • Saygitov, R. T. (2014). The impact of funding through the RF President’s Grants for Young Scientists (the field—Medicine) on research productivity: A quasi-experimental study and a brief systematic review. PLoS ONE, 9(1), e86969.

    Article  Google Scholar 

  • Swales, J. (1986). Citation analysis and discourse analysis. Applied Linguistics, 7(1), 39–56.

    Article  Google Scholar 

  • Teufel, S. (2010). The structure of scientific articles: Applications to citation indexing and summarization. Stanford, CA: CSLI Publications.

    Google Scholar 

  • Teufel, S., Siddharthan, A., & Batchelor, C. (2009). Towards discipline-independent argumentative zoning: Evidence from chemistry and computational linguistics. In Proceedings of the 2009 conference on empirical methods in natural language processing (pp. 1493–1502). Singapore.

  • Van den Besselaar, P., & Leydesdorff, L. (2009). Past performance, peer review and project selection: A case study in the social and behavioral sciences. Research Evaluation, 18(4), 273–288.

    Article  Google Scholar 

  • Van den Besselaar, P., & Sandström, U. (2015). Early career grants, performance, and careers: A study on predictive validity of grant decisions. Journal of Informetrics, 9, 826–838.

    Article  Google Scholar 

  • Van den Besselaar, P., & Sandström, U. (2017). Influence of cognitive distance on grant decisions. In Science, technology and innovation indicators 2017. Paris, France.

  • Van Leeuwen, T. N., & Moed, H. (2012). Funding decisions, peer review, and scientific excellence in physical sciences, chemistry, and geosciences. Research Evaluation, 21, 189–198.

    Article  Google Scholar 

  • Viner, N., Powell, P., & Green, R. (2004). Institutionalized biases in the award of research grants: A preliminary analysis revisiting the principle of accumulative advantage. Research Policy, 33(3), 443–454.

    Article  Google Scholar 

  • Von Hippel, T., & Von Hippel, C. (2015). To apply or not to apply: A survey analysis of grant writing costs and benefits. PLoS ONE, 10(3), e0118494.

    Article  Google Scholar 

  • Zuckerman, H. (1967). Nobel laureates in science: Patterns of productivity, collaboration, and authorship. American Sociological Review, 32(3), 391–403.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kevin W. Boyack.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Boyack, K.W., Smith, C. & Klavans, R. Toward predicting research proposal success. Scientometrics 114, 449–461 (2018). https://doi.org/10.1007/s11192-017-2609-2

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-017-2609-2

Keywords

Navigation