Evaluation and monitoring of transdisciplinary collaborations (original) (raw)

Abstract

In this paper we focus on the governance, in particular evaluation and monitoring, of the growing number of transdisciplinary collaborations (TDC's). Researchers and a variety of stakeholders collaborate in such TDC's, the purpose of which is to address societal challenges, like renewable energy, healthy aging or better language teaching in schools. Commonly used practices for evaluation of scientific research (accountability, rankings and benchmarking, dedicated to scientific excellence) do not fit the goals of TDC's. A bottom up or stakeholder oriented approach is better suited; one that stimulates mutual learning as well as the development of socially robust knowledge. We introduce the participatory impact pathways analysis (PIPA), as a method that suits the requirements. It has been developed in the context of development research. Two crucial features are the involvement of stakeholders from the start, and the joint development of a theory of change. This narrates what one wants to achieve and how that will be achieved. From this, stakeholders construct a logical frame that serves as a source for indicators. These indicators enable monitoring ex durante, during the TDC. We present evidence of the use of PIPA for a TDC. From this empirical evidence a number of issues with regard to evaluation, monitoring and indicators can be identified that require attention. Most prominent is the change of function of indicators. Instead of looking back and a focus on past performance, indicators look forward, in the short, intermediate and more distant future.

Loading...

Loading Preview

Sorry, preview is currently unavailable. You can download the paper by clicking the button above.

References (44)

  1. Benedictus, R., & Miedema, F. (2016). Redefine excellence: Fix incentives to fix science. Nature, 538, 453-455.
  2. Blankesteijn, M., Munnichs, G. M., & van Drooge, L. (2014). Contested science: Public controversies about science and policy. The Hague: Rathenau Instituut.
  3. Blume, S. S., & Spaapen, J. B. (1988). External assessment and ''conditional financing'' of research in Dutch universities. Minerva, 26(1), 1-30.
  4. Bush, V. (1945). Science the endless Frontier. A report to the president. Washington, DC: United States Government Printing Office.
  5. Clark, W. (2006). Academic Charisma and the origin of the research university. Chicago: University of Chicago Press.
  6. de Jong, S., Smit, J., & van Drooge, L. (2016). Scientists' response to societal impact policies: A policy paradox. Science and Public Policy, 43, 102-114.
  7. Donovan, C., & Hanney, S. (2011). The 'payback framework' explained. Research Evaluation, 20(3), 181-183.
  8. Douthwaite, B., Alvarez, B. S., Cook, S., Davies, R., George, P., Howell, J., et al. (2007a). Participatory impact pathways analysis: A practical application of program theory in research-for-development. Canadian Journal of Program Evaluation, 22(2), 127-159.
  9. Douthwaite, B., Alvarez, S., Keatinge, J., Mackay, R., Thiele, G., & Watts, J. (2009). Participatory impact pathways analysis (PIPA) and research priority assessment. In D. A. Raitzer & G. W. Norton (Eds.), Prioritising agricultural research for development (pp. 8-24). New York: CABI.
  10. Douthwaite, B., Schulz, S., Adetunji, S. O., & Ellis-Jones, J. (2007b). Impact pathway evaluation of an integrated Striga hermonthica control project in Northern Nigeria. Agricultural Systems, 92, 201-222.
  11. Edler, J., & Kuhlmann, S. (2008). Coordination within fragmentation: Governance in knowledge policy in the German federal system. Science and Public Policy, 35(4), 265-276.
  12. Ely, A. Y, & Oxley, N. (2014) STEPS centre research: Our approach to impact. STEPS Working Paper 60. Brighton: STEPS Centre.
  13. Expert Group on Policy Indicators for RRI. (2015). Indicators for promoting and monitoring responsible research and innovation: Report from the expert group on policy indicators for RRI (EUR 26866 EN). Brussels: European Commission, DG R&I.
  14. Freeman, C., & Soete, L. (1997). The economics of industrial innovation (3rd ed.). Cambridge, MA: MIT Press.
  15. Gibbons, M., Limoges, C., Nowotny, H., Schwartzman, S., Scott, P., & Trow, M. (1994). The new pro- duction of knowledge: The dynamics of science and research in contemporary societies. London: Sage.
  16. Gray, D. O. (2011). Cross-sector research collaboration in the USA: A national innovation systems per- spective. Science and Public Policy, 38(2), 123-133.
  17. Guba, E. G., & Lincoln, Y. S. (1989). Fourth generation evaluation. Newbury Park: Sage.
  18. Hemlin, S., & Rasmussen, S. (2006). The shift in academic quality control. Science, Technology and Human Values, 31(2), 173-198.
  19. Hessels, L. K. (2010). Science and the struggle for relevance Ph.D. thesis. Utrecht: Utrecht University.
  20. Hessels, L. K., van Lente, H., & Smits, R. (2009). In search of relevance: The changing contract between science and society. Science and Public Policy, 36(5), 387-401.
  21. Hoppe, R. (2010). The governance of problems-puzzling, powering and participation. Bristol: Voss.
  22. Joly, P. B., Matt, M., Gaunand, A., Colinet, L., Lare ´do, P., & Lemarie ´, S. (2015). ASIRPA: A compre- hensive theory-based approach to assess societal impacts of a research organization. Research Eval- uation. doi:10.1093/reseval/rvv015.
  23. Kloet, R. R., et al. (2013). Understanding constraints in the dynamics of a research program intended as a niche innovation. Science and Public Policy, 40(2), 206-218.
  24. Kuhlmann, S. (2003). Evaluation as a source of ''strategic intelligence''. In Ph Shapira & S. Kuhlmann (Eds.), Learning from science and technology policy evaluation: Experiences from the United States and Europe. Cheltenham: E. Elgar Publishing.
  25. Kuhlmann, S., & Rip, A. (2014). The challenge of addressing grand challenges: A think piece on how innovation can be driven towards the ''Grand Challenges'' as defined under the prospective European Union Framework Programme Horizon 2020. http://ec.europa.eu/research/innovation-union/pdf/ expert-groups/The_challenge_of_addressing_Grand_Challenges.pdf.
  26. L. Drooge, J. Spaapen
  27. Lam, A. (2011). What motivates academic scientists to engage in research commercialization: 'Gold', 'ribbon' or 'puzzle'? Research Policy, 40, 1354-1368.
  28. Nelson, R. (1977). The moon and the ghetto: An essay on policy analysis. New York: W. W. Norton. Nowotny, H., Scott, P., & Gibbons, M. T. (2001). Re-thinking science: Knowledge and the public in an age of uncertainty. London: Wiley.
  29. OECD. (2012). Meeting global challenges through better governance: International co-operation in sci- ence, technology and innovation. Paris: OECD Publishing.
  30. Patton, M. Q. (1997). Utilization-focused evaluation. The new century text. Thousand Oaks: Sage.
  31. Pielke, R. A. (2010). The climate fix: What scientists and politicians won't tell you about global warming. New York: Basic Books.
  32. Rip, A. (2004). Strategic research, post-modern universities and research training. Higher Education Policy, 17(2), 153-166.
  33. Rogers, P. (2014). Theory of change. Methodological briefs: Impact evaluation 2. Florence: UNICEF Office of Research.
  34. Sarewitz, D. (2016). Saving science. The New Atlantis, 49, 5-40.
  35. Schot, J., & Steinmueller, W. E. (2016). Framing innovation policy for transformative change: Innovation policy 3.0. Brighton: University of Sussex.
  36. Scriven, M. (1991). Evaluation thesaurus (Vol. 4). Newbury Park, CA: Sage.
  37. Scriven, M. (1996). The theory behind practical evaluation. Evaluation, 2(4), 393-404.
  38. Spaapen, J. B., & van Drooge, L. (2011). Introducing 'productive interactions' in social impact assessment. Research Evaluation, 20(3), 211-218.
  39. Stirling, A. (2006). From science and society to science in society: Towards a framework for ''co-operative research''. Luxemburg: Office for Official Publications of the European Communities.
  40. van den Akker, W., & Spaapen, J. B. (2017). Productive interactions: Societal impact of academic research in the knowledge society. LERU Position Paper. Leuven: LERU.
  41. van der Veen, G. (2005). Evaluation leading technological institutes: Final report. Amsterdam: Technopolis Group.
  42. Whitley, R., & Gla ¨ser, J. (Eds.). (2007). The changing governance of the sciences. Dordrecht: Springer.
  43. Wissema, J. G. (2009). Towards the third generation university: Managing the university in transition. Cheltenham: Edward Elgar.
  44. Worthen, B. R., Sanders, J. R., & Fitzpatrick, J. L. (1997). Program evaluation. Alternative approaches and practical guidelines (2nd ed.). White Plains, NY: Longman.