UEval: Bringing Community-Based Experiential Learning to the Evaluation Classroom (original) (raw)
Related papers
A. Erasmus and …, 2003
This paper describes a participative project which sought to build capacity for evaluation across the community, voluntary and statutory health sectors in three authorities within Greater Manchester, UK. A systems methodology was adopted, which emphasised the inclusion of marginalised groups, multiple stakeholder involvement and participative methods. Five action research stages were identified. A needs analysis in which 55 community based projects participated was followed by nine multi-agency learning events, each of which explored, in participatory ways, issues identified by the projects. Three further multi-agency workshops explored new ways of working together in order to practice evaluation for the benefits of the local community. A follow up survey examined how the project had led to longer term outcomes for projects, and currently a number of feasible ideas for mutual support for evaluation practice are being explored. Overall some 150 groups from across three authorities, all with an interest in improving community health (broadly defined) participated in the project. This paper reports the processes and outcomes of the needs audit and the first, foundation learning event, both of which shaped the rest of the project. These stages contributed to joint understanding of what local groups find difficult in terms of prioritising, undertaking and using evaluations, and what contributes to the development of evaluation capability; joint understanding of how constructive inter-agency approaches to evaluation might be developed; and joint assessment of the utility of different models of systemic evaluation for groups with different stakes in their communities.
The EVAL framework: Developing impact evaluation scholars
Advancements in Agricultural Development
The complexities of food, agriculture, natural resources, and human sciences (FANH) programs and projects require faculty to write and secure funding in addition to mastering skills such as evaluation competencies that integrate abilities in quantitative and qualitative research methods and evaluation theory and practice. The EVAL Framework was developed to advance skill development among FANH graduates to include these competencies and increase the pipeline of students who have the essential skills needed to advance FANH initiatives and priorities. The EVAL Framework includes four primary constructs: (a) Evaluation, (b) Value, (c) Active and Experiential Learning, and (d) Leadership. The purpose of EVAL is to build relationships with untapped FANH fields to develop a pipeline for graduates to become evaluation leaders for advancing food and agricultural sciences. This experiential learning and development model focuses on foundational and enrichment experiences, through formal co...
Do Workshops Work for Building Evaluation Capacity Among Cooperative Extension Service Faculty?
2000
A case study used survey design (pre-test, satisfaction, and post-test) to determine if a 1-day workshop affected participants' skills and self-efficacy in regard to conducting evaluation and if workshop participants applied evaluation skills afterwards. Findings indicate that the workshop was effective in building self-efficacy; however, it did not sustain evaluation practice. Formal training may be necessary to develop skills such
Who’s asking? An alternative methodology for engaging students in evaluation exercises
2019
and Summary This paper explores the application of a ‘students as partners’ approach within a project undertaking an evaluation of the learning experiences with technology of students within one institution. The full outcomes of this study are written up in a separate paper; discussed here are the practicalities and outcomes of adopting this method of student engagement to undertake an evaluative exercise. When presenting our study, we explore the issues surrounding student engagement in evaluating their own learning experiences in higher education. We examine the shortcomings of conventional forms of evaluation, and how these prompted us to seek alternative methods of investigation. We detail the method undertaken in this study of adopting students as co-researchers. The paper concludes with a discussion of the resulting data, and comments upon the depth and representativeness achieved using this method. The specific context of the inquiry was the digital learning of students, how ...
Facilitating participatory evaluation as a learning process
European Farming and Society in Search of a New Social Contract Learning to Manage Change Proceedings of the Sixth European Ifsa Symposium on Farming and Rural Systems Research and Extension Held at Vela Real Portugal 3 8 April 2008, 2015
This article presents the experience of conducting an evaluation of participatory learning, within an action research project conducted in South Sulawesi, Indonesia. An alternative approach to evaluation was taken by focusing on learning, and creating dialogue with the community as a main strategy. We classified this project evaluation in terms of: the outcomes of the research project, but equally importantly by the learning that occurred for all groups of participants. In this connection, we discuss the basic differences between traditional and constructivist research paradigms. We conclude that strategies and methods employed in the evaluation itself are key elements to enable participants to clarify and articulate their norms and values, decide on action, and illuminate their learning and eventual empowerment and a sense of liberation.
Navigating Theory and Practice Through Evaluation Fieldwork
American Journal of Evaluation, 2016
To explore the relationship between theory and practice in evaluation, we focus on the perspectives and experiences of student evaluators, as they move from the classroom to an engagement with the social, political, and cultural dynamics of evaluation in the field. Through reflective journals, postcourse interviews, and facilitated group discussions, we involve students in critical thinking around the relationship between evaluation theory and practice, which for many was unexpectedly tumultuous and contextually dynamic and complex. In our exploration, we are guided by the following questions: How do novice practitioners navigate between the world of the classroom and the world of practice? What informs their evaluation practice? More specifically, how can we understand the relationship between theory and practice in evaluation? A thematic analysis leads to three interconnected themes. We conclude with implications for thinking about the relationship between theory and practice in e...