Article Text

Protocol
Describing youth participatory evaluation of educational interventions as a key domain of the social determinants of health: protocol for a scoping review
  1. Bianca Montrosse-Moorhead1,
  2. Amanda Sutter2,
  3. Chisomo Phiri3,
  4. Luna De La Cruz Perdomo4
  1. 1Department of Educational Psychology, University of Connecticut, Storrs, Connecticut, USA
  2. 2University of Connecticut, Storrs, Connecticut, USA
  3. 3University of Connecticut, Waterbury, Connecticut, USA
  4. 4Psychology, University of Connecticut, Storrs, Connecticut, USA
  1. Correspondence to Professor Bianca Montrosse-Moorhead; bianca{at}uconn.edu

Abstract

Introduction Youth participatory evaluation is one model for monitoring global outcomes and assessing interventions to improve young people’s health equity and well-being while embracing principles of participation and empowerment. Little is known about the use of this approach in practise. This scoping review will identify and synthesise descriptions of how youth participatory evaluation is enacted, to what extent it occurs, and describe the relationship between context and inclusion.

Methods and analysis Scoping review methods will adhere to those outlined by Arksey and O’Malley. The study will also follow the Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews. The review will use publicly available evaluation reports (grey literature) for programmes funded by the US National Science Foundation through the Advancing Informal science, technology, engineering and mathematics (STEM) Learning programme and whose reports are archived in the repository hosted by the Reimagining Equity and Values in Informal STEM education (REVISE) Center. This scoping review is limited to education, one of the domains of the social determinants of health, more precisely STEM education, due to the report publication parameters set by the REVISE Center repository. A research team member will download citations for and PDFs of reports. These citations and reports will be managed using Zotero and exported to Covidence, a web-based program designed to manage systematic and scoping reviews. Evaluation report selection will occur in a two-step process by trained coders with clear criteria. Inclusion criteria will include: (1) report is for an evaluation study; (2) evaluation has a focus on young people, aged 10–24; (3) evaluation is for a programme serving young people, aged 10–24; and (4) report written and uploaded to the REVISE Center repository between 2017 and 2022. All reports hosted on the REVISE Center repository are based in the USA and written in English. Data charting will also be done by trained coders and facilitated by Covidence and a codebook. Several procedures will be used to uphold rigour and consistency during this process. Data analysis will be done with Dedoose.

Ethics and dissemination Human subjects research approval will not be required. This scoping review will rely on publicly available evaluation reports. No human research participants will be involved in this review. Findings will be shared through dissemination strategies, such as peer-reviewed journals, international and national conferences, and social media affiliated with academic institutions and professional associations.

Study registration This study is preregistered on Open Science Framework (https://osf.io/23jdx/). Registration DOI: https://doi-org.ezproxy.u-pec.fr/10.17605/OSF.IO/K6J98.

  • statistics & research methods
  • community-based participatory research
  • public health
http://creativecommons.org/licenses/by-nc/4.0/

This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/.

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Strengths and limitations of this study

  • Using and following a highly cited scoping review framework and adhering to established reporting guidelines will generate credible and transparent findings about the state of youth participatory evaluation practice within education, more precisely informal science, technology, engineering and mathematics (STEM) education.

  • This review will use publicly available US evaluation reports instead of peer-reviewed journal articles, providing a more realistic representation of practice.

  • This scoping review will not report on the quality of evaluation reports.

  • The focus on US evaluation reports in informal STEM education will not be generalisable to other contexts (eg, international evaluations, peer-reviewed literature, non-education programmes, etc) and sets the foundation for future comparisons.

Introduction

We know little about using a youth participatory approach in evaluations of interventions developed and implemented to address one or more domains of the social determinants of health. This scoping review will empirically describe the practice of youth participatory evaluation. It will be limited to education, a key domain of the social determinants of health, where we would expect young people to be involved.

Background

One model put forth for monitoring global outcomes and assessing interventions aimed at improving health equity and well-being, while also embracing principles of participation and empowerment of young people in studies, is commonly referred to as youth participatory evaluation. This youth-focused approach is defined as the process of involving young people in conducting evaluations.1 This model positions evaluation away from being framed as something done to young people to evaluation with or by young people.2–4 Meaningful participation moves beyond young people serving only as data sources, survey respondents or interview participants; instead, it includes young people’s involvement in all evaluation phases and in decision-making roles. Young people, for example, serve as key stakeholders and engage others, assist with developing a programme description, help decide on the purpose of the evaluation, pose key evaluation questions, consider the best methods to use, help to analyse and interpret the data and determine conclusions. The purpose of the youth participatory evaluation is to acknowledge young people’s legitimate and unique perspectives by meaningfully engaging and empowering them in the evaluation of programmes that serve them.

Within the social determinants of health literature, young people’s participation and empowerment are crucial to policy action towards health equity and well-being.5 This emphasis on principles of participation and empowerment has reverberated across the complex systems working to address the social determinants of health, including the evaluation of interventions aimed at one or more domains (eg, education, economic stability, living and working conditions). For example, the US Centers for Disease Control and Prevention identifies evaluation as one of the six pillars of the agency’s work to ‘reduce disparities and promote health equity’.6 This is because of a recognition that if we are to make progress on health equity and well-being worldwide, then it requires that we both monitor the current state of these outcomes and assess interventions aimed at mitigating adverse consequences. Evaluation helps accomplish both of these.

Moreover, prior work has consistently shown that the social determinants of health or social contexts hugely influence young people’s lives.5 7 8 These are significant findings because young people aged 10–24 make up a quarter of the world’s population, estimated to be 1.8 billion.9 This group of young people is a dominant force now and has the potential to continue to be one for years to come.10 On the positive side, for example, an educated and ambitious workforce ready to enter a strong labour market is good for economic stability. Employed young people earning a decent living have a higher potential to experience a good life, working conditions and strong health outcomes. On the negative side, the lack of an educated or well-prepared workforce entering a weak or non-existent labour market is likely to exert pressure in various ways, for example, spurring political unrest, forcing mass migration to often unwelcoming environments, impacting marriage and birth rates, and so on. Regardless of whether the positive or negative pathway occurs, these conditions are part of the complex social contexts in which young people live. These social determinants of health profoundly impact health equity and well-being.5

Despite the growth in excitement about engaging young people, youth participatory evaluation practise still needs to be explored. There is literature that lays out the theoretical and conceptual grounding for the approach.1–4 Guidance and resource materials have been developed to provide practical strategies for how to use a youth participatory evaluation approach.11–15 Case studies of the approach’s use in practise have been published.2 13 16–19 One systematic review of young people’s participation in evaluation has also been published.20 The author reported that across the 209 evaluations included in the review, evaluation practise did not match the theoretical and conceptual grounding or practises articulated in guidance and resource materials.

This systematic review of young people’s participation in evaluation highlights a critical tension in the field. There is vast literature from the 1950s that describes the benefits of a participatory approach to evaluation,21–23 demonstrating strong support for stakeholder inclusion of all ages continuing to the present day.24–26 There have also been empirical studies describing the benefits of participation on evaluation quality and outcomes, including a systematic review of stakeholder involvement in evaluation.27 Thus, stakeholder participation in evaluation is a widely accepted practise, but as the prior systematic review highlights, the principle of participation does not appear to extend to young people.

At the same time, while the findings from the prior systematic review on young people’s participation in evaluation are important,20 several limitations curtail their usefulness. One is that the study focused very narrowly on evaluation reports of out-of-school time programmes, which occur after school or during the summer. Two, systematic review methods were not adequately described in the article. There are, for example, no research questions or a methods section in the article. Instead, two paragraphs are embedded that note the Harvard Family Research Project’s Out-of-School Time Programme research and evaluation database was used, that the database contained 209 evaluation reports, that three searches of the database were done, and that the final sample included 6 evaluation studies (out of 209). The two paragraphs do not list search terms, inclusion or exclusion criteria, who reviewed the articles, the years covered, how data were extracted or how data were analysed. One could argue that this systematic review is in name only. Three, the study is almost 15 years old. No follow-up review studies have been done to explore the extent to which findings have held over time or to explore findings beyond those presented, such as whether observed patterns hold across other educational contexts. Complicating potential systematic review efforts, the Harvard Family Research Project’s Out-of-School Time Programme Research and Evaluation Database was decommissioned in 2017. These collective limitations raise concerns about the transparency and quality of methods for the prior systematic review, rendering replication impossible. We still know little about enacting youth participatory evaluation in education for these reasons.

Objective

This scoping review will empirically describe the practise of youth participatory evaluation. Scoping reviews offer one way to summarise existing literature and uncover key themes through thorough, transparent, and replicable processes.

This scoping review will be limited to education for several reasons. Research on the social determinants of health has shown that access to and participation in high-quality educational experiences are among the strongest structural determinants of health equity and well-being for young people.8 Many educational interventions aim to mitigate aspects that impede young people’s health equity and well-being. One example is the UK’s Free School Meals (FSM) programme. The FSM is a government-funded meal programme that provides nutritious, low-cost or free lunches daily to children who meet eligibility criteria and attend a primary or secondary state school, free school or academy.28 Many such programmes exist across other countries, such as the National School Lunch Programme in the USA. Another example is Morrocco’s Reading for Success National Programme for Reading. This programme was developed in response to research documenting that despite near-universal access to education in the country, 7 out of 10 first-grade students were not reading at grade level.29 A final example is out-of-school time interventions offered in many countries (eg, after-school programmes, summer programmes, etc). Prior meta-analytic work has shown these interventions to be especially helpful for students who are below country-specific grade-level standards, are from low-income households, or who are immigrants.30–32 Education is also a domain of the social determinants of health where one would expect to see young people as intended beneficiaries of interventions. For example, primary and secondary education is free for everyone in the USA, the UK and in many other countries. Thus, if youth participatory evaluation practises occur, one would be more likely to observe it in educational contexts. Future work will explore youth participatory evaluation practise in other domains associated with the social determinants of health.

Two objectives will guide this scoping review:

  1. To identify and synthesise descriptions of how youth participatory evaluation is enacted or not (ie, is it done, who is involved, in what ways, to what extent and with what methods, strategies and actions) within US science, technology, engineering and mathematics (STEM) education programmes.

  2. To describe the relationship between context and the inclusion of young people in evaluations of US. STEM education programmes.

Methods and analysis

Our study approach will be guided by steps one through five of Arksey and O’Malley’s scoping review methodological framework.33 There can be differing goals for scoping studies. Our scoping study goal will be exploratory and broad but still intended to understand the existing evidence, especially gaps in the evidence. The reporting will also conform to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews (PRISMA-ScR).34 This study is preregistered on Open Science Framework (https://osf.io/23jdx/). Collectively, our study approach and methods aim to overcome and address the limitations of the prior systematic review.20

Stage 1: defining the research questions

The theoretical grounding for this study will be the Conceptual Framework for Measuring Outcomes of Adolescent Participation.35 This ecological framework includes several core ideas. One is a recognition that participation exists on a continuum from non-participation to youth-led, with several key distinguishing markers in-between (eg, consultative, collaborative). Participation must be meaningful. Meaningful in this framework is defined by four key context characteristics: it is safe for youth to express their ideas and views, youth can share their ideas and views using whatever mediums are preferred by them, youth ideas and views are respected by adults and youth ideas and views are given due consideration during decision-making. The framework also describes socioecological spheres where participation can occur (eg, schools, with peers, in community groups and within government). It also calls attention to the environment that enables involvement (eg, social norms, skills and capacities, awareness of the right for youth to be involved).

This framework was selected because of its alignment with the idea within the social determinants of health literature that young people are a crucial component of policy action towards health equity and well-being.5 In the same way, in the Conceptual Framework for Measuring Outcomes of Adolescent Participation, young people are a critical component of evaluations of interventions meant to impact their lives. There is also alignment between the social determinants of health literature and the selected conceptual framework in that both recognise the importance of the social ecology and contexts in which young people live.

The Conceptual Framework for Measuring Outcomes of Adolescent Participation positions young people’s involvement, including in evaluation activities, as needing to be authentic and meaningful. It recognises the environmental context’s influence on participation and provides a conceptual framework for researching important components of participation.

To understand youth participatory evaluation practise grounded in the aforementioned conceptual framework, our scoping review will address the following research questions:

  1. How have young people been included in evaluating programmes that serve them?

  2. What evaluation methodologies have been used to include young people?

  3. What evaluation strategies and preparatory actions have been used to promote successful inclusion?

  4. What contextual factors are important for the inclusion of young people?

Stage 2: identifying relevant literature

Our study will use evaluation reports publicly available on the community repository hosted by the Reimagining Equity and Values in Informal STEM education (REVISE) Center.36 This Center is government-funded by the US National Science Foundation (NSF) through the Advancing Informal STEM Learning (AISL) programme. NSF funds practise, research and evaluation proposals investigating informal STEM learning experiences and environments. Informal education is a general term used in the USA for education outside of traditional classroom settings. As such, the NSF AISL programme supports various organisations, including but not limited to STEM centres and museums, zoos, aquariums, botanical gardens and so on.

While a few possible repositories were identified in the exploration stage, the REVISE community repository is a good fit for this study for several reasons. One, NSF is a significant funder of evaluation activities in the USA; the agency requires that all funded projects be evaluated. Two, NSF total award amounts are publicly available so that we can account for budgets associated with evaluation reports included in the REVISE repository. This is not the case for other repositories. Three, evaluators outside of the academy, of which there are many, do not have the pressure or need to publish. Many evaluation practitioners are not incentivised to publish in journals or make their reports public.37–39 The result is that many evaluation reports are neither public nor submitted for peer review to a journal. The REVISE community repository is unique because it collects and makes publicly available evaluation reports (grey literature) for NSF-funded STEM-related programmes. No other evaluation report database, including the aforementioned Harvard Family Research Project’s Out-of-School Time Programme research and evaluation database or the European OpenGrey literature database, carries this same public publishing mandate. Four, because this study is interested in describing actual practise, evaluation reports, as opposed to peer-reviewed journal articles, provides a more accurate picture of practise. The REVISE community repository provides a more complete picture of evaluation practise within STEM education funded by NSF because it limits bias in two crucial ways. It limits publication bias, or bias resulting from the failure to publish results based on direction or strength because all reports are made publicly available regardless of findings. It also limits another type of bias that is unique to evaluation studies. We call this bias ‘evaluation reporting bias’ because it results from the failure to publish results based on whether the evaluation team is incentivised to do so.

Moreover, this study is part of a larger research agenda. For the reasons mentioned above, this first study focuses on US evaluation reports (grey literature). Subsequent work will review grey literature outside of the USA and peer-reviewed literature to see the extent to which patterns observed in the grey literature hold.

A research team member (CP) will download citations for and PDFs of reports. These citations and reports will be managed using Zotero and imported into Covidence (AS).40 41 Zotero is open-source, freely available reference management software. Covidence is a web-based program designed to manage systematic and scoping reviews.

Stage 3: report selection

Evaluation reports will be screened for inclusion in a two-step process (see online supplemental file 1 for more information). In step 1, two independent screeners (AS, CP) will review the title and abstract for each report against inclusion and exclusion criteria. Inclusion criteria will include: (1) report is for an evaluation study; (2) evaluation has a focus on young people, aged 10–24; (3) evaluation is for a programme serving young people, aged 10–24; and (4) evaluation report written and uploaded to the REVISE Center repository between 2017 and 2022. Most programmes were likely also delivered within this 2017–2022 window. Inclusion criteria are intentionally broad so that researchers can describe to what extent the inclusion of young people in evaluation is occurring or not occurring. Both inclusion and non-inclusion of youth are essential to answer the first research question. Evaluation reports falling outside these inclusion parameters will be excluded, such as programmes for children under age 10 or adults 25 years of age and older. If the title or abstract needs to be clarified and all inclusion criteria can be fully accessed, the default action will be to include the study to be more fully assessed in step 2. Covidence will track results from step 1. In step 2, team members (AS, LDLCP) will review the full text of all evaluation reports. The same inclusion and exclusion criteria will be used, and Covidence will also track results.

The screening team will include the primary investigator (BM-M) and graduate (AS) and undergraduate trainees who have completed training on youth participatory evaluation and scoping reviews (CP, LDLCP). Several study procedures will be used to ensure consistency: (1) all reports will be reviewed by trained team members (BM-M, AS, CP, LDLCP); (2) the study Primary Investigator (PI), who is an expert in youth participatory evaluation, will serve as a critical reader and resolve conflicts (BM-M); (3) all team members will review scoping review objectives and inclusion/exclusion criteria before screening (BM-M, AS, CP, LDLCP); and (4) team members will practise applying inclusion/exclusion criteria on a subset of reports to calibrate between team members before engaging in the full screening process (BM-M, AS, CP, LP).

Stage 4: data charting

Once the final list of studies for inclusion is generated, team members (LDLCP, AS) will use Covidence to chart or extract relevant information from evaluation reports aligned with the objectives of the scoping review. Information to be extracted includes author details, programme information and evaluation study components. Many of these details are evaluation reporting elements identified in the Checklist for Evaluation-Specific Standards.42 Team members (LDLCP, AS) will also extract general information on youth participatory practises.

The complete list of quantitative and qualitative variables to be extracted is available in the project codebook on Open Science Framework (https://osf.io/23jdx/). This codebook describes codes, definitions for codes, the origin of the definition, the type of data to be extracted, the corresponding Covidence field and the variable values to be used in the analysis. Additional categories may be identified during the data extraction process. These will be discussed and decided on by the entire team. Consistent with scoping review convention, the methodological quality of included studies will not be evaluated.33 34

Rigour and consistency will be maintained through several procedures: (1) all team members will be trained on charting procedures (BM-M, AS, LDLCP), (2) team members will practise charting data on a subset of reports before phase one beginning (BM-M, LDLCP), (3) team members will bring data charting questions, concerns and requests for a second opinion to regularly scheduled team meetings involving all team members, with notes from the discussion and decisions documented in a shared google document (LDLCP) and (4) a codebook detailing variable, variable definitions, origins of the variable definition, type of variable and variable values will be developed, used and updated (as needed) by the PI (BM-M) during the data charting process.

Stage 5: collating, summarising, and reporting the results

Collating and summarising will be done in Dedoose by the team (AS, LDLCP). Dedoose is a web-based data analysis software program that allows for collaborative analysis of quantitative and qualitative data.43 One team member (AS) will download charted data and PDFs from Covidence and import them into Dedoose. Team members (LP, AS) will further code information on youth participatory practises using Dedoose, such as, inclusion type, mode of participation, depth of inclusion, ways young people were involved, strategies for inclusion, outcomes of young people’s involvement and environmental and context features enabling meaningful inclusion. The complete list of variables to be coded in Dedoose is available in the project codebook on Open Science Framework (https://osf.io/23jdx/).

We will use quantitative and qualitative data analysis methods to answer our research questions. Descriptive statistics (ie, mean, frequencies, cross-tabs) will be generated for all research questions using charted quantitative data. Thematic analysis will be used to supplement quantitative findings by providing descriptions and examples of strong youth participatory practice, such as how young people are included in evaluation studies, what methods, strategies and actions are used to engage young people in the evaluation process, and the relationship between the context in which evaluations take place and the inclusion of young people in evaluation.

All dissemination efforts (eg, presentations and manuscripts) will follow PRISMA-ScR guidelines. Table 1 includes the anticipated timeline.

Table 1

Anticipated timeline

Patient and public involvement

None.

Ethics and dissemination

This study will not involve human subjects and will not require institutional review board approval. Findings will be shared through several dissemination strategies, such as peer-reviewed journals, international and national conferences and social media affiliated with academic institutions and professional associations.

Study status

The authors are nearing completion of the report selection process (stage 3) now. Fall 2024 is the target date for completing this review.

Discussion

To the best of our knowledge, this will be the first review to describe youth participatory evaluation as enacted in practice based on the grey literature, upholding principles of transparency and the possibility of future replication. This is important given that future work will explore youth participatory evaluation as observed in the evaluation of interventions targeting other social determinants of health domains.

Using and thoughtfully integrating scoping review methods, our conceptual framework, the Checklist for Evaluation Specific Standards, and PRISMA-ScR will serve to enhance the rigour, transparency, trustworthiness and replicability of our study design.33–35 42 We also believe our use of grey literature is a strength because it more accurately reflects what occurs in evaluation practise. We believe this scoping review design has the potential to serve as a model for others interested in advancing scholarship on youth participatory evaluation.

As with any study, there are limitations to consider. Scoping reviews, by design, are not well-suited to examine the effectiveness of evaluation methods, including those that are participatory. This focus is better explored in systematic reviews and meta-analyses. This scoping review will also not assess the quality of evaluation designs involving young people. While we will search a unique publicly available evaluation report database, we know that this database does not cover all studies involving young people or using youth participatory evaluation methods. It only covers evaluation reports (grey literature) for STEM-related programmes based in the USA and its territories and funded by NSF AISL. Future work will search other grey literature databases, such as OpenGrey and peer-reviewed papers. This will allow researchers to examine (and potentially quantify) both publication bias and evaluation reporting bias in youth participatory evaluation. It will also enable researchers to see the extent to which findings vary across different types of literature, a larger slice of the education literature (not just STEM), in the evaluation of interventions targeting other social determinants of health domains and in different geographic contexts.

This scoping review will provide a much-needed synthesis of the state of the field, including youth participation evaluation being used in practice, who is involved, to what extent, in what ways, with what methods, strategies and actions, and how context is connected to involvement in evaluation. This information will provide empirical evidence on which to base discussions and debates about the merits of and need for greater participation of young people in evaluations of interventions aligned with the social determinants of health.

Ethics statements

Patient consent for publication

References

Supplementary materials

  • Supplementary Data

    This web only file has been produced by the BMJ Publishing Group from an electronic file supplied by the author(s) and has not been edited for content.

Footnotes

  • X @b_mmoorhead

  • Contributors We provide a contributor roles taxonomy (CRediT) author statement. BMM: conceptualisation; methodology; validation; investigation; resources; data curation; writing – original draft; writing – review and editing; visualisation; supervision; project administration. AS: conceptualisation; methodology; validation; investigation; writing – review and editing; supervision; project administration. CP: investigation; writing – review and editing; LDLCP: investigation; writing – review and editing. BMM is responsible for the overall content of this scoping review as guarantor.

  • Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.

  • Competing interests None declared.

  • Patient and public involvement Patients and/or the public were not involved in the design, or conduct, or reporting or dissemination plans of this research.

  • Provenance and peer review Not commissioned; externally peer reviewed.

  • Supplemental material This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.