Participatory research meaningfully includes the people that science is intended to impact in a way that is community or constituent-driven, systematic, inclusive, value-based, and focused on practical change (Bradbury, 2015; Minkler, 2000). A key value of participatory research is to promote democratization through research, especially within historically hierarchical settings or among marginalized populations (Greenwood & Levin, 2007). Specifically, the democratizing impact of participatory research has been explored for youth within settings that serve them, such as schools and out-of-school time programs (Greenwood & Levin, 2007). When youth are involved in participatory research they are seen as experts on their experiences and as necessary members of a research team (Zeller-Berkman et al., 2013), and this involvement can create benefit for the research project and the youth (Berg et al., 2009; Checkoway et al., 2005).

Although methods to promote youth participation in the research process exist (Peterson-Sweeney, 2005), the material and practical requirements of their implementation have yet to be explored as possible barriers to their application. Given that youth often engage in participatory research within settings that are systematically under resourced and overburdened—such as public schools—it is necessary to assess how resource requirements of participatory research methods may hinder implementation. In light of this context, we aim to support the development and exploration of research methods that are based in the values of participatory research while also responsive to the material restraints of settings that seek to promote youth empowerment.

Participatory Research with Youth

In participatory research with youth, adults collaborate with youth across the research process to understand the problems that impact youth, develop solutions, and evaluate the effectiveness of those solutions (Cammarota & Fine, 2008; Creswell et al., 2007; Hubbard, 2015). This collaboration between youth and adults can take on a variety of different forms and levels of involvement, as conceptualized within existing participatory frameworks and models (e.g., Checkoway & Richards-Schuster, 2003; Richards-Schuster & Plachta Elliott, 2019; Wong et al., 2010). These collaborations create a critical opportunity to learn from youths’ perspectives and engage them in decision-making that impacts their lives (Checkoway & Richards-Schuster, 2003).

An emerging body of literature suggests that involvement in participatory research can have positive effects on youth (Berg et al., 2009; Checkoway et al., 2005; Foster-Fishman et al., 2005; Hubbard, 2015; Ozer et al., 2010; Ozer & Douglas, 2013; Zeller-Berkman et al., 2013) and the organizations that serve them (Chen et al., 2010; Wilson et al., 2007; Zeller-Berkman et al., 2013). For example, Ozer & Douglas (2013) documented the positive impact of youth involvement in participatory research through a randomized control trial experiment, where high-school aged youth were randomly assigned to enroll in either (a) a course focusing on the development and execution of a student-led participatory research project or (b) a course focusing on direct-service peer mentoring and education with no participatory research component. In this study, youth involved in the participatory research course had significantly larger pre-to-post changes on sociopolitical skills and motivation to influence schools and communities when compared to youth enrolled in the peer mentoring course (Ozer & Douglas, 2013). Given the empirical and value-driven support for participatory research, it is important to establish feasible methods that can be employed within the diverse settings where youth are engaged.

Resource Challenges to Youth Engagement in Participatory Research

Many existing methods developed to facilitate meaningful youth engagement within research face pragmatic and resource challenges (Chen et al., 2010; Jacquez et al., 2013; Ozer & Douglas, 2013; Zeller-Berkman et al., 2013). Even successfully implemented participatory approaches tend to limit youth’s involvement in higher-order research tasks such as data analysis and interpretation (Jacquez et al., 2013). In their review of youth involvement in community-based participatory research, Jacquez and colleagues (2013) found that youth were infrequently involved in data analysis, as youth were only included in this phase of the research process in 54% of the studies included in the review. Therefore, a key direction for the growing field of participatory research with youth is identifying and examining methods that engage youth in data analysis. Developing and utilizing tools that include youth in the data analysis process will ultimately enable researchers to better meet the democratizing and power sharing values of participatory research (Ozer & Douglas, 2015).

Many scholars conducting participatory research with youth have also documented the significant resources needed to implement their approaches, such as cost, time, and space (e.g., Chen et al., 2010; Flicker, 2008; Ozer & Douglas, 2013; Zeller-Berkman et al., 2013). Researchers and community-based partners must manage conflicting tensions from participatory values, resource constraints, research quality, and internal capacity—all of which place different and unique demands on the research project (Zeller-Berkman et al., 2013). Furthermore, resource requirements are exacerbated in qualitative research, given the processes and materials needed to gather, record, and analyze these data (Gravois et al., 1992; Neal et al., 2015; Tessier, 2012). Resource requirements create a barrier to using qualitative methods in participatory research with youth, which is unfortunate given their potential to engage a wide array of youth. Effectively fostering participatory research with youth requires the development of methods that support young peoples’ participation in all aspects of research and evaluation that can be practically implemented with minimal resources.

Youth GO: An approach to gathering youth perspectives

In response to the need for pragmatic participatory approaches to qualitative research with youth, Stacy and colleagues (2018) recently put forth Youth Generate and Organize (Youth GO). Youth GO was developed by combining specific components of existing participatory research approaches (Foster-Fishman et al., 2010; Vaughn et al., 2011; Vaughn & Lohmueller, 2014) with the goal of maximizing their strengths and offsetting limitations. Briefly, Youth GO is a five-step participatory qualitative approach in which youth work with a facilitator to articulate and organize their perspectives on specified issues. During Step One, climate setting, the facilitator introduces the purpose and goals of Youth GO and moderates a discussion to set group expectations and rules. Step Two, generating, requires participants to develop individual responses to prompts that are then clarified in a group setting. In Step Three, organizing, participants learn and implement a process for sorting individual answers into emergent themes. Step Four, selecting, involves the group-based generation and review of categories in which to organize the themes. Finally, Step Five—debrief and discussion—focuses on processing the experience with participants and the establishment of next actions.

Stacy and colleagues (2018) conducted a developmental testing of Youth GO within the context of the evaluation of an out-of-school time educational program for youth living in public housing. Specifically, these researchers conducted two implementations of Youth GO and obtained preliminary evidence of (a) feasibility, (b) usefulness in guiding programming decisions, and (c) youths’ positive perceptions of the approach. In their first implementation, Stacy and colleagues (2018) demonstrated Youth GO’s implementation feasibility with the resources typically available in out-of-school time contexts, and the practical utility of the findings to inform program improvement. In their second implementation, they documented youths’ high satisfaction with the approach, and their favorable perceptions of its appropriateness for youth, involving supportive relationships with adults, and enabling of youth voice in decision-making. While this study demonstrated preliminary feasibility and utility of the Youth GO approach, a more critical analysis of the implementation resources and perceptions was needed to further explore this emerging method.

Current Study

The primary goal of the current study was to further examine the utility and value of the Youth GO approach. Specifically, the goal of this study was to compare the feasibility, resource requirements, and perceived value of Youth GO to that of focus groups. Focus groups are an appropriate comparison, given that they are often used by researchers to engage diverse groups of youth within numerous settings to elicit perspectives in an interactive and contextual manner (Brotherson, 1994; Peterson-Sweeney, 2005). To achieve this goal, we conducted a field-based, double-blind active comparison randomized trial whereby facilitators and youth were assigned to one of two conditions; Youth GO or focus group. Throughout this study, we examined both conditions on implementation fidelity, implementation cost, and perceived value from adult facilitators and youth perspectives. Our specific hypotheses were that—compared to focus groups—Youth GO would:

H1. Be more feasibly implemented at high levels of fidelity.

H2. Require significantly less resources for implementation.

H3. Be perceived more positively by adult facilitators and youth participants.

Methods

Context

We conducted this study within a public school district serving a small legacy city in the midwestern United States (US). Legacy cities are those that have experienced some degree of economic decline over the last 50 years due to the outflux of industry (Hollingsworth & Goebel, 2017). Legacy cities face unique community challenges, such as population loss, declining workforce, and neighborhood blight (Hollingsworth & Goebel, 2017). Given the complex challenges that legacy cities face, a comprehensive student support strategy—Community Schools—was implemented within the local public school district. Community Schools are a strategy to promote health equity and educational justice by establishing four core components within a school: (1) integrated student supports and services, (2) expanded learning time and opportunities, (3) community and family engagement, and (4) collaborative leadership and practice (Maier et al., 2017). The data for this study were collected as part of an evaluation project to document early implementation of a Community Schools approach within the public school district. As one component of our evaluation, we gathered student perceptions from samples of middle-school- and high-school-aged students. It was in the context of this evaluation that we compared Youth GO and focus groups.

Participants

Youth Participants. Twenty-nine students agreed to be a part of this study; 16 were randomly assigned to Focus Group participation and 13 were assigned to participate in Youth GO. Table 1 provides a summary of participants’ responses to demographic questions.

Table 1.Youth Participant Demographics
Focus Group (n = 16) Youth GO (n = 13) Total (N = 29)
n % n % N %
School
A 2 12.5 2 15.4 4 13.8
B 5 31.3 3 23.1 8 27.6
C 7 43.8 4 30.8 11 37.9
D 2 12.5 4 30.8 6 20.7
Race
African American 13 81.25 9 69.2 22 75.9
Multi-Racial 2 12.5 2 15.4 4 13.8
Indian 1 0.06 1 0.08 2 0.07
Not Reported 0 0 1 0.08 1 0.03
Gender
Male 5 31.3 4 30.8 9 31.0
Female 11 68.8 9 69.2 20 69.0
M (SD) M (SD) M (SD)
Age 13.1 (1.84) 12.8 (2.0) 12.9 (1.9)
Grade 7.9 (2.0) 7.8 (1.9) 7.8 (1.9)
GPA 3.2 (0.5) 3.9 (0.5) 3.5 (0.5)

Adult Facilitators. Six adults were recruited to facilitate the evaluation activities. All adults were selected because of their experience facilitating discussion groups with youth. These adults all had some classroom-based training or experience working with youth within underserved communities, thus making them well suited to serve as facilitators of these groups. The majority of adult facilitators were white (83%) and female (83%), and the mean age was 26. Highest level of formal education ranged from bachelor’s degree in progress (17%), bachelor’s degree obtained (50%) and master’s degree obtained (33%). Adult facilitators were randomly assigned to facilitate either the Focus Group (n = 3) or Youth GO (n = 3) conditions.

All adult facilitators received a 1.5-hour training prior to their involvement in the current study. The training included an overview on each of the following components: adolescent development and engagement, the goals of the overall evaluation project (but not the goals of this study), and skills and techniques for facilitating structured groups with youth. Once assigned to a condition, facilitators divided into teams based on the evaluation approach they were assigned and then received training on the relevant protocol. In this component of the training, the designers of each protocol led an in-depth discussion of protocol components and allotted time for questions and open discussion to ensure that research staff understood and were prepared to implement the protocol. All adult facilitators were instructed to only discuss protocol details with other members of their team.

Measures

Youth background information. Youth participants were asked to complete a written questionnaire that asked for information about their demographic background, including: gender, age, race/ethnicity, school, and grade.

Youth-adult partnerships. Two different components of youth-adult partnerships were examined in this study: youth voice in decision making and supportive adult relationships.

Youth voice in decision making. We asked participants to use a five-point Likert-type scale (ranging from 1 = Strongly disagree to 5 = Strongly agree) to respond to the four items of the Youth Voice in Decision Making subscale of the Youth-Adult Partnership scale (Zeldin et al., 2014). These items ask youth to rate the extent to which they experience voice in decision making (e.g., “The staff take my ideas seriously”). Previous research supports the internal consistency of this scale (α = .82) among a racially diverse sample of youth and adolescents, ages 11 to 24, who participated in youth development programs located in the United States, Malaysia, and Portugal (Zeldin et al., 2014). For the present study, the original item wording was slightly altered to better fit the current context (e.g., “The adults in this group take my ideas seriously”). In the current sample, the internal consistency of the modified Youth Voice in Decision Making scale measure was α = .51.

Supportive adult relationships. We asked participants to use a five-point Likert-type scale (ranging from 1 = Strongly disagree to 5 = Strongly agree) to respond to the five items of the Supportive Adult Relationships subscale of the Youth-Adult Partnership Scale (Zeldin et al., 2014). These items ask youth to rate the extent to which they perceive supportive adult relationships (e.g., “Youth and staff trust each other in this center”). Previous research supports the internal consistency of this scale (α = .87) among a racially diverse sample of youth and adolescents, ages 11 to 24, who participated in youth development programs located in the United States, Malaysia, and Portugal (Zeldin et al., 2014). For the present study, the original item wording was slightly altered to better fit the current context (e.g., “Youth and adults trust each other in this group”). In the current sample, the internal consistency of the Supportive Adult Relationships scale measure was α = .76.

Perceptions of evaluation approach. Two different components of participant perceptions of each evaluation approach were examined in this study: satisfaction and acceptability.

Satisfaction. We asked participants to use a five-point Likert-type scale (ranging from 1 = Strongly disagree to 5 = Strongly agree) to respond to five items to assess overall satisfaction of the evaluation approach for youth their age (e.g., “I enjoyed the activities of the group I was in today”). The items were based on an evaluation form for ¡Cuídate!, a sexual risk-reduction program for Latina/o youth (Villarruel & Eakin, 2008), the Primary Intervention Rating Scale (Lane et al., 2009), and the Modified Children’s Intervention Rating Profile (Mitchell et al., 2015). Previous participant satisfaction ratings and assessments often assess a specific type of intervention (e.g., behavior modification), specific components of the intervention such as program modules (e.g., safe sex behaviors), or the implementer of the intervention (e.g., teacher). In contrast, the five items created for the current study aimed to assess satisfaction that is not specific to an intervention or its components. In the current sample, the internal consistency of the satisfaction subscale was α = .90.

Acceptability. We asked participants to respond to seven items on a five-point Likert-type scale (ranging from 1 = Strongly disagree to 5 = Strongly agree) to report their perceptions of the acceptability for the evaluation approach for youth their age (e.g., “The activities today discussed issues that were relevant to my life”). Similar to the satisfaction subscale, the seven item acceptability subscale was developed based on an evaluation form for ¡Cuídate!, a sexual risk-reduction program for Latina/o youth (Villarruel & Eakin, 2008), the Primary Intervention Rating Scale (Lane et al., 2009), and the Modified Children’s Intervention Rating Profile (Mitchell et al., 2015). The seven items were modified to assess participants’ perceptions of acceptability nonspecific to intervention or intervention components. In the current sample, the internal consistency of the acceptability subscale was α = .80.

Facilitator questionnaire. After all Youth GO and focus groups sessions were conducted, we asked adult facilitators to respond to a questionnaire via email, which included questions about their background information (e.g., gender, age, race/ethnicity, education levels) and open-ended questions about their experience or training working and facilitating groups of primarily low SES, African American adolescents). Facilitators completed slightly modified versions of all previously described scales (i.e., youth voice in decision-making, supportive adult relationships, satisfaction, and acceptability) using a five-point Likert-type scale (ranging from 1 = Strongly disagree to 5 = Strongly agree). Items were modified to align with the different positionality of adult facilitators (e.g., “Youth and adults trust each other in this group” was modified to “Youth and adults trusted each other in the groups I facilitated”). While all facilitators were asked to respond to this questionnaire via email, only four of six facilitators provided full data.

Implementation fidelity. Youth GO facilitators completed a fidelity checklist after implementing each group. The checklist required them to indicate materials used and steps implemented within the protocol. It also required facilitators to include explanations if any deviations from the protocol occurred (e.g., materials were not used, or tasks were not carried out). We developed a parallel fidelity protocol for the focus group condition, completed by the current study’s first author using written transcripts of the discussions.

Procedures

Participant recruitment and consent. Staff who led the implementation of Community Schools at four schools serving grades seven through twelve within the district recruited students to participate in the study. Youth were recruited through fliers and word of mouth. Staff made it clear to students that participation in the activity was voluntary, and that refusing to participate would have no bearing on their—or their families’—relationship with the schools, including ability to receive services and supports. On the day that the groups occurred, students who were eligible to participate were brought to a classroom by the community school staff. Upon arrival, research staff described the purpose of the activities and explained an informed assent form to the youth. Youth that wished to participate in the activities signed the assent form. Only one youth declined participation. The use of these data were approved and deemed research exempt by an institutional review board.

Random assignment and evaluation approaches. Within each school building, all assenting participants were randomly assigned to one of two evaluation approaches using sequentially numbered, opaque, sealed envelopes. To limit the potential for manipulation, the envelopes were numbered in advance, and opened sequentially (Dettori, 2010). Both evaluation approaches were implemented concurrently in the same school building, led by two co-facilitators, and were audio recorded. Participation in the research procedures took approximately 1.5 hours. Upon completion, youth were asked to complete a post-assessment which included all measures included for this study. Research staff (who were different than research facilitators) read all post-assessment items and provided explanations or clarifications when asked. Youth completed the post-assessment individually, which took about 10 minutes. After completion of the post-assessment, research staff thanked the youth for their time, distributed research participation incentives, and dismissed youth to Community School staff.

Youth Generate and Organize (Youth GO). The Youth GO approach was implemented, as previously described. (See Stacy et al., 2018 for a more detailed description.)

Focus groups. An independent expert, kept uninformed about the goals of the study, designed the focus group protocol. The expert was asked to design a protocol that would (a) successfully address the guiding questions for the overall evaluation of the community schooling approach, and (b) adhere to existing standards for designing and reporting qualitative research (i.e., O’Brien et al., 2014; Tong et al., 2007). Adult facilitators began the focus group by explaining to youth the purpose of the group session. Then, adult facilitators used base questions to guide group discussion and these were presented to the participants one at a time and written on flip chart paper. Facilitators used probes and follow up questions as necessary to guide the discussion and understand students’ responses. Facilitators took notes during the focus groups on flip chart paper to document the key topics of discussion.

Micro-costing analysis. Our micro-costing analysis followed the five-step framework outlined by Charles et al. (2013). Briefly, we first developed cost diaries documenting the specific expenses involved in the implementation of each type of group (i.e., Youth GO and focus groups). At step two, we gathered cost data for these expenses based on actuals incurred when implementing the project and reported by the project’s principal investigator (this study’s third author). In step three, we gathered costs for any expenses not adequately accounted for in step two. For example, we searched current retail costs for the types of equipment used in the project. In step four, we organized information into tables, organized a priori to compare costs between types of groups and posteriori to detail costs during specific implementation (as the differences between types of groups emerged as concentrated within specific periods). At a final step, we used these tables to conduct a micro-costing analysis comparing expenses between types of group and across implementation periods. As part of this final step, we used these first-time implementation estimates to derive costs for second and subsequent implementations.

Results

Implementation Fidelity

We examined implementation fidelity of both Youth GO and Focus Group approaches, as summarized in Table 2. First, we compared implementation fidelity on the use of required materials. As can be seen in Table 2, facilitators of both Youth GO and the Focus Group approaches used all required materials in implementing their respective protocols. Then, we compared implementation fidelity on both active and debrief processes. Active process are the theoretically-linked differences in participation types and thus conceptualized as underlying hypothesized differences between conditions. Debrief processes involve facilitator-led discussion and reflection of the group experience at the completion thereof. While the Youth GO adult facilitators completed a slightly higher proportion of active processes (98.5%) than did the Focus Group facilitators (95%), both conditions obtained a high rate of fidelity for such processes. In regards to debrief processes, Youth GO facilitators completed a much higher proportion of these processes (91.7%) than did the Focus Group facilitators (33.3%).

Table 2.Summary of Implementation Fidelity Data
Variable Focus Groups Youth GO
Materials
% of materials utilized  100% 100%
Protocol Components 
% of active processes completed 95% 98.5%
% of debrief process completed 33.3% 91.7%

Cost Comparison

Table 3 provides a summary of the micro-costing results largely based on data for the year 2017. The projected implementation costs include preparation costs, delivery costs, and processing costs, and include only personnel and materials, with no indirect costs included.

As summarized in Table 3, although both types of groups have identical costs for preparation, they deviate in the costs required for delivery and processing. During delivery, focus groups require significantly more costs for first-time implementation, largely as a result of the cost of audio recording equipment. During processing, focus groups require a significant amount of personnel costs in order to transcribe and code a session’s recording. Although the differences for delivery costs are likely to decrease during subsequent implementations (as the purchase of audio recording equipment is not usually required), the differences in processing costs would persist across implementations. Stated plainly, based on actuals—and not including report preparation costs—the first time implementation of a focus group will be about 1.7 times more costly than the first time implementation of a Youth GO group. Although these differences should assuage relatively in a second and subsequent implementations, based on actuals, we predict that at second and subsequent implementations focus groups would be 1.4 times more costly than Youth GO groups. Finally, we also note that each Youth GO group should also take about 10.5 hours less to process compared to a focus group.

We note that our experience suggests that the report preparation times—and therefore, the costs—will also vary between both types of groups. Our estimate is that an evaluation report based on complete results for a single group can be prepared by an experienced evaluator in four to six hours for the Youth GO approach, compared to eight to twelve for Focus Groups. That said, because we prepared both reports concurrently and did not track actuals separately for each type of group, we omit these estimates from Table 3. As a result, Table 3 likely underrepresents the cost and time differences between the two approaches.

Table 3.Projected Total Costs (Based on Implementation Actuals) for First Time Implementation of Focus Group and Youth GO Protocols
Unit Focus Group Youth GO
Type of Cost Description Cost Cost (Units) Cost (Units)
Preparation
Personnel:
1 Evaluator’s Time, Planning and Preparation Hourly Rate $65 $130 (2) $130 (2)
1 Evaluator’s Time, Training Hourly Rate $65 $97.5 (1.5) $97.5 (1.5)
2 Staff Members’ Time, Being Trained Hourly Rate $25.61 $76.83 (3) $76.83 (3)
Staff Members’ Time, Participant Recruitment Hourly Rate $25.61 $204.88 (8) $204.88 (8)
(Preparation Subtotal) ($509.21) ($509.21)
Delivery Costs
Materials:
Nametags 1 pack $1.91 $1.91 (1) $1.91 (1)
Adhesive Flip Chart 1 pack $15.00 $15.00 (1) $15.00 (1)
Markers 1 pack $10 $10 (1) $10 (1)
Adhesive notes 1 pack $1.55 $1.55 (1)
Audio recorder 1 recorder $201.42 $201.42 (1)
Personnel:
Staff Member’s Time, Set-up & Facilitation Hourly Rate $25.61 $128.05 (5) $128.05 (5)
(Delivery Subtotal) ($356.38) ($156.51)
Processing Costs
Personnel
Staff Time (Transcription at 4 hours per 1 hour of
recording)
Hourly Rate $25.61 $ 153.66 (6)
Staff Time (Coding at 3 hours per 1 hour of recording) Hourly Rate $25.61 $115.24 (4.5)
(Delivery Subtotal) ($268.9) (—)
Total: $1,134.39 $665.72

Youth & Adult Perceptions

Finally, we examined youth and adult perceptions of both Youth GO and Focus Group approaches. Youth and adult perceptions were examined in two ways: perceived youth-adult partnerships (i.e., youth voice in decision making, supportive adult relationships), and satisfaction and acceptability.

Youth voice in decision making. Youth in the Youth GO condition (M = 3.81, SD = 0.58) reported similar scores on youth voice in decision-making as those in the focus group condition (M = 3.72, SD = 0.70), t(27) = -0.37, p = .717. The average scores for the adult facilitators in the Youth GO condition (M = 4.25, SD = 0.50) and focus group condition (M = 4.13, SD = 0.38) were also similar, though incapable of being tested due to the limited sample size.

Supportive adult relationships. Youth in the Youth GO condition (M = 3.93, SD = 0.44) reported similar scores on the supportive adult relationships as those in the focus group condition (M = 3.83, SD = 0.94), t(27) = -0.33, p = .747. Average scores for the adult facilitators in the Youth GO condition (M = 3.70, SD = 1.00) were slightly lower than those in the focus group condition (M = 4.10, SD = 1.00).

Satisfaction and acceptability. Youth in the Youth GO condition (M = 4.57, SD = 0.43) reported similar satisfaction scores as those in the focus group condition (M = 4.51, SD = 0.57), t(27) = -0.30, p = .769. Similarly, youth in the Youth GO condition (M = 4.05, SD = 0.57) reported similar acceptability scores as those in the focus group condition (M = 4.37, SD = 0.57), t(27) = 1.46, p = .156. Average scores from adult facilitators in the Youth GO condition demonstrated high satisfaction (M = 4.57, SD = 0.43) and acceptability (M = 4.05, SD = 0.57) of the approach. Adult facilitators in the focus group condition reported similarly high ratings on satisfaction (M = 4.51, SD = 0.57) and acceptability (M = 4.37, SD = 0.57).

Discussion

Feasible and practical methods are necessary to conduct participatory research with youth. Specifically, methods that engage youth in collecting and analyzing data and are responsive to the resources present in community settings may foster more democratic research. Youth GO is an emerging participatory approach designed to meet these aims. In Youth GO, adults and youth collaborate to collect and analyze qualitative data on a topic that is meaningful to both constituencies. The present study compared Youth GO against focus groups to examine the fidelity, feasibility, and perceptions of the implementation of both approaches. Results from this study indicate that Youth GO was more faithfully implemented and required less economic and time resources than that of focus groups. Although youth and adult facilitator perceptions of both approaches were comparable on all measures (i.e., youth voice in decision-making, supportive adult relationships, satisfaction and acceptability), implementation cost and time comparisons demonstrate the emerging promise of Youth GO.

Examining Implementation Fidelity and Feasibility of Youth GO

In contrast to research findings that suggest community-based and youth-focused practices are often difficult to implement in school settings (Fagan et al., 2008), we implemented Youth GO with slightly higher fidelity than focus groups. Implementation fidelity is important because it is one indicator of the capacity for an intervention or method to be successfully replicated (Fagan et al., 2008). In the context of participatory research, a lack of implementation fidelity can hinder the democratization of the research process and limit the mobilization of communities towards action. In this study, we also compared Youth GO and focus groups on the resources and associated costs required to implement both approaches. Our results indicate that the costs required to implement Youth GO were strikingly lower than that of focus groups. Paired with our implementation fidelity results, these findings suggest the practical utility of Youth GO in community-based settings where funding and/or resources are limited and focus groups may not be feasible. These findings are important, given researchers’ acknowledgement of resource challenges as a significant barrier to utilizing participatory methods (Chen et al., 2010; Flicker, 2008; Ozer & Douglas, 2013; Zeller-Berkman et al., 2013) and/or qualitative approaches (Gravois et al., 1992; Neal et al., 2015; Tessier, 2012). Our results suggest that by addressing these significant resource barriers, Youth GO may meet an important need within the field of participatory evaluation and research.

Despite these promising findings, our modified retrospective micro-costing analysis is only a foundational step for more comprehensive economic evaluation (Crowley et al., 2018). Without future research that determines the value of each practice’s effects and compares these against the costs, it may be that—despite requiring greater resources—focus groups also render more useful effects (Crowley et al., 2018). It is important that said future research closely adheres to relevant standards for economic evaluation, such as those put forth by the Society for Prevention Research (Crowley et al., 2018).

Examining Perceptions of Youth GO

Our second aim of the current study was to examine perceptions of satisfaction and acceptability of Youth GO from youth and adult facilitator perspectives. Contrary to our hypothesis, youth and adult facilitators perceived these two approaches to be equally satisfactory and acceptable for use within this context. This finding should first be examined in consideration with a few notable limitations that may have impacted these results. First, to enhance the scientific rigor of this study, the active comparison focus group condition was designed by a blind expert, unaware of the study hypotheses. Unfortunately, this design feature may have unintendedly minimized the differences between experimental conditions, as the resulting focus group design included several developmentally appropriate best practices for group facilitation (e.g., age-appropriate active facilitation, and in-vivo recording of visible summary notes using large poster-sized sheets). Although these components are helpful for engaging youth participants, they also reduced the differences between comparison conditions. Second, given that the project occurred within the context of a community-based evaluation, the study sample was subject to external community conditions (e.g., student attendance, participant recruitment) and resulted in significantly lower sample size than we anticipated. This small sample size adversely impacted our statistical power to identify significant effects (Cohen, 1992).

Despite these limitations, our findings regarding the perceptions of Youth GO still support the overall promise of the approach. Although youth and adult facilitators did not perceive Youth GO as more satisfactory or acceptable than focus groups, both types of groups were perceived to be relatively equivalent. Focus groups are traditionally used in community settings to gather rich, contextually-grounded qualitative data (Brotherson, 1994). However, focus groups pose several resource limitations and therefore may not be feasible in all settings (Bertrand et al., 1992; Gravois et al., 1992; Neal et al., 2015). Given that our results suggest that Youth GO may be more feasible than focus groups, equivalent perceptions of satisfaction and acceptability further underscores the approach as a viable tool for community settings with limited resources. Future research—that responds to the limitations present within this study—could further explore perceptions of Youth GO. For instance, qualitative approaches may be particularly insightful to examine experiences with the approach and its ability to enhance the democratization of the research process. Given the importance of promoting and understanding youth perspectives, a more in-depth study to explore youths’ perceptions of Youth GO is needed.

Implications for Research and Practice

The impetus for this study emerges from a growing field of research that both recognizes the inherent strengths and value in engaging youth in research and evaluation (London et al., 2003) and critically examines the utilization, impact, and experience of participatory methods (Foster-Fishman et al., 2005; Ozer & Douglas, 2015; Ozer & Wright, 2012). Specifically, our examination of Youth GO further demonstrates its potential as a practical participatory method for researchers and evaluators for several reasons. First, Youth GO involves youth in collecting and analyzing data—a stage in which youth are often omitted in extant participatory processes (Jacquez et al., 2013). Second, Youth GO also promotes rapid gathering, analyzing, and utilization of qualitative data, thereby contributing to the field of rapid evaluation and research approaches (McNall & Foster-Fishman, 2007). Finally, we propose Youth GO as a tool that can be embedded into many other participatory research processes—including but not limited to—needs assessments, action research, implementation assessments, and process evaluations. Future research could continue exploring novel contexts, settings, and projects in which Youth GO can be utilized.

Study findings also have several implications for community practice. Our findings demonstrate Youth GO as a promising approach for collaboratively engaging youth in research and evaluation. By supporting the development of methods that engage youth, we aim to further promote the inclusion of youth voices on issues that impacts their lives. Importantly, Youth GO can be a feasible tool for different community-based contexts, such as educational or school-based settings (as demonstrated in this study) and out-of-school time program settings (as demonstrated in Stacy et al., 2018). Youth GO can also be used to gather data on a wide variety of topics, based on the project in which it is embedded (e.g., programmatic feedback and improvement, project brainstorming, program evaluation). Furthermore, Youth GO may be a good approach for practitioners to collect and analyze data that is meaningful to them, without needing extensive university resources. This may be especially empowering for communities, as it allows them to gain qualitative feedback on important topics, without needing significant resources, outside expertise, or special software. Together, these findings point to the usefulness of Youth GO as a tool for community practitioners to embed youth in research and evaluation that impacts their lives.

Conclusion

Youth GO is an emerging participatory method for collecting and analyzing qualitative data with youth. Given its feasibility, practical utility, and integration of youth in analyzing data, Youth GO responds to several limitations within extant participatory processes. Despite its limitations, results from this study demonstrate the feasibility and usefulness of this approach in community-based settings with limited resources. In this study, Youth GO was implemented with more fidelity, less resources, and equal satisfaction among youth and adult participants. Future research could continue examining Youth GO to further understand its utility within different community-based settings and utilizing different methods of inquiry. This study contributes to a field of rapid research and evaluation approaches that seek to be responsive to the material constraints and needs of communities. Through the exploration of feasible participatory methods—such as Youth GO—this study aims to contribute to a field of methods that engage persons who are often omitted from the research process.