Challenging Horizons Program – After School and Mentoring Versions

Insights

In considering the key takeaways from the research on this program (Challenging Horizons Program) that other mentoring programs can apply to their work, it’s useful to reflect on the features and practices that might have influenced its rating as “no effects” (that is, a program that has strong evidence that it did not achieve justice-related goals), as well as some of the more interesting aspects of the program design.

1) This program evaluation offers a great glimpse into “The Practitioner’s Dilemma”

In just about any program that tries to improve the lives of people there is always a tension (hopefully a healthy one) between what would be the optimal form of support and services that would have the most impact and the often disappointing reality of what is actually achievable in the real world given the available resources and the willing participation of those involved. The evaluation of the Challenging Horizons Program (CHP) in both its Afterschool (AS) and Mentoring (M) form offers a nice example of how programs can face challenges in offering implementing their theory of change in the most optimal way.

The reason that there are two versions of the program in the first place is that the developers of the program noted that implementations of the AS model had issues related to student attendance and overall attrition from the program. In their words, “staying after school 2 days a week for an entire school year is not desirable for many students”⎯to say nothing of the extracurricular obligations and transportation issues that may arise even for students who want to participate 2 days a week. Middle school students lead busy lives, so the Mentoring version of the program was developed to test whether a less-robust version offered during the school day could also lead to good outcomes, even if the developers themselves expected a reduced impact under the new format.

And why did they expect that reduced impact? (Which turns out to be exactly what their evaluation found…)

2) The differences in “dosage,” topics of emphasis, and delivery context really made a difference in how well these two models performed

The students served by the two versions of CHP were middle schoolers with Attention Deficit Hyperactivity Disorder (ADHD) who had already exhibited some challenges in school related to their condition. These youth differed from many of the typical students served by school-based mentoring programs in that they had a formal diagnosis of a condition that can really impact academic achievement and related aspects of life, like forming close relationships with peers and school staff. Overcoming the challenges presented by ADHD, specifically the difficulties related to keeping organized, completing assignments, managing time, and paying attention during instructional time, will understandably take a lot of concerted effort and time. These students literally needed to build a new set of skills, strategies, and mindsets related to schoolwork, and do this halfway through their time in the K12 system. So while the challenges were substantial, the two versions of the program approached it from pretty different perspectives, both in terms of design and execution:

  • The AS version offered students a full exposure to the curriculum of the program, targeting many different skills and strategies that students could learn and employ. The Mentoring version, however, usually whittled that down to one or two (80% of the participating youth got 2 or fewer pieces of the intervention).
  • The youth in the AS version of the program got significantly more time in the program working directly with a Personal Counselor who was trained in the intervention. In the Mentoring version, youth were paired with a teacher from the school with whom they had some kind of rapport, but spent less time with. Students in the AS version got an average of about 32 sessions of the course of the year, at 2 hours each. Students in the mentoring version got about 25. But the real difference was in the time spent: the Mentoring version meetings averaged a paltry 12 minutes in length. It’s hard to imagine many impacts coming from a mentoring relationship with an average of 12 minutes per conversation (although some interventions in the human services realm can be successful with a frequent-but-brief check-in format). In fact, mentors in the Mentoring version spent almost as much time meeting with their program consultants to discuss the intervention (262 minutes on average) as they did meeting with the recipients of the services (305 minutes on average)!
  • Additionally, one wonders if meeting with teachers who were already familiar to the mentees in the Mentoring version may have kept youth from feeling like this program was an exciting new thing. Mentees may have taken the program more seriously if they were matched with this new person from outside the school (as in the AS version) instead of having these short meetings with a teacher they already knew.
  • The Mentoring version also eliminated the group time that offered peer engagement and fun activities in the AS version. One can imagine that these peer group interactions could reinforce the messages and strategies of the intervention while also allowing for some fun and deeper engagement with staff.
  • And not surprisingly, the AS version also offered more chances for parents to learn about the program and reinforce the strategies of the program at home.

None of this is to say that CHP shouldn’t have tried the Mentoring version of the program. But the reality is that the AS version was rated as having an 85% fidelity to the intervention (meaning youth got 85% of what they were supposed to) as opposed to the 80% for the mentoring version. So in spite of the fact that the AS version was seen as more onerous for youth to participate in, those students actually got more total intervention, delivered more completely, in spite of those challenges. What this means for practitioners is that they should consider trying different versions of an intervention and test them to see what the optimal delivery might look like, as was done here. The study authors ultimately conclude that the best version of this model is something that builds in the full curriculum and dosage, but does it during the school day to alleviate the scheduling challenges for families. So in the battle of ideal vs realistic, the best path forward is trial and error and tweaking until a program gets it right.

3) A “training” intervention might be particularly appropriate for ADHD youth

The authors of the study of CHP really emphasize that an intervention focused on training ADHD youth in new strategies, techniques, and skills to help offset the impact of their condition on learning. These training interventions differ from behavior management interventions that often try to achieve the same goals by incentivizing certain behaviors rather than approaching them as a new skill to learn and develop. But, the authors note, “our results suggest that making long-term changes in adolescents’ behavior using training interventions may take considerable time and coached practice”⎯something that the AS version of the program barely provided and the Mentoring version didn’t offer at all, really. In fact, one wonders what a multi-year version of CHP might be able to achieve, especially if it was able to offer a special “mentoring” relationship with a teacher in the building over multiple years to keep reinforcing the lessons of the curriculum and offering ongoing coaching over time. So if your program wants to improve outcomes for ADHD youth, plan on it being something that takes considerable time and effort. Which is what those youth may need and certainly deserve from the caring adults around them.


*Note: The National Mentoring Resource Center makes these “Insights for Mentoring Practitioners” available for each program or practice reviewed by the National Mentoring Resource Center Research Board. Their purpose is to give mentoring professionals additional information and understanding that can help them apply reviews to their own programs. You can read the full review of the After School Version or the full review of the Mentoring Version on the CrimeSolutions.gov website.

For more information on research-informed program practices and tools for implementation, be sure to consult the Elements of Effective Practice for Mentoring™ and the “Resources for Mentoring Programs” section of the National Mentoring Resource Center site.