It’s the beginning of another school year, which means a mix of emotions for students both new and returning – excitement for the year ahead, disappointment over summer coming to a close, and nervous anticipation about classes and work.
With every new cohort comes a mix of new realities for the institutional agents who serve and support them. In Generation On a Tightrope, authors Art Levine and Diane Dean examine today’s generation of college students, and Levine readily admits that his own generation theory about students from 2001-2006 was based on a flawed assumption – that 9/11 was a watershed moment for those students. If a theorist like Levine can make incorrect assumptions about students, what are the possible problematic assumptions any one of us could make about the students we serve?
Beyond Intuition & Best Practices
Most of us who serve students have experienced or developed a ‘sixth sense’ for assessing which students may be struggling, and what specific support to provide. For example, student advisors often use intuition and their own previous experiences to inform how they understand student risk. This can be useful — advisors know many of their students personally and have unique insights into what’s affecting their students.
Increasingly, however, we can see that at-risk students don’t always exhibit typical signs of risk, nor do they present such risk characteristics openly. Our recent Community Insights Report demonstrated how approximately 40 percent (in our sample) who leave college do so with GPAs of 3.0. or higher – GPAs not typically characteristic of risk identification on campus.
Beyond Single Data Points
In other cases, institutions examine historical data or single data points to understand the trends of students from the past. But single data points only tell part of the story — one that is not personalized enough to the individual student to give the best indicators of that student’s likelihood to persist.
As an example one of our partners had previously relied on a single data point to assess risk. By running their institution-specific model we were able to show them how that practice only identified 14 percent of their students who did not persist, missing the majority of at-risk students. But when we brought in a combination of elements from many disparate sources – creating derived variables – they could see 83 percent of the students who were at-risk. Now they can use that information at the beginning of the semester so they can act early to get students back to a successful path.
Serving Students with Precision
We find that early, accurate predictions deeply matter for student success. Our partners are seeing the power of the student-level predictions to target interventions for at-risk student populations early in the semester. Because the data informing such risk can change throughout the term, our most successful partners are those who act quickly to implement campaigns targeted to reach the students who need support and intervention the most, at the time the supports and interventions are most needed.
Intervention campaigns don’t have to be complicated or heavy lifts to be successful. One of our partners was planning an outreach to a group of 300 students who had received a mid-term grade of C- or lower. Rather than doing a generic blast that was helpful to some, but not each and every student, they used their predictive models to understand the many various reasons students were at risk and tailored their email campaign with specific, personalized advice. The result was a 9 percentage point increase in persistence for those students rated as most at-risk of not persisting.
Like Art Levine and Diane Dean, we can all make flawed assumptions about student needs. Our work with student data has taught us the importance of early, accurate predictions. We know that in-the-moment, focused interventions to support students can make all the difference between those students completing college well and on time, or dropping out and not returning.
Nothing replaces the human intelligence and expertise that our partners deliver to their students. Our solutions add to it, and help our partners easily identify those most at-risk, and assist by providing a picture of why the risks exist. The rest is up to the hard work of our partners and the hard work of their students, but a real student success strategy works when early, accurate predictions are acted upon!
RELATED: COMMUNITY INSIGHTS ISSUE 1
We collaborated with more than 50 institutions to examine historical records and found that detection rates with triggers, such as cumulative GPA, catch — on average — only 21% of non-persisting students. Meanwhile, our predictive models detect — on average — 82% of non-persisting students on the first day of the term.
Dr. Eric McIntosh is the research director at Civitas Learning. With a professional background in student affairs, Eric has worked for institutions of varying types and sizes understanding the intersection of students’ academic journeys and lives. An active researcher, Eric’s research interests include access issues in higher education, spirituality in higher education, transfer student integration, student thriving, and student success. Since 2007, Eric has been involved in research on student thriving; a research initiative of the Doctoral Programs in Higher Education at Azusa Pacific University.