Retaining students is a key goal for institutions of higher education (Barbera et al., 2020). Retention can be defined with regards to whether students complete their degree, or whether they complete a specific course (Hagedorn, 2005). In the current learning engineering project, we investigated a potential approach for increasing students’ course persistence at Arizona State University (ASU): Academic Status Reports (ASRs) (https://students.asu.edu/academic-status-report).
The ASR system is an early alert reporting tool implemented across the university for both in-person and online academic units that provides students with feedback about their progress throughout the semester. This feedback is initiated by course instructors, who have the option to submit either a 'positive' or 'negative' ASR. Positive ASRs can serve as “kudos” to students for doing well in class, whereas negative ASRs can serve as a warning sign to students that they are not on track to succeed in the course. ASRs are sent to students via their student portal and are denoted visually with a “check engine” symbol which students can click on to access the ASR message. ASRs can also be concurrently shared with student support staff, such as academic advisors or university success coaches, who can then follow up with the student.
Effective functioning of the ASR system requires the continuous involvement of several stakeholders. The Office of the Vice Provost oversees the maintenance of the ASR system and university websites describing the system and its goals. This oversight includes sending regular communications to instructors encouraging them to submit ASRs (e.g., during the early weeks of the academic term). Instructors initiate ASRs and subsequent interactions by choosing when, why, and for whom to submit an ASR. Likewise, student support staff often receive ASR notifications and make decisions about whether to reach out to a student. Finally, and perhaps most importantly, students decide how to respond to an ASR (if at all).
Given the number of stakeholders involved in the execution of the ASR system and the scale of its implementation, we used a learning engineering approach to investigate the impact of ASRs on students at ASU. Learning engineering is a process that begins with “the challenge”, or the opportunity to improve learner outcomes (Kessler et al., 2022). Understanding the challenge is critical to successfully executing the learning engineering process. Once the learning engineering team has an initial understanding of the challenge, the team then proceeds to creation (e.g., data instrumentation, designing solutions), implementation (e.g., data collection, enactment of designs, application of solutions), and investigation of impact (e.g., data analysis). The learning engineering process is iterative (Goodell et al., 2023). For example, a learning engineering team may go back-and-forth between creation and implementation while developing a design. Likewise, the team may revisit their understanding of the challenge as they gain new knowledge from other aspects of the process.
LEI seeks to transform learning environments and experiences to support student well-being and success. LEI does this by weaving four threads through each of our projects: Data, Assessment, Technology, and Inclusion. One way that data and technology were utilized in this project was through the ASU Learning@Scale (L@S) platform (https://learningatscale.asu.edu/). L@S is a large-scale data platform that combines disparate institutional data with the goal of empowering researchers and scientists to conduct research that will enhance student achievement and improve learning outcomes. Because of our access to ASU institutional data through L@S, LEI was well-suited to investigate the possible effects of ASRs on student success and retention.
Members of LEI brought many funds of knowledge together for this project. Our team included three research scientists (Drs. Megan Imundo, Maria Goldshtein, & Micah Watanabe), who bring expertise in research supporting student success and learning at higher education institutions. Expertise in analysis of L@S data was provided by data specialist Lilian Gong, and administrative support and project management was provided by Nicole Crosby. LEI Directors, Drs. Tracy Arner, Rod Roscoe, and Danielle McNamara also brought many years of academic research experience and ensured that this project was motivated by ASU and LEI’s standards for inclusive excellence.
Many undergraduate students will encounter difficulties during their academic journeys. ASRs can be used to signal that a student may be experiencing obstacles to their academic achievement and might benefit from additional support. The long-term goal of this project is to assess the impact of receiving an ASR on a student’s academic outcomes. Thus, our overarching challenge was to understand the impact of receiving an ASR on a student’s course grades and likelihood of persisting to degree completion.
We initially planned to conduct analyses with institutional datasets to understand this challenge. However, at the beginning of this project we realized that we first needed to understand how ASRs are used and viewed by instructors, staff, and students so that we could make appropriate analysis decisions.
To obtain a more holistic understanding of the ASR context, we completed multiple cycles of the learning engineering process within “understanding the challenge” (a variation of “nested” learning engineering; Kessler & Totino, 2023). Each cycle centered on its own smaller “challenge” (e.g., Do students receive ASRs from their instructors?). Information gained from each cycle informed the next cycle’s challenge. Each cycle included an instrumentation, implementation, and investigation phase within the learning engineering process. We describe the project’s three cycles below, and Figure 1 offers a high-level visualization of these cycles and their central challenges. The ultimate goal of these nested learning engineering cycles was to prepare us to revisit our initial, overarching challenge of understanding the impact of receiving an ASR on a student’s academic outcomes.
We used L@S to explore whether the ASR system was indeed used by instructors, as ASRs are unlikely to impact students’ course grades or degree completion if they are not ever submitted by instructors. We worked with the L@S team to construct a data request from the larger L@S platform (Instrumentation) and then obtain those data (Implementation). Between 2018 and 2022, 10.8% of students (~10,000 out of ~99,000 total students enrolled) in the introductory courses of Biology 100, Biology 181, Communication 100, English 101, Psychology 101 received ASRs (Investigation). This analysis provided clear evidence that the ASR system was used across multiple courses at ASU.
Cycle 1 informed the team that students do receive ASRs from their instructors, but this cycle did not provide information regarding the intentions of instructors in submitting an ASR, or the follow-up that students receive from instructors or staff about their ASR. Instructor and staff use of ASRs may have an important effect on whether an ASR has a positive, negative, or null effect on a student’s course grade or likelihood of persisting.
We began this cycle by examining four sources of existing data: (a) official ASU communications, descriptions, and guidelines about ASR use, (b) an internal ASU report about ASR use and student persistence, (c) social media posts in which students asked about the meaning and consequences of receiving an ASR, and (d) peer-reviewed literature on early alert systems (e.g., their design, implementation, and effectiveness). Although existing data helped us understand how the ASR system is viewed and used, we needed to instrument additional data collection to solidify our understanding of the ASR context from the perspective of ASU instructors and staff. We therefore used these data sources to develop questions for informal conversations with instructors and staff (Instrumentation). We discussed ASR usage with three instructors and 10 members of academic advising from six different academic units (Implementation). The focus of these informal conversations was about use of the ASR system, change(s) in use of the system during their tenure at ASU, perceptions of how effective the ASR system is, and features they would change or maintain within the system.
We then qualitatively analyzed these interviews via team discussion (Investigation). Instructors and advising staff reported that they appreciated that ASRs provided a unique channel of individualized communication between them and students. They also indicated that ASRs are used differently across academic units and instructors. Although ASRs can be used to report when a student is doing well or when they are doing poorly in a course, interviewees agreed that ASRs are typically used when students are underperforming. They also noted that ASRs were one of several communication methods they used to reach out to students, including email and invitations to office hours. In addition, ASR use varied by instructor, with some instructors sending weekly ASRs, some instructors sending ASRs to all students (compared to only underperforming students), and some instructors sending ASRs rarely or not at all. This variation presented challenges for staff in interpreting what an ASR meant for a student (who may, for example, have received multiple ASRs in one course but none in their other courses). This difficulty was compounded by varied student reactions to ASRs, with some students coming into appointments distressed over their ASR whereas others appeared indifferent, unsure of what an ASR was, or even unaware that they had received an ASR.
Interviewees also reported technological and practical challenges. Instructors submit ASRs but do not receive feedback regarding actions taken by advising staff. Advisors meanwhile cannot differentiate between positive and negative ASRs without multiple clicks in their system and report that instructors may not provide enough detailed information to get a good sense of a student’s situation or need. Both instructors and advisors were consistent in their feedback that iterating on system implementation or on guidance for instructors and students could increase instructor use of the system and engagement of students by support staff.
Conversations with advising staff surfaced reports of their having observed a wide range of student knowledge about ASRs and varied student responses when receiving an ASR. Students’ responses to receiving an ASR are important because the impact of an ASR is likely dependent on a student’s reaction to receiving it. As these observations were from academic advisors, we decided to examine this variability more closely drawing upon students’ own perspectives. Thus, in Cycle 3 we used our existing knowledge of the ASR system and its use to develop a 34-item survey that queried students’ knowledge of and personal experiences with ASRs using a mixture of open-text and closed-response (e.g., Likert scale) items (Instrumentation). We then recruited 372 undergraduate students through the psychology subject pool to complete the survey (Implementation). After data collection we used qualitative and quantitative data analysis to draw insights from student responses (Investigation).
The student survey revealed that students had highly varied reactions to receiving an ASR. Students most commonly (43.6%) expressed that they experienced negative emotions after receiving an ASR (e.g., feeling scared or upset), and 20.9% reported feeling academic pressure. Others (11.8%) responded that they felt positive emotions (e.g., happy), or reported that they didn’t feel anything at all (22.7%). This result suggests that, just as instructor submission of ASRs varies, so too do student responses to receiving an ASR.
In this project, we sought to learn how instructors, support staff, and students use and respond to ASRs. Overall, our investigation suggested that there was engagement with the ASR system by relevant university groups, but profound differences in engagement may impact whether the system is effective in bolstering persistence for students.
Through three cycles of the learning engineering process, we learned that (a) ASRs provide a unique pathway for individualized communication between instructors, students, and staff, (b) inconsistent use by instructors is a barrier to effective outreach to students by support staff, and (c) ideas for system improvement were consistent across different stakeholders. These findings can inform the design and implementation of early alert systems in higher education more broadly. The variability in system use by instructors affirms the importance of not only guiding system stakeholders in how to use the system but also evaluating whether and how stakeholders’ actual use differs from its intended use. Doing so can inform the development of trainings or other informational materials for users to address this variability.
The prevalence of negative emotions experienced by students after receiving an early alert also speaks to a tension for early alert systems: Reaching out to underperforming students without exacerbating the distress they already may be experiencing. Ultimately, the goal of an early alert system is to spur action by the student to improve their academic performance. Theories of motivation may inform the design and implementation of early alert systems to increase the likelihood that students will act in response to receiving an ASR. For example, self-determination theory (Deci & Ryan, 2000) suggests that individuals are more likely to act when they perceive that they have autonomy (i.e., ownership over their actions), competence (i.e., the ability to respond effectively), and relatedness (i.e., are connected to others) in their environment. Prior work suggests that students reporting greater autonomy have more positive reactions to early alerts (Velasco, 2020). Thus, early alert systems may implement features informed by self-determination theory, such as including personalized feedback in early alert notifications that does not compare them to their peers (Katz & Assor, 2007) or a brief set of choices (i.e., < 5) for how to act in response to the alert (e.g., schedule an appointment with an advisor) (Zong & Patall, in press).
Now that we have developed an improved understanding of the ASR system context, we are preparing to address our overarching challenge (i.e., What is the impact of receiving an ASR on a student’s course grades and likelihood of persisting to degree completion?). We are currently collaborating with the Office of the University Provost to construct a dataset request (Instrumentation), obtain these data from Learning@Scale (Implementation), and conduct analyses on archival institutional data (Investigation) to provide an initial answer to this question. From there, we intend to move through the learning engineering process again by creating new educational materials about ASRs (e.g., resources for students) and/or new ASR system processes, implementing these changes in the university context, and investigating the extent to which they affect the impact of receiving an ASR on student success.
Figure 1
A Visualization of Completed and Future Learning Engineering Cycles Investigating Academic Status Report Use and Impact
The research reported here was partially supported by the ASU Learning Engineering Institute and by the Institute of Education Sciences, U.S. Department of Education, through Grant R305N210041 to Arizona State University. The opinions expressed are those of the authors and do not represent views of ASU, the institute, or the U.S. Department of Education.