EdTech Archives EdTech Archives The Journal of Applied Instructional Design, 14(2)

Assessing Student Perceptions and Performance in Digital Learning Spaces through an Equity Lens

Naomi T Lin, Huilin Li, Cindy Mui Perez, Yu Yan, & Karen Flammer

Abstract

As digital learning spaces become increasingly prominent across institutions, equity and access to these spaces across the student population need to be examined to promote engagement in online learning spaces. While underrepresented minorities (URM) and transfer students still demonstrate equity gaps compared to their counterparts, we found no significant difference in course grade between first-generation students and their peers. Course survey ratings were significantly different across URM, transfer, first-generation students as well as gender groups. With intentionally designed online courses, certain gaps may be lessened. Higher education institutions must invest in strategies that not only improve the digital infrastructure and resources but also prioritize inclusive pedagogies and support systems.

Introduction

As digital learning spaces become increasingly prominent across institutions, equity and access to these spaces across the student population need to be examined to promote engagement in online learning spaces. Online options may open opportunities for populations who would not otherwise pursue education (Goodman et al.., 2019). There are physical constraints of online learning, as the physical resources of online learning (i.e. high speed internet, more than device per child) determined engagement in virtual learning, such as completing assignments (Tate & Warshauer 2022). Nevertheless, studies have revealed that students with access to online courses are more likely to graduate or transfer to a four-year institution (Johnson & Mejia, 2014; Fischer et al., 2019).

Beyond the physical resources and access to the digital spaces logistically, studies have found that online performance gaps were significantly larger for students from disadvantaged backgrounds in a community college setting (Jaggars & Xu, 2010; Xu & Jaggars, 2011, 2013, 2014), specifically males, Black students, Hispanic students, and low-performing students had larger online performance gaps than did their counterparts (Figlio et al., 2010; Xu & Jaggars, 2014). However, there are some contradictions to these equity and access gaps. For example, Fischer and colleagues (2020) demonstrated that at-risk college student populations (low-income students, first-generation students, low-performing students) were not found to suffer additional course performance penalties of online course participation.

Given the importance of understanding student perceptions and performance, across sociodemographic groups, in digital learning spaces, our research questions are as follows:

  1. Does student performance (as indicated by course grade) vary from various sociodemographic backgrounds in online courses?

  2. Do students from various sociodemographic backgrounds have differing experiences in online courses, as reflected in survey ratings?

  3. How do different learning experiences among students influence their performance in online classes?

Methods

We developed a survey instrument based on the National Survey of Student Engagement (NSSE) and Quality Matters to examine student experience in online courses at a Southern California public university. These online courses were intentionally designed with the support of instructional designers. The survey focused on student perceptions of the course's design, learning engagement, and effectiveness. The survey showed strong internal consistency (Cronbach's ɑ [range: 0.79 - 0.91]). Exploratory Factor Analysis (EFA) also confirmed its construct validity. An initial pilot in 2020-2023 collected 812 valid responses from 15 courses. Survey data was matched with grade and demographic information to identify potential equity gaps and inform future course improvements. To explore the differing experiences of students from diverse sociodemographic backgrounds, the Kruskal-Wallis Test and Mann-Whitney U Test were employed due to the non-normal distribution of the data and a minor violation of the equal variance assumption (Levene's Homogeneity Test F = 1.61, P = 0.04).

Results

First, there are significant differences in student performance (as indicated by course grade, see Table 1) from underrepresented minority (URM) students and transfer students in online courses. URM students had significantly lower grades (M = 3.71, SD = .67) compared to non URM students (M = 3.85, SD = .47; t(745) = -2.77, p = .03). Transfer students had significantly lower grades (M = 3.73, SD = .65) compared to non-transfer students (M = 3.86, SD = .45; t(745) = -2.99, p =.01). However, there were no significant differences between first-generation and non-first-generation students t(745) = -1.88, p =.08) and no differences amongst gender groups (F (9,737) = .72, p = .69).

Table 1

Undergraduate Students’ (n=747) GPA Results by Demographic Groups


N

Mean

SD

Transfer

193

3.73

0.65

Non-Transfer

553

3.86

0.45

Underrepresented (URM)

132

3.72

0.67

Non- URM

572

3.85

0.47

First Gen

241

3.78

0.59

Non First Gen

425

3.86

0.47

Second, students across demographics had differing experiences, based on the survey ratings. Transfer students consistently exhibited overall higher ratings across three survey sessions (course design, learning engagement, and learning effectiveness) compared to non-transfer students (W = [range: 64382-66242], p < 0.01; see Figure 1). Underrepresented students notably displayed lower ratings in the learning effectiveness session (W = 63928, p = 0.03) compared to their counterparts. First-generation students gave higher ratings in the course design session (W = 67569, p = 0.01, see Figure 2) in contrast to their peers. Additionally, male students tended to give lower survey ratings (Kruskal-Wallis χ2 = 7.5542, p = 0.02) in the learning engagement session compared to both female (p = 0.02) and non-binary/unknown (p = 0.02).

Figure 1

Boxplots of the Three Survey Sections Disaggregated by Transfer Student Status


Figure 2

Boxplots of the Three Survey Sections Disaggregated by First-Generation Student Status

A comparison of a graph Description automatically generated with medium confidence

Lastly, there were no significant associations found between students' grades and their survey ratings through linear regression.

Discussion

Our analysis revealed that URM students and transfer students had lower grades than their counterparts in online courses. This aligns with concerns raised in prior research about the performance gaps for students from disadvantaged backgrounds. Interestingly, no significant differences were observed between first-generation and non-first-generation students, nor among gender groups, in terms of course grades. This could suggest that certain demographic factors might not directly influence online learning outcomes as previously speculated.

The variance in survey ratings paint a more detailed picture of student experiences. Transfer students, for instance, consistently provided higher ratings across all survey sessions, indicating a possibly more positive perception or adaptability to online learning environments. The higher ratings by first-generation students in the course design session could reflect a recognition or appreciation of course structures that are potentially more accessible or engaging for students navigating higher education without guidance from college-educated family members. In contrast, underrepresented students reported lower ratings in the learning effectiveness session. Moreover, the lower ratings provided by male students in the learning engagement session suggest gender-specific differences in how engagement is perceived or experienced in online settings. Lastly, there were no significant associations between grades and survey ratings, which may possibly be attributed to the skewed distribution of high grades among the student population, with 88% achieving a grade of B+ or higher.

These findings contribute to the ongoing dialogue on the critical importance of addressing equity and access in digital learning spaces. While online learning offers unique opportunities for extending educational reach, it also necessitates a thoughtful consideration of the diverse needs and backgrounds of the student population. For example, with these intentionally designed courses with instructional designers, the course outcome did not differ amongst first-generation students. More time and intention behind course design and support from instructional designers may help mitigate performance gaps. The disparities in both performance and perceptions of the online learning experience underline the need for tailored support that ensures all students can fully engage and succeed in these environments. Moving forward, higher education institutions must invest in strategies that not only improve the digital infrastructure and resources but also prioritize inclusive pedagogies and support systems. Such efforts should aim to mitigate the identified disparities and foster a more equitable and accessible online learning landscape.

References

  1. Bettinger, E. P., Fox, L., Loeb, S., & Taylor, E. S. (2017). Virtual classrooms: How online college courses affect student success. American Economic Review, 107(9), 2855–2875. https://doi.org/10.1257/aer.20151193
  2. Bradshaw, A. C. (2017). Critical pedagogy and educational technology. In Culture, learning and technology: Research and practice (pp. 8–27). Routledge.
  3. Fischer, C., Xu, D., Rodriguez, F., Denaro, K., & Warschauer, M. (2020). Effects of course modality in summer session: Enrollment patterns and student performance in face-to-face and online classes. Internet and Higher Education, 45. https://doi.org/10.1016/j.iheduc.2019.100710
  4. Goodman, J., Melkers, J., & Pallais, A. (2019). Can online delivery increase access to education? Journal of Labor Economics, 37(1), 1–34. https://doi.org/10.1086/698895
  5. Johnson, H. P., & Mejia, M. C. (2014). Online learning and student outcomes in California's community colleges. Washington, DC: Public Policy Institute.
  6. National Survey of Student Engagement (NSSE) (2020). Engagement insights: Survey findings on the quality of undergraduate education. Center for Postsecondary Research, Indiana University School of Education.
  7. Quality Matters (2020). Specific review standards from the QM higher education rubric: Course design rubric standards. https://www.qualitymatters.org/sites/default/files/PDFs/StandardsfromtheQMHigherEducationRubric.pdf
  8. Tate, T., & Warschauer, M. (2022). Equity in online learning. Educational Psychologist, 57(3), 192–206. https://doi.org/10.1080/00461520.2022.2062597
  9. Xu, D., & Jaggars, S. S. (2014). Performance gaps between online and face-to-face courses: Differences across types of students and academic subject areas. The Journal of Higher Education, 85(5), 633–659. https://doi.org/10.1080/00221546.2014.11777343
  10. Xu, D. (2019). Academic performance in community colleges: The influences of part-time and full-time instructors. American Educational Research Journal, 56(2), 368–406. https://doi.org/10.3102/0002831218796131

Acknowledgments

This study was funded by the University of California Office of the President (UCOP) Online Award. We sincerely thank UCOP for their generous support, which made this study possible.