EdTech Archives EdTech Archives The Journal of Applied Instructional Design, 15(1)

Designing for Scale: An Iterative Case of Asynchronous Training for TAs

Erin Measom, Layne West, & Heather Leary

Introduction

Teaching assistants (TAs) play an increasingly important role in supporting the success and retention of online learners. Research suggests they can improve motivation (Emenike et al., 2020), and increase participation and satisfaction (Anderson & Rourke, 2002); and when properly trained, play a vital role in fostering student success (Bensimon, 2007; Kimball, 2016). Reimagining the traditional TA role requires intentional training that equips them to contribute meaningfully to the learning experience. Building the capacity of teaching assistants not only enables them to better support learners, but also helps them grow as learners themselves (Shannon et al., 1998). This design case describes the process and decisions in developing the Teaching Assistant Module (TAM) for TAs at a large university.

Context

This research is part of a Research-Practice Partnership (RPP) (Penuel & Gallagher, 2017) with administrators, design professionals, and educational researchers at a large university in the western United States. The partnership was formed to enhance online learner success and to close the gap between research and practice (Ferguson, 2005).

Since its launch in 2020, the RPP has collaborated on projects that combine conceptual, practical, and evaluative work across four areas: (a) online student readiness and success, (b) online teaching assistant support and success, (c) online faculty readiness and support, and (d) online instructional designer knowledge and practice. The TAM emerged directly from this collaborative environment, shaped by the RPP’s shared commitment to addressing authentic problems through ongoing, cross-role collaboration.

The TAM was developed in response to rapid program growth and limited staffing. TA numbers increased from about 200 each semester before the COVID-19 pandemic to more than 400, resulting in a staff-to-TA ratio exceeding 1:200. Existing training focused primarily on policies and learning management system  (LMS) navigation, with limited attention to student success strategies or pedagogical principles. As new administrators sought to expand - and reimagine - the role of the online TA, a more strategic and scaleable training approach became necessary to support that vision.

Early iterations of the TAM were guided by the Concerns-Based Adoption Model (CBAM) ( Anderson, 1997), which helped us address not just materials and resources but also the attitudes, beliefs, and practical questions new TAs often face. As the training matured, we integrated the Community of Inquiry framework (Garrison et al., 1999) to help TAs understand how their role could support teaching, social, and cognitive presence in asynchronous online courses.

Challenges

When designing the TAM, we faced not only numerous challenges, but also the way these challenges overlapped and compounded one another. Limited staff, rapid program growth, and training TAs with diverse experience levels meant that solving one problem often created tension in another. This overlap reflected the real-world complexity of large-scale instructional design, where practical constraints, competing priorities, and diverse needs often intersect.

The first and most pressing challenge was the sheer number of TAs, more than 400 each semester, and the requirement to train all of them asynchronously. This format was necessary due to limited staffing and the wide variation in TA schedules, but it eliminated opportunities for live modeling, peer interaction, and immediate feedback. Designing for scale required us to rethink pacing, engagement, and resource access so TAs had essential information up front and support later.

Designing for both onboarding and sustained support introduced a second layer of complexity. New TAs needed concise, early-semester training on their role, tools, and expectations. As the semester progressed, they would encounter more complex situations—such as supporting struggling students, interpreting course analytics, or responding to accommodation requests. We had to give them enough early guidance to feel prepared without overloading them with information they might not need yet. Just as importantly, we had to make sure they knew where to find those resources later. Striking that balance between “what you need now” and “what you’ll need later” became a central organizational and pedagogical challenge.

The diversity of TA backgrounds further complicated the design. Some were entirely new to the role and required foundational preparation, while others returned with strong skills but sought deeper development. We addressed this by creating differentiated completion paths within a single course, maintaining consistency across the program while tailoring the experience to individual needs.

Finally, the realities of TA workloads and budget constraints meant the training had to be concise and impactful. Most TAs are full-time students, so we prioritized content that could be completed quickly yet meaningfully, resisting the temptation to add “nice-to-have” material that might dilute focus.

Another challenge that certainly influenced the work was the composition of the design team. The design team initially consisted of graduate and undergraduate students with limited experience in instructional design. This required program administrators to take on mentoring and teaching roles throughout the process and balancing project deadlines with development and feedback. While not a direct design challenge, this dynamic shaped the pace, process, and iterative nature of the TAM’s evolution. These challenges serve as a reminder that practice-based design often unfolds within imperfect but authentic working conditions.

The Product

The TAM was designed with three primary goals: (1) to prepare hundreds of TAs each semester without overwhelming their capacity as students, (2) to equip them with strategies for proactive and inclusive student support, and (3) to foster their development as peer educators through opportunities for personal and professional growth.

The TAM is housed in the Canvas LMS. Early versions consisted of three required units, with a fourth optional unit focused on institutional knowledge transfer:

Four Units​​

  1. Unit 1: Ready, Set, Go! (Hiring Onboarding and Canvas Training)​​

  2. Unit 2: Out the Gate (Developing Characteristics of a Good TA)​​

  3. Unit 3: Finish Strong ("Choose Your Own Adventure" Professional Development)​​

  4. Unit 4: Enjoy the Victory (Course Improvements)​​

Progress Over Perfection

The TAM was developed using a design-based research (DBR) approach, positioning the training as a living design meant to evolve over time based on feedback, institutional needs, and changes in the TA role. Rather than aiming for a perfect, all-inclusive solution from the start, we focused on steady improvements that would work in our real context. This “progress over perfection” approach fit the realities of practice-based scholarship, where responding to immediate needs and learning from each round of feedback mattered more than getting everything exactly right the first time.

We often used the metaphor of moving from a scooter, to a bike, to a reliable car—eventually aiming for the luxury car. This helped set expectations, especially for student designers, that our initial goal wasn’t to build the most polished, feature-rich training possible, but to improve on what existed and make meaningful changes each cycle.

Unit 3, Choose Your Own Adventure, is one example of this evolution. Designed to offer personal growth and professional development opportunities for TAs, it allowed participants to select topics most relevant to them. The intent was that TAs who experienced meaningful growth themselves could, in turn, better support their students. Over time, the unit shifted in topics, we added microbadges, and refined its focus to maintain engagement for both new and returning TAs. Another change was discontinuing Unit 4, originally meant to capture and transfer course-specific knowledge between semesters. While valuable in theory, it proved difficult to complete during busy end-of-semester periods, so responsibility for this task shifted to individual instructors.

These and other iterations were shaped by both data and practical realities such as feasibility, staff capacity, and TA engagement. In this way, the TAM’s development reflects core principles of practice-based scholarship—grounding design decisions in authentic constraints, acting on evidence from real-world use, and prioritizing solutions that work in context over those that are perfect in theory.

Assessment and Insights

From the start, we used surveys and reflections to understand how TAs were experiencing the TAM and to guide iterative improvements. Across three years, 75–77% of TAs rated the TAM as moderately to very effective, with reflections showing how training was applied and where it fell short. For example, one TA shared: “[The test proctoring software] gave loads of students a hard time… I referred back to the pages in the TA training course, but those suggestions didn’t work.” Feedback like this led us to update resources, clarify instructions, and address gaps such as personal wellness and TA–instructor collaboration. This process reflects the essence of practice-based scholarship gathering evidence from authentic contexts, acting on it, and feeding it back into the design to improve both the product and the practice.

Future Directions

At the end of each academic year, the design team reviews TA feedback and meets with key stakeholders to identify areas for improvement in the Teaching Assistant Module (TAM). These conversations guide a summer redesign cycle, during which the team refines course content and structure. One of the ongoing challenges in this process is resisting the tendency to simply add more content. As the role of the online TA grows, we remain committed to keeping the training useful and time efficient.

The most recent redesign efforts have focused on several key areas:

  1. Course flow and navigation were revised to improve clarity and engagement.

  2. Digital badging is being improved to align more directly with NACE competencies and career development goals, reinforcing the training’s value beyond the TA role.

  3. AI awareness and responsible use are now emphasized, with attention to data protection.

  4. Institutional alignment is being strengthened by expanding content on how TAs can support instructors in reflecting and teaching toward the AIMS of the university.

  5. Identity and mentorship are an increasing focus, as past survey data show that TAs often struggle to balance their roles as both students and educators. The training now includes expanded guidance on support networks, mentoring, and strategies for navigating this dual-role tension.

These improvements reflect our ongoing commitment to a design-based research (DBR) approach, in which the TAM will continually evolve in response to feedback, context, and institutional goals. While sustaining and refining the training each year requires a significant time investment from a busy online learning team, we view this work as essential. Peer educators play a vital role in supporting student success, and we are committed to keeping their training both relevant and responsive. Our aim is not only to prepare TAs for their immediate responsibilities, but to continue prioritizing their growth and development as a meaningful part of the student learning experience.

Discussion and Implications for Practice

This design case began with the goal of reimagining one university’s TA training to better prepare peer educators for the relational, instructional, and technical demands of online learning environments. In doing so, we aimed to create a training program that was scalable, and personally meaningful, without overburdening already-busy students.

One of the most important insights to emerge from this work was a deeper appreciation for what it takes to design and maintain high-quality training at scale, including the time, expertise, and collaboration required to create experiences that are both pedagogically sound and practically useful. This realization led to creating a new non-student ¾ designer role to provide consistency and expand design capacity.

We continue to rely heavily on graduate and undergraduate designers. Far from being a limitation, this model has become a core feature of our approach, offering real-world design experience while requiring mentorship and scaffolding. Training novice designers mirrored the challenges of designing for TAs: balancing efficiency with development, structure with flexibility, and deadlines with reflection.

As the TAM matured, we saw its potential for other student-facing roles, informing how we prepare student employees who support learners across campus. Our experience with the TAM has helped inform how we think about preparing all student employees who play a part in supporting learners across campus. Tensions remain, such as measuring long-term impact and improving completion rates. While we consistently met the 80% completion goal for new TA's, returning TAs only recently reached that threshold, and we continue refining incentives and accountability structures to sustain and improve that rate.

Ultimately, we believe this work offers valuable insights for instructional designers, program administrators, and educational leaders. It invites broader reflection on what it means to treat peer educators and student workers as critical contributors to the learning environment and how institutions can build sustainable systems that support their training, growth, and long-term impact. In this sense, the TAM not only served institutional needs but also exemplifies practice-based scholarship: an ongoing, iterative process where design, evidence, and context continually inform one another in pursuit of both local solutions and broader insights.

References

  1. Anderson, S. E. (1997). Understanding teacher change: Revisiting the Concerns-Based Adoption Model. Curriculum Inquiry, 27(3), 331–367. https://doi.org/10.1111/0362-6784.00057
  2. Anderson, T., & Rourke, L. (2002). Using peer teams to lead online discussions. Interactive Learning Environments, 10(2), 105–119. https://doi.org/10.1076/ilee.10.2.105.7451
  3. Bensimon, E. M. (2007). The underestimated significance of practitioner knowledge in the scholarship on student success. The Review of Higher Education, 30(4), 441–469. https://doi.org/10.1353/rhe.2007.0032
  4. Emenike, M. E., Schick, C. P., Van Duzor, A. G., Sabella, M. S., Hendrickson, S. M., & Langdon, L. S. (2020). Leveraging undergraduate learning assistants to engage students during remote instruction: Strategies and lessons learned from four institutions. Journal of Chemical Education, 97(9), 2502–2511. https://doi.org/10.1021/acs.jchemed.0c00779
  5. Ferguson, J. E. (2005). Bridging the gap between research and practice. Knowledge Management for Development Journal, 1(3), 46–54.
  6. Kimball, E. (2016). Reconciling the knowledge of scholars & practitioners: An extended case analysis of the role of theory in student affairs. Critical Questions in Education, 7(3), 287–305.
  7. Kozan, K., & Richardson, J. C. (2014). New exploratory and confirmatory factor analysis insights into the Community of Inquiry survey. The Internet and Higher Education, 23, 39–47. https://doi.org/10.1016/j.iheduc.2014.06.002
  8. Penuel, W. R., & Gallagher, D. (2017). Creating research-practice partnerships. Harvard University Press. https://doi.org/10.1111/puar.13042
  9. Shannon, D., Twale, D., & Moore, M. (1998). TA teaching effectiveness: The impact of training and teaching experience. The Journal of Higher Education, 69(4), 440–466. https://doi.org/10.2307/2649274