EdTech Archives EdTech Archives Proceedings of the Learning Engineering Research Network Convening (LERN 2026)

Developing Learning Strategy Heuristics for Active Mobile Learning Platforms

Ishrat Ahmed, Shailee Shah, & Scotty D. Craig

Abstract

This study presents a heuristic framework and four-level rubric for evaluating the pedagogical effectiveness of mobile learning applications. Drawing on literature in active learning and mobile learning platforms, we developed an eight-category framework encompassing active learning, audio-based engagement, content access, interactivity, collaboration, digital equity, instructor feasibility, and feedback. Each category is operationalized through sub-heuristics and assessed using a rubric that rates implementation quality from no implementation to full pedagogical integration. Situated within the Learning Engineering (LE) process, this work presents a nested design cycle within the Challenge phase, in which a heuristic checklist was created to systematically identify pedagogical strengths and gaps in existing mobile learning platforms. Using a multi-phase methodology, literature review, and framework synthesis and validation, we show how this approach offers actionable, evidence-based criteria for identifying strengths and gaps in mobile learning platforms. This framework is designed to support educators and developers in creating mobile learning environments that move beyond content delivery to foster deeper engagement, learner autonomy, and more effective learning outcomes.

Introduction

Advances in educational technology have reshaped how students in higher education access, and interact with learning content. Yet this shift also presents a challenge as learning environments become increasingly digital and self-directed, ensuring that students remain actively engaged. Within Active L@S, this challenge underscores the need for instructional approaches that promote agency, and sustained interaction with course material. Mobile learning is one such method, offering a scalable way to extend learning beyond traditional classroom boundaries. Mobile learning enables instruction that is flexible, self-paced, and responsive to diverse learner needs, reflecting a broader shift from traditional, in-person models toward technology-enhanced education. While mobile platforms expand access and convenience, their pedagogical impact depends on how well they support meaningful learning. Active learning, defined by student engagement, reflection, and learner agency, has emerged as a critical benchmark because it fosters deeper cognitive processing and improved learning outcomes (Miyatsu, Nguyen, & McDaniel, 2018; Roscoe & Craig, 2022). This makes it essential to evaluate whether mobile learning tools are strategically supporting learners, rather than just delivering content.

Despite the growing adoption of mobile learning tools, structured and research-based approaches for evaluating their pedagogical alignment remain limited. Existing frameworks tend to prioritize usability, access, and content delivery while overlooking the deeper cognitive and behavioral processes that support sustained learning. As a result, many mobile learning tools remain supplementary rather than strategically integrated into instruction (Rangel-de Lazaro and Duart, 2023) and common barriers, such as insufficient instructional design and lack of alignment with evidence-based strategies continue to limit their effectiveness (Sophonhiranrak, 2021). Foundational models, such as Ozdamli and Cavus’s (2011) framework for mobile learning elements, highlight the importance of harmonizing learner needs, content delivery, instructional roles, learning environments, and assessment, yet no widely adopted evaluation tool has operationalized has operationalized these principles for practical use by educators and designers.

Learning Strategy Analysis (LSA) addresses this gap by providing a structured, research-informed methodology for assessing the pedagogical alignment of digital learning systems with evidence-based instructional strategies (Villa, Craig, Zakhidov, & Zielke, 2021). Situated within the Learning Engineering (LE) process, LSA was employed in this study as a nested analytic cycle (Craig et al., 2025) within the Challenge phase to inform the design of a heuristic checklist applicable to mobile learning platforms. Specifically, LSA guided the identification and operationalization of instructional strategies such as elaboration, self-explanation, multimedia practice, or spaced practice, strategies known to foster deeper engagement and mastery. Using LSA-informed heuristics, checklists, and rubrics, the resulting framework enables systematic identification of pedagogical strengths and gaps in mobile learning applications, supporting  evidence-based decision-making, and learner-centered design. By integrating LE and LSA, this approach bridges the gap between technological deployment and pedagogical integrity, providing a practical tool for evaluating and improving the instructional quality of mobile learning environments.

Methodology

This study was conducted collaboratively by two researchers and employed a two-phase methodology to develop a pedagogically grounded framework for evaluating mobile learning applications. Guided by principles of active learning and evidence-based instructional design, the research aimed to create a practical heuristic tool that educators and designers can use to assess whether mobile platforms meaningfully support learner engagement, cognitive processing, and instructional effectiveness.

The methodology involved two phases. Phase 1 involved a targeted literature review on mobile learning and active learning strategies to identify relevant pedagogical constructs (e.g., elaboration, note-taking, and multimedia learning) and establish foundational design principles. Phase 2 synthesized these insights into an eight-category heuristic framework, covering areas such as Active Learning, Learning with Voices/Audio, Content Access, Engagement and Interactivity, Communication and Collaboration, Digital Divide, Instructor Readiness and Feasibility, and Feedback. Each category was further divided into sub-heuristics that include actionable design recommendations. These heuristics function both as evaluative criteria for identifying areas of improvement and as guidelines for designing pedagogically grounded mobile learning tools. While not all strategies must be present in a given system, the inclusion of any single sub-heuristic, or a meaningful combination of them, can enhance active learning.

Learning Strategy Heuristics Related to Mobile Learning Platforms

As outlined above, we developed eight heuristic categories for evaluating mobile learning systems. The following section provides definitions of each heuristic and describes its corresponding sub-heuristic categories.

Active Learning Heuristic

Active learning refers to instructional methods that engage learners in the process of knowledge construction through different activities. The Active Learning heuristic emphasizes subheuristics that prompt learners to construct knowledge, including note-taking, self-explanation, summarization, and question-answering (Miyatsu, Nguyen, & McDaniel, 2018; Roscoe & Craig, 2022). Mobile systems should support structured, paraphrased note-taking  (Van Wyk & Van Ryneveld, 2018); prompt learners to explain concepts in their own words beyond multiple-choice (Bisra et al., 2018); enable meaningful summarization rather than copy-paste behavior; and incorporate reflective, open-ended questions that promote elaborative reasoning. Questions that prompt elaborative reasoning are more effective in assessing student understanding than factual or binary questions (Hernández-de-Menéndez et al., 2019). These strategies collectively foster deep cognitive processing, sustained engagement, and improved learning outcomes.

Learning with Voices/Audio Heuristic

This heuristic emphasizes the pedagogical value of audio-based learning in mobile contexts. Effective systems should incorporate sub-heuristics such as delivering high-quality audio, providing playback controls (e.g., pause, rewind, and timestamp tagging) that support review and note-taking without disrupting the listening experience (Schroeder et al., 2020), and integrating speech-to-text features for accessibility needs (Craig & Shroeder, 2019). These elements make audio a powerful tool for mobile learning environments.

Content Access Heuristic

This heuristic refers to how easily learners can access both platform-provided content and their own user-generated content (e.g., notes, highlights). Mobile systems should offer sub-heuristics such as seamless synchronization across devices (Kiewra et al., 1991), timestamp-linked navigation between notes and content (Kiewra et al., 1991), and personal archives that are searchable and easy to review (Kiewra et al., 1991). Such features support autonomy, revisitation, and reflection, key components of self-regulated learning.

Engagement & Interactivity

Heuristic This heuristic focuses on learner engagement (i.e., motivational and behavioral involvement in learning) and interactivity (i.e., learner’s ability to manipulate content or receive feedback from the system). Interactive elements, when grounded in pedagogy, can help maintain motivation and foster persistence. Gamification elements such as badges or points can enhance motivation (Sophonhiranrak, 2021), while opportunities for simulations or applied tasks help learners connect knowledge to authentic contexts (Rangel-de Lazaro & Duart, 2023; Ozdamli & Cavus, 2011). Together, these features can sustain attention in mobile environments where distractions are common.

Communication & Collaboration Heuristic

This heuristic assesses the system’s capacity to support interpersonal learning, peer exchange, and instructor-learner interaction. Such interaction reinforces understanding, supports metacognition, and builds learning communities. Effective platforms enable peer interaction through shared notes, discussion spaces, or review tools, and support instructor communication through messaging or comments, and finally integrations with third-party tools like WhatsApp or YouTube expand opportunities for interaction, fostering learning communities and deeper understanding (Sophonhiranrak, 2021; Rangel-de Lazaro & Duart, 2023; Ozdamli & Cavus, 2011).

Digital Divide Heuristic

This heuristic examines the platform’s capacity to accommodate diverse learners, particularly those with limited digital resources. Systems should function smoothly across a range of devices and operating systems (Sophonhiranrak, 2021; Ozdamli & Cavus, 2011), offer offline functionality for core features, and provide multilingual support (Sophonhiranrak, 2021; Ozdamli & Cavus, 2011). These elements help ensure that mobile learning reduces rather than reinforces inequities.

Instructor Readiness & Feasibility

This heuristic evaluates how easily instructors can adopt and integrate the platform into their teaching practice. Effective systems provide intuitive onboarding resources (Sophonhiranrak, 2021; Rangel-de Lazaro & Duart, 2023), support flexible content authoring, and ensure compatibility to integrate with common learning management systems (Ozdamli & Cavus, 2011).

Feedback

Feedback refers to the quality and timeliness of information provided to learners about their performance and engagement. Mobile systems should offer appropriately timed feedback, clear and constructive explanations (Ruth et al, 2021; Shute, 2008), and options for learners to choose the depth and frequency of feedback they receive. When given a choice, learners are more likely to view feedback and show more positive engagement with the system (Kuklick, 2025). Support for dialogue-based feedback further enhances engagement and reflection. Together, these features promote self-regulation, deeper learning, and sustained use.

Learning Strategy Heuristics Rubric

Following the heuristics described above, we created a four-level rubric for each sub-heuristic. The four levels are: No Implementation (0), indicating the feature is entirely absent; Partial Implementation (1), indicating minimal or incomplete presence; Adequate Implementation (2), indicating functional but not fully optimized support; and Full Implementation (4), indicating comprehensive, well-integrated pedagogical alignment. For example, the “prompts for self-explanation” sub-heuristic in Active Learning heuristic ranges from no prompts for learners to articulate understanding, to occasional generic prompts, to regular but non-adaptive prompts, and finally to fully embedded, context-aware prompts that encourage deep elaboration. For a full list of rubrics, check here.

The rubric offers actionable metrics with clear criteria to assess how well an app supports learner engagement, accessibility, and instructional integration. A total score out of 28 indicates overall alignment, with higher scores reflecting stronger support for effective, equitable, and interactive mobile learning experiences.

Conclusion

This study presents a structured, pedagogically grounded heuristic framework for evaluating mobile learning applications. By integrating evidence-based instructional principles with a four-level evaluative rubric, ranging from no implementation to full implementation, the framework enables systematic assessment of both the presence and quality of learning-supportive features. The application of this framework can demonstrate its practical value as both a diagnostic tool and a design guide, helping educators and developers identify strengths, address gaps, and enhance the instructional integrity of mobile platforms. Ultimately, this approach supports the creation of mobile learning systems that move beyond content delivery to foster deeper engagement, learner autonomy, and meaningful, high-impact learning.

Acknowledgement

The research reported here was partially supported by the Institute of Education Sciences, U.S. Department of Education, through Grant R305T240035 to Arizona State University. The opinions expressed are those of the authors and do not represent views of the Institute or the U.S. Department of Education.

References

  1. Bisra, K., Liu, Q., Nesbit, J. C., Salimi, F., & Winne, P. H. (2018). Inducing self-explanation: A meta-analysis. Educational Psychology Review, 30(3), 703–725. https://doi.org/10.1007/s10648-018-9434-x
  2. Craig, S. D., Avancha, K., Malhotra, P., C., J., Verma, V., Likamwa, R., Gary, K., Spain, R., & Goldberg, B. (2025). Using a Nested Learning Engineering Methodology to Develop a Team Dynamic Measurement Framework for a Virtual Training Environment. In International Consortium for Innovation and Collaboration in Learning Engineering (ICICLE) 2024 Conference Proceedings: Solving for Complexity at Scale (pp. 115-132). https://doi.org/10.59668/2109.21735 
  3. Carless, D., & Winstone, N. (2023). Dialogic feedback in higher education: A systematic review and conceptual framework. Studies in Educational Evaluation, 78, 101276. https://doi.org/10.1016/j.stueduc.2023.101276
  4. Craig, S. D., & Schroeder, N. L. (2017). Reconsidering the voice effect when learning from a virtual human. Computers & Education, 114, 193–205. https://doi.org/10.1016/j.compedu.2017.07.003
  5. Hernández-de-Menéndez, M., Vallejo Guevara, A., Tudón Martínez, J. C., Hernández Alcántara, D., & Morales-Menéndez, R. (2019). Active learning in engineering education: A review of fundamentals, best practices and experiences. International Journal on Interactive Design and Manufacturing, 13(3), 909–922. https://doi.org/10.1007/s12008-019-00557-8
  6. Kiewra, K. A., Benton, S. L., Kim, S. I., Risch, N., & Christensen, M. (1995). Effects of note-taking format and study technique on recall and relational performance. Contemporary Educational Psychology, 20(2), 172–187. https://doi.org/10.1006/ceps.1995.1011
  7. Kuklick, L. (2025). Effects of learner choice over automated, immediate feedback. Learning and Instruction, 96, Article 102065. https://doi.org/10.1016/j.learninstruc.2024.102065
  8. Mari van Wyk, & Linda van Ryneveld. (2018). Affordances of mobile devices and note-taking apps to support cognitively demanding note-taking. Education and Information Technologies, 23(4), 1639–1653. https://doi.org/10.1007/s10639-017-9684-0
  9. Miyatsu, T., Nguyen, K., & McDaniel, M. A. (2018). Five popular study strategies: Their pitfalls and optimal implementations. Perspectives on Psychological Science, 13(3), 390–407. https://doi.org/10.1177/1745691617710510
  10. Mullet, H. G., Butler, A. C., Berdin, B., von Borries, R., & Marsh, E. J. (2014). Delaying feedback promotes transfer of knowledge despite student preferences to receive feedback immediately. Journal of Applied Research in Memory and Cognition, 3(4), 222–229. https://doi.org/10.1016/j.jarmac.2014.05.001
  11. Ozdamli, F., & Cavus, N. (2011). Basic elements and characteristics of mobile learning. Procedia – Social and Behavioral Sciences, 28, 937–942. https://doi.org/10.1016/j.sbspro.2011.11.173
  12. Rangel-de Lázaro, G., & Duart, J. M. (2023). Moving learning: A systematic review of mobile learning applications for online higher education. Journal of New Approaches in Educational Research, 12(2), 198–224. https://doi.org/10.7821/naer.2023.7.1287
  13. Roscoe, R. D., & Craig, S. D. (2022). Learning through collaborative explanation: Processes and outcomes. Educational Psychologist, 57(3), 191–206.
  14. Rüth, M., Breuer, J., Zimmermann, D., & Kaspar, K. (2021). The effects of different feedback types on learning with mobile quiz apps. Frontiers in Psychology, 12, 665144. https://doi.org/10.3389/fpsyg.2021.665144
  15. Schroeder, N. L., Glahn, R., & Craig, S. D. (2020). The effects of pedagogical agent voice and animation on learning, motivation, and cognitive load. Computers & Education, 149, 103814.
  16. Shute, V. J. (2008). Focus on formative feedback. Review of Educational Research, 78(1), 153–189. https://doi.org/10.3102/0034654307313795
  17. Sophonhiranrak, S. (2021). Features, barriers, and influencing factors of mobile learning in higher education: A systematic review. Heliyon, 7(4), e06696. https://doi.org/10.1016/j.heliyon.2021.e06696
  18. Van Wyk, M., & Van Ryneveld, L. (2018). Affordances of mobile devices and note-taking apps to support cognitively demanding note-taking. Education and Information Technologies, 23(4), 1639–1653. https://doi.org/10.1007/s10639-017-9684-0
  19. Villa, A., Craig, S. D., Zakhidov, A., & Zielke, M. (2021). Evaluating digital learning environments with the Learning Strategy Analysis framework. Journal of Computer Assisted Learning, 37(6). https://doi.org/10.1177/1071181321651253