Problem-Based Learning (PBL) is a student-centered approach where students learn by solving problems and reflecting on their experiences (Barrows & Tamblyn, 1980). While research has shown the benefits of PBL for K-12 students, it also reveals challenges teachers face in implementation. Recent studies suggest learning analytics (LA) can provide key insights into students' learning processes in open-ended environments like PBL, helping teachers identify when scaffolding is needed. However, research on LA use in K-12 settings is still limited. This study uses LA to examine middle school students' performance in a PBL context, exploring how students' content knowledge relates to learning performance and behaviors.
418 sixth grade students, comprising 211 boys and 207 girls, from three different schools located in the western and southwestern of the U.S, used a 3D immersive PBL program. Through this program, they learned space science by rescuing six different alien species and relocating them to suitable habitats. This program has nine built-in multimedia tools to facilitate students’ problem-solving with instructional scaffolding (Table 1). These tools enable students to engage in cognitive activities to solve problems and are designed to support students’ scientific inquiry (Liu et al., 2021).
This program incorporates problem-solving tasks, which involve researching various aliens and planets, designing and launching probes to the planets they think are most suitable for each alien, and validating their hypotheses by analyzing data collected from the probes. Students are asked to provide a rationale for why they plan to launch a probe and their probe justifications were categorized into five types (i.e., random input, vague inquiry, vague claim, specific inquiry, and reasoning). To execute these problem-solving tasks, students are required to strategize, collect relevant information, and decide on using the embedded tools effectively.
In this study, a 20-item multiple-choice science knowledge test was used to measure students’ understanding of key science concepts introduced in the PBL program (Cronbach’s alpha =.84). The tests were given before and after students used the PBL program. In addition, over 30,000 lines of log data on students’ just-in-time trajectory of students’ navigation patterns were captured to understand how each student used the tools to solve the problem within the PBL program. The log data were cleaned and used to calculate the frequency (how many times each tool was accessed) and duration (how long a student used each tool in minutes) of each student’s tool use. The students’ learning behavioral patterns were measured with the frequency and duration of using each tool and the counts of sending probes.
Table 1
Technology-Enriched Tools in the Problem-Based Learning Program
Tool Name | Tool Functions |
---|---|
Available at Tool Bar | |
Solar System Database | Provides information on selected planets and moons within our solar system. Data are intentionally incomplete to support the ill-structured nature of the problem-solving environment and foster the need for hypothesis testing. |
Missions Database | Provides information on past NASA missions, including detailed descriptions of probes used on these missions. |
Concept Database | Provides instructional modules on 10 selected scientific concepts designed to facilitate conceptual understanding. This tool is for real-time learning. Students access this tool when they encounter a science concept that they are unfamiliar with but that is needed in Alien Rescue. |
Periodic Table | Provides a periodic table of the elements. |
Spectra | Provides information to help students interpret spectra found in the Alien Information Center. |
Notebook | A notebook to store students’ notes about their research findings. This tool has a notes-comparison feature to facilitate comparing alien needs and planet requirements. |
Available using 4 Arrows | |
Alien Information Center | Provides information via 3D imagery and text, on the aliens’ home planet, their journey, species characteristics, and habitat requirements. |
Communication Center | Serves as a home for the program and also houses the message tool. Students receive welcome messages when they log into the program as well as submit their final solutions using the message tool. |
Probe Design Center | Provides information on scientific equipment used in past and future probe missions. Students construct probes by deciding on probe type, communication, power source, and instruments. |
Mission Control Center | Displays the data collected by the probes. Students analyze and interpret this data in order to develop a solution. Equipment malfunction can occur, and poor planning may lead to mission failure and budget waste. |
All students were categorized into low, medium, and high knowledge groups, given their science knowledge test scores. Kruskal–Wallis tests investigating differences among science knowledge groups based on their post-science test scores revealed significant differences in frequency and duration of tool usage in Concept Database, Spectra, and Notebook, as well as in duration for Periodic Table and Alien Information Center (Table 2). Post hoc Mann–Whitney U tests detected significant differences between high and low knowledge groups in these aspects (Table 3). Between the high and medium knowledge groups, significant differences were found in the durations of Periodic table and Alien Information Center. For the medium and low knowledge groups, significant differences were found in the frequencies of Concept Database and Notebook.
Students in the low science knowledge group spent significantly more time with those tools addressed above and used them more frequently in problem-solving than the high science knowledge group did. Since these tools are designed to facilitate students’ science learning and to help them solve problems, the findings suggest that the students with low science knowledge needed more support and content scaffolding to solve the problem.
Table 2
Findings from Kruskal-Wallis Tests Examining Overall Behavioral Patterns Grouped by Post-Science Knowledge Test Scores
Variable | Kruskal-Wallis Test |
---|---|
Tool frequency | |
Solar System Database | 2.9826 |
Mission Database | 0.9229 |
Concept Database | 7.8226* |
Periodic Table | 3.3164 |
Spectra | 7.0318* |
Notebook | 5.8617* |
Alien Information Center | 3.5829 |
Probe Design Center | 0.1830 |
Mission Control Center | 0.6835 |
Tool duration | |
Solar System Database | 1.1804 |
Mission Database | 1.2361 |
Concept Database | 10.705* |
Periodic Table | 4.9126* |
Spectra | 8.784* |
Notebook | 6.183* |
Alien Information Center | 5.8323* |
Probe Design Center | 0.0788 |
Mission Control Center | 1.4015 |
*p < .05.
Table 3
Findings from Mann-Whitney U Tests Between Students Grouped by High, Medium, and Low Groups of Post-PBL Science Knowledge Scores
High vs. Low | High vs. Medium | Medium vs. Low | ||||
---|---|---|---|---|---|---|
Median | Median | Median | ||||
Variable | High (n = 80) | Low (n = 107) | High (n = 80) | Medium (n = 231) | Medium (n = 231) | Low (n = 107) |
Tool frequency |
|
|
|
|
| |
Concept Database | 2*** | 4 | 2 | 5 | 5*** | 4 |
Spectra | 6*** | 13 | 6 | 12 | 12 | 13 |
Notebook | 20.5** | 74 | 20.5 | 54 | 54** | 74 |
Tool duration | ||||||
Concept Database | 0.1*** | 0.72 | 0.1 | 0.73 | 0.73*** | 0.72 |
Periodic Table | 1.18*** | 2.07 | 1.18 | 1.32** | 1.32 | 2.07 |
Spectra | 2.1*** | 5.12 | 2.1 | 3.97 | 3.97 | 5.12 |
Notebook | 4.69*** | 21.87 | 4.69 | 12.8 | 12.8** | 21.87 |
Alien Information Center | 61.74 | 64.35 | 61.74 | 51.7** | 51.7 | 64.35 |
*p < .05, **p < .02, ***p < .01.
We explored the behavioral patterns of the groups in relation to changes in their science knowledge from pre- to post-PBL. Students were categorized into three groups: (a) low-high (n = 32), those students whose science knowledge improved from low on the pre-PBL test to high post-PBL; (b) low-low (n = 29), those whose science knowledge was low on both tests; and (c) high-high (n = 14), those whose science knowledge was high on both tests. Kruskal-Wallis Tests revealed significant differences among these three groups of students in tool usage frequencies and durations for the Solar System Database, Notebook, Probe Design Center, Mission Control Center, and Concept Database (Table 4). In addition, the total number of probes launched, the count of successfully launched probes, and the proportion of probe justification types were significantly different among the three groups. Post hoc Mann–Whitney U tests highlighted significant differences between the low-low and high-high science knowledge groups in tool usage frequencies and duration for the Solar System Database, Notebook, Probe Design Center, and Mission Control Center, as well as the frequency of Probe Design Center (Table 5). Furthermore, significant differences were found between the low-high and high-high groups for all variables except the frequency of using the Solar System Database. The high-high science knowledge group used some tools (e.g., Solar System Database) for a shorter period and less frequently but spent more time and used the probe design-related tools more often than the others. As a result, they launched greater numbers of probes and had more successful probes than the others. This finding highlighted the role of probe design in the problem-solving process.
Table 4
Findings of Kruskal-Wallis Tests Examining Overall Behavioral Patterns Grouped by Their Pre- and Post-PBL Science Knowledge Test Scores
Variable | Kruskal-Wallis test |
---|---|
Tool frequency | |
Solar System Database | 10.401*** |
Mission Database | 3.689 |
Concept Database | 5.310* |
Periodic Table | 0.57 |
Spectra | 1.224 |
Notebook | 13.289*** |
Alien Information Center | 0.841 |
Probe Design Center | 11.738*** |
Mission Control Center | 13.834*** |
Tool duration | |
Solar System Database | 9.915*** |
Mission Database | 4.257 |
Concept Database | 6.385* |
Periodic Table | 0.928 |
Spectra | 1.537 |
Notebook | 12.595*** |
Alien Information Center | 4.312 |
Probe Design Center | 12.073*** |
Mission Control Center | 13.768*** |
Other | |
Probe count | 9.572*** |
Successful probe count | 12.378*** |
Solution count | 0.333 |
Successful solution count | 4.087 |
Wrong solution count | 5.708 |
Vague inquiry count | 1.439 |
Vague claim count | 11.61*** |
Random input count | 14.4*** |
Reasoning Count | 4.506 |
Specific inquiry count | 0.806 |
*p<.05, **p<.02, ***p<.01
Table 5
Findings from Mann-Whitney U Tests Between the Students as Grouped by Low-Low, Low-High, and High-High Levels of Pre- & Post-PBL Science Knowledge Scores
Low-Low vs. Low-High | Low-Low vs. High-High | Low-High vs. High-High | ||||||
---|---|---|---|---|---|---|---|---|
Median | Median | Median | ||||||
| L-L (n= 29) | L-H (n= 32) | L-L (n=29) | H-H (n= 14) | L-H (n=32) | H-H (n=14) | ||
Tool frequency | ||||||||
Solar System Database | 229 | 128.5 | 229 | 59*** | 128.5 | 59 | ||
Concept Database | 3 | 5 | 3 | 0 | 5 | 0*** | ||
Notebook | 49 | 100.5 | 49 | 1*** | 100.5 | 1*** | ||
Probe Design Center | 85 | 54.5 | 85 | 249*** | 54.5 | 249*** | ||
Mission Control Center | 39 | 18.5 | 39 | 120*** | 18.5 | 120*** | ||
Tool duration | ||||||||
Solar System Database | 68.717 | 60.425 | 68.717 | 20.4*** | 60.425 | 20.4*** | ||
Concept Database | 0.45 | 0.6 | 0.45 | 0 | 0.6 | 0*** | ||
Notebook | 11.733 | 34.083 | 11.733 | 0.042*** | 34.083 | 0.042*** | ||
Probe Design Center | 15.3 | 5.017 | 15.3 | 35.475 | 5.017 | 35.475** | ||
Mission Control Center | 4.7 | 4.325 | 4.7 | 35.192*** | 4.325 | 35.192*** | ||
Other | ||||||||
Probe count | 3 | 1 | 3 | 6*** | 1 | 6*** | ||
Successful probe count | 1 | 0 | 1 | 5.5*** | 0 | 5.5*** | ||
Vague claim count | 0 | 0 | 0 | 1.5** | 0 | 1.5*** | ||
Random input count | 0 | 0 | 0 | 1.5*** | 0 | 1.5*** |
*p < .05, **p < .02, ***p < .01
The findings have several implications for teachers’ PBL implementation. First of all, teachers may need to make extra efforts with students who lack sufficient content knowledge to solve problems, and teachers should monitor their progress more frequently to provide support (Kim et al., 2019). Providing teachers with adaptive dashboard systems which show students’ progress during PBL can be helpful (Tissenbaum & Slotta, 2019; van Leeuwen et al., 2022). Secondly, the high-high group’s frequency and duration of probe design-related tool uses (i.e., the Mission Control Center and Probe Design Center) suggest the importance of testing hypotheses in problem-solving, which aligns with the findings of Spires et al. (2011) regarding the positive relationship between middle-grade students’ hypothesis-testing strategies and their learning outcomes in PBL. Students who entered the program with limited science knowledge often dedicated significant time to using tools designed to teach them the necessary scientific concepts. This reduced the amount of time available for completing problem-solving tasks. In contrast, students who already possessed a strong foundation in science prior to the PBL experience were able to allocate more time and effort toward hypothesis testing, as they did not need to spend as much time on the initial stages of problem-solving.
The study enriches existing literature in two aspects. First, by conducting in-depth analyses of students’ behaviors via LA, we have shown that students’ knowledge of the content plays a role in how students determine when and what tools to use to facilitate their problem-solving. Second, this study offers evidence and insights for researchers in PBL and LA who are interested in designing technology-enriched tools to support K-12 teachers in classroom implementation.
One of the limitations of this study is its reliance on log data as the sole source for analyzing student behavior. Future research could incorporate multiple data sources to enhance the transparency and robustness of data interpretation. Additionally, the study’s focus on a single PBL program limits the generalizability of the findings.