Data Analysis in ST Math (DAST) Project Page

HIGHLIGHTS DETAILS PUBLICATIONS PRODUCTS TEAM

ACCOMPLISHMENTS AND HIGHLIGHTS
Updated July, 2018
Accomplishments to date under the grant:

  • Developed a system to collect and analyze fine-grained student attempts
  • Strengthened communication feedback loop between ST Math developers and researchers to ensure that the strongest research results can impact practice
  • Developed an innovative way to collect student motivation and affect through a puzzle-like interface
  • Analyzed student data within ST Math to understand which game features and student behaviors are associated with success
  • Conducted observations of select teachers and classes, focusing on teacher practices in the lab and student engagement

Highlights of what we have learned and how it can be used:

Learned Suggested Action Items
· Students who replay previously-passed games after failure experiences have lower achievement than students who replay previously-passed games after success experiences · Develop ways to encourage students who are failing a level to persevere, provide more supports so that they can be successful if “stuck,” and re-position navigating to replays as a “reward” after successfully completing a challenging level
· Among third grade fraction games, the majority of levels displayed good learning curves, wherein students were progressing and leveling out learning, indicating achievement of skill mastery

· Games were identified that were too difficult for some students as currently presented or not related to previous learning content in the most efficient way

· Feedback provided to MIND content team to focus efforts on revision of the small number of games that did not display good learning curves

· Feedback about difficulty has been provided to the ST Math content team. Longer term, personalized difficulty via diagnostics and “ramp-up” curricula is being explored by MIND

· Teacher variation in curricular ordering has implications for student progress through ST Math · These associations suggested changes in recommended ordering to improve successful progress of students through the curriculum; feedback provided to MIND
· Teachers who better understood the philosophy behind ST Math were better able to support students in times of struggle within the games · Implement targeted teacher PD to focus on comfort with ambiguity and learning from mistakes
· Although student perception of usefulness and importance of mathematics increases by grade-level, student rating of self-efficacy for mathematics decreases · Examine ways to increase student self-efficacy with mastery-learning approaches or positive messaging
· Student motivation for mathematics increases as their test score performance increases, but this relationship differs between boys and girls · Use multiple years of achievement data to pin-point when differences may begin; use game data to identify if there are certain groups of students more or less motivated
· Students would like their teachers to know more about their motivation for academic and non-academic subjects · Design a way to share student motivation survey answers with teachers

[Back Home]

DETAILS: EVALUATION FOR ACTIONABLE CHANGE: A DATA-DRIVEN APPROACH
Increasing pressures for accountability have resulted in a push for rigorous evaluation of educational programs and practice. Yet rigorous evaluations such as Randomized Control Trials (RCTs) are expensive and often show small effects. Even RCTs of widely-adopted digital learning platforms can show disappointing results and these results have little impact on subsequent adoptions of programs already entrenched in the educational landscape. New methods are needed to both estimate effects and to indicate ways of improving outcomes for already-adopted digital learning tools. With platforms currently in wide-scale use, novel approaches to assessing use patterns and their relations with outcomes can both evaluate maximal effectiveness and provide means for improved effectiveness. This proposal sets forth an evaluation of ST Math, a K-8 digital learning platform, through a partnership between researchers at North Carolina State University (NC State) with expertise in Educational Evaluation, Educational Data Mining, and Assessment with program developers at the non-profit MIND Research Institute (MIND).
The PIs propose to use data collected within MIND’s ST Math to develop behavior-based techniques to assess student progress within the system as well as to assess the quality of the existing interventions. This research will explore novel methods for detecting, visualizing, and evaluating students’ puzzle-solving and puzzle-selection behaviors. The PIs will assess whether the detected patterns are driven by students’ incoming competence or can be used to predict their short and long-term performance. By linking student and teacher behavior patterns with important learning and motivational outcomes, researchers can recommend promising actions to teachers and potential refinements to developers. This work has the potential to not only transform the use and success of the ST Math platform, but to create methods that can be refined and transferred to the evaluation and implementation of other platforms.

Intellectual merit: This project will explore novel, and noninvasive, approaches for determining the impact of the ST Math digital learning environment. It will advance the analytical basis for formative assessment using process data and build towards algorithms that improve STEM teaching and learning by facilitating the automatic recognition of teachable moments for learning. The transformative potential of this research resides in the creation of new cross-disciplinary approaches that can be used to not only evaluate impact, but to inform improved teaching and learning in STEM, by leveraging observed behaviors of students and teachers and Educational Data Mining techniques.

Broader impacts: This project addresses the important national concern of strengthening the mathematical competency of our students through enhancing both their understanding of math concepts and their motivation for math learning. By investing in evaluation innovations to improve digital learning platforms such as ST Math, we can reach large numbers of children with programs that approach maximal effectiveness with each iteration—development of automated tools for improvement can also enhance both the efficiency and efficacy of the digital learning platforms. By removing the adversarial process from evaluation, researchers, educators, and developers can work toward the common goal of improving learning. [Back Home]

PUBLICATIONS

Published Conference Proceedings

Peddycord-Liu, Z., Harred, R., Karamarkovich, S. M., Barnes, T., Lynch, C., & Rutherford, T. (2018). Learning curve analysis in a large-scale, drill-and-practice serious math game: Where is learning supported? In Proceedings of the 19th International Conference on Artificial Intelligence in Education. London, UK. LINK

Peddycord-Liu, Z., Cody, C., Kessler, S. M., Barnes, T., Lynch, C., & Rutherford, T. (2017). Using serious game analytics to inform digital curricular sequencing: What math objective should students play next? In Proceedings of the ACM SIGCHI Annual Symposium on Computer-Human Interaction in Play (CHI PLAY). Amsterdam, Netherlands. LINK

Liu, Z., Cody, C., Barnes, T., Lynch, C., & Rutherford, T. (2017). The antecedents of and associations with elective replay in an educational game: Is replay worth it? In Proceedings of the 10th International Conference on Educational Data Mining. Wuhan, China. LINK

Reports

Example of de-identified report provided to one of the project districts regarding fall and winter motivation surveys. LINK

PRESENTATIONS

Karamarkovich, S. M. & Rutherford, T. (2018, August). Differences in profiles of motivation for mathematics across grades and districts. Paper accepted to the annual convention of the American Psychological Association, San Francisco, CA. LINK-Coming Soon.

Karamarkovich (Kessler), S. M. & Rutherford, T. (2017, October). Fraction errors in a digital mathematics environment: Latent class and transition analysis. Presented at the annual meeting of the Cognitive Development Society, Portland, OR. LINK

Karamarkovich (Kessler), S. M., Cao, W., & Rutherford, T. (2017, August). Predictors and components of fraction performance in a mathematics digital environment. Presented at the annual meeting of the American Psychological Association, Washington, D.C. LINK
[Back Home]

PRODUCTS

Coming Soon.

TEAM

Faculty
PI, Dr. Teomara (Teya) Rutherford, twitter: @DrTeyaR
Co-PI, Dr. Tiffany Barnes, twitter: @DrTiffanyBarnes
Co-PI, Dr. Collin Lynch

Postdoctoral Scholar
Allison Liu (starting Sept., 2018)

Graduate Students
Sarah Karamarkovich (TELS, current)
Zhongxiu Liu (CS, current), twitter: @PirateAuroraLiu
Jessica Vandenberg (TELS, current)
Christa Cody (CS, current)
Waverly Logan (TELS, current)
Rachel Harred (CS, current)
Andrea Kunze (TELS, 2016-2018)
Wenjia Cao (TELS, 2016-2018)
Marina Wagemaker (TELS, 2016-2017)

Undergraduate Students
Ryan Edmonds (current)
Megan Armstrong (current)
Maiya Whiteside (2016-2018)
DeShaun Fontenot (2016-2018)
Chantelle Linthicum (2017)
[Back Home]