Analytics is the process of mining routinely collected data for actionable information about an organization or activity. Schools use analytics routinely to describe, predict, and improve student performance outcomes.
Traditional analytics methods require large datasets. For example, Tim McKay’s E2Coach is an example of a traditional analytics project. E2Coach makes an early prediction of whether an individual student will pass a course based on historical behavior patterns of other students. E2Coach builds its models using Michigan’s Data Warehouse, a repository of more than 20 years of student CMS activity records. These early warnings give instructors the opportunity to intervene with at-risk students before they fall too far behind.
Similarly, Purdue’s Course Signals Program predicts learning outcomes based on a training dataset collected from 27,276 unique students enrolled in over 3,000 sections of 597 courses in 75 departments and 9 colleges.
While these projects are impressive, traditional analytics are less useful at small institutions serving at-risk populations. They lack both large datasets and necessary expertise to leverage analytics. Also, these projects focus on institutional level rather than class-level outcomes. Finally, large analytics tools are subject to elimination for budget or business reasons (case in point: Purdue SIGNALS closed in December 2017.)
Project Activities, Goals
We want to adapt big-data tools and strategies to smaller “artisan” scales (multi-section courses or even individual classes.) We’re asking, “can mining smaller datasets yield actionable data for teachers? What challenges do small-scale analytics present in terms of data collection, implementation, and interpretation?”
We have a large dataset with 10 years of students’ instructor and course evaluation responses collected electronically. We will be mining that dataset using traditional analytics, then determining which analytics tools scale down successfully to a single or multi-course level.
Specific methods we will be evaluating include:
- Open response text analysis of student sentiment
- Multiple regression and congruence analysis of student behaviors with course grades
Questions we are exploring:
- Which routinely collected data are predictors for final course outcomes?
- How should data be structured for analysis?
- Which analysis methods are most useful? Which can be scaled up, down successfully?
- How do we interpret the results of analysis?
- What are the simplest, clearest ways to disseminate RESULTS of analysis? Analysis METHODS?
If successful, individual educators could use analytics to: 1) identify students who are struggling; 2) evaluate learning outcomes more consistently; and 3) identify the best predictors of student success for their students.
YES (FERPA guidelines restrict this project to faculty and graduate students; go to current listing)