Meeting 1 - 2021-03-30

Materials/Agenda

Participants

  • Ed Comer
  • Andrew Rice
  • Mario Palmisano
  • Mark Masterson
  • Patrick Yoho
  • Ryan Gravette
  • Shane Foster
  • Molly Stewart
  • Sunil Pasupuleti
  • Traci Clarke
  • Tim Pritchett 
  • Kristi Richburg
  • Noah Bookman
  • Eric Jansson

Meeting Notes

Used Jam Board: https://jamboard.google.com/d/1M1MgFWnmLDcW2X4V63dcH09ubzWYM7EjnOTJJjXQqAs/edit?usp=sharing

Use cases were covered and a variety were presented:

  • Identifying for instructional leaders signs of student engagement / disengagement – e.g. via “activity gaps” or other patterns in “online presence”
  • Use of the LMS as a place to pull very current grade data was mentioned
  • There were more general notes of the potential value of online LMS “events” to a variety of analytics, a la streaming user experience data
  • Use of LMS data to evaluate and recommend learning content
  • Use of assessment capability of LMS tools in a variety of tactical ways (but generally not as a substitute for a broader program of interim or formative assessments)
  • Some use cases were for narrower use cases: pulling data for teacher evaluations handled partly in LMS tools; integration for a single course pull of data for a college-going curriculum; pulling pre-and post-semester “student diagnostics” for courses, all driven via an LMS assessment capabilities

Much of the use case conversation re-emphasized basic activity (assignments, “presence”, other transactional daily work) as that is likely of most value to teachers to spot engagement/disengagement.

Another common theme was the value of combining LMS data with other sources of data, particularly SIS and assessment data.

One bucketing of data presented:

  • Demographic
  • Behavioral – activity, behavior interventions, teaching and communications
  • Academic – measurement of outcomes
  • Perceptual – self perception of student learning

Some common problems with LMS data were cited:

  • Non-availability of some data useful for downstream analytics
  • Some lag times in data availability (e.g. getting older activity data)
  • Occasional lack of internal consistency in some data

The question of whether to focus on getting demographics in the LMS more reliably to improve analytics OR to focus on rejoining the LMS data with the SIS data downstream was also raised. (It was noted that generally the Alliance work in this area and others has been targeting the latter).

Another question was on authorization for GradeBook and how to reduce the likelihood of thrashing and/or coordinate updates by multiple parties.

Postges support for the LMS Toolkit was also asked about - could be a community need.

Gaps with the current Ed-Fi data model were also covered:

  • Learning content – generally not covered
  • Assignment data – generally not covered in a native LMS manner, through GradeBook is there
  • Activity data – logins/logouts, message posts, etc. generally not covered
  • GradeBook – generally needs an overhaul

In terms of focus areas for the Alliance work, there was not a discrete discussion. However, the explicit suggestions that were offered to focus on assignment information and activity information for students/teachers. This was generally reinforced – or at least not contradicted – by the discussion. The general sense of a initial canonical use cases would be something like the following (note this was not explicitly discussed at the meeting - this is an extrapolation):

Canonical Use Case #1

Provide at a classroom level and for a teacher a visualization of student engagement in course work and course activities and allow that information to be shown/combine in the context of demographic data as well as either past achievement data, as evidenced by either or both grades or assessment data. The engagement picture should be on recent trends, so as drive and encourage action by the teacher.


Canonical Use Case #2

Do the same as use case #1, but at a building level and for a principal, counselor or grade level teacher cohort

The overall prioritization seemed to lie thus:

  1. Assignment and activity / “presence” data
  2. Current grade data (possibly + Gradebook overhaul)
  3. Assessments or similar data (though some of this might fall under “presence” data)
  4. Content usage

The conversation also noted/reaffirmed the need to tackle the Gradebook/Assignment issue.