SIS SIG - Meeting 20 - 2025.12.9

SIS SIG - Meeting 20 - 2025.12.9

Participants

First Name

Last Name

Organization

Nathan

Gandomi

Ed-Fi Alliance

Kathleen

Browning

Ed-Fi Alliance

Maria

Ragone

Ed-Fi Alliance

Sean

Casey

Ed-Fi Alliance

Anita

Nesby

Focus

Josh

Bergman

Skyward

Oscar

Ortega

Edupoint

Meg

Morgan

Jupiter

Agenda

  • Sending Assessment data

  • Common issues across state implementations and data hubs

  • Next meeting January 20, 2026 11:00am-12:15pm CT

    • Integration testing

    • Ed-Fi OneRoster API update (slides)


Discussion

1. Sending Assessment data

  • How widespread is the use of the Assessment domain among SIS vendors?

  • SIS vendors utilize additional resources outside typical certification processes, including assessment data. Some vendors successfully send assessment data into the ODS via APIs. Community members have expressed interest in understanding broader use of data models by vendors.

  • Next steps:

    • Assessment special interest group to also include SIS vendors who are utilizing the assessment data model.

    • Ed-Fi Alliance to gather information on other SIS vendors that are utilizing additional resources outside of the typical certification process.

2. Common issues across state implementations and data hubs

  • Next steps: Ed-Fi Alliance to action on list of “What we can do differently” derived from community discussion (see below)

 

Challenge

Mitigation

What we’re doing currently

What we can do differently

Challenge

Mitigation

What we’re doing currently

What we can do differently

  1. Opaque Governance

  • Strong implementation partner that understands SIS vendor pain points, open channel to share feedback across partners/agencies

  • First draft review by vendors (versus current process is to give a near final or final draft). Should be before SEA does significant work

  • Plan with data available (see flipped model)

Updated Implementation Partner guidance/requirements

Badging process considers reproducibility

Official guidance: SEA should give early drafts

Recommend the process improvement of this integrated approach for SEAs, include vendors in the process.

Core + Extension model alignment by SIS community

  1. Spec Instability

  • Avoid accelerating the timeline by reducing the draft/review/analysis period. Stick to the timeline and it will reduce rework, complications in the future

  • Flipped model: turn on core Ed-Fi data, reconstruct calculations from the existing data and identify gaps in the data. Collect all data via core profile, then build from existing data. +2

Stick to timeline

Recommend using experienced partners

Prescribe flipped model (start with core, then map after)

Propose program to review state specs and extensions; establish review process. Alliance is willing for any new state to review specs and extensions and propose changes. Offer direct support to agencies.

  1. Legacy Systems Persist

  • Recommend to states to collect granular data and drive logic based on that granular data

  • In some cases (funding, membership, accountability), the state will not address it from granular data if they are not sure, they can collect the calculation and risk guessing wrong. Would rather support a calculation from a legacy system than an extended version in Ed-Fi. "If you don't KNOW what granular data you need, don't guess, just collect the calculated version"

 

Produce stronger guidance when to retire CSV pipelines.

Make the costs of legacy systems and dual reporting visible to agencies. How can we do this?

  • A large burden is when the agency continues to update requirements to the legacy system and pipelines. If they leave the system as-is, it is not as hard to maintain. However, when they update it to be in sync with current reporting, that gets more painful.

  • In addition, there are comparisons being made across the the Ed-Fi API reporting system to the legacy system that is updated. And it is not a 1 to 1 comparison. The Ed-Fi model changes some data elements and logic.

  • Recommendation to update guidance to identify the scenarios that occur during persistent legacy systems and note that this will incur additional work and possible cost if legacy systems are maintained and updated to current.

  1. Parallel Year

 

 

Should we recommend a different process? (e.g., TN, SC)?

  • The community supports revising guidance to encourage turning on core data first for better implementation outcomes.

  • The group was in favor of this approach as long as those involved in the newly recommended approach are involved in the creating of the new guidance and process.

  • Alliance to convene discussions with the teams involved in Tennessee's implementation to gather lessons learned and updates to our implementation playbook guidance.

  1. Underestimated LEA training needs and change management

  • States that understand the time it takes and it cannot be rushed, it goes better. States that try to shortcut, rush, or underfund it, puts too much burden. They need to understand that it is a mindset shift for districts.

  • Spec instability also makes this worse. Train on one process and change it the following year

  • Starting with easier chunks to collect first help with retraining. For example, starting with simpler collections, like demographics that are more 1:1.

Change management is discussed in the SEA playbook.

Develop operational model:
 – Integrate with state data conferences
 – Roadshow via ESCs/regional supports
 – LEA stakeholder meetings

Convene SEA workgroup to recognize issue and publish model

  1. Support Multiple API Version without tooling or funding