Ed-Fi Dashboards 2019-07-31 Meeting notes

Date

Attendees

Goals

  • Ed-Fi Dashboard Implementation Notes/Lessons Learned
  • Next Steps

Discussion items

TimeItemWhoNotes
Implementation Lessons LearnedItzel Torres
  • Delaware: Older v1.x implementation,  confirmed state survey feedback, most of the usage is with administrators and not much in the classroom. Pending decision on upgrade, working on other technology projects for the next school year and will start to look into visualization options later, not sure what the strategy or the direction will be yet.
    • Fairly Comprehensive Extensions:
      • Student Level metrics, early version implementation, prior to Plugin Architecture so their extensions were managed with MetricsDB tool
      • Most of their drilldowns are customized
      • Early Learning and Usage module shared in Exchange, but heavy cost to package for community contribution
      • NWEA Add-in functionality, very complex, even with new options for extensibility it is not as easy to follow extension guidelines.
      • As many custom SSIS packages as core metric packages.
    • UI experience in classroom:
      • Educators see the benefit, but don't have the time to use it. Dashboard data can't match changes seen in classroom.
      • Since it is not transaccional, teachers might open it up every few weeks, so it is not going to get a lot of usage in classroom.
      • More usage out of administrator, great for roll ups. Serves a different purpose, for admins, superintendents, etc. it is a great quick overview. For teachers attendance for example is seen everyday so there is not a lot of added value.
      • Assessments in dashboards currently don't drill down to effectively change the path at the educator level, it takes up to 72 hrs to show information and it is too late to change instruction.
      • Great tool for parent teacher conferences, to look at a student overview with parents but not to affect classroom instruction effectively. The overall view and the common language and consistency of how information is presented makes it easier to have better conversations with them. Note: Will reach out to YesPrep as they built a Parent teacher dashboard for precisely this purpose with the ODS as the source and simpler dashboard implementation cost. 
      • Early warning system was updated after implementation, based on research they updated the early warning indicators.
      • Extended the UI to add more tabs and summary options.
  • Pennsylvania: Older 1.x implementation
    • Dashboards are offered as a state add-on, but like other states district buy-in is a struggle. For the LEA's to use them, there is a burden to get the data loaded in a way that will show up in the dashboards and most don't see it as worth it for what they get.
    • Using dashboards and other data visualizations options like PowerBI for reporting.
    • Pain points are the technology stack, customized and does not seem as solid as other product offerings. In the middle of a project so limited availability but very interested in participating and seeing feedback from this group.
    • Discussions around build/buy seem to lack a clear understanding of long term costs. 

Survey Review
  • Interviews with Dashboard implementations so far have confirmed the state survey feedback.
  • LEA survey questions are to specific to the dashboards and it is unlikely that districts would have been able to answer them. Will define a new set of more open and general questions for districts to get a better sense of:
    • Technical acumen/resources available
    • Biggest pain points for administrators and educators, want to verify that what is top of mind for districts most likely is not the same as states
    • Budget/spent on technology implementations. We want to understand effort and cost of implementation and maintenance so that new implementations understand the true cost of implementing reporting and visualizations. 

All implementations we've talked to so far have mentioned that they did not fully understand the true cost of implementation and the custom extensions they were designing. It is hard to evaluate cost of ownership and maintenance without some reference or guidelines. Group would like to provide a better sense of types of customizations and the estimated long term cost of these.


 Next Steps
  •  Create an interview document with initial set of questions for districts  
  • Create a quick upgrade guide for legacy Ed-Fi Dashboard implementations with a high level overview of what changes need to be addressed for an upgrade to the latest 3.x code base.  
  • Create a quick customization guide for Ed-Fi Dashboards to give implementations a better understanding of the options and if possible evaluate the type of customizations that will increase cost of maintenance and code complexity.  

We will cancel our next working session to give members time to review these guides and any additional questions and comments from our general working session with the rest of the group.


Action items type your task here. Use "@" to assign a user and "//" to select a due date.