RVWG 2021-11-15 Meeting

Participation

First NameLast NameOrganization
BuckyBushNorthwest RESA
DavidClementsEd-Fi Alliance
Rosh JoshuaDhanawadeIndiana University/INSITE
ShaneFairbairnNEFEC
StephenFuquaEd-Fi Alliance
VinayaMayyaEd-Fi Alliance
HappyMillerRio Rancho Public Schools
ChrisMoffatEd-Fi Alliance
JimRifeESP Solutions Group, Inc.
EmilyStallingsNorthwest RESA
JeremyThigpenESP Solutions Group, Inc.

The meeting was held on 2021-11-15 1:00 pm - 3:00 pm CT, at the 2021 Summit and via WebEx

Meeting recording LINK

Refer to the PPT  for additional details on the meeting minutes and discussions.

Agenda/Meeting Notes

  • Welcome
  • Summarize past charter and previous phase work
    • RVWG Charter
    • Rosh additional context: Originally, group was looking at dashboard replacement. INSight and ESP both created partial replacements. Learned that there was only one outstanding production deployment left. Decided not to create a full replacement. Instead create demonstration systems showing what can be done with the ODS instead of focusing on a specific visualization / output product.
  • Discuss the revised mission for the group and goals
    • Revised RVWG Charter
    • The 2021 Starter Kits did not emerge from this workgroup, though they do serve as useful demonstration visualizations.
    • From the workgroup perspective, useful to identify decision making processes for particular products and technology components used to deliver a solution - without needing to be overly prescriptive. Want to identify and document what is working in the field, and also identify gaps that need to be filled by the community and/or by the Alliance.
    • POC developed by ESP used React front end code with a GraphQL backend that used the Analytics Middle Tier, and demonstrated authentication and authorization. Available on the Exchange as a reference project and framework to help identify how a custom solution can be built.
      • One interesting challenge when replacing a system is narrowing its scope to what people really want… but then can you continue providing some of the older functionality in some way, for the small number of adopters who really got used to a system?
      • Stephen asked about refining the ESP POC documentation to draw more attention to the lessons learned and point to the different parts of the tool that can be used by others for learning how to build a system.
    • Need to provide tools to help train people on new interfaces and definitions. What does each data point really mean? Mature on the tech side, need to get more mature on the implementation side.
    • When talking about visualizations, should we differentiate two different kinds? 1) controlling day-to-day operations, 2) analytics functions. Power BI more relevant to the second case than the first one? In other words, do teachers want to see analytics charts, or just looking at what they need to do next.
      • Shane: most of our work is for instructional coaches, which is more about the analytics, but sometimes have use cases with just a number (how many kids are expected here at this time?).
      • Can also think about analytics as predictive or prescriptive. What can we do that is predictive and put in front of a teacher?
    • Teachers and coaches do need different information. Build-in flexibility in the system to address those different needs, give end-user control to select the information relevant to them.
    • In this COVID-challenged world, many folks are so pulled into triaging immediate problems that they don’t have time to look ahead and predictions. They’re just trying to plug staffing gaps, find children, get food served. Many people only have limited bandwidth to process information. We might be solving a big problem but the user can’t get there. How can this workgroup account for that? Still in survival mode.
    • Let’s make sure we document what is working right now, even if it is not the most elegant solution. Give us something to learn from.
    • Impact of having a data engineer or analyst leave is severe, even when a school is part of a collaborative with an expert to talk to. This can make it harder to keep the data flow stable and accurate. Then when new people come onboard, must help them gain the “Ed-Fi skills” quickly to keep things running smoothly. This all affects the quality of what is being put in front of the stakeholders. Recognize that visualizations are pulling from a large pool of data and people need to be able to solve when the data are weird.
    • Teachers have very few days available to get training on tools, and the people doing the training in the district don’t have the training or end-user context themselves to learn. Need better professional development for edtech tools. Not just how to use the system, but also how to use the results in the classroom (etc.) Interpretation, action.
      • SIDE NOTE FROM EDITOR: in the Ed-Fi Starter Kits, do we have enough guidance on how to get the dashboards out to the intended users and help them understand how to use them?
    • Ideally any tool needs usage analytics to help understand how / when / who is actually using a system. From that can learn where more attention and training is needed.
      • Should that kind of data be in the ODS? Might be too much data and too detailed. Telemetry data perhaps belongs in a different system, perhaps more appropriate for data lake solutions.
      • Arkansas had a Dashboard plugin for something like this.
      • If anything, maybe just store summary information.
    • Does this workgroup also cover “data out” in the sense of: how do we improve ability to get data out from the Ed-Fi ODS/API? Not just how to visualize it. Want to make sure everyone knows the best practices and that we’re thinking about improved solutions.
  • Discuss meeting cadence and line up initial schedule for "data out" demos
    • During the demonstrations we are looking for any kind of documentation to visualize the architecture, explanation of tools used, and/or any demonstration for how systems work given the configuration you have built out.
    • We should cover the professional development training aspect as we have future demonstrations in this workgroup.
    • Need to know where a product is in the deployment process, whether it is actually being used in the field. Should we prioritize hearing about systems that are already in use so that we can hear how they are being received? Also need to get feedback on things deployed but not used.
    • David is aiming for meetings every 2-3 weeks with demonstrations.

Action Items:

  • Stephen and Jim look at refactoring ESP demo documentation to be more instructive for other solutions.

Next meeting: TBD