TAG Meeting 2017-10-05

Participants

  • Silvia Brunet-Jones
  • Dan Retzlaff
  • John Raub, WI DPI
  • Fuat Aki
  • Sherod Keen
  • Jennifer Downey, Infinite Campus
  • Jon Berry
  • Geoff McElhanon
  • Britto Augustine, AZ DOE
  • Matt Warden
  • Dirk Bradley, Michigan Data Hub
  • Don Dailey
  • Nancy Wilson, Collaboration Synergy
  • Chris Moffatt
  • Eric Jansson

Materials

Notes

See Meeting PPT agenda and slides for reference

• Overview of TAG

TAG Overview (see slides)

Reviewed current roadmap for versions/APIs/++ and changes in timeline
Explained what ODS Admin App v1.0 does

Enumerations SIG update in PPT slides: Currently Ed-Fi supports two - thinking about waiting for demand to emerge for more; but is it better to address now? See Geoff’s paper. It can be hard to evaluate you without getting into the specific details.
Wisconsin aggregates up for example for race codes with mapping.
With LMS in NM trying to feed back the LMS information back into Ed-Fi.
Issue of LMS using its own enumerations rather than those in Ed-Fi – opportunity to discuss this is in EADM with the districts involved. Also Generate Project could shine some light on this. And intervention domain is being looked at.

Time and resources permitting we should take a look at support for multiple operational contexts before ODS API 3.0 too far along.  General agreement from TAG members present.

Words of wisdom from SIS vendors: it is tough and we are in the process of doing some of this with local, state and federal codes – need a tool to address this. If you had a shared mapping at the state level would this address the local issue? Needs to be thought out more with use cases.

Models for Support of Community Activity
What format should this take?
What are various responsibilities for those involved?
Relationship to Ed-Fi technology releases?

What about a model as participants of beta programs with some level of alignment and membership; idea of collecting games and outcomes data from games that has commonality across states and it does help if some of us demo something but could there be an ability to make sure that the first effort is the best effort and helps involve the community and considers alignment; raise visibility for the community engagement being a key need and competing efforts are avoided; need good communication to share great ideas – how would this happen and able to list others who are interested – would wait until you see how many adopt it before putting into core – currently don’t have hard and fast models; could think of this as an app store with some levels of validation; have considered this over past few weeks – a need that we don’t know how to address currently – division of responsibilities (see slide on concept of “Sponsored” Projects)

Question is if sharing actual outcome through the Exchange then there may not be alignment to the core – will back by repository in 2018

Community development is accelerating and what might be doable now is not doable in the future – the piece that is missing is a coordinating role of which players are interested in what common areas – perhaps a template for a SIG or who is interested in that “thing” that could be filled by Ed-Fi Alliance; more of a structured process to have the ability to collaborate; even the alignment piece may be something the community itself could police to some extent with a template to contribute issues or questions to that effort

It might be hard to know which idea will have the most value – Wisconsin uses (User Voice) a voting process to identify the highest need with school districts that works well (web-based); could also consider using Slack to communicate and vote; would a vendor consider it if it doesn’t align to their roadmap but it has a high value – would be considered on a case by case based on what a lot of people want to adopt using Wisconsin’s structured prioritization; JIRA can do a lot of collaboration – we are still trying to drive more traffic to JIRA – we have a long way to go to get the full community using that tool; I hear a lot about people having trouble using it – is cumbersome – need to figure out how to simplify it.

WDPI built validation engine into WISE portal so on stuff like that where you want more of a paper explaining what we are doing or if enough people thought that was ok would it go on the roadmap to build? First step would be surfacing what the functionality is for people to respond to.

  • Action: evaluate impacts and scope of an ODS API that supports multiple operational contexts

Data Quality (Wisconsin DPI and Placid Consulting leading - see slides)
Built validation rules and calculations to address data quality (called level 2 validations); also have API errors that the LEA may not realize happen and the data is not coming to us; this is also a problem with Texas – where do you apply your business rules – could there be a generic plug and play place with different failure conditions for the different states with a core set that would apply to all states
See Empowering Ownership Slides
With a real time API there are extra steps and the person at the keyboard are not aware of them and that information will not flow back to them – data integrity errors are identified every day in Arizona and some vendors send the error with the link to the screen; from a vendor perspective some of the API errors can be solved and the secondary errors how do we get them back to the vendors; once we get the data we have a lot more control but we can’t control what it does not get in; districts want to know which data properties they need to fix for data that does come down to us; would some variation to get level two data errors be valuable – that makes a lot of sense.

The 500s are a little weird. The 409s should be able to provide a little more context. Wisconsin can send example error messages that are id info has been removed. Vendors can provide examples of the error messages we are seeing to strategize better how to handle this. With problems happening on the way in do you really need codes for that or is it better to standardize the responses? Level 1 errors need to be handled by the system sending it. Level 2 errors at the next level. Skyward has offered to send examples. Some errors may need their own codes.

We are seeing trends where the 403s are high in the first two months and then they go way down. It may be certain types of operations. In order for someone to see if they have a claim for a student we are sending a lot of erroneous errors off the bat to determine what we have to do and figure out what process we need to do. Need a way to check this without having an error.

On agency-specific validations, the concept of not changing API behavior was agreed to, as it woudl result in fundamentally different APIs that would complicate vendor adotpion (confirms earlier TAG deliberations). However, the concept of allowing processes to write error objects to the ODS, that could then be read by API clients (like SIS systems) was discussed as a model. There seemed general agreement that this might be a good design to follow

First goal should be to address the API errors (403s, 409s, 500s,etc.), 2nd stage to consider data quality and agency-specific validations

  • Action: form a SIG on these errors and look at some real data on it: Infinite campus, Skyward, FL CODE, AZ and MI.