Use Case Selection Process: We have some initial use cases but we haven’t finalized the selection process. A few things that come to mind that I suggest we get agreement on before we start receiving community feedback:
How to capture use cases: We have different version on how to capture use cases and it will be important to be consistent.
Survey responses: The GAT will want to see details on how long to keep the survey open and we might want to think of ways to encourage participation. The Alliance will distribute and include in their communications but we can certainly help to get the community to participate.
i. How long should the survey stay open?
ii. How do we encourage the community to respond?
iii. How/who volunteers to review the responses and present to the group?
Important Dates
Technical Congress: Tampa, FL April 10-12
i. RVWG Session April 11 @ 9. – Should we use the time at the Technical Congress to do an initial review of the gathered use cases?
We have 3 scheduled meetings before the congress, what does the group want to achieve in that time.
Current use cases submitted by the group are in different format which makes it harder to evaluate. After reviewing the format used by the AWG, group decided to start with this document as a template and transfer the current use cases to this format. In evaluating the template a few things to consider:
Use cases should have enough detail of data elements to be able to evaluate commonality of use between different use cases. There might be a set of elements that can serve multiple insights and we would want to know that.
Need to keep in mind ADA guidelines in the ultimate proposal design. Billy B. has some resources from other projects that he can share.
Group members will transfer submitted use cases to the new template and provide feedback on format or changes.
Use case template will be updated as needed based on feedback after initial review.
Survey:
Unclear who should be the target of the survey:
If it should be teachers, districts require approval and a process to have access to Teachers. Does it make sense to go through this approval process? Do we believe we would get quality responses from teachers?
Unclear who the survey will reach if we only rely on the Alliance distribution list. Most likely response rate will be low.
Need to define what is an acceptable number of responses to aim for. One suggestion is to do interviews instead of or in addition to just sending out survey.
50 interviews seems like a good number to hit, still need to see if there are enough members that can tackle conducting these interviews.
Molly S. already participated in some data collection from teachers and administrators and it is a big effort. She will make the data and results of this effort with the group but will not be able to conduct interviews or send out another survey since this effort was fairly recent.
Questions:
Does the Alliance have budget to get a more formal survey conducted.
Does the survey need to be statistically significant?