See also Annual Data Collection and Reporting Cycle.
Data Systems in Support of Accountability and Continuous Improvement (aka 'Metrics White Paper - Draft') describing Data Project intent. The full draft was presented to the PESB at its November 2011 regular meeting.
- Evaluate and determine, pursuant to RCW, whether each institution’s program is in compliance with the program approval standards of PESB WAC;
- Develop guidance for the PESB and programs related to planning and offering of educator preparation programs; and
- Compiling aggregate non-personally identifiable information for the general public.
This commitment to redesign is further reflected in the PESB’s 2011-2015 strategic plan, which states as goal and outcome:
Establish transparency and public accountability for preparation program quality and program approval that is clearly linked to the success of program completers, as measured by student-based evidence.
By 2015, PESB teacher and principal preparation program oversight and approval will incorporate measures of educator effectiveness, including aggregate results from the statewide evaluation system and the Washington Teacher Performance Assessment...
Such a system contains records that must:
- be accurate
- be the same across institutions (compare apples to apples)
- be quickly accessible by programs (reports)
- be countable
- reflect the practices of the program.
Our current system of program review uses the same questions and requests the same range of data from every program at the same set points in time, regardless of indicators of a program’s current or past performance. This design weakens any opportunity to guide systemic inquiry or provide assistance to programs that are struggling. The purpose of building a framework of evidence is to support both program-level and state-level inquiry. For example, a program may collect information about recruiting practices that increase the diversity of the program’s field staff. This information could be used to determine how this change influences the program completion of candidates with diverse backgrounds. At the same time, the PESB may look research outcomes to discuss changes to effectiveness measures being reviewed for approval. Such a system is intended to be responsive to the questions that arise as programs continuously review and inquire about their practice and will mean that answers to the program review questions change over time.
The PESB does want to eliminate unnecessary reporting, but a data system useful for ongoing program improvement as well as accountability cannot just be about streamlining reporting. It must move beyond the current system where the orientation focuses on “how will this data be used / reported?” and “how will it ‘count’?” rather than “how will we use this data to improve our program?” and “how can we demonstrate to PESB our program’s successes?” In doing so, more data will be collected by and from programs over time. The transition to a new system is significant; but over the long term programs and PESB will use key indicators of this system to: drive deeper inquiry into programs that are demonstrating strong candidate outcomes; allow greater regulatory flexibility for programs demonstrating solid performance on key indicators; and provide information to the general public.
Program standards will continue to serve as expectations by which programs are evaluated, but over time the evaluation tool will be increasingly populated with new metrics of evidence related to the standards, with more data available on an ongoing basis - annually or as requested - rather than presented only as part of program site visits. The diagram below depicts new inquiry cycles; both in relationship to providing the PESB ongoing insight into program quality and how to support its continuous improvement, as well as the manner in which we envision the PESB’s cycle of inquiry and that of individual programs naturally intersecting.
1. Deciding on the data elements in the new framework / establishing a system for incorporating newly defined measures over time
PESB has expectations for programs and data, but many of these expectations are implied rather than explicit. For instance, PESB has implied that data surrounding the WEST tests are important, but we have not explicitly outlined what elements should be captured or what role that data plays in the review or approval process. We have implied expectations for other information, such as entrance requirements, “signature” assessments, elements about the clinical experience, and elements to be collected and maintained about a candidate’s success after completion. This data is largely unstructured and as such does not lend itself well to research or evaluating programs or systems.
2. Establishing new data systems to produce the data
Explicitly defined and structured data, i.e. what to collect and how it will be used to inform the approval process, can provide information that can be more easily collected, stored, and accessed, as well as support research to define best practices and evaluation based on predictable indicators and clearly defined outcomes. But data can only be collected, stored and accessed when appropriately structured and properly maintained.
Programs neeed to develop the capacity to collect, access, and retain this evidence over multiple years. The metrics for meeting each standard will be clearly specified, and programs will be able to determine at, or before, the data submissions the degree to which the program has made progress toward or met each standard. Any improvements or technical support deemed necessary will be data driven.
While the PESB program reapproval cycle may continue to occur on the five or seven year cycle, annual review of data points and trends in results can serve to flag areas earlier in that cycle at which time PESB program staff can contact those programs for further inquiry into these indications. Such inquiry may result in a) no action by PESB, i.e. program leadership established internal responses to address the flag; b) technical assistance from PESB to assist the program in determining actions to address the flag (such a request could be program initiated or PESB initiated); or c) Board action according to WAC 181 78A-110 (3) .
Use of the framework of evidence model standardizes the metrics gathered from each program and provides clear comparison of information across programs for consumers, e.g. potential candidates, funders, etc. The model also customizes the regulatory oversight function of the PESB to the needs of each institution. Examples of this standardized customization include:
- A program’s data submissions do not trigger any flags over the course of its approval status (five or seven years). In such a case, PESB’s reapproval review will look different than the review needed for a program for which a pattern of assistance has been indicated.
- A program makes a change that results in a flag. PESB queries the program about the flag.
- The program’s responses could include
- “we were expecting the data outcome variance due to the programmatic change and this is our plan for response”;
- “we were expecting a change, but not this one, and can you help us identify assistance about next steps”;
- “we weren’t expecting any change; now what?”; or among others
- “change? what change?”.
- Each of these program responses to the PESB query could result in different PESB action depending on the programs’ status in the review cycle.
- PESB anticipates a number of requests for assistance to support programs’ data collection, review, and submission capacities. Should the reapproval review occur for these programs for whom this assistance is needed occur while the program continues its capacity building in this area, the reapproval review team may need to include a member with specific data systems experience. Whereas, a program on the same approval timeline but without flags around this indicator would not necessarily require a similarly composed review team.
Such customization of PESB responses to the needs of the program reduces the costs of reapproval review for the agency and the program as well as increases the efficiency and effectiveness of the reapproval process.