2018 Federal Standard of Excellence


U.S. Department of Education

Score
8
Leadership

Did the agency have a senior staff member(s) with the authority, staff, and budget to evaluate its major programs and inform policy decisions affecting them in FY18? (Example: Chief Evaluation Officer)

  • The United States Department of Education’s (USED or ED) Institute of Education Sciences (IES), with a budget of $613.5 million in FY18, has primary responsibility for education research, evaluation, and statistics. The IES Director is appointed by the President and confirmed by the U.S. Senate, and advises the U.S. Education Secretary on research, evaluation, and statistics activities. Four Commissioners support the IES Director, including the Commissioner for the National Center for Education Evaluation and Regional Assistance (NCEE), who is responsible for planning and overseeing ED’s major evaluations. IES employed approximately 170 full-time staff in FY18, including approximately 20 staff in NCEE.
  • The Assistant Secretary for the Office of Planning, Evaluation and Policy Development (OPEPD) reports to, and advises, the Secretary on matters relating to policy development and review; program performance measurement and evaluation; and the use of data and evidence to inform decision-making. OPEPD’s Policy and Program Studies Service (PPSS) has a staff of about 20 and serves as the Department’s internal analytics office. PPSS performs data analysis and conducts short-term evaluations to support continuous improvement of program implementation, working closely with program offices and senior leadership to inform policy decisions with data and evidence.
  • IES and PPSS staff collaborate closely through ED’s Evidence Planning Group (EPG) with other senior staff from OPEPD, including Budget Service, as well as staff from the Office of Innovation and Improvement (OII), the Performance Improvement Office, and the Office of the General Counsel. EPG currently includes around 25 participants from these offices within ED. EPG supports programs and advises Department leadership and staff on how evidence can be used to improve Department programs and works to provide resources and support to staff in the use of evidence. EPG has coordinated, for example, the development of revised evidence definitions and related selection criteria for competitive grant programs that align with the Elementary and Secondary Education Act, as amended by the Every Student Succeeds Act (P.L. 114-95) (ESSA), and provided guidance on the strategic use of those definitions in the Department’s grant competitions. EPG has also facilitated cross-office alignment of investments in technical assistance related to evidence and pooling program funds for evaluations.
  • Senior officials from IES and OPEPD, and OII are part of ED’s leadership structure. Officials from OPEPD and OII weigh in on major policy decisions. OPEPD plays leading roles in the formation of the Department’s policy positions as expressed through annual budget requests, grant competition priorities, including evidence, and technical assistance to Congress to ensure that evidence appropriately informs policy design.
Score
9
Evaluation & Research

Did the agency have an evaluation policy, evaluation plan, and research/learning agenda(s) and did it publicly release the findings of all completed evaluations in FY18?

  • ED has a scientific integrity policy to ensure that all scientific activities (including research, development, testing, and evaluation) conducted and supported by ED are of the highest quality and integrity, and can be trusted by the public and contribute to sound decision-making. In January 2017, IES published “Evaluation Principles and Practices,” which describes the foundational principles that guide its evaluation studies and the key ways in which the principles are put into practice.
  • In addition, IES works with partners across ED, including through the EPG, to prepare and submit to Congress a two-year biennial, forward-looking evaluation plan covering all mandated and discretionary evaluations of education programs funded under ESSA (see the FY18 plan here). IES and PPSS work with programs to understand their priorities, design appropriate studies to answer the questions being posed, and share results from relevant evaluations to help with program improvement. This serves as a research and learning agenda for ED.
  • ED’s FY 2017 Annual Performance Report and FY 2019 Annual Performance Plan includes a list of ED’s current evaluations in Appendix E, organized by topic. IES also maintains profiles of all its evaluations on its website, which include key findings, publications, and products. IES publicly releases all peer-reviewed publications from its evaluations on the IES website and also in the Education Resources Information Center (ERIC). IES announces all new evaluation findings to the public via a Newsflash and through social media (Twitter, Facebook, and YouTube). IES regularly conducts briefings on its evaluations for ED, the Office of Management and Budget, Congressional staff, and the public.
  • Finally, IES manages the Regional Educational Laboratory (REL) program, which supports districts, states, and boards of education throughout the United States to use research and evaluation in decision making. The research priorities are determined locally, but IES approves the studies and reviews the final products. All REL studies are made publicly available on the IES website.
Score
6
Resources

Did the agency invest at least 1% of program funds in evaluations in FY18? (Examples: Impact studies; implementation studies; rapid cycle evaluations; evaluation technical assistance, and capacity-building)

  • Overall spending on evaluation (about $38 million in FY18) represents .08% of ED’s $46.1 billion discretionary budget (without Pell Grants) minus administrative funds in FY18. This total reflects a targeted definition of program funds dedicated to evaluation, including impact studies and implementation studies. It is important to note that the timing of evaluation projects and the type of research projects proposed by the field results in year-to-year fluctuations in this amount, and does not reflect a change in ED’s commitment to evaluation.
  • While some evaluation funding – such as that for Special Education Studies and Evaluations – is appropriated to IES ($10.8 million in FY18), most evaluations are supported by funds appropriated to ED programs. The Evidence Planning Group (EPG) described above supports program staff that run evidence-based grant competitions and monitor evidence-based grant projects, advises Department leadership and staff on how evidence can be used to improve Department programs, and provides support to staff in the use of evidence.
  • The Education Innovation and Research (EIR) and the Supporting Effective Educator Development (SEED) programs utilize technical assistance contractors that support grantees in the design and implementation of their project-level evaluations, as well as the development of evaluation resources that are now being shared publicly on the NCEE, What Works Clearinghouse, and program websites. In FY17, using funds appropriated in previous fiscal years, about $2.8 million was dedicated to this evaluation technical assistance for EIR grantees, and about $700,000 was dedicated to supporting SEED grantees.
  • In addition to the narrowly defined figure above, ED invests in evaluation by supporting states and school districts so that they can conduct studies of their own education policies and programs. For example, IES runs annual grant competitions to support researcher-practitioner partnerships between state and local education officials and research institutions, including a new program for low-cost, short duration evaluations. IES also awarded approximately $24 million in the Statewide Longitudinal Data Systems grant program across 16 States to support efforts related to (1) increasing use of data for decision making; (2) conducting training on data use, data tools, or accessing data and reporting systems; and (3) utilizing research and analysis results. Finally, the Regional Education Laboratories (RELs) provide extensive technical assistance on evaluation and support research alliances that conduct implementation and impact studies on education policies and programs in ten geographic regions of the U.S., covering all states, territories, and the District of Columbia. Over $55 million was appropriated for the RELs in FY18.
  • ED emphasizes evaluation and the building of evidence in a number of its grant programs through requirements that grantees conduct or participate in evaluations. This emphasis on evaluation continues even for programs supported by an existing evidence base, as the evaluation design for these programs looks at impact with new settings, different populations, and project-specific implementation. In many instances, during grant competitions, this evaluation criterion is reviewed and scored by evaluation experts familiar with the What Works Clearinghouse standards, ensuring proposed evaluation plans are of the appropriate design and rigor. Additionally, a number of OII programs dedicate resources to evaluation technical assistance and program level evaluations.
  • IES and OII grantees are expected to make the results of their evaluations public through Education Resources Information Clearinghouse (ERIC) and other grant-funded dissemination activities. In addition, all impact evaluations funded by IES and OII are reviewed by the What Works Clearinghouse (WWC), which plays a major role in summarizing and disseminating findings from the most rigorous studies to ED and the broader field.
  • ED has an opportunity to increase its annual investment in program evaluation through the reauthorized ESEA pooled evaluation authority, which allows the Department to use funds appropriated to ESEA programs to evaluate any ESEA program included in the biennial evaluation plan prepared by IES. In addition, the ESEA permits ED to reserve up to .5 percent of program funding for evaluation activities.
Score
9
Performance Management / Continuous Improvement

Did the agency implement a performance management system with clear and prioritized outcome-focused goals and aligned program objectives and measures, and did it frequently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other dimensions of performance in FY18? (Example: Performance stat systems)

  • ED develops a four-year strategic plan and holds quarterly data-driven progress reviews of the goals and objectives established in the plan, as required by the Government Performance and Results Act Modernization Act of 2010 (GPRAMA). ED’s FY18-22 Strategic Plan includes two parallel goals, one for P-12, and one for higher education (Strategic Objectives 1.4 and 2.2, respectively), that focus on supporting agencies and educational institutions in the identification and use of evidence-based strategies and practices. The Department’s FY 2017 Annual Performance Report and FY 2019 Annual Performance Plan includes the FY17 performance results for the strategic objective 5.3 in the previous Strategic Plan, which also included metrics for evidence and for which established targets were mostly met.
  • Per GPRAMA’s requirement that agencies conduct quarterly data-driven performance reviews, the Office of the Deputy Secretary facilitates these discussions with the goal leaders and their teams each quarter. The Deputy Secretary is the designated Chair of these meetings and they involve reviewing data submitted by the goal teams, performance to date, and discussing any challenges or known risks. Office and goal leaders attend these meetings in person.
  • In addition, ED has emphasizes continuous improvement in evidence-based decision-making among States and districts. In 2016, ED released non-regulatory guidance, Using Evidence to Strengthen Education Investments, which recommends a five-step decision-making process to promote continuous improvement and improvement and support better outcomes for students. This guidance has served as a framework for the ED’s technical assistance related to implementation of ESSA’s evidence provisions, such as the State Support Network’s community of practice on evidence-based practices that supports 9 states with selection of interventions. ED has conducted outreach to build awareness of the guidance with stakeholder groups. In addition, ED included tailored guidance for these five steps in its guidance on Title II, Part A, and Title IV of ESEA. These resources supplement ED’s substantial evidence-focused technical assistance efforts, such as:
    • RELs work in partnership with policymakers and practitioners in their regions to evaluate programs and to use evaluation findings and other research to improve academic outcomes for their students.
    • Comprehensive Centers provide support to States in planning and implementing interventions through coaching, peer-to-peer learning opportunities, and ongoing direct support.
    • The State Implementation and Scaling Up of Evidence-Based Practices Center provides tools, training modules, and resources on implementation planning and monitoring.
Score
10
Data

Did the agency collect, analyze, share, and use high-quality administrative and survey data – consistent with strong privacy protections – to improve (or help other entities improve) federal, state, and local programs in FY18? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; open data policies)

  • ED has several resources to support the high-quality collection, analysis, and use of high-quality data in ways that protect privacy. IES’ National Center for Education Statistics (NCES) serves as the primary federal entity for collecting and analyzing data related to education. Almost all of ED’s K-12 statistical and programmatic data collections are now administered by NCES via EDFacts. NCES also collects data through national and international surveys and assessments. Administrative institutional data and statistical sample survey data for postsecondary education is collected through NCES in collaboration with the Office of Postsecondary Education (OPE) and the Office of Federal Student Aid (FSA). Some data are available through public access while others only through restricted data licenses. ED’s Office for Civil Rights conducts the Civil Rights Data Collection (CRDC) on key education and civil rights issues in our nation’s public schools. Additionally, the Data Strategy Team helps to coordinate data activities across the Department and the Disclosure Review Board, the Family Policy Compliance Office (FPCO), the EDFacts Governing Board, and the Privacy Technical Assistance Center all help to ensure the quality and privacy of education data.
  • Department data are made publicly available online and can be located in the ED Data Inventory. In FY 2017, ED continued to maintain and grow the Data Inventory, ensuring the information for ED contacts are up to date and expanding the library to include additional years of existing data sets as well as adding new data sets. Additionally, ED is exploring ways to leverage revisions to a technical system to use the data generated through information collection approval process to populate new entries within the Data Inventory.
  • ED made concerted efforts to improve the availability and use of its data in FY17. With the release of the revised College Scorecard, the Department now provides newly combined data in a tool that helps students choose a school that is well-suited to meet their needs, priced affordably, and consistent with their educational and career goals. Additionally, the College Scorecard promotes the use of open data by providing the underlying data in formats that researchers and developers can use through downloadable data files and Application Program Interface (API). In fall 2017, ED updated the Scorecard as part of its annual data refresh and launched a new comparison tool to further promote informed educational choices. The 2018 updates are currently underway.
  • InformED, the ED’s primary open data initiative, works to improve the Department’s capacity to make public education data accessible and usable in innovative and effective ways for families, policy makers, researchers, developers, advocates and other stakeholders. Through InformED, ED has:
    • Continued to leverage its interactive data story template and used it to deliver rich and accessible data narratives around pressing education topics. This has included launching a data story focused on the educational experiences of English learners, accessible here. There are additional data stories under development or under consideration
    • Developed an Open Data IT plan to create an enterprise-wide solution to improve data dissemination capabilities making public data more discoverable, accessible, and usable for the public, while still protecting student privacy. The plan identified enterprise solutions to enhance open data projects at ED.
    • Supported data-informed decision-making internally by piloting data dashboards that provide data on key metrics while leveraging best practices in data visualization.
    • Continued to maintain and support ED’s data landing page to make it easier to identify and navigate to data sources and data tools from across the agency.
  • ED also continued to participate in The Opportunity Project initiative, now coordinated by the U.S. Department of Commerce. In 2017, ED participated in the initiative’s federal agency cohort of projects and worked with external developers to support the development of multiple tools. The tools centered on one of two use cases identified by ED around (1) promoting access to and interest in STEM fields, and (2) supporting States in developing data report cards.
  • ED partnered with the U.S. Department of Housing and Urban Development to deliver a webinar entitled, “Connecting Housing and Education: How a Data-Sharing Partnership Can Improve Outcomes for Children in your Community.” This webinar, which had over 1,000 registrants, largely pulled from the tool: Data Sharing Road Map: Improving Student Outcomes through Partnerships between Public Housing Agencies and School Districts, which was jointly developed by ED and HUD.
  • Additionally, ED administers the Statewide Longitudinal Data System (SLDS) program ($32.3 million in FY18), which provides grants to states to develop their education-related data infrastructure and use these data for education improvement.
Score
10
Common Evidence Standards / What Works Designations

Did the agency use a common evidence framework, guidelines, or standards to inform its research and funding decisions and did it disseminate and promote the use of evidence-based interventions through a user-friendly tool in FY18? (Example: What Works Clearinghouses)

  • ED’s evidence standards for its grant programs, as outlined in the Education Department General Administrative Regulations (EDGAR), build on ED’s What Works ClearinghouseTM (WWC) evidence standards. ED uses these same evidence standards in all of its discretionary grant competitions that use evidence to direct funds to applicants proposing to implement projects that have evidence of effectiveness and/or to build new evidence through evaluation (see criterion 8 below for more detail).
  • As noted above, EPG has coordinated the development of revised evidence definitions and related selection criteria for competitive programs that align with the Every Student Succeeds Act (ESSA) and streamline and clarify provisions for grantees. These revised definitions align with ED’s suggested criteria for states’ implementation of ESSA’s four evidence levels, included in ED’s non-regulatory guidance, Using Evidence to Strengthen Education Investments. ED also developed a fact sheet to support internal and external stakeholders in understanding the revised evidence definitions. This document has been shared with internal and external stakeholders through multiple methods, including the Office of Elementary and Secondary Education ESSA technical assistance page for grantees.
  • The Office of Innovation and Improvement (OII), in coordination with staff from IES and members of EPG, developed Evidence Requirements Checklists for the Education Innovation and Research (EIR) mid-phase and expansion grant competitions. The checklists are intended to help applicants determine what studies to include on the Evidence Form with their application for the purposes of meeting the evidence requirement. Applicants can use the checklist as an informal worksheet to understand the evidence criteria used to review studies and learn about additional evidence-related resources available online. OII also worked with IES to develop two presentations to further support applicants in submitting evidence that meets the established standards: Understanding the Evidence Definitions Used for U.S. Department of Education Programs and Using the What Works Clearinghouse (WWC) to Identify Strong or Moderate Evidence of Positive Effects from Education Interventions.
  • Additionally, IES and the National Science Foundation issued a joint report that describes six types of research studies that can generate evidence about how to increase student learning in 2013. These principles are based, in part, on the research goal structure and expectations of IES’s National Center for Education Research (NCER) and National Center for Special Education Research (NCSER). NCER and NCSER communicate these expectations through their Requests for Applications and webinars that are archived on the IES website and available to all applicants.
  • ED’s What Works ClearinghouseTM identifies studies that provide valid and statistically significant evidence of effectiveness of a given practice, product, program, or policy (referred to as “interventions”), and disseminates summary information and reports on the WWC website. The WWC has reviewed more than 10,000 studies that are available in a searchable database, including a commitment to review all publicly available evaluation reports generated under i3 grants. The WWC released four new Practice Guides in 2016 and 2017. WWC Practice Guides are based on reviews of research and experience of practitioners and are designed to address challenges in classrooms and schools.
  • In 2017, the WWC released the version 4.0 Standards and Procedures Handbooks, which take into account improvements in research methodology. More recently, in early 2018, the WWC launched an online training system for the version 4.0 standards and procedures. The online training system is available to anyone, anywhere, and at any time – and is free to everyone.To make information about statistically significant evidence of effectiveness available to the public more quickly, the WWC has improved its suite of online reviewer tools.
  • As noted above, in 2016, ED released non-regulatory guidance, Using Evidence to Strengthen Education Investments, which recommends a five-step decision-making process to promote continuous improvement and support better outcomes for students that is built on common evidence definitions. This guidance has served as a framework for ED’s technical assistance related to implementation of ESSA’s evidence provisions, such as the State Support Network’s community of practice on evidence-based practices that supports nine states with selection of interventions. ED has conducted outreach to build awareness of the guidance with stakeholder groups. In addition, ED included tailored guidance for these five steps in its guidance on Title II, Part A, and Title IV of the ESEA. These resources supplement ED’s substantial evidence-focused technical assistance efforts, such as:
Score
8
Innovation

Did the agency have staff, policies, and processes in place that encouraged innovation to improve the impact of its programs in FY18? (Examples: Prizes and challenges; behavioral science trials; innovation labs/accelerators; performance partnership pilots; demonstration projects or waivers with strong evaluation requirements)

  • The Education Innovation and Research (EIR) program is ED’s primary innovation program for K–12 public education. EIR grants are focused on validating and scaling evidence-based practices, and encouraging innovative approaches to persistent challenges. The EIR program incorporates a tiered-evidence framework that supports larger awards for projects with the strongest evidence base as well as promising earlier-stage projects that are willing to undergo rigorous evaluation. Funds may be used for: (1) early-phase grants for the development, implementation, and feasibility testing of an intervention or innovation which prior research suggests has promise, in order to determine whether the intervention can improve student academic outcomes; (2) mid-phase grants for implementation and rigorous evaluation of interventions that have been successfully implemented under early-phase grants or have met similar criteria for documenting program effectiveness; and (3) expansion and replication of interventions or innovations that have been found to produce a sizable impact under a mid-phase grant or have met similar criteria for documenting program effectiveness. All grantees must carry out a rigorous independent evaluation of the effectiveness of their project.
  • ED is participating in the Performance Partnership Pilots for Disconnected Youth initiative. These pilots give state, local, and tribal governments an opportunity to test innovative new strategies to improve outcomes for low-income disconnected youth ages 14 to 24, including youth who are in foster care, homeless, young parents, involved in the justice system, unemployed, or who have dropped out or are at risk of dropping out of school.
  • ED is continuing to promote the use of data in innovative ways by engaging with developers. This includes launching a new Developer Hub and GitHub platform, which provides developers with needed information and resources, and the creation of new APIs. Additionally, ED continues to be an active participant in the Opportunity Project, which encourages the use of federal data for social good by providing a process for developers, data enthusiasts, policy leaders, and communities to co-create innovative tech solutions that expand opportunity.
  • ED is currently implementing the Experimental Sites Initiative to assess the effects of statutory and regulatory flexibility for participating institutions disbursing Title IV student aid.
  • The IES Research Grants Program supports the development and iterative testing of new, innovative approaches to improving education outcomes. IES makes research grants with a goal structure. “Goal 2: Development and Innovation” supports the development of new education curricula; instructional approaches; professional development; technology; and practices, programs, and policies that are implemented at the student-, classroom-, school-, district-, state-, or federal-level to improve student education outcomes.
  • On behalf of ED, IES also administers the Small Business Innovation Research (SBIR) program, which competes funding to small business that propose developing commercially viable education technology projects that are designed to support classroom teaching and student learning. Projects must go through an iterative development process and conduct research to determine promise of effectiveness.
  • ED has funded a number of tools to support innovation and rigorous evaluation in the field, including the following:
    • RCT-YES is a free software tool that uses cutting-edge statistical methods to help users easily analyze data and report results from experimental and quasi-experimental impact studies of education programs.
    • Downloadable programs help users build logic models and create ongoing plans for monitoring, measuring, and tracking outcomes over time to measure program effectiveness.
    • A guide for researchers on how to conduct descriptive analysis in education to help identify and describe trends in populations, create new measures, or describe samples in studies aimed at identifying causal effects.
    • The Ed Tech Rapid Cycle Evaluation Coach, a free online tool that helps users plan, conduct, and report findings from experimental and quasi-experimental impact studies of education technology products. The tool is optimized for non-technical users and employs unique statistical methods that allow findings to be presented.
    • CostOut, a software tool that helps users assess the cost and cost effectiveness of education interventions.
  • ED is implementing a number of Pay for Success projects:
    • Career and Technical Education (CTE): $2 million to support the development of PFS projects to implement new or scale up existing high-quality CTE opportunities.
    • English Language Acquisition: $293,000 to conduct a feasibility study that will identify at least two promising school sites that are using evidence-based interventions for early learning dual language models where a PFS project could take shape to help scale the interventions to reach more students those who need them.
    • Early Learning: $3 million for Preschool Pay for Success feasibility pilots to support innovative funding strategies to expand preschool and improve educational outcomes for 3- and 4- year-olds. These grants are allowing states, school districts, and other local government agencies to explore whether Pay for Success is a viable financing mechanism for expanding and improving preschool in their communities.
    • Technical Assistance: The Office of Special Education Programs is collaborating with early childhood technical assistance centers to educate and build capacity among state coordinators in IDEA Part C and Part B to explore using PFS to expand or improve special education services for young children.
    • In addition, ED has conducted a Pay for Success webinar series for the Comprehensive Centers.
  • The Strengthening Career and Technical Education for the 21stCentury Act, passed by Congress in July 2018, authorizes a new innovation fund to “create, develop, implement, replicate, or take to scale evidence-based, field-initiated innovations” in career and technical education. The fund is authorized as part of National Activities under Section 114. The Secretary may use up to 20 percent of the funds authorized for national activities for innovation fund activities; the authorized level for national activities is $7.6 million in FY19 and gradually increases to $8.2 million by FY24.
Score
9
Use of Evidence in Five Largest Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its five largest competitive grant programs in FY18? (Examples: Tiered-evidence frameworks; evidence-based funding set-asides; priority preference points or other preference scoring; Pay for Success provisions)

  • ED’s five largest competitive grant programs in FY18 are: (1) TRIO ($1.01 billion in FY18); (2) Charter Schools Program (400 million in FY18); (3) GEAR UP ($350 million in FY18); (4) Teacher and School Leader Incentive Program (TSL) ($200 million in FY18); and (5) Comprehensive Literacy Development Grants ($190 million in FY18).
  • ED uses evidence of effectiveness when making awards in all five of these largest competitive grant programs. (1) The vast majority of TRIO funding in FY18 will be used to support continuation awards to grantees that were successful in prior competitions that awarded competitive preference priority points for projects that proposed strategies supported by moderate evidence of effectiveness, including over $300 million in Student Support Services, over $150 million in Talent Search, and nearly $400 million in the Upward Bound and Upward Bound Math and Science programs combined. (2) Under the Charter Schools Program, ED generally requires or encourages applicants to support their projects through logic models – however, applicants are not expected to develop their applications based on rigorous evidence. (3) For the 2017 competition for GEAR UP, ED used a competitive preference priority for projects based on moderate evidence of effectiveness for state and partnership grants (approximately $70 million in new awards in FY17). ED is funding continuation awards in 2018 for these evidence-based projects. Additionally, ED is conducting 2018 GEAR UP competitions (nearly $130 million) including an absolute priority for applicants proposing evidence-based strategies to improve STEM outcomes. (4) The TSL statute requires applicants to provide a description of the rationale for their project and describe how the proposed activities are evidence-based, and grantees are held to these standards in the implementation of the program. In 2018, the program is paying out continuation awards from the 2017 competition. (5) The Comprehensive Literacy Development (CLD) statute requires that grantees provide subgrants to local educational agencies that conduct evidence-based literacy interventions. Plans for the 2018 competition are forthcoming, as ED can make awards under the CLD program through September 30, 2019.
  • The Evidence Planning Group (EPG) advises program offices on ways to incorporate evidence in grant programs, including by encouraging or requiring applicants to propose projects that are based on research, and by encouraging applicants to design evaluations for their proposed projects that would build new evidence.
Score
8
Use of Evidence in Five Largest Non-Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its five largest non-competitive grant programs in FY18? (Examples: Evidence-based funding set-asides; requirements to invest funds in evidence-based activities; Pay for Success provisions)

  • ED’s five largest non-competitive grant programs in FY18 included: (1) Title I Grants to LEAs ($15.8 billion); (2) IDEA Grants to States ($12.3 billion); (3) Supporting Effective Instruction State Grants ($2.1 billion); (4) Impact Aid Payments to Federally Connected Children ($1.4 billion); and (5) 21st Century Community Learning Centers ($1.2 billion).
  • ED worked with Congress in FY16 to ensure that evidence played a major role in ED’s large non-competitive grant programs in the reauthorized ESEA. As a result, Section 1003 of ESEA requires states to set aside at least 7 percent of their Title I, Part A funds for a range of activities to help school districts improve low-performing schools. School districts and individual schools are required to create action plans that include “evidence-based” interventions that demonstrate strong, moderate, or promising levels of evidence.
  • Section 4108 of ESEA authorizes school districts to invest “safe and healthy students” funds in Pay for Success initiatives. Section 1424 of ESEA authorizes school districts to invest their Title I, Part D funds (Prevention and Intervention Programs for Children and Youth Who are Neglected, Delinquent, or At-Risk) in Pay for Success initiatives.
  • ED is working to align its diverse technical assistance to best serve state, school districts, and schools as they use evidence to drive improvements in education outcomes.
Score
7
Repurpose for Results

In FY18, did the agency shift funds away from or within any practice, program, or policy that consistently failed to achieve desired outcomes? (Examples: Requiring low-performing grantees to re-compete for funding; removing ineffective interventions from allowable use of grant funds; proposing the elimination of ineffective programs through annual budget requests)

  • ED seeks to shift program funds to support more effective practices by prioritizing the use of evidence as a requirement when applying for a competitive grant. For ED’s grant competitions where there is data about current or past grantees, or where new evidence has emerged independent of grantee activities, ED typically reviews such data to shape the design of future grant competitions.
  • Additionally, ED uses evidence in competitive programs to encourage the field to shift away from less effective practices and toward more effective practices. For example, ESSA’s EIR program supports the creation, development, implementation, replication, and scaling up of evidence-based, field-initiated innovations designed to improve student achievement and attainment for high-need students. IES released The Investing in Innovation Fund: Summary of 67 Evaluations, which can be used to inform efforts to move to more effective practices. The Department is exploring the results to determine what lessons learned can be applied to other programs.
  • The President’s FY19 Budget eliminates, streamlines or reduces 39 discretionary programs that duplicate other programs, are ineffective, or are supported with state, local, or private funds. Major eliminations and reductions in the FY19 Budget include:
    • Supporting Effective Instruction State grants (Title II-A), a savings of $2.3 billion. The program is proposed for elimination because the program lacks evidence of improving student outcomes (see pp. C-17 – C-19 of the FY19 budget request). It also duplicates other ESEA program funds that may be used for professional development.
    • 21st Century Community Learning Centers program, a savings of $1.2 billion. The program lacks strong evidence of meeting its objectives, such as improving student achievement. Based on program performance data from the 2014-2015 school year, more than half of program participants had no improvement in their math and English grades and nearly 60 percent of participants attended centers for fewer than 30 days (p. C-22 in the FY19 budget request).
  • In addition, the President’s Budget proposes to streamline and consolidate programs to achieve management efficiencies, focus federal investments on activities supported by evidence, and reduce the federal role in education.
  • In the previous administration, ED worked with Congress to eliminate 50 programs, saving more than $1.2 billion, including programs like Even Start (see pp. A-72 to A-73) (-$66.5 million in FY11) and Mentoring Grants (p. G-31) (-$47.3 million in FY10), which the Department recommended eliminating out of concern based on evidence.
Back to the Standard

Visit Results4America.org