2018 Federal Standard of Excellence


Administration for Children and Families (HHS)

Score
8
Leadership

Did the agency have a senior staff member(s) with the authority, staff, and budget to evaluate its major programs and inform policy decisions affecting them in FY18? (Example: Chief Evaluation Officer)

  • The Administration for Children and Families’ (ACF) Deputy Assistant Secretary for Planning, Research, and Evaluation, a Senior Executive Service career official, oversees ACF’s Office of Planning, Research, and Evaluation (OPRE) and supports evaluation and other learning activities across the agency. ACF’s Deputy Assistant Secretary reports directly to the Assistant Secretary for Children and Families. ACF’s evaluation policy, which is published in the Federal Register, gives the OPRE Deputy Assistant Secretary “authority to approve the design of evaluation projects and analysis plans; and…authority to approve, release and disseminate evaluation reports.”
  • ACF’s budget for research and evaluation in FY18 is approximately $165 million. OPRE’s staffing includes 61 federal positions, experts in research and evaluation methods and data analysis as well as ACF programs and policies and the populations they serve.
  • In the past year, ACF released evaluation impact reports on major programs including Healthy Marriage and Responsible Fatherhood programs, Personal Responsibility Education Programs, and Health Profession Opportunity Grants, as well as impact reports on strategies that can be used by ACF programs including subsidized employment and career pathways. OPRE released many other types of research reports related to ACF programs. Examples include research reports related to Temporary Assistance for Needy Families, Head Start, Child Care, the Maternal, Infant, and Early Childhood Home Visiting program, Child Welfare, the National Domestic Violence Hotline, the Domestic Victims of Human Trafficking Demonstration Projects, and Refugee Cash Assistance. Research and evaluation efforts are ongoing in other ACF program areas. The OPRE Research Library contains all publications, and the OPRE website also includes a Projects by Topic page.
  • OPRE engages in ongoing collaboration with program office staff and leadership to interpret research and evaluation findings and to identify their implications for programmatic and policy decisions. Examples of how research and evaluation findings have influenced ACF regulations and funding opportunity announcements include:
    • When ACF’s Office of Head Start significantly revised its Program Performance Standards (PPS), the regulations that define the standards and minimum requirements for Head Start services, the revisions drew from decades of research and the recommendations in the Final Report of the Secretary’s Advisory Committee on Head Start Research and Evaluation. For example, the 2016 PPS includes new and strengthened requirements for Head Start programs to use evidence-based curriculum with an organized developmental scope and sequence; to use evidenced-based parenting curricula; to support staff to effectively implement curricula and monitor curriculum implementation and fidelity; and to implement a research-based, coordinated coaching strategy for education staff. These requirements drew on a wide array of research efforts that developed, tested, and established an evidence base on effective curricular and professional development/coaching strategies for use in Head Start programs.
    • ACF’s Office of Child Care drew on research and evaluation findings related to eligibility re-determination, continuity of subsidy use, use of quality dollars to improve quality of programs, and more to inform regulations related to Child Care and Development Block Grant reauthorization. 
    • ACF’s Office of Family Assistance (OFA) used lessons learned and emerging findings from research and evaluation on the first round of Health Profession Opportunity Grants (HPOG) to inform the funding opportunity announcement for the second round of grants. Specifically, research findings informed the program components highlighted as important to the HPOG approach in the second round of funding. For example, based on the finding that many participants engaged in short-term training for low-wage, entry-level jobs, OFA more carefully defined the career pathways model, described specific strategies for helping participants progress along a career pathway, and identified and defined key HPOG education and training components. Based on an analysis which indicated limited collaborations with healthcare employers, OFA required second round applicants to demonstrate use of labor market information and consultation with local employers and to describe their plans for employer engagement.
Score
9
Evaluation & Research

Did the agency have an evaluation policy, evaluation plan, and research/learning agenda(s) and did it publicly release the findings of all completed evaluations in FY18?

  • ACF’s evaluation policy addresses the principles of rigor, relevance, transparency, independence, and ethics and requires ACF program, evaluation, and research staff to collaborate. For example, the policy states, “ACF program offices will consult with OPRE in developing evaluation activities.” And, “There must be strong partnerships among evaluation staff, program staff, policy-makers and service providers.” ACF established its Evaluation Policy in November 2012, and published it in the Federal Register in August 2014.
  • ACF’s annual portfolio reviews, which are publicly available on the OPRE website, describe key findings from past and recent research and evaluation work, and how ongoing projects are addressing gaps in the knowledge base and answering critical questions in the areas of family self-sufficiency, child and family development, and family strengthening, including work related to child welfare, child careHead Start, Early Head Start, strengthening families, teen pregnancy prevention and youth development, home visiting, self-sufficiency, welfare, and employment. These portfolio reviews describe how evaluation and evidence-building activities unfold in specific ACF program and topical areas over time and how current research and evaluation initiatives build on past efforts and respond to remaining gaps in knowledge.
  • Building on this assessment of the existing evidence base and the questions being answered by ongoing research, OPRE annually updates its research plans and proposes a research and evaluation spending plan to the Assistant Secretary. This plan covers both longer-term activities that build evidence over time as well as activities to respond to current administration priorities and provide information in the near term. This plan covers areas in which Congress has currently provided authority and funding to conduct research and evaluation.
  • ACF’s evaluation policy requires that “ACF will release evaluation results regardless of findings…Evaluation reports will present comprehensive findings, including favorable, unfavorable, and null findings. ACF will release evaluation results timely – usually within two months of a report’s completion.” ACF has publicly released the findings of all completed evaluations to date. In 2017, OPRE released nearly 100 publications. OPRE publications are publicly available on the OPRE website.
Score
7
Resources

Did the agency invest at least 1% of program funds in evaluations in FY18? (Examples: Impact studies; implementation studies; rapid cycle evaluations; evaluation technical assistance, and capacity-building)

  • In FY18, ACF plans to spend approximately $165 million on research and evaluation, research and evaluation technical assistance, and research and evaluation capacity-building, representing 0.3% of ACF’s $58.6 billion budget in FY18 (in addition to investments in evaluations by ACF grantees). The amount of ACF’s spending on evaluation is largely determined by Congress.
Score
8
Performance Management / Continuous Improvement

Did the agency implement a performance management system with clear and prioritized outcome-focused goals and aligned program objectives and measures, and did it frequently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other dimensions of performance in FY18? (Example: Performance stat systems)

  • ACF’s performance management framework focuses on outcomes and aims for coordinated and results-oriented management and operations across all ACF programs.
  • ACF aims to develop performance measures that are meaningful and can be used by program managers, leadership, outside stakeholders, and Congress to assess and communicate progress. Results for these metrics are reported annually in the ACF Congressional Budget Justification. ACF reports on approximately 140 performance measures (84 outcome measures and 54 output measures) in the FY19 Congressional Budget Justification.
  • ACF is an active participant in the HHS Strategic Review process, an annual assessment of progress on key performance measures.ACF participated in the development of HHS’s FY18–22 Strategic Plan, which includes ACF-specific objectives. ACF also worked with the Department to provide ACF-specific elements (primarily within Strategic Goal 3) to support the FY19 HHS Annual Performance Plan/Report. During 2018, ACF will continue to work with HHS on the required reporting on ACF accomplishments captured in the FY 2018-2022 HHS Strategic Plan and the quarterly Strategic Review process.
  • Individual ACF programs regularly analyze and use performance data, administrative data, and evaluation data to improve performance. Two performance management systems worth noting are the Participant Accomplishment and Grant Evaluation System (PAGES) management information system for Health Profession Opportunity Grant (HPOG) grantees and the Information, Family Outcomes, Reporting, and Management (nForm) management information system for Healthy Marriage and Responsible Fatherhood grantees. Both are web-based management information systems that are used both to track grantee progress for program management and to record grantee and participant data for research and evaluation purposes.
Score
9
Data

Did the agency collect, analyze, share, and use high-quality administrative and survey data – consistent with strong privacy protections – to improve (or help other entities improve) federal, state, and local programs in FY18? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; open data policies)

  • In 2016, ACF established a new Division of Data and Improvement (DDI) providing federal leadership and resources to improve the quality, use, and sharing of data. DDI serves as ACF’s coordination point on all things related to administrative data and interoperability, with DDI staff providing support to ACF program offices and their stakeholders at all levels. DDI works to support the development of interoperable data systems, improve data quality and program integrity, and use data to build evidence and improve programs.
  • ACF’s Interoperability Initiative supports data sharing through developing standards and tools that are reusable across the country, and addressing common privacy and security requirements to mitigate risks. ACF has developed resources such as the National Human Services Interoperability Architecture, which proposes a framework to facilitate information sharing, improve service delivery, prevent fraud, and provide better outcomes for children and families; an Interoperability Toolkit to help state human services agencies connect with their health counterparts; and a Confidentiality Toolkit that supports state and local efforts by explaining rules governing confidentiality in ACF and certain related programs, by providing examples of how confidentiality requirements can be addressed, and by including sample Memoranda of Understandings and data sharing agreements. Several ACF divisions have also been instrumental in supporting cross-governmental efforts, such as the National Information Exchange Model (NIEM) that will enable human services agencies to collaborate with health, education, justice, and many other constituencies that play a role in the well-being of children and families. New pages published on the ACF website in 2018 highlight resources for Interoperability and Data Sharing and ACF Program Guidance on Sharing Administrative Data. ACF will soon announce an Interoperability Action Plan to continue this work.
  • An important element of the ACF Interoperability Action Plan is that all ACF programs will actively pursue actions that allow and encourage states and tribes to share data, including the removal of unnecessary restrictions that prevent legal, ethical, and authorized data sharing, for the benefit of clients served by these programs. ACF program offices will actively seek out opportunities to enhance and support integrated data initiatives such as coordinated case management and data-informed decision-making, and ACF will continue to expand efforts to make data available for research, evaluation, cross-program outcome measurement, and other statistical purposes to inform policymaking and program improvement. ACF and its program offices will develop and implement a Data Sharing First (DSF) strategy that starts with the assumption that data sharing is in the public interest. ACF will encourage and promote data sharing broadly, constrained only when required by law or when there are strong countervailing considerations.
  • ACF administers the Public Assistance Reporting Information System, a platform for exchange of data on benefits receipt across ACF, Department of Defense, and Veterans Affairs programs. This platform entails data sharing agreements between these three federal agencies and between ACF and state agencies.
  • In 2018 ACF produced a Compendium of ACF Administrative and Survey Data Resources. All major ACF person-level administrative data sets and surveys are included, including 11 administrative data sources and eight surveys. Each entry includes the following information: data ownership and staff experts, basic content, major publications and websites, available data sets (public, restricted use, in-house), restrictions on data sharing, capacity to link with other data sets along with history of such linking, data quality, and resources to collect, prepare, and analyze the data. The compendium is currently available for internal use at HHS; a public version is forthcoming.
  • ACF has numerous efforts underway to promote and support the use of data for research and improvement. Highlights of these efforts are listed below:
    • ACF has made numerous administrative and survey datasets publicly available for secondary use, such as data from the National Survey of Early Care and EducationChild Care and Development Fund, National Survey of Child and Adolescent Well-Being, and Adoption and Foster Care Analysis and Reporting System, among many other examples.
    • ACF’s National Directory of New Hires has entered into data sharing agreements with numerous agencies. For example, the Department of Labor’s Chief Evaluation Office and Employment & Training Administration have interagency agreements with ACF for sharing and matching earnings data on nine different formal net impact evaluations. The NDNH Guide for Data Submission describes an agreement with the Social Security Administration to use its network for data transmission.
    • ACF’s TANF Data Innovation Project supports innovation and improved effectiveness of state TANF programs by enhancing the use of data from TANF and related human services programs. This work includes encouraging and strengthening state integrated data systems, promoting proper payments and program integrity, and enabling data analytics for TANF program improvement. The project supports the use of data for understanding the broad impact that TANF has on families, and improving knowledge of how the federal government and state partners can use data to more efficiently and effectively serve TANF clients.
    • The Family Self-Sufficiency Data Center, a cooperative agreement with the University of Chicago, supports the development, implementation, and ongoing operations of the Data Center to support family self-sufficiency research and activities. To date, the project has: conducted a comprehensive needs assessment; developed a prototype for a web-based data archive and analysis tool; worked with states and localities, providing modeling, analytic, and technical support to providers and users of family self-sufficiency data; and generated publicly-available resources, including data models and code to help state-level data users produce analyses. The Center is currently analyzing TANF caseload data through a data sharing agreement with ACF’s Office of Family Assistance. One goal is to assess data quality and opportunities for matching with other administrative data sources. Another is to produce descriptive information about caseload dynamics over time.
    • ACF is producing a resource series on Supporting the Use of Administrative Data in Early Care and Education Research. This set of resources is intended to strengthen the ability of state/territory child care administrators and their research partners to use administrative data to address policy-relevant early care and education research questions.
    • OPRE actively promotes archiving of research and evaluation data for secondary use. In FY18, ACF OPRE research contracts included a standard clause requiring contractors to make data and analyses supported through federal funds available to other researchers and to establish procedures and parameters for all aspects of data and information collection necessary to support archiving information and data collected under the contract. Many datasets from past OPRE projects are stored at archives including the ACF-funded Child Care & Early Education Research Connections site and the ICPSR data archive.
    • New pages published on the ACF website in 2018 include resources for Administrative Data for Research and Improvement.
Score
9
Common Evidence Standards / What Works Designations

Did the agency use a common evidence framework, guidelines, or standards to inform its research and funding decisions and did it disseminate and promote the use of evidence-based interventions through a user-friendly tool in FY18? (Example: What Works Clearinghouses)

  • ACF has established a common evidence framework adapted for the human services context from the framework for education research developed by the U.S. Department of Education and the National Science Foundation. The ACF framework, which includes the six types of studies delineated in the ED/NSF framework, aims to: (1) inform ACF’s investments in research and evaluation; and (2) clarify for potential grantees and others the expectations for different types of studies.
  • While ACF does not have a common evidence framework across all funding decisions, certain programs do use a common evidence framework for funding decisions. For example, the Family First Prevention Services Act (FFPSA) enables states to use funds for certain evidence-based services. ACF is currently developing an evidence framework that will be the basis for determining services eligible for funding. The Head Start Designation Renewal System determines whether Head Start and Early Head Start grants are automatically renewed, based in part on how Head Start classrooms within programs perform on the Classroom Assessment Scoring System (CLASS), an observation-based measure of the quality of teacher-child interactions. The Personal Responsibility Education Program Competitive Grants were funded to replicate effective, evidence-based program models or substantially incorporate elements of projects that have been proven to delay sexual activity, increase condom or contraceptive use for sexually active youth, and/or reduce pregnancy among youth based on a systematic evidence review.
  • ACF sponsors several user-friendly tools that disseminate and promote evidence-based interventions. In particular, several evidence reviews of human services interventions disseminate and promote evidence-based interventions by rating the quality of evaluation studies (using objective standards vetted by technical experts and applied by trained, independent reviewers, and similar to those used by other agencies such as the U.S. Department of Education’s What Works Clearinghouse and the U.S. Department of Labor’s CLEAR) and presenting results in a user-friendly searchable format. Reviews to date have covered teen pregnancy prevention; home visitingmarriage education and responsible fatherhood; and employment and training; and include both ACF-sponsored and other studies.
  • Additionally, ACF is currently working to fulfill two new statutorily required evidence reviews. (1) The Consolidated Appropriations Act of 2017 directed HHS to create a “What Works Clearinghouse of Proven and Promising Projects to Move Welfare Recipients into Work” that includes “projects that used a proven approach or a promising approach in moving welfare recipients into work, based on independent, rigorous evaluations of the projects.” (2) As mentioned above, the Family First Prevention Services Act (FFPSA) enables States to use Federal funds available under parts B and E of Title IV of the Social Security Act to provide enhanced support to children and families and prevent foster care placements through the provision of evidence-based mental health and substance abuse prevention and treatment services, in-home parent skill-based programs, and kinship navigator services, and requires an independent systematic review of evidence to designate programs and services as “promising,” “supported,” and “well-supported” practices.
Score
8
Innovation

Did the agency have staff, policies, and processes in place that encouraged innovation to improve the impact of its programs in FY18? (Examples: Prizes and challenges; behavioral science trials; innovation labs/accelerators; performance partnership pilots; demonstration projects or waivers with strong evaluation requirements)

  • ACF’s Behavioral Innovations to Advance Self-Sufficiency (BIAS) project was the first major effort to apply a behavioral economics lens to programs that serve poor families in the U.S. The project conducted 15 rapid-cycle randomized tests of behavioral innovations in seven states with nearly 100,000 sample members. The results of these tests demonstrated the promise of applying insights from behavioral science to improve human services program outcomes. The Behavioral Interventions to Advance Self-Sufficiency-Next Generation (BIAS-NG) project continues ACF’s exploration of the application of behavioral science to the programs and target populations of ACF. Additionally, the Behavioral Interventions Scholars (BIS) grant program supports dissertation research that applies a behavioral science lens to research questions relevant to social services programs and policies and other issues facing low-income families.
  • ACF is in the process of procuring a contract to build knowledge about the utility of human-centered design approaches in the context of delivery of human services. Key project tasks will include expert consultation, review of the knowledge base, a synthesis of the current state of the field, and a pilot study of the feasibility of implementation of human-centered design in ACF programs. This work builds on prior work done by ACF’s Office of Family Assistance testing the utility of design thinking as a creative problem solving approach for social service organizations.
  • ACF has actively participated in the HHS IDEA Lab, an entity within HHS launched in 2013 to invest in internal innovation, leverage external innovation, and build collaborative communities to tackle cross-cutting issues of strategic importance. Projects have included the ACF Administration for Native Americans’ Application Toolkit andDataQuest: Making ACF Native Data Visible and Useful, the ACF Office of Family Assistance’s Understanding Temporary Assistance for Needy Families Through Data Visualization, and the ACF Office of Head Start’s Partnership Alignment Information Response System.
  • Several ACF grant programs are demonstration projects or allow waivers with evaluation requirements. Examples are listed below:
    • The Health Profession Opportunity Grants (HPOG) program was authorized as a demonstration program with a mandated federal evaluation. OPRE is utilizing a multi-pronged evaluation strategy to document the operations and assess the success of the HPOG program. The evaluation strategy aims to provide information on program implementation, systems change, outcomes, and impacts.
    • ACF’s Office of Child Support Enforcement administers grant-funded demonstration projects and waivers with research components. Current examples include: the Procedural Justice-Informed Alternatives to Contempt demonstration project (PJAC), which will allow grantees to examine whether incorporating procedural justice principles into child support business practices increases reliable child support payments; the Behavioral Interventions for Child Support Services Demonstration Program (BICS), which is testing how behavioral economic strategies affect child support results; the National Child Support Noncustodial Parent Employment Demonstration Project (CSPED), which is testing the efficacy of child support-led employment programs; and Parenting Time Opportunities for Children in the Child Support Program (PTOC), which is testing approaches to safely develop parenting time orders at the time child support is established.
    • ACF’s Foster Care program ($5.3 billion in FY16) has approved over 30 jurisdictions to develop and implement child welfare waiver demonstration projects to improve outcomes for children in foster care or at risk for entry or re-entry into foster care. Through these demonstrations, ACF waives provisions of law to allow flexible use of funding normally limited to foster care for other child welfare services.Many participating jurisdictions are implementing evidence-based or evidence-informed interventions and all demonstration projects are required to have a rigorous evaluation conducted by a third-party evaluator. Although ACF does not currently have statutory authority to grant new waivers, current projects are expected to continue through September 30, 2019. General information on this program, including a fact sheet and summary of relevant legislation/policy, is available at the online Children’s Bureau portal.
Score
7
Use of Evidence in Five Largest Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its five largest competitive grant programs in FY18? (Examples: Tiered-evidence frameworks; evidence-based funding set-asides; priority preference points or other preference scoring; Pay for Success provisions)

  • In FY18, the five largest competitive grant programs are: (1) Head Start ($9.9 billion); (2) Unaccompanied Children Services ($1.3 billion); (3) Early Head Start-Child Care Partnerships ($755 million, included as part of Head Start total); (4) Transitional and Medical Services ($320 million); and (5) Preschool Development Grants ($25 million).
  • ACF’s template (p. 14 in Attachment C) for competitive grant announcements includes two options, requiring grantees to either 1) collect performance management data that contributes to continuous quality improvement and is tied to the project’s logic model, or 2) conduct a rigorous evaluation for which applicants must propose an appropriate design specifying research questions, measurement and analysis. This helps build the evidence base through ACF grantmaking.
  • In FY12, ACF significantly expanded its accountability provisions with the establishment of the Head Start Designation Renewal System (DRS). The DRS was designed to determine whether Head Start and Early Head Start programs are providing high quality comprehensive services to the children and families in their communities. Where they are not, grantees are denied automatic renewal of their grant and must apply for funding renewal through an open competition process. Those determinations are based on seven conditions, one of which looks at how Head Start classrooms within programs perform on the Classroom Assessment Scoring System (CLASS), an observation-based measure of the quality of teacher-child interactions. Data from ACF’s Head Start Family and Child Experiences Survey (FACES) and Quality Features, Dosage, Thresholds and Child Outcomes study (Q-DOT) were used to craft the regulations that created the DRS and informed key decisions in its implementation, including where to set minimum thresholds for average CLASS scores, the number of classrooms within programs to be sampled to ensure stable program-level estimates on CLASS, and the number of cycles of CLASS observations to conduct.
  • ACF has an ongoing research portfolio building evidence in Head Start. Research sponsored through Head Start funding over the past decade has provided valuable information not only to guide program improvement in Head Start itself, but also to guide the field of early childhood programming and early childhood development. Dozens of Head Start programs have collaborated with researchers in making significant contributions in terms of program innovation and evaluation, as well as the use of systematic data collection, analysis and interpretation in program operations.
  • ACF has an ongoing Study of Early Head Start-Child Care Partnerships which includes a review of the literature to summarize the current knowledge base around EHS-child care partnerships; development of a theory of change model to articulate relations among key features, characteristics, and expected outcomes of partnerships; development of approaches to measuring partnerships for existing and new data collection efforts; and the design and implementation of a descriptive study documenting the characteristics and features of EHS-child care partnerships and activities aiming to improve professional development and the quality of services to better meet families’ needs.
  • ACF’s Personal Responsibility Education Program ($75 million in FY18) includes three individual discretionary grant programs that support evidence-based competitive grants that teach youth about abstinence and contraception to prevent pregnancy and sexually transmitted infections.
  • To receive funds through ACF’s Community Based Child Abuse Prevention (CBCAP) program, ($39.8 million in FY18) states must “demonstrate an emphasis on promoting the increased use and high quality implementation of evidence-based and evidence-informed programs and practices.” CBCAP defines evidence-based and evidence-informed programs and practices along a continuum with four categories: Emerging and Evidence-Informed; Promising; Supported; and Well Supported. Programs determined to fall within specific program parameters will be considered to be “evidence informed” or “evidence-based” practices (EBP), as opposed to programs that have not been evaluated using any set criteria. ACF monitors progress on the percentage of program funds (most recently 61.1% in FY15) directed towards evidence-based and evidence-informed practices.
Score
7
Use of Evidence in Five Largest Non-Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its five largest non-competitive grant programs in FY18? (Examples: Evidence-based funding set-asides; requirements to invest funds in evidence-based activities; Pay for Success provisions)

  • In FY18, ACF’s five largest non-competitive grant programs are: (1) Temporary Assistance for Needy Families (TANF) ($17.3 billion); (2) Child Care and Development Fund (Block Grant and Entitlement to States combined) ($8.1 billion); (3) Foster Care ($5.5 billion); (4) Child Support Enforcement Payments to States ($4.36 billion); and (5) Low Income Home Energy Assistance ($3.6 billion).
  • ACF has a long-standing and ongoing research portfolio building evidence related to TANF. Congress has recently provided ACF with additional funds and statutory requirements related to building evidence in this area. In FY17, Congress designated 0.33% of the TANF Block Grant for research, evaluation, and technical assistance, a substantial increase over previously available resources. ACF has used this money to invest in a major new research project on Building Evidence on Employment Strategies for Low-Income Families (BEES) as well as the TANF Data Innovation Project described above. Additionally, the Consolidated Appropriations Act of 2017 directed HHS to create a “What Works Clearinghouse of Proven and Promising Projects to Move Welfare Recipients into Work,” as described in criterion 6.
  • ACF has an ongoing research portfolio examining child care. Research in this area furthers our understanding of child care as a support for parental employment and for children’s developmental well-being, and of the role of child care subsidies in allowing low-income working parents to balance work and family obligations. Recently, ACF significantly increased its investment in child care research, from $14 million in FY17 to $23 million in FY18. ACF’s Office of Child Care provides evidence-based guidance based on OPRE research when providing technical assistance to grantees.
  • ACF has an ongoing research portfolio on abuse, neglect, adoption, and foster care. The child welfare research portfolio includes research on children who are maltreated or who are at risk for child maltreatment; children and families who come to the attention of child protective services; and children and families who are receiving child welfare services either in their families of origin or in substitute care settings. OPRE partners with ACF’s Children’s Bureau to conduct research covering a broad array of topics, including identification of antecedents and consequences of child maltreatment, strategies for prevention of maltreatment, and service needs and service outcomes for children who come to the attention of child welfare. ACF’s child welfare waiver demonstration projects, described above, provide further evidence in this area.
  • The Family First Prevention Services Act (FFPSA) (Division E, Title VII of the Bipartisan Budget Act of 2018) newly enables States to use Federal funds available under parts B and E of title IV of the Social Security Act to provide enhanced support to children and families and prevent foster care placements through the provision of evidence-based mental health and substance abuse prevention and treatment services, in-home parent skill-based programs, and kinship navigator services. FFPSA requires an independent systematic review of evidence to designate programs and services as “promising,” “supported,” and “well-supported” practices. Only interventions designated as evidence-based will be eligible for federal funds.
Score
8
Repurpose for Results

In FY18, did the agency shift funds away from or within any practice, policy, or program that consistently failed to achieve desired outcomes? (Examples: Requiring low-performing grantees to re-compete for funding; removing ineffective interventions from allowable use of grant funds; incentivizing or urging grant applicants to stop using ineffective practices in funding announcements; proposing the elimination of ineffective programs through annual budget requests)

  • The Head Start Designation Renewal System requires Head Start ($9.9 billion in FY18) grantees to compete for grants moving forward if they failed to meet criteria related to service quality, licensing and operations, and fiscal and internal controls. The 2007 Head Start Reauthorization Act made all Head Start grants renewable, five-year grants. At the end of each five-year term, grantees that are running high-quality programs will have their grants renewed. But grantees that fall short of standards are now required to compete to renew grants. Grantees whose ratings on any of the three domains of the Classroom Assessment Scoring System, an assessment of adult:child interactions linked to improved outcomes, fall below a certain threshold, or in the lowest 10 percent of grantees, must also compete.
  • ACF, in collaboration with the HHS Health Resources and Services Administration, has established criteria for evidence of effectiveness of home visiting models, and oversees the Home Visiting Evidence of Effectiveness Review (HomVEE), which determines whether models have evidence of effectiveness. To date HomVEE has reviewed evidence on 45 home visiting models, determining 20 of these to have evidence of effectiveness. Grantees must use at least 75% of their federal home visiting funds to implement one or more of these models.
  • As noted in the response to question 1, OPRE engages in ongoing collaboration with program office staff and leadership to interpret research and evaluation findings and to identify their implications for programmatic and policy decisions, including encouraging or requiring recipients of grants to use effective (and not ineffective) practices.
Back to the Standard

Visit Results4America.org