2018 Federal Standard of Excellence

U.S. Department of Labor


Did the agency have a senior staff member(s) with the authority, staff, and budget to evaluate its major programs and inform policy decisions affecting them in FY18? (Example: Chief Evaluation Officer)

  • The U.S. Department of Labor’s (DOL) Chief Evaluation Officer is a senior official with responsibility for all activities of the Chief Evaluation Office (CEO), and coordination of evaluations department-wide.In 2016, DOL’s Chief Evaluation Officer was converted to a career position, a change which more fully cements the principle of independence and reflects DOL’s commitment to institutionalizing an evidence-based culture at the agency. Evaluation results and products are approved and released by the Chief Evaluation Officer (as per the DOL Evaluation Policy), and disseminated in various formats appropriate to practitioners, policymakers, and evaluators.
  • The CEO includes 15 full-time staff plus a small number of contractors and one to two detailees at any given time. This staff level is augmented by staff from research and evaluation units in other DOL agencies. For example, the Employment and Training Administration (ETA) has nine FTE’s dedicated to research and evaluation activities with which CEO coordinates extensively on the development of a learning agenda, management of the studies, and dissemination of results. CEO staff have expertise in research and evaluation methods as well as in DOL programs and policies and the populations they serve. CEO also employs technical working groups on the majority of evaluation projects whose members have deep technical and subject matter expertise. Further, CEO staff engage and collaborate with program office staff and leadership to interpret research and evaluation findings and to identify their implications for programmatic and policy decisions.
  • In FY18, the CEO is directly overseeing approximately $21.4 million in evaluation funding (this includes a direct appropriation of $8.04 million for department program evaluation and a set-aside amount of up to 0.75% of select department accounts). Additionally, many projects are co-funded with DOL agencies using programmatic dollars. CEO also collaborates with DOL program offices and other federal agencies on additional evaluations being carried out by other offices and/or supported by funds appropriated to other agencies or programs.
  • The CEO’s role is to develop and incorporate evidence and evaluation findings as appropriate and to identify knowledge gaps that might be filled by evaluations or convey evidence that can inform policy and program decisions or performance.DOL’s Chief Evaluation Officer and senior staff are part DOL’s leadership structure and play a role in the formation of the DOL’s agencies’ annual budget requests, recommendations around including evidence in grant competitions, and providing technical assistance to DOL leadership to ensure that evidence informs policy design. There are a number of mechanisms set up to facilitate this: CEO has traditionally participated in quarterly performance meetings with DOL leadership and the Performance Management Center (PMC); CEO reviews agency operating plans and works with agencies and the PMC to coordinate performance targets and measures and evaluation findings; quarterly meetings are held with agency leadership and staff as part of the Learning Agenda process; and meetings are held as needed to strategize around addressing new priorities or legislative requirements.
Evaluation & Research

Did the agency have an evaluation policy, evaluation plan, and research/learning agenda(s) and did it publicly release the findings of all completed evaluations in FY18?

  • DOL has an Evaluation Policy Statement that formalizes the principles that govern all program evaluations in the department, including methodological rigor, independence, transparency, ethics, and relevance.
  • CEO works with each of 12 operating agencies within DOL to create a learning agenda, which is rolled up into a separate agency-wide learning agenda (or Department evaluation plan). Learning agendas are updated every year. They highlight priority questions that the operating agencies would like to answer. They are a catalyst for setting priorities, identifying questions, and for conceptualizing studies that advance evidence in areas of interest to DOL agencies, the department, and the Administration.
  • CEO develops, implements, and publicly releases an annual DOL evaluation plan. The evaluation plan is based on the agency learning agendas as well as DOL’s Strategic Plan priorities, statutory requirements for evaluations, and Secretarial and Administration priorities. The evaluation plan includes the studies CEO intends to undertake in the next year using the set-aside dollars. Appropriations language requires the Chief Evaluation Officer to submit a plan to the U.S. Senate and House Committees on Appropriations outlining the evaluations that will be carried out by the Office using dollars transferred to CEO; the DOL evaluation plan serves that purpose. The 2017 plan was posted on the CEO website. The 2018 evaluation plan is also publicly available. The evaluation plan outlines evaluations that CEO will use its budget to undertake. CEO also works with agencies to undertake evaluations and evidence building strategies to answer other questions of interest identified in learning agencies, but not undertaken directly by CEO.
  • Once contracts are awarded for new evaluation studies, study descriptions are posted on the Current Studies page of CEO’s website to provide the public with information about studies currently underway including research questions and timelines for study completion and publication of results. All DOL reports and findings are publicly released and posted on the complete reports section of the CEO website. DOL agencies, such as ETA, also post and release their own research and evaluation reports.

Did the agency invest at least 1% of program funds in evaluations in FY18? (Examples: Impact studies; implementation studies; rapid cycle evaluations; evaluation technical assistance, and capacity-building)

  • In FY18, DOL’s CEO invested approximately $21 million in evaluation and evidence building activities. This represents approximately 0.20% of DOL’s FY18 discretionary budget for agency programmatic appropriations minus salaries and expenses.
  • This amount only represents the dollars that are directly appropriated or transferred to CEO. Additionally, many DOL evaluations and research studies are supported by funds appropriated to DOL programs and/or are carried out by other offices within DOL. In fact, in addition to CEO, most agencies and program offices conduct and support evaluation activities with their own dollars. For example, ETA funds evaluations and provides evaluation technical assistance and capacity building activities to states to help them meet evaluation and reporting requirements under the Workforce Innovation and Opportunity Act (WIOA). ETA continues funding and technical assistance to states under the Workforce Data Quality Grant Initiative (WDQI) to link earnings and workforce data with education data longitudinally (for example in in FY18, ETA will award approximately $6.0M for WDQI grants). ETA and DOL’s Veterans’ Employment and Training Service (VETS) have also modified state workforce program reporting system requirements to include data items for a larger set of grant programs, which will improve access to administrative data for evaluation and performance management purposes.Further, several DOL agencies also have separate evaluation appropriations. DOL studies funded through individual agencies and program offices also coordinate with DOL’s CEO.
  • In many areas where DOL is undertaking evaluation activities, the evaluation budget far exceeds 1% of the budget for the program. For example, the budgets for the evaluations of a number of recent grant programs, such as the America’s Promise grant evaluation and the Reentry Grant Evaluation, are between 3% and 5% of the programmatic budget.
  • The Administration’s FY14-17 budget requests recommended allowing the U.S. Secretary of Labor to set aside up to 1% of all operating agencies’ budgets for evaluations, coordinated by CEO. In FYs 2012-2015, Congress authorized the Secretary to set aside up to 0.5% of these funds for evaluations, in addition to the separate evaluation funds that exist in many DOL agencies. In FY16-FY18, Congress authorized DOL to set aside up to 0.75% of operating agency budgets for use by CEO for evaluations.
Performance Management / Continuous Improvement

Did the agency implement a performance management system with clear and prioritized outcome-focused goals and aligned program objectives and measures, and did it frequently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other dimensions of performance in FY18? (Example: Performance stat systems)

  • DOL’s Performance Management Center (PMC) is responsible for DOL’s extensive performance management system, which includes 1000 measures which are reviewed quarterly by Department leadership. PMC leads the department’s Continuous Process Improvement (CPI) Program, which supports agencies in efforts to gain operational efficiencies and improve performance. The program directs customized process improvement projects throughout the department and grows the cadre of CPI practitioners through Lean Six Sigma training.
  • PMC leads DOL’s implementation of the Government Performance and Results Modernization Act of 2010 (GPRMA), including requirements such as the four-year Strategic Plan and Annual Performance Report. Using a performance stat reporting and dashboard system linked to component agencies’ annual operating plans, PMC coordinates quarterly reviews of each agency’s program performance by the Deputy Secretary to analyze progress and identify opportunities for performance improvements.
  • At the agency level, ETA recently implemented extensive performance reporting requirements for programs authorized by the WIOA and related workforce programs. ETA’s workforce programs use a similar data layout for performance reporting, using the same data elements and definitions. This facilitates comparison of outcomes and information for different programs. ETA uses this performance information to inform program policy and budgetary decisions.
  • An important role that DOL’s CEO helps to plays is to facilitate the interaction between program and evaluation analysts, and performance management and evaluation. Learning agendas updated annually by DOL agencies in collaboration with DOL’s CEO include program performance themes and priorities for analysis needed to refine performance measures and identify strategies for improving performance. The quarterly reviews with leadership routinely include specific discussions about improving performance and findings from recent evaluations that suggest opportunities for improvement.

Did the agency collect, analyze, share, and use high-quality administrative and survey data – consistent with strong privacy protections – to improve (or help other entities improve) federal, state, and local programs in FY17? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; open data policies)

  • DOL makes the majority of its administrative and survey datasets publicly available for secondary use. For more information, see CEO’s Public Use Datasets and ETA’s repository of public use datasets.
  • DOL’s Bureau of Labor Statistics (BLS) (approximately $600 million in FY17) serves as the principal Federal agency responsible for measuring labor market activity, working conditions, and price changes in the economy. BLS has 110 Cooperative Agreements with 50 States and four Territories for labor market and economic data sharing. For calendar year 2016, there were 513 “letters of agreement” on data usage with academics to conduct statistical research, and eight data sharing agreements with the Bureau of Economic Analysis and the Census Bureau, for a total of 521 agreements.
  • DOL’s Employment and Training Administration (ETA) has agreements with 52 States and Territories for data sharing and exchange of wage data for performance accountability purposes. In FY15 DOL’s ETA began work with the Department of Education’s Office of Career Technical and Adult Education, Rehabilitative Services Administration and Office of the General Counsel to revise and renegotiate the agreements that ETA shares with 52 States and Territories to facilitate better access to quarterly wage data by States for purposes of performance accountability and Research and Evaluation requirements under the Workforce Innovation and Opportunity Act (WIOA). This work aims to expand access to wage data to Education’s Adult and Family Literacy Act programs (AEFLA) and Vocational Rehabilitation programs among others. This work has continued through FY17 and is being conducted in collaboration with State agencies who are subject to the performance accountability and research and evaluation requirements of WIOA and the State Unemployment Insurance Agencies.
  • DOL’s CEO, Employment Training Administration (ETA), and the Veterans Employment and Training Service (VETS) have worked with the U.S. Department of Health and Human Services (HHS) to develop a secure mechanism for obtaining and analyzing earnings data from the Directory of New Hires. In this past year DOL has entered into interagency data sharing agreements with HHS and obtained data to support 10 job training and employment program evaluations.
  • DOL’s worker protection agencies have open-data provisions on enforcement activity for firms from DOL’s five labor enforcement agencies online and accessible through the Enforcement Data Base (Mine Safety and Health Administration, Wage and Hour Division, Occupational Safety and Health Administration, and the Employee Benefits Security Administration).
  • The privacy provisions for BLS and DOL’s Employment and Training Administration (ETA) are publicly available online.
  • In FY17, DOL continued efforts to improve the quality of and access to data for evaluation and performance analysis through the Data Analytics Unit in DOL’s CEO office, and through new pilots beginning in BLS to access and exchange state labor market and earnings data for statistical and evaluation purposes. The Data Analytics unit has also updated its Data Exchange and Analysis Platform (DEAP) with high processing capacity and privacy provisions to share, link, and analyze program and survey data across DOL programs and agencies and with other agencies. Internal use of DEAP is available now and public access will be available in the future.
  • The Workforce Innovation Opportunity Act (WIOA) calls for aligned indicators of performance for WIOA authorized programs. DOL’s Employment and Training Administration has worked within DOL and with the U.S. Department of Education to pursue the deepest WIOA alignment possible, including indicators definitions, data elements, and specifications to improve the quality and analytic value of the data. DOL chose to include several additional DOL programs in this process, which will result in unprecedented alignment of data and definitions for 13 federal programs (11 DOL and 2 Education). DOL and ED have issued five WIOA Final Rules, which all became effective October 18, 2016. The regulations cover WIOA programs under Title I, II, III, and IV, in addition to other miscellaneous changes. The aligned indicators of performance are included in the DOL-ED Joint Rule for WIOA, part 677.
  • ETA continues funding and technical assistance to states under the Workforce Data Quality Initiative to link earnings and workforce data and education data longitudinally. ETA and DOL’s Veteran’s Employment and Training Service have also modified state workforce program reporting system requirements to include data items for a larger set of grant programs, which will improve access to administrative data for evaluation and performance management purposes. An example of the expanded data reporting requirements is the Homeless Veterans Reintegration Program FY16 grants.
Common Evidence Standards / What Works Designations

Did the agency use a common evidence framework, guidelines, or standards to inform its research and funding decisions and did it disseminate and promote the use of evidence-based interventions through a user-friendly tool in FY18? (Example: What Works Clearinghouses)

  • DOL uses the Cross-agency Federal Evidence Framework for evaluation planning and dissemination. Additionally, DOL collaborates with other agencies (HHS, ED-IES, NSF, CNCS) on refining cross-agency evidence guidelines and developing technological procedures to link and share reviews across clearinghouses. The Interagency Evidence Framework conveys the categories of evaluations, the quality review of evaluation methodologies and results, and the use of evaluation findings. The framework is accepted department-wide
  • DOL’s Clearinghouse for Labor Evaluation and Research (CLEAR) is an internet-based evidence clearinghouse. CLEAR’s goal is to make research on labor topics more accessible to practitioners, policymakers, researchers, and the public more broadly, so that it can inform their decisions about labor policies and programs. CLEAR identifies and summarizes many types of research, including descriptive statistical studies and outcome analyses, implementation, and causal impact studies. For causal impact studies, CLEAR assesses the strength of the design and methodology in studies that look at the effectiveness of particular policies and programs.
  • CLEAR reviews causal studies in a number of labor-related topic areas and assigns them a rating indicating the strength of their causal evidence. It provides an objective assessment and rating of the degree to which the research establishes the causal impact of the intervention on the outcomes of interest based on established causal evidence guidelines.
  • DOL uses the CLEAR evidence guidelines and standards to make decisions about discretionary program grants awarded using evidence-informed or evidence-based criteria. The published guidelines and standards are used to identify evidence-based programs and practices and to review studies to assess the strength of their causal evidence or to do a structured evidence review in a particular topic area or timeframe. Requests for proposals also indicate the CLEAR standards should be applied to all CEO evaluations.
  • In additional to CLEAR, ETA maintains a user friendly technical assistance tool to promote state and local service providers’ use of evidence-based interventions through Workforce System Strategies, a comprehensive database of over 1000 profiles that summarize a wide range findings from reports, studies, technical assistance tools and guides that support service improvement, program management and operations; education and training; employment, retention or advancement activities; and other workforce development related topics.
  • DOL’s Evaluation Policy Statement formalizes the principles that govern all program evaluations in DOL, including methodological rigor, independence, transparency, ethics, and relevance. CLEAR standards provide DOL a tool to understand and make transparent how it defines rigor, and to assess the extent to which DOL studies meet the highest standards of methodological rigor

Did the agency have staff, policies, and processes in place that encouraged innovation to improve the impact of its programs in FY18? (Examples: Prizes and challenges; behavioral science trials; innovation labs/accelerators; performance partnership pilots; demonstration projects or waivers with strong evaluation requirements)

  • DOL is participating in the Performance Partnership Pilots (P3) for innovative service delivery for disconnected youth which includes not only waivers and blending and braiding of federal funds, but gives bonus points in application reviews for proposing “high tier” evaluations. DOL is the lead agency for the evaluation of P3. An interim report is expected in 2018.
  • DOL recently awarded a contract to undertake a number of new behavioral insights studies and is conducting knowledge development and feasibility analyses to explore potential trials under that project. Additionally, DOL is completing a behavioral science evaluation that tested tailored outreach methods and modes targeted to women to increase their awareness of, and participation in, non-traditional occupations. The study was implemented in select sites of the American Apprenticeship grants. DOL recently completed a number of behavioral science evaluations—three in unemployment insurance, two in OSHA, one in OFCCP, and one in EBSA for pension contributions (see the CEO website for more information).
  • DOL is using Job Corps’ demonstration authority to test and evaluate innovative and promising models that could improve outcomes for youth. In 2016 DOL awarded a contract for a Job Corps pilot program, the Cascades Job Corps College and Career Academy. The pilot will test alternative strategies for the operation of Job Corps for younger participants (ages 16 and 21). Past research on the program showed that while Job Corps increased the education and earnings of program participants, it was more beneficial for youth over age 20 than for its younger participants. This pilot uses DOL’s demonstration authority to test innovative and promising strategies (which include using a career pathway program approach of integrated academic and technical training, workforce preparation skills, and support services) to better meet the needs of this population. CEO is sponsoring a rigorous impact evaluation to examine the effectiveness of the pilot. Seethe study overview here.
  • DOL has two pilot projects that tested the use of a Pay for Success (PFS) financing model in pilot projects from 2013-2017. In these pilots, private for-profit and philanthropic investors paid the up-front costs of delivering an intervention designed to achieve specific outcomes within a given timeframe, knowing they would only receive a return if the project met its specific outcome targets. Both pilots employed a random assignment methodology to measure results. DOL is sponsoring a process study to document project implementation and provide information on the PFS approach for policymakers and program administrators. The first report from this study was released in 2016, documenting the development of pilots and first year of implementation. A second report will document the pilots’ longer-term operational experiences, including the extent to which the pilots achieved their performance milestones, and is expected in 2018. DOL also sits on the Supporting Social Impact Partnerships to Pay for Results interagency council and is informing implementation of the Pay for Success provisions of the Bipartisan Budget Act of 2018 (PL 115-119), based on its experience with the pilots.
  • DOL has invested more than $90 million through the ApprenticeshipUSA initiative – a national campaign bringing together a broad range of stakeholders, including employers, labor, states, and education and workforce partners, to expand and diversify Registered Apprenticeship in the United States. This includes more than $60 million for state-led strategies to grow and diversify apprenticeship, and State Accelerator Grants to help integrate apprenticeship into education and workforce systems; engage industry and other partners to expand apprenticeship to new sectors and new populations at scale; conduct outreach and work with employers to start new programs; promote greater inclusion and diversity in apprenticeship; and develop statewide and regional strategies aimed at building state capacity to support new apprenticeship programs. All of these grants include funding for data collection; additionally, ETA and CEO are conducting an evaluation of the American Apprenticeship Initiative.
  • In 2018 DOL intends to invest significantly in the Retaining Employment and Talent after Injury/Illness (RETAIN) Demonstration Project, which will test the impact of early intervention projects on stay-at-work/return-to-work outcomes. This demonstration project includes strong evaluation requirements. Grantees will be required to participate in an evaluation, which will be designed in Phase 1 and conducted during Phase 2 by an external, independent contractor.
  • CEO received grant making authority in 2016. In January of 2017, CEO awarded nine research grants aimed at supporting university-based research of workforce policies and programs. The goal is to build capacity and drive innovation among academic researchers to answer questions that will provide insight into labor policies and programs.
Use of Evidence in Five Largest Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its five largest competitive grant programs in FY18? (Examples: Tiered-evidence frameworks; evidence-based funding set-asides; priority preference points or other preference scoring; Pay for Success provisions)

  • In FY18, the five largest competitive grant programs awarded were: (1) YouthBuild ($85 million); (2) Reentry Projects ($84million); (3) RETAIN Demonstration Projects ($63 million); (4) Indian and Native American Employment and Training Program ($62 million); and (5) National Health Emergency (NHE) Dislocated Worker Demonstration Grants ($21 million).
  • All grantees have been or will be involved in evaluations designed by CEO and the relevant DOL agencies. In each case DOL required or encouraged grantees (through language in the funding announcement and proposal review criteria) to use evidence-based models or strategies in grant interventions and/or to participate in an evaluation, especially to test new interventions that theory or research suggest are promising.
  • DOL includes rigorous evaluation requirements in all competitive grant programs, involving either: (1) full participation in a national evaluation as a condition of grant receipt; (2) an independent third-party local or grantee evaluation with priority incentives for rigorous designs (e.g., tiered funding, scoring priorities, bonus scoring for evidence-based interventions, or multi-site rigorous tests); or (3) full participation in an evaluation as well as rigorous grantee (or local) evaluations.
  • For example, the YouthBuild funding announcement required applicants to demonstrate how their project design is informed by the existing evidence base on disadvantaged youth serving social programs, and in particular disadvantaged youth workforce development programs. The funding announcement also contained language stating that grantees are required to participate in an evaluation as a condition of grant award should DOL undertake one. DOL funded an evaluation of YouthBuild using a randomized controlled trial. The evaluation included 75 programs across the country and nearly 4,000 young people who enrolled in the study between 2011 and 2013. The final report, presenting the program’s effects on young people after four years, will be released in 2018.
  • The Reentry Projects grant program used a tiered evidence framework to require applicants to propose evidence-based and informed interventions, or new interventions that theory or research suggests are promising, (or a combination of both) that lead to increased employment outcomes for their target populations. Applicants must frame their goals and objectives to address this issue and are able to select and implement different program services and/or features of program models. The grant funding announcement includes examples of previous studies and evaluations that DOL has conducted on reentry programs, as well as other evidence-based and promising practices, and applicants were encouraged to review these resources prior to designing their intervention. DOL currently has an evaluation underway of the Reentry Projects grant program.
  • The Retaining Employment and Talent after Injury/Illness (RETAIN) Demonstration Projectwill test the impact of early intervention projects on stay-at-work/return-to-work outcomes (see here for the funding announcement). This project builds off of current evidence and includes a rigorous evaluation. The demonstration will be structured and funded in two phases. The initial period of performance will be 18 months and will include planning and start-up activities, including the launch of a small pilot demonstration and an evaluability assessment. At the conclusion of the initial period of performance, a subset of awardees will be competitively awarded supplemental funding to implement the demonstration projects. Awardees will be required to participate in an evaluation, which will be designed in Phase 1 and conducted during Phase 2 by an external, independent contractor.
  • The funding announcement for the Native American Employment and Training Program required applicants to demonstrate how their project design is informed by the existing evidence base on disadvantaged youth serving social programs, and in particular disadvantaged youth workforce development programs, and contains language stating that grantees are required to participate in an evaluation as a condition of grant award should DOL undertake one.
  • The funding announcement for the National Health Emergency (NHE) Dislocated Worker Demonstration Grants states that a primary goal is to test innovative approaches to address the economic and workforce-related impacts of the opioid crisis, and contains language stating that grantees are required to participate in an evaluation as a condition of grant award. The purpose of these grants is to enable eligible applicants to serve or retrain workers in communities impacted by the health and economic effects of widespread opioid use, addiction, and overdose.
Use of Evidence in Five Largest Non-Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its five largest non-competitive grant programs in FY18? (Examples: Evidence-based funding set-asides; requirements to invest funds in evidence-based activities; Pay for Success provisions)

  • In FY18/PY18, the five largest non-competitive grant programs at DOL are in the Employment and Training Administration: (1) the Unemployment Insurance state grants ($2.5 billion in FY18); (2) the Employment Service program state grants ($666 million in PY18); and three authorized programs under WIOA: (1) Youth Workforce Investment program ($903 million in PY18); (2) Adult Employment and Training program ($845 million in PY18); and (3) Dislocated Workers Employment and Training program ($1 billion in PY18).
  • All ETA grant programs allocate funding by statute, and all include performance metrics (e.g., unemployment insurance payment integrity, WIOA common measures) tracked quarterly.
  • A signature feature of WIOA (Pub. L. 113-128) is its focus on the use of data and evidence to improve services and outcomes, particularly in provisions related to states’ role in conducting evaluations and research, as well as in requirements regarding data collection, performance standards, and state planning. Conducting evaluations is a required statewide activity, but there are additional requirements regarding coordination (with other state agencies and federal evaluations under WIOA), dissemination, and provision of data and other information for federal evaluations.
  • WIOA includes evidence and performance provisions which: (1) increased the amount of WIOA funds states can set aside and distribute directly from 5-10% to 15% and authorized them to invest these funds in Pay for Performance initiatives; (2) authorized states to invest their own workforce development funds, as well as non-federal resources, in Pay for Performance initiatives; (3) authorized local workforce investment boards to invest up to 10% of their WIOA funds in Pay for Performance initiatives; and (4) authorized states and local workforce investment boards to award Pay for Performance contracts to intermediaries, community based organizations, and community colleges.
  • Currently, ETA is working with states to implement the requirements of the Reemployment Services and Eligibility Assessments (RESEA) program (an Unemployment Insurance state grant program to assist individuals receiving UI benefits with reemployment). The RESEA program is an evidence-based strategy that combines an assessment for continuing Unemployment Insurance (UI) eligibility with the provision of reemployment services and referrals to other workforce partners. RESEA and its predecessor, Reemployment Eligibility Assessments (REA), previously have been funded and authorized from 2005-2018 via federal appropriations acts, and grants have been provided to states. Currently 51 states and territories offer RESEA. The Bipartisan Budget Act of 2018 made the RESEA program a permanent, but still voluntary, program to serve UI claimants. DOL currently has an implementation evaluation and impact evaluation underway of the REA program, and intends to begin an evaluation of RESEA this year.
  • Evaluations are currently under way of WIOA core programs. For example the WIOA implementation study is examining states’ implementation of the core workforce programs authorized under WIOA’s Title I (Adult, Dislocated Worker, and Youth) and Title III (Employment Services). It will also explore the role of WIOA in stakeholder integration among programs authorized under Titles II (Adult Education and Literacy) and IV (Vocational Rehabilitation). Additionally, two studies are underway or recently completed examining American Job Centers, including a Study of American Job Center Customer Experience and the American Job Center Institutional Analysis.
Repurpose for Results

In FY18, did the agency shift funds away from or within any practice, program, or policy that consistently failed to achieve desired outcomes? (Examples: Requiring low-performing grantees to re-compete for funding; removing ineffective interventions from allowable use of grant funds; proposing the elimination of ineffective programs through annual budget requests)

  • DOL’s evidence-based strategy is focused on program performance improvement and expansion of strategies and programs on which there is evidence of positive impact from rigorous evaluations. DOL uses both evaluation results and program performance measures to make decisions about future funding.
  • Since 2014, DOL has closed three Job Corps centers based on chronic low performance. Closure of underperforming centers allows DOL to shift limited program dollars to centers that will better serve students by providing the training and credentials they need to achieve positive employment and educational outcomes. In a Federal Register notice published in August of 2017, DOL announced a center closure and the methodology used for selecting centers for closure.
  • Discretionary grant performance is closely monitored and has been used to take corrective action and make decisions about continued funding. For example, Youthbuild grant funding is based heavily on past performance. Organizations that have previously received and completed a YouthBuild grant award receive points based on past performance demonstrated, totaling 28 points (almost 30% of their score). This effectively weeds out low performing grantees from winning future awards. (For more information, see the Grant Funding Announcement).
  • Additionally, DOL uses evidence in competitive grant programs to encourage the field to shift away from less effective practices and toward more effective practices. For example, recent grant programs such as the Reentry Grant Program use a tiered evidence framework to require grantees to use evidence based approaches and not choose approaches that have been found to be ineffective. This supports the creation, development, implementation, replication, and scaling up of evidence-based practices designed to improve outcomes, and dis-incentivizes grantees from using approaches not backed by evidence or that have been found to be ineffective.
  • DOL’s FY19 budget request prioritizes programs with demonstrated evidence (e.g., it proposesinvesting $200 million in apprenticeships, a proven strategy) and proposes reductions to unproven strategies (e.g., it proposes shifting dollars in Job Corps away from serving younger youth for whom past studies have not found positive long term impacts, to focus the program more on the older youth for whom the program has been shown to be more effective).
Back to the Standard

Visit Results4America.org