2018 Federal Standard of Excellence


Performance Management / Continuous Improvement

Did the agency implement a performance management system with clear and prioritized outcome-focused goals and aligned program objectives and measures, and did it frequently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other dimensions of performance in FY18? (Example: Performance stat systems)

Score
8
Administration for Children and Families (HHS)
  • ACF’s performance management framework focuses on outcomes and aims for coordinated and results-oriented management and operations across all ACF programs.
  • ACF aims to develop performance measures that are meaningful and can be used by program managers, leadership, outside stakeholders, and Congress to assess and communicate progress. Results for these metrics are reported annually in the ACF Congressional Budget Justification. ACF reports on approximately 140 performance measures (84 outcome measures and 54 output measures) in the FY19 Congressional Budget Justification.
  • ACF is an active participant in the HHS Strategic Review process, an annual assessment of progress on key performance measures.ACF participated in the development of HHS’s FY 2018–2022 Strategic Plan, which includes ACF-specific objectives. ACF also worked with the Department to provide ACF-specific elements (primarily within Strategic Goal 3) to support the FY 2019 HHS Annual Performance Plan/Report. During 2018, ACF will continue to work with HHS on the required reporting on ACF accomplishments captured in the FY 2018-2022 HHS Strategic Plan and the quarterly Strategic Review process.
  • Individual ACF programs regularly analyze and use performance data, administrative data, and evaluation data to improve performance. Two performance management systems worth noting are the Participant Accomplishment and Grant Evaluation System (PAGES) management information system for Health Profession Opportunity Grant (HPOG) grantees and the Information, Family Outcomes, Reporting, and Management (nForm) management information system for Healthy Marriage and Responsible Fatherhood grantees. Both are web-based management information systems that are used both to track grantee progress for program management and to record grantee and participant data for research and evaluation purposes.
Score
8
Administration for Community Living
  • OPE is responsible for performance management, which includes approximately 25 output and outcome measures reported in annual budget justifications. These results are reviewed annually by ACL leadership. OPE, in coordination with ACL’s Center for Management and Budget, leads the ACL’s internal Continuous Process Improvement (CPI) Program which supports agencies in efforts to gain operational efficiencies and improve performance.
  • ACL’s performance strategy presents a high-level approach to the planning, conduct, and implementation of performance management. The strategypresents a high-level approach to the planning, conduct, and implementation of performance management and represents ACL’s commitment to providing rigorous, relevant, and transparent performance data highlighting all the programs and initiatives ACL supports. This strategy describes (p. 2) how ACL’s ”performance data is reported and tracked (1) to monitor the administration’s progress towards achieving our departmental and agency strategic goals, objectives, and priorities (2) to support ACL’s budget justifications; and (3) to monitor program performance and support improvement.” ACL developsand maintainsa repository of high quality and robust performance data on all of ACL programs and business lines to demonstrate the impact of programs and services. And ACL, as described under Goal 4 (p. 1), works to “Encourage the utilization of the performance strategy and data in policy and practice to enhance planning and decision-making enabling ACL to easily track goals, objectives and performance across the agency.” Implementation of this strategy, in combination with the center-specific learning agenda process and NIDILLR’s Long-Range Plan (Described under criterion two), contributes to ACL’s development of an Agency-wide learning agenda.
  • Current information about ACL performance is available in reports to Congress and Congressional budget justifications. The 2018 justification provides the overview of ACL’s performance management approach:
    • With its aging programs, ACL focuses on three categories of performance measures: (1) improving consumer outcomes; (2) effectively targeting services to vulnerable populations; and (3) improving efficiency. Each measure is representative of activities across the Aging Services Program budget, and progress toward achievement is tracked using a number of indicators.
    • ACL has implemented a quality review system (QRS) for developmental disability programs under ACL’s/AIDD. The QRS uses a three-tiered model to review program compliance, outcomes, and fiscal operations and use review results to target and coordinate technical assistance. The first tier is annual standardized review. The second tier is standardized, in-depth review involving a team of reviewers conducted on a periodic basis. Tier three is customized monitoring for programs for which ACL has significant concerns in terms of compliance and performance.
  • ACL continues development of a formula grant monitoring framework for Older Americans Act Title III and VII state formula grants. The framework combines assessments of grantee’s progress toward program goals and objectives with identification of risk or instances of fraud, waste and abuse.
  • There is a rigorous process in which each office within ACL develops Program Funding Plan Memoranda which detail the proposed discretionary grant and procurement activities for the office and justify each proposed activity consistent with ACL’s mission and performance measures. Senior leadership has established processes for use of performance data for management decision-making, including a quarterly discretionary dashboard, bi-weekly reports for the Administrator/Assistant Secretary, quarterly reviews of operating budgets, quarterly managers’ meetings and bi-weekly center director meetings.
  • NIDILRR has developed a data and performance management program to measure progress and track outcomes of competitive grant recipients. The program has posted and made available annual performance data from grantees to all NIDIILRR staff from 2007 to 2013. A public summary can be accessed online.
  • Annually, all ACL grantees report their performance data which is used to inform agency budget justifications, publicly available reports, and for the delivery of technical assistance to support continuous program improvement. For example:
    • NIDILRR uses a web-based platform. Additionally, grantees are required to complete the final performance report module within 90 days of the grant’s end-date. This report captures overarching, summative results and insights gained from the performance period which grant officers feed back to grantees as part of its continuous quality improvement approach.
    • Many of ACL’s Disability Program grantees use a system called ACLReporting to report on performance annually or semiannually.
    • Older American Act Title III and VII grantees use as system called NAPIS/CARDS for annual performance reporting. In FY 2018 ACL piloted a new web-based reporting system to improve the user experience and include additional data security and verification tools. These performance data, as well as data for the OAA Title VI program and data from a national survey which are used for performance reporting, are available in the Aging Integrated Database.
    • SMP/SHIP programs have complex and standalone data systems. Their performance is reported to ACL and is used to provide more focused TA and support for the programs. Data from both programs is publicized in a number of ways. Each quarter, all State Health Insurance Assistance Programs (SHIP) get a report from ACL to show how they are doing with a county by county breakdown with quality Likert scores and quantitative analysis of their reports. Project officers use this to help states understand how they are doing and where they can improve.
Score
8
Corporation for National and Community Service
  • CNCS has a focused set of Agency-Wide Priority Measures derived from the 2011-2015 Strategic Plan. Every CNCS Program contributes to the Agency-Wide Priority Measures. There are also specific grantee/sponsor measures that roll up into the Agency-Wide Priority Measures, which can be found in the Agency-Wide Priority Measures chart. Grantees are required to select at least one national performance measure, and they are required to report performance measures data annually. CNCS encourages grantees to use these measure for continuous program improvement. CNCS uses the agency-wide priority measures to assess its own progress toward attaining the goals and objectives of its strategic plan.
  • The CNCS Chief of Staff has asked the agency’s Management Team of 18 Directors to identify a core business improvement to initiate in FY18 to advance the goal of strengthening core business functions (identified in the agency Transformation and Sustainability Plan). Each Director will be responsible for the project’s outcome, impacts, timeline, and metrics for determining success.
  • The Senior Corps program has invested over $1 million in various management analyses that have been and will be used to inform operations, programming, and its research agenda moving forward. Focus groups with the field were conducted by an independent contractor to improve performance measurement, recruitment practices, volunteer incentive policies, partnership development, and information sharing among offices. Case studies were also funded (and are currently underway) to better understand various dimensions of program implementation. A contract will also be awarded this fiscal year to comprehensively assess the quality of program administrative data and its potential uses for performance management.
  • The AmeriCorps NCCC program tracks five key performance indicators: (1) alignment of NCCC teams with state identified priorities; (2) in-kind contributions from project sponsor organizations and communities; (3) employee viewpoint trends; (4) member graduation rates; and (5) number of alumni remaining in the community post-graduation. A pilot was initiated in 2014 to determine how the program might pursue increasing its effectiveness while decreasing costs. A comparison of key performance indicators (e.g., number of service hours, number of projects and sponsors, member attrition) was made between two classes that served and graduated from the program prior to the pilot and three classes following pilot implementation. Findings from the pilot demonstrated that costs could be reduced while maintaining the same level of community service and increasing member retention in the program. The revised program model has since been implemented in three of four program locations with the third occurring in FY18.
Score
8
Millennium Challenge Corporation
  • MCC monitors progress towards compact and threshold program results on a quarterly basis using performance indicators that are specified in the M&E Plan for each country’s investments. The M&E Plans specify indicators at all levels (process, output, and outcome) so that progress towards final results can be tracked. Every quarter each partner country submits an Indicator Tracking Table (ITT) that shows actual performance of each indicator relative to the baseline that was established before the activity began and the performance targets that were established in the M&E Plan. Key performance indicators and their accompanying data by country are updated every quarter and published online. MCC management and the relevant country team reviews this data in a formal Quarterly Performance Review meeting every quarter to assess whether results are being achieved and integrates this information into project management and implementation decisions.
  • In an effort to track and aggregate evidence across its entire portfolio, MCC has implemented a common indicators structure across six sectors in which it invests. In all MCC countries, projects in these six sectors – energy, land and property rights, education, WASH, transportation, and agriculture – capture evidence across a common set of indicators to allow MCC to build an agency-wide evidence base around its investments.
  • MCC also supports the creation of multidisciplinary ‘country teams’ to manage the development and implementation of each compact and threshold program. Teams usually include the following members: coordinator, economist, private sector development specialist, social inclusion and gender integration specialist, technical specialists (project specific), M&E specialist, environmental and social performance specialist, legal, and financial management and procurement specialists. From the earliest stages, these teams develop project logics and M&E frameworks supported by data and evidence, and use them to inform the development of the projects within each program. Teams meet frequently to gather evidence, discuss progress, make project design decisions, and solve problems. Prior to moving forward with a program investment, teams are encouraged to use the lessons from completed evaluations to inform their work going forward.
  • Established as a key element for success in the 2016 Open Government Plan, Knowledge Management (KM) continues to be a critical priority for MCC. In FY18, the agency formed a Knowledge Management Core Team led by a newly appointed Knowledge Management Lead. The objective of this KM initiative is to better capture and disseminate intra-agency information and resources. By leveraging accumulated knowledge, MCC will be better positioned to implement country programs and achieve development impact more efficiently and effectively.
  • Throughout FY18, MCC is implementing a new reporting system that will enhance MCC’s credibility around results, transparency, and accountability. The Star Report and its associated business process quarterly captures key information to provide a framework for results and improve the ability to promote and disseminate learning and evidence throughout the compact and threshold program lifecycle. For each compact and threshold program, evidence is collected on performance indicators, evaluation results, partnerships, sustainability efforts, and learning, among other elements; and critically, this information will be available in one report after each program ends. Through the Star Report, MCC is able to capture how and why programs achieved certain resultsand provide better reporting of compact and threshold program performance to public audiences, such as Congress, other development agencies, and the academic community. Each country will have a Star Report published roughly seven months after completion. MCC’s first Star Report will focus on the recently closed compact in Cabo Verde and will be published in September 2018.
  • MCC reports on its performance in its Agency Financial Report (AFR) which provides the results that enable the President, Congress, and the American people to assess MCC’s performance for the fiscal year. In particular, the AFR provides an overview of MCC’s programs, accomplishments, challenges, and management’s accountability over the resources entrusted to MCC. MCC also prepares an Annual Performance Report (APR) each fiscal year that is included in its Congressional Budget Justification. Together, the AFR and APR provide a comprehensive presentation and disclosure of important financial and programmatic information related to MCC’s operations and results, including a fair assessment of MCC’s leadership and stewardship of the resources entrusted to the agency. MCC provides further information related to its activities in an Annual Report.
Score
8
Substance Abuse and Mental Health Services Administration
  • In 2016, SAMHSA’s Office of Financial Resources (OFR) established a Program Integrity Review Team (PIRT) staffed by representatives from each of its four Centers and managed by OFR. On a quarterly basis, three SAMHSA discretionary grant portfolios (one from each of the three program Centers) conduct a self-analysis to examine grantee performance based on objective performance data, financial performance and other factors. Program staff present their program self-assessments to the PIRT and receive feedback on, for example, targets of concern. In one instance, grantees were surpassing their targets by 200-300%, resulting in the board suggesting that the targets be re-examined as appropriate for these high-performing grantees. In addition, the Centers have historically managed internal performance review boards to periodically review grantee performance and provide corrective actions as needed.
  • A new unified data collection system, SAMHSA’s Performance Accountability & Reporting Systems (SPARS), was put into place in early 2017. Historically, the three program Centers had independent data collection systems that did not allow for global reviews of agency activities. The new system allows for greater transparency about grantee performance across Centers. SAMHSA aligns program objectives and measures through its utilization of SPARS, SAMHSA’s online data entry, reporting, technical assistance request, and training system for grantees to report timely and accurate data. SPARS is a mechanism by which SAMHSA meets requirements of the Government Performance and Results Act of 1993 (GPRA) and the GPRA Modernization Act of 2010.
  • Pursuant to the 21st Century Cures Act, SAMHSA is required to establish standards for grant programs that, among other factors, addresses the extent to which grantees must collect and report on required performance measures, and SAMHSA must advance the use of performance metrics recommended both by the Assistant Secretary for Planning and Evaluation (ASPE) (Sec. 6002, pp. 464-465) and the Director of CBHSQ (Sec. 6004, p. 470). In addition, SAMHSA’s Chief Medical Officer is required to coordinate with ASPE to assess the use of performance metrics in evaluation activities, and coordinate with the Assistance Secretary to ensure programs consistently utilize appropriate performance metrics and evaluation designs (Sec. 6003, p. 468). The Assistant Secretary must also submit a biennial report to Congress that assesses the extent to which its programs and activities meet goals and appropriate performance measures (Sec. 6006, p. 477).
Score
8
U.S. Agency for International Development
  • USAID partners with the U.S. Department of State to jointly develop and implement clear strategic goals, strategic objectives, and performance goals, articulated in the FY 2018 – 2022 U.S. Department of State – USAID Joint Strategic Plan (JSP). Indicators measuring progress on strategic goals, strategic objectives, and performance goals are collected from across the agency, in part, through the annual Performance Plan and Report (PPR), and performance is reported externally through the Annual Performance Plan/Annual Performance Report (APP/APR) and the Agency Financial Report.
  • USAID also measures operations performance management to ensure that the agency achieves its development objectives and aligns resources with priorities. USAID’s Performance Improvement Officer (PIO) and USAID’s Program Management Improvement Officer (PMIO) lead agency efforts to use performance data for decision-making and improve performance and operational efficiency and effectiveness. For example, the PIO and PMIO coordinate tracking of Cross Agency Priority (CAP) Goal and Agency Priority Goal (APG) progress; leverage performance management reviews to conduct deep-dives into evidence; and oversee business process reviews and other program and management assessments to ensure that the agency more efficiently and effectively achieves its mission and goals. USAID reports on APG and CAP goal progress on www.performance.gov.
  • USAID missions develop CDCS with clear goals and objectives and a PMP that identifies expected results, performance indicators to measure those results, plans for data collection and analysis, and periodic review of performance measures to use data and evidence to adapt programs for improved outcomes.
Score
8
U.S. Department of Education
  • ED develops a four-year strategic plan and holds quarterly data-driven progress reviews of the goals and objectives established in the plan, as required by the Government Performance and Results Act Modernization Act of 2010 (GPRAMA). ED’s FY18-22 Strategic Plan includes two parallel goals, one for P-12, and one for higher education (Strategic Objectives 1.4 and 2.2, respectively), that focus on supporting agencies and educational institutions in the identification and use of evidence-based strategies and practices. The Department’s FY 2017 Annual Performance Report and FY 2019 Annual Performance Plan includes the FY17 performance results for the strategic objective 5.3 in the previous Strategic Plan, which also included metrics for evidence and for which established targets were mostly met.
  • Per GPRAMA’s requirement that agencies conduct quarterly data-driven performance reviews, the Office of the Deputy Secretary facilitates these discussions with the goal leaders and their teams each quarter. The Deputy Secretary is the designated Chair of these meetings and they involve reviewing data submitted by the goal teams, performance to date, and discussing any challenges or known risks. Office and goal leaders attend these meetings in person.
  • In addition, ED has emphasizes continuous improvement in evidence-based decision-making among States and districts. In 2016, ED released non-regulatory guidance, Using Evidence to Strengthen Education Investments, which recommends a five-step decision-making process to promote continuous improvement and improvement and support better outcomes for students. This guidance has served as a framework for the ED’s technical assistance related to implementation of ESSA’s evidence provisions, such as the State Support Network’s community of practice on evidence-based practices that supports nine states with selection of interventions. ED has conducted outreach to build awareness of the guidance with stakeholder groups. In addition, ED included tailored guidance for these five steps in its guidance on Title II, Part A, and Title IV of ESEA. These resources supplement ED’s substantial evidence-focused technical assistance efforts, such as:
    • RELs work in partnership with policymakers and practitioners in their regions to evaluate programs and to use evaluation findings and other research to improve academic outcomes for their students.
    • Comprehensive Centers provide support to States in planning and implementing interventions through coaching, peer-to-peer learning opportunities, and ongoing direct support.
    • The State Implementation and Scaling Up of Evidence-Based Practices Center provides tools, training modules, and resources on implementation planning and monitoring.
Score
8
U.S. Dept. of Housing & Urban Development
  • HUD complies with federal strategic planning and performance management requirements, which include quarterly reporting on Agency Priority Goals on Performance.gov. HUD documents alignment between strategic goals and supporting objectives and performance metrics in the Annual Performance Plan and Annual Performance Report, and identifies the staff assigned lead responsibility for each objective.
  • HUD has launched an effort called “Prescription for HUD,” (see mention here) through which senior staff reviews quarterly data demonstrating progress toward Secretarial priorities and Agency Priority Goals. Prescription for HUD supersedes the original HUDstat approach to performance management.
  • In FY17, HUD launched a pilot of “Standards for Success,” a new standardized data collection and reporting framework for discretionary grant programs. The framework is intended to enable grant activities to be driven by coordinated outcomes and assessed using return on investment metrics. The framework is helping to standardize data elements, measures, definitions, metrics, and reporting periods; align programmatic data elements and measures with higher-level agency priority goals and objectives; and strengthen online reporting through record-level reports for greater analysis and responsiveness of programs. In the pilot’s first year, three HUD programs participated with a subset of their grants. The pilot is building an evidence base for scaling up Standards for Success as the common reporting framework for all discretionary grant programs.
Score
9
U.S. Department of Labor
  • DOL’s Performance Management Center (PMC) is responsible for DOL’s extensive performance management system, which includes 1,000 measures which are reviewed quarterly by Department leadership. PMC leads the department’s Continuous Process Improvement (CPI) Program, which supports agencies in efforts to gain operational efficiencies and improve performance. The program directs customized process improvement projects throughout the department and grows the cadre of CPI practitioners through Lean Six Sigma training.
  • PMC leads DOL’s implementation of the Government Performance and Results Modernization Act of 2010 (GPRMA), including requirements such as the four-year Strategic Plan and Annual Performance Report. Using a performance stat reporting and dashboard system linked to component agencies’ annual operating plans, PMC coordinates quarterly reviews of each agency’s program performance by the Deputy Secretary to analyze progress and identify opportunities for performance improvements.
  • At the agency level, ETA recently implemented extensive performance reporting requirements for programs authorized by the WIOA and related workforce programs. ETA’s workforce programs use a similar data layout for performance reporting, using the same data elements and definitions. This facilitates comparison of outcomes and information for different programs. ETA uses this performance information to inform program policy and budgetary decisions.
  • An important role that DOL’s CEO helps to plays is to facilitate the interaction between program and evaluation analysts, and performance management and evaluation. Learning agendas updated annually by DOL agencies in collaboration with DOL’s CEO include program performance themes and priorities for analysis needed to refine performance measures and identify strategies for improving performance. The quarterly reviews with leadership routinely include specific discussions about improving performance and findings from recent evaluations that suggest opportunities for improvement.
Back to the Standard

Visit Results4America.org