2018 Federal Standard of Excellence


Substance Abuse and Mental Health Services Administration (HHS)¹

Score
7
Leadership

Did the agency have a senior staff member(s) with the authority, staff, and budget to evaluate its major programs and inform policy decisions affecting them in FY18? (Example: Chief Evaluation Officer)

  • The director of the Substance Abuse and Mental Health Services Administration’s (SAMHSA) Center for Behavioral Health Statistics and Quality (CBHSQ) Division of Evaluation, Analysis and Quality (DEAQ) serves as the agency’s evaluation lead with key evaluation staff housed in this division. In addition, the agency’s chief medical officer (CMO), as described in the 21st Century Cures Act, plays a key role in addressing evaluation approaches and the utilization of evidence-based programs and practices among grantees; at this time, a collaborative approach between CBHSQ and the Office of the CMO is being established to ensure broad agency evaluation oversight by senior staff. The Office of the CMO is housed within the agency’s emerging Mental Health Policy Lab (currently the Office of Policy, Planning and Innovation) and will influence evaluation policy decisions across the agency in a more systematic manner as the new Policy Lab is stood up in January 2018. In January 2018, SAMHSA announced the creation of the National Mental Health and Substance Use Policy Lab, which is designed to “play a central role in shaping SAMHSA’s efforts to bring more science to the evidence-based practices used in the prevention, treatment, and support services being provided by behavioral health practitioners and other clinicians.”
  • SAMHSA’s Office of Policy, Planning and Innovation provides policy perspectives and guidance to raise awareness around SAMHSA’s research and behavioral health agenda. OPPI also facilitates the adoption of data-driven practices among other federal agencies and partners such as the National Institutes for Health, the Centers for Disease Control and Prevention, and the Centers for Medicare and Medicaid Services.
  • At this time, evaluation authority, staff, and resources are decentralized and found throughout the agency. SAMHSA is composed of four Centers, the Center for Mental Health Services (CMHS), the Center for Substance Abuse Treatment (CSAT), the Center for Substance Abuse Prevention (CSAP) and the Center for Behavioral Health Statistics and Quality (CBHSQ). CMHS, CSAT, and CSAP oversee grantee portfolios and evaluations of those portfolios. Evaluation decisions within SAMHSA are made within each Center specific to their program priorities and resources. Each of the three program Centers uses their program funds for conducting evaluations of varying types. CBHSQ, SAMHSA’s research arm, provides varying levels of oversight and guidance to the Centers for evaluation activities. CBHSQ also provides technical assistance related to data collection and analysis to assist in the development of evaluation tools and clearance packages.
  • SAMHSA evaluations are funded from program funds that are used for service grants, technical assistance, and for evaluation activities. Evaluations have also been funded from recycled funds from grants or other contract activities. Given the broad landscape of evaluation authority and funding, a variety of evaluation models have been implemented. These include recent evaluations funded and managed by the program Centers (e.g., First Episode Psychosis, FEP); evaluations funded by the Centers but directed outside of SAMHSA (e.g., Assisted Outpatient Treatment, AOT), and those that CBHSQ directly funds and executes (e.g., Primary and Behavioral Health Care Integration, PBHCI, and the Cures-funded Opioid State Targeted Response funding). Evaluations require different degrees of independence to ensure objectivity and the models above afford SAMHSA the latitude to enhance evaluation rigor and independence on a customized basis.
Score
7
Evaluation & Research

Did the agency have an evaluation policy, evaluation plan, and research/learning agenda(s) and did it publicly release the findings of all completed evaluations in FY18?

  • SAMHSA’s Evaluation Policy and Procedure (P&P), revised and approved in May 2017, provides guidance across the agency regarding all program evaluations. Specifically, the Evaluation P&P describes the demand for rigor, compliance with ethical standards, and compliance with privacy requirements for all program evaluations conducted and funded by the agency. The Evaluation P&P serves as the agency’s formal evaluation plan and includes a new process for the public release of final evaluation reports, including findings from evaluations deemed significant. The Evaluation P&P sets the framework for planning, monitoring, and disseminating findings from significant evaluations.
  • The Evaluation P&P requires Centers to identify research questions and appropriately match the type of evaluation to the maturity of the program. A new workgroup was formed in 2017, the Cross-Center Evaluation Review Board (CCERB), composed of Center evaluation experts, who began reviewing significant evaluations at critical milestones in the planning and implementation process, providing specific recommendations to the Center Director having the lead for the evaluation.
  • SAMHSA’s Cross-Center Evaluation Review Board (CCERB) worked with the four centers within SAMHSA: CSAP, CMHS, CSAT, and CBHSQ to advise, conduct, collaborate, and coordinate on all evaluation and data collection activities that occur within SAMHSA. CCERB staff provided support for program-specific and administration-wide evaluations. SAMHSA’s CMO also played a key role in reviewing evaluation proposals and clearing final reports.
  • Results from significant evaluations will be available on SAMHSA’s website, a new step SAMHSA took with its newly-approved Evaluation P&P in the Fall of 2017. As of July 2018, one summary was posted on the website – a process evaluation of the Safe Schools/Healthy Students (SS/HS) State Program. No other evaluation summaries are posted, including of any ongoing evaluation studies. Significant evaluations include those that have been identified by the Center Director as providing compelling information and results that can be used to make data driven, evidence-based, and informed decisions about behavioral health programs and policy. The following criteria is used to determine whether an evaluation is significant: (1) whether the evaluation was mandated by Congress; (2) whether there are high priority needs in states and communities; (3) whether the evaluation is for a new or congressionally-mandated program; (4) the extent to which the program is linked to key agency initiatives; (5) the level of funding; (6) the level of interest from internal and external stakeholders; and (7) the potential to inform practice, policy, and/or budgetary decision-making.
  • CBHSQ is currently leading agency-wide efforts to build SAMHSA’s learning agenda. Via this process, we have developed agency-wide Learning Agenda templates in the critical topic areas of opioids, serious mental illness, serious emotional disturbance, suicide, health economics and financing, and marijuana; learning agendas focused on other key topic areas such as alcohol are underway as well. Other topics, such as cross-cutting issues related to vulnerable populations, are interwoven through these research plans. Through this multi-phased process, CBHSQ is systematically collecting information from across the agency regarding research and analytic activities, analyzing and organizing this information into a guiding framework to be used for decision-making related to priorities and resource allocation. SAMHSA began this process in early 2017 and planned to complete it in the winter of 2018. SAMHSA has developed a template for the issue of opioid abuse, the first topic we tackled in this effort and thus the most complete at this point in time and has been used in determining research questions along with the current activities underway across the agency that are relevant to these areas. The template followed the construct outlined by OMB in the publication entitled Analytical Perspectives; Budget of the U.S. Government; Fiscal Year 2018.
  • SAMHSA’s Data Integrity Statement outlines how CBHSQ adheres to federal guidelines designed to ensure the quality, integrity, and credibility of statistical activities.
  • SAMHSA’s National Behavioral Health Quality Framework, aligned with the U.S. Department of Health and Human Services’ National Quality Strategy, is a framework to assist providers, facilities, payers, and communities better track and report the quality of behavioral health care. These metrics are focused primarily on high-rate behavioral health events such as depression, alcohol misuse, and tobacco cessation, all of which impact health and health care management and thus affect a large swath of the U.S. population.
Score
0
Resources

Did the agency invest at least 1% of program funds in evaluations in FY18? (Examples: Impact studies; implementation studies; rapid cycle evaluations; evaluation technical assistance, and capacity-building)

  • RFA was unable to determine the amount of resources SAMHSA invested in evaluations in FY18.²
  • ²RFA was unable to determine the amount of resources SAMHSA invested in evaluations in FY18 for criterion #3. Therefore, to tally a final score, RFA scored criterion #3 a zero and reduced the denominator from 100 to 90 points.
Score
8
Performance Management / Continuous Improvement

Did the agency implement a performance management system with clear and prioritized outcome-focused goals and aligned program objectives and measures, and did it frequently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other dimensions of performance in FY18? (Example: Performance stat systems)

  • In 2016, SAMHSA’s Office of Financial Resources (OFR) established a Program Integrity Review Team (PIRT) staffed by representatives from each of its four Centers and managed by OFR. On a quarterly basis, three SAMHSA discretionary grant portfolios (one from each of the three program Centers) conduct a self-analysis to examine grantee performance based on objective performance data, financial performance and other factors. Program staff present their program self-assessments to the PIRT and receive feedback on, for example, targets of concern. In one instance, grantees were surpassing their targets by 200-300%, resulting in the board suggesting that the targets be re-examined as appropriate for these high-performing grantees. In addition, the Centers have historically managed internal performance review boards to periodically review grantee performance and provide corrective actions as needed.
  • A new unified data collection system, SAMHSA’s Performance Accountability & Reporting Systems (SPARS), was put into place in early 2017. Historically, the three program Centers had independent data collection systems that did not allow for global reviews of agency activities. The new system allows for greater transparency about grantee performance across Centers. SAMHSA aligns program objectives and measures through its utilization of SPARS, SAMHSA’s online data entry, reporting, technical assistance request, and training system for grantees to report timely and accurate data. SPARS is a mechanism by which SAMHSA meets requirements of the Government Performance and Results Act of 1993 (GPRA) and the GPRA Modernization Act of 2010.
  • Pursuant to the 21st Century Cures Act, SAMHSA is required to establish standards for grant programs that, among other factors, addresses the extent to which grantees must collect and report on required performance measures, and SAMHSA must advance the use of performance metrics recommended both by the Assistant Secretary for Planning and Evaluation (ASPE) (Sec. 6002, pp. 464-465) and the Director of CBHSQ (Sec. 6004, p. 470). In addition, SAMHSA’s Chief Medical Officer is required to coordinate with ASPE to assess the use of performance metrics in evaluation activities, and coordinate with the Assistance Secretary to ensure programs consistently utilize appropriate performance metrics and evaluation designs (Sec. 6003, p. 468). The Assistant Secretary must also submit a biennial report to Congress that assesses the extent to which its programs and activities meet goals and appropriate performance measures (Sec. 6006, p. 477).
Score
8
Data

Did the agency collect, analyze, share, and use high-quality administrative and survey data – consistent with strong privacy protections – to improve (or help other entities improve) federal, state, and local programs in FY18? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; open data policies)

  • SAMHSA has five data collection initiatives: National Survey on Drug Use and Health (NSDUH): population data; Treatment Episode Data Set – Admissions: client level data; National Survey of Substance Abuse Treatment Services (N-SSATS): substance abuse facilities data; Drug Abuse Warning Network: emergency department data; and the National Mental Health Services Survey (N-MHSS) and has made numerous administrative and survey datasets publicly available for secondary use. Each data collection can be sorted by metadata parameters such as geography, methodology, spotlights, data reviews, and data tables. CBHSQ oversees these data collection initiatives and provides publicly available datasets so that some data can be shared with researchers and other stakeholders while preserving client confidentiality and privacy. Some restricted data cannot be shared beyond federal staff.
  • SAMHSA’s Data Integrity Statement articulates the administration’s Center for Behavioral Health Statistics and Quality (CBHSQ), a Federal Statistical Unit, adherence to the federal common set of professional and operational standards that ensure the “quality, integrity, and credibility” of statistical activities.
  • SAMHSA’s Performance and Accountability and Reporting System (SPARS) hosts the data entry, technical assistance request, and training system for grantees to report performance data to SAMHSA. SPARS serves as the repository for the Administration’s three centers, Center for Substance Abuse and Prevention (CSAP), Center for Mental health Services (CMHS), and Center for Substance Abuse Treatment (CSAT). Due to concerns about confidentiality and privacy, the current data transfer agreement limits the use of grantee data to internal reports so that data collected by SAMHSA grantees will not be available to share with researchers or stakeholders beyond SAMHSA and publications based on grantee data will not be permitted. Enhancements to the existing data collection system to improve data transparency and sharing of administrative and performance data are being planned. The foundational system went live in February 2017. Going forward, changes will allow for analytic reports to be shared with grantees so that performance successes and gaps can be better tracked, both by the project officers overseeing the grantees and by the grantees themselves. It is anticipated that this will improve communication and oversight as well as offer more real-time opportunities for program performance. Enhancements to the existing data collection system to improve data transparency and sharing of administrative and performance data are currently being implemented. Information on latest available data for program staff can be found on the portal announcement section on the home page.
  • SAMHSA’s Performance and Accountability and Reporting System (SPARS) hosts the data entry, technical assistance request, and training system for grantees to report performance data to SAMHSA. SPARS serves as the repository for the Administration’s 3 centers, Center for Substance Abuse and Prevention (CSAP), Center for Mental health Services (CMHS), and Center for Substance Abuse Treatment (CSAT). Due to concerns about confidentiality and privacy, the current data transfer agreement limits the use of grantee data to internal reports so that data collected by SAMHSA grantees will not be available to share with researchers or stakeholders beyond SAMHSA and publications based on grantee data will not be permitted. We expect to revisit the issue once the Commission on Evidence-Based Policymaking releases their findings in September 2017.
  • SAMHSA’s Substance Abuse and Mental Health Data Archive (SAMHDA) contains substance use disorder and mental illness research data available for restricted and public use. SAMHDA promotes the access and use of SAMHSA’s substance abuse and mental health data by providing public-use data files and documentation for download and online analysis tools to support a better understanding of this critical area of public health.
  • Per SAMHSA’s Evaluation Policy & Procedure (P&P), CBHSQ will work with CMHS, CSAT, and CSAP Center Directors and other program staff to develop a SAMHSA Completed Evaluation Inventory of evaluations completed between FY11 and FY17. This inventory and the evaluation final reports will then be made available on SAMHSA’s intranet and internet sites. In addition, data files from completed evaluations will be made available on the intranet, and via a restricted access mechanism such as SAMHDA.
Score
7
Common Evidence Standards / What Works Designations

Did the agency use a common evidence framework, guidelines, or standards to inform its research and funding decisions and did it disseminate and promote the use of evidence-based interventions through a user-friendly tool in FY18? (Example: What Works Clearinghouses)

  • There is great diversity across SAMHSA programming, ranging from community-level prevention activities to residential programs for pregnant and post-partum women with substance misuse issues. While this diversity allows SAMHSA to be responsive to a wide set of vulnerable populations, it limits the utility of a common evidence framework for the entire agency. Within Centers (the Center for Substance Abuse Prevention, the Center for Substance Abuse Treatment, and the Center for Mental Health Services), consistent evidence frameworks are in use and help to shape the process of grant-making (e.g., Center staff are familiar with the pertinent evidence base for their particular portfolios). At the programmatic level, staff review the state-of-the-art for a particular topic area to facilitate grantee adoption and implementation of evidence-based practices (EBPs). While staff awareness of EBPs varies, a systematic approach to evidence classification remains to be developed. Most Center staff rely on the National Registry of Evidence-based Programs and Practices to identify evidence-based programs for grantee implementation.
  • Until 2018, SAMHSA regarded the National Registry of Evidence-based Programs and Practices (NREPP) as the primary online user friendly tool for identifying evidence-based programs for grantee implementation. In January 2018, SAMHSA announced that it was “moving to EBP implementation efforts through targeted technical assistance and training that makes use of local and national experts and will that assist programs with actually implementing services….” At the same time, the Assistant Secretary for Mental Health and Substance Use outlined significant concerns with the rigor and effectiveness of NREPP, and reportedly terminated the contract with the organization running NREPP. It was stated that the Mental Health and Substance Use Policy Lab would now “play a central role in shaping SAMHSA’s efforts to bring more science to the evidence-based practices used in the prevention, treatment, and support services being provided by behavioral health practitioners and other clinicians.”
  • In April 2018, SAMHSA launched the Evidence-Based Practices Resource Center (Resource Center) that aims to provide communities, clinicians, policy-makers and others in the field with the information and tools they need to incorporate evidence-based practices into their communities or clinical settings. The Resource Center contains a collection of science-based resources, including Treatment Improvement Protocols, toolkits, resource guides, and clinical practice guidelines, for a broad range of audiences. Similarly, the Evidence-Based Practices (EBP) Web Guide features research findings and details about EBPs used to prevent and treat mental and substance use disorders. Stakeholders throughout the behavioral health field can use the EBP Web Guide to promote awareness of current intervention research and to increase the implementation and availability of EBPs.
  • In February 2018, SAMHSA published guidance for healthcare professionals and addiction treatment providers on appropriate prescribing practices for FDA-approved medications for opioid use disorder (OUD) and effective strategies for supporting the patients utilizing medication for the treatment of OUD.
  • In January 2018, SAMHSA announced it had released $12 million in funding to the American Academy of Addiction Psychiatry to begin the effort to utilize local expertise to provide TA and training on scientifically based evidence-based practices to combat the nation’s opioid crisis. The Opioid State Targeted Response TA program aims to provide TA on evidence-based practices across the spectrum of prevention, treatment and recovery.
  • SAMHSA has universal language about using evidence-based practices (EBPs) that is included in its Funding Opportunity Announcements (FOAs) (entitled Using Evidence-Based Practices (EBPs)). This language includes acknowledgement that, “EBPs have not been developed for all populations and/or service settings” thus encouraging applicants to “provide other forms of evidence” that a proposed practice is appropriate for the intended population. Specifically, the language states that applicants should: (1) document that the EBPs chosen are appropriate for intended outcomes; (2) explain how the practice meets SAMHSA’s goals for the grant program; (3) describe any modifications or adaptations needed for the practice to meet the goals of the project; (4) explain why the EBP was selected; (5) justify the use of multiple EBPs, if applicable; and (6) discuss training needs or plans to ensure successful implementation. Lastly, the language includes resources the applicant can use to understand EBPs. Federal grants officers work in collaboration with the SAMHSA Office of Financial Resources to ensure that grantee funding announcements clearly describe the evidence standard necessary to meet funding requirements.
  • In 2011, based on the model of the National Quality Strategy, SAMHSA developed the National Behavioral Health Quality Framework (NBHQF). With the NBHQF, SAMHSA proposes a set of core measures to be used in a variety of settings and programs, as well as in evaluation and quality assurance efforts. The proposed measures are not intended to be a complete or total set of measures a payer, system, practitioner, or program may want to use to monitor quality of its overall system or the care or activities it provides. SAMHSA encourages such entities to utilize these basic measures as appropriate as a consistent set of indicators of quality in behavioral health prevention, promotion, treatment, and recovery support efforts across the nation.
Score
8
Innovation

Did the agency have staff, policies, and processes in place that encouraged innovation to improve the impact of its programs in FY18? (Examples: Prizes and challenges; behavioral science trials; innovation labs/accelerators; performance partnership pilots; demonstration projects or waivers with strong evaluation requirements)

  • The SAMHSA Knowledge Network, a collection of technical assistance and training resources provided by the agency, provides behavioral health professionals with education and collaboration opportunities, and ample tools and technical assistance resources that promote innovation in practice and program improvement. Located within the Knowledge Network are groups such as the Center for Financing Reform and Innovation, which works with states and territories, local policy makers, providers, consumers, and other stakeholders to promote innovative financing and delivery system reforms.
  • In addition, SAMHSA participates in collaborations with other HHS agencies to promote innovative uses of data, technology and innovation across HHS to create a more effective government and improve the health of the nation, via the HHS IDEA Lab. SAMHSA has co-developed and submitted several innovative data utilization project proposals to the Ignite Accelerator of the HHS IDEA Lab, such as Rapid Opioid Alert and Response (ROAR), a project to monitor and prevent opioid overdoses by linking heroin users to resources and information.
  • The agency is currently exploring the use of tiered-evidence frameworks in its award decision-making to actively encourage innovation at the grantee/program level. In addition, pursuant to the 21st Century Cures Act, SAMHSA is establishing the National Mental Health and Substance Use Policy Laboratory (Policy Lab) (Sec. 7001, p.501), by restructuring the current Office of Policy, Planning, and Innovation (OPPI). The new Policy Lab will review programs and activities operated by the agency to identify programs and activities that are duplicative, identify programs and activities that are not evidence-based or effective, and formulate recommendations for coordinating, elimination, or improving such programs (Sec 7001, pp.502-503).
  • To further promote innovation, per the Cures Act, SAMHSA’s Assistant Secretary may coordinate with the Policy Lab to award grants to states, local governments, tribes and tribal organizations, and other eligible organizations to develop evidence-based interventions. These grants can help support the evaluation of models and interventions that show promise, or the expansion, replication, or scaling up of interventions that have been established as evidence-based (Sec. 7001, pp. 503-504).
Score
7
Use of Evidence in Five Largest Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its five largest competitive grant programs in FY18? (Examples: Tiered-evidence frameworks; evidence-based funding set-asides; priority preference points or other preference scoring; Pay for Success provisions)

  • The following represents SAMHSA’s five largest competitive grant programs for which funds were appropriated in FY18: (1) Opioid State Targeted Response ($1.5 billion in FY18); (2) Children’s Mental Health Services ($125 million in FY18); (3) Strategic Prevention Framework ($119.5 million in FY18); (4) Targeted Capacity Expansion – General ($95.2 million in FY18); and (5) Substance Abuse Treatment Criminal Justice ($89 million in FY18).
  • The President’s Budget request for SAMHSA for FY18 stipulates “that up to 10 percent of amounts made available to carry out the Children’s Mental Health Initiative may be used to carry out demonstration grants or contracts for early interventions with persons not more than 25 years of age at clinical high risk of developing first episode of psychosis.” Specifically, funds from this set-aside should address whether community-based interventions during the prodrome phase can prevent further development of serious emotional disturbances and eventual serious mental illness, and the extent to which evidence-based early interventions can be used to delay the progression of mental illness, reduce disability, and/or maximize recovery.
  • SAMHSA has universal language about using evidence-based practices (EBPs) that is included in its Funding Opportunity Announcements (FOAs) (entitled Using Evidence-Based Practices (EBPs)). This language includes acknowledgement that, “EBPs have not been developed for all populations and/or service settings” thus encouraging applicants to “provide other forms of evidence” that a proposed practice is appropriate for the intended population. Specifically, the language states that applicants should: (1) document that the EBPs chosen are appropriate for intended outcomes; (2) explain how the practice meets SAMHSA’s goals for the grant program; (3) describe any modifications or adaptations needed for the practice to meet the goals of the project; (4) explain why the EBP was selected; (5) justify the use of multiple EBPs, if applicable; and (6) discuss training needs or plans to ensure successful implementation. Lastly, the language includes resources the applicant can use to understand EBPs. SAMHSA shares evidence-based program and practice language with grantees as they compete for SAMHSA grants and describe the types of program/practice implementation they hope to engage in to address the needs of their particular target populations and communities. The review criteria contained in the FOA make clear that applicants proposing to use programs and practices with a more robust evidence base will receive higher scores and thus greater support for their funding application.
  • The President’s Budget for SAMHSA for FY18 plans to implement a tiered evidence approach in the Screening, Brief Intervention, and Referral to Treatment (SBIRT) program, which will allow for funding allocations and awards based on the implementation of both innovative practices or programs and more standard programming. Grant funding will be tied to the particular approach taken by the grantee. At the present time, SAMHSA does not use preference points to link funds to evidence of effectiveness; however, the 10 percent set-aside includes language to suggest that the Coordinated Specialty Care model is a first episode approach of importance to this work.
  • Among SAMHSA’s standard terms and conditions of all grant funding is the requirement that grantees collect and report evaluation data to ensure the effectiveness and efficiency of its programs under the Government Performance and Results Modernization Act of 2010 (P.L. 102-62). In addition, grantees must comply with performance goals and expected outcomes described in Funding Opportunity Announcements (FOAs), which may include participation in an evaluation and/or local performance assessment. While exemplar FOAs are not available to be shared publicly at this juncture, SAMHSA is developing the first tiered evidence FOA that will be funded in FY18, a key step to incentivize innovative practice/program models among grantees. While exemplar FOAs are not available to be shared publicly at this juncture, SAMHSA is developing the first tiered evidence FOA that will be funded in FY18, a key step to incentivize innovative practice/program models among grantees.
Score
8
Use of Evidence in Five Largest Non-Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its five largest non-competitive grant programs in FY18? (Examples: Evidence-based funding set-asides; requirements to invest funds in evidence-based activities; Pay for Success provisions)

  • The following represents SAMHSA’s largest non-competitive grant programs for which funds were appropriated in FY18: (1) Substance Abuse Prevention and Treatment Block Grant Program ($1.8 billion in FY18); (2) Mental Health Block Grant Program ($722.5 million in FY18); (3) Projects for Assistance in Transition from Homelessness (PATH) Program ($64.6 million in FY18); and (4) Protection and Advocacy for Individuals with Mental Illness (PAIMI) Program ($36.1 million in FY18).
  • In FY18, Congress maintained the 10 percent set-aside for evidence-based programs in SAMHSA’s Mental Health Grant Block grant (p. 377 of the FY18 appropriations law) to address early serious mental illness (ESMI) (including psychotic disorders). In its FY19 budget request (p. 121), SAMHSA expressed its desire to continue the set-aside. In FY17, SAMHSA’s Mental Health Grant Block maintained a 10 percent set-aside for evidence-based programs (p. 4) to address early serious mental illness (ESMI) (including psychosis) among individuals. In FY18-19 grant applications, states must describe how they will utilize the 10 percent set aside to align with coordinated specialty care models such as that which is grounded in the National Institute of Mental Health’s RAISE (Recovery after an Initial Schizophrenic Episode) work, or other approved evidence-based approaches. A key assumption of the block grant applications that grantees must meet is that, “state authorities use evidence of improved performance and outcomes to support their funding and purchasing decisions” (p. 8). In addition, a quality improvement plan is requested from all grantees, which is based on the principles of Continuous Quality Improvement/Total Quality Management (CQI/TQM). Grantees are also required to comply with performance requirements, which include assessing how funds are used via data and performance management systems and other tracking approaches.
Score
1
Repurpose for Results

In FY18, did the agency shift funds away from or within any practice, program, or policy that consistently failed to achieve desired outcomes? (Examples: Requiring low-performing grantees to re-compete for funding; removing ineffective interventions from allowable use of grant funds; proposing the elimination of ineffective programs through annual budget requests)

  • SAMHSA’s FY19 budget request proposed eliminating nine programs (totaling $279.6 million), but none of them for reasons of failing to achieve desired outcomes.
  • In January 2018, SAMHSA announced it would shift resources away from the National Registry of Evidence-based Programs and Practices (NREPP) toward targeted technical assistance and training for implementing evidence-based practices. The reasoning was that NREPP had skewed presentation of evidence-based interventions, which did not address the spectrum of needs of those living with serious mental illness and substance use disorders.” SAMHSA publicly confirmed that it terminated the contract for the organization running NREPP.
  • The SAMHSA budget provides performance information along with budget information which Congress can use to determine funding levels. Each year the program Centers review grantees within each program, project, or activity in terms of performance and financial management, when funding decisions are made for continuation funding. It is up to each Center to determine the factors that go into decisions related to continued funding based on guidance from the Office of Financial Management, Division of Grants Management. To the extent that costs are reduced for continuation funding, those funds can be repurposed to fund new grantees or to provide additional contract support for those grantees. In FY17, SAMHSA underwent a stringent review process for all funding requests utilizing both program and fiscal performance. During this process, SAMHSA utilized $51 million in unspent funding from existing grantees to fund new programs and activities.
  • CBHSQ staff conducted a summer evaluation inventory in the summer of 2016, requesting that program staff from the Centers provide information related to how their evaluation findings inform the next iteration of their programs and/or new evaluation activities. For the most part, program staff indicated that evaluation findings were used to improve the next round of funding opportunity announcements and thus grantee implementation of program.
Back to the Standard

Visit Results4America.org