Brief Report on Improved Treatment Outcomes for Medicaid Funded Mental Health in Maryland

 Jeb Brown, Ph.D, Center for Clinical Informatics

Takuya Minami, Ph.D, University of Massachusetts - Boston

Introduction

The purpose of this report is to summarize results of a 5-year implementation of feedback informed treatment for children receiving publicly funded mental health service in Maryland. Services were managed by Beacon Health Options, which supported the use of feedback informed treatment using ACORN questionnaires and the ACORN Decision Support Toolkit.

Description of Questionnaires

Children (or an adult who knew the child well) were asked to complete an ACORN questionnaire.  Questionnaires varied depending on the age of the child and other developmental and clinical considerations. All questionnaires show high reliability and construct validity. The Maryland sample is part of the larger ACORN normative sample of over 1.2 million clients and 3.2 million completed questionnaires.

Description of Maryland Clients

The sample consists of children and adolescents starting treatment on or after January 1, 2014 through November 7, 2019.  A total of 11,571 cases had at least one completed questionnaire. Of these, 7,561 had an intake score in the clinical range of severity, of which 4,509 (60%) had at least two completed questionnaires, which allows for evaluation of progress.

This later group, rather than the total sample, is used to benchmark outcomes. The reason is that (a) those clients with intake scores outside the clinical range tend to average no improvement, and (b) there is not a body of clinical trials in the scientific literature for clients starting within the non-clinical range to use as a treatment benchmark.

Benchmarking Outcomes

The ACORN Collaboration incorporates a well-developed and validated methodology for benchmarking outcomes, permitting comparison of results from one clinic to another and among clinicians. The basic metric is based on the standard metric used in clinical trials: effect size (Cohen’s d). This is defined as the pre-post change score divided by the standard deviation of the pretreatment assessment.

ACORN utilizes an advanced form of Cohen’s effect size d statistic, known as Severity Adjusted Effect Size (SAES), that accounts for differences in diagnosis, severity of symptoms at intake, and other relevant variables. When making comparisons between individuals, the ACORN platform also adjusts for sample size. For more information see (Brown, Simon, Cameron & Minami, 2015) and (Brown, Simon & Minami, 2015)

Therapists (or agencies) with an average SAES greater than d= 0.5 are characterized as delivering effective care. Those with an average SAES of d= 0.8 or greater are classified as highly effective. By way of comparison, the average effect size for well conducted peer reviewed studies of so-called evidence-based treatments is approximately 0.8. This means that those agencies and therapists with an SAES of d= 0.8 or better are delivering results equal to or better than what would be expected from high quality clinical trials.

Results

The following table and graphs break down results by calendar year. Results for 2014-2017 are combined because of the low case counts for those years as the program was being implemented. Beginning in 2017, the Anne Arundel County Public Schools Expanded School Based Mental Health partnership, with assistance from Elizabeth Connors, PhD (faculty at Yale University and the University of Maryland), placed more emphasis on participation in the program (known as OnTrack, funded by Beacon Health Options) for providers of school based services. The resulting increase in participation is evident in 2018 and 2019. In all, seven agencies contributed to these data.

The increase in effect size over the period of implementation is impressive. However, the number of assessments per case also increased, indicating that the questionnaires were being administered more consistently. This may have been in part due to a shift away from paper forms and towards use of inexpensive tablets to administer the questionnaires. Prior to 2017, 97% of forms were faxed. From January 1, 2017 to date, only 15% have come in via fax. This also had the effect of significantly reducing the cost of the program for Beacon Health Options.

There was a weak (Pearson’s r= .17) but statically significant (p< .01) correlation between session count and effect size.

fig1.png
2014.png

Earlier published reports using ACORN data have revealed a consistent relationship between effect size and how often a clinician views their data via the ACORN Toolkit. This appears to be the case with this sample also. Over the past 12 months, those clinicians with at least 5 cases in the clinical range who logged in at least 24 times annually had an average effect size of d= 0.86 (n= 54 clinicians) compared to d= .74 for those who logged in less frequently (n= 19 clinicians). The difference in significant at p< .05 (One tailed t-test).

To further test this hypothesis, we selected a sample of clinical range cases based on how many years the clinician had participated in the program, and then analyzed results from their second year of participation. This permitted us to look at clinician effect size change over time and a function of effect size in year one, session count in year two, and engagement (login count) in year two.

Following are the results of for 1,512 youth and children seen by clinicians in their second year of participation of the program. These are broken out by those seen by therapists with high engagement (at least 24 logins) in the second year compared with those with low engagement. Note that there is little difference in the assessment counts between the two groups.

fig2.png
Pic2.png

Using a General Linear Model to control for the assessment count for each case, the therapist’s SAES in year one, and the therapist’s level of engagement in year two, we found a strong trend for the effect if engagement was high in year two (p< .10), despite controlling for these other variables.

Taken together, these results provide continued evidence that those clinicians who are actively using the Toolkit are likely to deliver more effective services regardless of the number of sessions. This difference is large and clinically meaningful. Consistent with findings using the entire ACORN sample of therapists, it also appears that Toolkit usage is associated with improvement over time.

Discussion and implications for the future

The number of clinicians participating actively in the use of the ACORN Toolkit has increase significantly during the past year as part of a quality improvement initiative spearheaded by Beacon Health Options and Anne Arundel school district, with active support with Elizabeth Conners, PhD at Yale University.  We hope to be able to continue this program, as it is clearly in the best interest of the youth and children receiving services.

Funding for this program is unsure beyond the end of this year. Beacon Health Options lost this contract to United Health Care, and United Health Care has not yet provided information as to the future of feedback informed treatment in Maryland. We hope to learn more in the near future and will keep the participating agencies informed.