Abstract

Background: Evidence-based practice (EBP) use is varied in autism spectrum disorder community-based organizations (ASD-CBOs) in the United States.

Aim: The ACT SMART Implementation Toolkit—a multi-faceted implementation strategy guiding teams through implementation phases—was pilot tested to assess its preliminary effectiveness in increasing EBP use within ASD-CBOs.

Method: Six ASD-CBOs participated, and five completed all Toolkit phases. Supervisors and direct providers completed an agency assessment pre- and post-pilot reporting their use of the selected EBP.

Results: Effect sizes, examining meaningful changes in reported EBP utilization from pre- to post-pilot, found small effect sizes for supervisor-reported, supervisor-report of direct provider’s use, and for direct provider-reported EBP use.

Conclusions: Results indicate an increase in reported EBP use post-pilot and signal potential effectiveness of the ACT SMART Toolkit to yield provider-reported behavioral changes. This is the first multi-faceted implementation strategy designed specifically for ASD-CBOs, and findings support its facilitation of EBP uptake to improve community care for autistic individuals.

Keywords: autism spectrum disorder, implementation, community-based, video modeling, usual care, health services

Introduction

Autism spectrum disorder (ASD), a lifelong neuro-developmental disorder, is prevalent in 1.8% of the population in the United States (Maenner et al. 2020). ASD is characterized by impairments in social communication, as well as repetitive and/or restricted behaviors (American Psychiatric Association 2013). Evidence-based practices (EBPs) developed for this population (ASD-EBPs) are effective in improving core ASD deficits and co-occurring symptoms, and lead to improved short- and long-term outcomes (Steinbrenner et al. 2020). However, the utilization of ASD-EBPs varies in ASD community-based organizations (ASD-CBOs), where they have been observed to be delivered with low frequency and intensity (Brookman-Frazee et al. 2012).

Implementation science aims to address this research-to-practice gap, by leveraging methods to facilitate the uptake of EBPs within usual care or community settings (Brownson et al. 2018). Specifically, implementation strategies (i.e. methods or processes utilized to facilitate the adoption and uptake of a practice within a given setting) can be effective tools to facilitate the utilization of EBPs in community-based organizations by addressing barriers throughout the implementation process and systematically facilitating implementation in these community settings (Powell et al. 2019). A myriad of strategies have been identified, including discrete (singular) and multi-faceted (combination of two or more discrete strategies) implementation strategies. Moreover, implementation scientists posit that multi-faceted strategies may be more effective than discrete strategies (Powell et al. 2019); yet, limited research comparing the effectiveness of multi-faceted versus discrete strategies has been conducted. Indeed, ‘conduct[ing] more effectiveness research on discrete, multi-faceted, and tailored implementation strategies’ is a priority research area within this field (Powell et al. 2019).

ACT SMART implementation toolkit development

The Autism Community Toolkit: Systems to Measure and Adopt Research-Based Treatments (ACT SMART Toolkit) is one multi-faceted implementation strategy developed based on prior research examining barriers and facilitators to ASD-EBP implementation in ASD-CBOs, with the aim of addressing client, provider, and contextual factors (e.g. training requirements, EBP fit, provider capacity) that impact implementation processes (Drahota et al. 2014; Drahota et al. 2021). The ACT SMART Implementation Toolkit, based on an adapted version of the Exploration, Preparation, Implementation, Sustainment (EPIS) framework (Aarons et al. 2011, Drahota et al. 2014), is a comprehensive web-based interface designed to guide organizational implementation teams through five phases of the implementation process (Figure 1): Exploration, Adoption Decision, Preparation, Implementation, Sustainment (Drahota et al. 2021). Developed over the course of a year in collaboration with the Autism Model of Implementation (AMI) Collaboration (comprised of 9 ASD-CBO agency leaders, 2 ASD academic researchers, and 1 D&I scientist) (Gomez et al. 2021), the Toolkit was informed by an evaluation of factors influencing the adoption and use of EBPs within usual care settings for individuals on the autism spectrum (Drahota et al. 2021). Participation in the AMI Collaboration included five 2-hour meetings focused on reviewing and providing feedback related to toolkit materials developed by the research team with the explicit goal to develop a practical and effective systematic process that could be delivered flexibly (Sankey et al. 2019) to implement any ASD-EBP in ASD-CBOs as well as have broad potential for organizations serving individuals on the autism spectrum. AMI community-academic partners reviewed and provided iterative feedback on the components and materials of the Toolkit during collaborative meetings, including the development of the ASD Strategies and Interventions Survey (Pickard et al. 2018), which was included in this study.

After development of the Toolkit, a pilot study was completed, and preliminary findings indicate that the Toolkit is feasible, acceptable and useful, and that implementation teams had a high level of adherence to the toolkit and facilitation activities (Drahota et al., 2015). This point of view presents pilot study data on the preliminary effectiveness of the ACT SMART Toolkit in increasing the utilization of an EBP adopted for agency-wide implementation by ASD-CBO implementation teams.

Methods

Study procedures were reviewed and approved by the San Diego State University Human Research Protection Program’s Institutional Review Board (vIRB # 961087) prior to study initiation. Michigan State University’s Human Research Protection Program Institutional Review Board (ID # STUDY00002382) approved use of the de-identified data for secondary data analysis as presented in this manuscript.

Participants

Six ASD-CBOs located in urban areas in Southern California participated in the pilot study (United States Department of Agriculture, 2020). Participating organizations met the following eligibility criteria: existing social and/or research collaborations with agencies, researchers, and collaborative groups; existing efforts to receive additional staff training; and agency leader’s interest in implementing new EBPs within their organization. Participating ASD-CBOs provided applied behavior analysis (ABA; n = 4), ABA and mental health care (n = 1) and speech/language pathology (SLP; n = 1) services. During recruitment, all six organizational leaders reported a need for at least one of the three EBPs selected for the ACT SMART Implementation Toolkit pilot study: self-management, social stories/narratives, or video modeling (Wong et al. 2015). These EBPs were selected for the pilot study because they each were discrete (rather than complex interventions/packages) and had similar levels of training requirements (Autism Focused Intervention Resources and Modules n.d.), and therefore would allow for comparisons even if participant ASD-CBOs chose differing practices for implementation. Coincidentally, during the study, all organizations identified video modeling as the EBP under consideration for implementation within their organization during Phase 2, Adoption Decision. After determining a lack of fit between the selected EBP and the agency, one organization opted not to adopt an EBP after completing Phase 2 activities of the Toolkit. Thus, five ASD-CBOs completed all phases of the Toolkit.

Procedure

Implementation teams (ITs) at each organization were guided through each of the phases, steps and activities comprising the ACT SMART Toolkit (Figure 1). First, agencies engaged in the Exploration phase, during which participants completed an agency assessment, and determined goals and next steps. Next, participants identified an appropriate EBP and determined whether or not to adopt the intervention during the Adoption Decision phase. Phase 3, Preparation involved developing adaptation, training, and implementation plans, and Phase 4, Implementation focused on conducting and tracking progress for each of these plans. Finally, participants began the Sustainment phase, during which the success of the implementation was evaluated, and a sustainment plan was developed. Implementation teams met with ACT SMART facilitation teams on a monthly basis throughout each of the five phases, in order to receive support as they progressed through the toolkit. Please see Figure 1 for further detail. Additionally, the ACT SMART Toolkit was developed to take place over the course of 12 months. The pilot study lasted a total of 13 months, with pre-pilot data collection taking place during the first month, and post-pilot data collection in month 13.

Data collection

Supervisors and direct providers (DPs) at each participating organization completed an agency assessment battery at pre- and post-pilot. Due to the staffing changes at each organization over the course of the pilot study, participants at each timepoint varied slightly. The agency assessment battery gathered both demographics information (Table 1) and included the ASD Strategies and Interventions Survey (ASD-SIS) (Pickard et al. 2018) to obtain data on each providers’ utilization of various interventions, including video modeling. The ASD-SIS, adapted from the Therapeutics Strategies Survey (Brookman-Frazee et al. 2009) and the Therapy Procedure Checklist (Weersing et al. 2002), consists of an inventory of 55 treatment practices with descriptions (some ASD-EBPs and other non-EBPs) commonly used to treat youth on the autism spectrum, and included the three discrete ASD-EBPs considered for adoption by the participating ASD-CBOs. Specifically, supervisors reported on their own utilization of each treatment practice (yes/no), as well as on their supervised DPs’ use of each practice (yes/no), and direct providers reported on their own use of each practice (yes/no). A copy of the ASD-SIS can be requested from the corresponding author.

Table 1.


Pre-pilot
Post-pilot


Demographics
Supervisors

(n = 34)


Direct providers

(n = 78)


Supervisors

(n = 37)


Direct providers

(n = 80)




Sex (% female)

Missing


88.2%


82.1%

1.3%


91.9%


86.3%

1.3%




Race

Latinx/Hispanic

Prefer not to answer

missing


35.3%

5.9%


37.2%

3.8%

1.3%


40.5%

5.4%


32.5%

8.8%

1.3%




Ethnicity (not mutually exclusive)


White
70.6%
71.8%
51.4%
58.8%


American Indian/Alaskan native

Black/African American

Asian

Native Hawaiian/Pacific Islander


2.9%

5.9%

8.8%


3.8%

5.1%

2.6%


5.4%

2.7%

5.4%


3.8%

15.0%

1.3%




Prefer not to answer

Missing


17.6%

2.9%


17.9%

3.8%


32.4%

2.7%


21.3%

6.3%




Education level a


High school diploma

Some college

Associate degree

Bachelor’s degree

Master’s degree


14.7%

67.6%


2.6%

11.5%

2.6%

56.4%

15.4%


29.7%

67.6%


12.5%

6.3%

47.5%

22.5%




Doctorate

Other

Missing


14.7%

2.9%


3.8%

6.4%

1.3%


2.7%


5%

5%

1.3%




Discipline a


Psychology

Marriage and family therapy

Social work

Speech/language/communication

Education


41.2%

5.9%

11.8%

14.7%


43.6%

1.3%

3.8%

14.1%

3.8%


43.2%

5.4%

5.4%

10.8%


51.2%

2.5%

10.0%

7.5%




Behavior specialist
20.6%
20.5%
29.7%
16.3%


Other

Missing


5.9%


9.0%

3.8%


5.4%


11.3%

1.3%



May not equal 100% due to rounding errors.

Results

Due to the small sample size of the ACT SMART Pilot Study, effect sizes (Hedge’s g) were calculated in order to evaluate meaningful changes from pre- to post-pilot in supervisor- and DP-reported use of video modeling within each participating ASD-CBO (Hedges 1981, Ferguson 2009). Results, found in Table 2, indicated a small effect size (0.28) for supervisor-reported use of video modeling, suggesting a modest increase in supervisor use of video modeling from pre- to post-pilot. Findings also illustrated a small effect size (0.30) for supervisor-report of their supervised DP’s use, as well as a small effect size (0.22) for direct provider-reported use of video modeling. Additionally, percentages were calculated by dividing the number of supervisors who endorsed ‘yes’ to using video-modeling, by the total number of supervisors. The same procedure was used to calculate percentage use of supervisor-reported direct provider use, and direct provider reported use. Results suggest that direct providers endorsed utilizing video modeling more often upon completion of the ACT SMART pilot study than before the pilot study occurred. This phenomenon is demonstrated in aggregate (Table 2) and by Agency (Table 3).

Table 2.


Pre-pilot

% use

mean (SD)


Post-pilot

% use

mean (SD)


Hedge’s g

effect size




Supervisor use
33.3%

0.35 (0.48)


41.2%

0.48 (0.51)


0.28*


Supervisor-report of

direct provider use


26.7%

0.29 (0.46)


35.3%

0.43 (0.50)


0.30*


Direct provider use
27.3%

0.30 (0.46)


37.7%

0.41 (0.50)


0.22*

0.2 = small effect size.

0.5 = medium effect size.

0.8 = large effect size.

Table 3.


Pre-pilot
Post-pilot


Agency
Supervisor use
Supervisor-Report of DP use
DP use
Supervisor use
Supervisor-report of DP use
DP use


(n = 30)
(n = 30)
(n = 66)
(n = 37)
(n = 37)
(n = 69)


1
30.8% (4/13)
23.1% (3/13)
42.9% (9/21)
37.5% (6/16)
37.5% (6/16)
23.1% (3/13)


2
100%

(3/3)


33.3%

(1/3)


0%

(0/8)


100%

(4/4)


100%

(4/4)


61.1% (11/18)


4
75%

(3/4)


75%

(3/4)


45.4% (5/11)
50%

(1/2)


50%

(1/2)


87.5%

(7/8)




5
0%

(0/2)


0%

(0/2)


0%

(0/4)


100%

(1/1)


100%

(1/1)


66.7%

(4/6)




6
0%

(0/8)


12.5%

(1/8)


18.2% (4/22)
18.2% (2/11)
27.3% (3/11)
4.2%

(1/24)



DP = Direct provider.

Discussion

Overall, exploratory results indicate differences in reported utilization of video modeling from pre- to post-pilot study of the ACT SMART Implementation Toolkit. Though effect sizes were small, these findings suggest that the Toolkit may be effective in facilitating behavioral changes in supervisor’s and director provider’s reported delivery of video modeling to individuals on the autism spectrum at their agencies. These results additionally suggest that the toolkit may facilitate the adoption and use of EBPs, more generally. Importantly, the ACT SMART Implementation Toolkit is the first implementation toolkit designed specifically for community-based organizations providing services to individuals on the autism spectrum. These findings indicate that the ACT SMART Implementation Toolkit may be an effective multi-faceted implementation strategy to facilitate the adoption, uptake and implementation of EBPs within ASD-CBOs.

There are several limitations that should be addressed in future research. Due to the pilot study’s design (i.e. focused on examining feasibility, acceptability and utility), including the low number of participating ASD-CBOs in this pilot study, the researchers did not expect to find statistically significant differences in reported use of video modeling from pre- and post-pilot. Effect sizes were utilized to calculate a more accurate estimate of the magnitude of the effect of the ACT SMART Implementation Toolkit (Ferguson 2009) on use of video modeling. Given the small sample size in this study, future work should evaluate the effectiveness of this toolkit with a larger sample of ASD-CBOs that includes greater heterogeneity in service systems (e.g. schools, work programs, community mental health centers, etc.) and locations (e.g. rural, urban), ideally utilizing a randomized controlled experimental design, appropriately powered to examine statistically significant differences in the effectiveness of this implementation strategy. Further, exploration into the mechanisms (e.g. implementation team decision-making, engagement with implementation facilitators, organizational EBP receptivity) driving the change in adoption and utilization of ASD-EBPs post-toolkit is warranted and necessary to further advance the field of D&I science research (Powell et al. 2019). Additionally, the use of self-report data may be a limitation to these study findings, particularly as the respondents participated in the use of the ACT SMART Implementation Toolkit intervention. Future work should involve data collection across additional respondent groups (e.g. direct providers not involved in the use of the Toolkit) in order to minimize the impact of social desirability and potential bias in reporting EBP use. Finally, future work should test the effectiveness of this Toolkit to facilitate implementation of complex ASD-EBPs (e.g. cognitive behavioral therapy) in order to better understand the generalizability of these findings. The ACT SMART Implementation Toolkit was designed to be agnostic to ASD-EBP (e.g. was designed to facilitate the uptake and implementation of any ASD-EBP); however, this is yet to be tested empirically.

Implications

Recent work has highlighted the need for pragmatic illustrations of the use of implementation science to increase the uptake of EBPs within the context of behavioral health (Vroom et al. 2022 in press). This brief report highlights promising findings related to the effectiveness of one multi-faceted implementation strategy that combined web-based implementation process activities and monthly facilitation meetings involving agency ASD-CBO implementation teams and Toolkit facilitators, and was designed to increase the use of EBPs within community-based organizations providing services to individuals on the autism spectrum. The ultimate goal of the Toolkit is to increase availability of ASD-EBPs in order to improve treatment outcomes and quality of life for this population. Additionally, this implementation toolkit was developed in collaboration with key stakeholders involved in ASD service provision; this collaborative effort may allow for greater success and sustainability in the use of such an implementation strategy by enhancing facilitators and reducing barriers of EBP adoption and use within the ASD community-based service system (Drahota et al. 2021; Gomez et al. 2021; Vroom et al. 2022 in press). Overall, these findings may inform the development, revision, and utilization of implementation guides and support the broader utilization of multi-faceted implementation strategies. These preliminary effectiveness results indicate use of implementation toolkits may be a practical and systematic method for EBP integration and implementation into community-based, usual care settings, in an effort to reduce the research-to-practice gap within the context of behavioral health services.

Acknowledgements

Amy Drahota, Ph.D., was an Assistant Research Professor in the Department of Psychology at San Diego State University and Investigator at the Child and Adolescent Service Research Center in San Diego, California, U.S.A., at the time of data collection for this study. An earlier version of this paper was presented at the INSAR 2020 Annual Meeting in Seattle, Washington, U.S.A.

Funding Statement

This study was supported by a grant from the National Institute of Mental Health, United States of America (K01MH093477; PI: Drahota).

Conflict of interest

The authors report no conflict of interest.

References Aarons, G. A., Hurlburt, M. and Horwitz, S. M.. 2011. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration and Policy in Mental Health, 38, 4–23. [DOI] [PMC free article] [PubMed] [Google Scholar]
American Psychiatric Association. 2013. Diagnostic and statistical manual of mental disorders. 5th ed. Arlington, VA: American Psychiatric Association. [Google Scholar]
Autism Focused Intervention Resources and Modules. n.d. National Clearinghouse on Autism Evidence and Practice. [online] Available at: <https://afirm.fpg.unc.edu/afirm-modules> [Accessed 09 August 2021].
Brookman-Frazee, L., Drahota, A., Stadnick, N. and Palinkas, L. A.. 2012. Therapist perspectives on community mental health services for children with autism spectrum disorders. Administration and Policy in Mental Health, 39, 365–373. [DOI] [PMC free article] [PubMed] [Google Scholar]
Brookman-Frazee, L., Garland, A. F., Taylor, R. and Zoffness, R.. 2009. Therapists’ attitudes towards psychotherapeutic strategies in community-based psychotherapy with children with disruptive behavior problems. Administration and Policy in Mental Health, 36, 1–12. [DOI] [PMC free article] [PubMed] [Google Scholar]
Brownson, R. C., Colditz, G. A., and Proctor, E. K. eds. 2018. Dissemination and implementation research in health: translating science to practice. 2nd ed. New York: Oxford University Press. pp. 6. [Google Scholar]
Drahota, A., Martinez, J. I., Meza, R., Brikho, B., Gomez, E., Stahmer, A.C., Aarons, G.A., and the Autism Model of Implementation Collaboration. 2015. ACT SMART Toolkit: Developing and pilot testing a comprehensive implementation strategy for ASD service providers. In: ABCT (Association for Behavioral and Cognitive Therapies), 49th Annual convention on improving dissemination by promoting empirically supported principles of psychopathology and change. Chicago, Illinois, United States of America, 12-15 November 2015.
Drahota, A., Meza, R., Bustos, T. E., Sridhar, A., Martinez, J. I., Brikho, B., Stahmer, A. C., and Aarons, G. A. 2021. Implementation-as-usual in community-based organizations providing specialized services to individuals with autism spectrum disorder: A mixed methods study. Administration and Policy in Mental Health and Mental Health Services Research, 48, 482-498. [DOI] [PMC free article] [PubMed]
Drahota, A., Meza, R., and Martinez, J. I. 2014. The Autism Community Toolkit: Systems to Measure and Adopt Research-based Treatments. [online] Available at: <actsmart.herokuapp.com> [Accessed 9 August 2021].
Ferguson, C. J. 2009. An effect size primer: A guide for clinicians and researchers. Professional Psychology: Research and Practice, 40, 532–538. [Google Scholar]
Gomez, E., Drahota, A., and Stahmer, A. C. 2021. Choosing strategies that work from the start: a mixed methods study to understand effective development of community–academic partnerships. Action Research, 19, 277-300. [DOI] [PMC free article] [PubMed]
Hedges, L. V. 1981. Distribution Theory for Glass’s estimator of effect size and related estimators. Journal of Educational Statistics, 6, 107–128. [Google Scholar]
Maenner, M., Shaw, K., Baio, J., Washington, A., Patrick, M., DiRienzo, M., Christensen, D. L., Wiggins, L. D., Pettygrove, S., Andrews, J. G., Lopez, M., Hudson, A., Baroud, T., Schwenk, Y., White, T., Robinson Rosenberg, C., Lee, L., Harrington, R. A., Huston, M., Hewitt, A., Esler, A., Hall-Lande, J., Poynter, J. N., Hallas-Muchow, L., Constantino, J. N., Fitzgerald, R. T., Zahorodny, W., Shenouda, J., Daniels, J. L., Warren, Z., Vehorn, A., Salinas, A., Durkin, M. S. and Dietz, P. M, PhD-7. 2020. Prevalence of autism spectrum disorder among children aged 8 years—autism and developmental disabilities monitoring network, 11 sites, United States, 2016. Morbidity and Mortality Weekly Report. Surveillance Summaries (Washington, D.C. : 2002), 69, 1–12. [DOI] [PMC free article] [PubMed] [Google Scholar]
Pickard, K., Meza, R., Drahota, A., and Brikho, B. 2018. They’re doing what? A brief paper on service use and attitudes in ASD community-based agencies. Journal of Mental Health Research in Intellectual Disabilities, 11, 111–123. [DOI] [PMC free article] [PubMed]
Powell, B. J., Fernandez, M. E., Williams, N. J., Aarons, G. A., Beidas, R. S., Lewis, C. C., McHugh, S. M. and Weiner, B. J.. 2019. Enhancing the impact of implementation strategies in healthcare: A research agenda. Frontiers in Public Health, [e-journal] 7, 3. Available through: Frontiers Publisher [Accessed 9 August 2021]. [DOI] [PMC free article] [PubMed] [Google Scholar]
Sankey, C., Girard, S. and Cappe, E. 202. 2019. Evaluation of the social validity and implementation process of a psychoeducational program for parents of a child with Autism Spectrum Disorder. International Journal of Developmental Disabilities, 67, 101–111. [DOI] [PMC free article] [PubMed] [Google Scholar]
Steinbrenner, J. R. Hume, Odom K., Morin S. L., Nowell K. L., Tomaszewski S. W., Szendrey B., McIntyre S., Yücesoy-Özkan N. S., S. and Savage, M. N.. 2020. Evidence-based practices for children, youth, and young adults with autism. [pdf] Available at: <ncaep.fpg.unc.edu> [Accessed 9 August 2021].
United States Department of Agriculture. 2020. Economic research service. Rural-urban commuting area codes. [website] Available at: <https://www.ers.usda.gov/data-products/rural-urban-commuting-area-codes.aspx> [Accessed 18 February 2022].
Vroom, E. B. and Massey, O. T.. 2022. Moving from implementation science to implementation practice: The need to solve practical problems to improve behavioral health services. The Journal of Behavioral Health Services & Research, 49, 106–116. [DOI] [PMC free article] [PubMed] [Google Scholar]
Weersing, V. R., Weisz, J. R. and Donenberg, G. R.. 2002. Development of the therapy procedures checklist: A therapist-report measure of technique use in child and adolescent treatment. Journal of Clinical Child and Adolescent Psychology: The Official Journal for the Society of Clinical Child and Adolescent Psychology, American Psychological Association, Division 53, 31, 168–180. [DOI] [PubMed] [Google Scholar]
Wong, C., Odom, S. L., Hume, K. A., Cox, A. W., Fettig, A., Kucharczyk, S., Brock, M. E., Plavnick, J. B., Fleury, V. P. and Schultz, T. R.. 2015. Evidence-based practices for children, youth, and young adults with autism spectrum disorder: A comprehensive review. Journal of Autism and Developmental Disorders, 45, 1951–1966. [DOI] [PubMed] [Google Scholar]