↓ Skip to main content

Using Instructional Design, Analyze, Design, Develop, Implement, and Evaluate, to Develop e-Learning Modules to Disseminate Supported Employment for Community Behavioral Health Treatment Programs in…

Overview of attention for article published in Frontiers in Public Health, May 2018
Altmetric Badge

About this Attention Score

  • Average Attention Score compared to outputs of the same age

Mentioned by

twitter
4 X users

Citations

dimensions_citation
59 Dimensions

Readers on

mendeley
314 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Using Instructional Design, Analyze, Design, Develop, Implement, and Evaluate, to Develop e-Learning Modules to Disseminate Supported Employment for Community Behavioral Health Treatment Programs in New York State
Published in
Frontiers in Public Health, May 2018
DOI 10.3389/fpubh.2018.00113
Pubmed ID
Authors

Sapana R. Patel, Paul J. Margolies, Nancy H. Covell, Cristine Lipscomb, Lisa B. Dixon

Abstract

Implementation science lacks a systematic approach to the development of learning strategies for online training in evidence-based practices (EBPs) that takes the context of real-world practice into account. The field of instructional design offers ecologically valid and systematic processes to develop learning strategies for workforce development and performance support. This report describes the application of an instructional design framework-Analyze, Design, Develop, Implement, and Evaluate (ADDIE) model-in the development and evaluation of e-learning modules as one strategy among a multifaceted approach to the implementation of individual placement and support (IPS), a model of supported employment for community behavioral health treatment programs, in New York State. We applied quantitative and qualitative methods to develop and evaluate three IPS e-learning modules. Throughout the ADDIE process, we conducted formative and summative evaluations and identified determinants of implementation using the Consolidated Framework for Implementation Research (CFIR). Formative evaluations consisted of qualitative feedback received from recipients and providers during early pilot work. The summative evaluation consisted of levels 1 and 2 (reaction to the training, self-reported knowledge, and practice change) quantitative and qualitative data and was guided by the Kirkpatrick model for training evaluation. Formative evaluation with key stakeholders identified a range of learning needs that informed the development of a pilot training program in IPS. Feedback on this pilot training program informed the design document of three e-learning modules on IPS: Introduction to IPS, IPS Job development, and Using the IPS Employment Resource Book. Each module was developed iteratively and provided an assessment of learning needs that informed successive modules. All modules were disseminated and evaluated through a learning management system. Summative evaluation revealed that learners rated the modules positively, and self-report of knowledge acquisition was high (mean range: 4.4-4.6 out of 5). About half of learners indicated that they would change their practice after watching the modules (range: 48-51%). All learners who completed the level 1 evaluation demonstrated 80% or better mastery of knowledge on the level 2 evaluation embedded in each module. The CFIR was used to identify implementation barriers and facilitators among the evaluation data which facilitated planning for subsequent implementation support activities in the IPS initiative. Instructional design approaches such as ADDIE may offer implementation scientists and practitioners a flexible and systematic approach for the development of e-learning modules as a single component or one strategy in a multifaceted approach for training in EBPs.

X Demographics

X Demographics

The data shown below were collected from the profiles of 4 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 314 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 314 100%

Demographic breakdown

Readers by professional status Count As %
Student > Master 36 11%
Student > Ph. D. Student 28 9%
Student > Bachelor 22 7%
Lecturer 20 6%
Researcher 17 5%
Other 64 20%
Unknown 127 40%
Readers by discipline Count As %
Social Sciences 36 11%
Medicine and Dentistry 27 9%
Nursing and Health Professions 24 8%
Computer Science 18 6%
Psychology 11 4%
Other 64 20%
Unknown 134 43%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 2. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 07 June 2018.
All research outputs
#14,107,269
of 23,047,237 outputs
Outputs from Frontiers in Public Health
#3,406
of 10,326 outputs
Outputs of similar age
#179,434
of 327,928 outputs
Outputs of similar age from Frontiers in Public Health
#68
of 96 outputs
Altmetric has tracked 23,047,237 research outputs across all sources so far. This one is in the 37th percentile – i.e., 37% of other outputs scored the same or lower than it.
So far Altmetric has tracked 10,326 research outputs from this source. They typically receive more attention than average, with a mean Attention Score of 10.0. This one has gotten more attention than average, scoring higher than 64% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 327,928 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 43rd percentile – i.e., 43% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 96 others from the same source and published within six weeks on either side of this one. This one is in the 28th percentile – i.e., 28% of its contemporaries scored the same or lower than it.