Learning Effectiveness: A Program Evaluation Framework CAUCE-CNIE Conference Waterloo, ON Thursday June 2, 2016
Bev Beattie, Assistant Professor RPN to BScN Blended Learning Program
[email protected] Steven Cairns, Assistant Professor RPN to BScN Blended Learning Program
[email protected] Agenda
Picture Reference: S. Cairns, 2016.
Share Explore Summarize Explain
“The Mechanics of Program Evaluation”
Picture Reference: S. Cairns, 2016.
The Mechanics of Program Evaluation “The why and what”
Why conduct a program evaluation?
Know who you are and what you do.
Quality Program Accountability Ethical Responsibility Stakeholder Input Accreditation Requirements Sustainability Learner Satisfaction
The Mechanics of Program Evaluation “Pay now or pay later” What are the challenges when conducting program evaluation?
Facing the challenges found ‘under the hood’ is complex.
Expertise
Resources: Human Financial Time
Method
Collecting & Analyzing the Data Commitment
The Mechanics of Program Evaluation “Structure is good” What process or approach do you use for program evaluation?
Building on a structural process for evaluation enables the ‘fine tuning of effectiveness’.
Consumer
Expertise
Decision Expertise Method
Program Evaluation
Resources: Human Financial Time Participant
Collecting & Analyzing the Data Commitment Program
Quality Program Accountability Ethical Responsibility Stakeholder Input Accreditation Requirements Sustainability Learner Satisfaction
Copyright © 2016 by Bev Beattie & Steve Cairns
Consumer
Expertise
Decision Expertise Method
Program Evaluation
Resources: Human Financial Time
Collecting & Analyzing the Data Commitment
Participant
Program
Quality Program Accountability Ethical Responsibility Stakeholder Input Accreditation Requirements Sustainability Learner Satisfaction
Stufflebeam’s CIPP Framework Context (C): Mission and goals of the program;
Program operation; Stakeholder input. “Planning decisions” (Fitzpatrick et al., 2011, p. 174). Input (I): Financial, Human, and Physical resource allocation. “Structuring decisions” (Fitzpatrick et al., 2011, p. 174). Process (P): Curriculum. “Implementing decisions” (Fitzpatrick et al., 2011, p. 174).
Product (P): Satisfaction, Performance, and Accomplishments. “Results and recycling decisions” (Fitzpatrick et al., 2011, p. 174).
References: Fitzpatrick et al., 2011; Horne & Sandmann, 2012; Oermann, 2016; Singh, 2004: Suhayda, & Miller, 2006.
RPN to BScN Blended Learning Program Systematic Program Evaluation Framework CIPP Program Evaluation Model ‘Draft’
Criterion
Information Information Required Sources
Method for Assigned Information Collecting Responsibilities Collected: Information : By Whom Conditions
Information Collection: When
Analysis Procedures
Frequency of Review: To Whom/ How and When
Context Component Input Component Financial, human, and physical resources.
Process Component Curriculum, Etc. Product Component Satisfaction, Performance, and Accomplishments.
Adapted CIPP Model from: Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. R. (2004) Text: Evaluation Plan ; Suhayda & Miller (2006) - Rush University College of Nursing Program Evaluation Matrix.
RPN to BScN Blended Learning Program Systematic Program Evaluation Framework CIPP Program Evaluation Model Context (C) Component
Criterion
Data
Actions
Assigned Responsibility
Reference: Suhayda, & Miller, 2006.
Frequency of Review
RPN to BScN Blended Learning Program Systematic Program Evaluation Framework: CIPP Program Evaluation Model
Input (I) Component [Financial, Human, and Physical Resource Allocation]. Criterion Mentorship Model
Data Establish a mentorship model and plan for new and continuing instructors within the Blended Learning Program.
Actions
Assigned Responsibility
Frequency of Review
Develop a Mentorship model considerations for Blended Learning Program.
Steve Cairns
Summer 2015
Survey / Discussion - Goal to identify current needs and opportunities for mentorship programming for online distance instructors.
Steve Cairns
Meeting October 15, 2015. Summary November 15, 2016
Discussion Item: Mentorship Planning within the School of Nursing
Steve Cairns
Contributers: Bev Beattie Stephanie Langford Jennifer Black Vivian Papaiz
Program Manager Meeting March 17, 2016
Council Meeting April 1, 2016 Nursing Scholarship Council Meeting May 16, 2016 Blended Program Meeting May 26, 2016
Data Action Form Highlights an identified program/ curriculum issue. A plan of action is outlined. Similar to an ongoing Needs Assessment:
Identify issues. Action plan. Actions taken. Provide target dates. Include person(s) responsible.
Reference: Suhayda, & Miller, 2006.
Program Evaluation Data Action Form Date
Criterion
Summary of the Issue/ Concern
Data Action Plan
Actions Taken
Date Completed
Review/ Comments/ Follow-up
Adapted from: Suhayda, R., & Miller, J. (2006). Optimizing evaluation of nursing education programs. Nurse Educator, 31(5), 200-206.
Mentorship Initiative- Program Evaluation Data Action Form CIPP Program Evaluation Model Dates
Criterion
July 22, 2015- Curriculum Committee Retreat Day October 15, 2015
Input (I) Component
Summary of On-line distance instructors have discussed mentorship programming within the Blended Nursing Program. A variety of the Issue/ administrative and peer support initiatives have been in place and a ‘mentorship model’ has been developed to represent Concern opportunities that exist. Program Managers met on March 17, 2016 to briefly discuss potential connections within all of the SON programs.
Data
Identify current needs and opportunities for mentorship programming for online distance instructors. Instructors in the RPN to BScN Blended Learning Program. Consider for all SON programs at NU. Literature: - Hui, Z. (2012). Mentoring engagement model, Settlement Services Settlement, Language and Community (SLC) Services Division, Vancouver, BC. Retrieved from http://post.successbc.ca/eng/component/option,com_mtree/task,viewlink/link_id,1203/Itemid,26/ Nick J.M., Delahoyde T.M., Prato D.D., Mitchell C., Ortiz J., Ottley C. et al. (2012). Best practices in academic mentoring: a model for excellence. Nursing Research Practice, 1-9. doi: 10.1155/2012/937906 Carter. L, Salyers. V., Page, P., Williams, L., Albl, L., Hofsink, C. (2012). Canadian Journal of Learning and Technology, v. 38, n. 1, feb. 2012. ISSN 1499-6685. Retrieved from http://www.cjlt.ca/index.php/cjlt/article/view/598 Cooper, M., & Wheeler, M. (2007). Career talk: Building successful mentoring relationships, in Canadian-nurse.com, Canadian Nurses Association, Ottawa, ON. Retrieved from http://fly.yale.edu/sites/default/files/files/Cooper%20and%20Wheeler%202007%20Mentoring%20Relationship.pdf Canadian Nursing Association (2004). Achieving excellence in professional practice: A guide to preceptorship and mentorship, Ottawa, ON. Retrieved from https://www.cna-aiic.ca/~/media/cna/page-content/pdf en/achieving_excellence_2004_e.pdf?la=en
Mentorship Initiative- Program Evaluation Data Action Form CIPP Program Evaluation Model Action Plan
Actions Taken
i) Develop a Mentorship Model. ii) Conduct a needs assessment (Survey) and determine the scope of instructor interest in the Blended Learning Program. iii) Initiate a Mentorship program.
Lead: Steven Cairns, Assistant Professor. Survey- Online Instructor Mentorship Survey- “It is hoped that the information gathered will help indicate instructor perspectives towards the scope and direction of potential mentorship activities”- Survey: Sept-Oct. 2015. Development of Mentorship Model Draft. (Appendix 1.) Meeting notes - October 15, 2015. – Survey results sharing (Appendix 2.) Meeting with Program Management Team March 17, 2016 Discussion item Nursing Faculty Meeting March 23, 2016 Summary notes for Nursing Council Meeting– March 28, 2016. – (Appendix 3.) Discussion item Nursing Council Meeting April 1, 2016 Discussion item Nursing Scholarship Council Meeting May 16, 2016 Follow-up Discussion Blended Program Meeting May 26, 2016
Date Completed
Consideration for School of Nursing: Review/ Outline roles and responsibilities for structural mentorship planning within SON programs. Comments/ Identify interest for mentorship programming between SON programs. Follow-up Consider opportunities for mentorship initiatives within the SON scholarship subcommittee.
Rationale for CIPP Framework
Encompasses a systematic framework and orderly approach. Provides useful information for decision making (Fitzpatrick et al., 2011). Supports evaluation as the program operates, grows, and changes (Fitzpatrick et al., 2004). Encourages formative and summative evaluation. Promotes an organizational culture involving:
Ongoing program improvement and accountability. Maintaining program standards. Striving for adaptability and sustainability.
Aligns with the program accreditation requirements. Supported by evidence-informed literature. Overall Goal: Learning Effectiveness.
Examples of Other Approaches
Expertise-Oriented and Consumer-Oriented:
Program-Oriented:
Logic models, Theory-based and Objectives-oriented. Work by C. Weiss, R. Tyler, and M. Provus.
Decision-Oriented:
Accreditation and work by M. Scriven.
CIPP model and Utilization-focused evaluation. Work by D. Stufflebeam and M. Patton.
Participant-Oriented:
Developmental evaluation and Stakeholder-based evaluation. Work by M. Patton and Stake; Mark & Shotland. References: Fitzpatrick et al., 2011; Oermann, 2016; Schug, 2012; Scriven, 1991.
Kirkpatrick Framework Four Levels for Evaluating Training: 1) 2) 3) 4) 5)
Reaction- Learner satisfaction. Learning- Learning occurred. Behavior- Changes in job behavior. Concrete- Results r/t quality improvement. Added: Return on Investment (ROI) [Phillips, 1996]
Reference: Galloway, 2005.
Thank you! Questions?
Picture References: S. Cairns, 2016; Nursing Pin Designer: S. Fitzgerald.
References Chen, H. (1996). A comprehensive typology for program evaluation. Evaluation Practice, 17(2), 121. Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. R. (2004). Program evaluation: Alternative approaches and practical guidelines. (3rd ed.). San Francisco, CA: Pearson. Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. R. (2011). Program evaluation: Alternative approaches and practical guidelines. (4th ed.). Boston, MA: Pearson. Galloway, D. L. (2005). Evaluating distance delivery and e-learning: Is Kirkpatrick’s model relevant? Performance Improvement, 44(4), 21-28. Horne, E., & Sandmann, L. (2012). Current trends in systematic program evaluation of online graduate nursing education: An integrative literature review. Journal of nursing Education, 51(10), 570-578. doi: 10.3928/01484834-20120820-06 Neutens, J. J., & Rubinson, L., (2014). Research Techniques for the Health Sciences (5th ed.). San Francisco, CA: Pearson. ISBN-13: 9780321896414 Oermann, M. H. (2017). A systematic approach to assessment and evaluation of nursing programs. Washington, DC: National League for Nursing, Wolters Kluwer. Schug, V. (2012). Curriculum evaluation: Using National League for Nursing accrediting commission standards and criteria. Nursing Education Perspectives, 33(5), 302-305. Scriven, M. (1991). Beyond formative and summative evaluation. In M.W. McLaughlin and ED.C. Phillips, eds., Evaluation and Education: A Quarter Century. Chicago, IL: University of Chicago Press. Singh, M. (2004). Evaluation framework for nursing education programs: Application of the CIPP model. International Journal of Nursing Education Scholarship, 13. doi: 10.2202/1548-923x.1023 Suhayda, R., & Miller, J. (2006). Optimizing evaluation of nursing education programs. Nurse Educator, 31(5), 200-206.