/
Evaluating Organizational Learning

Evaluating Organizational Learning

 

Evaluation and Continuous Improvement - Learning in Organizations (Ford, 2020)

Evaluation:

  1. Needs Assessment:

    • Parallel process with setting up an evaluation plan.

    • Evaluation plan answers questions about purpose, data collection, and appropriate intensity.

  2. Five Questions Addressed by Evaluation Plan:

    • Relevance: Reflects learner needs based on the needs assessment.

    • Content Validity: Measures job relevancy through evaluation of content domain.

    • Ratings of Job Relevance: Directly asking learners about job relevance.

    • Emphasis: Assesses appropriate emphasis on knowledge and skills.

  3. Learning Validity:

    • Identifies expected level of learning in relation to success standards.

    • Measures different knowledge constructs through various assessment methods.

  4. Transfer Validity:

    • Assesses changes in behavior on the job after learning.

    • Examines direct application, learning from observation, explaining ideas to others, and leading teams.

  5. Job Performance and Organizational Payoff:

    • Measures job performance proficiency and contribution to team goals.

    • Considers economic impact or changes in performance for organizational payoff.

  6. Return on Investment (ROI):

    • Calculates program value based on net benefits and costs.

    • Steps involve developing a valuation plan and estimating ROI conservatively.

  7. Success Case Method:

    • Determines if program-intended changes are achieved.

    • Identifies success cases through surveys or records, relying on self-reported data.

  8. Informative Evaluation:

    • Determines evaluation purpose and develops appropriate measures.

    • Collects high-quality data for informed choices about program retention and modification.

  9. Stakeholders and Quality of Measurement:

    • Identifies interested parties and their expectations.

    • Focuses on developing criterion measures with high validity.

  10. Proportionate Evaluation:

    • Creates measures, designs studies, and analyzes data proportionate to learning needs and organizational capabilities.

  11. Choice Points in Evaluation:

    • Evaluation efforts can be simple or complex based on priorities, resources, and organizational commitments.

    • Strong evaluation plans are essential for effective interventions.

  12. Internal Validity and Threats:

    • Considers whether the intervention made a difference and evaluates potential threats.

    • Threats to internal validity include history, testing, instrumentation, differential selection, and program integrity.

Evaluation Designs:

  1. Learner Post-Assessment/Case Study Design:

    • Only post-test, cannot show change.

  2. Learner Pre-and Post-Assessment Design:

    • Traditional design with pre- and post-tests.

    • Internal referencing approach can strengthen design.

  3. Pre-Test/Post-Test, Control Group Design:

    • One group does pretest, training, post-test; the other does pre- and post-tests without training.

    • Threats include selection and regression to the mean.

  4. Randomized Control Group Design:

    • Similar to the above, but with random selection into learning and control groups.

  5. Solomon Four-Group Design:

    • Highly rigorous design addressing most validity threats.

  6. Time Series Quasi-Experimental Design:

    • Learning group does four pretests, learning, and four post-tests.

    • Helps eliminate threats like testing effects or regression to the mean.

Continuous Improvement:

  1. Learning Systems Model:

    • Evaluation feeds into a continuous improvement model.

    • Feedback loops to design, delivery, and evaluation for program modification.

  2. Feedback Loop:

    • Strong focus on summative and formative processes and external validity.

  3. Summative and Formative Evaluation:

    • Summative targets overall outcomes, comparing interventions.

    • Formative focuses on understanding why outcomes were or were not achieved.

  4. External Validity Issues:

    • Summative evaluation provides information on program effectiveness.

    • External validity involves generalizability, requiring multiple studies in different settings.

  5. Rapid Evaluation (REAM):

    • Aims for a balance between speed and accuracy in needs assessment, planning, implementing, and evaluating processes.

    • Involves real-time evaluations, systematic organization, data collection, and debriefing sessions.

Best Practice Guidelines:

  1. Articulate Purpose and Identify Stakeholders:

    • Clearly define evaluation purpose and identify interested parties.

    • Build relevant evaluation measures.

  2. High-Quality Measures:

    • Create measures with high levels of reliability and validity.

  3. Realistic Evaluation Plan:

    • Develop a realistic evaluation plan considering available resources.

  4. Appropriate Design:

    • Minimize threats to internal validity through appropriate design.

    • Consider quasi-experimental design with multiple time points when necessary.

  5. Formative Evaluation:

    • Use formative evaluation during a pilot program to improve instruction quality.

    • Test rather than assume the generalizability of evaluation findings.

  6. Considerations for Evaluation Designs:

    • Match evaluation efforts with learning priorities and organizational capabilities.

    • Managers and supervisors should provide post-learning assessments of transfer.

    • Resources must be available to take action based on evaluation data.

  7. Internal Validity and Threats:

    • Assess threats to internal validity, including history, testing, instrumentation, selection, and program integrity.

  8. Continuous Improvement:

    • Implement feedback loops for ongoing program modification.

    • Balance summative and formative evaluation approaches for effective continuous improvement.

  9. Rapid Evaluation (REAM):

    • Consider rapid evaluation methods for urgent needs, using mixed-method approaches.

    • Focus on being rapid, participatory, team-based, iterative, and appropriate for urgent situations

Related content

Determining Organizational Learning Needs
Determining Organizational Learning Needs
More like this
Evaluating a Learning Organization
Evaluating a Learning Organization
More like this
Organizational and Societal Issues
Organizational and Societal Issues
More like this
Building a Learning Organization
Building a Learning Organization
More like this
Context of Need of Learning Organizations
Context of Need of Learning Organizations
More like this
Learners Influencing Organizational Change
Learners Influencing Organizational Change
More like this