Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Evaluation and Continuous Improvement - Learning in Organizations (Ford, 2020)

Evaluation:

  1. Needs Assessment:

    • Parallel process with setting up an evaluation plan.

    • Evaluation plan answers questions about purpose, data collection, and appropriate intensity.

  2. Five Questions Addressed by Evaluation Plan:

    • Relevance: Reflects learner needs based on the needs assessment.

    • Content Validity: Measures job relevancy through evaluation of content domain.

    • Ratings of Job Relevance: Directly asking learners about job relevance.

    • Emphasis: Assesses appropriate emphasis on knowledge and skills.

  3. Learning Validity:

    • Identifies expected level of learning in relation to success standards.

    • Measures different knowledge constructs through various assessment methods.

  4. Transfer Validity:

    • Assesses changes in behavior on the job after learning.

    • Examines direct application, learning from observation, explaining ideas to others, and leading teams.

  5. Job Performance and Organizational Payoff:

    • Measures job performance proficiency and contribution to team goals.

    • Considers economic impact or changes in performance for organizational payoff.

  6. Return on Investment (ROI):

    • Calculates program value based on net benefits and costs.

    • Steps involve developing a valuation plan and estimating ROI conservatively.

  7. Success Case Method:

    • Determines if program-intended changes are achieved.

    • Identifies success cases through surveys or records, relying on self-reported data.

  8. Informative Evaluation:

    • Determines evaluation purpose and develops appropriate measures.

    • Collects high-quality data for informed choices about program retention and modification.

  9. Stakeholders and Quality of Measurement:

    • Identifies interested parties and their expectations.

    • Focuses on developing criterion measures with high validity.

  10. Proportionate Evaluation:

    • Creates measures, designs studies, and analyzes data proportionate to learning needs and organizational capabilities.

  11. Choice Points in Evaluation:

    • Evaluation efforts can be simple or complex based on priorities, resources, and organizational commitments.

    • Strong evaluation plans are essential for effective interventions.

  12. Internal Validity and Threats:

    • Considers whether the intervention made a difference and evaluates potential threats.

    • Threats to internal validity include history, testing, instrumentation, differential selection, and program integrity.

...