Chapter 4 — Evaluation & Outcomes

Chapter 4 — Evaluation & Outcomes

4.1 Policy Statement

ARmed for Medical Training and Consultancy – LLC – OP.C shall implement a comprehensive evaluation framework for all CME/CPD activities to measure effectiveness, ensure accountability, and demonstrate impact in alignment with DOH standards.
Evaluation is mandatory for every accredited activity and shall measure:

  • Learner satisfaction (Level 1).
  • Knowledge/competence acquisition (Level 2).
  • Performance-in-practice change (Level 3).
  • Patient or system outcomes (Level 4) were feasible.

Reference: DOH requires CME/CPD providers to assess activities for changes in competence, performance, and/or patient outcomes, and to retain evaluation data for 6 years.

4.2 Rationale

  • DOH Activity Development Guide mandates that outcomes are linked to identified gaps and must demonstrate measurable improvement.
  • Quality Assurance Manual requires providers to systematically evaluate activities for effectiveness and improvement.
  • International benchmarks (ACCME) emphasize evaluation as part of a Continuous Quality Improvement (CQI) cycle.

4.3 Evaluation Framework (Kirkpatrick Model)

4.3.1 Level 1 — Reaction (Satisfaction)

  • Learners’ perceptions of relevance, quality, and faculty performance.
  • Tool: Standard post-activity survey.
  • Timing: Immediately after activity.

4.3.2 Level 2 — Learning (Knowledge/Competence)

  • Measurement of knowledge gained, or competence acquired.
  • Tools:
    • Pre- and post-tests.
    • MCQs mapped to objectives.
    • OSATS for skills-based training.
  • Timing: At activity conclusion.

4.3.3 Level 3 — Behavior (Performance-in-Practice)

  • Measurement of practice change at the workplace.
  • Tools:
    • Follow-up survey (2–3 months post-activity).
    • Supervisor/peer attestations.
    • Clinical audits or case reviews.
  • Timing: 2–6 months post-activity.

4.3.4 Level 4 — Results (Patient/System Outcomes)

  • Assessment of impact on patient care, safety indicators, or system performance.
  • Tools:
    • QI/QA data before and after intervention.
    • Incident reports, infection rates, error reduction metrics.
    • Patient satisfaction data.
  • Timing: 6–12 months post-activity.

 

4.4 Tools and Instruments

4.4.1 Standard Evaluation Survey (Level 1)

  • Relevance of topic to practice.
  • Achievement of learning objectives.
  • Faculty expertise and delivery quality.
  • Perceived bias or commercial influence (mandatory DOH item).
  • Overall satisfaction rating.

4.4.2 Knowledge Assessments (Level 2)

  • Minimum 5–10 MCQs per activity.
  • Questions mapped to Bloom’s taxonomy (apply, analyze, evaluate).
  • Pre/post score comparison required to demonstrate knowledge gain.

4.4.3 Performance Assessments (Level 3)

  • Structured follow-up survey:
    • “Have you implemented [X] guideline since attending?”
    • “Did this activity change your decision-making in [Y] scenarios?”
  • Optional clinical audit forms.

4.4.4 Outcome Assessments (Level 4)

  • QI metrics (error rates, compliance with protocols).
  • Patient outcomes (readmission rates, complication rates).
  • Public health indicators (vaccination coverage, screening uptake).

4.5 Reporting and Documentation

  • Evaluation reports must be compiled within 14 days of activity completion.

 

  • Reports include:
    • Number of learners.
    • Survey response rate.
    • Satisfaction scores.
    • Knowledge gain (pre/post test results).
    • Follow-up performance data (if available).
    • CQI actions identified.
  • Reports presented to the Scientific Committee for review and approval.
  • DOH requires reporting of learner completions (attendance + credits) within 30 days.

4.6 Escalation & Corrective Actions

  • Minor gaps (e.g., low response rate <60%) → remedial action (additional surveys).
  • Moderate issues (e.g., low knowledge gain) → activity redesign or faculty remediation.
  • Serious issues (e.g., evidence of bias, failure to meet objectives) → activity suspension, CAPA log entry, and reporting to DOH within 10 working days.

4.7 Continuous Quality Improvement (CQI)

  • Evaluation data is aggregated annually into the CME/CPD Annual Report.
  • Oversight Committee reviews trends in:
    • Learner satisfaction.
    • Knowledge improvement.
    • Practice change.
    • Patient outcomes (if applicable).
  • Findings feed into next year’s activity planning (link between Chapter 4 and Chapter 2).
  • CQI actions are logged in the CAPA Register with timelines, owners, and effectiveness checks.

4.8 Documentation and Retention

  • All evaluation data (surveys, test results, reports, CAPA logs) must be retained for 6 years.
  • Data stored securely in both hard copy and electronic repositories.
  • Learner anonymity protected in reporting (compliance with UAE PDPL).

4.9 Templates and Appendices

  • Learner Evaluation Survey.
  • Activity Evaluation Report format.
  • CAPA Log.
    • Internal Audit Checklist (evaluation section).

Comments are closed.