Summative Evaluation and Tools

The summative evaluation plan for this course will rely on surveys and data collected and analyzed from the LMS in the form of grades, peer reviews and discussion forum comments as well as logs kept by the instructor and support teams measuring student retention and completion, workload and time on task ( instructors and students) and total technical, content or LMS support questions for this class for the quarter.

A pre-course ENKA and a post-course EXKA survey will evaluate student familiarity with key concepts and terms and students' confidence level in explaining or presenting these concepts and terms to others. The eLearning department also administers a standard course survey for students that consists of mostly Level I data collection, defined by the Kirkpatrick Evaluation Model as "reaction" questions that look to measure how well learners liked the course (Chyung 2008).

Finally, the course will undergo a Quality Matters design review by the instructor, program manager and one other QM qualified reviewer measuring the required standards of the QM rubric for instructional design. This review will take place before the end of the next academic quarter.

References

Chyung, S. Y. (2008). Foundations of Instructional Performance Technology. Amherst, MA: HRD Press Inc.

Summative Criteria Primary Factors Data Sources

Effectiveness: Mastery of WBI goals

Can students apply concepts and procedures to create a tool for analyzing the online marketing and social media strategy for an organization?

Have students learned terms and concepts well enough to explain them to others in writing or in a presentation?

  • Student scores on final analysis assignment and presentation
  • Collect student self reflection comments on discussion forum and blog
  • Collect and organize comments from final peer reviews
  • Compare ENKA and EXKA scores on recognition of online and social media marketing terms and concepts
  • Quizzes and midterm averages

Efficiency: delivered in a timely and logical manner. Material is accessible to all students. 

Was there sufficient time for students to create, upload and reflect on assignments?

Did LMS and related technology function as expected

Assess instructor workload in troubleshooting, facilitation and feedback

  • QM standards review for online course design (reviewed according to rubric for required standards only )
  • Logs of time spent on answering logistical questions about course, preparation time, facilitation workload, feedback/scoring workload
  • Number of requests or use of "free passes" to submit late work
  • Mid-course and end-of -course survey by instructor
  • ELearning department course survey

Appeal: Gain and maintain learner attention and interest

Will students take other distance courses from the college? Did learners enjoy working with the materials in the course? What part of the course was the most engaging and enjoyable?

  • End-of- course survey
  • Enrolled/completed ratio of students
  • Self-reflection discussion forum

Back to top

Data Collection Timeline Chart

Click on the image to open a larger view (opens in new window)

data collection timeline