SUMMATIVE EVALUATION
Implement summative evaluation plans
This artifact is a detailed report that reviews the evaluation process and outcomes for a LinkedIn Learning course, “Animating in Photoshop.” This report demonstrates that I can use Kirkpatrick’s Four Levels of Evaluation to implement a plan that helps organizations determine a program’s effectiveness, which will help them decide if they should continue using this instructional product.
Instructional designers need to gather convincing evidence that proves their value to organizations. This requires proactive planning before the learning design process begins because this learning experience needs to target the behaviors necessary to achieve learning and organizational goals. On the other hand, designers can also be called in as consultants to evaluate existing programs. In both situations, they must plan carefully to acquire the relevant data to show stakeholders. To prepare for this necessary skill, we were asked to select a learning product to evaluate. We selected a LinkedIn Learning course, “Animating in Photoshop,” that could be used for an internship program of an imaginary game company, Tsunami Games. We imagined the organization wanted Cinematics interns to produce animated trailers that could increase marketing trailer production and decrease costs. They also wanted to ensure a high satisfaction level with the internship program. Therefore the goals of the evaluation were to measure intern engagement, trailer production levels, and trailer production costs. We measured these goals using exit surveys, interviews, action plans, production data, and a return on investment analysis.
The executive summary reviews the key findings, results, and recommendations on pages three and four of the report. Overall, the program met expectations. Data from Kirkpatrick’s Levels 1-4 indicated that “Animating in Photoshop” via LinkedIn Learning was an engaging, effective, cost-saving program used to teach interns tools that could help them create more marketing trailers. Therefore, we recommended continuing the program and increasing the number of trailers made by my interns to help further increase annual trailer production and decrease costs. Appendix C on page 23 provides all mock data and results of the summative evaluation.
This was my first time planning and implementing a summative program evaluation. As a teacher, I created summative assessments at the end of units. To develop effective summative assessments that were accurate measures of their skill acquisitions, I had to think carefully about how the questions on the assessment aligned with the concepts and skills they had learned in the unit. This is similar to aligning our program evaluation to specific behaviors and outcomes (Levels 3 and 4 in the Kirkpatrick model). Similarly, our teaching staff often looked closely at state testing data to strategize how to implement effective teaching practices in the classroom that could translate into the behaviors measured on these standardized tests.
EDCI 577 was challenging because it was my first time with program evaluation. I initially felt intimidated by the process because of its implied stakes - if designers cannot prove their value, they will be fired. After integrating these concepts into my practice as a designer, I learned that it would help strengthen my learning experiences by being more intentional in my design focus. It provides a destination so that I do not design arbitrarily. Since we created the mock data for the results during this project, I look forward to developing this competency in authentic situations where I may have to present negative outcomes to stakeholders.