My name is Joelyne Marshall I am here to present my Final Project – the Redesign of the Overall Evaluation for the WLPI
I will inform you of My reasoning for selecting to redesign the Overall Evaluation What significant elements of the WLPI were used What elements of the ADDIE Model were used My Learning Curve as I moved through the project Adjustments that were made to the NEW evaluation The current state of the Evaluation
I am the WLPI intern for the Spring session For the internship, I am required to complete an additional project that will provide future benefit to the Institute I was allowed to use my Final Project as a portion of my internship responsibilities I selected the Overall Reaction Evaluation – which you see as part of this handout (SHOW HANDOUT) Redesigning the evaluation provided me an area to learn more where I had limited experience It also provided me a challenge and an opportunity to gain confidence in another aspect of training and development
Although all aspects of the WLPI provided helpful information The three modules that were most prominent for my project are: Designing Learning – using the ADDIE model and identifying objectives Delivering Training – tapping into presentation skills Measurement and Evaluation – learning the foundation of Level 1 and Level 2 Evaluations
Throughout my project I moved back and forth through the different stages of the ADDIE model Analysis was continually touched on with each revision of the evaluation Design and Develop stages were also addressed Implementation will occur at session end when the participants complete the new version Initial touches on Evaluate were identified through revisions, Evaluate will take a step further actual data is collected from participants
My learning curve went deeper than I anticipated I started with foundational information about developing evaluations from Week 6 Working directly with Ken Phillips allowed me to better understand the intricacies and nuances of evaluations. Over several phone calls, Ken would identify areas to tweak with in the current draft (wording, scale, flow of questions) The discussions included my questions as to why Ken would make the change he identified. Explanation of the reasoning, and further understanding followed Coupled with the WLPI session, my conversations with Ken, allowed me to better understand the following
Consistent scale throughout the evaluation is key This provides clarity for the respondent allows for easier completion due to familiarity Increases validity when ratings are similar and consistent Easier to administer the data from the completed evaluations Data is consistent and within similar boundaries
Initially I did not fully take into consideration the value of going through many revisions Currently the eval is on its 5 th draft. I anticipate it to be adjusted again Revisions provide a chance for a new perspective to tweak and refine along the way Need a fresh view to look over questions, wording, flow
AT this time, I’d like to reference this handout (SHOW HANDOUT) To further explain the adjustments between the OLD and NEW evaluation We will look at the following: Scale difference More in depth questions Cross validation New questions
You can see the OLD evaluation used rankings from HIGH to LOW And places descriptors at each point The NEW version reverses this per the information provided in week 6 (referencing how we count, high to low may cause inaccurate results) Also – the new version only has descriptors at the anchors which are definitive This allows the respondent to truly know what 5 and 1 represent As you can see from the final eval copy – I tried to use the same scale throughout to allow for easier administration
As you can see from this example I went deeper into the question about the Mentor Or Project Coach The OLD version only had 1 question devoted to the Project Coach The NEW question looks to acquire more feedback about the project Coach, including Providing feedback and whether or not they were knowledgeable about WLP
Another added question. During several of the sessions, participants communicated about the helpful tips they learned from the facilitator that they intended to use on the job Things such as Deb Pastors Ah Ha moments and others were great techniques to add to out trainer tool belts. The recurrence of this from session to session prompted me to think this information would benefit the WLPI in knowing how elements are applied on the job I added this question to see if participants would use tips/tools learned on the job
Currently the eval is at the 5 th draft I believe the scale is consistent as well as the question wording I also believe the evaluation will be easier to complete for participants The goal was to have the evaluation touch both Level 1 and Level 2 Acquiring a reaction, identifying if learning occurred and if the WLPI was relevant to participants and their job situation
I've shared why I selected the evaluation redesign for my project I also explained what elements of the WLPI I used prominently for my project and what stages of the ADDIE model were used. I explained the learning curve I went through as I reworked the evaluation I also highlighted some specific pieces of the evaluation that were adjusted Lastly, I explained the current state of the evaluation To end I’d like to extend a big thank you to Ken Phillips for all of his insight and expertise. Thank you.