Username: Save?
Password:
Home Forum Links Search Login Register*
    News: Keep The TechnoWorldInc.com Community Clean: Read Guidelines Here.
Recent Updates
[April 24, 2024, 11:48:22 AM]

[April 24, 2024, 11:48:22 AM]

[April 24, 2024, 11:48:22 AM]

[April 24, 2024, 11:48:22 AM]

[April 03, 2024, 06:11:00 PM]

[April 03, 2024, 06:11:00 PM]

[April 03, 2024, 06:11:00 PM]

[April 03, 2024, 06:11:00 PM]

[March 06, 2024, 02:45:27 PM]

[March 06, 2024, 02:45:27 PM]

[March 06, 2024, 02:45:27 PM]

[March 06, 2024, 02:45:27 PM]

[February 14, 2024, 02:00:39 PM]
Subscriptions
Get Latest Tech Updates For Free!
Resources
   Travelikers
   Funistan
   PrettyGalz
   Techlap
   FreeThemes
   Videsta
   Glamistan
   BachatMela
   GlamGalz
   Techzug
   Vidsage
   Funzug
   WorldHostInc
   Funfani
   FilmyMama
   Uploaded.Tech
   MegaPixelShop
   Netens
   Funotic
   FreeJobsInc
   FilesPark
Participate in the fastest growing Technical Encyclopedia! This website is 100% Free. Please register or login using the login box above if you have already registered. You will need to be logged in to reply, make new topics and to access all the areas. Registration is free! Click Here To Register.
+ Techno World Inc - The Best Technical Encyclopedia Online! » Forum » THE TECHNO CLUB [ TECHNOWORLDINC.COM ] » Techno Articles » Sales
 Measuring Training's Value: Metrics Lite
Pages: [1]   Go Down
  Print  
Author Topic: Measuring Training's Value: Metrics Lite  (Read 973 times)
Daniel Franklin
TWI Hero
**********


Karma: 3
Offline Offline

Posts: 16647


View Profile Email
Measuring Training's Value: Metrics Lite
« Posted: November 05, 2007, 02:15:06 PM »


Being a sales consulting and training company, clients and prospects often ask if we can measure the impact of our training, and occasionally whether we can guarantee increased sales. While we have confidence in our consulting, instructional design and facilitation abilities, and understand the importance of these questions from both management and training perspectives, quantifying training's impact can be a slippery slope for any training company.

Isolating training from the many factors that may positively and negatively impact its success is a major pitfall in the evaluation process. Most training professionals recognize that the best training in the world may not be able to overcome the impact of poor management, low morale, unsatisfactory compensation structures, unreasonable sales goals, and a variety of other factors. Conversely, all too often mediocre training appears superb when supported with a marketing blitz, product price reductions, or economic upswings.

Despite these inherent challenges, we agree that evaluating training is wise and a necessary process. However, the process doesn't need to become a science project with empirical evidence to "prove" that the training intervention was solely responsible for the results.

Many organizations want to measure results, but are unable to provide the resources in time and money to support the process. To help those organizations, we suggest a lighter (tastes great, less filling) approach: "Metrics Lite." "Metrics Lite" employs Donald L. Kirkpatrick's 4 Levels of Evaluating Training Programs process as simply stated below:

Level 1: Reaction - Participant reaction to the training

Level 2: Learning - Change or increase in knowledge or skills

Level 3: Behavior - Extent of the application of learning

Level 4: Results - Effect on the business resulting from the training

To illustrate the "Metrics Lite" approach we offer a real-life case study and our self-evaluation: what worked well, what we learned, and what might be a slippery slope. We were contacted by a leading mortgage lender's Call Center training department interested in helping transform their agents from reactive "order takers" to proactive, needs-based solution performers. Additionally, it was important to them that there be a "measurement" on training's effectiveness.

We proposed Kirkpatrick's model designing and executing Levels, 1, 2, and 3, and because of monetary, time and resource constraints, we would support the design of Level 4 only.

Level 1

Action: Immediately following the training workshops, participants rated the relevance and effectiveness of the training, and the training methods and techniques employed.

Result: Participants viewed the training as highly effective and relevant to their jobs, with many offering specific positive comments.

Self-evaluation: Participants rated all questions high, with the highest areas on content relevancy and ability to transfer skills to the job. We learned that participants wanted more coverage on resolving objections and closing (definitely linked concepts!).

Level 2

Action: Participants completed written Pre and Post-Tests (20 questions), testing their knowledge and the application of skills learned in the workshops. We used realistic customer situations to test the ability to apply knowledge about learned skills.

Result: Participant's Post-Test scores were higher than their Pre-Test scores, indicating increased knowledge and skill application.

Self-evaluation: The questionnaire was designed to be challenging, but may have been too easy as there wasn't a significant increase in scores due to relatively high grades on the Pre test. Our training philosophy is that people learn by doing (I hear and I forget. I see and I remember. I do and I understand. - Confucius). Accordingly, we believe passing a written test does have some limited merit in assessing skills learning, but it's certainly not a true indication of mastery of the training objectives.

Level 3

Action: Using a random approach of training and non-training participants, we monitored two to four recorded calls with the training department. Using a customized 22 sales behavior checklist, we rated the degree participants and non-participants applied the skills. We also conducted an on-line "perception" survey two months following training, asking for feedback regarding the impact and usefulness of the training and materials.

Result: Participants applied the learned skills during calls more often following training. Participants continue to refer to the training materials very often.

Self-evaluation: We believe that assessing a participant's actual performance to be the most important assessment. We were thrilled to notice distinct improvement in participant's skills. It was also valuable to learn that the materials were still relevant two months after the training.

We do have two comments, though. First, we are concerned that we and training department personnel conducted the monitoring. We know that we were as objective as possible, but the monitoring should impartially be done by a party who isn't a stakeholder in the success of the training. Secondly, the success of the training is predicated on not just the training workshop, but also other ongoing factors such as coaching by Sales Managers and the support of management.

Level 4

Action: The Call Center's training department compared participant and non-participant conversion rates (the ability to convert a qualified caller's interest in mortgage information into a closed mortgage) and cross-sales levels for two months following training to prior periods.

Result: Sales Agents who went through training had increased conversion rates and cross sales following training compared to prior periods, and higher overall levels than those who did not go through training.

Self-evaluation: This is obviously the bottom-line to any client. However, as stated earlier, there are many other factors that can make the numbers better or worse, regardless of the quality of the training. Given the training department's need to spend considerable hours on call monitoring, it's questionable whether the cost in time and manpower is worth the effort. Secondly, while the assessment appears very objective (the assessment is purely statistics), the results were compiled by those with a stake in the training's success, the client's training department. We believe that if a Level 4 evaluation is to be done, it should be done by a third party, either a different area of the company, or an outside expert vender.

The American Society for Training and Development (ASTD) found that 45 percent of surveyed organizations only gauged trainees' reactions to courses (Bassi & van Buren, 1999). Overall, 93% of training courses are evaluated at Level 1, 52% at Level 2, 31% at Level 3 and 28% at Level 4. The data illustrates a preference to conduct simple evaluations. We believe this is partly due to the difficulty of conducting objective in-depth evaluations, and partly due to monetary, time and resource constraints.

Despite the inherent challenges and pitfalls of the evaluation process, we strongly urge organizations to attempt it, even though it's not foolproof. After all, despite the "Metrics Lite" approach being less filling, it does taste great.

  Articles Source - Free Articles
About the Author

Howard is co-owner and Business Manager of The Bluestar Group LLC, a leading skill-intensive consulting and training company specializing in maximizing sales performance by developing effective sales strategies, tactics and highly refined skills. For more information, visithttp://www.thebluestargroup.biz.

Logged

Pages: [1]   Go Up
  Print  
 
Jump to:  

Copyright © 2006-2023 TechnoWorldInc.com. All Rights Reserved. Privacy Policy | Disclaimer
Page created in 0.168 seconds with 24 queries.