Building a great training program is deceptively hard.

For example, let’s say you change your refund policy and expect a savings of around $50 per refund.

You train your customer service team and launch the new policy. Refund amounts go down, but not as much as you had hoped.

Looking into it, not all of your support agents are adapting the new approach, and you’re still refunding thousands more every day than you anticipated.

What happened?

Is there a problem with the policy? A glitch in the software? Did your customer service reps forget what they learned in training?

Unfortunately, the answer is often: “We have no idea what happened.”

Expected Vs Actual

Great Training Programs vs. Underperforming Programs

 

Over the last ten years, I’ve helped with the development of hundreds of training programs and educational products. I’ve seen what separates a great training program from an underperforming one.

It’s not our information or our tools.

It’s our assumptions.

We assume:

  • Our knowledge is correct and relevant to the employee
  • We’ve packaged it in a way that’s understandable to our employees
  • Employees will retain 100% of the knowledge we deliver
  • They’ll apply it correctly

In reality, any or all of these assumptions could be wrong. And most companies have no way to test if their assumptions are correct.

So… what are we to do? How can we actually build a training program that sticks, and actually drives the changes we want to make?

Answer: constantly check the validity of our assumptions.

The Need for Data

 

Marketing and product teams are experts at checking assumptions. When run well, they run like laboratories, constantly checking to see how new features and campaigns affect their results.

marketing-sales

We can adopt the same experimental mindset in employee training. It requires two things:

  1. Assumptions
  2. Data

We (hopefully) have the assumptions. What we often lack is data to measure whether our assumptions are correct.

Here’s a scenario we often see: a company will do a training in hopes of improving a particular KPI; customer satisfaction scores (CSAT), for example.

It works, and management sees a jump in scores, only to see them mysteriously fade back down over the next few weeks.

This happens often because people are forgetting what they learned in training.

This isn’t a failure of people; it’s a failure of training.

Actionable Data in Training

 

Hickory’s employee training platform actively predicts when a learner is likely to forget material from a training course.

It will show you who knows what and how well—all in real time. If you want to know what someone knows today, just check in your dashboard.

Once you have data on your team’s current knowledge, you can correlate it to performance, then test whether your assumptions are true.

Bottom line? With the right data, you can ensure the gains that happen in training actually stick.

Let’s return to our earlier customer service training example—with a data-driven approach to our assumptions.

You change your refund policy. You expect a savings of $50 per refund. You train your support team. Refund amounts drop after the policy change, but not as much as you expected.

If you’re tracking how well your training is working, you don’t have to guess what’s happening. Instead, you check the data in your dashboard, which confirms that only 65% of the team is fully knowledgeable on your new privacy policy.

Hickory’s algorithm is already working to fix the problem, automatically scheduling follow-up refreshers for underperforming members of the team. Over the next two weeks, the team reaches 95% understanding of the refund scenarios.

You also see: refund rates hit their $50 savings target. And your company preserves thousands of dollars a day on its bottom line as a result.

target-achieved-bar

Retention Isn’t Always the Problem

 

One of my favorite cases involved a company that was receiving customer complaints about the hygiene of employees at the company.

The company assumed that training employees on the importance of hygiene would solve the problem, and we helped them design and implement a training course. At the end of the training, our data showed over 90% of the information was understood and retained by the employees.

However, the problem persisted. Since Hickory’s data showed it wasn’t a training issue (after all, the majority of the employees understood the training), the company went looking for other culprits—and found one.

It was an access issue. Adding cans of spray deodorant to the work environment completely solved the issue.

In the end, we removed the hygiene training from the curriculum. It simply wasn’t needed anymore.

How to Create Great Training Programs

 

Building a world-class training program is an iterative process. If you take the time to continuously test and optimize, you can build something great.

All you need to improve any KPI through training is the right data, assumptions you can test, and an experimental mindset.