ABOUT TRAINING STRATEGY

Translating Competencies into Daily Behaviors

One of the challenges in learning and development is to help employees turn Competency models into everyday behaviors. The typical question sounds like: “what does it mean, in practical terms ‘inspire leadership’?... what can I do exactly when dealing with my reports to improve on this competency?’”. SkillGym is the ideal platform to support trainees in understanding how their daily dialogues impact on their competencies. In this article, we’ll see how easy it is to move from theory to practice.
Andrea Laus
SkillGym CEO
4 min read

In another article (“Making the best of SkillGym Analytics”), we have already explored how powerful SkillGym analytics can be when it comes to turning conversations into metrics.
We have seen that there are several ways to measure one’s conversational performance, from the broadest “confidence” and “self-awareness”, down to the most detailed competency model KIPs.

However, one of the main challenges when practicing is that of finding the exact connection between what was said during the conversation and the result in terms of the score on competencies.

This is where the Augmented Replay comes to help. Let’s see how you can make the best of this amazing tool to improve faster on your confidence, self-awareness and overall performance by analyzing your past conversations.

Enter Augmented Replay

SkillGym Augmented Replay allows reviewing the entire conversation after it has been played. The idea is to attend and listen to the action from a third-party position in order to review the performance and reflect on details.

 

 

Typically, the two main things that one user may want to review are:

  1. On one side, their own behaviors as applied along the way. Imagine a way to review each sentence, identify the underlying predominant behaviors and rate their application in terms of quality (with some evidence of what other ways were available to deal with that specific step of the conversation).
  2. On the other side, the character’s body language. This is one of the main challenges for most everyone: recognizing other people’s body language. The most advanced Digital Role Play platform allows for this feature.

Since this article focuses on the first of the two, let’s start by saying that the most intuitive, still most powerful, feature of the Augmented Replay is the possibility to start/stop the action and move along the conversation to find the specific steps you want to review.

It may seem trivial, but before entering the Augmented Replay, the user experience is that of real-time action with no possibility to step back during the conversation. And it makes good sense since, while practicing, we want users to feel the authenticity of real life. But once the real-time practice is completed, having the possibility to browse through the conversation is essential to be able to reflect, discuss, learn.

Navigation is ensured by the bottom bar, where each step is a rectangle and, at the far left, you can find the play/pause button.

 

 

Deciding which steps to focus on depends on the scope of your analysis; you may want to look at those steps:

  • where you performed very bad, to learn what happened and find a way to improve
  • where you performed very well, to double down on those best practices
  • where your behavior impacted in some ways the competencies you are asked to improve

In all cases, the blue curve just above the bottom bar can give you a good idea of where to look at, in search of the right step to analyze.

 

 

Once you find the step, you can either:

  1. Open the detailed view showing which behavior was behind the sentence you chose at that step.
  2. Open the detailed view showing the body language of the character while listening to your sentence
  3. Open the detailed view showing the character’s answer and summarizing the entire step in terms of your behavior, the character’s answer and the impact of the step on the overall trend of the conversation

 

From competency to the conversation, and back

I opened this article focusing on how trainees can connect the dots between a competency model and their daily behaviors.

So let’s see in a few passages how to learn from SkillGym on this specific point. Imagine that your competency map looks like this:

 

 

You certainly want to learn why you scored low, for example, on the competency called “Active listening”.

The first thing to do is to look at which behaviors are connected to that specific competence. You can see this in the SkillGym Launcher, looking at the

 

The next step is to open the Augmented Replay and look at the steps where those behaviors were associated with your sentences.
In our example, at step 15:

 

 

It looks like your sentence did not perform well on the behavior called “Accept arguments”.

So the next thing to do is entering the details to discover which sentence of your underperformed that way:

 

 

At this point, you have a clear connection between the score of one behavior connected to a specific competency and your actual choice in the simulation.

You can also delve deeper here:

  • for example, looking at the body language of the character while listening to that sentence
  • or analyzing the impact of your specific behavior on the character’s next reaction

All these triggers will help you to turn an abstract concept of competency model description into a very real and tangible daily action, like that of talking to your employees in a certain way rather than another.

 

Of course, several behaviors will impact on one single competency and that would happen in more than one single step of the conversation.

So, I would recommend to focus on one specific competency at a time and look at all the possible interaction you had during that interview that somehow affected that specific KPI.

Using the Augmented Replay for this purpose is a fantastic way to gain and improve self-awareness but also the understanding of how close your daily behaviors are to your competency model.

 

Of course, we would be delighted to show you SkillGym’s solution in a 1-hour discovery call.

Leave a Reply