Monday, January 18, 2010

The Challenge of Measuring Sales Training

To: Sales Directors, Sales Academy Staff, Sales Skills Development staff, Sales Learning/Training staff

I was speaking to a client contact the other day about the challenges that they faced to justify the investment the organisation as making in building sales skills. They had created an internal “Sales Academy” a few years back, but other than who had attended what courses, they didn’t have much to report.
A key lesson here is that the measures and metrics need to be established, and bought into, from the start. Then the method of gathering the data can be designed, data collected, and regular discussions can take place on progress. That said, even without thinking ahead of time, there are still things that can be done.

The two challenges are to be able to communicate upwards the value of the work that the team (or in this case the Academy) is doing, as well as communicating to your internal customers (the sales teams) that the Academy is helpful to them. Often we forget the value and importance of communicating to the sales teams in our zest to communicate upwards!

Of course the key to this process is the availability of data (see key lesson above). Without having planned in advance for measurement and reporting, you can be left with quite narrow choices for data. Without knowing what data each your organisation has access to, it is hard to make any specific recommendations; you need to be creative with what you can get (and, occasionally, politely aggressive to get access to it). I can offer up some suggestions, though, based on what we’ve done before.

Below are a few ideas of analyses that you may want to perform:

  1. Compare scores on the Sales Competencies to performance against sales quota. While imperfect (since it relies on the assumption that quotas are reasonably and consistently set), this analysis should be able to show that those with better skills exceed quota.

    You can also quantify the value of moving the average competency score by one point (or similar), for instance, “a 1 pt movement on the average competency score equates to 2% improvement in quota performance.” This can also easily be translated in dollar impact.
  2. To be even more targeted, you can group specific skills (e.g. Opportunity Development and Clarification) from the competency model and analyse their impact as performance. This is the same as the above, but much more focused on key known gap areas, or key areas where you know you can have impact. Once again, this can be translated to dollar impact, “improving our skills in opportunity development could mean $10 in top line sales”.
  3. Compare teams that have received coaching from the Academy, to those that have not. Typically well-coached teams out-perform poorly coached (or at least less frequently coached) teams. You may not have this degree of data on yet.
  4. Compare the performance of known experts in the Competencies or the Sales Process to those who do not perform as well. In our Academies, this is typically done through the accreditation levels; we see that Gold accredited individuals outperform Silver, who in turn outperform Bronze. Without accreditation, you could create quartiles for comparison. The goal here would be to be able to say something like “the top quartile performers in the competency model outperform the second quartile by 40%,” thereby further proving the value of the model and of engagement with the Academy.
  5. Compare won deals to lost deals, in particular along whether the deal had specific completed templates filed against it. This would allow you to show that chances of winning a deal are x% greater if the template is completed. This will help to argue against those that see the tools as simply an administrative task.
  6. As a simple win analysis, capture the value of wins where the sales team has been coached, or used the tools, etc. A simple roll-up of the value of the contracts won would allow you to communicate that “teams where we have been involved in the sales process have generated $150 million in sales”.
  7. A more complicated win analysis, would be to get sales managers and account managers to assign a percentage of the win to the tools, coaching and process received from the group during the sales process. This percentage is then multiplied by the deal value, and then all the percentage deal values across all the won deals are added to create an impact value.

This list may well be more ideas than anyone needs, but they are a few ideas that might get you thinking about your own internal challenges to demonstrate the impact of your programmes.

The goal, of course is to be able to talk dollars: here’s the dollar impact of what we’ve done. Once you have quantified some business impact (“Level 4”), you can go on to show ROI (“Level 5”). This can be pretty compelling data to communicate upwards. For instance, for one account we found that every $1 spent on a training/coaching programme generated $16 of gross margin. That got some attention!

Thursday, January 14, 2010

Let's not all drink the Virtual training Kool-aid!

I have been hearing a lot of excitement from clients lately about the idea of delivering "virtual" sales training sessions. It has been coming up a lot in conversations, frequently enough that it is already time to raise some red flags. We need to be careful with this excitement, and not allow the it to overwhelm the reality, nor, importantly, to overwhelm the impact of these sessions. (In the interest of full disclosure, before I begin, some of the excitement is, indeed, coming from me.)
Let's define what I'm talking about first. When I say virtual sales training (VST) sessions, I am talking about replacing face-to-face training/coaching/in-field support, with a virtual interaction. This generally takes place via a Web-Ex- or NetMeeting-like technology, allowing individuals to gather around a common desktop or slidedeck. The session is synchronous, meaning that the attendees are all present at the same time, all can contribute, discuss, ask questions etc. That's enough of a definition for now.
One more thing, though: I don't want to get caught up in the argument of "Good virtual learning is better than bad face-to-face," or vice versa. Let's assume, for the sake of the argument, that we're discussing good versions of both. Arguing that good skills development programmes are good, and bad ones are bad, is something that takes up a surprising volume of space in blogs and discussion groups, but gets us nowhere. Onward.
Probably the first, and largest, red flag is the context in which VST is discussed. Most of the time it comes up around a discussion of doing more with less, and virtual sessions are immediately introduced as a way to save money. This has been particularly true during the global financial crisis (or, as Saul Eslake rightly calls it, the North Atlantic Financial Crisis) where budgets have been cut, costs slashed, people retrenched. VST comes in to save the day: we can do the same amount of training, but save heaps on travel and room hire, etc. 'Not only that, but the team doesn't need to be out of the field for 2 days in a row! We can run 4 x 2-hour sessions and get the same result!' (Yes, I have heard virtually that same quote from a client).
The second red flag comes when people start talking about how VST delivers the same thing as face-to-face interaction. Through the magic of technology, we hear, the experience is the same. One client even said that the only difference between VST and a classroom session is that with VST you can't have drinks together afterward! Someone in the room suggested they could have virtual drinks instead - ha ha!
The third flag comes when people discuss how easy VST is. People are used to virtual sessions, now, they say, and the results just come. This is a variant of the classic 'If you build it they will come' argument, tweaked to say, 'If you VST it, their behaviours will change.'
And the final flag (for now) pops up when people discuss that they have moved their course from face-to-face to VST. This is worrying when the discussion implies that it is a straight port across, or that all they had to do was eliminate a few things that "wouldn't work virtually".
So what's my point? I'm a big fan of VST, and I think it is a very useful past of a sales development programme. But we have to be realistic. Many conversations I have heard with clients or with people in the sales effectiveness space sound an awful lot like the conversations we had back in 1999: face-to-face training is dead, long live online training. The truth was, and is, much more complicated.
Here's the truth: VST is hard, just like any effort to build the skills or change the behaviours of salespeople is hard. And VST will fail, just like other approaches, if it isn't combined with good upfront communication, defined measures goals and metrics, coaching and reinforcement, systemic support, and reporting. So instead of jumping on the bandwagon and declaring that VST is the answer to all questions, and the solution we've all been looking for, focus on making sure that VST fits into a larger picture plan of how you're going to support whatever you're doing via VST, and change behaviours in the field.
Above and beyond the above, VST requires thought. Creating VST is not just a process of using the same slide deck, the same discussions, the same exercises. Effort needs to be put into creating a session that works in a virtual world, leveraging the advantages (and avoiding the pitfalls) therein. There will likely be lots of commonality, but a sure path to poor results is when your VST session maps 100% onto something done face-to-face. VST is different – and that’s a good thing – and needs to be thought through.
Thinking of VST as a cheaper way of doing things condemns us to the same mistakes as was made a decade ago with online training, and 20 years ago with Computer Based Training. It *may* be cheaper, and it *may* be overall more cost effective, but the key thing is not the cost, it is the outcome. Again, VST is *different* and needs to be understood and discussed as such. The focus needs to be on getting good outcomes from the inputs, not just saying 'VST is cheaper'. There’ll be lots of space in other posts to discuss some of the keys to what makes VST different, and how to take advantage of that.
The deeper truth, of course, is that VST is one element that probably has its place in almost any sales development project. It should be included as a possibility in most programmes, and eliminated or included on its own merits. The more tools and technologies that we use to develop skills and embed behaviours, I believe, the better.
But my honest and heartfelt warning remains: don’t just jump on VST because it is the latest thing, or the cheapest thing, or the only thing that you can afford. Look deeper, think harder, and you will more than likely be on the right path.