Selling B2B? Why you need a social selling success tracking program.

Social Selling, the use of social media like Twitter, Facebook and LinkedIn by sales professionals, is a great buzz phrase. But does it really work?

IBM seems to think so – their social selling pilot in 2012 saw a 400% increase in sales on top of massive increases in reach. LinkedIn agree, their research in 2016 found that sales people who share content are 45% more likely to exceed quota.

Many other professional services firms have since followed suit – often by buying LinkedIn Sales Navigator licences for staff (e.g. Ernst & Young).

Certainly “social selling” is now  dominated  by LinkedIn with its 200 million strong professional user base, and now backed by Microsoft, it is set to maintain its dominance.

However social selling can and will happen on other tools and sites:

  • Quora questions and answers can deliver very targeted leads
  • Twitter provides a fast moving environment for breaking news
  • TED talks can strengthen existing thought leadership positions
  • Presentation decks on Slideshare can keep presenting for you long after the original talk
  • Blog posts can provide the space to make an argument effectively.
  • Sector focused Facebook groups can be lively and engaging
  • Whatsapp groups can trigger rapid responses among business people.

New sites can pop up too, like Gartner’s new platform that offers a forum for experts while tools like can pop up and go away in just a few months.

For some businesses, they may also run their own online social platforms – whether multi-stakeholder such as the blogging community or a corporate focused one such as CapGemini’s Expert Connect.

Then of course there are geographically localised sites that may offer more profitable prospecting in specific countries, such as Xing in Germany, Viadeo in France or Weibo in China.

The lesson is that when it comes to B2B social selling there is unlikely to ever be a single site that covers all your needs for all your sales focused staff.

Analyse that!

This then presents a problem when it comes to analysing what works. Whether from a management point of view, asking “is our licence money well spent?”, or from an indvidual point of view, “where should I invest my time?” – having so many different options brings a struggle to create a cohesive strategy.

One way to decide, is to use a data driven approach: look at the results of activity and link them back to success. Do more of what seems to work and less of what doesn’t.

Many platforms offer their own analytics which do go some way to providing the necessary feedback loop. They offer  scores based on your activity – whether specific metrics, e.g. number of tweet impressions or a more sophisticated, composite index such as Klout or LinkedIn’s Social Selling Index. These are what are termed native analytics tools as provided by the platform.

However, most native analytics tools are biased towards usage rather than value.

Take LinkedIn’s SSI for example. One LinkedIn trainer, Andy Foote, who looked in detail at how the score algorithm is calculated said:

“Frankly, it looks like a checklist for how to become an aggressive LinkedIn pest.”

It’s true. Every analytics package always has designer bias built in. In LinkedIn’s case it makes complete sense that the metrics should prioritise getting people to use LinkedIn over other priorities. Even something as simple as the order in which analytics are shown reflects the preference of the designer, yet the viewer will instinctively treat the first metric as more important – it’s simply the way we’re wired.

How can I bias the analytics towards business value for us, not the platform?

One way to do this is to create your own social selling score and composite metrics instead. You can then order and weight metrics according to your contextual priorities, not those of the underlying communications platform.

Creating a meaningful social score for your staff need not be difficult or expensive: using a spreadsheet you can import raw usage data from any of your sales navigator staff from Linkedin, you can download data from twitter analytics too. Combine it all together and you can create a social selling composite report for each sales rep that reflects your priorities as a business (and your experience of what works in your sector). Then email out the score to each rep and you can get them engaged and motivated to focus on the right social selling behaviours.

Of course if that sounds like too much work to do each week, then you can of course use Rise to take away much of the heavy lifting. Rise will pull in the data automatically where possible, process it and calculate a score. Rise will then share the results to each sales rep via email or in a personal online dashboard.

If you’d like to try out scoring your sellers yourself, then we recommend a simple Staff Power 100 implementation. This app uses Kred scores as a proxy for more detailed metrics. It means you can be up and running in half an hour.

Investing in Sales Navigator licences? Put some budget into success tracking too.

I think the key takeaway for me is that if you are spending in the thousands to give your staff sales navigator licences then you should spend in the hundreds to make sure that investment is giving you value (management reporting) and personal feedback so that your staff  can optimise their behaviour to give them value (personal reporting).

The human algorithm – an interview with IBM’s Marie Wallace



Towards the end of last year, I met up with employee advocacy and human network analysis guru Marie Wallace at IBM to discuss driving engagement and business results using scores.



Here’s what she had to say.


TB: Hi Marie, can you let us know a bit about your role at IBM and what you’re up to now?


I work in IBM’s Analytics Group on emerging technology. Previously I worked on the natural language processing technology which is now part of IBM Watson, our cognitive computing platform, although these days I’m all about analysis of human networks. Over the last couple of years I’ve been focused on an effort called Project Breadcrumb, which is an engagement analytics system that ingests collaboration and social data, from systems like IBM Connections, to measure employee engagement. The algorithms incorporate more than 15 years of IBM research into social networks and reputation. Our system measures engagement activity, reaction, eminence, and network to provide IBM Connections users with detailed scores on their internal social effectiveness. Now we’re expanding our vision to create algorithms that go beyond simply the social network but to other systems as well.


TB: What is an algorithm in your context?


Our algorithms take into consideration a wide variety of interactions, that are represented in the graph as nodes, edges, and their respective properties, in order  to generate a set of scores for each individual. The algorithm represents a significant competitive advantage for us and our clients as it’s based not only on good data science but also good social science, and has been proven through a number of business value experiments to accurately characterize individual engagement. This is absolutely critical because as they say “be careful what you measure as you might just get it” and if the algorithm is measuring and rewarding the wrong behaviors than it is detrimental to your organization.


TB: So do your clients ask you to create the algorithms for them?


Yes! We are seeing an increasing number of internal and external clients interested in using these types of network analysis techniques to meet a number of business goals. It might be an organization looking to implement organizational change programs where they could benefit in understanding who are the influencers or information brokers across their organizations in order to leverage them to maximize outcome. Or perhaps knowledge redistribution is a key objective where experts and expertise distribution, such as social sharing behaviors, are critical.


One question I always ask clients before they create any metric is to think very clearly about the outcome they are trying to steer. If you measure people to become more engaged then they engage more. If you measure number of deals closed then they will close more deals.


However, you have to be careful – sometimes you can optimise for the individual and not for the organisation.  Taking our personal social dashboard as an example; if you just chose to measure and reward individual “activity” then you’d generate a company of spammers, whereas if you include “reaction” and “eminence” then you drive more thoughtful sharing; which is critical to redistribution of knowledge throughout a company.


A great algorithm will in fact take “gaming of the system” into account and still ensure the right business outcome.


TB: So can an algorithm change behaviour?

MW: Absolutely, with social science we know that whatever culture you want in your organisation then the measurement and reward system you put in place can help drive that. By making this analysis available to every individual in the company, and simplifying the presentation so that everyone can understand what is driving the scores, much the same as you guys at Rise are doing, then we can use the scores from an algorithm to drive and reward the desired behaviour.


TB: So what advice do you have for business leaders wanting to create their own algorithms to drive behaviour?


Well firstly I would encourage you to bring all stakeholders, analyzers and analyzees, into a room to agree first what outcomes you want and then decide what the metrics should be. You are likely going to be constrained by the data you can get hold of, so secondly I would recommend that you start to identify the data you have or that you could start to collect; it’s always amazing to me how much data companies either have, but are not using for analysis, or that they could have but aren’t collecting. And finally consider potential privacy and security concerns early into the project and ensure that these are taken into account.


TB: Yes, we’ve also seen that the best scoring programs are those where the players have a real voice in how the algorithm creates the score.

MW: Yes, it’s really important that the analyzees – employees in our case – are part of the discussion and that they are comfortable and happy with the analysis being undertaken, and the privacy controls. Finally it’s important not to go too fast; getting the algorithm right and ensuring employees are happy with it, is more important than making it available for use by other parts of the business. For example HR usage in performance reviews. This will come over time, but we must walk before we run and bring all stakeholders with us on the journey.


TB: Great insights, thanks Marie. Do you have any other materials you’d like to share for people setting out on their own people analytics journey?


I’ve got heaps more content on this subject on my own blog and for those wanting to deep dive then there’s our Engagement Analytics team at IBM.