How Google is using the principles of Success Tracking

An email from the team at Google Maps landed in my inbox recently.  I knew that it was a “robo-generated” email, and yet I got engaged.  The title of the email grabbed my attention first – “Your  review is making a difference”.  As I looked at it further and digested fully the message, I realised that Google was using the 5 core principles of Success Tracking

So, here’s how:

This bit of the email shows that I have opted-in to receive the success tracking report (score)

And the main part of the email shows how Google are adhering to no prizes, a simple score, storified content and positive score keeping

This is a great example of how just relevant feedback, storified and positive, is driving my behaviour change.

Great to see #success tracking at work.


Reflections on using Rise to support conference-based gamification

This is a guest blog from:

Fiona MacNeill, Learning Technologies Adviser, University of Brighton and UCISA Digital Capabilities committee member. AboutMe: Twitter: @fmacneill

In early June I had the pleasure of implementing a conference-wide gamification activity in support of the UCISA Digital Capabilities event. The event took place at MediaCity, Salford; a vibrant and engaging venue for an event stocked-full of innovative ideas. The event focused on showcasing successful practices for supporting academic staff and learners in their use of technology within further and higher education. Another goal of the event was to highlight findings from the recent Digital Capabilities survey. So when a member of the event organising committee, Iain Cameron (University of Aberdeen, and UCISA Digital Capabilities committee), mentioned the idea of a Twitter selfie (or Twelfie) competition as part of the proceedings; Rise immediately came to my mind as the right tool for the job! I had encountered Rise before at a demo at the International Confex event in 2013 and then again during the Mahara 2014 Hui held at the University of Brighton.

The rules of the game were simple and already outlined for me by the organisers. One point was awarded for original selfies; @mentions; and retweets featuring the #udigcap hashtag. Two points were awarded to reward the befriending behaviours needed to: take selfies with another delegate; take selfies with a speaker; and take selfies with organisers. Three points were awarded for imaginative selfies; selfies with passing celebrities who work or visit the television studios of MediaCity; and a selfie with a famous landmark. Although the game was simple, we entered into it with a sense of playfulness, completed by my donning the literal udig-Cap on my head, to signify my position as the twelfie official! Here’s the photo evidence



Observed positive effects of using Rise Leaderboard:

  • Rise really stoked attendee engagement via Twitter. There were around 90 tweets that included twelfies. Overall, there were almost 1200 tweets related to the event, many of which were a direct result of attendees taking part in the Leaderboard.


  • The competition called for attendees to take photos of themselves with other attendees, speakers and celebrities. This encouraged both in-person and online engagement.


  • The twelfie competition promoted a sense of fun and resulted in crowdsourced documentation of the event proceedings. The documentation is now archived as a Storify
    • The competition boosted discursive engagement and publicised the twitter feed prior to the event. This was largely achieved by some pre-conference challenges where attendees were asked to take engaging photographs of their journey to the conference.


Top tips for using Rise in a conference situation

  1. Our photo-based metrics meant that we had to do a lot of manual scoring. I suggest using a wider variety of metrics, including a mixture of automatic metrics derived from twitter polling and manual metrics.
  2. I recommend linking a Google doc to the active leaderboard to enable simpler player addition and
  3. Limit the number of times a certain metric can be scored. We found that some of the twelfies became repetitive, as there was not a limit on the number of times that a twelfie could be scored.
    • Include some wildcard activities to promote positive conference behaviour:
      • g. tweet and tag someone whom you met at lunchtime (with their permission);
      • engage in the conference treasure hunt and tweet what you found etc.
  1. Take greater advantage of the need for the human superviser, or games-master, and consider using them to lead tweet-ups of certain topics raised during the event. These could also have point-awarding options.
  2. Consider day-by-day scoring and options for remote attendees and second day attendees.
  3. Points for @mentions of anything other than the conference hashtag, can affect the quality of tweets’ written content due to the character limit. Best to keep it to one @mention metric.
    Add players in advance of the conference, if possible.
  4. Having clearly defined board release times was a good strategy and led to a sense of anticipation, e.g. breaks worked well as times to release and show the updated leaderboard. Leave at least 10 minutes for the polls to complete and to release the board. I owe this idea to Katie Piatt (University of Brighton), who used this strategy to great effect at the 2014 Mahara Hui.

Future ideas

As I contemplate gamification at the next iteration of the Digital Capabilities event I have been considering how the competitive element could be developed further. Here are a few ideas, although I won’t go into specifics, as I don’t want to give the game away in advance!

  • Make awards unexpected – as Daniel Pink, explains in his 2010 book, Drive expected extrinsic rewards can negatively affect performance (pp. 63-70). Therefore adding some unexpected rewards for completed tasks could add value. However these rewards will not be itemised on the rules list, so a disclaimer about judge discretion may be helpful!
  • Reward introverts as well as extroverts – one of the best conferences that I have ever attended was Eyeo Festival based in Minneapolis, Minnesota, USA ( Eyeo is an awe-inspiring event focusing on data visualisation, interactivity and maker ethics. However in the midst of all the flashy stuff, in the two years that I attended they had quiet spaces where one could engage in puzzles and inventions related to the event, sans supervision or sales influence. This was an invaluable opportunity to play and learn. Having an area in a conference like this provides time for time-out and inspiration as well as hidden scoring opportunities!
  • An idea inspired by Jane McGonigal’s book, Reality is Broken (2012): we allow attendees to +1 each other. This is like an in-person analogy of a “favorite” star or a “like” thumbs-up, but because it is real, perhaps it means even more within the context of the event. I like the idea of using physical +1s (think cardboard cutouts the size of a plate) which could become the subject of a selfie; a nice option for camera shy attendees.
  • Finally this is an idea that I owe to Pete Jenkins (, who suggested making the next iteration of our competition, a team-based activity. Rise Leaderboard can support this mode of use. The concept is that player interest will be more sustained if they are contributing to a group effort, as opposed to seeing individuals rapidly ascend up the leaderboard and losing the will to compete due to very high leading scores. In the team model points can still be awarded individually for small activities and these can contribute to the collective team score.

Well, I for one am excited about the next Digital Capabilities event!


McGonigal, J. (2012). Reality is Broken: Why Games Make Us Better and How They Can Change the World. London: Vintage.

Pink, D. H. (2010). Drive: The Surprising Truth About What Motivates Us. Edinburgh: Canongate Books.

More than one way to skin a bird: 8 Different Twitter Engagement Formulas

It’s all very well having a Twitter account with lots of followers – but are they really listening? The best way to monitor this is to track their reactions to your posts – this is called “engagement.”

However, there is no one standard formula for measuring engagement. Each platform is different and even on relatively straightforward platforms like Twitter, there is plenty of scope for variation.

In this blog post I want to take Twitter, and look at some of the different formula you can use to measure engagement. Which one you pick will be up to you. Your choice will depend on your context and what you are hoping to achieve with your Twitter channel.

Here’s the list:

1) Total Engagements

Earned @Replies +Earned Retweets +Earned Mentions +Earned Favorites

Perhaps the simplest way to measure engagement is to total up all the engagements on your Twitter channel during a set period.

2) Engagement Deltas

Earned @Replies +Earned Mentions +Earned Retweets + New Followers (by day)

This focuses more on the incremental numbers. It’s disadvantage is that it assumes a consistent amount of activity per day.

3) Basic Engagement Rate

(Earned @Replies +Earned Mentions +Earned Retweets) /  Followers x 100.

This looks at the ratio and calculates what percentage of your followers are engaging with your tweets.

Digital marketing consultant, Erica Kei says that “a good engagement level is between 0.5 and 2.0%

4) Average Tweet Engagement Rate

((Earned @Replies +Earned Mentions +Earned Retweets) / Tweets ) / Followers x 100.

An Average Tweet Engagement Rate really measures the quality of your content. How many engagements did your tweets get?

While it can be done on an individual tweet basis (engagements per tweet/ number of followers) it is best averaged across all tweets for a period. Scores can range between 0.01% and 1%.

5) Engagement as calculated by Twitter itself

(Clicks +Earned Retweets +Earned Mentions) / Impressions

Twitter has more data to pick from than 3rd party tools (which don’t have access to click or impression data) so it is able to factor in these important metrics into the calculation. You can access your Twitter engagement scores directly at

Breaking down Engagement into Conversation, Amplification and Applause

Avinash Kaushik goes further and defines more advanced ratios that break down engagement into its constituent parts. These are summarised by Shobha Thomas as follows:

6) Conversation Rate.

Because you need to be “social” on social media.
 Earned Replies / Tweets

7) Amplification Rate.

How frequently are you tapping in to your “second level” network? i.e. reaching followers of followers?
Earned Retweets / Tweets

8) Applause Rate.

Helps you understand what the audience likes.
Earned Favourites / Tweets

Keeping track of this stuff is hard, which is the best twitter engagement rate to track for you. That’s why we’ve made it easy by creating the Rise Twitter Engagement Club – it automatically keeps track of the most popular engagement metric (Average Engagement Rate) for you and will email you once a week with your score, coupled with a comparative benchmark with others on the board. It’s a great way to start improving your Twitter engagement.

Key KPIs to measure when commissioning a game

So you’ve decided to commission a game. Well done you.

Whether it’s for entertainment without an ulterior motive, an advergame to subtly inject your brand message or a serious game to educate and drive new behaviours, it is worth considering how you will measure success.

At times like these I like to turn to the Playfish troika of metrics – gleaned from the successful social gaming company these categories have been shown time and again to be the right prioritized order of metrics.

They are engagement, virality and monetization.

Let’s take each in turn.

Engagement is first and foremost – are people engaging in your game, do they enjoy being here and do they return time and again to your game.

The metrics are standardised – daily active users (DAUs) and monthly active users (MAUs). Beyond simply number of installs or downloads, MAU and DAU track actual users and engagement. For the advanced among us you can look at dwell times (the average duration of an engagement) and churn rates (cohort declines over time) but MAU and DAU still reign supreme.

Next up is virality, because once people are engaged in your game they should want to share it with their friends. Unless your game has no social dynamics (like uh huh?) virality becomes your next concern after engagement. Here the metrics should be viral coefficient (how many people does each engaged player bring in to the game) – any number over 1 is good, a number over 2 is stellar. For the advanced you should look at the average time to share to get a feel for how quickly your virus will spread. If it takes people a few hours before they share with friends great, if it takes a few months then you have a virality issue.

Finally consider monetization. What? It’s last for a reason. People will pay, and indeed want to pay for your game, for level ups, for access to new features, but only once they are hooked and their friends are there too, egging them on. Monetisation is the art of converting online desires into cold hard cash. Typically your KPIs here are CAC (Customer Acquisition Cost) and CLV (Customer Lifetime Value) but the key one is ARPU (Average Revenue Per User). Ensure your ARPU exceeds your CAC and you have a viable, sustainable business, less and you are in trouble.

So the KPIs to measure are:

  • Engagement – MAU/DAU
  • Virality – Viral Coefficient
  • Monetisation – ARPU & CAC

Good luck!