Andy Littledale and Lee Carrotte Lunchtime Talk: A Social TV platform

Second Screening and social television: the combination of social interaction and social media with the age-old activity of watching TV.


What Andy Littledale and Lee Carrotte from SecondSync Limited offer is a social TV platform that does two things: facilitate the creation of second screen apps; apps that synchronise to a live TV broadcast to deliver additional content to TV programmes, from Wikipedia descriptions to relevant images from flickr; and monitor and display the conversations taking place on social media that relate to TV programmes to effectively map programmes with conversations. They use TV listings information, image recognition and subtitle results to map TV content with social media conversations.

The aim behind this platform is to deliver extra experiences around what viewers are watching to enrich and add value to the watching experience, as well as to monitor audience responses for analysis.

SecondSync started their journey by exploring how to interact with live subtitles and the idea of what connected TV could provide in terms of apps. They began working with a concept extractor engine; that fed in free text and turned it into constructed data so that key concepts are extracted.

These concepts are then uploaded to the second screen with a little number that represents how many related pieces of information to that concept; this is a way of pulling in related web content to what you are watching on TV.

SecondSync then created an iPad app that interrogated the subject matter of the programme in various ways, one example listened out for animal noises; the app would then suggest thumbnails on a second screen which linked to BBC’s natural history output, test audiences responded well to this instead of distracting viewers as anticipated.

Developing these ideas, SecondSync put together a platform that could span all of UK broadcasting, which brought in a live feed to discover what is best for extracting metadata from those streams by: reading the subtitles as they come out, audio recognition and image recognition. They hope to integrate logo, product and people recognition in the future.

What they end up with is a database of concepts derived from what is being talked about on TV at any given moment across all the channels. These were used as a tool for audience research to see how people responded to and use second screen, synchronised companion apps.

Andy and Lee gave a live demo of the companion app that they developed for the BBC’s Frozen Planet series, serving up contextual data from a wildlife databank to deliver more in depth information about the animals and themes explored in the series. They also showed an in-app twitter stream that displayed the conversations taking place involving #frozenplanet to allow users to track and contribute to the dialogue.

The prototype revealed a large volume of tweets and social media activity that encouraged SecondSync to move away from a synchronised platform to a social TV analytics platform, to create targeted searches and filters of the twitter stream to capture content that is being delivered around TV viewing. This creates a huge amount of data that is interesting to broadcasters, programme researchers, marketers and journalist etc.

They then perform sentiment analysis on the tweets as they come in, which overcomes specific domain problems with TV, allowing a more human approach to monitoring, having the ability to assess tweets accurately to hold on to information that computer analysis may deem as negative or inappropriate.

Questions from the audience....

Has connected TV become a bit more mainstream, are there then lots of opportunities to do more with that?
Yes because the thing about connected TV is that you can catch people changing the channels and see what people are watching at any given time and obviously providers can plug into that data that’s coming upstream. I think there is going to be a lot more call for the creation of synchronised content.
And also the second screen data itself can be very valuable in providing an alternative way to measure how audiences perceive the content that is being broadcast.

Have you done any user testing where you are focusing on the twitter feeds?
Under 35s love it and over 35s don’t really go for it.

So what do under 35s love about it, are you making it completely random other than the concept extraction or have you got any leads for curation?
All we have done in the pilot is match hashtags manually and selected what we thought people would like and copied it through to the second screen app, so we are basically filtering out some of the rubbish content. In the future this could be more automated and intelligent.

What are your thoughts about where it might go in the future?
At the moment 40 percent of all twitter traffic at peak times of viewing is about TV programmes so we are more interested in identifying that and capturing it and measuring it and being able to give that data to other apps like this that can potentially give you statistics about how people are receiving a programme in real time. The analysis side of things is where we are moving towards, we want to create a data provision platform so that people can really use that data to create whatever they need to do around broadcast TV, so that might be things like content discovery through social media, you could have an app that shows you what’s playing and how people are talking about it.  Or broadcasters might use the data to see exactly what point in particular programmes did twitter content go up, to get real in depth insight into what’s going on.