We Know If You’re Watching - How to Measure Your Digital Signage
Digital signage has improved hugely from a content perspective, and in terms of how easy it is to manage screens without clunky hardware. But are we still missing the third element, on how to measure the success of our screens? Here's how we looked to solve that recently at ScreenCloud.
Digital signage has, until recently, always had a fairly big blindspot. Conceived in the world of advertising, but more recently adopted by industries like education, retail, hospitality and places of worship as a means of refreshing and visualising key information, we all perceive the power of digital screens, but can we prove it? Digital signage as an industry has only recently begun to ask:
“How do I know if anyone is actually watching my screens?”
ScreenCloud’s CEO Mark, often compares digital signage now to early web experiences. For those of us working here who are a bit too short in experience to remember, let’s recap:
Bounce Rate, Attribution, and Behavioural Flow are now essential words in any self respecting digital marketer's vocabulary. But more importantly, modern websites are rarely altered today unless the decision you want to make is supported by data. Choosing how to design a new landing page, where to send the user next or how to rejig your site hierarchy only comes after a good long look at Google Analytics, A/B testing platforms and other tools that tell you how your users are interacting with the site today.
Compared to the reams of data produced by websites, how many digital signage deployments can tell you the dwell time around each location, how many views there were per content piece, or how many viewing opportunities that screen generates a week?
It seems very few. Deciding what content to show and when on digital signage screens to date, (excluding conventional media buying channels here, where the content schedule is optimised algorithmically), is the equivalent of licking an index finger and holding it to the breeze.
So here at ScreenCloud we set out to perform a brief experiment, to answer the perennial question:
“How difficult is it to see who’s actually engaging with the digital signage content shown?”
Handily, in the modern era of SaaS, there are several analytics vendors that offer tracking methods such as footfall tracking, iBeacons (a kind of NFC which can tell when a smartphone is in range), facial tracking, and more.
Having reviewed a variety of vendors and methodologies, we settled on using a mutual Intel partner called Quividi, who produce anonymised facial tracking data using webcams. This was an important distinction for this initial experience, as we didn’t want to use facial recognition to check who is watching (usually based on demographic, ethnicity etc.), we just wanted to check if people (anyone) was watching.
Our setup consisted of mounting an Intel NUC behind the 4x1 video wall setup in front of the elevators on our floor, with a webcam mounted just underneath. Once we’d ensured the data was uploading correctly and the power settings were correct on the NUC, we simply left it running for an extended period of time.
You can see our webcam just below the centre right monitor:
This webcam would monitor:
This would monitor:
- Screen traffic - how many people passed by the screens, showing when the peaks and troughs of traffic were
- Opportunities to See (OTS) vs Watchers - out of those that walked by, how many actually stopped to watch the screen
- Dwell time - how long would someone give their attention to the screen for
What we found
Over the 2 month duration of this experiment, we had some key learnings:
1. Nearly 50% of people passing by stopped to watch the screen content
Over the average week, we saw an OTS of 4,002, and 1,942 Watchers, expressed as a conversion ratio of 48.5%.
In other words, every second person who walks past the screens, stops to look at them.
2. ⅓ of people’s time spent in front of the screen is spent actually watching the screen content
We learned that people standing in front of the screens, and people watching the screens, weren’t necessarily the same thing. For example, people might have meetings in that area or be looking at their phone, which is an important distinction to make, as many experiments would confuse passers-by with actual watchers.
On the average week, there was 5,000 seconds of total attention time, compared to 13,000 seconds of dwell time. This is expressed as an attraction ratio of 38.4%.
This implies that roughly ⅓ of people’s time spent in front of the monitors is spent watching.
3. There are ebbs and flows throughout the week when people are engaged with screens
Quividi gives all the above metrics broken down by day of the week, and hour of the day. Using this, we gained far greater information on the optimal time to impact.
Statistically, the days where the screens made the most impact were Friday (50.59%), Thursday (49.82%) and Monday (48.43%).
Friday, stereotypically the least productive work day, produced the greatest conversion and attraction ratio. Despite nearly 100 less OTS, people watched the screens almost as much as Tuesday, and for more time than any other point in the week.
What we learned about the best times to schedule digital signage content
This initial experiment gave us some great insight into our own internal communications environment, and has triggered further thought and experiments with how we can better optimise the scheduling of content. There is still more work to be done, in order to reduce the amount of educated guesswork needed when determining what time of day is best to schedule content and that’s something we’ll strive for in the future for sure.
In particular we’ve gained clarity on:
- Sharing employee information - The best time to start showing staff relevant content such as HR benefits, metrics, new joiner information etc, is in smaller windows than expected, and the best days are Thursday & Friday. This also draws our thoughts to some of our staff who work from home on Fridays, usually for family or convenience, and ensuring that if we’re aiming performance & HR content at specific “slots” during the week, that all staff receive the same information.
- Improving automation - Earlier in this article, I mentioned how content being placed in traditional channels in the advertising industry has become increasingly algorithmic, and optimised via machine learning, to produce the maximum profit. Digital signage must continue to evolve towards this, but instead of generating profit, it must optimise to generate views and watch time. The power of automation, machine learning, and interactivity, combined with digital signage could produce some massive efficiency gains for companies, such as: a) Automatically re-weighting content play frequency, to prioritise the pieces of content that need to meet play or view figures. b) Ensuring that compliance information has greater weighting in areas with poor completion rate. c) Automatic takeovers when critical events require attention, such as product or service outages.
The next step for us is to integrate a solution like Quividi with the ScreenCloud platform, to give our customers more detailed information on digital signage views.
All of the above begins with analytics, and knowing if people are watching. The answer, for at least half of us here at ScreenCloud, is yes.