Is it a question of the known unknowns, or the unknown unknowns?
When talking to organisations about what thinkTribe does, and what makes us different, we are always keen to get a live trial of our monitoring suite or testing services running on their own sites, using their own data.
Proof of Value as it’s sometimes called. Show, not tell, if you prefer. We can tell you how it works, and what it does, and why and how it does it, but in the end it’s like the Matrix. To misquote Morpheus “No one can tell you what thinkTribe does, you have to see it for yourself.”
Sometimes, when we suggest this, people think that the return on their time may not enough to justify the hassle. People are familiar with the tools they currently use so, at first, they understandably assume that “website monitoring tools must all be pretty much the same”.
If they get little added value beyond alerts and basic “up or down” information out of the tool they currently have, they assume any other similar sounding tool would deliver much the same. It would be useful to have, but not business critical. It would provide some numbers for the weekly report, but not strategically vital business intelligence.
It’s the Black Swan Syndrome*.
So in some cases they’ll politely turn our offer down, saying, ‘Thanks but no thanks, I’m sure your tool is similar to what we have, but it’s not an area I’m looking to review right now’.
That’s probably why, if you take a look over our client list, you’ll see that it’s made up of people who really care about their online properties. The innovators, the forward thinkers, those out there on the cutting edge. They come in all shapes and sizes, from all kinds of industries, and with all kinds of brands and business and models; but what they share is an understanding that decisions based on data, on actionable intelligence, are what will keep them out in front.
Henry Ford once said: “If I had asked my customers what they wanted, they would have said a faster horse.” Which is to say that if you don’t know something exists how can you know if you want it? Imagine you needed a way to travel 100 miles and were expecting a fast horse and then someone gave you a Ferrari…
Now imagine you were expecting an alerting tool that does the kind of basic monitoring that checks through a list of pre-defined URLS and tell you if it can reach them…and then someone showed you thinkTribe’s Dynamic User Journey Monitoring! Thinking in terms of horses and sports cars, then, they’re both perfectly adequate modes of transport, and there are valid reasons for having one or the other – but they are not equally appropritate in all situations!
Mostly people are aware of, if not actually using, website monitoring of some kind now. A few years ago a handful of techies might have been aware of the arcane sciences of measurement, but no one else was concerned. As online grew into eCommerce and then into complex multichannel activities it has quickly become of concern to the entire business.
Of course sometimes a person has heard of what we do and comes straight to us, sometimes an individual who has used our monitoring moves to a new company and encourages their new team to trial it there as being an improvement on what they have, but most people we do trials for are currently using another, more traditional, system.
Looking back at the new clients we’ve started working with this year, I noticed that most months we had a significant number of organisations that moved from their existing monitoring tools and approaches to embrace ours. Inquiring deeper, I found some reasons, the unexpected unexpected, that these clients had in common.
Looking at the most recent such new user, their proof of value started out in a common way with a company wide initiative to review Best Practice around all the suppliers. An IT Change Manager had pushed the Operations Team, as the main users of the website monitoring tools, to have a look around to see how their current monitoring arrangements compared with the best out in the market.
Something obviously happened during the trial with us, because they ended up signing a multi-year deal, with several months overlap on their existing simple monitoring contract. There had been some kind of a Eureka moment.
This Eureka event, a shifting of expectations of monitoring, seems to be a shared theme amongst the clients upgrading to use our services.
For this client doing a Best Practice review, the trial was running against their site a couple of the dynamic user journeys we’re best known for. There were a number of problems occurring on the client’s website that our tools found, problems they were aware of. Known unknowns. Things they’d not fixed, but were still tracking down. That didn’t excite them.
But more importantly, our service highlighted problems that were new to them: unknown unknowns.
We found real errors that users would suffer, but their existing monitoring services and tools were, in contrast, showing all clear. One thing in particular, was a recurring problem, that didn’t happen every time, maybe 1 in 30 or 50, was our virtual user being logged out and sent to the home page, when they were actually 5 or 6 pages into a purchase journey, with items in the basket.
Suddenly the team who’d initially just been looking for a comparison of simple alerting tools, was confronted with a problem in their technology, that they’d not expected – an unknown unknown.
That was the eye-opening moment – it can be a shock to discover that there are identifiable patterns of errors occurring on your site, that you were not aware of! But this client wasn’t tempted to go into denial; they immediately saw the benefit of getting visibility of these sporadic errors, how the saved HTML and JS data for them that our service provides would help rapid trouble shooting, so they could fix them with least hassle.
Across the board, all the clients who moved to our more sophisticated monitoring model had a Eureka moment concerning our Journeys. Like the users above they realised that it was the dynamic user journey approach which was bringing to their attention these real errors that were impacting real users. They realised the benefit of journeys that look in real time into the page as served in real-time, and choose from within a choice or link – as being better able to find the real issues that do impact real users.
Talking to the Client Liaison team, they reported other common Eureka moments when clients are trialling:
- Non functional user experience problems – the journey works but there are say missing images of products, but only a few products from the total inventory. If nobody can see it, nobody will buy it!
- Consistency of user experience – “so you mean sometimes it is really fast but actually 5% of my users can take over 2 minutes just to add something to the basket?”
- Realistic catalogue coverage with random selection – if you monitor with only a simple, static URL ‘journey’ which by definition therefore always buys the same product, say tennis shoes, then you only know for that the processes and content involved in buying that one item are working/not working correctly. You optimise it page by page and anyone buying tennis shoes has the most wonderful e-commerce experience. But, for all the other products on your site where there is a choice of colours, for example, for each product (that tennis shoes does not have) the user experience may be poor. The system may have a bug only for certain product SKUs, or a problem with certain types or categories, or with content and data uploaded at a particulary point in time, it may be that in another section the “Add to Basket” button does not show! Taking live, random samples from across the product catalogue to show a realistic, authentic, experience provides much more valuable data than an artificial reproduction of a journey stuck at the time it was created.
- Realisation that current, static URL monitoring, is actually following a series of URLs that not a single real customer follows! Missing Redirects is the root cause. A static URL list may be accurate when first set up, but as the site evolves, it’s common that redirects are added behind the scenes. So whilst real users on your site, may have their browser redirected 2 or 3 times between 1 page and the next (redirects are invisible to the user), the basic monitoring they have in place is missing out key redirect steps, so no longer doing what users do. Errors in the code behind those redirect steps will impact real users, but won’t be picked up by static URL approach. Similarly for Ajax sections within a page – that Static URL monitoring will often overlook entirely.
- Non user-affecting issues, that upset analytics accuracy. For instance “oh, it’s just a missing third party tracking script” – but if that keeps breaking the business has no reliable analytics/conversion data to use!!
The Black Swan problem
“Not long ago people could imagine only white swans, because white swans were all they had ever seen. And so people predicted that every next swan they would see would be white. The discovery of black swans shattered this prediction. The black swan is a metaphor for the uselessness of predictions that are based on earlier experiences, in the presence of unknown unknowns.”
- Wikipedia – Black Swan theory
- Book: Taleb, Nassim. The Black Swan.