What Data Should I Collect?

I got a report the other week. Loaded with figures. It was all about ‘engagement, apparently- which was meant to justify some very large sum being spent monthly on web-based marketing. ‘What do you think of this?’ the client asked.

I answered that with a question of my own. ‘What does “engagement” MEAN?’ I won’t recount the rest of the exchange, but what I will do is to tell you what the end was. I asked ‘How many EXTRA, QUALITY, business leads did you get through this activity and how much EXTRA cost was there as a – result of doing it, including the value of your time – ALL of it?

The answer to both questions, of course, was that they did not know. There are reasons for this.

BIAS alert. I trained as an experimental scientist! I’m a chartered accountant! I love data…IF the science behind it makes sense!

What are We Trying to Do Anyway?

The first is that ‘engagement’ is probably the worst-defined concept I have ever come across. If you don’t know what you are actually measuring, no measure makes sense.

Experimental science works by having an idea of how something might work (a theory) and then creating predictions (hypotheses) about those ideas and trying to disprove the theory.  You end up over time with the best working model – or as it may also be known, understanding. Data science works by measuring thousands of possibly unrelated data points and seeing which ones seem to clump together *. Measurement for measurement’s sake. Here’s the worse news. If you measure enough stuff and do enough maths on it, eventually you’ll find things that vary together (‘correlate’), Then you’ll apply the good old A therefore B reasoning of classical logic and you are away. A causes B. Except it may well not. A could cause D, and D cause F and F cause G and G cause B. or maybe there’s no relation at all, it is just chance, or maybe B and A are both caused by X which you haven’t thought about.

I have laboured this point because it is actually important: no theory means no understanding. In BD terms the key is HOW do people decide to buy professional services. My own view is that in most instances, your good clients will ultimately come from things that data science measures little or not at all. Indeed, my view is that the huge majority of budgets for this sort of stuff represent money wasted. AI will make this better over time, but it will only provide answers based on the data collected…

What Does it Mean?

The second is that if it is defined in terms that are not measurable, then no measure you ever make will be meaningful.

The right time frame?

Time periods. If you measure any data over the wrong time period, you miss the point. Business Development (BD) directors are typically expected to show ‘results’ over a short period and to have activities and measures that produce that.

I have always regarded short-term BD as really dumb: you produce much better ROI doing things which have an apparent ROI of zero for a long time, but then work, Networking / Influencer strategies are a good example. They take ages to work, but once working, they cost peanuts to keep going.

A ‘results driven’ approach with short-term measures will never achieve optimal long-term ROI (the other reason for this is that all such strategies are easily copied and tend to work only over the short term, requiring constant attention…why your SEO gurus keep on at you to spend spend spend. My experience as a partner and very successful business developer is that the best (most profitable and fun) clients arrive usually because of a multitude of factors (no not just my magnetic personality, extreme brilliance etc) and often over a very long time with numerous ‘touch points’.

Fuzzy data.

The joy of knowing exactly why you measure something is that it defines how and what you measure.  I have seen dozens of different ‘measures of engagement’ because engagement is such a fuzzy concept that almost anything can be taken to measure it. If the data are too vague to make sense, you may have a lot of difficulty… the clue is that if ten different things are posited as a valid measure you are looking at a phoney concept.

Are the answers right anyway?

So you get a new client and they have to say ‘where they came from’. You give them six possible answers (this is driving data analysis remember), of which they can choose only one.  So, the client who recalls your firm’s name from somewhere then a friend mentioned you, then they looked at Trustpilot, then they looked up the name they were given from your website, then they called in the office and the receptionist was much friendlier than the other firm, and the fee-earner gave them a proper quote. Where did they come from? The answer, because it is forced and easy to say is probably ‘via web search’. One of my big moans (yes, I do have a LOT) is that so few people in practice marketing have ever actually been people responsible for driving in advisory business. These are NOT simple sales: I am often shocked at the lack of understanding of the process in some BD types, even in-house.

So what should you measure?

The bad news – I don’t know, but the aim should be as little as possible and over as long a period as possible. I start with how many times the phone rings with QUALITY enquiries. Then experiment!  The aim should be over time, to look at less and less and concentrate on the things that you can nail down as predictors…and to do that well. You need to spend a lot of time making sure that the data you collect is right (go back 100 words…).

We don’t even look at our web stats any more. Haven’t for years. What we do know is that our new business is utterly unrelated to the number of visitors to our website (but of course, nearly all our new clients do look at one of our websites before they pick up the phone). What we DO do is to do the maths up front and if that says it will work, we do it. Normally we’re right. If we’re not we stop and start a new experiment. Over time this has worked really well because it is based on an understanding of the process.

Here’s the bad news. I’m not going to tell you what we do measure, other than who clicks what in our e-newsletter (well, that’s a gimme). Suffice it to say, there are only 2 other measures we ever look at…and the result is extremely high ROI on our activity.

I am deeply cynical about the smoke and mirrors of the marketing industry. Successful practice growth comes through doing a mixture of effective BD things and (crucially) training and involving your people to be good at building relationships and spreading influence. Get the basics right and you will fly – and you can fly at EasyJet prices not BA ones. Don’t and you won’t, no matter how ‘whizz bang’ the solution you are promised.

* Data science told the Republicans to bombard me with messages recently. A baseball-loving middle-aged American from a conservative state. Alas, it didn’t catch on that I haven’t ever voted there, haven’t lived there for 50 years so couldn’t anyway, and have voted Liberal/ Lib Dem all my life, so not their core constituency.

Joe Reevy, LegalRSS

LegalRSS logo legal articles, legal copywriting and uk court updates for law firms

Interested? Drop us a note