Last weekend at a neighborhood block party, I found myself standing in the middle of a group of 3-6-year-olds when one remarked “You are tall!” Standing next to a bunch of elementary and pre-school aged kids, I was the tallest person. This little boy’s statement was perfectly accurate.
But as soon as I went back to the adults area, I was no longer tall. I was completely average at my height of 5 foot 9 inches. The little boy’s statement was no longer accurate: I was not tall.
What does this have to do with analytics?
In a word, everything!
Data Accuracy Is a Red Herring
Analytics is more than just providing accurate numbers. It’s about providing context and presenting data in a way that tells a story that results in actionable business outcomes. Saying I am tall while surrounded by 3-6-year-olds is accurate. Saying I am average height while I stand with adults is accurate. But neither tells a story or provides context for why my height even matters.
This focus on “accuracy” is a mistake I see many organizations making right now. Leaders say “I want accurate data” or “I need accurate reports,” but focusing on accuracy as a goal distracts from the real goal of extracting value from those reports.
This week, I received an email from one of my marketing colleagues. He had run an event in Q2 and was looking to understand how well the event performed. He had pulled three different reports: two from Salesforce and one from Tableau, with the goal of understanding whether his event created pipeline for the sales team.
- The first report showed a number, let’s say $100,000. The report was looking at Salesforce’s version of first-touch attribution.
- The second report showed a different number, let’s say $500,000. This report was leveraging Salesforce’s version of Campaign Influence.
- The third report showed a third number, let’s say $700,000. This report was leveraging a home-built version of Campaign Influence from Tableau.
The marketer’s email asked: “Which report is accurate?” The first thing that came to mind was “All of them.” But as I said earlier, accuracy doesn’t answer the real question the marketer wanted answered.
In actuality, none of those reports he pulled answered his original question, despite all of them being “accurate.”
Here are three tips for moving from data accuracy to business insights.
1. Align on Terminology and Definitions
If the consumers of your dashboards don’t understand the terminology, then the dashboard is worthless. At one former company of mine, there was one dashboard that showed results for a segment called “Americas” side-by-side with a segment called “North America.” Huh? Is Americas a superset of North America? Is it South America? Does it include Mexico? The average consumer of this dashboard would never know these answers!
Additionally, try to avoid obscure and confusing acronyms unless you are building a data dictionary to accompany the dashboard. At one company I worked for, CRM meant Customer Relationship Marketing, whereas at others it means Customer Relationship Management. When you refer to CPA (Cost Per Acquisition) does acquisition mean a paying customer or a top of funnel form completion?
The more time people spend explaining or learning confusing terminology, the less time they spend understanding and analyzing the data.
Related Article: 5 Signs Your Analytics Dashboard Needs an Update
2. Sales Reports and Marketing Reports Should Contain the Same Targets and Actuals
A surefire way to make incorrect business decisions is to base them on conflicting targets. The easiest way to avoid this is to make sure all sales reports and all marketing reports are using the same underlying data. If the sales target is 1000 opportunities and the marketing target is 600 opportunities, and both reports show 800 actual opportunities, does the CEO think the business is 33% above target or 20% below? If the two teams developed their targets separately, then both are accurate!
In this scenario, marketing thinks their programs are performing well and will probably invest more in the same “successful” programs they are running now. But if marketing is actually 20% below goals, wouldn’t they want to adjust their program mix to try to improve performance?
And what about the CEO who might only see the marketing dashboard? She may now have a biased opinion of how each respective team is performing.
Related Article: Intent Data and the Gap Between Sales and Marketing
3. Divide Your Analytics Teams Into 3 Focus Areas
Many organizations think that everyone in “analytics” has the same skill set and responsibilities. This is a huge mistake. I typically separate the “analytics” function into three key areas: Reporting, Analysis, Data Science.
- Reporting: Responsible for building basic reports and dashboards and ensuring the data is correct. This person should be focused on items like tip two.
- Analysis: Responsible for answering key business questions, leveraging reports, dashboards and custom models. Also responsible for ensuring that the reports and dashboards are understood by business users (tip one)
- Data Science: Responsible for predictive data modeling, data engineering and forecasting.
With three focus areas, you can hold individual members of the analytics team accountable for responsibilities that align with their individual skill sets. Don’t expect the reporting person to be sharing insights and don’t expect the analysis person to be building predictive models. Put each person in a position to be successful based on their area of expertise.
One final anecdote: I saw a meme recently that said “85% of statistics are made up … or is it 65%?”
I thought this was clever, but it didn’t fit well for this article. So I decided to write my own (made up) meme.
“85% of reports are accurate, but only 25% of them lead to better business decisions.”
Justin Sharaf is a marketing and marketing operations leader who has worked at some of the biggest names in B2B and B2C during his 15+ year career. He is currently Vice President of Marketing Operations at Collibra.