Once, Post-It® Notes held a respected spot in my reporting toolbox.
Wait – before you start picturing me sporting seriously big hair and way-cool shoulder pads (Post-Its took off for 3M in 1980), I want to make something perfectly clear: “Once” wasn’t all that long ago. I realize this is a less-than-flattering confession from a modern marketer, but stick with me here (no pun intended). Yes, for a specific time period, the notion of “marketing metrics” and rainbow-hued cubes of adhesive-backed paper were married in my mind – but for very good reasons.
Here’s one: I was working as a social media/community manager. (For those who have chosen the lifestyle of a social professional – or have supervised someone who has taken this path – you can guess where I’m headed.) Back then, on any one day, I could be found shooting for the stars on a shoestring. Case in point: To manage and measure the social activity of my enterprise employer, I used a combination of free and affordable (small business) tools. Of course, none of these technologies spit out aggregated results with slam-dunk accuracy. Eventually, I learned to how to calculate reliable results – pulling one metric from here, another from there, making apples-to-apples comparison adjustments – all the while accepting that the social truth lay somewhere between (and among) automated dashboards.
However, I wasn’t comfortable using this precarious tracking system for campaign or event measurement. Which brings me to the sticky note part of my story. One day a marketing colleague requested a report for our social activity during a conference. I decided to resort to manual recording. In between live tweeting and posting, I jotted down the number of tweets, retweets, likes, shares – as well as anecdotal details on conversations – on Post-It Notes, which I dutifully adhered to my 8 ½ x 11” day book pages. Referring to this fluorescent note mosaic and assorted screenshots, I compiled a post-event report.
Sounds like a protracted exercise in teeth-grinding, right? But not completely. In the process of scribbling, doodling and tabulating, I discovered something very valuable: the power of pause. Rather than compulsively keying in running totals for vanity metrics, I began lingering over my notes. Of course, social media pros are always scanning search columns and dashboards, looking for interesting correlations and engagement opportunities. But something about that ink-scrawled, multi-colored chaos in my day book pushed my thinking in new ways.
As I pored over my sticky note montage, unexpected trends began to emerge from the data, triggering new questions.
- Who are the social “stars” of the event so far (brands and individuals)? Are they engaging with us?
- Do these “stars” fall into any of our target personas?
- Are the event’s socially active influencers attending our speaking sessions? If not, which personas are?
- Which of our brand values have we communicated effectively? How are people responding?
- Any surprises in the social activity? How does it compare/contrast with the staff’s account of our booth experience? How can we, as a team, close the gap?
- What social experiments did we do? What worked – and what didn’t? Why? What’s working for others?
- How would we describe the social engagement with our brand? What are the most engaging topics/insights?
- Any story ideas to share with our PR/content team? Contacts to nurture?
- Any social predictions for the remaining days/hours of the event?
- Based on these observations, how do you understand what to do next?
I doubt even a third of these questions would have occurred to me, had I enjoyed access to reliable automated reporting.
And so, on the day of my epiphany, I stumbled upon the resonant truth of a popular marketing maxim: just because something is easy to track doesn’t mean we should be measuring it. Sometimes we need a break from templates, routines and assumptions to discover new needs – or even more meaningful goals and metrics. Markets are continuously changing, and so are our customers. Our marketing and measurement should evolve, too.
The primary reason for this is that automation is based on the assumption that every single day/week/month the question we want answered with data is exactly the same. While that was true through [the] early 1900s, it is no longer true. The world changes too much every day.
While I wouldn’t say that all automated reporting is useless (or even, as Kaushik claims, “almost completely useless”), I’m convinced it’s dangerous for marketers because it encourages complacency.
Think I’m being melodramatic? Try this experiment, suggested by Kaushik: Turn off all automated reports on a random day/week/month and watch what happens. Is your inbox packed with urgent messages, demanding to know about a specific performance metric? Do any business initiatives come to a grinding halt? If not, you have fallen victim to Kaushik’s “Reporting Squirrel syndrome (spending 75% or more of your time on data production and caching) and should probably revamp your approach to marketing measurement.
If we don’t make this change, our business partners will continue to rely heavily on gut-level decisions. Remember the last time your leader (or your leader’s boss) made a decision that clearly went against a well-documented data trend? When this first happened to me, I remember thinking, “Guess no one reads my monthly reports.” Later I realized it was possible that my reports were, in fact, read…but their measurements didn’t stick with (or set off sparks for) my leader and/or his executive team.
Although this automated dashboard is pretty, it lacks actionable insights and isn’t particularly meaningful for anyone outside of the context of the website and social teams that do the day-to-day work. Source: Occam’s Razor via Kaushik.
So when it comes to ineffective marketing reports (and the measurements within them), a hefty part of responsibility rests with us. When was the last time you revamped your reporting – of your own initiative? How recently have you asked for feedback on the relevance of your marketing data analysis? And I’m not talking about gently asking your leader in an email. To get candid responses (of a transformative kind), we must pursue them like they matter. (Insert mental picture of in a slobbery bloodhound bounding on the heels of a fresh and irresistible scent here.)
Although collecting others’ feedback is ideal, you can also make your metrics roll-up more relevant—and your daily actions more impactful – on your own. Enter the impressive power of pause. Before you export a dashboard report, stop. Take time to let the data speak to you. Don’t be satisfied with facts. Instead, look for true insights – data interpretations that move decision makers to take action. It’s the difference between summarizing performance (aka “data puke” in Kaushik-speak) and telling a story that contextualizes the data and moves your marketing away from what’s not working and toward something else that is more likely to succeed. In addition to reporting what happened (past performance), we marketers should always explain the “why” behind the results, accompanied by recommendations for moving forward.
In a July 2014 post that I’ve bookmarked and referenced many times, Kaushik provides an example of a strategic marketing dashboard that not only tells a story (via explanatory copy), but also features actionable insights.
It’s easy to see the difference between this strategic dashboard and the aggregated data dump produced by standard marketing tools, such as the first report image featured in this post. A vast improvement, to say the least. But a report/dashboard can contain a "strategic" narrative and still be virtually useless. This happens if you're tracking the wrong metrics. What's the difference between "right" and "wrong" metrics? If your metrics are the right ones, your results will drive decisions that grow the business – such as optimizing what's working to achieve your marketing goals. If you're not taking action based on what you're measuring, it's a clear sign you need to change direction.
All too often, we think of a report as a wrap-up – a recurring task marking the tail end of the weekly/monthly grind. However, as Rachel Richter (our senior customer analytics and insights leader here at Dun & Bradstreet) often reminds us, the essential purpose of measurement is to drive ongoing decisions. It's all about effectiveness and optimization. So it's continuous learning. If you don't understand what's really working, then you obviously can't make decisions that actually maximize your marketing effectiveness (via test and learns, changes in spend levels or mediums, etc.). Your measurement winds up being an empty gesture.
Rescue your marketing from this fate. Put your automated reporting on pause. Do a quick decision-making audit. How many of your metrics have resulted in strategic actions? Shut down your dashboard for a while. And what the heck. Maybe even break out a few Post-It Notes.