One of the greatest marketing lessons I have learned came from my Sociology grad school advisor. After spending considerable effort preparing to present ideas for my first big research project I opened with the comment, “In my experience…”
My advisor stopped me mid-sentence, his eyes got large, and he took a deep breath. “Listen to me,” he said. “Your personal experience is irrelevant here. You are not a representative sample.” And I was asked to start my work over.
Marketers Are Not Normal
In 2013, I worked on a study titled Marketers from Mars* that compared the online habits of digital marketers with the online habits of online consumers (basically everyone except digital marketers). Not surprisingly, digital marketers were more engaged in just about every form of online activity compared to the average consumer.
As a digital marketer, I spend much of my day thinking about websites, emails, points of potential engagement. I look at things like copy and image selection on the sites I visit. I use terms like, “I wonder if the email I just received came directly from that salesperson, or if it was written by a copywriter and setup as part of a drip campaign.” Worse still, I think about how every action might be tracked, measured, and optimized.
Your average consumer doesn’t do these things. No matter how digitally savvy consumers are becoming, the fact remains—consumers tend to take a more pragmatic approach to online shopping, research, and decision making than I do.
Test Your Assumptions
We have all heard about what happens when you ass-u-me. And yet, we do it all the time in email marketing. All too often, we go with the creative options that appeal to our senses. We think of the timing we would want to get messages. Consider, even the creative briefs and testing plans that we develop are jaded by our personal experience—that is, unless we consistently and consciously test our assumptions.
To be clear, some of the biggest breakthroughs stem from people’s personal experiences—from imagining a better way after something that happened to us personally. But this is where, as marketers, we must separate our personal experience and our professional experience. While our personal experience may provide a basis for innovation, it is completely irrelevant as a basis for “best practices.” Best practices—the bedrock upon which we build email programs—should always be the result of rigorous testing.
That said, email best practices are simply rules that are meant to be broken and evolved.
Establish a Culture of Evolving Best Practices
We have a saying at Trendline, “We know best practices, but we don’t always follow them. If everyone followed best practices, you would simply redefine mediocrity.”
The hope behind this is to create a culture of evolution. To avoid settling. It is not to throw caution to the wind. The reality is that best practices are ever changing. They represent nothing more nor less than “the best we know for our organization as of today.”
Evolving “best practices” for your organization is as simple as the following:
- Industry baseline – there are lots of statistics and that will help suggest what works best, but these are general and not specific to your organization (objectives and goals etc.). Start here as a baseline, but look to move onto the next step quickly
- Historical performance – evaluate the performance of your own program and look at the best and worst performing emails. As you note similarities, you are building your own best (and worst) practices
- Collect alternative hypotheses (challenge orthodoxies) – this often overlooked step is actually pretty simple if you apply a bit of discipline. As people in your organization ask, “Why don’t we do it this way?” or “Have we tried XYZ?”, create a list. These alternative hypotheses form the basis for future testing. If you have trouble getting input, I have had clients create a pool—something as simple as, “everyone put in a dollar with each subject line they submit, winner take all” can work wonders for getting positive participation.
- Beat the control – again, the control should contain all established best practices. Run tests by establishing clear criterion for evaluation that are as closely tied to your business KPIs as possible. For example, if you are trying to drive leads, all measurement should be on leads, not on open rate. Or, if you can’t measure revenue, maybe pageviews are the best you can measure consistently. We even have developed models based on driving multiple KPIs and developed best practice that offer different rules for driving different KPIs.
The key to evolving any program is constant improvement. It can be easy to fall back into the old habit of going with what feels good to me. Alternatively, we see that email programs that take a measured approach to establishing and evolving best practices not only perform better, but experience consistent and predictable year-over-year gains that are the envy of marketers working in other channels.
Want to speak with us? Contact us for a no obligation discussion about your email marketing needs.
*Original publication of the Marketers from Mars piece can be found here.