This is a post about AVEs (advertising value equivalency).
Do you ever read stories like this one from Kyle Boone of CBSSports.com and think to yourself: god this person really doesn’t fully grasp what she/he is writing about?
I had a moment like this last week. I was innocently scrolling my Twitter feed when I came across the article I mentioned above and felt as if I had to write something about it.
Essentially, this is what happened….
The Gonzaga Men’s Basketball team went on a historic run in the NCAA Tournament, reaching the Final Four for the first time in school history. Obviously, March Madness is one of the most reported, tweeted, talked about sports events of the year. So, if you put two and two together, you can assume that the University experienced a larger than normal amount of news stories about the University and their basketball team. In media measurement terms we might say, their share of voice went up.
I can hear all my measurement friends across the country sighing hard from this headline. Who would be this reckless with their media measurement; not even qualifying this metric to explain how a number that high could be generated?
The article appears to reveal that the school was using a web monitoring service, Meltwater, to automatically calculate a metric for media exposure, a metric that the measurement industry is turning its back on… the AVE (ad value equivalency).
Here is why this is a problem, from both a media monitoring and a media measurement perspective.
When they mention “hiring Meltwater,” I think it is important to note that they are specifically talking about hiring an automated monitoring service to give calculated metrics. With a service like that there is normally no human touch. It is a software as a service tool. In a SaaS model, you essentially set your search, modify the search terms as needed, then accept the results and automated measures the computer turns out. In most cases, there is no analyst ensuring the outcomes are accurate unless you subscribe to enhanced services.
Even if this measure of ad-value were valid, and I can assure you no AVE is accurate, $406 million doesn’t cover the exact amount of money this coverage generated. Within the article, the author mentions that: “Meltwater [the monitoring service] tracked Gonzaga’s coverage on television and online sites from the beginning of March Madness through the April 3 championship game in Glendale, Arizona.” The media monitoring company left out one major media type, plus radio and possibly social media, when conducting this analysis.
The traditional print media type is not accounted for in the analysis mentioned in the article! This $406 million estimate would be much higher if it were to include stories coming from newspapers. It can also be argued that print sources are traditionally richer in ad-value… for those who are keeping track.
AVE: An Inappropriate Metric
But ultimately, the fact that they used AVE’s to show VALUE. that is what is most concerning from a media measurement perspective. But, I must give credit where credit is due. Within this article, Gonzaga did include some measures that are reliable and should be part of any media measurement effort.
Good PR Measures
It is mentioned in the article that the publicity appears to be helping Gonzaga recruit students to the university. Early inquiries from students for the 2018-2019 academic year are up by 10,000 compared to a year ago. In addition to that, website and social media traffic for the University have reached all-time highs, in conjunction with the run to the Final Four.
What was that? You’re up 10k+ in student inquiries, year over year, and you’re showcasing AVE’s?
It seems doubtful that anyone on Gonzaga’s Board of Trustees is going to care about the publicity value (I shouldn’t assume), but they will care about an outcome metric (like a big rise in student inquires) that is driven by this publicity.
While it is hard to knock Gonzaga for being excited about the outcomes they have seen regarding this Final Four publicity, it seems the AMEC (The International Association for Measurement and Evaluation of Communication) community has work left to be done educating the general public relations professional about the pitfalls and unreliability of AVEs.
However, don’t think that AMEC is silent on this age-old issue. One resource that can be used to educate your clients is this great article by Richard Bagnall, Chairman of AMEC and a senior global communications effectiveness consultant. This list of 22 reasons why you shouldn’t use AVE’s will help PR pros move past suspect measurement, or at least get the conversation started.
What do you think? If you were a part of the Gonzaga PR department, would you use these metrics to report to your supervisor? I would love to hear your thoughts on this matter. You can Tweet me at @austinomaha and be sure to give me a follow!
Austin Gaule, PR Measurement Director at Universal Information Services