I sometimes feel like Diogenes searching for an honest man. Only I’m looking for an honest benchmark. What’s that? A yard stick for social marketing performance that tells me if a program is doing really well or only relatively well. For example, if a community has 10,000 members and another geared to the same group with pretty much the same objectives has 25,000 members, I might be high fiving all around if I’m the community manager for the latter. BUT! Does the larger community look great only in relationship to the smaller one or are they both underperforming? If I look at the engagement levels of the 10,000 and find they are much more prolific than the 25,000, then is the smaller community actually the high performer? We could go on and on.
Social marketing is an industry in search of benchmarks that will give organizations meaningful insights about the health and wellness of their initiatives. To a degree, I agree with a recent post by Matt Rhodes about the growth of a healthy online community.. He makes the point that performance is relative to the type and purpose of a community. This is true, but benchmarks apply when we are trying to measure against peer groups. I think to get beyond the sophomoric comparison of page views as a metric to something truly meaningful, we need to use an algorithm-based approach to measurement. We need to filter multiple data points to create performances indexes. These should give us the insights we need to make surgical adjustments to community engagement and drive growth and vitality. In turn, these indexes provide a foundation for measuring performance against a peer group.
For example, we recently completed some research that looked at the differences in rewards and recognition preferences between consumer, IT Pro and developer communities. Not surprisingly, software developers have very different motivations for returning to a community and using it frequently than consumers do. Before this research, we knew this intuitively. But now my client knows very specifically, how to design reputation management systems that will resonate with the audiences it wants to engage. It also informs the selection of data points that can lead to some meaningful performance indicators. We can for example look at engagement KPIs such as UCG volume, views, posts, comments, click-throughs, peer support, etc in correlation to specific reputation management approaches. The same algorithm could be applied across a community peer group and yield benchmarks that give insights into not only what to adjust but also how to make changes.
If my score is low, I can surgically tinker to deepen engagement by changing three things about the way I reward community members. This type of approach helps me know both if my community is relatively good in a meaningful way AND if I am also really good! High fives all around.
The social marketing industry needs benchmarks to catapult it to the next stage of professionalism. I don’t think we can continue to gauge how we’re doing only in relation to our own objectives or by using metrics that are interesting, but not useful. ComBlu is looking for some smart folks who would like to collaborate on a benchmarking study for social marketing. We’re calling it the Diogenes Project. If you’re interested, contact me .In the meantime, we’ll keep searching.
Kathy’s forte is enterprise content strategy, content marketing and thought leadership. Over the past 40 years, she has worked with both emerging brands and large enterprises in developing content and thought leadership strategies. She has written several research reports, white papers and has been a key contributor to Forbes Publish or Perish Report.