I was especially excited about this session, partly because I so enjoyed the “Evolution of Content” workshop with Randall and Elizabeth in Paris last year. But also, figuring out the right way to measure the success of content is the next step in my journey as a content strategist. I was hoping that Confab would give me some applicable things I could start using right away. I was pleased to see plenty of topics on the program that dealt with this issue — likely because now that everybody’s talking content strategy, clients are starting to ask reasonable questions about why they should care as demonstrated by good data.
Randall and Elizabeth talked about a number of things they look at and do to make data-driven content decisions. They also talked about a few things that people tend to cite in defense of content that are generally meaningless.
Search terms: Before making content recommendations, the speakers said they look at the search terms that people are using to get to the site. “People only care about what’s relevant to them,” they said. For example, search terms helped them make the case to a client (a health benefits administrator) that people were looking more for health-specific terms than product names. A big duh, perhaps, but sometimes it takes data to wake clients up to being more client-focused in their content.
Navigation summaries: Randall and Elizabeth showed how they traced where people go from page to page, and how they move from one page to another (or fail to because they bail out and exit the site), to formulate an idea of how content is working. Thinking about all the different kinds of information a certain kind of user might be interested in, and how they move through pages to find that content, allows them to create an “experience architecture” that allows the site to serve up the right content when it’s most relevant to the user.
“Next page” data: Why does one link on a page get 40% more traffic than the link right next to it? Analytics let you examine this phenomenon to make some educated guesses about why one thing is working and one isn’t, and start to make some changes accordingly.
How do you document whether a user comprehends content? The speakers said they might videotape users reading and voicing confusion about content. But a more scientific way to do this is with a cloze test (see a detailed description of this on Jakob Nielsen’s site). You show a paragraph from a website to a user with every fifth or sixth word blanked out. If the user can generally guess at what is being communicated anyway, then the test proves the content to be effective.
(Also see a summary of Christine Perfetti’s “Testing Content” presentation for more user testing ideas.)
The pair talked about the importance of getting the metadata in content strategy — “it gives content longevity.” Getting the keyphrases and IA right is a huge part of what we do, and presumably analytics and user testing help us make sure we’re making the right choices with metadata. However, the speakers did extend a warning about become fixated on making the search term the core focus. For some clients, being desirable (and being featured in editorial, linked to from blogs, etc.) is much more important than making sure you have that one search term baked into the content. The content strategy will drive those decisions.
My So-Called Data
This was maybe my favorite part of the discussion: here’s stuff that we’re doing already that we can turn into data. For example:
Language analysis: We’re already talking about “better writing” and content tone/voice. But the cold, hard truth, the speakers said, is that most people don’t care. How can you make these recommendations more rigorous and scientific? They suggested using a model that Kristina Halvorson laid out in her book as an example: a tone chart. Rather than saying the content should be warm, show examples of what it should and shouldn’t be, parsing the language to be more quantifiable. It makes the recommendations more usable and also more respected by clients who need backup for the decisions they’re making.
Social listening: Paying attention to what people are saying on social networks can change a client’s perception about everything. The speakers talked about a yogurt brand that wanted to rethink its content. All yogurts talk about “lowfat” in their marketing, but what consumers really care about (according to what they say on social media) is that the yogurt is “creamy.” This revelation caused the yogurt brand to rethink its entire product strategy: new packaging, new messaging, new content.
On-site search engines: What are people entering into the search box on the home page once they get to the site? This could indicate what’s broken about the architecture — what are people looking for that they’re not finding?
Offline insights: “Call centers are your secret content strategy friend,” the speakers said. If you can figure out what people ask about most often when they call, it can inform the content you need to create online to support customers.
There are a lot of data points that everybody tends to point to that may not really be a good measure of content success, the speakers said. One example was length of time on a page or on the site — there’s no guarantee that a user is spending more time on content because they’re reading more of it. It could be that they’re just confused. The point is that rather than taking data at face value, you have to put everything in context.