Not long ago, a client asked us the following questions:
How much content should we have on a page? What is an average page length in words?
And somewhat predictably, I answered, “It depends.” But I went one step further. I let them know why and how it depends.
While I’m not a content strategist in the sense that my fellow Seaquam high school alumni Jeffrey MacIntyre is or the same way that local Vancouver CMS-meets-UX-meets-content strategy guru Rahel Anne Bailie is, I do provide our clients with a lot of web strategy and that does involve a lot of content. And I’ve been thinking about this a fair bit recently, working on a lot of large public website and intranet projects for the past 15 years.
Here’s my proposed content strategy in a nutshell. It stems from two simple ideas:
Understand the context of use of your content and then be channel appropriate.
Context of use? Channel appropriate? What does that mean?
First off, the concept of bounded applicability is in full force when it comes to the type of large bureaucratic corporate website content that we frequently face.
“Bounded applicability simply states that any method or tool has limits. You know you are reaching those limits as the cost/benefit ratio of handling new issues becomes adverse. At this point you should not carry on doing the established approach more furiously, but instead realise that you are approaching a boundary and gain perspective so you can look on the other side.” – Source: Dave Snowden.
The way you design your content and what a page of content is good for, has its limits. If you feel like you’re questioning the usefulness of that page, its content, and the overall design, if you’re getting feedback from users that they still can’t find what they’re looking for, you’re probably reaching the limits of your content.
Remember that, we’ll get back to Dave in a moment.
Secondly, users are engaged in productive inquiry when they visit a corporate website. Productive inquiry is an activity where they are deliberately seeking what they need in order to do what they want to do. Said another way, it’s not inquiry in the form of general curiosity, but inquiry in the service of wanting to get things done.
What types of things are users trying to get done when they visit a corporate website? For our client, we did user research (interviews, observations, surveys, search term analytics) to find out the answer to that question. We identified over 2000 primary tasks that users described in their own terms.
These included very real world tasks like buying things, researching things, learning things, obtaining things. Some of which were directly satisfied by the organization and its services, some of which were not. It painted a brilliant picture of the mental model of the user, their motivations and needs.
In the midst of their information foraging behaviour, where users come to the website to get one of these tasks done and try to make sense of the 1300+ pages our client has on the subject, we hit the limits of cognitive science.
Simply put, users don’t read, they scan.
Enter Mr. Jakob Nielsen and the usability studies of how much or rather how little users actually read on any given web page.
How little do users read?
Answer: On the average Web page, users have time to read at most 28% of the words during an average visit; 20% is more likely.
So what about long articles vs short articles?
Answer: Information foraging shows how to calculate your content strategy’s costs and benefits. A mixed diet that combines brief overviews and comprehensive coverage is often best.
So, as a result of the big 3 concepts of 1) bounded applicability, 2) productive inquiry, and 3) scanning behaviours, our client’s content approach / strategy should be thus:
- Determine which problem domain their target content resides (simple, complicated, or complex?)
- If simple: create a web page, no more than 300 words – example: find corporate contact information
- If complicated: create a web page, no more than 900 words – example: find where and how to access a service
- If complex: delegate to the call centre – example: what happens to me and my super specific scenario
The three domains are from the Cynefin Framework, a knowledge management / sensemaking framework that describes the types of problem domains in which humans interact. It’s more of Dave Snowden’s work.
Read the Wikipedia article on Cynefin for a primer and watch Shawn Callahan’s video on an introduction to the framework.
Okay? All briefed on Cynefin. Good. There’s lots of brilliant stuff in there. I encourage you to explore further or better yet, become Cognitive Edge certified like me and lots of other really interesting people around the world.
Simple is something where there’s a best practice. Where there’s a right answer, quite possibly only one answer. Where the user is engaged in sensing, categorizing, and responding. Where cause and effect is obvious. Deductive logic works here: theory, hypothesis, observation, confirmation. Short content is recommended and appropriate.
Complicated is something that may require more analysis, where there are a few right answers. It’s the domain of good practice. The user is involved in sensing, analyzing, and responding. Inductive logic works here: observe, pattern, tentative hypothesis, theory. Medium-length content is recommended and appropriate.
Complex is where cause and effect are unknown, it all “depends” – the domain of highly specific and personal, multiple variables. The user is having to probe, sense, and respond to figure out what’s going on. They’re heavily into “sensemaking” and doing their best to “figure things out.” This is the domain of emergent practice. They’re in the realm of abductive logic: half theory, half observation, mostly “hunch” or “heuristic” – again, this domain is not appropriate for your typical top-down, expert written content. You can’t cover all of the scenarios that come up. You can’t deal with all of the variables. You can’t author content that predicts all of the unknowns in this domain.
No web content other than “CALL US” should be on this page. Talk or interact directly with a human being or many human beings. We are sensemaking machines. Static web pages are not.
Having said that, there is one argument for web content to manage the Complex domain.
If our client company had a social forum area where people could discuss their problems, tell stories, read each other’s anecdotes, then you’d have a proper group of content for the complex domain. But in our client’s case, they don’t. They have their official materials and official policy. So any social interactions need to happen between the customer and a corporate employee on the phone or in person. Answers in complex situations are highly contextual. A context strategy (not a content strategy) should be social: i.e. talk to someone. Or some people.
The endless hype around user generated content and social media seems to miss the fact that it’s sometimes totally inappropriate and that lots of problems can be answered by corporate, few-to-many, expert-written content. For our client, a large public sector bureaucracy, it was inappropriate to suggest a social strategy for their complex content simply due to the nature of their business. Privacy and legal issues were too great to allow for customers to share their experiences and benefit from a social experience. But if privacy and legal issues had not been present, this would be a big win for our client. Social forums and human dialogue (one-to-one or many-to-many) are where people make sense of complex subjects.
So, in summary, here’s my take on an integrated approach to content strategy.
- Step 1: Understand what task the user is trying to accomplish. What is the user trying to do when they come to the website
- Step 2: Understand if the answer to that task is simple, complicated, or complex
- Step 3: Write and design content appropriate for that level [remember: users scan, they don’t read] (simple = less, complicated = a bit more, complex = call someone)
- Step 4: Publish content, measure the results using Google Analytics and the inquiries to your call centres, determine if it’s working through customer satisfaction surveys
- Step 5: Iterate and revise content, tuning where required
Given the word count of this page, perhaps next time I should take some of my own advice…