Services and content that are useful, usable and used: who wouldn't want that?
The World Bank has long produced policy guidance and technical documents - most of which are published on their website for their audiences to access.
A few years back someone in their web team decided to have a look at their web analytics - and they discovered that:
nearly 30% of their PDF reports had never been downloaded, nor opened. Not even once.
a further 40% had been downloaded and/or opened fewer than 100 times
only 13% had been accessed more than 250 times.
For an organisation with the stated objective of informing public debate and government policy, this is a lousy track record.
Off the back of that realisation, they decided to properly invest in user-centred design and design research, to make sure the content and services they provide are useful, usable and used.
Design research is how you make sure that you achieve the impact you’re looking for.
It’s all very well you getting your subject-matter experts together in a room to produce technically precise and intellectually satisfying policy documents and guidance.
But if the intend users of your products don’t find them useful and usable, and don’t want to use them, then you have failed.
To illustrate the difference between the three, let’s take the example of a door.
A door can be more or less useful. If you want it to separate outside from inside, makes controlling room temperature easier, acts as a noise barrier then it’s useful - but if it’s a saloon door then less so.
A door can be more or less usable. A door that opens automatically when people approach is more usable than a door that needs a “push” sign on it, is heavy or has a handle.
And a door can be more or less used. For doors that are permanently propped open, and doors in locations where no-one ever goes, you have to ask - what’s the point of that door?
Enough of doors, what does this mean in terms of your content or service?
It means you can and should do design research focused specifically on making sure your content and services are useful, usable and will be used.
Useful
Increase your confidence that your content or service will help your users accomplish their task or objective
is it relevant, current, accurate and sufficient to support the user to do something they are trying to do?
is the content easy for your users to understand, and are the key points within it clear and memorable?
Research intent:
to help you understand more deeply your users, their context, what they’re trying to do, and how - so that you can design something that closely matches their needs
to help you gauge the comprehensibility and memorability of your content, and know whether the terms you use mean the same things to your users - so you can better achieve your communication objectives
Methods:
interviews
surveys
ethnography, diary studies
semantic analysis
text mining
customer service log analysis
cloze tests
recall-based tests
highlighter tests
Usable
Increase your confidence that your content or service helps your users accomplish their task quickly, easily and conveniently
Can they find what they need quickly and easily? Are services and content relevant to them and their task easy enough to discover?
Can they complete their task, from start to finish, quickly and confidently - without getting stuck, lost or confused?
Is it clear to them what they have to do - even if they were attempting the task for the first time and coming to your service with no prior knowledge?
Is the service inclusive so that everyone use can use it, regardless of their circumstances or abilities?
Research intent:
to help you understand who might be excluded by your content and services - so you can improve their accessibility
to help you know how you might improve the efficiency, ease and convenience of your service
Methods:
moderated or unmoderated usability testing (task or scenario-based)
search analysis
top task analysis
eye-tracking/heatmap
open and closed card sorting, treejack
accessibility testing - automated and moderated, covering users with a range of access needs
web and service analytics
Used
Increase your confidence your service is being/will be used (and used in the way you intended)
does it meet a defined need that (enough of) your users report or recognise they have?
do your users believe they would use your content or service (and related attitudinal and sentiment insight, including around trust and reliability and brand)?
do your users actually choose to use your content or service when given the opportunity (real or otherwise)?
are your users interacting with your content or service in expected ways?
are your service metrics translating into outcomes (or proxy outcomes)?
Research intent:
to help you understand likely scale of usage and track uptake/adoption, awareness and attitudes - so you can improve your service reach (inc. through marketing, brand)
to help you understand types of use and misuse (and users and misusers) of your content and service - so you can prevent misuse
to help you test your assumptions around your theory of change for your high-level policy and how you’ve implemented it through your service - so you can improve fidelity and impact
Methods:
fake door testing
A/B testing
landing page test
field research
surveys
opinion polling and focus groups
sentiment analysis
web and service analytics
complaints and user feedback
Learn from the mistakes of many firms who’ve come before you.
If you’re publishing for publishing’s sake and don’t really care about impact, then whatever.
But if you actually have a policy outcome that you’re trying to achieve, you need to remember you’re not aiming for a technically precise and intellectually satisfying document to publish in some dark corner of the internet.
There are design research methods you should use to make sure your content and services are useful, usable and used, by the users you’re targeting, to achieve the outcomes you’re seeking.
Why wouldn’t you want to do that?