Assessing product documentation

Documentation is a vital part of many products as services, especially software services. It helps customers to get started with the product, teaches them about features and guides them to use if successfully. Documentation can, therefore, be as important as the product itself, but companies often don’t invest as much time and effort in making sure product documentation is written in a way that can facilitate, rather than hinder the process of learning about the product.

 
Screen Shot 2021-07-25 at 10.45.12 AM.png

However, when I tried to find out how to best test the documentation for the new product, I could not find much in terms of best practices. There are articles on testing content, mostly website content, but information that features in product documentation is different. It is not only informational in nature but also instructional. And it often has dependencies on other information in the documentation. Therefore, I found that guidance on how to test general content is not completely applicable to product documentation. I had to come up with a way to do this on my own, borrowing from best practices for testing content and figuring out the rest.

First, I wanted to see if documentation is easy to understand and does it convey important information about the product, on its own.

So I set out to find out if:

Screen Shot 2021-07-25 at 10.58.06 AM.png
 

1.     Documentation makes sense out of context
If a potential customer looked at the documentation without seeing the product, is the documentation telling the right story about the product? 

2.     People can find what they are looking for easily
If a potential customer was curious about a feature or how something works in the product, would they be able to find this information easily?

3.     People understand what is written
Once the potential customer finds the information, would they be able to understand what they read or is the information written in a way that is confusing?

4.     It facilitates learning
Is the documentation written in a way that helps new customers learn quickly and if not, what can be done to improve it?

I also wanted to know how effective is it in helping people use the product when they are stuck. In order to explore this, I devised a task based research plan that comprised of two studies: in and out of the product.
As always, it is important to recruit participants that represent potential customers you want to attract.


Study 1 of 2

I started a session by giving participants a scenario, which gave a brief overview of what the product does, without going into detail. Then I asked participants to take some time to scan the documentation, get familiar with its contents and let me know when they are ready to proceed to a task. I created a series of tasks that explored different areas and watched participants search through the documentation to accomplish the tasks.

I structured the tasks around these topics:

Screen Shot 2021-07-25 at 10.44.35 AM.png

1. Can information be easily found
Tasks in this category asked participants to find certain information. I followed up each task with a single ease question (SEQ scale), asking participant to rate, on a scale of 1-7, how easy or difficult was it to find the information they were looking for.

I also wanted to know how participants perceived the information they found, and if they think it is easy to understand, so I asked them to tell me if the information they just read was easy to understand, on a scale of 1-7, 1 = not easy to understand and 7 = easy to understand.

2. Is the information understandable
Tasks in this category asked participants to tell me about the things they read. What people say and do are two different things so I also wanted to know if the ratings about whether information was easy to understand would match their understanding. So I asked them to tell me about the information they read and noted if they misunderstood the information.

Screen Shot 2021-07-25 at 10.44.47 AM.png

3. How does product documentation appear at first sight
Finally, I spent some time exploring first impressions on look and feel of the documentation. Here I learned a lot about participants’ learning preferences and what they wanted to see included in product documentation and why, such as screenshots and videos. For example, participants shared with me that having screenshots and videos helps them remember vital information more easily.

However, I also know that often such content is not included readily because it takes more effort to maintain screenshots, with product changing slightly with new updates or design changes, but this might be a vital component in providing seamless onboarding for customers.


Study 2 of 2

Next I invited the same participants to a usability session, in which they get to use the product, alongside documentation. I did this because I expected that, once they start using the product, documentation will be scrutinized more readily, and I was right. I also wanted to compare the impressions from previous study with this study to see how the impressions change between just browsing the documentation and actually using it to help accomplish tasks.

 I organized the tasks in this order:

Screen Shot 2021-07-25 at 11.28.15 AM.png

1.     I told participants to do whatever they are curious about first
I did this because this is the closest to real life scenario that a usability session can get.
I heard, in other research studies, participants say that they learn by getting stuck in the product and playing with it, slicing and dicing data, only searching for help if needed. So I knew this is probably how the majority of new customers would start using the product.

I watched participants go into the documentation when and if they got stuck and asked them to talk aloud as they completed the task, so I can understand what they are thinking.

Screen Shot 2021-07-25 at 11.32.27 AM.png

2.     Next I followed up with 3-4 tasks which varied in difficulty
I started with easy tasks and moved to more complex ones. For all of the tasks, including the previous one, I asked participants how easy or difficult it was to complete the tasks and noted whether they referred to the documentation.

This is where you need to think carefully about the tasks. If the tasks are too easy, participants may accomplish them without having to seek guidance from the product documentation, so make sure you have tasks that are harder, which will get them looking for guidance too.

Screen Shot 2021-07-25 at 11.47.27 AM.png

3.     Finally, I created a task which was vague and ambiguous on purpose.
I wanted to see if participants can find the right information and conclude confidently that something can or cannot be done in product. Often people have an idea what they want to do but, as new customers, don’t always know if the product can do this. So they start from a slightly ambiguous position of trying to figure out what is possible.


As expected, these sessions yielded lower SEQ scores than tasks in the documentation alone, and I started to see additional behavior, such as the use of the search function, which was not used previously, when documentation was given to participants outside of the product. Participants were a lot less forgiving and patient, if they could not locate the right information quickly or if the search yielded results that were not applicable to the task. Not all tasks were completed successfully. Participants also spotted gaps, where additional information should be given or where things appeared ambiguous and not very clear.

Benchmarking and tracking progress

This research allowed me to assess how our product documentation appears to potential customers who are just browsing for products and want to learn more about a product (first impressions) and how it would facilitate them in successfully using the product. Scores (task completion and SEQ scores) can be used for benchmarking, to track progress, as you make changes that are needed and test again.