Preference testing: Sometimes preference is about performance
/I came across a very informative article on color contrast requirements and how this impacts accessibility. The conversation in the comments centred around the difference between preference (as in, what users prefer) and performance (is it easy to read). What this may mean, is that people may prefer certain color contrasts, but what about performance - does the contrast ratio actually help visibility and how would you separate that from preference for certain color combinations (e.g. white text vs black text on an orange button)?
This got me thinking, because I have done many preference tests (A/B version), whether just testing color combinations on buttons or look of certain elements, usually online. And after asking participants to choose which design do they prefer I always also ask them to justify their decision by asking why they chose the one they did. And on many occasions participants mention the preference related to how accessible the design is for them. What do I mean by that? Frequently participants reflect that preferred version was ‘easier to read’ or ‘clearer’.
Below is an example of the two buttons I tested in the past for a small start up. They are clearly very different and I worked out the contrast ratio for both for the purpose of this experiment. Both buttons came close in terms of preference. The comments are participants’ own comments and I highlighted performance related comments green and preference ones red. I did this to illustrate that often, in preference testing, users decisions can be true preference (e.g. I like that one better, I like the colour), performance (e.g. easier to read) or even both.
This shows me that sometimes preference is not only about whether you like certain colours or not. For many people preference will be dependent on functionality and accessibility. If you are just looking to get opinions on wether users like the colours then you should be clear about what you’re asking. Asking them what they prefer may confound the results if you only want to find out about colour choices. This is why I always like to follow up with a “tell me why” or “tell me what made you say that”. Some users will prefer something without knowing why, even when asked. In face-to-face usability testing, they may be encouraged to concentrate and elaborate on certain elements but in online preference testing, this is lost. But majority of the users will have an idea about why they prefer something, and often their preference will also be down to how well the design performs.