Qualitative Analysis: Are you reaching your goals?

Customers often have goals and KPIs when we start new projects. But once the solution has been deployed, customers often forget to verify that these goals have been achieved. Statistics may give some answers, but can not explain how employees actually use the solution. Qualitative analysis can provide insight which can help you take your solution in new and productive directions.

Qualitative analysis conversation guide

Qualitative analysis usually requires basic tools. This example shows extracts from a conversation guide used in interviews.

When we plan SharePoint intranets and Office 365 collaborative solutions for customers, they usually have a clear set of goals. These are often variations of the following:

  • better collaboration
  • more user engagement
  • easier content retrieval
  • better integration with existing systems
  • greater mobility

These goals are often used for internal justification and budgeting. Sometimes the goals will be qualified by a list of KPIs, like how many documents people share, the amount time spent searching for things, and the percentage of users who actually contribute in collaboration areas.

Our customers may strongly insist that these goals are extremely important to them. Therefore when we build solutions, we always place these goals at the heart of our planning, so that the functional specification is based on having users solve tasks which support the goals – and thus provide real business value for the customer. We also try to build goal tracking into the governance plans whenever possible.

We observe a ”project fatigue” in organizations which causes the internal discussions to be more about budgets and dates rather than goal realization and KPI tracking.

However, somewhere during the project, customers tend to lose interest in the goals, and focus more on deployment dates and features. Then, after the solution has been deployed and taken into use by employees, they forget about the goals completely. Usually they are just happy that the solution is ”out there”. We observe a ”project fatigue” in organizations which causes the internal discussions to be more about budgets and dates rather than goal realization and KPI tracking.

These customers also usually do not want to spend more time and money on justifying the solution they have invested in.

At the other end of the scale are customers who actively plan their goals and how to follow up on them. They often track goals through statistics, and try to make changes to the solution to get better results. However, the most important factor in analyzing collaborative solutions is the user. Statistics have major shortcomings – they do not tell you how users actually experience the solution, nor what they would like to change and how.

One way to deal with this, is to perform post-launch qualitative analysis.

What is ”qualitative analysis”?

When we think of statistics and analysis, we usually think of large amounts of data which we crunch in order to figure out trends and patterns. This is known as ”quantitative” analysis. It is normally easy to classify and compare data in this way.

However, sometimes we need to study things which are not easy to quantify. It means we have to perform research involving data which cannot be measured through numbers and values, and try to find explanations which surpass what we can get from statistics and measurable observations. This is, in brief, ”qualitative” analysis. When we talk about qualitative analysis of collaboration solutions, it often means getting people’s opinion on things and trying to figure out what these opinions tell us about the solution.

Here are typical activities we often perform:

  • user interviews: how do users experience the solution?
  • workshops / group interviews: solve tasks in groups and find improvement areas
  • user testing: evaluate how well parts of the solution actually work
  • task evaluation: check if the selected user tasks actually are relevant, and that they actually can be solved comfortably by users

Usually our analysis work consists of one or two of these at a time. Doing all four can be a massive undertaking and could preferably be spread out over a longer time span. However, our most successful customers run continual analysis and testing. When qualitative analysis are evaluated together with results from quantitative analysis like usage statistics, we often learn very important things about the solution and what we need to do to improve it.

Want to learn how to do a post-launch qualitative analysis? Read our blog post on how to perform post-launch user interviews.

By | 2016-11-16T10:47:52+00:00 August 8th, 2016|

About the Author:

Tormod is Chief Strategy Officer at Puzzlepart. He has 20+ years' experience from working with communication, collaboration and analysis.