July 2019
« Jun    


In the report “The Business Value of Design” recently published by McKinsey & Company there are four themes that are key for business and design leaders to focus on if they are committed to maximizing the business benefits of design in their organizations:

  • Cross-functional talent
  • Continuous iteration
  • User Experience
  • Analytical leadership

The report leaves us with some key takeaways and highlights two big opportunities for design leaders and practitioners:

  • 40% of the organizations in the study do not talk to the customer
  • 56% of these organizations do not measure the impact of design

In many different engagements, I have interviewed designers who struggle because they have no data to work with, they are drowning in data and some hesitate to bring the data questions to their teams. On the other hand, some designers are actively exploring ways to measure the impact of design — whether we focus on the impact of our work as a UX team or the work we do as part of a product team and the organization. As designers, in our journey towards an analytical leadership mindset, we encounter myths and we make mistakes that we have to overcome.

Common myths about data-informed design

Here are some of the most common myths designers encounter when working with data

Data means numbers

Rolling or consolidating the behavior of hundreds, thousands or millions of people into a single number is not always useful, or reliable. Many times, we are dealing with qualitative insights, and teams struggle to accept non-numeric information. Product teams, business leaders and even ourselves, often do not consider qualitative data as measurable and we ignore it.

As designers, we need to work with our teams to articulate the need to supplement the hard data we collect, which indicates what happened, with the soft data that answers the why it happened question, what was the customer thinking, what motivated their behavior. We need to understand the user’s motivations and needs behind their behavior.

Data Is the objective truth

Quantitative data compiled by software rather than humans make it seem like hard fact. However, the algorithms that collect data sets are created by humans, who interpret them and assign meaning and there is an unconscious bias in that meaning. Big or small, no data is perfect. There are limitations and bias present in every type of data analysis. Design practitioners and leaders have the responsibility to minimize that bias as much as possible, or at least identify it, describe it and provide context.

Bigger is always better

Big Data does not have the power to reveal or predict behaviors about the people that use our products. We need to collaborate with Data Scientist to understand the data and draw connections between the behaviors and the quantitative information we receive from our customers. The data analytics will not tell us everything we need to know about user behavior. We have to create meaningful categories of metrics for our products that help us evaluate, understand and keep track of actionable outcomes.

Data is for others (Managers, Developers, or Data Scientists), not for designers

It is tempting to look for data to prove or disprove decisions we make as individuals, teams or organizations. However, we should not use data to confront designers with leadership or their teams. We should not look at data to be the arbitrator to prove who is right or wrong. We should leverage the data to learn, make improvements and discover new possibilities together, as a team. We need to leverage data to help us tell the story of the people using the solutions and services we are putting out in the marketplace.

Data undermines our ability to innovate

In a recent interview for DesignImpact — Establishing a bond of trust to expand Design’s influence and impact — Sami Niemelä says “that those data-driven organizations have their eyes in the rear-view mirror.” For some, data is seen as the antithesis of innovation because it is backward looking, it is tactical rather than strategic and it skims the surface. While hard metrics are important, Niemelä considers that design practitioners need to bring back the soft metrics like design quality, empathy, customer feedback, and ethics.

The problem is not in the data itself, but in how we are using it. If we want to use data effectively to help us inform design decisions, we need to embrace the complexity of both quantitative and qualitative perspectives.

The myth of the right way to use data to inform design

Designers need to understand that there is no magic bullet. There is not a single process or a unique approach to work with data. Teams and organizations have to find their own approach in a manner that makes sense to them. A few key points to consider include:

  • Use data from a variety of sources to inform your design
  • Include numbers and provide context
  • Use data to track changes over time, explore new patterns, and dig deeper on the problems
  • Decide on meaningful categories and metrics that help your team tell a story about the customer experience
  • Develop a way to share and discuss data in your organization, and start by defining the basics together with your team and your peers.

Common mistakes designers and organizations make when using data

Here are some of the most common mistakes designers, product teams and organizations make when working with data.

Using data to drive decisions, rather than inform decisions

Data alone may force a team to throw away a good idea or experiment at the first sign of trouble. Instead, being data-informed is about using the data we have and combine it with qualitative feedback, our own design intuition, our practices, and experience. From that point, we experiment, learn, iterate and validate the product and services we design and we continuously deliver value to our customers and to the organization.

The best teams understand which discovery tools to use and when; and, most importantly, how to leverage the data they collect. That is the ultimate measure of the value of discovery work but it’s much harder to quantify.

Jeff Gothelf — Optimizing your team’s velocity (of learning)

Mistaking a vanity metric for a meaningful metric

Vanity metrics are often focused on the number of releases, the number of product features, number of new customers or superficial adoption rates. However, these metrics do not tell us anything about product quality or whether we are meeting the customer experience needs. Vanity metrics are measuring activities that make us feel good but do not actually tell us if we are making progress. We need to be relentless in defining metrics that truly capture the value we are creating for people, and measure the impact design has in the organization.

Drowning in the data we collect

Some teams are tracking and collecting data that they do not use. Designers and product managers feel they are drowning in data, they feel overwhelmed and unprepared to understand it. Instead, product teams should clearly articulate what metrics may be useful to them based on the nature of the application or service and the business goals. Metrics should be based on and aligned with those product and organizational goals.

Exploring data before formulating our hypotheses

We have to resist the temptation of retrofitting our hypotheses to match the data we collect. As a team, we have to clearly understand the riskiest hypotheses first and then device the experiments that will help us identify meaningful ways to validate them and measure them. We cannot leave it to chance and determine whether or not we were successful after our product is already in the hands of our customers. Our teams and our organization should have a clear picture of what the goals are, and how we will measure if our work and our products achieved them.

Embracing the opportunities to create business impact with design

We have the responsibility to elevate the UX practice in our organizations and measure the design impact. As design leaders, we need to overcome the myths and watch out for the mistakes we could make when working with data. We need to foster the cultural environment in our teams to involve customers in the design process, review and analyze the data we have and be prepared to act based on what we learn.

If we want to expand our influence within our organizations, we have to start showcasing the impact design has in the business and not the methods and processes we use.

Comments 0
There are currently no comments.