5

A vs. B: Don’t Ask Your Colleague, Ask Your Users | UX Planet

 1 year ago
source link: https://uxplanet.org/a-vs-b-dont-ask-your-colleague-ask-your-users-e37e8f2419ae
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

A vs. B: Ask Your Users, Not Your Colleague

A sneak peek into effective A/B testing.

An illustration showing two hands holding a phone. The screen of the phone is dividing into two parts. The left part is named “A” while the right part is named “B.”

Illustration by Michalina Bidzinska

I used to tell myself that user interviews, surveys, and usability testing were enough to get to know your target audience and reveal what they needed. Here is why I was wrong.

According to Harvard professor Gerald Zaltman, only one out of twenty people make a conscious purchase decision online. — Source

This surprisingly small number shows that there are many factors prompting users to take various actions, but those actions cannot always be explained logically. Just like Zeltman points out: “Emotion is what really drives the purchasing behaviors, and also, decision making in general.”

Thus, how can you uncover what your target users desire if you can’t entirely rely on qualitative data collected during interviews alone?

Calling A/B testing for help

While visiting various websites or using apps, you may have come across an unusual situation when your favorite site suddenly changed its appearance, and a few days later, it returned to its typical look and feel. That’s what happened on Instagram in June 2022, when various users saw a full-screen feed while others still saw the original version.

With that experiment, Meta attempted to check if Instagram users would be more willing to scroll through all posts one by one instead of quickly moving through several posts with one swipe of a finger. The idea was similar to the viewing experience on TikTok, where users watch all videos in full-screen, more immersive mode. If you weren’t “the lucky one” who was taking part in the test, you can read more details about it here.

However, the new experience appeared to be strenuous and discouraging to many dissatisfied Instagrammers that complained about the experimental design across social media. Sometime later, Meta brought back the previous experience to everyone on Instagram.

Why you should give A/B testing a go

A/B tests are an effective and prevalent method of checking which variant out of two or more would be preferred among a pool of target users. Companies such as Netflix, Amazon, Apple, or just like the mentioned above — Meta — use A/B testing to uncover the most optimal and desirable user experience without committing to all changes in the longer term. Simply put, if people love the new experience — the test wins and stays for longer, but if users hate it — it loses and is taken down once the testing phase is over. Such a way of implementing designs doesn’t hurt your sales — the idea behind it is to confirm how various solutions would hurt your sales if applied without checking their impact. Conversely, applying A/B testing can also help you verify how one change can drastically encourage customers to keep using your product or services. On the other hand, Nielsen Norman Group specifies that A/B testing is much cheaper and less risky compared to other research and testing methods.

Incrementally, A/B testing is very cheap. You do need to pay a designer to create the “B” design, but most of the cost lies in the software to run and analyze the test, which is a one-time expense. Thus, if you’re going to do it at all, you should run lots of A/B tests. — NN/g Group

In terms of Meta’s experiment, the results haven’t been publicly announced since that kind of data is usually confidential. However, we can assume that Instagram’s full-screen feed test didn’t achieve the desired results as they’ve decided to hold to the original content viewing experience. It’s also very likely that many users either stopped or limited using the app as the interaction required more scrolling effort to go through unseen feed. We can also presume that the total user engagement dropped, which prompted Meta not to implement that variant. That leads us to my next point.

1. Testing is learning

As a designer working on A/B tests on a daily basis, I’ve encountered numerous situations when an idea was believed to be a clear winner to everyone inside an organization, but that wasn’t the experience that customers wanted. However, what’s incredibly beneficial about experimentation is that no matter what results you obtain, they supply you with new learnings about your target users that you can always utilize in the future. Thus, it’s a win anyway!

2. Testing is data

Running tests helps you collect more intricate data across the entire user journey and allows you to check the overall level of interaction with your product, whether that’s conversion, click rates, exit rates, or something else. If you don’t have that knowledge, you’re forced to base your work purely on assumptions, and that’s not enough when you’re in control of the experience of a large pool of returning users. Risking their trust can cost the business much more than running a test.

In the current period of economic instability, many companies have imposed budget limitations and opt-out of experimentation as it seems to generate extra costs. However, applying bold changes within the area of customer and user experience without checking their impact may lead to further drops in user engagement and, subsequently, sales.

3. Collect factual data to prove that option A is better than B

A/B testing is an incredibly useful, effective, and relatively cheap method that designers and companies should apply much more frequently to learn more about their target users. At the same time, experimentation programs can help you collect factual data proving why “solution A” works better than “solution B.” Without testing, designers often have to turn to their co-workers for feedback. However, other employees will never provide you with the same answer as target users because they ARE NOT the target.

Make your design decisions more data-driven

In customer experience, data analysis and design go together. A/B testing has taught me how crucial it is for designers to get comfortable with data analysis and even be able to collect and interpret quantitative data on their own. I frequently meet designers who say how much they despise working with numbers and analytics tools. However, if you wish to make your work more people-driven and impactful, you must befriend quantitative data to uncover what solutions work, what don’t, and why.

In my case, I use web analytics tools as often as I use prototyping tools. Since I’ve added them to my favorites on my browsers, I’ve discovered many issues I would never have uncovered without them, and I’m not even sure if anyone else would have done that. I’m still a UXer, and I surely pay attention to different points of friction than other professionals.

Testing can validate your design decisions

A/B testing is a form of research and validation of design decisions. If you think that your organization doesn’t need to test, let me say that it’s unquestionably more convenient and less expensive to conduct an internal survey and ask your colleagues for feedback.

In reality, presenting two versions side-by-side to your co-workers and asking which one is better isn’t the same as testing the overall user experience on your website or app with a randomly selected user. In fact, you only expose them to a selected chunk of information, and they are very likely to know how to perform actions within your interface if they use it everyday. Moreover, it’s crucial to understand that end users will not have the privilege of having you by their side when they try to unsubscribe from a service, make a purchase, or struggle to find a crucial piece of information — they are on their own.

While some decisions are purely opinion-driven, such as image or icon selection, the UX teams should be given the opportunity to test various ideas on an actual pool of target users. According to American Express, retaining a customer can cost even 6–7 times more than acquiring a new one. That’s why you should always be mindful of what your customers want instead of letting them go.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK