5

Ensuring quality design: A staring contest with the Eye of Sauron

 3 years ago
source link: https://uxdesign.cc/ensuring-quality-design-a-staring-contest-with-the-eye-of-sauron-d4784c8f552a
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Ensuring quality design: A staring contest with the Eye of Sauron

A drawing of a woman holding a burnt object, face covered in soot. Beside her is a box for a Sabre printer.
A drawing of a woman holding a burnt object, face covered in soot. Beside her is a box for a Sabre printer.
My new Sabre printer, shipped straight from Scranton, PA

Quality is everybody’s problem.

If the marketing messaging does not align with the actual delivered product, the client experiences poor brand quality. If the day to day work is not improved by a recently procured product, then the client will experience a poor quality in value. If the design and development work is not held accountable to the overall quality of the user experience, the client will receive a poor quality product. More importantly, at each step of the process, the burden of helping our users fix their quality will trickle down to the support team, driving overall operational cost increase and frustration from all sides.

Your client doesn’t care about the organizational excuses that result in poor experience. They don’t care about breakdowns in communication, processes, or lack of testing. They only care about ensuring they are getting what they are paying for. Even more so, they are going to be critical about what they will continue to pay for in a subscription- and service-based economy.

Over the past year and a half I’ve worked to help create a way to understand and advocate for design quality at my organization, IBM Security, and, y’all, it’s been quite the journey. Now, after months of learning, failing, growing, and revisiting, I want to share what it was like getting this work off the ground so you can build a pathway to creating design excellence at an engineering-focused organization. This involves creating a program, causing rabble with design quality assessments, and finally embedding into the existing life cycle.

It’s a long one, y’all.

The TL;DR is here:

  • Transformational processes are chaotic and painful, but worth it.
  • Creating room for accountability in an engineering-lead organization is about creating space for your design quality assessment to be prioritized in the development cycle.
  • As a cross-functional leadership team, work together to understand what quality means across the customer experience (CX) for your organization.
  • Use your expected user experience as a charter for all of the cross-functional work.
  • Learn the language of how your development team handles bugs and use this language for customer experience quality defects.

For those who want the full story, read below.

This path was riddled with luck, peril, a single-minded purpose, and the warmth of camaraderie. So what better way to tell this than through a Lord of the Rings metaphor?

Image for post
Image for post

Leaving the Shire

A woman waving goodbye as she walks into the unknown
A woman waving goodbye as she walks into the unknown

I’ve been a part of the design team for five years as a design researcher. A key part of this role is to build partnerships. Often it’s with product management and design, two of the key stakeholders for the work we complete.

However, for this quality work to be successful I would need to leave behind the world I knew and foray into the larger business context within which design resides. The folks who I would engage with were another group altogether, most of whom had never heard of me nor my work and to whom I’d need to prove my purpose. I worked with the blessing and aid of my design executive, Haidy Perez-Francis, who enabled me to work with this larger business audience.

This program that we were building would be called the Consumability Program. Consumability, funnily enough, is an IBM-created term. The core goal of it was to help drive quality improvements throughout the lifecycle of the product, from when a user is looking for a solution to when they are ready to move on from using our products. Each part of this customer experience would need to be enhanced.

The program itself was broken into three parts.

  • The first part was the establishment of a set of standards and guidance. How would teams know if they were delivering quality experiences after all, if we did not come to a shared definition of what quality meant?
  • The second part was to hold teams accountable for delivering this quality by performing a formalized review. How can we check to see the value of the product prior to release?
  • Finally, the last part was to report on the improvements to understand how we are constantly working towards improving the experience. How were teams prioritizing identified issues?
Image for post
Image for post

One does not simply walk into Mordor

The first piece of this work involved connecting with stakeholders to represent all parts of the customer journey and identify our standards of excellence. This served as an initial blueprint to understand overall the destination that was unique to our business unit and its limitations.

Why did you create your own standards when so many already exist?

That’s a great point. There are so many design standards that exist, from aesthetics, to customer engagement, to usability. Additionally, there are so many icons to look up to for high-quality design and experience. However, the focus of this exercise was two-fold. First, it was important to understand the current state of affairs — what is the as-is experience as we know it? Second, I wanted to use this to understand where we wanted to go — what was the to-be experience we wanted to create?

Fundamentally, the work we were about to embark on would be representative of who we were, as an organization. Not only that, I wanted it to keep in mind the existing pain points of the process that we may already know about. This was critical. I did this by hosting 11 (!!) design thinking workshops with leaders across my organization, who brought in subject matter experts from the business, to help outline pain points, identify the goal state, and figure out what were the pillars of our standards that would support us in this vision.

A woman catching several falling workshop screenshots
A woman catching several falling workshop screenshots

And with that, I set off from the known comfort of the shire onto other parts of the business, to take our standards to figure out how to get teams to use these standards.

a visual page
a visual page

And my axe

It would be remiss not to include the stars that aligned to enable this work. Luck, fortune, or fate — Whatever you want to call it, if certain elements weren’t in place we wouldn’t have been as impactful as we were. This great luck comes in many forms.

First, the elements that gave us the pieces to piece together a narrative for accountability:

  1. The design program at IBM overall was focused on assessing quality, from our VP of Design down into the Cloud & Data space. We previously had external assessments done for us, but it was hard to understand how to act on them.
  2. Our new VP for our business unit wanted us to get back to the basics of doing work well. This involved a keen look at how our overall offerings were performing from a quality standpoint.
  3. Our Design Executive had been working with the leadership team to emphasize a holistic product experience vantage point at which to judge our offerings rather than through independent silos.

So the foundations for opportunity were available. My role was to connect these opportunities and weave them into a way for us to hold ourselves accountable to a better product experience.

The second strike of fortune exists in all the people who worked to push this program forward along the way.

This was not a hero’s journey nor an independent venture. This work wouldn’t be possible without the cross-functional, cross-disciplinary collaboration that needed to happen to ensure that the demand for quality came from voices across the organization. I am eternally grateful for my colleagues from marketing, to development, to design, to support that was open to helping and open to being honest and vulnerable during the formation of this program.

a visual page break
a visual page break

You cannot pass

What is friction?
And why do we fear it?
And can we always avoid it?

Ensuring we would be holding ourselves accountable to design quality, we needed to actually create more friction in the process of creation. Because friction is not unavoidable, though it may be able to be deterred. In leaning away from initial friction, it compounds in the final user experience.

The work that must be done, then, is to figure out what will connect a team together. What could a team hold cross-functional accountability for? After all, we are not all judged for the same performance indicators. Our crafts from performance marketing to level 2 tech support are clearly very different.

A ring with a test case engraved on it.
A ring with a test case engraved on it.
One test case to rule them all (or more, depending on the scope of your release)

What I wanted to do was to align teams around the user experience. We did this in the form of a test case. For all intents and purposes, this test case would be the one ring to rule them all. It would be the test case that ultimately would help to determine if the teams were delivering on the promises they had within their product, it was the single source of cross-functional accountability.

Test cases needed to include:

  • Persona: Identify the people who are involved in doing work in this particular test case.
  • Trigger: Describe the reason why a user would start the test case.
  • Flow: Using the steps a person would take within your product, outline the happy path, assuming all goes according to plan.
  • Success: Define what happens to know this test case is completed and a user’s job is done!

Pro-tip!

The test case wasn’t just about it’s contents. It really served as a good way to be a diagnostic for the teams. How were teams writing them? Would it be written based on the click-through path or based on the steps a user was trying to accomplish? What does this reveal about user-centricity as a mentality on a team?

It was important to really emphasize the test cases as a charter for teams to really align around.

  1. Across your team, align around your test cases.
  2. Use it in your planning for a release and track against the use case.
  3. Constantly validate this test case through your design and development cycle using user testing methodologies.
  4. Identify and fix the high-severity bugs related to the test case.
  5. Run another evaluation.
  6. Create a report on how your test cases are scoring relative to the experience.
  7. Share this at each release.
  8. Repeat.

As a design researcher, we hunt for friction and find in it opportunity. In this work, we come with really cool tools to be able to take a look at user experience. To drive this formal review process, I wanted to lean into existing usability testing methodologies to conduct the sessions. After writing their test cases and providing other necessary prerequisites like a demo environment, teams were asked to come in and observe how a user or proxy-user would complete these test cases in a moderated study.

A woman and an elf sitting at a table while people look in from the shadows. The elf has a confused look while typing.
A woman and an elf sitting at a table while people look in from the shadows. The elf has a confused look while typing.
Elves need a little reassurance — there’s no wrong way a participant can go in a usability test!

As the DM — err, I mean facilitator of this session, I would really rely on the test case as a way to guide the user while they were stuck. While the user was completing the test case, we had the product team leads come as observers alongside a panel of design and experience professionals who wrote down issues and recommendations to playback to the team.

The teams were scored against how each of the issues would impact the user’s experience and this score was ultimately rolled up and presented back into the reporting cycle, making it up to the executive level.

What happened next? Well, we pissed a lot of people off.

a visual page break
a visual page break

At the turn of the tide

I don’t want to lean away from the difficulties in this story of how all of this was coming together. It was painful and hard, especially to see how this work impacted the teams when they received a negative report.

To make real change, we also needed to create a shared language with which to report any bugs or defects found. For us, it meant to understand how to communicate the severity of each of these found issues in a similar way. We found a 4-point severity scale and we mapped against it, adding our slice of how each of these severities impacted the user experience.

  • Severity 1: Poor user experience, a user is likely to be blocked completely from doing what they need to get done. There are significant errors within the experience that are not able to be remediated on their own.
  • Severity 2: Burdensome user experience, a user is going to encounter significant issues that impede their user experience. While they may be able to complete the work they set out to do, there are large usability concerns that need to be fixed.
  • Severity 3: Usable user experience, a user can complete the work they set out to do. Some visual and UI consistencies need to be improved on within the next release to create a better, cohesive experience.
  • Severity 4: Delightful user experience, a user completes their work with this product with little to no issues. Little visual inconsistency. Users may have experienced new functionality in a way that was unexpectedly pleasant.

To be honest, this is in the thick of where the program is at now. We are working to better understand and build into the existing engineering quality assurance cycle. Just as you’d unit test or test performance, so too should you test your end to end experience.

However, a great and fundamental learning in this process is really understanding the other parts of the organizational machine. As a designer, you’re always taught to hold empathy for your users; but it’s also important to hold empathy for each other. Design doesn’t own the user experience. Everyone does. And being accountable for the user experience is shared.

By using the same technology, terms, and places where developers already are, it’s easier to prioritize the work to be done. And if you couple this with furious rallying around the test case as a cross-functional team, you know you will have an artifact to serve as the way you are united in planning, building, improving, speaking about, and supporting your customer experience.

a visual page break
a visual page break

GO WHERE YOU MUST GO, AND HOPE

There is pain in transformation. It is easier to avoid it, to keep the status quo, and to lean into the ease of comfortable practice. By building a program that demands and enforces accountability and quality to the customer experience, we force transformation with all it’s hard and difficult pieces.

There’s still a lot of work to do. Some of it involves better reporting back on progress and some involve encouraging teams to do more evaluative user research consistently so we never stray too far from the heart of those we serve. Quality simply doesn’t exist without our user’s being involved in the development process. Investing in design research is investing in the security that our teams are meeting our user’s needs. Investing in an experience quality assurance process ensures that to our user’s discerning eye, we are valuable.

A dwarf looks at a package for a battle axe with hearts in his eyes.
A dwarf looks at a package for a battle axe with hearts in his eyes.
Make battle axes dwarves crave!

Designing for quality means you are designing for accountability. IBM as a whole has been pushing for quality and it’s been great to see how other teams have rolled this process out. Even more so, it’s great to see all the unique different variations teams have done to get here. Because designing for quality, for accountability, is also designing for community.

I believe in this work a thousand-million-bajillion percent. Accountability sucks. It feels awful. At first. But, let’s say you stick to it, yeah? And now you’re on day 45 of resolution of riding your bike. Sure, maybe it was hard to get here, but for 45 whole days, you did it.

And, dang, doesn’t that breeze feel good?


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK