Usability Testing: An Integral Part of Corporate Responsive Redesign

Redesigning Your Site to Be Responsive? You Need to Conduct Usability Testing. Often.

A vital component to our customer web site redesign at AEP is iterative usability testing. Our site is large – hundreds of pages, 7 sections, tons of content. Our analytics reveal that around 95% of our customer traffic is there to do one of 25 key tasks: view or pay their electric bill, report an outage or check outage status, or do some other account-related function. The other 5% of traffic is there to do anything else among our hundreds of pages. So, while our priority was to make the account-related functions prominent and easy to access, we couldn’t exactly kill all the other stuff.

As our information architecture and designing commenced, it became immediately imperative to continuously run usability studies on our wireframes and design comps – to make sure our design decisions were solid and informed by customer failures and successes.

We love the successes. But we rely on the failures to improve design.

Customer failures in our early designs led to 5 separate iterations of design-test-design, using Chalkmark for the testing. We observed a significant failure with a major task: reporting a power outage. Only around 40% of participants actually clicked in the right place – a section labeled Report Outages & Problems. Another one-third opted for the Contact Us tab, which we, as an electric utility, do not want. (We want customers to report outages through our automated web form, not a contact us form.)

ReportOutage_blog

In this early design, too many users were clicking Contact Us, and not Report Outages & Problems.

By the final iteration, we had created a design that worked and customers click on at least 85% of the time. But I shudder to think what the business impacts would have been had we NOT done usability testing.

Outages_Problems_Blog

By consolidating the outage-related functions, the most recent design attracts most of the users to the Outages & Problems area.

Comparative Testing
Another facet of our early testing was comparative usability testing, or A-B-C testing our site alongside 2 or 3 other electric utility sites. I cannot espouse the value of this enough. We gathered so much information about what works – and what does not work – in terms of design, labeling, and information architecture from our peer utilities. In many cases, our peers were vastly outperforming us in terms of consistency of where users clicked and how quickly they did so.

Comparative_Blog

From our comparative study — we saw how much faster our peer utilities were for paying your bill.

Eight Rounds of Mobile Testing…and Counting
Taking our wide, deep site, and making it responsive invariably means making design decisions that affect layout, navigation, menus, touch interactions, and all the other nuances of presenting a web site to a small viewport.

Early on, we did a few rounds of in-person wireframe usability tests. At 320 pixels wide, the comps, when loaded on a test server, fit perfectly within the confines of an iPhone and certain Android phones. We experimented with various high-level elements, like the account login, search box, and navigation menu.

v2_1_320_wireframes

A very early mobile wireframe.

As our design became more refined, we began usability testing with full-color comps, created by our expert interaction designer, Nick Carron. Using Axure (wireframing software), I took Nick’s comps and created a pseudo-interactive experience. I then proceeded to conduct 5 remote, recorded mobile usability tests with representative users on iPhones and Android phones. Again, with each test, we patted ourselves for the successes. But lived for the failures.

While we’ve seen several issues with navigation and other design elements along the way (which we’re addressing with each subsequent round of test-design-test), I’ll point out a couple of key failures – with icons.

First, the menu navigation – aka “hamburger” nav. While this appears to be on its way to becoming a convention, it is not there yet – especially among mobile users who consider themselves to be novice to moderately experienced on mobile. While I don’t have anything close to statistical evidence for this statement, anecdotally, I observed customers in their 50s and older struggling with this more than younger users did. (That’s not to say younger ones didn’t occasionally have issues, or that older ones always did.)

What I know is that on tasks that required users to access the hamburger icon, a larger-than-expected percentage of them scrolled down the page – and in many cases, all the way down to the footer – where they found an appropriate link to tap. Recent studies and articles have actually shown this ineffectiveness: Aurora Bedford, of the Nielsen Norman Group, wrote an article on icon usability. Jennifer Aldrich blogged about it in her “User Experience Rocks” blog. Exis did a quantitative study on it.

Most users scrolled down — some all the way to the footer to find nav that was under the hamburger.

Simply adding the word “Menu” to the icon significantly improved the tapworthiness of it – although it’s much better at drawing in the Account-type functions than more general content like News or Save Energy.

320_Home_menu_blog

Adding “Menu” to the hamburger icon helped a great deal. But it’s still not perfect.

The second icon failure was a simple [+] sign – which we used to show an expandable section on the page. Nope. People didn’t tap it. In fact, one person, who clearly saw it, commented that “plus means to add something, and that’s not what I want to do.” Duh.

320_Account_LoggedIn_blog

Oops. Using [+] for expandable content wasn’t intuitive.

I live for those “duh” moments. The new design has a down arrow, but as of now, we still need to test its effectiveness.

There have been bunches of other fascinating findings from our mobile studies, which I won’t delve into in this blog:

  • Users will scroll a long page, but will bail out as soon as they see a link that might be remotely connected with what they’re looking for (even if it’s not).
  • Users are unforgiving about page load time on mobile — although slightly less unforgiving during a usability test.
  • Users don’t read content. They quickly scan.
  • Users will tap anything that looks like it could help them – even nonclickable headers.
  • Users have really big fingers compared to the things they have to tap.
  • Users learn your site during testing, so it’s best to re-order your tasks as you work through your list of participants.
  • Users will say something was totally perfect after they struggled for 2 minutes trying to find it. If they say: “So far, everything is easy, no problems here” or “OK, perfect” – DO NOT BELIEVE THEM or use what they say to add to your insights. Instead, rely on your observations.

Next Up – More Usability
Our next studies will be another mobile study, a desktop usability test (probably the first of 3 or 4), and an eye tracking study. In total, by the time the site launches late this fall, I expect we will have done more than a dozen in-person and remote usability studies – with input collected from more than 250 representative users.

Have you gone responsive yet? What kinds of usability activities did you do? And what amazing DUH moments have you had?

^EJD

@ericdUX on Twitter
Connect on LinkedIn.

Advertisements

Getting Ready for Usability Testing a Responsive Site

At AEP, we’re in the midst of a gigantic redesign of our corporate site, AEP.com. We’re developing the site using responsive design, which means our one design adapts to work on any device. We’ve already conducted at least 5 rounds of user experience testing already, in order to nail down the global navigation (how does it change from full desktop to iPad to small tablet to iPhone or Android phone), contextual navigation, dealing with long content pages, and other UI elements. Now, with the site earnestly being developed, we prepare to bring in customers to test the full experience.

The usability testing, which will begin next week, will have users complete common tasks that explore aspects of navigation (findability and context) of a deep site; readability of content on small devices; usability of tables containing lots of data; the usability of completing a form on a mobile device; getting customer service; general touchability and effectiveness of interaction design.

Working with our outside recruiting firm, we’ve enlisted participants who are very mobile and internet savvy and who have experience looking at information on corporate sites. As of this writing, we will be testing on two iPhones, two iPads, two 7-inch e-Readers, one Android phone, and one 10-inch Android tablet. Our testing hardware will allow us to record both the mobile device screens and the picture-in-picture of the users’ faces.

Because responsive design is meant to work on any device, we’ve invested in a bunch of mobile and desktop gadgets. Here are a few pictures of our gadget lab.

Some of the gadgets in our lab. Not pictured are the Google TV, Playstation 3, and Samsung SmartTV.

Some of the gadgets in our lab. iOS devices, Android phones, e-Readers, even a BlackBerry.

Image

Another view of our gadgets. The Dell monitor on the right is our eyetracking equipment.

Viewing our site on a Samsung SmartTV. Also pictured are a Sony Playstation 3 and Google TV.

Viewing our site on a Samsung SmartTV. Also pictured are a Sony Playstation 3 and Google TV.

We keep the devices in the UX Lab, but we’re also using the room to do QA on the devices. Thus the room has taken on the moniker of QUAX Lab, which I find amusing given that many people think my name is “Eric Dux.” It also speaks to the general craziness we’re all feeling with this project.

The following week after we test on the small gadgets, we’ll run another set of participants through the full site experience — using our eyetracking equipment. This will let us see exactly where people are looking as they traverse our site, providing excellent insight to our designers. And, of course, we’ll learn about how the navigate and complete tasks, and to what degree of satisfaction.

Being part of a corporate responsive design project has been a significant learning experience for everyone in our group. It’s been a heck of a lot of work and we could probably write articles on content governance, iterative design and usability testing (aka, being agile), nuances of designing for breakpoints, structuring design and development teams, quality assurance, and so on.

Our upcoming usability testing is one portion of a huge project — but an important one to validate that the site works for real people.

Wish us luck!

Usability Testing Vendor Products

Author’s Note: This blog was originally posted on March 30, 2011, on a company Facebook page. Now that I have a WordPress presence, I thought I’d post it here. Plus, there’s some good news at the end.

This blog posting looks at some recent examples of usability testing of vendor products, the challenges and resistance we faced from the vendor, and the lessons we have learned about why it’s valuable to conduct vendor testing while the project stakeholders observe live.

Continue reading