AEP Outage Alerts Have Launched to Customers!

When your power goes out, if you’re like most people, you want to know 1) Does your electric company know; 2) When will the power be restored; 3) What caused the outage? Today, a project I’ve worked on (and off) for years has finally launched — and will tell customers just that.

Back in 2007, customers first told me during user research that they would love to receive an alert by text or email when their power goes out. In 2009, we launched a small email-only pilot. Then during the bulk of 2014, we built the project in earnest — paving the way for email and text alerts for outages, billing and payments, or whatever other account-related functions we add down the road.

A sample power restoration email alert.

A sample power restoration email alert.

Today the first wave of alerts — Outage Alerts — has launched to AEP customers. Initially, customers can get alerts to one email address and one phone number. If their power goes out, they will receive an alert. Then, as the status changes, and AEP knows the estimated time of restoration (ETR), an additional alert will be sent. And when power is restored, customers will also receive an alert.

A sample updated restoration time text message.

A sample updated restoration time text message.

I hope people don’t have to experience an outage. But if they do, and if they are signed up for alerts, they will receive reassurance that AEP knows their power is out, plus get updates on restoration, and the cause of the outage (if known). It’s a great step toward improving customer experience, and I’m proud to be part of the project team!

The sign-up for for outage alerts.

The sign-up for for outage alerts.

If you’re an AEP customer, you can sign up for alerts at AEPOhio.com.

^EJD
@ericdUX

Advertisements

Design vs. UX Viral Photo — I Object!

There’s a photo making the rounds in the User Experience world that’s gained thousands of favorites and retweets. You know the one — it shows nice, unused paved path next to a dirt path presumably worn into the grass by loads of people whose real goals were to get from point A to point B along a somewhat more desirable line. It originally came from Twitter user @benkimediyorum.

And while I agree that what we’re looking at is Something vs. Something Else, and I believe I get the essence of what the photo is saying, I do not agree that we’re looking at Design vs. User Experience.

Why? Because Design done properly reflects a well-planned user experience. The User Experience reflects a design created with the input of the users (in this case, the walkers), but also with the input of the business (perhaps this is a school?), the constraints of the budget, the environment, and so on. Design and UX go hand in hand, and I don’t believe you can pit them against each other.

If Design had been part of the project, it would have started out with some conversations. First, hypothetically, with the users.

A user researcher or two could have conducted some simple (and fun) ethnographic research. All that means is they park themselves under a tree and just observe the comings and goings of students. That would have been a great first step. From there, some basic user interviews could have been conducted. “Why are you walking from here to there?” “I see you’re coming straight from that parking lot. Where are you headed? Where did you come from?” Presumably, with how worn that dirt path is, there would have been a lot of pedestrian traffic to interview. Eventually, if you talk to enough people, you probably start seeing patterns, what times of day get heavy usage? are there athletes? students? carrying backpacks or not? gender? Probably more than enough information to decide not only where to create your path, but maybe even how wide to make it, or even which materials to use. (What if most of the people are on rollerblades because they play roller hockey in that empty lot from 6-8 PM? And you were planning on your path being cobblestone?)

Anyway, my point is that some easy-to-do user research would have uncovered a lot of actionable information.

Design’s role is also to understand the client’s goals, not just the end user goals. A roundtable discussion or 1-on-1 interviews with the business would have elucidated any constraints that the business had. Suppose the budget is finite and it’s not possible to pave two paths. What if the land adjacent to the proposed path is notoriously flood prone or is some kind of wildlife preserve. Now what?

Now what is that Design — that ongoing conversation between user and client — would have come up with the best possible outcome, given the constraints of the client and the goals of the user.

As a User Experience Design practitioner, I can tell you this: The best possible outcome is NOT always the ideal outcome for all parties. That’s reality. But Design — good Design — will always attempt to create the ideal User Experience. The two go hand-in-hand.

All that said, for the sake of this Design vs. UX photo, let’s assume that the school had sufficient budget to create a paved path (or two) anywhere between the two points and that the land was not some kind of preserve or swamp. No major constraints, and the project developers were eager to just go out a make a path.

The result? The photo you see. An expensive, non-Designed path alongside a worn dirt path representing what would have been a much better Designed path. A path that would have been created with much user input. A thoughtful, conversation-driven, researched, planned, studied, iterated, tested process that would have ultimately placed pavement where dirt now lies.

What I see in the photo is not Design vs. User Experience. If we’re to infer that the paved path is the undesirable one, it is actually the antithesis of Design. The dirt path is what should have been the design had Design and UX been involved. Using my mad Photoshop skills, I have recaptioned the photo: Development Without Design and Unfulfilled User Goals. The paved path is what happens when a project plows ahead without UX and Design. The dirt path represents unmet, undiscovered user goals and desires — the beginnings of a design unrealized.

design_dev

What’s scary is how often I see this in the corporate world of web sites and applications. Where a business unit says, “We need a web site that does blah.” And developers plow ahead, paving a web site that accomplishes their own aims, and leaving the poor users to trudge through the dirt to accomplish their goals.

 

Usability Testing: An Integral Part of Corporate Responsive Redesign

Redesigning Your Site to Be Responsive? You Need to Conduct Usability Testing. Often.

A vital component to our customer web site redesign at AEP is iterative usability testing. Our site is large – hundreds of pages, 7 sections, tons of content. Our analytics reveal that around 95% of our customer traffic is there to do one of 25 key tasks: view or pay their electric bill, report an outage or check outage status, or do some other account-related function. The other 5% of traffic is there to do anything else among our hundreds of pages. So, while our priority was to make the account-related functions prominent and easy to access, we couldn’t exactly kill all the other stuff.

As our information architecture and designing commenced, it became immediately imperative to continuously run usability studies on our wireframes and design comps – to make sure our design decisions were solid and informed by customer failures and successes.

We love the successes. But we rely on the failures to improve design.

Customer failures in our early designs led to 5 separate iterations of design-test-design, using Chalkmark for the testing. We observed a significant failure with a major task: reporting a power outage. Only around 40% of participants actually clicked in the right place – a section labeled Report Outages & Problems. Another one-third opted for the Contact Us tab, which we, as an electric utility, do not want. (We want customers to report outages through our automated web form, not a contact us form.)

ReportOutage_blog

In this early design, too many users were clicking Contact Us, and not Report Outages & Problems.

By the final iteration, we had created a design that worked and customers click on at least 85% of the time. But I shudder to think what the business impacts would have been had we NOT done usability testing.

Outages_Problems_Blog

By consolidating the outage-related functions, the most recent design attracts most of the users to the Outages & Problems area.

Comparative Testing
Another facet of our early testing was comparative usability testing, or A-B-C testing our site alongside 2 or 3 other electric utility sites. I cannot espouse the value of this enough. We gathered so much information about what works – and what does not work – in terms of design, labeling, and information architecture from our peer utilities. In many cases, our peers were vastly outperforming us in terms of consistency of where users clicked and how quickly they did so.

Comparative_Blog

From our comparative study — we saw how much faster our peer utilities were for paying your bill.

Eight Rounds of Mobile Testing…and Counting
Taking our wide, deep site, and making it responsive invariably means making design decisions that affect layout, navigation, menus, touch interactions, and all the other nuances of presenting a web site to a small viewport.

Early on, we did a few rounds of in-person wireframe usability tests. At 320 pixels wide, the comps, when loaded on a test server, fit perfectly within the confines of an iPhone and certain Android phones. We experimented with various high-level elements, like the account login, search box, and navigation menu.

v2_1_320_wireframes

A very early mobile wireframe.

As our design became more refined, we began usability testing with full-color comps, created by our expert interaction designer, Nick Carron. Using Axure (wireframing software), I took Nick’s comps and created a pseudo-interactive experience. I then proceeded to conduct 5 remote, recorded mobile usability tests with representative users on iPhones and Android phones. Again, with each test, we patted ourselves for the successes. But lived for the failures.

While we’ve seen several issues with navigation and other design elements along the way (which we’re addressing with each subsequent round of test-design-test), I’ll point out a couple of key failures – with icons.

First, the menu navigation – aka “hamburger” nav. While this appears to be on its way to becoming a convention, it is not there yet – especially among mobile users who consider themselves to be novice to moderately experienced on mobile. While I don’t have anything close to statistical evidence for this statement, anecdotally, I observed customers in their 50s and older struggling with this more than younger users did. (That’s not to say younger ones didn’t occasionally have issues, or that older ones always did.)

What I know is that on tasks that required users to access the hamburger icon, a larger-than-expected percentage of them scrolled down the page – and in many cases, all the way down to the footer – where they found an appropriate link to tap. Recent studies and articles have actually shown this ineffectiveness: Aurora Bedford, of the Nielsen Norman Group, wrote an article on icon usability. Jennifer Aldrich blogged about it in her “User Experience Rocks” blog. Exis did a quantitative study on it.

Most users scrolled down — some all the way to the footer to find nav that was under the hamburger.

Simply adding the word “Menu” to the icon significantly improved the tapworthiness of it – although it’s much better at drawing in the Account-type functions than more general content like News or Save Energy.

320_Home_menu_blog

Adding “Menu” to the hamburger icon helped a great deal. But it’s still not perfect.

The second icon failure was a simple [+] sign – which we used to show an expandable section on the page. Nope. People didn’t tap it. In fact, one person, who clearly saw it, commented that “plus means to add something, and that’s not what I want to do.” Duh.

320_Account_LoggedIn_blog

Oops. Using [+] for expandable content wasn’t intuitive.

I live for those “duh” moments. The new design has a down arrow, but as of now, we still need to test its effectiveness.

There have been bunches of other fascinating findings from our mobile studies, which I won’t delve into in this blog:

  • Users will scroll a long page, but will bail out as soon as they see a link that might be remotely connected with what they’re looking for (even if it’s not).
  • Users are unforgiving about page load time on mobile — although slightly less unforgiving during a usability test.
  • Users don’t read content. They quickly scan.
  • Users will tap anything that looks like it could help them – even nonclickable headers.
  • Users have really big fingers compared to the things they have to tap.
  • Users learn your site during testing, so it’s best to re-order your tasks as you work through your list of participants.
  • Users will say something was totally perfect after they struggled for 2 minutes trying to find it. If they say: “So far, everything is easy, no problems here” or “OK, perfect” – DO NOT BELIEVE THEM or use what they say to add to your insights. Instead, rely on your observations.

Next Up – More Usability
Our next studies will be another mobile study, a desktop usability test (probably the first of 3 or 4), and an eye tracking study. In total, by the time the site launches late this fall, I expect we will have done more than a dozen in-person and remote usability studies – with input collected from more than 250 representative users.

Have you gone responsive yet? What kinds of usability activities did you do? And what amazing DUH moments have you had?

^EJD

@ericdUX on Twitter
Connect on LinkedIn.

Security Questions Get Personal!

We’re all familiar with the traditional “mother’s maiden name” online account security question — one of the first I remember when security questions began to proliferate. To seemingly provide more secure, unguessable options, companies have added others, like “make and model of first car,” “street you lived on in 3rd grade,” and more questions whose answers don’t ever change. Of course, we also see examples of bad questions that DO change over time. These are mainly “favorites” types of questions: “favorite actor,” “favorite song,” “favorite movie,” “favorite teacher.” If you’re like me, my favorites change over time, rendering these types of questions useless. FYI, if you’ve ever had to reset your Apple ID password, you’ve noticed that Apple is among the biggest culprits of “favorites” questions.

Apple Security Questions

Apple security questions include many “favorites,” which change over time.

Favorites aren’t the only questions whose answers can change. How about “youngest child’s middle name?” Well, what if I have more children? Am I going to remember that fact a couple years down the line when I have forgotten my password, but increased my family count? Heck, questions about where you met your spouse can have ephemeral answers if you’re the divorcing type. Even questions like “nickname as a child” become difficult to answer. I had at least 4 nicknames that family and friends called me by. Which one do I choose? And what is the likelihood that 6 months later, I’ll remember the right nickname. My oldest cousin’s name? Well, it works for now, but what happens if he or she dies? (I actually refuse to enter questions like this, out of some strange superstition I feel…but that’s a whole separate issue.)

Not only do fungible answers confound the process, but arbitrary syntax rules as well. What if the first concert I attended was “U2,” but the site demands at least 3 characters in the field? If the first concert was Bruce Springsteen, do I include the space between the names? Does upper or lower case matter? If next year, while recovering my password, I type “bruce springsteen” when the site was expecting “Bruce Springsteen,” will it give me an error? What if I spell it “Springstein”? After multiple errors, I may start to doubt my own memory of whether that was my first concert — and go down the “Barry Manilow” path. Hypothetically speaking. Of course.

Usability problems of security questions aside, every now and then, I come across some gems that are worth capturing. These more, um, personal questions are both usable — for their answers can never change — and amusing. Seeing “What is the first name of the boy or girl that you first kissed?” from an online bank was a shocker for its uniqueness — and its being from a bank, of all places. (OK, admittedly, I spent a few seconds pondering if the girl who kissed me in first grade counted, or if the site was really after my first, shall we say, real kiss.)

Kissing Questions

Security questions get personal, with kissing questions!

This got me wondering — in companies’ efforts to come up with ever-increasing unguessable security questions — how much more personal might these questions start to get? Like, substitute “kissed” with, you know, other activities. Or maybe secret moles or other bodily anomalies that only you know the answer to? Or “How old were you when you lost your ________?” You fill in the blank. (I was thinking “first tooth.” Get your mind out of the gutter.) Regardless, I’m sure I’d glance back over my shoulder to make sure no one was watching as I typed the answers. And I’d probably never forget the answers or enter them wrong. Unless, of course, your site expects me to spell out “forty” instead of “40.”

If you see any clever security questions in your web travels, please pass them along.

^EJD
@ericdUX

Getting Ready for Usability Testing a Responsive Site

At AEP, we’re in the midst of a gigantic redesign of our corporate site, AEP.com. We’re developing the site using responsive design, which means our one design adapts to work on any device. We’ve already conducted at least 5 rounds of user experience testing already, in order to nail down the global navigation (how does it change from full desktop to iPad to small tablet to iPhone or Android phone), contextual navigation, dealing with long content pages, and other UI elements. Now, with the site earnestly being developed, we prepare to bring in customers to test the full experience.

The usability testing, which will begin next week, will have users complete common tasks that explore aspects of navigation (findability and context) of a deep site; readability of content on small devices; usability of tables containing lots of data; the usability of completing a form on a mobile device; getting customer service; general touchability and effectiveness of interaction design.

Working with our outside recruiting firm, we’ve enlisted participants who are very mobile and internet savvy and who have experience looking at information on corporate sites. As of this writing, we will be testing on two iPhones, two iPads, two 7-inch e-Readers, one Android phone, and one 10-inch Android tablet. Our testing hardware will allow us to record both the mobile device screens and the picture-in-picture of the users’ faces.

Because responsive design is meant to work on any device, we’ve invested in a bunch of mobile and desktop gadgets. Here are a few pictures of our gadget lab.

Some of the gadgets in our lab. Not pictured are the Google TV, Playstation 3, and Samsung SmartTV.

Some of the gadgets in our lab. iOS devices, Android phones, e-Readers, even a BlackBerry.

Image

Another view of our gadgets. The Dell monitor on the right is our eyetracking equipment.

Viewing our site on a Samsung SmartTV. Also pictured are a Sony Playstation 3 and Google TV.

Viewing our site on a Samsung SmartTV. Also pictured are a Sony Playstation 3 and Google TV.

We keep the devices in the UX Lab, but we’re also using the room to do QA on the devices. Thus the room has taken on the moniker of QUAX Lab, which I find amusing given that many people think my name is “Eric Dux.” It also speaks to the general craziness we’re all feeling with this project.

The following week after we test on the small gadgets, we’ll run another set of participants through the full site experience — using our eyetracking equipment. This will let us see exactly where people are looking as they traverse our site, providing excellent insight to our designers. And, of course, we’ll learn about how the navigate and complete tasks, and to what degree of satisfaction.

Being part of a corporate responsive design project has been a significant learning experience for everyone in our group. It’s been a heck of a lot of work and we could probably write articles on content governance, iterative design and usability testing (aka, being agile), nuances of designing for breakpoints, structuring design and development teams, quality assurance, and so on.

Our upcoming usability testing is one portion of a huge project — but an important one to validate that the site works for real people.

Wish us luck!

Citibank’s $392.20 Web and Communications Failure

Today’s communications failure comes from Citibank’s credit card division, Citi Cards. For years, I had been amassing points in their Driver’s Edge program, good for rebates on automobile purchase or service. At some point in 2011, my account was converted to a ThankYou Rewards program. All along, my online account at citicards.com has shown my Driver’s Edge balance as $392.20. Great! So, earlier this month, I had new brakes installed on my car. When I ponied up the $900 for my brake job, I was glad to know that close to $400 would be covered. I found the Driver’s Edge redemption form online, filled it out, attached my receipt, and emailed it in.

Imagine my surprise when I received a letter from the ubiquitous and likely fictional “S. Larson, Customer Service” stating that my rebates had expired:

My letter from "S. Larson" stating that my points had expired.

Expired?

The web says otherwise:

My online account shows $392.20 available.

So I contacted S. Larson’s group and told them there must be a mistake — my balance online still shows $392.20. Not only that, but it specifically calls out 0.00 points as having expired.

I heard back not from S., but from Justin, who informed me that my card was converted in March 2011 and that I had 12 months from conversion date to use my points. They claim they had notified me by mail, and that they could not reinstate my points. But let’s be honest — who actually reads that stuff? Besides, as a paperless customer, I expect my communications to arrive via email. I was especially conditioned to this expectation because it seems all of my communications from Citi Cards *do* arrive by email. Except the important matter of point expiration and money in my bank.

I have looked back over every single Citi Cards email I’ve received going back to 2010 — and not one of them mentions my card converting to a new points program. I see the typical statement ready notice or online account activity confirmation — but not a single communication regarding points, account conversion, or expiration.

The history of my emails from Citi Cards. If my account was converted in March 2011, where is the notification?

I had been accumulating those points for probably close to 10 years, and had banked on them as a cushion for costly service repairs. I had probably redeemed over $800 over the years. Those points represented money in the bank for me, literally. Citibank’s failure to live up to communications expectations, combined with their misleading web site has literally cost me $392.20. By the way, if you know anything about earning credit card points, you’ll know that the $392.20 actually represents $39,220 that I had spent on my Citibank card over the years. (Not to mention the tens of thousands of dollars I had spent previously and had redeemed for points.)

So, Citibank, congratulations on your $392.20 failure to communicate and to plant your flag firmly in the land of customer indifference— where you join a select assortment of other ignominious companies who care little about their customers or their experiences.