I’ve been busy the last few weeks doing some of the most challenging usability testing I’ve ever done. There were three locations where I did day-long test sessions. But that wasn’t the challenging part. The adventure came in testing ballots for the November election.

What was wild about it?
This series of tests came together through a project with the Brennan Center for Justice and the Usability Professionals’ Association. The Brennan Center released a report in July called Better Ballots, which reviewed ballot designs and instructions, finding that

  • hundreds of thousands of voters have been disenfranchised by ballot design problems
  • there has been little or no federal or state guidance on ballot design that might have been helpful to elections officials who define and design ballots at the local level
  • usability testing is the best way to ensure that voters can use ballots to vote as they intend


Also in the report, the Brennan Center strongly urged election officials to conduct usability tests on ballots. The recommendation to include usability testing in the ballot design process is a major revelation in the election world. The UPA Voting and Usability Project has developed the LEO Usability Test Kit to help local elections officials to do their own simple, quick usability tests of ballot designs.

But not all local elections officials were ready to do their own usability tests, and some wanted objective outsiders to help evaluate ballots for this particular, important upcoming election.

I did tests in three locations — Marin County, California, Los Angeles County, California, and the home of Las Vegas in Clark County, Nevada — with about 40 participants across the three locations. Several other UPA volunteers conducted tests and reviews in Florida, New Hampshire, and Ohio. In addition, UPAers trained local elections officials on usability testing and the LEO Test Kit in Ohio, Iowa, and a couple of other spots I can’t think of right now.

Pulling together a test in just a few days, including recruiting and scheduling participants
The Brennan Center report was released toward the end of July. Most ballots must be ready to print or roll out right now, the middle of September. The Brennan Center sent the report to every election department in the US and the response was great. Most requests came in in August, so among the five or six UPA Usability and Voting Project members available, we scrambled to cover the requests for tests.

We had the assistance of one of the Brennan Center staff to help coordinate recruiting, although it took some pretty serious networking to get people in to sessions on short notice, often within a few days.

The Brennan Center covered the expenses, but the time and effort spent by the people who worked with local elections officials and conducted the sessions was purely pro bono.

Not knowing what I would be testing until I walked onto the site
For two out of the three tests, I hadn’t seen exactly what I was going to be testing until I walked in the door of the election department. (I got the other ballot two days before the test.) This happened for a couple of reasons. Sometimes the local election official didn’t have a lot of information about what could be evaluated and how that might happen. Sometimes the ballot wasn’t ready until the last minute because of final filing deadlines or other constraints. Sometimes it was all of the above.

Fortunately, the main task is pretty straightforward: Vote! Use the ballot as you normally would. But there are neat variations. Are there write-ins possible? On an electronic voting machine, how do you change a vote? What if you’re mailing in a ballot – what’s different about that and how do design and instructions have to compensate for not having poll workers available to ask questions of?

Giving immediate results and feedback
So, we got copies of ballots or something close to final on an electronic voting machine. We’ve met briefly with the local elections officials (and often with their advisory committees). We’ve recruited participants (sometimes off the street). We’ve conducted 8 or 10 or 15 20-minute sessions in one day. Now it’s time to roll up what we saw in the sessions and to talk with the person who owns the ballot about how the evaluations went.

Handling enthusiastic observers and activists
A lot of people are concerned with the usability, accessibility, and security of ballots and voting systems. You probably are. Some are more concerned about it than others. Those are the people who show up to observe sessions. They’re well informed, they’re enthusiastic, and they’re skeptical. The observers and activists (many signed up to be test participants) were also keenly interested in understanding this activity. How was this different from focus groups or reviews by experts? How do we know that the problems we’ve witnessed are generalizable to other voters in the jurisdiction?

The good news: Mostly, the ballots worked pretty well. The local elections officials usually have the ability to make small changes at this stage and they were willing, especially to improve instructions to voters. By doing this testing, we were able to effect change and to make voting easier for many, many voters. (LA County alone has more than 3 million registered voters.)

Links:
Brennan Center for Justice report Better Ballots
http://www.brennancenter.org/content/resource/better_ballots/

UPA’s Voting and Usability Project

http://www.usabilityprofessionals.org/civiclife/voting/
voting@usabilityprofessionals.org

LEO Usability Testing Kit
http://www.usabilityprofessionals.org/civiclife/voting/leo_testing.html

Ethics guidelines for usability and design professionals working in elections
http://www.usabilityprofessionals.org/civiclife/voting/ethics.html

Information about being a poll worker
http://www.eac.gov/voter/poll%20workers

EAC Effective Polling Place Designs
http://www.eac.gov/election/effective-polling-place-designs

EAC Election Management Guidelines
http://www.eac.gov/election/quick-start-management-guides

Advertisements