A Study of The Usability of Three Unemployment Registration Websites

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 16

A STUDY OF THE USABILITY OF THREE UNEMPLOYMENT REGISTRATION WEBSITES Matt Specter

ABSTRACT For this study, 10 subjects were asked to find their way to the unemployment registration pages of Ohio, Kentucky and Indiana. Times and comments were recorded and analyzed. Results showed an unclear picture of which state was easiest to navigate, but many users had great difficulty with Indianas website. User frustration was high and was usually related to two issues: large quantities of unhelpful text, and difficulty in locating the dedicated new user entrance page. Several users suggested placing greater emphasis on the links to the new user area, or finding other ways to make this area easier to find. INTRODUCTION In the current economic climate, there is a great deal of attention being given to statistics regarding unemployment. While most recent attention has been given to the end of the unemployment process (how long benefits last) there has been little research done on the beginning of that process. Registering and applying for unemployment benefits, like any other deeply involved online interaction, can be highly complicated. In addition, since the agencies which are responsible for the implementation of these forms are not driven by profit motive to improve usability, the process often does not meet the standard that many come to expect from similar registrations on, for example, a consumer-oriented website. What begins as a mere quality issue becomes much more important if the level of confusion created by poorly designed forms actually prevents unemployed persons from succeeding in the registration process. If at any point people are frustrated to the point of giving up then this becomes in some way a social justice issue. It is not unreasonable to say that the agencies running unemployment programs or any other form of public assistance have an ethical obligation to present an easy-to-use product. And while it is true that there exist other methods to register which may provide an alternative, (phone registration, for example) these methods are labor-intensive and inefficient compared to what a well-maintained automated computerized system can achieve. In short, a welldesigned user interface for every step of the process, beginning with registration and application, benefits the entire agency and the people it serves. As a person who unfortunately has been forced to use Ohios version of this process firsthand, I developed an interest in the quality of the experience. At several points throughout my time with the online unemployment system, I cynically (and only half-sarcastically) wondered if the system was designed to be unusable. In addition, having regularly experienced hour-long wait times to speak to live help regarding the claims process, I can attest to the need to streamline the process wherever possible. The fewer government employees tied up in simple registration issues, the more there are available to handle the complex questions in the claim filing process.

This proposal is for a pilot study of a potentially much larger study, or more accurately, an entire area of study. Apart from Buie and Murrays extensive Usability in Government Systems: User Experience and Design for Citizens and Public Servants there is a limited amount of literature devoted specifically to the usability of online government and social services, particularly in the United States. Bertot, Jaeger and McClures Citizen-Centered e-government Services: Benefits, Costs, and Research Needs does however provide a useful backdrop against which to view the results and any suggestions for change. The possibilities to expand this basic pilot study are immense. With 50 states and a plethora of governmental programs serving the needs of their citizens, there is a tremendous variety of designs and interfaces to explore, with the opportunity to determine best practices from a wide variety of systems. Information learned in researching ease of use in the unemployment process will likely be easily transferrable to other governmental programs such as Food Assistance, Medicaid, and Social Security, and successful design concepts should be transferrable across states and departments. METHODS For this experiment, subjects were given the task of navigating from a search engine to the registration page of Ohio, Kentucky, and Indianas unemployment systems. The intent was to simulate the experience of a newly unemployed person with no experience using the website, in order to judge the accessibility of the system itself. Data was collected on the time taken to reach the pages in question, the ease of finding the necessary web pages, as well as users general thoughts about the intuitiveness of those pages. Data on time elapsed to achieve the goal was analyzed quantitatively, and data on user comments during and after the process was analyzed qualitatively. It should be noted that the study recorded time necessary not to merely arrive at the target page, but to recognize and identify the page as such. After all, if the actual pages purpose is not immediately clear, this is a major usability concern. Common themes among individual states processes and across all states were noted. The order of the states was rotated throughout the process so as not to create irregularities of data due to test fatigue or other factors. This was a limited study using a sample of convenience, composed of adults aged 35-64. The test was conducted in a controlled, distraction-free environment. The speed of the internet connection used in differing environments was similar enough to have negligible effect on the results. This study was limited to the process of getting to the starting line because the researcher had no access to the systems themselves. I hope that this study could be expanded by contacting the IT departments at the various governmental programs in question, and setting up means to simulate the processes of registration, claim filing, and weekly maintenance, by providing test subjects with fictitious information to be entered into a copy of the system environment, thus not creating false information on the real system. As previously mentioned, this study can also be expanded to include any and all government systems which require a forms and registration process or online interactions of any kind. There is also the potential to compare the ease of use of these sites to similar commercial sites, to see if anything can be learned and applied from a business environment to governmental programs.

RESULTS Testing showed a wide range of times necessary to complete the tasks. Indiana showed the longest time necessary among the 10 subjects, with a mean of 5:18. Subjects averaged 4:29 to find Ohios registration page, and only 2:15 to find Kentucky. Two times each, subjects were unable to locate Ohio or Indianas registration page within the allotted 10 minute timeframe, and one user quit and refused to continue the test while searching Ohio and Indianas website (in these instances, a time of 10 minutes was assigned to these results). There were no such failures for Kentuckys website. In general, most subjects required 1 to 5 minutes to locate and identify the required registration pages. These results are somewhat misleading, due to a large variation among Ohio especially, caused by the 10 minute timeouts. Without these failures, times required to reach Ohios page were generally lower than or nearly identical to those for Kentucky. In addition, Ohio received equal votes from subjects for being the most positive experience among the three states, in some cases even when Ohio took the subject longer to reach than another state. Demographically speaking, age showed little relation to required time to complete the tasks, nor did gender, or subjects estimated weekly web browsing time. Although the experiment rotated the order of states with the intent of eliminating bias due to test fatigue, the results showed an opposite trend, with many subjects showing noticeable reduction in time from the first state to the third. However, the six reported failures were evenly distributed among first, second, and third state attempted. The only generalizable comparative statement that could be drawn from the quantitative data is that Indianas web environment usually took longer for subjects to navigate than Ohio or Kentucky. While the numbers painted a mixed picture and were not always clear, the comments provided by users both during and after the test provided clear insight into the reasons for the difficulties experienced. In addition, patterns of behavior emerged which mirrored the comments and demonstrated a strong picture of areas needed for improvement. A common theme of most users was feeling overwhelmed by a large amount of text. Users complained of there being a lot of reading, called the amount of text intimidating, and described pages as containing a lot of information that can be confusing. One user in a moment of exasperation said Good grief, theres an awful lot of stuff here. In most instances, subjects ended up skimming or completely ignoring large blocks of text, choosing to either click on a different page, or to scroll quickly down to find the link they wanted. Users only tackled these large blocks of text if they were experiencing difficulty finding the required page and beginning to show signs of frustration, and in these cases, they found the text they read to be unhelpful. Comments at this stage included this is information I dont really want, and Come on, just tell me how to get to this. One user even said at this point, Is this going to do the same thing I tried not to read before? A second common experience among most test subjects was difficulty in finding the specific area dedicated to new users or first-time applicants. All three web environments require users to

navigate through an existing users login page in order to reach the new user registration area. Nearly all users were able to find their way to the returning login page, but at this point, finding the link to the new users area became a major challenge. Several users, not able to find the link on this page, backed out of the page while only one click away from their goal. In fact, this happened in five of the six instances where subjects were eventually unable to complete the task. Most users simply backed out and eventually circled around again, only to look more closely the second (or third) time around. The details of why this was so difficult will be covered below in an analysis of each state, but this issue accounted for a large percentage of user comments, with more than half of subjects commenting that a greater emphasis needed to be given to the New User or First-time applicant section. The issues of excessive and unhelpful text along with hard-to-find entry links mentioned above, coupled with other issues led several test subjects to comment that the tasks presented would have been extremely difficult for someone with less computer and internet experience. One user commented How would someone less educated handle this? and as another put it, If you dont know anything about it, forget it. KENTUCKY Kentuckys website, as mentioned above, provided the lowest mean time to complete the task of the three states tested. It also proved to be the only state in which all users were able to successfully navigate to the registration page. However, Kentucky received fewer votes as the most positive of the three than did Ohio. This is likely due to a combination of issues. Kentuckys entry page does not provide the link to the login area in a prominent location. The link is located roughly halfway down the page, under large blocks of text, and somewhat buried. Users called this at various times confusing and a complex screen and this proved to be the major challenge of navigating Kentuckys web environment. Once this link is clicked, users are taken to another page of lengthy text. This page of text proved to be a source of annoyance to most users, after having already navigated a text-heavy page en route to this one. However, as a result of this frustration, most users skipped the text, scrolled down quickly to the bottom of the page, and immediately found the link to the returning users area. Thus, the very source of users frustration caused them to advance quickly to their next destination, causing a decrease in satisfaction along with a decrease in elapsed time. The returning users page is sparse, making it very easy to find the New User link, and thus no users were convinced to back out of this page as they were in other states environments. Several subjects described finding the New User link on this page as being much easier than other states environments. All in all, Kentuckys system, while frustrating to several users due to the abundance of text, seemed the easiest for test subjects to successfully navigate, largely due to the simplicity of the returning users log-in page.

OHIO Ohios website provided the lowest mean time to complete the task of those who eventually succeeded, or in other words, leaving out the three failures, Ohio was navigated very quickly. In general, users quickly found their way to the existing account login page, but if users were at this point unable to locate the link to register as a first time user, their likelihood of failing to complete the task increased dramatically. The trouble many users faced was that once they left that page without finding the link they needed, they assumed the page was a wrong turn on the way to their goal. Therefore, no matter how many times their navigation brought them back to the same page, they no longer scanned the page for the information needed. This provided a very frustrating experience. User comments at this point included Here we go again and Ive been here before and it didnt help me along with Im just going around in a circle here. Several users also experienced difficulty with using the browsers Back button, which does not function in Ohios website environment. This was a minor issue, as the site contains side and top navigation menus which allowed users to restart their search. The page which gave users so much difficulty was sparse, much like Kentuckys. The reason so many users had difficulty finding the required link was possibly in this case a less-than-obvious choice of text. Whereas Kentucky asks for new users to click on the words New User and Indiana even goes so far as to include a New User? button, Ohio chose the words Register to file your claims online! underlined and in italics. Even though the link was near the center of the page, the text and formatting in many ways caused the link to become buried in plain sight. Perhaps most interestingly however, all three users who failed to find the required page on Ohios system often found themselves navigating to a page which included the phone number to call to achieve the same task, and all three users made comments at this point about how they would probably give up and call at this point. Comments included, I think Ill call, Can I just call? and Looks like they want you to call. The significance of this was mentioned in the introduction and will be addressed again below. Meanwhile, those subjects who quickly found the required link had positive comments on the websites design and their experience using it, calling it Fairly easy and A lot easier. However, several users also commented on the overwhelming amount of text and its relative irrelevance to the task at hand. User comments to this effect included Lot of stuff, it doesnt seem to be applicable, and A lot of less important information on the landing page. Ohios design does allow users to bypass this information by following menu navigation links on the left side of the page, so not all users encountered this issue. In conclusion, Ohios website provided a mixed bag of results, and it would appear that making the New User link more obvious and clear would make a huge difference in reducing both required time and frustration.

INDIANA Indianas web environment required the longest time for most subjects to navigate, provided a high level of frustration, and caused three failures to achieve the task. The increase in necessary time was connected to several factors. To continue past Indianas landing page, users must click on a large icon with a company logo and the words Claimant Self Service. This appeared to be very unintuitive as very few users actually chose to go to this link as their next step. Most users resorted to seeking ideas from the menu bars placed at left and right, each of which contains links entitled File for Unemployment and Filing for Unemployment respectively. Users found this to be a more clear path to follow, but became quite frustrated when both of those links returned them to the page they were currently browsing. Many users at this point resorted to clicking other links on the page, which included a streaming tutorial for using the system, a collection of screenshots from the system, and a resource guide in pdf form. The resource guide, which basically begins with ways of coping with unemployment, was quickly closed by all users. The screenshots did in fact provide an image of the target page, but did not highlight or explain how to navigate to that page, thus it also was of little help to the users. Users lost a great deal of time waiting for the lengthy tutorial to load, and all but one quickly shut it back down before the narration could conclude its instructions on how to use the tutorials navigation buttons. User comments while wading through these steps included, Oh God I dont want to watch this, I dont want to view a tutorial or download a resource guide, and You dont need a tutorial if you know how to fill in blanks. (Only one subject took the time to watch the tutorial long enough to receive helpful information, using the strategy of running the tutorial in a second browser tab while continuing to search for the correct page in the first tab.) It was only after this point that many users attempted the correct Claimant Self Service link, but this link then brought users to a page seemingly requiring them to view the tutorial. This frustration is what led one user to quit, and 2 others to abandon the page and not return, causing their failure to complete the task. Eventually most users did notice the red text allowing them to continue past the tutorial (located below a large View the Tutorial button and a stop sign icon). At this point, users were taken to a returning user login page, just as in Ohio and Kentucky. And here, users again had great difficulty finding the New User? button located at the very bottom of the page. All this added up to a great deal of time and frustration. It is worth noting that there are several testing factors which may have negatively influenced Indianas results. Since subjects were instructed to simply proceed to the necessary page, and were aware of the timed nature of the test, there was no interest in sitting through the tutorial or reading the resource guide. It is entirely possible that for a real unemployed person, these would be a source of welcome information, and worth the wait and time necessary to digest that information. One can not fault the state of Indiana for failing to provide information in a variety of formats. It would possibly be worth analyzing further the content and presentation of the provided online help to look for ways to make it more immediately helpful to users. In addition, this study was conducted on a laptop computer with a moderately sized screen, which required subjects to scroll down to find the New User? button. One subject astutely asked if the button would have been visible on a larger screen. This may be true. A further study would be required as to the browsing habits of the populations served by these government agencies, to determine if redesigning the website with smaller screens (or even mobile devices) in mind would be a worthwhile task.

CONCLUSIONS This study was extremely limited in scope, yet very eye-opening to anyone who has had experience navigating government agency web environments. The two emergent themes from user comments were the overwhelming amount of (unhelpful) text, and difficulty in finding the new user registration area, especially compared to finding the returning user section. Comments in general were overwhelmingly negative. While many users did complain about the amount of text, some hypothesized that this was a necessary evil in this kind of website, where regulations may very well dictate a large amount of information be disseminated. Research into how much information imparted on these websites truly is mandated, as well as how much of that information is truly taken in by users, would constitute a new and possibly rich source of knowledge, especially considering the rapidity with which users abandoned the task of actually reading the large blocks of text. Although subjects were not specifically asked during the study to provide suggestions for improvement, many offered such advice, especially in the area of the hard-to-locate new user areas. Most subjects suggested that greater emphasis needed to be placed on the link for new users. One suggested a large banner or prominent menu option, another suggested a large button reading New User? Click Here and still another recommended an entirely dedicated separate area to the site. Several users commented on how much more streamlines commercial websites were in this area, and how much easier to navigate they were. This is to be expected to some degree, as a governmental website does not serve the same purpose as a commercial website. However, as mentioned in the introduction to this report, the government agencies that run these websites have an ethical obligation to provide an easy to use product. In this limited study, 30% of the users of Ohio and Indianas systems gave up or failed to find the necessary page. Those users would simply have been forced to use the phone to register, a process requiring much more time and more resources on the part of the agency. Government is often already saddled with an image of being overburdened, slow to change, and moving at a glacial pace. Effective web forms and automated systems are a chance to let technology take some of the pressure off the massive tasks expected of these agencies, but only if implemented correctly. It is telling that one subject, with experience navigating a different area of Ohios social services website, called this first experience with the unemployment system typical. This study may have only investigated the unemployment registration process, but this researcher would expect to find similar themes throughout the systems of all states. As already mentioned, there is very little literature regarding usability in regards to the websites of governments and service agencies. Hopefully this limited study can spark an interest in this area, and lead to some greater research, and some degree of change. REFERENCES Bertot, J. C., Jaeger, P. T., & McClure, C. R. (2008, May). Citizen-centered e-government services: benefits, costs, and research needs. In The Proceedings of the 9th Annual International Digital Government Research Conference (pp. 137-142). Buie, E., & Murray, D. (2012). Usability in government systems: User experience design for citizens and public servants. Burlington: Elsevier Science.

Krug, S. (2009). Rocket surgery made easy: The do-it-yourself guide to finding and fixing usability problems. (1 ed.). New Riders Publishing. Withrow, J., Brinck, T., & Speredelozzi, A. (2000). Comparative usability evaluation for an e-government portal. Ann Arbor, 1001, 48104.

OHIO Subject 1 2:12 Subject 2 1:40 Subject 3 1:05 Subject 4 0:37 Subject 5 DNF Subject 6 DNF Subject 7 1:36 Subject 8 DNF Subject 9 1:16 Subject 10 6:33

KENTUCKY 3:44 1:27 1:04 1:32 1:19 5:36 3:01 1:44 2:07 1:03

INDIANA 4:49 1:31 2:36 5:04 DNF DNF 4:07 DNF 1:18 3:37

700 600 500 400 300 200 100 0 1 2 3 4 5 6 7 8 9 10 Ohio Kentucky Indiana

Subject

Fig. 8 Data Results and Chart displaying time in seconds.

You might also like