Skip to main content
Posted on April 2nd, 2012 by
Companies spend a lot of money developing a navigation structure but often do not spend much time testing it with users. We know it’s important to test our navigation structures, especially on content-rich websites but testing often gets shortened, pushed to the end of a project or eliminated all together.
At ADG, we’ve developed a process for validating website structures using an online web application. The process allows us to capture feedback from users on a proposed navigation structure in a cost-effective, timely manner.
The first thing we do is develop the structure based on user research, business requirements, content audits, and web statistics. Typically we already know the audiences that will use the website as we’ve identified them when we conduct user research.
We then develop the tasks that the user(s) will do using the website (usually 12 to 18 tasks). Typically, we are testing “findability” and are asking where a person would look for a particular piece of content or functionality in the site. We try to include a broad range of tasks that require users to look at all sections of the navigation structure, especially areas we think may cause problems.
We recruit people from our different audience segments and try and get at least 25 to 30 users for the testing. While the recruiting is happening, we set up the study in our online testing tool called Navtester. The setup is similar to using a tool like Survey Monkey - you design the study and then release it when it is ready. We usually leave the study open for 3 days and monitor the responses as they come in.
For the testing, users navigate through the site categories until they find the appropriate section that would contain the information they’re looking for. Once users have found what they think is the correct spot in the structure, they click a button to indicate that this is where the information would be located. Users can then fill out a comment before moving to the next task. As the user progresses through the tasks, their pathways, times, comments and other information are tracked in the back end of the testing tool.
Although Navtester helps us capture and present the results, we still need to analyze why certain patterns are occurring. Are users confused with the structure? Are category names unclear? Did they understand the task? Is there a better way to classify content?
We typically conduct 3 iterations but more can be done if you need to further refine the structure.
We have found the following benefits with testing a navigation structure: