If you work at an agency, you’ve likely heard these comments a few times: “Research is too expensive.” “There’s just not enough time for user testing.”
Over time this perpetual loop of trying to convince clients on the importance of user research can frustrate and infuriate the best of us.
This post is a real-world case study to help illustrate how you can create a more efficient learning process by incorporating guerrilla methods into your user research—hopefully debunking the myth that it’s too expensive and too time-consuming.
Related: Pareto Principle-based user research
Read on to find out how my team at Signal took this approach in order to evolve and revise an outdated website navigation for our client, a chain restaurant famous for its flame-grilled peri peri chicken.
Chicken? Yes, chicken
If you’re in the US, the first place that probably comes to mind when you think about chicken is KFC. But anyone based in the UK—and 30 other countries around the world—will immediately think of Nando’s, a wildly popular chain founded in South Africa in 1987 on a rich mix of heart and soul.
For the past 10 years, Signal has worked on over 400 projects to help bring Nando’s signature blend of food, music, art, and atmosphere to the digital world.
When the company underwent a full brand overhaul in 2015 to introduce vibrant colors, patterns and African flare, we at Signal were tasked with bringing this new, fresh feel to the website.
We got to move beyond the old box-based, heavily textured site and create a completely new experience—with improved usability and accessibility—for their visitors and fans.
A fresh start
In the early days, getting Nando’s to think about their brand digitally was a challenge. Their entire company ethos is founded on tangible experiences, focusing on the atmosphere and service within their restaurants.
But after seeing the success of the relaunched website at the start of 2016, we found ourselves having conversations with other departments within the company who wanted to make use of this newly focused channel. This led to a clear roadmap of digital content they wished to create and publish over the next few months.
We soon got into a situation where the navigation became too complex as the site required more new, varied content. As such, the rationale behind the original content groupings on the navigation panel and the usability of the UX and design were both put under strain.
So we planned a project that would give us the time and resources needed to focus on improving the navigation of the website.
Since the navigation is so integral to influencing and affecting the site’s user experience, it was important that we balanced stakeholder content needs with insight gathered directly from their customers. We needed to have a clear understanding of user needs so we could prioritize content to provide customers with the information they wanted to find.
To achieve this, we did user research. Generally speaking, I’ve found that research and user testing has the unfortunate stigma of being considered expensive. When budgets need to be cut or tightened, it’s usually the first to go.
In this case, it was a spike project that we were keen to keep inside the budget, so in order to get maximum insight we devised a guerrilla approach.
Ultimately, this approach is about being smarter and more efficient in the way we planned our activities and used the findings.
This meant no deliverables for the sake of deliverables, and no time wasted on lengthy write-ups. Instead, we took a practical, lean, and ready approach.
By defining working conditions that allowed us to think, design, test, learn, and repeat in shorter bursts, we arrived at solutions faster and gained better end results.
A clear direction
This approach works best when you have clearly defined objectives. This means you can choose activities as a path to help bring you to your goal, while keeping the flexibility to swap, tweak, or change as you uncover insight and knowledge.
For us, the main goal for the project was to design and build a responsive website navigation that caters for the needs of both internal stakeholders and Nando’s customers.
We further broke this down into the following objectives:
- Evaluate—and, where appropriate, reassign—where all the content was currently hosted on the website. From there, we’d create a new organizing principle for all future content.
- Assess the tone of voice used in the navigation and develop a standardized one to increase customer understanding. This involved creating a new set of labels for better signposting.
- Check the usability of the interface and design a new layout to increase ease of use.
We started by reviewing our current navigation so that we could understand where there were opportunities for improvement. We did this by testing the current navigation with users on multiple devices, giving us valuable qualitative insight into its usability. We gained further quantitative insight into web users’ current behaviors and device usage from a breakdown of Google Analytics.
We found that the main labels in the navigation were too ambiguous and didn’t set clear enough expectation for the users. So, in order to better structure the information architecture, we created a test card-sorting exercise to allow Nando’s customers to categorize content themselves.
We completed this using Optimalsort.
Leveraging our client’s resources, we offered participants meal vouchers instead of payment and conducted the tests at restaurants instead of needing to rent meeting rooms.
Using results to guide activities
The results allowed us to clearly see the relationships between different types of Nando’s web content from the perspective of our sample group of customers. From here, we felt we had enough insight to develop an IA pattern. We then did a further test of this on a larger section of Nando’s customers.
We held a stakeholder workshop so that business needs were also considered during the IA definition work. From both the customer tests and the stakeholder workshops, we came up with two organizing principles: One where we grouped the content into four categories (Option A), and one where we grouped it into five (Option B).
In order to test these two principles, we designed a first-click test using Chalkmark that was to be completed remote via the company’s group of customer product and service testers. We had 200 participants in total, and split the test 50/50 so that each person only got one organizing principle to test.
The tests comprised of an image mockup of the Nando’s navigation panel, with the category labels displayed. We asked questions such as, “If you wanted to contact Nando’s customer service, where would you click?” We also asked participants to click on the label they’d expect to find it behind. This allowed us to understand if people could tell what was behind the labels at first look.
We used this method, as studies have found that a participant who clicks down the right path on the first click will complete their task successfully 87% of the time. And a participant who clicks down the wrong path on the first click tends to only successfully complete their task 46% of the time. This test allowed us to evaluate the effectiveness of our proposed navigation solutions.
What we discovered was that some of the language used in the labels scored poorly in the first organizing principle. Although these labels reflected the brand’s fun identity, they were too ambiguous (i.e. “Eat,” “Explore,” “Play”). But in the second option there was a success rate of over 80% on most tasks because the labels were more specific and direct signposts (i.e. “Food,” “Restaurants,” etc.).
Rapid design prototyping
The next stage was to use the insights gained through testing and begin to prototype and understand what the best way would be to present the UI and navigation.
I created a number of elementary prototypes using InVision that allowed us to get user feedback on usability and accessibility across various devices. Almost two-thirds of Nando’s site visitors access the site on a mobile device, so it was critical that the navigation worked well on mobile.
In order to support a rapid process of design, test, learn, I kept my design tools and methods lean and efficient, taking advantage of Craft by InVision to allow for speedy interface amends through Sketch.
We evolved our designs from low to high fidelity through each round of testing. As the content was refined, so was the interface. We were able to make smart decisions based directly on user feedback, which meant we could find the right balance of user experience, interface elements, and brand visual styling through an iterative and user-centric approach.
The final outcome was a design that we felt met the objectives we set out to achieve. We made sure to put analytics in place so we could measure the effectiveness of the new design and look for new improvements to make in the future.
If you’re a UX designer at an organization who believes user research is too expensive and too time-consuming, I hope this post has provided a useful example of how, with some resourcefulness and willingness, it’s possible to provide a user experience based on thorough research and valid evidence—even with an existing scope and cost.
In summary, here are some of my best tips for maximizing user research budgets:
- Set clear objectives at the start
- Define working conditions to enable a lean process free from admin
- Leverage any available resources from your client, or existing resources from previous projects
- Find a research tool that enables remote and in-person testing
- Set up a lean design, test, learn flow by leveraging plugins and integrations
- Be comfortable working in mixed fidelity
And if you do a good job, you might also be able to sell your client or exec team on design sprints—and that’ll open the door to bigger budgets in the future.