Supporting a customer-focused content site for all of NSW government.

In 2022, The NSW Department of Customer Service embarked on the mammoth task of merging content across many department and agency websites under one umbrella: NSW.gov.au.

As they rolled out a new 20 topic taxonomy for this content, I planned, conducted and analysed quarterly user testing studies to validate wayfinding, information architecture and design decisions with users.

The research aimed to uncover usability issues, gain customer and stakeholder trust, and identify opportunities to drive value.

As a wide variety of government agencies brought their content on board, the NSW Digital Channels team needed to assure their agency partners that users could find and engage with their content. Similarly, if visitors to the site could reliably locate the information they were looking for this would engender trust in the NSW Government. The CX team took the opportunity to use regular user testing not just to identify issues but to also identify opportunities to add value.

A mixed methods approach for usability and content

Understanding beyond the brief

For this project, I decided on two different methods to fit the budget and objectives:

  • Usability interviews facilitated remotely using Lookback

  • Treejack testing using Maze

The content team initially asked for 10 usability interviews focusing on navigational tasks on the live site each quarter. Their priorities were capturing verbatim and hearing feedback from as many people as possible.

Delivering more value

I noticed that this method would make it difficult to assess the topics themselves and that for usability testing, 5 participants is sufficient to uncover most usability issues and achieve saturation.

Instead, I recommended using a mix of moderated usability testing and unmoderated Treejack tasks to validate users’ ability to find information under the 20 topics and their 88 respective subtopics.

Efficient insights

For the same cost of recruiting 5 people for 1 hr interviews I was able to recruit 35 for a 15 minute unmoderated task and each successive quarter test a different set of subtopics until all were covered.

Moderating the interviews with Lookback allowed me to take my own notes and use transcripts for analysis, which freed up time to analyse the Treejack responses. It also allowed the content designers working on the project to observe sessions.

  • Each round of interviews, I selected Askable participants so that the cohort represented the diverse population of NSW; a mix of ages, genders, post codes, employment type and home ownership status (where relevant based on topic pages tested). For the Treejack studies, eligible participants were selected automatically.

  • Usability interviews consisted of information-seeking tasks that I observed users complete, mixed with probing questions where needed. I noted assisted or independent task completion, task time and expressions of confusion or frustration.

    Treejack tasks asked participants to indicate which topic they would expect to find the answer for a given query. Their responses were then compared to the expected response (based on content live on the website). In some rounds self reported difficulty ratings were also captured.

  • I synthesised the interviews by reviewing the transcript generated by Lookback, supplementing where needed and copying responses into an excel spreadsheet.

    I organised the quotes according to the task and question asked, adding a column for my summary of the issue, observations about the steps the participant took and any unanticipated feedback that was valuable. I colour coded any cells that had recurring themes that made it easy to ladder up findings across the sessions.

    I maintained the same spreadsheet for the 4 studies so that themes across sessions could also be captured. This also allowed for comparison when the testing stimulus included iterations of a design over time.

    For the Treejack results I recorded the following into the spreadsheet:

    • the question asked and the expected response

    • the most common response (and % of respondents who answer this way)

    • the second most common response (and % of respondents who answer this way)

    • average difficulty rating

    I used these to identify where participants were getting confused, where subtopics may be closely related or not accurately labelled for the content they contain and any other issues.

Outcomes and Impact

Opportunity to increase browsing and decrease errors

My research demonstrated where information across the 20 topics and their 88 subtopics overlapped. This helped content designers map content that could be categorised under multiple sub topics and ensure crucial linkages occur.

Having a deeper understanding of how users navigate the content and what their mental model of categorising topics is allowed designers to structure content in a way that assists site visitors to:

  • take advantage of services, funding and vouchers they were eligible for but may not have known about

  • self-correct errors in navigation. Surfacing and linking related content allows users to find their way if they haven’t found what they were looking for the first try.

This paved the way for not only a more usable experience bet also a more valuable one, making sure public services actually serve the public.

Identifying related content to promote browsing behaviour and discovery of helpful information

Across the 20 topics and their 88 subtopics, I was able to identify where information overlapped. This was an opportunity to identify useful pages to link in ‘related information’ modules and multiple paths a user might take to the same information. This enriched the content designers’ understanding of how their users interact with and understand their content.

NSW Government logo

“Caitlin is a skilful and empathetic researcher, who is able to draw useful insights from a hugely varied range of participants. Every DCU team member who did make the time to observe a moderated session was very impressed with the usability investigations that she led. Caitlin was then able to analyse the results of both the moderated and unmoderated sessions and package them up into reports that not only were able to draw consistent findings from across all participants, but also gave recommendations for specific remediation actions where appropriate.”

NSW.gov.au Digital Channels Unit

More like this

Confident Conversations

Discovery research and product concept for secondary teachers having tricky talks with teens about consent.

Mentally Healthy Workplaces

Human centred product design for better mental health in all Australian workplaces.

Gumtree

Continuous feature discovery for a shipping MVP for Gumtree, using a Dovetail research repository.