03. Moderated User Testing & Analysis

FORD


Role: Writing discussion guides • Participant recruitment • Moderate user testing • Analysis & Insight reporting

London, UK 2018

Automotive (Electric)

 

 The Challenge


Ford, the American multinational motoring company, were working on concepts to provide customers with an enhanced digital experience.

They wanted to increase UK and Germany customer engagement with their existing app by understanding the correlation between driving behaviours and journey data.

Snapshot of moderated user testing

Method & Process

The objective for the user testing sessions was to refine the concepts by assessing the overall desirability, usability and potential for repeat usage of the apps new features. To do this I wanted to have a structured conversation with participants guiding them through the the prototypes and discussing their opinions. The main focus was on their understanding of the information and the design language of how data was presented.

The research phase ran in weekly sprints with user testing carried out simultaneously in 2 different countries. This meant that the team needed to be well organised and each aspect of the process well planned. The diagram below illustrates the iterative design process we worked to, my responsibilities included preparing discussions guides, facilitating user tests, analysis of raw data and generating insights.

Moderated testing

For each sprint we conducted 10 face-to-face user testing sessions lasting 45- 60 minutes each (5 in UK and 5 in Germany). As a moderator I needed to balance putting the participants at ease with adhering to the discussion guide so the raw data would be as comparable as possible.

To begin the session I would start with a general chat to gain background information into their goals, attitudes and behaviours related to the driving and ownership of their car. This was a natural segue into the introduction of the mobile prototype where each new app feature concept could be demonstrated and the participants’ understanding of the concepts could be tested using scenario based tasks then further monitored for additional insights.

Another aspect of the moderated sessions was live note taking. Using Lookback we were able to live stream each session this was a useful tool to bring transparency of the research process to the wider team, including clients and stakeholders. The time stamped notes from the live recordings were an integral part of the later analysis as it allowed the research team to pull quotes directly from customers.

Snapshot of moderated user testing

Snapshot of moderated user testing

Learnings

The main outcome for this project were the insights. To get to this stage analysis had to be carried out using the notes from the recorded user sessions, the research team reviewed and categorised the participant statements using the affinity mapping technique.

This technique allowed us to group feedback according to features participants would change or expected to see, potential new features ideas from participants, things they liked and any pain points they had with the overall features.

The insights from the analysis was presented formally to stakeholders and clients as a report. Then informally to the internal teams in UK and Germany to further refine concepts in preparation for the next round of iterations.