AI-Driven HR Tech Solution


Video Interviews
with AI Analytics

Platform Web & Mobile
Role UX design, UI design, Usability Testing
Team 5 people including the CTO and 3 developers
Duration 3 months

Overview


This project is part of my job as Product Designer at Neufast, a startup based in Hong Kong building online video interview software with AI analytics.

I joined the team to design a solution for the low interview completion rate problem, in addition to a range of other projects. I was able to transform the interview experience from being rigid and nerve-wracking to stress-free and easy-to-use. My final design has been deployed in the product, and has increased interview completion rate by 10%.



Working as the founding designer in a small startup, I was lucky enough to own and lead the design process with guidance from the CTO and support from the development team. I also had the chance to work on sales and marketing and reflect on their connection with UX.

Problem


Online video interview solution provided by Neufast helps job seekers interview during a pandemic, but it is often nerve-wrecking and confusing for interviewees when their interviewer is an AI algorithm hidden behind a webpage, and even worse, an disjointed user experience where system visibility is low. This led to a low interview completion rate of 60%, which is a disadvantage for our clients, the HR departments of small and medium sized enterprises looking for more talents.



Design Goal


1) Pinpoint the reasons why people are giving up doing the interviews;
2) Redesign a more enjoyable system that makes interviewees stay (attract more talents for our clients).

Research

Problem Discovery

To understand the problem more in-depth, I spoke with the CTO and learned that 3 recurring issues kept showing up when he observe interviewees using the interface:

  • Candidates were not sure when the recording really started and whether their answers were properly captured.
  • Interviewees didn’t understand the “preparation time” before they started recording their responses.
  • They say the interview experience is stressful and scary.
Competitive Research

Since video interviews are becoming more and more common, there might be an existing interaction model or best practices that users expect. I studied main video interview products to understand the strengths and weaknesses of their UX.


Due to the urgency of the issues and the agile work style, I initially provided some design solutions based on my assumptions given the information known so far:

1. Added countdown function.
    Reason: No obvious indication of recording start
    Assumption: A countdown can indicate recording start clearly


2. Added pop up windows with preparation time instructions.
    Reason: No easily understandable explanation of what preparation time is
    Assumption: Pop up window with recognizable image can help explain


3. Added virtual interviewer pop up window.
    Reason: Lack of information about the interviewer
    Assumption: Pop up window like a dialogue box is more engaging


Usability Tests

I was interested in finding out the reasons why users feel the experience is confusing and scary. I recruited two current job seekers to do a first round of user testing, and found that:

  • Both users commented that pop-up windows are annoying;
  • Both thought “preparation time” was when they should immediately start answering questions;
  • Users have also said that they feel the pressure to answer the question on the screen when they see the interviewer in the photo staring at them.

I used the Microsoft Desirability Toolkit to test the visual appeal of the interface, and found that users commented the interface as “clean”, “organized” but “stressful”, “unappealing”, and “impersonal”.
User Journey

I synthesized the main painpoints in a user journey map, in order to see and communicate more clearly how they emerge throughout the multi-step interview process:


Research Synthesis

1. The main reasons why users are giving up doing the interviews are usability issues, especially with the recording interface and preparation time function, as well as unappealing UI design.

2. Implications for redesign:
   
  • The whole look of our webapp needs to be redesigned, and should serve the purpose of relaxing users in an already nerve-wrecking process; 
  • Any hiccups on the recording screen should be eliminated, or made invisible if it will not impact the actual interview, otherwise users will think the system is broken;
  • What users are expected or recommended to do during preparation time needs to be communicated in a straightforward way that does not require pop up windows or lengthy instruction text.

Design

Friendlier Design System

The B2B and B2C interface both used Material Design before, which has a clean look but lack a human touch. To make the system look more relaxing and fun, I discussed with the developers and we decided to use Material Design for the information rich B2B (HR-facing) interface and a new custom design system for the B2C (interviewee) interface. I incorporated more rounded edges and irregular shapes, like in the case of the virtual interviewer introduction window. I also made illustrations to add to the dull, rigid UI. At the time, I read the book Atomic Design, and the concept of atomic design helped me greatly in creating a design system from scratch.


Users eventually described the new design as “clean,” “convenient,” and “friendly/appealing.”

Before
Rigid UI communicated seriousness and made interviewees nervous
After
Softer designs, shadows, and illustration makes the interface more dynamic, fun, and appealing
Countdown

Users are confused because the video recording system shows a brief lag before recording starts. I designed a countdown right after users press the recording button both to provide more system visibility and to accommodate technical difficulties.

Before
Interviewees worried whether their response was properly recorded, and became even more stressed due to this uncertainty
After
Smooth process with no confusion validated through usability tests
Preparation Time

Preparation time is a tricky one - it is a friendly feature that allows interviewees more time to structure a response, but our users keep ignoring it and start answering right away. By observing the users and talking with them, I found that to nervous interviewees, "the interviewer staring quietly at them" means "their chance to talk". Considering this fact, I decided to grey out the virtual interviewer during preparation time, and overlay some text explaining what preparation time is. Through usability tests, this has proved to be a better solution than explanatory pop up windows, which users find annoying.

Before
Interviewees often started answering before preparation time was over, and they felt pressured to answer when they see the interviewer staring at them
After
Users still hesitated a bit during preparation time, but were able to understand the purpose of preparation time
Virtual Interviewer

In collaboration with the CEO, we designed the virtual interviewer function where HRs can say hi to the interviewee digitally, and have the option to record a video of themselves asking questions, making the process more personal.

Before
Recruiters had limited options to make the interview feel more human
After
Recruiters can now extend a warm welcome to the firm "in person"
Design Validation

I validated my design through usability tests, and the results show that the countdown feature significantly reduced confusion, and that the new design system was perceived as accessible and friendly. Although the preparation time feature still caused confusion in one user, the new design do appear to help users understand the function better. The detailed results are as follows:




Future Steps


The visual redesign was most successful. The “preparation time” function remains to be a risk of discouraging interviewers. It needs to be made even more intuitive or removed, since interviewers tend to be stressed during an interview and learning a new feature remains a rather big cognitive load for them.




What I Learned


  • The necessity of checking in directly and more frequently with developers, since programming is much more “expensive” (It’s easier to change a design mockup) than design

  • The importance of developing my own structured design process and knowing when exactly to test and validate designs, in a startup/agile development environment with minimal guidance

  • The importance of being a design advocate as the single designer in a startup (taking initiative to communicate, asking a lot of questions, and communicating the importance of interviewing real users)

  • This product also prompted me to think about AI ethics. On the one hand, AI analytics seems to provide a fair evaluation based on consistent standards, better than the sometimes biased individual recruiter; on the other hand, data used to train AI models can be biased and candidates, especially those with disabilities, can be treated unfairly. UX certainly has a role to play in making the hidden rules as transparent as possible and helping candidates bring out their best selves in a stress-free environment.