With advances in mobile graphics and machine learning, lifelike augmented reality avatars are becoming feasible as an alternative to a human presence in customer-facing interactions. Whilst touchscreen check-in systems are often in use in receptions, these remove the social element that a human receptionist brings and lack identity, taking away the ‘face of the company' element that a human receptionist provides. The technology now exists for some of this reception functionality to be delivered by a virtual assistant. Using a simulated humanoid character in an augmented reality setting is an attempt to re-humanize this process and allows a company more control over the first impression that a visitor has when coming to a building.
Our original task was to create a receptionist for use at IBM Headquarters in London. We were told that it would ideally have functionality to provide information about the company and any events happening in the building, as well as being an impressive demonstration of what IBM technology could do. Additionally, as a stretch goal, we wanted to provide proof of concept avatars for other settings, like a library, museum or healthcare setting, to show that this was a technology that could potentially be used in other organizations apart from IBM.
We had to consider that our users could be anybody who walks into a building with a reception, and that Augmented Reality is a relatively new technology that many people have not seen before. It was therefore important to consult with a wide range of users and to make our project as user friendly as possible.
We knew that not many people have used AR before, and our team had never designed or built an AR experience. Therefore, we conducted initial user interviews to get insights into users’ goals and attitudes towards the idea of an AR receptionist in general. We interviewed two classmates, one who had not used AR before and one who had, to gauge how experience level might affect user’s goals and requirements for an AR.
Se Jin had used AR before but found it buggy and hard to use. He would be very curious about an AR receptionist and would want to talk to it, find out what it could do and push its boundaries. He would ask it to contact someone in the building and would also like to ask it questions about itself.
Eunice had never used AR before and felt she would prefer to interact with a human receptionist. She seemed unsure of how she would like to interact with the avatar. She would want to find out where to go, and also to ask questions about ID badges and access. She would like the AR to be as human-like as possible.
Following our interviews, we conducted a survey where we showed potential users some different types of avatar (realistic, cartoon, robot, humanoid, hologram etc.) to gather information about what kind of responses users might have to an AR avatar. This helped us form some more requirements about how the avatar should appear and what kind of qualities users are looking for in a receptionist.
Finally, we built several different prototypes which we tested with different users – see the HCI page for more detail. This helped us pin down the most important requirements - for example, it was at this stage that we decided the avatar needed to follow the user with its eyes or face, because otherwise the avatar appeared very static and it was hard for the user to engage with it.
When we did our user interviews, noticed that our interviewees had quite different attitudes to the AR, which were probably linked to their familiarity with and openness to new technology. We therefore encapsulated these requirements by creating two personas, differentiated by their level of experience using AR and their open-ness to new technology.
Here we have provided a use case list for each type of receptionist and a more detailed description of three specific use cases. In the “Design” Section you can see our more detailed use case diagrams for each specific chatbot that we made. NB. our “administrator” use cases involve a technically capable user: our application does not at this point have a GUI for non-technical users to change the chatbot backend.