human-robot interaction, user research
designing behaviors for a social robot
idea
With mobile robots entering crowded environments such as theme parks, airports or restaurants, we must resolve key interaction and navigation challenges between robots + humans. We tackled these challenges head-on, by creating a generalisable mobile robot platform.
challenges
- Enabling a robot to continue moving, adapting to unpredictable people without simply 'freezing'
- Testing and train the system quickly and safely?
- Designing the robot to intuitively communicate with primary users and passers-by?
- Testing and train the system quickly and safely?
- Designing the robot to intuitively communicate with primary users and passers-by?
approach
I led our human-robot interaction team as part of this project.
This involved coordinating our industrial design, behavioral system refinement, user research and testing. I integrated our work with the software and algorithms, hardware, safety/ assurance, and mechanical workstreams to bring the robotic platform together.



when
2020-2022
project type
robot UX design
with
internal team at Cambridge Consultants (later adapted for a logistics client)
tools
Unity, C#, Blender, Illustrator, Mural, Figma
our primary goal was to create an adaptable system that intrinsically builds trust with users
While there was lots to learn from existing research on human-robot interaction, there were specific hypotheses we wanted to test for ourselves. My team coordinated 5 rounds of research, for questions such as:
- What are the fundamental aspects of a trustworthy human-robot interaction platform?
- What characteristics and modalities are effective in conveying trust?
- How can robots convey their intent when moving through a crowd?
- How should our robot interact with primary users during fundamental interaction flows like meet + greet / leading/ following/ charging?
- What are the fundamental aspects of a trustworthy human-robot interaction platform?
- What characteristics and modalities are effective in conveying trust?
- How can robots convey their intent when moving through a crowd?
- How should our robot interact with primary users during fundamental interaction flows like meet + greet / leading/ following/ charging?



Early in the project, COVID-19 hit. We needed to be creative with our testing methods as we no longer had real crowds or in-person tests.
Our navigation algorithm was trained with simulated crowds to predict trajectories of people and adapt to new environments.
I also led the making of a user testing suite in Unity to be able to fine-tune robot behaviors for specific human-robot interactions e.g. how to "coach" users into interacting and following the robot.
Meanwhile in the real world, we created works-like and looks-like prototypes. For user testing certain behaviors, we even stuck a cardboard model with basic swiveling capabilities and an ipad onto my robot vacuum. Hey, it worked.
Meanwhile in the real world, we created works-like and looks-like prototypes. For user testing certain behaviors, we even stuck a cardboard model with basic swiveling capabilities and an ipad onto my robot vacuum. Hey, it worked.
Using simulation for rapid (user) testing
Communicating through emotion
To quickly convey the robot's limited capabilities a passers-by whilst building trust, I followed the principle of minimum modality - maximum expressibility. There's a lot that can be done with a single turning "head", animated eyes, non-verbal sounds, and even movement qualities like speed, in conveying emotion.
I led a robot personality workshop with our team, adapting chatbot design techniques as well as Minja Axelsson's social robot design canvases.
Our initial robotic concept, Abe, was born and we created an emotion system for him to help convey key elements to users such as tiredness (low on battery); agitation (awaiting user input); and joy.
I led a robot personality workshop with our team, adapting chatbot design techniques as well as Minja Axelsson's social robot design canvases.
Our initial robotic concept, Abe, was born and we created an emotion system for him to help convey key elements to users such as tiredness (low on battery); agitation (awaiting user input); and joy.
Integrating the whole robotic system
A large part of the challenge was continuous integration of the human-robot interaction learnings into the wider robotic system. Myself and our algorithms lead developed a subsystem to be able to take real inputs from the robot sensors and navigation system, and adapt behaviors in real time. This was tested in simulation and encoded into the software architecture.
Mary brought new perspectives and insightful research to our team. [This] project would not have been as successful... without Mary’s contribution – it was particularly good to see how she integrated well into a multidisciplinary team, and was able to work with and communicate deeply with engineers from outside her core area."
Chris Roberts
Head of Robotics and Automation, Cambridge Consultants
Building on this work, there was an industrial-design focused case study from Cambridge Consultants. Read more about it on Behance!