top of page

nuReality

Launch Date

September 2023

Role

Expressive Robotics Researcher

Related Skills

Product Design, Project Management, Usability Testing, Factory Automation

This project was published and open-sourced as part of my summer internship as an expressive robotics researcher at Motional, an autonomous vehicle company.

Project Name

This is your Project description. Provide a brief summary to help visitors understand the context and background of your work. Click on "Edit Text" or double click on the text box to start.

BACKGROUND

​About Motional

Motional is a joint venture between automotive technology expert Aptiv and vehicle manufacturing leader Hyundai Motor Group. By partnering with major ride-hail companies, including Lyft, Uber Eats, and Via, Motional is developing and commercializing SAE Level 4 AVs for autonomous ride-hail and delivery. 

​

One of the big questions in the realm of autonomous vehicles (AVs) is, in the absence of a driver, how can a vehicle express its intended behavior to drivers and pedestrians? This is where expressive robotics comes into play.

​​

Expressive Robotics & nuReality

In summary, expressive robotics is the study and practice of how to make robots respond to scenarios the same way that a human might. Basically, making a robot less robotic. The ultimate goal of expressive robotics is to make human-robot interactions simple, familiar, and intuitive by using behaviors that already align with what someone would expect.

​

In order to investigate how using expressive behaviors in AVs could help pedestrians readily recognize the intent of a driverless vehicle, Motional collaborated with CHRLX animation studio to create a virtual reality (VR) environment in which to test pedestrian reactions to different expressive AV braking behaviors. This environment was coined "nuReality". 

​

Goal

The goal of the NuReality project was to publish the findings of the VR participant study as well as make the resources of nuReality open source and available for academics to use for pedestrian interaction studies.

I wrote this article in collaboration with my co-worker JiHyun Jeong based on the research findings from a within-subjects study run by the previous group of interns. The goal of the experiment was to expose participants, in the nuReality virtual environment, to different expressive vehicle braking behaviors and measure their speed of willingness to cross the road as well as their feelings of safety, confidence, and intention understanding.

​

We found that the findings suggested that expressive behaviors such as easing into a full stop or stopping farther away can help pedestrians make quicker decisions to cross the road. Additionally, stopping farther away from the pedestrian also resulted in higher subjective experience for sense of safety, confidence, and intention understanding.

​

Our paper was accepted for review by the IEEE Robotics and Automation Letters and was published in April 2022 (Volume:7, Issue: 2)

nuREALITY

Below is a sample of the nuReality Environment.

Now that we had this virtual reality environment, it didn't seem fair to keep it to ourselves. We recognized  that the academic community has a growing interest in studying the critical issues of safety and psychology surrounding AV-pedestrian interactions and that we could support those efforts by making the environment and the related vehicle behavior scenarios open source. 

​

In order to prepare this environment for open source use I collaborated with CHRLX (the animation studio that created the environment) and Motional's legal team, marketing team, and frontend engineers. 

​

I organized the 10 unique vehicle behavior scenarios, and the Unreal Engine and Autodesk Maya source files for each scenario in AWS S3 to prepare them for easy download by users. Additionally, I designed the nuReality website which hosts the downloadable files and provides information about the project.

bottom of page