< back

Interfaces for trust in self-driving cars

In my master thesis I developed and tested two interfaces for self-driving cars. One uses eye tracking and augmented reality, the other uses LED lights in the dashboard. Both interfaces show the passengers which other road users the car has identified in the surrounding environment.

The augmented reality interface; eye tracking and augmented reality let the user know their vehicle has spotted the objects.
The LED interface; leds in the dashboard let the user know their vehicle has spotted the objects.

Simulator development

I developed a simple car simulator setup that could run the interfaces on video files, or on the video game Grand Theft Auto V. An image recognition neural network was deployed to recognize objects in real time on video files. For the version running on top of Grand Theft Auto V (GTA), I looked at existing GTA mods (user generated additions to the game) and wrote a program to extract the locations of road users and run the required interface in real time.

The simulator running on top of Grand Theft Auto V. Both LED and AR interface are engaged at this moment.

Expermiments

Two series of experiments were ran, involving over 50 participants. Analysis shows that the users had more trust in the vehicle when using both interfaces. Also tests were run to look for demographical trends. More on the analysis can be found in the published conference paper based on my thesis, found below.

Publications