Keywords: Smart Cars | Mixed Reality | Gaze Tracking

 

This project was about analysing the gaze point of a driver under different conditions in road types, weather, lighting and other factors. I completed this project as part of an internship project at the CSIRO QCAT. My developed solution included a computer vision based system that would combine data from an external eye tracker and telemetric parameters from the driving simulator to accurately estimate and overlay the real-time gaze point of the driver. We had observed that the gaze point of the driver was greatly affected by the lighting and weather conditions while driving. We linked the transition of the gaze point to a driver’s confidence level and proved that when driver confidence was low, the gaze point would remain at a closer point in front of the car and be less subject to rapid movements in the environment.

A proud achievement of mine in this project is a subsystem I created which could transform any real-world map data into a virtual track in the simulator. I used this very system to drive around the streets of my childhood home, the loopy highways of Tokyo and the infamous Parramatta Road in Sydney, all from my lab desk. An amazing development which was not part of the project itself but rather a result of my curiosity.