NextPerception’s Impact on Road Safety and Driver Well-being

Next Perception UC2 Team

It’s a fact that most road accidents result from driver errors. According to the National Highway Traffic Safety Administration (NHTSA), Over 90% of all accidents are caused by human error. These errors include various factors such as distracted driving, drowsiness, and impaired driving, among others. However, the spectrum of driver mistakes extends beyond these evident examples to involve other behaviours and conditions that lead to misjudgements and decisions, resulting in accidents.


As mentioned in our previous section, the NextPerception project focuses on identifying cognitive and visual distractions, as well as drivers’ emotional states and arousal levels. To achieve this, the Driver Monitoring System (DMS) utilizes sensors to capture various data, including driver video stream for facial expressions and staring direction, driver temperature via a thermal camera as an indicator of arousal, driving performance metrics for analyzing behavior, and additional data from biometric sensors, like heart rate chest bands. These inputs create a complete understanding of the driver’s complex state.


In a practical application within the RE:LAB driving simulator, the systems have produced interesting results, providing valuable insights for discussion. It’s promising, but further experimental recordings are necessary to evaluate the robustness and effectiveness of the proposed systems and enhance driver state detection, for example involving larger sample sizes and incorporate other tasks related to visual, cognitive, and emotional conditions. Moreover, potential studies could focus on developing and integrating multi-person state detection. The NextPerception Human-Machine Interface (HMI) has undergone prototype testing, and its full development requires additional design and testing phases to fine-tune parameters like timing thresholds. An evaluation with real users will be crucial to assess the effectiveness of recovery strategies designed to guide the driver’s state toward a safe zone.

The project has successfully showcased the ability to identify shifts in a driver’s state, crucial for enhancing road safety by preventing accidents. The synchronization of multiple data acquisition systems in a distributed architecture enhances the reliability and performance of the integrated sensors and AI algorithms.

The positive reception of driver state support systems indicates a growing acceptance of advanced safety technologies among drivers. Visual HMI, ambient lighting, and voice assistants not only enhance the driving experience but also effectively deliver important warning messages, aligning with user preferences and expectations. This positive feedback shows the significance of these features in improving overall vehicle usability and safety.

Share this on

Keep in contact.

If you are interested in collaborating with us or if you would like information about our services, please contact us and we will be happy to help. Let’s get in touch and make something great happen.