• Year: 2020

Project

The goal of this project is to develop next generation smart perception sensors and enhance the distributed intelligence paradigm to build versatile, secure, reliable, and proactive human monitoring solutions for the health, wellbeing, and automotive domains.

The NextPerception project intends to make a leap beyond the current state of the art of sensing and to achieve a higher level of services based on information obtained observing people and their environment. We envision that this leap entails and comprises paradigm shifts at the following three conceptual levels of sensor system technology development:

Advanced Radar, LiDAR and Time of Flight (ToF) sensors form the eyes and ears of the system. They will be specifically enhanced in this project to observe human behaviour and health parameters and will be combined with complementary sensors to provide all the information needed in the use cases. Their smart features enable easy integration as part of a distributed intelligent system.

this emerging paradigm allows for the distribution of the analytics and decision making processes across the system to optimise efficiency, performance and reliability of the system. We will develop smart sensors with embedded intelligence and facilities for communicating and synchronising with other sensors and provide the most important missing toolsets, i.e., those for programming networks of embedded systems, for explainable AI and for distributed decision making.

smart sensors and distributed intelligence will be applied to understand human behaviour and provide desired predictive and supportive functionality in the contexts of interest, while preserving users’ privacy.

Driver monitoring

This use case plans to investigate and define smart solutions intended to mitigate the risk of accidents and their consequences on individuals and, indirectly, on the society (e.g. reduction of costs for fatal and disabling injuries due to road crash). In particular, this use case (UC) focuses on the definition of innovative sensing and identifying solutions to monitor driver status and behaviour. In addition, the emotional state of the driver/user will be addressed in the specific application scenario for this UC, which considers private vehicles.

 

All these applications require video and audio acquisition and processing to detect relevant information as well as data collection. Moreover, distributed low power sensors can be used to improve the accuracy of the detection as well as to overcome the issues of an only video system.

 

The main goal is to develop a Driver Monitoring System (DMS), which can classify both the driver’s cognitive states and the driver’s emotional states, as well as the activities and positions of occupants (including driver) inside the vehicle cockpit. Examples of cognitivestates are distraction (including all forms), fatigue, workload and drowsiness; while examples of emotions can include anxiety, panic attack, anger/aggressiveness, and so on.

In order to achieve that, several sensors and source of information will be considered:

Such as Eye-tracking, face and expression recognition, etc.

Such as heart rate, blood pressure, ECG/EEG, skin conductance and so on.

Health-related analysis based on mm-wave radar(s), in combination with other sensor types.

Such as speed, yaw-rate, steering wheel, accelerator and braking behavior, etc. acquired by ad hoc sensors and by data already available from vehicle (CAN-BUS)

Such as obstacles speed and velocity, position of ego-vehicle in the lane, and so forth, possibly supplied by a simulator.

Share this on

Next project

Evoca – Maestro Touch

Keep in contact.

If you are interested in collaborating with us or if you would like information about our services, please contact us and we will be happy to help. Let’s get in touch and make something great happen.