Annonce


[PhD] Driver State and Perception Analysis for Shared Driving with an Autonomous System

18 Avril 2025


Catégorie : Postes Doctorant ;

Plus d'informations, téléchargement :

PhD Title: Driver State and Perception Analysis for Shared Driving with an Autonomous System

Host Laboratory:
Research Unit: Heudiasyc UMR 7253 (UTC)
Research Team: SyRI (Robotic Systems in Interaction)
Website: https://www.hds.utc.fr/recherche/equipes-de-recherche/syri-systemes-robotiques-en-interaction/

PhD Supervisors:

  • Insaf Setitra (Associate Professor, co-supervisor) – SyRI Team, Heudiasyc Lab UMR-CNRS 7253, UTC
  • Philippe Bonnifait (Professor, co-supervisor) – SyRI Team, Heudiasyc Lab UMR-CNRS 7253, UTC
  • Véronique Cherfaoui (Professor, co-advisor) – SyRI Team, Heudiasyc Lab UMR-CNRS 7253, UTC

Fields of Expertise:
Computer Science, Electronics, Mathematics

PhD Project Description:
Ensuring safe transitions and/or effective shared control between an autonomous driving system and a human driver is a critical challenge in vehicle automation. This requires assessing the driver’s engagement in the driving task. Gaze analysis has emerged as a key indicator of driver attention and engagement (Deo & Trivedi, 2020; Kim et al., 2022).

This PhD aims to develop a driver monitoring system based on monocular vision that aligns human perception with the cyber-physical representation of the vehicle (World Model – WM). The research focuses on:
(1) Gaze detection using monocular cameras (Kim et al., 2022),
(2) Integration of saliency into the WM to highlight critical areas and the Most Important Object (MOI) (Jha et al., 2023; Yoo et al., 2021), and
(3) Managing uncertainty in gaze analysis and control transitions.

By estimating driver engagement, the system will support both binary transitions and cooperative shared control.
The approach involves real-time gaze tracking, estimation of driver engagement state, and experimental validation with calibration steps on an autonomous vehicle, thereby improving human-machine teaming for safer driving.

References:

  • Deo, N. and Trivedi, M. M. (2020). Looking at the driver/rider in autonomous vehicles to predict takeover readiness. IEEE Transactions on Intelligent Vehicles, 5(1):41–52.
  • Jha, S. and Busso, C. (2023). Estimation of Driver’s Gaze Region From Head Position and Orientation Using Probabilistic Confidence Regions. IEEE Transactions on Intelligent Vehicles, 8(1):59–72.
  • Kim, J., Kim, W., Kim, H.-S., Lee, S.-J., Kwon, O.-C., and Yoon, D. (2022). A novel study on subjective driver readiness in terms of non-driving related tasks and take-over performance. ICT Express, 8(1):91–96.
  • Yoo, S., Jeong, S., Kim, S., Jang, Y. (2021). Saliency-Based Gaze Visualization for Eye Movement Analysis. Sensors, 21(15):5178. https://doi.org/10.3390/s21155178

Keywords:
Driver monitoring system,
Gaze estimation, Saliency detection, World Model,
Shared driving, Takeover readiness.

Candidate Profile:
The ideal candidate should hold a Master’s degree (or equivalent) in computer science, artificial intelligence, robotics, or a related field, with strong skills in computer vision (2D, 3D), deep learning, and geometric modeling. Proficiency in machine learning frameworks (e.g., PyTorch, TensorFlow), Python programming, and experience in object detection are highly desirable. Knowledge of uncertainty estimation techniques (e.g., probabilistic models) is a plus. Strong analytical skills, experience with large-scale datasets, and the ability to work in interdisciplinary teams are also valued.

PhD Start Date:
October 2025

PhD Work Location:
Université de Technologie de Compiègne
Heudiasyc Laboratory UMR-CNRS 7253

Application Instructions:
To apply, please send a CV, a motivation letter, copies of all academic records and degrees (preferably with rankings), and optionally a letter of recommendation or a referee contact.

Contact Emails:
Insaf Setitra, Philippe Bonnifait, Véronique Cherfaoui
firstname (dot) lastname (at) hds (dot) utc (dot) fr

Only complete applications will be considered. All documents must be in either French or English.

Les commentaires sont clos.