Appel Choose France
L’appel Choose France est ouvert jusqu’au 31 mars. C’est une belle opportunité pour attirer en France des talents...
18 December 2023
Catégorie : Stagiaire
Host laboratory: Connaissance et Intelligence Artificielle Distribuées (CIAD)
Keywords: Human action recognition, classification, video data, deep learning, scene interpretation, robots, autonomous vehicles.
Contacts : Abderrazak Chahi (abderrazak.chahi@utbm.fr), Yassine Ruichek (yassine.ruichek@utbm.fr)
Description of the internship topic:
Human action recognition in video sequences is a particularly difficult problem due to the variations in the visual and motion of people and actions, changing camera viewpoint, moving backgrounds, occlusions, noise, and the enormous amount of video data. Detecting and understanding human activity or video motion is essential for a variety of applications, such as video surveillance and anomaly detection in crowded scenes, and safe and cooperative interaction between humans and robots in shared workspaces. Action and motion recognition can also be used in intelligent and/or autonomous vehicles to detect driver behavior and improve road safety. Over the past decade, significant progress has been made in action and motion recognition using spatiotemporal representation of video sequences, optical flow information, and fusion of the two. The objective of this project is to develop new machine learning approaches that address the fusion of spatiotemporal information and scene model understanding to produce a state-adaptive representation of the scene. The scene state understanding model will extract situation data (interpretation, context, circumstances, etc.) related to the different states and conditions in the scene. The proposed approach is to collaborate an intermediate recognition of the scene with one or more scene understanding models. The intermediate recognition can be performed using classical image processing/classification methods or advanced techniques such as deep learning approaches. The experiments and analysis of the results will be carried out on video data, which is widely used by the scientific community in this field. We also plan to apply the developed methods in at least one of the experimental platforms of the laboratory (automated vehicles and robots equipped with perception and localization sensors and communication interfaces).
References:
Candidate Profile :
Application (CV, scores, reference letters, …) to Abderrazak Chahi (abderrazak.chahi@utbm.fr), Yassine Ruichek (yassine.ruichek@utbm.fr)
Starting date:February/March 2024