prix de thèse Signal, Image et Vision 2025 attribué à Thomas FEL
Nous avons l’honneur et le plaisir de vous annoncer que nous avons attribué, ce jour,...
14 Novembre 2023
Catégorie : Stagiaire
Skills :
• A master level in computer sciences, with a speciality in Virtual Reality
• Basic knowledge in Deep learning
• Skills in Unity - C# and Virtual Reality, Python
• Knowledge in ROS or RTmaps would be appreciated
• Human skills
o Good interpersonnal skills
o English writing ability
Contacts :
• Vincent HAVARD vhavard@cesi.fr, lecturer, CESI LINEACT Rouen.
• Rim SLAMA SALMI rsalmi@cesi.fr, lecturer, CESI LINEACT Lyon.
• Vincent VAUCHEY vvauchey@cesi.fr, Senior research engineer,
How to apply :
Submit you application to Vincent Havard vhavard@cesi.fr and Rim SLAMA SALMI rsalmi@cesi.fr
Please, fill the email object as: “[Internship] VR-HAR-Dataset”
The application must contain:
• CV ;
• A cover letter for the subject ;
• Results of the current master.
• Recommendation letters if available.
Thank you to send LASTNAME FirstName.zip.
Contract: internship of 5 to 6 months, starting in February 2024.
Location :
CESI Rouen
80 Avenue Edmund Halley
Rouen Madrillet Innovation
CS 10123
76808 Saint-Etienne-du-Rouvray.
Context:
As part of Industry 5.0, the manufacturing process is centeredaround the human factor. Meticulous focus is placed on operator actions and motions, all the while ensuring their holistic well-being. Previous work has been made at CESI LINEACT on Human motion analysis (Slama et al., 2023)(Dallel et al., 2022). Within this framework, acquiring a comprehensive dataset for action recognition assumes paramount significance, given its multifaceted applications in enhancing human ergonomics and manufacturing efficiency.Acquiring such dataset has ever been made at CESI LINEACT (Dallel et al., 2020) and can be time consuming.
In parallel, digital twin and virtual reality represent technologies that can deal with several industrial issues like design, simulation and optimisation of industrial systems.Moreover, they represents tools that can acquire datasets with the ability to setup specific parameters (Dallel et al., 2023).In this context, the use of VR to acquire labelled datasets representing operator performing their activities become very interesting solution. In fact, it helps not only acquiring data and labelling actions instantaneously but also simulate different lightening conditions and camera point of views.
Work:
During this internship, the focus will be on developing an automated tool with the primary objectives of:
Technically, the main features to develop in the acquisition tool:
Bibliography
Dallel, M., Havard, V., Baudry, D., & Savatier, X. (2020). InHARD - Industrial Human Action Recognition Dataset in the Context of Industrial Collaborative Robotics. 2020 IEEE International Conference on Human-Machine Systems (ICHMS), 1–6. https://doi.org/10.1109/ICHMS49158.2020.9209531
Dallel, M., Havard, V., Dupuis, Y., & Baudry, D. (2022). A Sliding Window Based Approach With Majority Voting for Online Human Action Recognition using Spatial Temporal Graph Convolutional Neural Networks. 2022 7th International Conference on Machine Learning Technologies (ICMLT), 155–163. https://doi.org/10.1145/3529399.3529425
Dallel, M., Havard, V., Dupuis, Y., & Baudry, D. (2023). Digital twin of an industrial workstation: A novel method of an auto-labeled data generator using virtual reality for human action recognition in the context of human–robot collaboration. Engineering Applications of Artificial Intelligence, 118, 105655. https://doi.org/10.1016/j.engappai.2022.105655