Annonce

Les commentaires sont clos.

Contrastive learning and training strategies for deep hyperspectral unmixing

21 December 2023


Catégorie : Stagiaire


General description

Artificial intelligence (AI) and machine learning approaches have revolutionized computer vision and image processing. The field of remote sensing, which exploits satellite or aerial images for earth observation, is also benefiting from the efficiency of these approaches. This, in turns, has a strong social impact, as the applications of these methods include highly important challenges such as environment monitoring (e.g. urban monitoring, deforestation, crop and agriculture monitoring...).

Nevertheless, the direct application of learning methods comes up against a number of difficulties. First of all, there exist very few labeled datasets, thus calling for methods

requiring a low level of supervision. Second, as the sensor is far away from the imaged scene, the images generally suffer from low spatial resolutions. This is especially an issue in the context of hyperspectral images (HSI), which are the focus of this internship. These images are acquired in numerous spectral/colorimetric bands, each of which measures the scene in very narrow wavelength intervals. Such a good spectral information enables in principle to identify the materials (water, sand, vegetation...) present in the scene. Nevertheless, the HSI low spatial resolution make such an identification difficult, since several materials are usually present in each pixel. This calls for hyperspectral unmixing methods (HSU), enabling to separate each material.

Nevertheless, as mentioned earlier, the use of deep learning for HSU is impeded by the absence of large amounts of labeled datasets for training the corresponding neural networks. Consequently, the authors often rely on simple reconstruction loss functions, which is not fully satisfying as it can lead to spurious solutions. Consequently, the objective of this internship is to consider better methods to train HSU deep neural networks.

Of particular interest to us, we will investigate HSU-specific contrastive-based training methods [chen2020simple, sarfati2023weakly, dufumier2023integrating]. In various domains, such as medical imaging, these approaches have obtained results in the unsupervised settings that are on par with the supervised ones. Generally speaking, the main insight of contrastive learning is to compare unlabeled data points against each other to teach a model which points are similar and which are different. In practice, a way of using contrastive learning which we will consider during this internship, is to train the considered neural networks to be robust to specific data transformations (for instance, rotations, crop, colorimetric perturbations...). It is expected that training so the architectures will lead to improved results compared to mere reconstruction errors.

As an alternative, and depending on time, adversarial training strategies might also be considered.

 

Candidate

The candidate must be a Master 2 student (or equivalent) with a good knowledge of signal/image processing and machine learning. Ideally, the candidate will be familiar with the Python language (and in particular with pytorch). Knowledge in contrastive learning or hyperspectral imaging is a plus.

The candidate will acquire an expertise in remote sensing, deep learning and self-supervised training strategies. Such competences are largely sought-after, both in academia and in private companies, and are transferable to many other applications such as biomedical imaging, astrophysics... In addition, if relevent, the work conducted during the internship might lead to a scientific publication.

An extension of this subject to a PhD might be considered.

 

Contact

The internship (6 months) will take place in the IMAGES team (Télécom Paris) under the supervision of Christophe Kervazo and Pietro Gori.

Contact: christophe.kervazo@telecom-paris.fr; pietro.gori@telecom-paris.fr;

 

 

 

The full subject can be found at: https://partage.imt.fr/index.php/s/zrQoyQdiGpqFtRS