Context
Next-generation gravitational-wave (GW) observatories such as Einstein Telescope and Cosmic Explorer will achieve unprecedented sensitivities, detecting thousands of compact binary coalescence events annually. This sensitivity introduces a critical challenge: overlapping signals from concurrent events that current analysis pipelines cannot efficiently process.
Traditional Bayesian parameter estimation requires O(days) of computation per event, making real-time multi-messenger astronomy coordination impossible. Furthermore, existing matched-filtering and deep-learning methods typically handle only 1-2 concurrent signals, offering limited adaptability to the crowded detector data expected in next-generation facilities.
Recent breakthroughs address these bottlenecks independently: DINGO (Dax et al., 2021) uses neural networks as surrogates for Bayesian posteriors, reducing inference time from days to minutes while maintaining full accuracy on LIGO-Virgo events. UnMixFormer (Zhao et al., 2024) employs attention-based architectures to separate and count up to 5+ overlapping GW signals with 99.89% accuracy. Combining these approaches represents a promising and unexplored direction for real-time analysis of complex multi-signal scenarios.
Objectives
The intern will develop an integrated deep learning framework that performs real-time parameter estimation on overlapping gravitational-wave signals by combining signal separation with neural posterior estimation.
Specific goals:
- Implement a two-stage pipeline: neural network frontend for signal separation and counting → parallel neural network backends for parameter inference on each separated signal
- Generate comprehensive training datasets containing 2-5 overlapping GW signals
- Design and train an end-to-end architecture with joint loss function balancing separation quality and parameter accuracy
- Validate on synthetic data and compare parameter recovery against ground-truth posteriors
- Quantify total inference time and assess feasibility for low-latency multi-messenger astronomy applications
Expected Outcomes
- A prototype integrated model capable of real-time parameter estimation for multiple concurrent GW events
- Quantitative benchmarks: counting accuracy, waveform separation overlap, posterior agreement with traditional codes
- Open-source Python package with trained models and documentation
- (Optional) Presentation at LIGO-Virgo-KAGRA collaboration meeting
Skills and Tools
- Background in machine learning (Python, PyTorch or JAX)
- Experience with Bayesian inference and signal processing
- Familiarity with gravitational-wave physics is a plus (training provided)
- Programming: Python, Git, GPU computing, Linux environments
Internship Duration
4–6 months (Master 2 or engineering school level)
Supervision
The internship will be conducted at L2IT (Toulouse). The student will work in collaboration with researchers involved in gravitational-wave data analysis and machine learning for astrophysics.
Contact
Antsa Rasamoela, antsa.rasamoela@l2it.in2p3.fr
Sylvain Caillou, sylvain.caillou@l2it.in2p3.fr
References
- Dax, M., et al. (2021). Real-time gravitational-wave science with neural posterior estimation. arXiv:2106.12594
- Zhao, T., et al. (2024). Compact Binary Coalescence Gravitational Wave Signals Counting and Separation. arXiv:2412.18259
- Cuoco, E., et al. (2025). Applications of machine learning in gravitational-wave research. Living Reviews in Relativity, 28(2)
