- Recherche d'emploi
- nanterre 92
- recherche
Recherche Offres d'emploi - Nanterre 92
- Offre sponsorisée
valuateur de Recherche en Ligne
WelocalizeParis, Paris, France- Offre sponsorisée
Recherche assistante maternelle
TinaRueil-Malmaison, Hauts-de-Seine, France- Offre sponsorisée
Recherche nounou allemande
FloraLe Pré-Saint-Gervais, Seine-Saint-Denis, Francerecherche juriste
Demarche Papier31 avenue jean jaures, parisRecherche Commerciaux Sur Paris
MarianneBoulogne-Billancourt, Île-de-France, France- Offre sponsorisée
Ingénieur de recherche H / F
DASSAULT SYSTEMESMeudon, Hauts-de-Seine, FranceConsultant Recherche quantitative
AlgofiParis, FRAssistante-maternelle - Recherche
Garder Mes EnfantsMalakoff, France- Offre sponsorisée
Recherche aide menagere temps patiel
Axeo Services Paris 13Paris 19e arrondissement, Paris, France- Offre sponsorisée
RECHERCHE AUXILIAIRE DE VIE
Domidom Paris NordParis 18e arrondissement, Paris, FranceResponsable projet recherche
Université Paris DescartesPARIS , FRRecherche Prothésiste Ongulaire
L'Onglerie110 Rue de la Paroisse, Versailles, FranceRecherche manipulateur / manipulatrice
Afppe75006 PARIS- Offre sponsorisée
Recherche Directeur(trice)
Conseil Départemental du Val de MarneIVRY-SUR-SEINE, Ile de France, France- Offre sponsorisée
Recherche massage à domicile
EmmanuelParis 3e arrondissement, Paris, FranceRecherche secrétaire médicale (h / f)
Institut F2iParis, Île-de-France, FranceRecherche BARMAN / BARMAID
Le Gamin de PoissyParisStage recherche
CESINanterre, France- Temps plein
In the evolving landscape of AI, Attention LSTMs are becoming critical tools for enhancing the explainability of deep learning models. While deep learning approaches, such as LSTMs (Long Short-Term Memory networks), achieve high prognostic performance, they often come with significant computational demands and limited interpretability. Standard feature extraction models sometimes exacerbate this issue by losing information during training, leading to poorer predictions, especially in Remaining Useful Life (RUL) forecasting tasks.
Attention mechanisms integrated into LSTM models provide a solution by allowing the network to focus on the most relevant parts of the input sequence. Attention helps identify key time steps or data points that are most influential in predictions, which is particularly valuable in applications like time-series analysis and predictive maintenance.
Incorporating hyperparameter optimization further enhances model performance. In our previous work, we introduced BootBOGS [1], a hybrid hyperparameter optimization technique that combines bootstrap resampling, Bayesian Optimization, and Grid Search. This method was designed to efficiently search hyperparameter spaces, reducing variance in model performance and refining the hyperparameter ranges for better accuracy.
The goal of this internship is to integrate BootBOGS into an Attention-enhanced LSTM for RUL prediction. You will tune various parameters, including the number of LSTM units, dropout rates to mitigate overfitting, attention head sizes (in multi-head attention mechanisms), and learning rates. By optimizing these parameters, the performance and interpretability of the model will be significantly improved, allowing it to focus on the most critical features of the input sequence. This enhances the model’s ability to explain predictions, an important factor in high-stakes applications like predictive maintenance.
23 / 10 / 2024
KEY OBJECTIVES
1. Incorporate BootBOGS into the LSTM framework for enhanced hyperparameter tuning.
2. Perform computational experiments using the NASA C-MAPSS datasets to evaluate model effectiveness.
3. Focus on improving both the performance and explainability of RUL prediction
CADIDATE PROFILE : Master 2’s student (M2) or Computer Engineer.
SKILLS REQUIRED : Proficiency in Python, with experience using libraries such as TensorFlow, scikit- learn, and keras. Backround in Deep learning particulary LSTM models. Experience with time-series data and / or predictive maintenance applications and / or explainable AI (XAI) techniques is a plus. Final report will be written in Overleaf.
Interested candidates should send their CV, cover letter, transcripts and rankings from MASTER 1 and 2, and any relevant documents to by email to [email protected]. Successful candidates will be invited for an interview in person or via videoconference.