












Research Intern AI/ML & DH Data Science and Digital Health (DSDH) function within Johnson & Johnson Innovative Medicine Research & Development is seeking a PhD‑level Research Intern to support the development of foundation models for multimodal wearable sensor data, including accelerometer, PPG, ECG, and other physiological signals. The intern will work on self‑supervised and multimodal representation learning to build scalable models that generalize across digital health applications such as human activity recognition, sleep staging, cardiovascular monitoring, gait analysis, and digital biomarker discovery. Johnson & Johnson Innovative Medicine Research & Development develops treatments that improve the health of people worldwide. Research and development areas encompass multiple therapeutic areas (TA) including oncology, cardiopulmonary, immunology and neuroscience. Our goal is to help people live longer, healthier lives. We have produced and marketed many first-in-class prescription medications and are poised to serve the broad needs of the healthcare market, from patients to practitioners and from clinics to hospitals. To learn more about Johnson & Johnson Innovative Medicine R&D, please visit https://innovativemedicine.jnj.com/us/ The AI/ML and Digital health team within Johnson & Johnson Innovative Medicine Research & Development is a part of Data Science & Digital Health (DSDH) function and develops innovative AI/ML and analytical solutions using a variety of data sources across multiple diseases and therapeutic areas. We are looking for outstanding candidate for the Research Intern AI/ML & DH whose responsibilities include: Key Responsibilities * Design and implement self‑supervised learning frameworks for wearable time‑series data. * Train foundation models on large‑scale unlabelled multimodal sensor datasets. * Develop architectures using transformers, contrastive learning, masked modelling, and cross‑modal attention. * Integrate heterogeneous sensors (accelerometer, PPG, ECG, heart‑rate) using multimodal fusion strategies. * Evaluate learned representations on downstream health tasks (HAR, sleep, stress, gait, health outcomes). * Contribute to reproducible research outputs, including publications and well‑documented code.