Build, train, and evaluate deep learning models for image enhancement, denoising, reconstruction, and feature extraction on NDE image/volume data
Develop robust data pipelines for preprocessing, augmentation, and efficient 2D/3D batching with GPU acceleration
Design and run structured experiments (ablations, hyperparameter sweeps), track metrics, and iterate to improve image quality
Analyze noise/artifacts and apply techniques to boost signal fidelity and effective resolution with clear visualizations
Package reproducible training/inference pipelines; optimize for speed, memory, and reliability; contribute clean, documented code
Collaborate with NDE/imaging SMEs, present progress, insights, and recommendations in regular reviews
Required Qualifications
Currently pursuing a Master’s or advanced Bachelor’s in Computer Science, Electrical/Computer Engineering, Applied Physics, Data Science, or related field.
Solid foundation in deep learning for computer vision: CNNs, encoder–decoder architectures, residual/attention blocks, loss functions, and regularization.
Hands-on experience with PyTorch or TensorFlow, plus Python data stack (NumPy, SciPy, pandas).
Practical experience training models on image datasets; familiarity with GPU workflows (e.g., CUDA, mixed precision).
Demonstrated ability to run controlled experiments, maintain clean experiment logs, and interpret statistical results.
Strong problem-solving skills, curiosity, and attention to detail; ability to work independently and in a team.
Ideal Candidate:
Someone pursuing PhD and not submitted the thesis.
Preferred Qualifications
Experience with image reconstruction or enhancement in medical/industrial imaging contexts (e.g., X-ray/CT, MRI, ultrasound).
Understanding of NDE concepts and imaging physics: projections, artifacts, sampling, SNR, resolution.
Familiarity with classical image processing (OpenCV, scikit-image) and signal processing.
Experience with 3D data and volumetric processing, including memory-efficient training and inference strategies.
Knowledge of experiment design (DoE), statistical analysis, and uncertainty quantification.
Experience with performance optimization: data loaders, mixed precision, vectorization, and profiling.
Tools and Technologies
Python, PyTorch/TensorFlow, NumPy/SciPy, scikit-learn, OpenCV, scikit-image
Visualization: Matplotlib/Seaborn/Plotly
Optional: CUDA, PyTorch Lightning, DDP, Docker