Junior Deep Learning Engineer – Python, TensorFlow
A fast-growing, US-based technology group delivers production-ready artificial intelligence products for regulated industries. The culture prizes knowledge-sharing, mentorship, and rapid experimentation over hierarchy. You will learn directly from senior researchers who publish and ship code to millions of users—sometimes in the same week.
What You’ll Do
- Build, test, and deploy deep learning models for computer vision, NLP, and time-series forecasting.
- Write clean, idiomatic Python using TensorFlow and PyTorch APIs.
- Prepare and augment data pipelines with Pandas, NumPy, and modern GPU toolkits.
- Evaluate model accuracy, precision, and recall—then squeeze out extra percentage points through hyperparameter tuning.
- Debug convergence issues, memory bottlenecks, and distributed-training hiccups.
- Document architectures, experiments, and findings in clear, reproducible notebooks.
- Cooperate with product managers, DevOps, and QA to integrate models into microservices.
- Monitor live models, log discrepancies, and trigger retraining workflows.
- Contribute to code reviews, pair-programming sessions, and weekly research demos.
Tech Environment
- Python 3.11, TensorFlow 2.x, PyTorch 2.x.
- CUDA, cuDNN, and ONNX Runtime for GPU acceleration.
- JupyterLab, VS Code, Git, and GitHub Actions CI/CD.
- Docker, Kubernetes, and RESTful APIs.
- Experiment tracking with Weights & Biases and MLflow.
- Cloud GPU clusters on AWS and GCP.
What You Bring
- Bachelor’s degree in Computer Science, Data Science, or related engineering field.
- Solid grasp of calculus, linear algebra, and probability.
- 6+ months of hands-on deep learning coursework, internships, or personal projects.
- Competence in Python, NumPy, and Pandas; you know list comprehensions from lambdas.
- Familiarity with TensorFlow or PyTorch for forward and backward passes.
- Basic understanding of CNNs, RNNs, Transformers, and loss functions.
- Git workflow fluency—branch, commit, pull request, merge without fear.
- Strong written and verbal communication; you explain gradients to non-technical peers.
- Curiosity, resilience, and a passion for keeping models honest with robust validation.