March 19, 2021
$100,000 - $180,000 / yr
Hugging Face is doing the most practically interesting NLP research and development anywhere” - Jeremy Howard, fast.ai & former president and chief scientist at Kaggle
NLP - or natural language processing - is the field of artificial intelligence which applies specifically to text (or natural language) as opposed to images for computer vision or structured data for time-series. NLP has become one of the most popular fields of machine learning.
Here at Hugging Face, we’re on a journey to advance and democratize NLP for everyone. Along the way, we contribute to the development of technology for the better.
This specific position is related to our current projects on training and sharing with the community very large language models (for over a billion parameters). If you have experience building, optimizing and training large-scale neural network models on setup ranging from tens up to thousands of GPUs, optimizing communication/compute in very large distributed setups using efficient scaling librairies like DeepSpeed, fairscale, mesh-tensorflow or distributed TF, PyTorch or JAX, have good knowledge and deep interest for the internal and interconnections of GPUs/TPUs and recent accelerating hardware, then we can't wait to see your application! Experience with an HPC orchestrator such as SLURM would be a plus.
Work on challenging very large-scale (thousands of GPUs) tech problems
Work on the fastest growing open-source and open science NLP tech
Flexible working hours and remote options
Competitive salary + equity
Choose your own hardware equipment
Medical benefits, parental leave, and more
If you're interested in joining us, but don't tick every box above, we still encourage you to apply! We're building a diverse team whose skills, experiences, and background complement one another. We're happy to consider where you might be able to make the biggest impact.