Who we are
Foundation models have transformed text and images, but structured data - the largest and most consequential data modality in the world - has remained untouched. Tables power every clinical trial, every financial model, every scientific experiment, every business decision. No one has built a foundation model that truly understands them.
Until now. What LLMs did for language, we're doing for tables. The next modality shift in AI is happening - and we're hiring the team that makes it.
Momentum: We pioneered tabular foundation models and are now the world-leading organization in structured data ML. Our TabPFN v2 model was published in Nature and set a new state-of-the-art for tabular machine learning. Since its release, we've scaled model capabilities more than 20x, reached 3M+ downloads, 6,000+ GitHub stars, and are seeing accelerating adoption across research and industry - from detecting lung disease with Oxford Cancer Analytics to preventing train failures with Hitachi to improving clinical trial decisions with BostonGene.
The hardest work is in front of us. We're scaling tabular foundation models to handle millions of rows, thousands of features, real-time inference, and entirely new data modalities - while building the infrastructure to deploy them in production across some of the most demanding industries on earth. These are open problems no one else is working on at this level.
Our team: We're a small, highly selective team of 20+ engineers, researchers and GTM specialists, selected from over 5,000 applicants, with backgrounds spanning Google, Apple, Amazon, Microsoft, G-Research, Jane Street, Goldman Sachs, and CERN, led by Frank Hutter, Noah Hollmann and Sauraj Gambhir and advised by world-leading AI researchers such as Bernhard Schölkopf and Turing Award winner Yann LeCun. We ship fast, create top-tier research, and hold each other to an extremely high bar.
What's Next: In 2025, we raised €9m pre-seed led by Balderton Capital, backed by leaders from Hugging Face, DeepMind, and Black Forest Labs. The next phase of growth is here which makes this an optimal time to join.
About The Role
Tabular data breaks the assumptions that make scaling work for language and vision. There's no natural sequence, no spatial structure, no shared vocabulary across datasets. The architectures and scaling laws that power LLMs don't transfer. We've made the first breakthrough with TabPFN - but the hardest problems are still ahead.
At Prior Labs, Research Engineers aren't supporting scientists - they are the science team. You'll design experiments, contribute to papers, and write the code that turns architectural ideas into trained models. We create cutting edge research because the same people do both. As an early team member, you'll have significant technical ownership and room to grow as we scale.
The problems we're solving:
Day-to-day, you'll design and test novel architectures, run ablations, analyze scaling behavior, and write the training and evaluation infrastructure that makes rapid experimentation possible. We hold software quality to the same standard as research quality.
What We're Looking For
Nice to Have
Location
Compensation & Benefits
FULL TIME
mid
4/12/2026
You will be redirected to Prior Labs's application portal.