Founding Simulation Engineer (Robotics & Autonomy)
We’re building a software‑first simulation platform for robotics and autonomous systems, enabling test‑driven development without reliance on hardware‑in‑the‑loop testing. Our work supports teams deploying autonomy in safety‑critical, real‑world environments, helping them validate behaviour faster and with far greater confidence.
This is a foundational engineering role on a small, highly technical founding team. You’ll take real robotic systems, sensors, environments, and edge cases, and turn them into simulations that customers can trust as a proxy for the physical world. Your work will directly determine how quickly and safely customers can deploy autonomous systems.
What you’ll be responsible for
End‑to‑end system ownership
- Own the full lifecycle of customer system onboarding, translating physical robots, sensors, and operating environments into validated simulation models.
- Take responsibility from initial scoping through implementation, calibration, validation, and delivery, not just isolated components.
High‑fidelity sensor simulation
- Build and calibrate physics‑based simulation models from first principles for sensing modalities including cameras, lidar, radar, infrared/thermal, and other non‑visible sensors.
- Reason deeply about sensor behaviour, noise, environmental effects, and failure modes rather than relying on surface‑level configuration.
Calibration and validation
- Define quantitative validation criteria and test plans to measure simulation fidelity against real‑world data.
- Iterate models from “good enough to explore” to “accurate enough to trust” for production autonomy workflows.
Tooling and scale
- Build internal tooling and automation in Python to support calibration, testing, repeatable onboarding, and long‑term platform scalability.
- Contribute to performance‑critical components or simulator integrations using C++ and/or Rust where appropriate.
Collaboration
- Work closely with customers and internal stakeholders to understand real systems deeply and close gaps between simulation and reality.
- Communicate technical trade‑offs clearly and confidently to both engineers and non‑specialists.
Required experience & background
Seniority
- 2–8 years of experience in robotics, autonomy, or hardware‑level simulation (academic, professional, or a combination).
Simulation experience (non‑negotiable)
- Demonstrated hands‑on, end‑to‑end ownership of at least one sensor or system simulation, including calibration and validation against real hardware or data.
- Experience using NVIDIA Isaac Sim / Isaac Lab as a primary simulator is required. Experience with Mujoco or Gazebo is acceptable as complementary or prior exposure.
Sensor experience
- Experience with non‑visible sensing systems such as lidar, radar, infrared/thermal, or UV, in addition to or instead of camera‑based systems.
- Depth over breadth: you should be able to go deep on at least one sensing modality and explain modelling decisions in detail.
Education
- Bachelor’s degree in Mechanical Engineering, Robotics, Electrical Engineering, Mechatronics, or a closely related field.
- Strong academic performance preferred (3.7+ GPA).
- Master’s or PhD preferred, but strong professional experience with clear ownership can fully substitute.
Technical skills
- Strong Python for data analysis, validation, testing, tooling, and automation.
- Experience programming in C++ and/or Rust.
- Comfortable working from first principles and making independent technical decisions.
Working style & soft skills
- High‑agency builder who thrives in 0‑to‑1, ambiguous environments.
- Clear, personable communicator able to explain complex systems and trade‑offs.
- Willing to work broadly across sensing modalities and problem domains as the platform evolves.
- Comfortable with a small‑team, high‑ownership, high‑intensity start-up environment.
Explicit non‑fits (important)
This role is not a fit for candidates with:
- Only surface‑level simulator usage (e.g. setting up environments without owning fidelity or validation).
- A hardware‑only background without meaningful experience simulating those systems.
- Macro‑level or factory/plant simulation experience (e.g. supply chain, manufacturing flow) without sensor‑level, physics‑based modelling.
- A narrow or mercenary mindset e.g. only interested in one sensing modality and unwilling to go broader.