Do you want to tackle the biggest questions in finance with near infinite compute power at your fingertips?
G-Research is a leading quantitative research and technology firm, with offices in London and Dallas.
We are proud to employ some of the best people in their field and to nurture their talent in a dynamic, flexible and highly stimulating culture where world-beating ideas are cultivated and rewarded.
This is a role based in our new Soho Place office – opened in 2023 – in the heart of Central London and home to our Research Lab.
The role
We’re seeking a Software Engineering Intern to join our Core AI sub-team within the AI Engineering Group at G-Research.
The Core AI team builds and evolves the foundational platforms behind every Generative AI initiative within the firm, from RAG services to tooling that improves developer experience across quant and engineering teams.
As part of this team, you’ll contribute to the design and delivery of scalable, reliable and secure infrastructure that enables researchers, data scientists and engineers to experiment and deploy AI solutions safely and at speed.
Your work will span critical projects and workstreams, including distributed systems development, LLM orchestration and inference, RAG service integration and leading the drive for internal adoption of third-party AI technologies
Key responsibilities of the roll include:
Who are we looking for?
We value engineers who are energised by complex, systems-level challenges, work fluently across languages and tooling and care deeply about developer experience and platform ergonomics.
You should be comfortable owning services end-to-end, from design docs to production dashboards, and motivated by the opportunity to help shape the foundations of AI development at G-Research.
The ideal candidate will have the following skills and experience:
The following experience is beneficial:
Why should you apply?
.
intern
4/28/2026
You will be redirected to G-Research's application portal.
Sign in and we'll score your resume against this role.