ABOUT DEDALE:
Dedale is a fast-growing intelligence and strategy firm operating at the intersection of data, AI, and decision-making. We help leading organizations and investors navigate complexity through sharp insights and a cutting-edge product. As we scale, data is at the heart of everything we build — and we need someone to own it end-to-end.
YOUR ROLE:
As Data Engineer, you will be the primary owner of Dedale's data stack. You will define how we govern, model, maintain, migrate, and enrich our data (proprietary and public data) - setting the foundation for everything from analytics to AI-powered products. Joining Data and Tech Teams of 20+ people, you will collaborate closely with our Product and Research Teams.
This is a high-impact, high-autonomy role, with a clear path toward Data Architect.
- Define and enforce access policies: who can see what, who can change what – aligned with GDPR standards
- Implement data quality checks and completeness monitoring across all sources
- Build and maintain a data catalogue — making data discoverable and trustworthy
- Own data observability: detect anomalies, lineage issues, and freshness gaps proactively
- DATA ARCHITECTURE – IN CLOSE COORDINATION WITH OUR PRODUCT TEAM
- Design and maintain Dedale's canonical data model — the single source of truth
- Own the full data pipeline: ingestion, transformation, storage, and serving layers
- Select and evaluate the right tools for each layer of the stack (ETL, warehouse, orchestration)
- Ensure coherence, consistency, and scalability as we grow
- DATA PIPELINES AUTOMATION AND ENRICHMENT
- Automate data workflows, reducing manual steps, improving reliability and building scale
- Design and build new pipelines that enrich raw data with external sources and computed signals
- Partner with analysts and product teams to deliver clean, ready-to-use proprietary datasets
- DATA MODERNISATION & MIGRATION
- Audit and progressively migrate legacy data infrastructure toward a modern, scalable architecture
- Identify technical debt in the data stack and drive its resolution methodically
- Ensure zero data loss and high availability during migration phases
YOUR PROFILE:
Must-haves:
- 3 to 5 years of experience in a data engineering role
- Proficiency in Python and SQL
- Solid experience with data modelling, pipeline design, and warehousing (e.g. Airtable, dbt, Airflow, BigQuery, Snowflake, Spark)
- Interest and exposure to AI/ML pipelines and feature engineering
- Hands-on knowledge of data governance principles and tooling
- Track record of working in a start-up or scale-up environment — you thrive with ambiguity
- Entrepreneurial mindset: you see the big picture, propose solutions, and take ownership
Nice-to-haves:
- Experience with data migration projects
- Familiarity with data cataloguing and observability tools (e.g. DataHub, Monte Carlo, Great Expectations)
- Interest in growing into a Data Architect role — we will invest in your development
Who you are:
- You take pride in clean, reliable data and treat it as a product
- You communicate clearly with technical and non-technical stakeholders alike
- You balance speed with rigour — you can ship fast without cutting dangerous corners
- You are proactive, curious, and always looking for ways to improve the stack
WHAT WE OFFER:
- High autonomy and direct impact from day one
- A clear career path toward Data Architect
- A collaborative, intellectually stimulating environment
- Competitive compensation package
- Flexible working arrangements