Data Engineer
Ledgy
At Ledgy, we’re on a mission to make Europe a powerhouse of entrepreneurship by building a modern, tech-driven equity management and financial reporting platform for private and public companies. In 2025, we aim to be the leading provider for European IPOs and reporting for share-based payments. We are a value-based company with a core focus on being humble, transparent, ambitious and impactful, all in order to delivery the best experience for our customers and end users.
We are proud to partner with some of the world’s leading investors. New Enterprise Associates led our $22m Series B round in 2022, with Philip Chopin joining Sequoia’s Luciana Lixandru on our board.
We were founded in Switzerland in 2017 and today we operate globally from offices in Zurich and London. We encourage diversity and are an international team coming from 26 different countries and speaking 25 different languages.
As a Data Engineer at Ledgy, your mission is to build robust data pipelines, design scalable data architecture, and collaborate with teams to deliver insights that drive business decisions. Reporting directly into Head of Operations & AI, you’ll play a key role in driving our data engineering strategy.
At Ledgy, you will:
- Manage and optimize data infrastructure and ETL pipelines using Fivetran, Airbyte, and Google Cloud Platform, ensuring reliable data flow from multiple sources into our analytics ecosystem
- Develop, test, and maintain DBT models that transform raw data into analytics-ready datasets following best practices
- Create and manage LookMLmodels in Looker to enable self-service analytics for stakeholders across the company
- Drive continuous improvement of our data engineering practices, tooling, and infrastructure as a key member of the Operations team
The job is a good fit if you have:
- 2-3+ years experience building production data pipelines and analytics infrastructure, with DBT, SQL, and Python (Pandas, etc.)
- Experience implementing and managing ETL/ELT tools such as Fivetran or Airbyte
- Ideally hands-on experience with GCP (BigQuery)
- Proficiency in Looker, including LookML development
- Strong plus if you have experience using n8n or similar automation tools
- Experience with SaaS data sources (HubSpot, Stripe, Vitally, Intercom)
- Familiarity with AI-powered development tools (Cursor, DBT Copilot) and a strong interest in leveraging cutting-edge tools to improve workflow
- Strong problem-solving skills and ability to debug complex data issues
- Excellent communication skills with ability to explain technical concepts to non-technical stakeholders