Job details
Company
Delivery Hero
Location
Berlin, Germany
Employment type
Full-time
Seniority
Mid level
Primary category
IT Operations
Posted date
5 May 2026
Valid through
Job description
We are on the lookout for a Data Engineer to join the Quick Commerce team on our journey to always deliver amazing experiences.
As a Data Engineer, your primary mission will be to contribute to forming and improving our huge data model setup located in Google Cloud. This involves diving deep into our data platform to break down data platform and pipeline complexities, identify friction points, and optimise systems for better performance and scalability. You will be part of the team directly responsible for managing the data platform’s infrastructure components, continuously monitoring and optimizing pipelines, queries, and databases to ensure high availability and performance.
Day-to-day, you will be hands-on in designing, building, and maintaining scalable ingestion and data pipelines, ETL processes using SQL and Python, with a strong focus on ensuring high data quality, integrity, and low latency. This role also carries a responsibility for governance and standards. You will be responsible for promoting best coding practices and participating in standardization routines for data foundations, analytics, and data science. This extends to collaborating on data catalogs and access controls to ensure all data processes adhere to strict accuracy, privacy, security, and compliance standards.
You will work as a key collaborator, establishing Data Contracts between our numerous Domain Data Units and shaping efficient ways of working between various stakeholder groups. You will also work closely with cross-functional teams to define data requirements and implement or enhance data models specifically for our DataMesh environment.
Manage data platform’s infrastructure components, continuously enhancing and monitoring them as well as optimizing pipelines, queries, and databases for high availability and performance.
Design, build, and maintain scalable data pipelines and ETL processes using SQL and Python, ensuring high data quality, integrity, and low latency.
Work with cross-functional teams to define requirements and implement/enhance data models for a DataMesh environment.
Collaborate on data catalogs and access controls, ensuring all data processes adhere to accuracy, privacy, security, and compliance standards.
Build and deploy foundational data products for data science and analytics teams, troubleshoot data quality problems, and provide technical support for the BI toolbox.
Be part of redefining how customers experience quick commerce. You’ll help build technology that scales our non-food offerings, reaching new market segments and driving revenue growth. By innovating within our Quick Commerce Team, you’ll make Delivery Hero the go-to platform for a broad range of products, helping us grow faster and deliver more value to customers around the world.