Digital Factory - Data Engineer - Associate
- Luxemburg
- Unbefristet
- Vollzeit
- Design, implement, and maintain batch and streaming data pipelines using Python and modern data frameworks.
- Build reliable ingestion and transformation processes from internal and external data sources (APIs, databases, files, events).
- Develop reusable data processing components and frameworks following software engineering best practices.
- Ensure clear data contracts, schemas, versioning, and documentation.
- Implement data validation, quality checks, and monitoring to ensure accuracy, completeness, and timeliness.
- Write clean, testable code with unit and integration tests for data workloads.
- Instrument pipelines with logging, metrics, and alerts; define SLAs/SLOs for critical data products.
- Troubleshoot and resolve data incidents using root-cause analysis and durable
- fixes.
- Optimize data processing performance, storage layouts, and query efficiency.
- Design pipelines for scalability and resilience (retries, idempotency, checkpointing, backpressure).
- Apply partitioning, parallelization, and caching strategies where appropriate.
- Contribute to data platform architecture, including lakehouse, warehouse, and event-driven designs.
- Collaborate with Backend Engineers to align APIs and services with data consumption patterns.
- Work closely with Analytics, Data Science, and Product teams to translate requirements into reliable data solutions.
- Document architecture decisions, data models, and operational runbooks.
- Implement secure data handling practices, including access controls, encryption, and secrets management.
- Ensure compliance with enterprise, regulatory, and privacy requirements (data classification, retention, lineage).
- Support audits and data governance processes.
- Own data pipelines in production: monitor, maintain, and continuously improve reliability and performance.
- Reduce operational toil through automation and standardized frameworks.
- Contribute to shared data engineering standards, templates, and best practices across the Digital Factory.
- You have an agile, growth-oriented mindset. What you know matters. But the right mindset is just as important in determining success. We're looking for people who are innovative, can work in an agile way and keep pace with a rapidly changing world.
- You are curious and purpose driven. We're looking for people who see opportunities instead of challenges, who ask better questions to seek better answers.
- You are inclusive. We're looking for people who seek out and embrace diverse perspectives, who value differences, and team inclusively to build safety and trust.
- Bachelor's or Master's degree in Computer Science, Data Engineering, Software Engineering, or a related field.
- Professional experience as a Data Engineer or in a strongly data-focused engineering role.
- Strong experience with Python for data engineering use cases.
- Solid experience with SQL and data modeling concepts (dimensional, normalized, or lakehouse patterns).
- Hands-on experience with relational and analytical data stores (e.g. PostgreSQL, SQL Server, data warehouses, data lakes).
- Experience with CI/CD, automated testing, and code reviews for data workloads.
- Working knowledge of cloud platforms (Azure preferred) and data services.
- Familiarity with containerization (Docker) and orchestration concepts.
- Strong understanding of data security, access control, and observability.
- Experience with modern data frameworks and platforms (e.g. Spark, Airflow, dbt, Azure Data Factory, Databricks).
- Exposure to streaming and event-based data processing (e.g. Kafka, Azure Event Hubs, Service Bus).
- Experience with lakehouse or enterprise data warehouse architectures.
- Background in financial services or regulated environments.
- Familiarity with infrastructure-as-code (Terraform/Bicep).
- Experience working in distributed, cross-functional teams.
- Develop, implement, and continuously improve delivery governance standards, templates, and tooling across the Digital Factory.
- Coordinate portfolio-level reporting, risk tracking, and delivery health metrics to provide clear visibility of project status and performance.
- Support onboarding, resource planning, and performance tracking for teams and service streams, ensuring optimal allocation and utilization of resources.
- Facilitate cadence rituals such as quarterly planning, retrospectives, and delivery reviews to maintain operational rhythm and foster continuous improvement.
- Ensure alignment with enterprise delivery models, including onshore, nearshore, and offshore approaches.
- Serve as a liaison between engineering, product, and leadership teams to promote delivery transparency and effective communication.
- Drive continuous improvement by establishing feedback loops and leveraging operational insights to enhance processes and outcomes.
- Oversee license and support handover coordination (e.g., Copilot, platform tools) with Digital Operations L1/L2 teams to ensure seamless transitions and support coverage.
- A first HR Call
- Two rounds of Interviews with the Business