Data Pipeline Engineer
Uni Systems
Strasbourg, France
Description
At Uni Systems, we are working towards turning digital visions into reality. We are continuously growing and we are looking for a Data Pipeline Engineer to join our UniQue team.
What will you be doing in this role?
- Design, implement, and maintain data connectors between source systems and the data warehouse.
- Configure and operate open-source data integration tools (e.g. Airbyte or similar).
- Ensure reliable, observable, and fault-tolerant data pipelines.
- Build and manage batch and/or streaming data pipelines.
- Handle data extraction, loading, and basic transformation in alignment with warehouse models defined by the Data Architect.
- Implement monitoring, logging, and alerting for pipelines.
- Define and implement data access patterns to the data warehouse.
Requirements
What will you be bringing to the team?
- Master degree in IT or related field and at least 8 years of professional experience in IT.
- Strong experience with data pipelines and ETL/ELT architectures ideally with OpenShift relevant Frameworks.
- Experience with running data pipelines with Kubernetes.
- Hands-on experience with open-source data integration tools (e.g. Airbyte, Kafka Connect, Singer, etc.)
- Solid knowledge of SQL and relational databases (PostgreSQL or similar).
- Experience working with data warehouses (cloud or on-prem).
- Familiarity with Python for pipeline development and automation.
- Understanding of data access control, authentication, and authorization.
- Familiarity with dbt or similar transformation tools.
- Experience in cloud environments.
- Fluent in English at least at a level B2.
Don't forget to mention EuroTechJobs when applying.