Daily tasks
- Development of new functionality in the Python-based ETL pipeline
- Maintenance and optimalization of the existing pipelines in Azure Data Factory Dataflows
- Initiating planning and set up of new service components
- Maintain service related documentation according with Allianz Technology standards
- Monitoring and SLA Management
- Document, maintain and optimize processes and configurations related to the environments
- Coordinate with various stakeholders in development, architecture, rollout and operations teams as well as with customers for releases/upgrades/changes
- Manage and support service related risk assessments and audits
Professional requirements
- Preferably at least 2 years’ experience in data engineering
- At least 1 years’ experience with data oriented Python (Polars, Pandas, Pyspark,…)
- At least 1 years’ experience with Microsoft Azure
- Solid understanding of cloud architecture
- ITIL
- Experience with software as a service concept
- Strong analytical skills
- Organizing abilities
- Strong customer focus
- Can rapidly adapt and respond to changes in environment and priorities
- Advanced English is a must
- Excellent communication skills with high customer focus
Desirable Skills:
- Experience with Synapse, Delta Lake and/or PowerBI
- Experience in Bl Projects
- Familiar with DevOps processes
- Familiar with DataVault 2.0
If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now.
If this job isn't quite right for you, but you are looking for a new position, please contact us for a confidential discussion on your career.