This listing has expired.
We are looking for a Data Engineer, who is interested in joining our partners scrum team for the global reporting tool.
Data Engineering tasks
- Develop new functionalities for the data pipeline (including improving data quality functions, harmonization of data over multiple nationalities and calculating out of it new KPIs for the Dashboard usage)
- Maintain and improve existing Global Reporting functionalities
- Overall
- Understand and explain advantages and disadvantages of the proposed solutions to internal and external stakeholders
- Participate in Scrum ceremonies e.g. daily standup, Sprint review
- Resolve incidents and change requests
- Write technical documentation and apply best practices
Key requirements/skilss/experience
- Experience building and optimizing big data pipelines and architectures
- Hands-on experience with Azure Cloud services (best Azure Data Factory)
- Hands-on experience with AWS is a plus
- Deep knowledge in writing complex queries for RDBMS Systems (PostgreSQL)
- Solid Spark knowledge
- General understanding of Networking and IT Security Principles
- Other Skills
- Bachelor (BSc), Master (MSc) or equivalent experience in a technical field (for example, Computer Science, Engineering)
- Fluent English (written and spoken) is a must,
- Willingness and ability to learn new technologies and tools
- Team-player open to work in an agile and flexible environment
Nice to have
- Experience with Databricks is a plus
- Hands-on experience with SQL database design is a plus
- Practical experience with Linux, shell scripting and GIT ( and other DevOps related tools)
- Experience with Data Governance Frameworks (e.g. Informatica) and principles
- Hands-on experience with PowerBI implementation
- German language
What we offer
- Competitive salary package according to candidates knowledge and competences
- Annual financial bonus based on individual targets
- Wide package of certified trainings
- Possibility to develop existing and new skills