We are seeking a Data Engineer to design and implement data solutions in Microsoft Azure. As a Data Engineer, you will operate within some of our most versatile and complex client environments, utilizing technologies such as ETL tools, SQL and Python.
Tasks & Responsibilities
- Lead technical decisions for enterprise data architecture using Databricks
- Develop strategies for cost-effective, high-performance data architectures, covering everything from data streams to analytical storage
- Design and implement data solutions in Microsoft Azure, based on project needs
- Build and maintain ETL processes and data models for analytical pipelines, especially in Finance
- Establish and uphold best practices for software development, like version control, testing, and CI
- Support pre-sales activities, including RFPs, proposals, and client communications
- Guide discussions on cloud data warehousing and machine learning platforms
- Provide technical support for vendor and third-party engagements
- Offer ongoing support for business and technology teams on platform strategy and quality assurance
Required Qualifications
- Bachelor’s degree in Computer Science, IT, or a related field
- Minimum 2 years of Databricks and 4 years of Azure experience (certifications preferred)
- Experience as a Data Architect/Engineer with hands-on expertise in Data Lakehouse, Data Lake, and Data Warehouse design
- Strong knowledge of Databricks, Data Lakes and Spark
- Strong experience with Azure Platform Services (Identities, Network, Storage)
- Proficiency with Azure Data Services, including Azure Data Factory, Azure SQL
- Experience with streaming data technologies like Databricks Streaming, Kafka, Azure Streaming Services
- Experience with Azure DevOps especially Azure Pipelines
- Proficiency in Python, PySpark, SQL
- Familiarity with DataOps
Nice to Have
- Deliver cloud infrastructure solutions using Infrastructure-as-Code
- Knowledge in Microsoft Fabric Ecosystem and PowerBi
- Knowledge of Azure infrastructure and security, Docker, and Kubernetes
- Experience with Apache Airflow
- Experience with open-source databases (e.g., Postgres, MySQL)
What we can offer
- Flexible remote working
- Home office allowance
- Training and learning opportunities
- Support and opportunities for professional development