--}}

JOB DESCRIPTION


Responsibilities

  • Design, deploy, and manage scalable solutions using cloud platforms such as Azure and Databricks
  • Lead integration of data sources with Data Lake storage using ADF pipelines and perform ETL activities through Notebooks
  • Utilize Git or Azure DevOps for version control and collaboration, and develop scalable data solutions
  • Expertise with Databricks clusters optimization and powershell scripting
  • Collaborate with Data Engineers, Data Scientists, and Data Analysts to ensure effective and efficient data usage
  • Drive the adoption of new technologies and lead future improvement initiatives
  • Own and continuously evolve our data landscape to support business growth and scalability
  • Ensure efficient and reliable data processing workflows to maximize performance for analysts and scientists
  • Strategically identify, plan, and execute improvements in data reliability, efficiency, and quality
  • Lead the design and implementation of advanced data models and pipelines
  • Mentor and coach junior and medior team members, fostering professional growth and technical expertise
  • Establish and promote best practices for data modeling, quality processes, and standards for big data and cloud platforms

About you

  • Master’s degree in Computer Science, Information Technology, or a related field. Or equivalent experience
  • Minimum 5 years in Data Engineering
  • Advanced programming skills in Spark, ADF, Python and T-SQL.
  • Expert-level knowledge of PySpark and Spark SQL, including performance tuning
  • Proven experience with API interfaces and JSON file ingestion
  • Knowledge in git/azure devops for version control and collaboration
  • Comprehensive understanding of Microsoft Azure ecosystems. Experience with data system architecture and information security (MS Purview)
  • Proficiency in Azure DevOps for code maintenance and CI/CD
  • Experience with PowerBI, GenAI, and/or Fabric is a significant advantage
  • Proven experience as a DevOps Engineer is a strong plus
  • Demonstrated ability to mentor and coach junior and medior engineers
  • Strong business acumen to identify and implement scalable data solutions
  • Effective collaboration and leadership across multi-disciplinary teams
  • Strong communication skills for articulating complex data concepts
  • Ability to work from the Utrecht office on a Hybrid setting, or remote with the ability to work in the CET time-zone

Technical Requirements 

  • A minimum internet speed of 10 Mb/s download and 10 Mb/s upload internationally.   
  • A minimum of 8 GB RAM   
  • A 64-bit version of Windows 10 or newer, or macOS 10.11 or newer 
  • An Intel Core i5-8260U or better/similar.
  • A smartphone usable for two-factor authentication that runs at least:
    • Android 8.0 or newer
    • Apple iOS 15.0 or newer 

Salary

Competitive

Monthly based

Location

Worldwide, Worldwide Region, Worldwide

Job Overview
Job Posted:
1 week ago
Job Expire:
5d 14h
Job Type
Remote
Job Role
Engineer
Education
Master Degree
Experience
5 - 10 Years
Slots...
1

Share This Job:

Location

Worldwide, Worldwide Region, Worldwide