We are looking for a skilled and motivated Data Engineer to join our dynamic team. In this role, youll play a key part in designing and building scalable data infrastructure and pipelines using cutting-edge technologies on Google Cloud Platform.
Responsibilities:
Data Architecture:
Design and build advanced data pipelines on Google Cloud Platform, implementing the Modern Data Stack to enable scalable, reliable data solutions.
Advanced ETL/ELT Development:
Develop and maintain real-time and batch data processing workflows using tools such as Apache Airflow, DBT, and Dataflow.
Cloud Data Infrastructure Management:
Build, manage, and optimize Data Lakes and Data Warehouses using BigQuery.
Responsibilities include partitioning, clustering, and implementing cost optimization strategies.
Responsibilities:
Data Architecture:
Design and build advanced data pipelines on Google Cloud Platform, implementing the Modern Data Stack to enable scalable, reliable data solutions.
Advanced ETL/ELT Development:
Develop and maintain real-time and batch data processing workflows using tools such as Apache Airflow, DBT, and Dataflow.
Cloud Data Infrastructure Management:
Build, manage, and optimize Data Lakes and Data Warehouses using BigQuery.
Responsibilities include partitioning, clustering, and implementing cost optimization strategies.
Requirements:
Experience:
At least 3 years of hands-on experience in Data Engineering or BI development within enterprise environments.
Cloud Expertise:
Proven experience working with Google Cloud Platform (GCP), especially with BigQuery, Cloud Storage, Dataflow, and Cloud Functions.
Programming:
Advanced proficiency in Python and complex SQL. Familiarity with Bash/Shell scripting is a plus.
ETL/ELT Tools:
Hands-on experience with Apache Airflow, DBT, or similar data automation frameworks.
Version Control & DevOps:
Experience with Git, CI/CD pipelines, and a solid understanding of DevOps methodologies.
Experience:
At least 3 years of hands-on experience in Data Engineering or BI development within enterprise environments.
Cloud Expertise:
Proven experience working with Google Cloud Platform (GCP), especially with BigQuery, Cloud Storage, Dataflow, and Cloud Functions.
Programming:
Advanced proficiency in Python and complex SQL. Familiarity with Bash/Shell scripting is a plus.
ETL/ELT Tools:
Hands-on experience with Apache Airflow, DBT, or similar data automation frameworks.
Version Control & DevOps:
Experience with Git, CI/CD pipelines, and a solid understanding of DevOps methodologies.
This position is open to all candidates.
רוצה לראות עוד משרות מתאימות?
Jobify מנתחת את הניסיון התעסוקתי שלך ומציגה לך משרות עדכניות - בחינם!