עדיין מחפשים עבודה במנועי חיפוש? הגיע הזמן להשתדרג!
במקום לעבור לבד על אלפי מודעות, Jobify מנתחת את קורות החיים שלך ומציגה לך רק משרות שבאמת מתאימות לך.
מעל 80,000 משרות • 4,000 חדשות ביום
חינם. בלי פרסומות. בלי אותיות קטנות.
Job description:
Design and deliver end-to-end data architecture across diverse client environments
Lead the design and contribute to the implementation of data pipelines, lakehouses, and data warehouses (batch & streaming), taking a hands-on role when needed or supporting engineering teams
Define and guide data transformation approaches using tools such as dbt, Dataform, and Spark, while collaborating closely with those implementing them
Help shape and support the implementation of serving layers, including APIs and microservices (e.g., Cloud Run, ECS), either directly or by working alongside delivery teams
Work hands-on with cloud-native technologies across AWS and GCP
Work Across Diverse Environments
Deliver projects in regulated and policy-driven environments, balancing compliance and delivery
Work with startups and enterprises, adapting to different scales, speeds, and expectations
Contribute to AI-related and advanced analytics projects, enabling modern data use cases
Operate in Real-World Constraints
Design systems that meet security, governance, and regulatory requirements
Collaborate in environments where ownership and decision-making are shared across stakeholders
Influence technical direction while navigating organizational and operational constraints
Communicate effectively with both technical teams and non-technical stakeholders, including policy organizations
Engineering & Leadership
Ensure systems are scalable, reliable, and cost-efficient
Lead architecture discussions, design reviews, and technical best practices
Mentor engineers and contribute directly to implementation
Requirements:
7+ years of experience in data engineering / data architecture
Proven experience designing and building production-grade data platforms
Strong hands-on experience with AWS and/or GCP (experience with both is a strong advantage)
Willingness to work on-site at least 3 days a week, from both HQ office and client locations.
Fluent in both Hebrew and English
Cloud & Data Stack
AWS: S3, Redshift, Glue, Lambda, ECS (or similar)
GCP: BigQuery, Cloud Run, Dataflow, GCS (or similar)
Experience with Databricks / Spark-based processing
Data & Engineering
Strong SQL and Python skills
Experience with dbt and/or Dataform
Deep understanding of:
Data modeling (warehouse & lakehouse)
ETL / ELT patterns
Batch and streaming architectures
Experience building APIs / backend services for data access
Familiarity with CI/CD, testing, and production systems
Adaptability & Stakeholder Engagement
Experience working across different types of organizations — or strong ability to adapt
Ability to navigate complex stakeholder environments and constraints
Strong communication skills — able to bridge engineering, business, and policy stakeholders
More information
Recruitment company : Yael group
Domain : ארכיטקט
Area : גוש דן
Location : כפר סבא
Date published : 16/04/2026
Job number : 24757
Job type : 5 ימים בשבוע
במקום לעבור לבד על אלפי מודעות, Jobify מנתחת את קורות החיים שלך ומציגה לך רק משרות שבאמת מתאימות לך.
מעל 80,000 משרות • 4,000 חדשות ביום
חינם. בלי פרסומות. בלי אותיות קטנות.