עדיין מחפשים עבודה במנועי חיפוש? הגיע הזמן להשתדרג!
במקום לחפש לבד בין מאות מודעות – תנו ל-Jobify לנתח את קורות החיים שלכם ולהציג לכם רק הזדמנויות שבאמת שוות את הזמן שלכם מתוך מאגר המשרות הגדול בישראל.
השימוש חינם, ללא עלות וללא הגבלה.
The Opportunity
Join a market-leading global B2C platform with over 450 million registered users. We are at the forefront of the live-streaming industry, leveraging high-quality video technology to allow creators worldwide to engage with fans and monetize their talents.
Our engineering culture is built on pushing limits. We are a high-energy team of overachievers and creative thinkers who believe in solving "impossible" problems. As a Senior Data Engineer, you will be a key player in our R&D center, building the infrastructure that powers real-time interactions for millions of users.
Key Responsibilities
- Pipeline Architecture: Design and maintain robust ETL/ELT processes and data pipelines on GCP (BigQuery, Dataflow, Pub/Sub).
- Workflow Orchestration: Build and monitor complex workflows using Apache Airflow and Cloud Composer.
- Data Modeling: Develop scalable models (Dimensional, Star/Snowflake) to support analytics and large-scale operational workloads.
- Performance Optimization: Manage query performance and costs through partitioning, clustering, and proactive monitoring.
- Engineering Excellence: Apply software engineering best practices, including modular design, testing, and version control.
- Collaboration: Partner with Data Scientists and Analysts to translate business needs into technical solutions.
- Mentorship: Lead code reviews and mentor junior engineers to promote a culture of high-quality engineering.
- Experience: 6+ years in Data Engineering, with at least 3 years of hands-on experience in a cloud environment (GCP preferred).
- Python Mastery: Expert-level Python skills, including OOP, testing, and code optimization.
- Cloud Infrastructure: Deep proficiency with BigQuery, Dataflow (Apache Beam), and Cloud Composer/Airflow.
- Data Design: Strong SQL expertise and a solid grasp of data warehousing and modeling concepts.
- Modern Tooling: Familiarity with CI/CD, Terraform (IaC), and data observability tools.
- Education: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
- Experience with Dataplex for governance and cataloging.
- Knowledge of streaming technologies (Kafka/Confluent) or Looker.
- Exposure to AI tools and methodologies (e.g., Vertex AI).
- Google Cloud Professional Data Engineer or Cloud Architect certifications.
במקום לחפש לבד בין מאות מודעות – תנו ל-Jobify לנתח את קורות החיים שלכם ולהציג לכם רק הזדמנויות שבאמת שוות את הזמן שלכם מתוך מאגר המשרות הגדול בישראל.
השימוש חינם, ללא עלות וללא הגבלה.