Tel Aviv-Yafo, Tel Aviv District, Israel
Established in 2022, Guardz rapidly emerged as a noteworthy player in the cybersecurity sphere, securing $30M in funding and rallying a dedicated team of 50 industry professionals. Our vision is to foster a safer digital landscape for small and medium businesses across the globe. To this end, we introduced our comprehensive all-in-one Secure & Insure platform in early 2023, and continue to grow and expand our team, our partnerships and our revenue.
As a Senior Data Engineer, you will work on creating and optimizing our data warehouse to meet the complex and evolving needs of our organization. You will collaborate closely with stakeholders from various departments to understand their requirements and deliver high-quality data solutions that enable efficient decision-making and operational processes. Your ability to understand intricate business logic and translate it into efficient, scalable data models will be essential. You will partner with cross-functional teams, including product, sales, marketing, and executives, to ensure our data infrastructure supports the company's growth.
Responsibilities:
- Design, build, and optimize our data warehouse and the entire data pipeline including the data ingestion, orchestration and internal data tools to ensure it serves the needs of the entire organization.
- Implement complex business logic in the DWH using tools like DBT to transform raw data into meaningful insights.
- Collaborate with data leaders, data analysts, engineers, and business stakeholders to understand their data needs and translate them into technical requirements.
- Optimize data pipelines for both operational and business intelligence use cases.
- Ensure data quality, integrity, and governance across all systems and processes.
- Support ad hoc requests from stakeholders, including sales, marketing, product, and finance teams.
- Building operational data systems (e.g., CRM data integration) with alignment to business logic.
Requirements:
- 5+ years of experience in data engineering or a similar role, with a focus on building and managing data warehouses.
- 3+ years of development experience in one or more of the following programming languages: Python, Java, Scala.
- Experience leveraging dbt for dimensional data modeling and data warehouse implementation, with a focus on Snowflake or Big Query.
- Strong understanding of ETL/ELT processes, data modeling, and data architecture.
- Experience with end-to-end data pipelines designing, orchestrating, and building using cloud-based modern data stack / tools (with a focus on - Segment, Airflow, GCP tools, Rivery).
- Has strong understanding and experience of CI/CD practices and Git.
- Proficient in reverse ETL processes for delivering data to operational systems.
- Proven ability to understand complex business logic and translate it into efficient data models.
- Strong problem-solving skills and a proactive approach to finding innovative solutions.
- Excellent communication and collaboration skills to work with cross-functional teams and business leaders.
- Ability to balance speed and quality, delivering efficient solutions in a dynamic environment.
Nice to Have:
- Experience with CRM data integration and building operational data systems.
- Experience in managing infrastructures using Terraform.
- Knowledge in web application development (with a focus on Node.js).
- Familiarity with BI tools like Looker, Tableau, or Metabase.
- Prior experience working in a fast-paced startup or high-growth environment.
- Experienced in building production ML pipelines leveraging LLM models.