Industry: Fintech Work model: hybrid (3 days/week working from the Warsaw office) Contract’s length: B2B, long-term cooperation Project language: English Start: ASAP / 1 month notice period
1. SQL / Redshift and/or Py Spark: Essential for building data pipelines and transformations at scale
2. Git: Required for version control and collaboration
3. Apache Airflow, Apache Flink, AWS Lambda, AWS S3, Bigquery, Kafka Streams, Redshift, Web Scraping: These are part of the skills taxonomy for data engineering
4. Experience with AWS, EMR/DLT: Beneficial for handling large-scale data processing
5. Building Data Transformations and Pipelines: Ability to handle large-scale data transformations
6. Aligning Business Needs to Metrics: Translating business requirements into concrete metrics
1. Experience in Data Engineering: Prior experience in building data pipelines and transformations is advantageous
2. Experience with Data Mesh and Self-Service Tools: Useful for developing scalable data platforms (nice to have)
WE OFFER:
Challenging international projects in a Scandinavian business culture.
Long-term cooperation.
Transparently built relations based on trust and fair play.