Data Engineer – Cloud ETL & Platform Development
BI & Data

Data Engineer – Cloud ETL & Platform Development

Warszawa
18 303 - 25 624 PLN/mies.Netto miesięcznie - B2B
18 303 - 25 624 PLN/mies.Netto miesięcznie - B2B
Rodzaj pracy
Pełny etat
Doświadczenie
Specjalista/Mid
Forma zatrudnienia
B2B
Tryb pracy
Praca hybrydowa

Wymagane umiejętności

angielski C1

ETL

Python

SQL

AirFlow

Opis stanowiska

Rekrutacja zdalna
Friendly offer

Data Engineer – Cloud ETL & Platform Development


📍 Hybrid in Warsaw (3 days/week onsite required) | 💼 Full-time | B2B contract up to $7,000/month



Our client is a US-based technology company headquartered in New York City, delivering digital solutions and consulting services that transform businesses and drive measurable value. With offices in multiple countries, the company is now investing in a new engineering centre in Warsaw, recognizing the strong talent and culture of Polish software professionals.

We are looking for a highly motivated and self-driven Data Engineer to join a fast-growing data platform team. In this role, you will design, build, and maintain robust, scalable, and cloud-native ETL infrastructure and data pipelines, enabling real-time analytics and AI-ready architecture for high-performance business applications.

This is a hybrid position — you will be expected to work from the Warsaw office at least 3 days per week.


Key Responsibilities

  • Design, develop, and deploy Python-based ETL pipelines using Airflow and Prefect

  • Build and optimize data warehouse structures for analytics, OLAP, and dimensional modeling

  • Model AI-compatible DB schemas and ontologies for future-facing analytics

  • Migrate and transform structured, semi-structured, and unstructured data using DBT and Pandas

  • Work with streaming/event-driven architectures for real-time data processing

  • Optimize pipeline performance and scalability for large data volumes

  • Ensure data quality, validation, cleansing, and graceful error handling

  • Perform code reviews to ensure standards, scalability, and best practices

  • Collaborate with DevOps on CI/CD and automated release pipelines

  • Implement data governance, security, and observability across ETL processes


Key Requirements

  • 5+ years of experience designing and building enterprise-scale ETL pipelines

  • Strong Python skills, including use of Pandas for data processing

  • Proficient in SQL — writing complex queries and procedures across RDBMSes

  • Experience with Airflow, Prefect, or similar workflow orchestration tools

  • Solid understanding of data warehousing (OLTP, OLAP, Facts & Dimensions)

  • Familiarity with cloud-based data platforms such as RDS, Redshift, or Snowflake

  • Knowledge of cloud data architectures, messaging systems, and scalable design

  • Hands-on experience with code versioning, deployment automation, and CI/CD


Preferred Qualifications

  • Experience with Docker, Kubernetes, AWS Lambdas, Step Functions

  • Exposure to Databricks, PySpark, and advanced distributed data processing

  • Cloud certifications are a plus


What’s Offered

  • B2B contract with monthly compensation up to $7,000

  • Strong career growth opportunities in a global fintech environment

  • High-impact projects in a fast-growing sector

  • Friendly, open, and ambitious team culture

  • Hybrid model – minimum 3 days/week in the Warsaw office


18 303 - 25 624 PLN/mies.

Netto miesięcznie - B2B