The role is available for Candidates from Warsaw, Łódź, Tricity and Wrocław!
About the role:
As a Data Engineer (DataBricks), you will collect, aggregate, store, and reconcile data in support of Customer's business decisions. You will design and build data pipelines, data streams, data service APIs, data generators and other end-user information portals and insight tools.
Day-to-day you will:
- Cooperate closely with the customer to propose and deliver solutions to business challenges
- Plan, build and implement data solutions based on MS Azure platform, Azure Data&Analytics PaaS Services
- Design and build Modern Data Pipelines and Data Streams
- Develop and Maintain Data Warehouse
- Design and create ETL processes supplying Data Warehouse
- Implement effective metrics and monitoring process
- Preprocess of structured and unstructured data, create queries in databases
About you:
You are a Data Engineer with strong DataBricks, collaboration and customer facing skills. You are curious team-player with passion to learn likely having about 2-4 years of relevant experience.
Qualifications:
- Experience working with Azure Databricks
- Familiarity with PySpark
- Fluent Polish and English, both verbal and written
- Mastery of SQL (T-SQL or PL SQL preferred)
- Knowledge of at least one component: Azure Data Factory, Azure Data Lake, Azure SQL DW, Azure SQL
- Experience with any programming language used for data engineering purpose: Python, Scala, R, Java etc
- Ability to conduct data profiling, cataloging, and mapping for technical design and construction of technical data flows
Will be considered as an advantage but are not required:
- Microsoft Certified: Azure Data Engineer Associate
- Knowledge of the following components of Azure Analytics: Azure HDInsight with Spark, Azure Stream Analytics, Azure IoT Hub, or Azure Event Hub, Azure CosmosDB
- Experience preparing data for Data Science and Machine Learning
- Knowledge of Jupyter Notebooks or Databricks Notebooks for Python development