Form of cooperation: B2B
Remote work
Rate: 130-140 PLN/h net+VAT
Responsibilities:
- Implement Medallion Architecture: Apply the medallion architecture to organise data layers (Bronze, Silver, Gold) within the Lakehouse, ensuring progressive enhancement of data quality.
- Design and Optimize Data Storage: Develop scalable and secure data storage solutions that align with the medallion architecture principles. Develop Data Processing Pipelines: Construct and maintain robust data pipelines for efficient data ingestion, transformation, and consolidation.
- Ensure Data Quality and Compliance: Uphold high standards of data governance, ensuring compliance with industry regulations and data privacy laws.
- Collaborate with Cross-Functional Teams: Work closely with stakeholders across the company to understand data requirements and deliver insights that drive strategic decisions.
Requirements:
- 3 years experience in similar position
- Proficiency in SQL and Python as data processing languages
- Experience with Apache Spark or Databricks as data processing engines
- Strong experience with Lakehouse patterns and medallion architecture in data engineering
- Familiarity with Azure Data Services and other relevant data management tools including Microsoft Fabric, Synapse Analytics and Data Factory
- English level: B2, C1
databricks Azure Data Factory SQL Python Apache Spark Data Engineering Azure Data Services Medallion Architecture