Senior Data Engineer at DATUMO sp. z o.o. #vacancy #remote

Expected, Google Cloud Platform, Azure DevOps, AWS, Big Data, Apache Airflow, Apache Spark, Snowflake Data Cloud, Python, Scala

Optional, Docker, Kafka, Databricks

Your responsibilities, Our team members tend to stick around for more than 3 years, and when a project wraps up, we don’t let them go – we embark on a journey to discover exciting new challenges for them. It’s not just a workplace; it’s a community that grows together!

4 years of commercial experience in Big Data, proven record with a selected cloud provider: GCP preferred, Azure or AWS, senior level knowledge of: programming language – Scala/Java/JVM, distributed data processing – Spark or similar , pipeline orchestration – Airflow or similar, distributed datastore – BigQuery, Snowflake, Hive or similar, designing and implementing Big Data systems following best practices, good knowledge of Python, ensuring solution quality through automatic tests, CI / CD and code review, proven collaboration with businesses, English proficiency at B2 level, communicative in Polish

Optional, experience in Snowflake/Databricks platform, knowledge of Apache Kafka, Docker and Kubernetes technologies, experience in Machine Learning projects, experience in Flink, willingness to share knowledge (conferences, articles, open-source projects), experience in leading a team or willingness to gain it – for the position of Team Leader

Division of working time, Development – 65%, Meetings – 10%, Fixing bugs – 10%, Your own ideas – 10%, Documentation – 5%

Team size, 4 – 7

This is how we work, at the client’s site, you focus on a single project at a time, you can change the project, you have influence on the choice of tools and technologies, you focus on product development, agile

Development opportunities we offer, conferences in Poland, development budget, intracompany training, mentoring, technical knowledge exchange within the company, time for development of your ideas

What we offer, 100% remote work, with workation opportunity, 20 free days, onboarding with a dedicated mentor, project switching possible after a certain period, individual budget for training and conferences, benefits: Medicover private medical care, co-financing of the Medicover Sport card, opportunity to learn English with a native speaker, regular company trips and informal get-togethers

Benefits, sharing the costs of sports activities, private medical care, sharing the costs of professional training & courses, remote work opportunities, flexible working time, integration events

Recruitment stages, Quiz – 15 minutes, Soft skills interview – 30 minutes, Technical interview – 60 minutes

Development opportunities in Datumo:, participation in industry conferences, establishing Datumo’s online brand presence, support in obtaining certifications (e.g. GCP, Azure, Snowflake), involvement in internal initiatives, like building technological roadmaps, training budget, access to internal technological training repositories

Discover our exemplary projects: , IoT data ingestion to cloud, , The project integrates data from edge devices into the cloud using Azure services. The platform supports data streaming via either the IoT Edge environment with Java or Python modules, or direct connection using Kafka protocol to Event Hubs. It also facilitates batch data transmission to ADLS. Data transformation from raw telemetry to structured tables is done through Spark jobs in Databricks or data connections and update policies in Azure Data Explorer., , , Petabyte-scale data platform migration to Google Cloud, , The goal of the project is to improve scalability and performance of the data platform by transitioning over a thousand active pipelines to GCP. The main focus is on rearchitecting existing Spark applications to either Cloud Dataproc or Cloud BigQuery SQL, depending on the Client’s requirements and automate it using Cloud Composer., , Data analytics platform for investing company, , The project centers on developing and overseeing a data platform for an asset management company focused on ESG investing. Databricks is the central component. The platform, built on Azure cloud, integrates various Azure services for diverse functionalities. The primary task involves implementing and extending complex ETL processes that enrich investment data, using Spark jobs in Scala. Integrations with external data providers, as well as solutions for improving data quality and optimizing cloud resources, have been implemented., , Realtime Consumer Data Platform, , The initiative involves constructing a consumer data platform (CDP) for a major Polish retail company. Datumo actively participates from the project’s start, contributing to planning the platform’s architecture. The CDP is built on Google Cloud Platform (GCP), utilizing services like Pub/Sub, Dataflow and BigQuery. Open-source tools, including a Kubernetes cluster with Apache Kafka, Apache Airflow and Apache Flink, are used to meet specific requirements. This combination offers significant possibilities for the platform.

DATUMO sp. z o.o., Datumo specializes in providing Big Data and Cloud consulting services to clients from all over the world, primarily in Western Europe, Poland and the USA. Core industries we support include e-commerce, telecommunications and life science. Our team consists of exceptional people whose commitment allows us to conduct highly demanding projects.

This is how we work,

databricks apache-flink Code review CI/CD Scala Python Apache Spark Amazon Web Services (AWS) Data Engineering Apache Kafka Big data Google Cloud Platform (GCP) Docker Airflow Kubernetes Machine Learning Azure DevOps Team Leader

Leave a Reply