Data Engineer with GCP at Ework Group #vacancy #remote

  • Having higher educational level – IT or related preferred
  • Several years of experience within IT industry as Data Engineer or similar roles
  • Hands-on experience with working on SQL and Python programming tasks
  • Hands-on experience with GCP and BigQuery stack including DWH platform as well Spark
  • Knowledge of orchestration and scheduling tool e.g. Airflow, Prefect or Dragster
  • Understanding of DevOps concepts like GKE and Docker
  • Knowledge of software engineering and good practices
  • English skills as you might have international Clients to work with

For our client, an IT Consulting Company specializing within AI/Data Engnieer and Analytics solutions, we realize a recruitment process on the position Data Engnieer with GCP.

Project Description:

The team is responsible for activities including: crafting, constructing and upholding the structure, tools and procedures to create data platforms which are used by Analytics Engineers.

,[Assessing and choosing suitable technologies for the project while cooperating with the other team members, Activities within R&D, maintenance and monitoring of the platform’s components, Implementing intricate data intake procedures, Working on data models, Implementing and executing policies aligned with strategic plans, Assuring compliance with the regulations regarding to the security, data privacy and data processing matters, Cooperating with the others to share a knowledge and providing trainings if needed] Requirements: SQL, Python, GCP, BigQuery, DWH, Spark, Airflow, DevOps, Docker Additionally: Sport Subscription, Private healthcare, International environment, Support for relocation, Life insurance.

DevOps Google Cloud Platform (GCP) Docker SQL Airflow data-warehouse Python Apache Spark Data Engineering google-bigquery

Leave a Reply