Data Engineer at AVENGA #vacancy #remote

  • Experience in developing sql queries (Snowflake)
  • Experience in Talend development
  • Experience working with CI/CD developing environments (i.e. Gitlab, bitbucket)
  • 4+ years of working with programming language focused on data pipelines,eg. Python or R
  • 3+ years of experience in data pipelines maintenance
  • 3+ years of experience with different types of storage (filesystem, relation, MPP, NoSQL) and working with various kinds of data (structured, unstructured, metrics, logs, etc.)
  • 3+ years of experience in working in data architecture concepts (in any of following areas data modeling, metadata mng., workflow management, ETL/ELT, real-time streaming, data quality, distributed systems)
  • 3+ years of cloud technologies with emphasis on data pipelines (Airflow, Glue, Dataflow – but also other smart solutions of handling data in the cloud – elastic, redshift, bigquery, lambda, s3, EBS etc.)
  • 1+ years of experience in Java and/or Scala
  • Very good knowledge of data serialization languages such as JSON, XML, YAML
  • Excellent knowledge of Git, Gitflow and DevOps tools (e.g. Docker, Bamboo, Jenkins, Terraform
  • Excellent knowledge of Unix

Requirements: Python, SQL, Snowflake, Docker Additionally: Sport subscription, Training budget, Private healthcare, International projects.

GitLab Workflow management Terraform Data Engineering JSON Talend snowflake-cloud-data-platform data-modeling YAML Docker bamboo Airflow filesystems XML aws-glue Git-flow data-quality Git Unix CI/CD Scala Python cloud-platforms R Google Dataflow distributed-systems DevOps SQL Java ETL NoSQL Jenkins Bitbucket

Залишити відповідь