Remote Big Data Engineer at Saxon Global #vacancy #remote

Big Data Engineer: Volume of patients is large Need to build ETL pipeline to transform and massage data and build data models They will be working with Data Scientists Stack: Spark is most important Java Python or Scala (any of the three is fine) Real time and batch system experience – Kafka SQL GCP is preferred – any cloud experience is okay Airflow preferred – need some sort of orchestration tool. Need to have experience with complex data and complicated ETL pipelines. Required Skills : Spark, (java or python or scala), kafka, GCP must have Basic Qualification : Additional Skills : Background Check :Yes Notes : Selling points for candidate : Project Verification Info : Candidate must be your W2 Employee :No Exclusive to Apex :No Face to face interview required :No Candidate must be local :No Candidate must be authorized to work without sponsorship ::No Interview times set : :No Type of project :Development/Engineering Master Job Title :Big Data: Dev Branch Code :Bentonville

Scala Data Science Python Apache Spark Apache Kafka Big data Google Cloud Platform (GCP) cloud-computing data-modeling SQL Airflow Java ETL

Leave a Reply