Data Analytics Engineer/Remote/Python/Go/ETL Kafka/AWS at Motion Recruitment #vacancy #remote

Job Description: Our client is an innovative tech firm specializing in advanced data analytics solutions. Established with a vision to empower businesses through data-driven insights, they offer a comprehensive suite of services tailored to meet the evolving needs of modern enterprises. With a focus on excellence and cutting-edge technology, our client strives to deliver scalable ETL operations and robust data analytics solutions to drive informed decision-making and foster sustainable growth. Committed to staying ahead in the dynamic landscape of data analytics, they are dedicated to providing top-tier expertise and exceptional service, enabling them to unlock the full potential of their data assets. They are currently looking to hire a Senior ETL Operations and Data Analytics Engineer, someone that will serve as a driving force in the data-driven decision-making process. Your role will be pivotal in designing, implementing, and maintaining ETL processes, ensuring data accuracy, and providing valuable insights to propel business growth. Responsibilities:

  • Develop, implement, monitor, and refine ETL workflows to guarantee data quality and optimize performance.
  • Collaborate with cross-functional teams to grasp and address data requirements effectively.
  • Fine-tune ETL processes for enhanced performance and scalability.
  • Construct and manage data models to cater to reporting and analysis necessities.
  • Work closely with DevOps teams to deploy ETL solutions efficiently within Kubernetes environments through CI/CD pipelines.
  • Provide diligent support for troubleshooting ETL processes and resolving issues promptly.
  • Conduct data analysis, develop dashboards, and deliver actionable insights to stakeholders.

Requirements:

  • Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.
  • Minimum of 8 years of hands-on experience in ETL operations, Systems Operations, and Data Analytics.
  • Advanced proficiency in SQL, git, various data formats (JSON, YAML, CSV), and MS Excel.
  • Expertise in Python and Bash, including Object-Oriented techniques.
  • Familiarity with Go, Ruby, and other languages is advantageous.
  • Familiarity with Argo CD/Workflow, Kubernetes (K8s), containers, GitHub actions, Linux, and AWS.
  • Proficient in SQL with experience in MySQL or similar relational databases.
  • Ability to interact with databases using raw-SQL.
  • Solid grasp of data modeling principles and methodologies.

Desired Skills:

  • Familiarity with ELK (Elasticsearch, Logstash, Kibana) or OpenSearch for advanced log and data analysis.
  • Knowledge of Jasper Reports and BIRT.
  • Exposure to Apache Kafka for real-time data streaming and event-driven architectures.
  • Experience with relational databases such as PostgreSQL and MySQL for managing structured data.
  • Familiarity with Druid, an open-source analytics data store, and its integration into data pipelines.
  • Proficiency in Apache Superset for crafting interactive and insightful data visualizations.

The Offer: You will receive the following benefits – Health Care Plan (Medical, Dental & Vision) Retirement Plan (401k, IRA) Life Insurance (Basic, Voluntary & AD&D) Paid Time Off (Vacation, Sick & Public Holidays) Short Term & Long Term Disability Training & Development Work From Home **Due to the cooperation of this role with the US State Department, United States citizenship is required . No security clearances are required.** #LI-LT2 Posted by: Lindsay Troyer Specialization: Data Engineering

PostgreSQL Amazon Web Services (AWS) Apache Kafka JSON jasper-reports Linux Elasticsearch YAML Apache Druid MySQL Elastic Stack Microsoft Excel Argo CD Kibana OOP Git Go Data Analyst Python BIRT logstash CSV GitHub SQL Kubernetes Bash ETL Ruby apache-superset

Leave a Reply