Senior Cloud AWS Engineer – Remote / Telecommute at Cynet Systems #vacancy #remote

Job Description:

Responsibilities:

  • Understand technology vision and strategic direction of business needs.
  • Understand our current data model and infrastructure, proactively identify gaps, areas for improvement, and prescribe architectural recommendations with a focus on performance and accessibility.
  • Partner across engineering teams to design, build, and support the next generation of our analytics systems.
  • Partner with business and analytics teams to understand specific requirements for data systems to support both development and deployment of data workloads ranging from Tableau reports to ad hoc analyses.
  • Own and develop architecture supporting the translation of analytical questions into effective reports that drive business action.
  • utomate and optimize existing data processing workloads by recognizing patterns of data and technology usage and implementing solutions.
  • Solid grasp of the intersection between analytics and engineering while maintaining a proactive approach to assure solutions demonstrate high levels of performance, privacy, security, scalability, and reliability upon deployment.
  • Provide guidance to partners on effective use of the database management systems (DBMS) platform through collaboration, documentation, and associated standard methodologies.
  • Design and build end to end automation to support and maintain software currency
  • Create automation services for builds using Terraform, Python, and OS shell scripts.
  • Develop validation and certification process through automation tools
  • Design integrated solutions in alignment with design patterns, blueprints, guidelines, and standard methodologies for products
  • Participate in developing solutions by incorporating cloud native and 3rd party vendor products
  • Participate in research and perform POCs (proofs of concept) with emerging technologies and adopt industry best practices in the data space for advancing the cloud data platform.
  • Develop data streaming, migration and replication solutions
  • Demonstrate leadership, collaboration, exceptional communication, negotiation, strategic and influencing skills to gain consensus and produce the best solutions.
  • Engage with Senior leadership, business leaders at the Client and the Board to share the business value.

Qualifications:

  • Demonstrates mutual respect, embraces diversity, and acts with authenticity.
  • Bachelors degree in Computer Science, Management Information Systems, Computer Engineering, or related field or equivalent work experience; advance degree preferred.
  • Seven + years of experience in designing and building large-scale solutions in an enterprise setting in both.
  • Three years in designing and building solutions in the cloud.
  • Expertise in building and managing Cloud databases such as AWS RDS, DynamoDB, DocumentDB or analogous architectures.
  • Expertise in building Cloud Database Management Systems in Databricks Lakehouse or analogous architectures.
  • Expertise in Cloud Data Warehouses in Redshift, BigQuery or analogous architectures a plus.
  • Deep SQL expertise, data modeling, and experience with data governance in relational databases.
  • Experience with the practical application of data warehousing concepts, methodologies, and frameworks using traditional (Vertica, Teradata, etc.) and current (SparkSQL, Hadoop, Kafka) distributed technologies.
  • Refined skills using one or more scripting languages (e.g., Python, bash, etc.).
  • Experience using ETL/ELT tools and technologies such as Talend, and Informatic plus.
  • Embrace data platform thinking, design and develop data pipelines keeping security, scale, uptime and reliability in mind.
  • Expertise in relational and dimensional data modeling.
  • UNIX admin and general server administration experience required.
  • Presto, Hive, SparkSQL, Cassandra, or Solr other Big Data query and transformation experience a plus.
  • Experience using Spark, Kafka, Hadoop, or similar distributed data technologies a plus.
  • ble to expertly express the benefits and constraints of technology solutions to technology partners, business partners, and team members.
  • Experience with leveraging CI/CD pipelines.
  • Experience with Agile methodologies and able to work in an Agile manner is preferred.
  • One + cloud certifications.

Informatica Agile CI/CD Python Terraform Amazon Web Services (AWS) Apache Spark Apache Kafka replication Talend ELT Apache Hive remote work data-modeling data governance data-pipelines SQL presto ETL Cassandra migration Apache Solr

Leave a Reply