Data Engineer with AWS and Snowflake at DCG #vacancy #remote

DCG is a technology expert who gathers IT professionals in its teams. Due to our and our client’s development and the large number of ongoing projects, we are constantly looking for new employees and associates with high competencies in the IT area or other, consistent with the information provided in the advertisements.

Responsibilities:

  • Building data pipelines to ingest data from various sources such as databases, APIs, or streaming platforms. Integrating and transforming data to ensure its compatibility with the target data model or format.
  • Designing and optimizing data storage architectures, including data lakes, warehouses, and distributed file systems. Implementing techniques like partitioning, compression, or indexing to optimize data storage and retrieval. Identifying and resolving bottlenecks, tuning queries, and implementing caching strategies to enhance data retrieval speed and overall system efficiency.
  • Designing and implementing data models that support efficient data storage, retrieval, and analysis. Collaborating with data scientists and analysts to understand their requirements and provide them with well-structured and optimized data for analysis and modeling purposes.
  • Collaborating with cross-functional teams including data scientists, analysts, and business stakeholders to understand their requirements and provide technical solutions. Communicating complex technical concepts to non-technical stakeholders clearly and concisely.
  • Independence and responsibility for delivering a solution.
  • Ability to work under Agile and Scrum development methodologies.

Requirements:

  • 3+ years of professional experience in the Data & Analytics area
  • Good knowledge of Snowflake
  • Good knowledge of AWS cloud services (S3, RDS, Redshift, etc.), knowledge of other clouds is an asset
  • Very good level of communication, the ability to clearly and specifically convey information, experience in working with business users
  • English at least at B2 level, ideally C1
  • Experience working in the Agile (Scrum) methodology
  • Capable of working both independently and in a team-oriented, collaborative, cross-functional and cross-cultural environment
  • Hands-on experience in designing and implementing data pipelines, data warehouse solutions and ETL workflows
  • Passion for the Data & Analytics solutions in the Cloud environment
  • Holding certifications with AWS and Snowflake – nice to have

What we offer:

  • Unlimited access to the Udemy learning platform from day 1
  • Certificate training programs
  • Medical Insurance
  • Multisport Card
  • Fully remote work possibility

Agile API Information technology (IT) amazon-s3 Amazon Web Services (AWS) Amazon Redshift dcg ETL Data Engineering Scrum amazon-rds snowflake-cloud-data-platform

Leave a Reply