Cloud Data Engineer at Sela #vacancy #remote

About Us  

Sela is a global cloud partner, dedicated to guiding our clients in their cloud journey. With our deep relationships with top cloud platforms (AWS, GCP, Azure, Alibaba) and our expertise in building resilient cloud infrastructure, we enable our clients to ‘Cloud Better.’ Our team has over three decades of experience delivering excellence and thought leadership to clients. Visit our website ( to learn more about what we do. 

What We Are Looking For  

We’re looking for a Cloud Data Engineer to join our cloud development team. Your primary role will be to help design, build, and optimize data solutions for our clients. You’ll work with large datasets, leveraging the latest cloud technologies to build robust data pipelines and analytics platforms. You take ownership of your work, are a great communicator, and thrive in environments with changing requirements. 

Our team includes network architects, cloud architects, software developers, and security engineers. You will have the chance to gain significant real-world experience and work on high-impact projects. We also provide many opportunities for growth and will cover costs of relevant education and certification. 

Due to regulatory requirements, only candidates located in the US will be considered for this position. This is a fully remote role. 

What You’ll Do:  

  • Architect and develop data pipelines for batch processing, streaming, and event processing 
  • Build and manage data lakes/lakehouses 
  • Implement and optimize data warehousing solutions 
  • Collaborate with data analysts, scientists, and clients to understand requirements 
  • Implement data governance, security, and compliance controls 
  • Contribute to planning new data architectures and evaluating emerging technologies 

Relevant Skills:  

Batch Processing and Data Processing  

  • Distributed frameworks: Apache Spark, Databricks, Flink, Ray 
  • Programming: Python, Scala, Java 

Streaming and Event Processing  

  • Streaming platforms: Kafka, Kinesis 
  • Real-time data ingestion and processing 

Data Warehousing and Analytics  

  • Cloud data warehouses: Snowflake, BigQuery, Redshift 
  • Databases: OLAP, OLTP 
  • Query engines: SQL, Presto 
  • ETL/ELT: AWS Glue 
  • Data Lakehouses: Delta Lake 

Additional Skills:  

  • Cloud platforms: AWS, GCP, Azure 
  • Data governance, lineage, security 
  • Containerization: Docker, Kubernetes 

Great to Have  

  • Experience with Docker, containerization, and container orchestration 
  • Experience building serverless applications and workflows 
  • Experience with Python, Node.js, JavaScript, and SQL 
  • Experience making infrastructure design decisions and managing client requirements 
  • AWS certifications 

Benefits  

  • Remote-first workplace 
  • We offer top medical, dental, and vision insurance for you and your dependents (with options for 100% covered) 
  • FSA and HSA plans available 
  • Retirement matching (3%) 
  • Unlimited PTO 
  • Home office stipend 
  • Annual team get-togethers 
  • Modern tools to get your job done (MacBook, etc) 
  • Collegial workplace focused on individual growth 

We don’t expect candidates to be familiar with all relevant skills but do expect them to have comprehensive experience in the domains in question and eagerness to learn. If you geek out on data infrastructures and want to work on cutting-edge projects, we’d love to hear from you! This is a full-time position with a generous benefits package and a quarterly performance bonus. Pay for this position will be commensurate with experience. 

databricks apache-flink ray Amazon Web Services (AWS) Node.js Apache Spark Apache Kafka Azure ELT Amazon Kinesis snowflake-cloud-data-platform Google Cloud Platform (GCP) Docker aws-glue OLAP Scala Python Amazon Redshift OLTP JavaScript delta-lake SQL Kubernetes Java presto ETL google-bigquery

Leave a Reply