Remote Lead Data Engineer (Databricks/AWS) at Insight Global #vacancy #remote

Position: Lead Data Engineer (Databricks/AWS) Duration: 6 Month Contract – w/extensions or conversion Location: Remote ; East Coast!!! Must Haves: 2-3 years exp. as a data engineer 3 years of RECENT experience with Data Bricks Must be their most relevant experience!! Experience with Cloud ( Azure, GCP, or AWS) Experience with SQL – they will be writing queries Lead experience – Okay with not just being heads down Python/pyspark experience Plusses: AWS Certifications Data Governance experience Bach Degree Day to Day: Insight Global is seeking a Lead Data Engineer to sit remote (EAST COAST) for a large retail client headquartered in the greater Pittsburgh area! Our client recently kicked off a major project to build their own home-grown eCommerce platform and applications that will be used internally and then deployed out into their 600+ retail locations. In this role, the Data Engineer will be a member of the RISC team and will be a design, develop, and maintain scalable and efficient data pipelines for extracting, transforming, and loading (ETL) data from various sources into our data warehouse leveraging AWS, Databricks, Python and SQL technologies. Additional responsibilities will include: Collaborate with cross-functional teams to understand data requirements and implement solutions that meet business needs. Optimize and troubleshoot existing data pipelines for performance and reliability. Implement data quality and validation processes to ensure accuracy and consistency of data. Stay updated on industry trends and best practices in data engineering to continuously improve our data infrastructure. Work closely with data scientists, analysts, and other stakeholders to understand data use cases and provide the necessary infrastructure and support. Assemble large, complex sets of data that meet non-functional and functional business requirements. Identify, design and implement internal process improvements including re-design infrastructure for greater scalability, optimize data delivery, and automating manual processes. Compensation: $70.00-80.00/HR #J-18808-Ljbffr

databricks PySpark Google Cloud Platform (GCP) cloud-computing data governance SQL Python Amazon Web Services (AWS) Data Engineering Azure

Залишити відповідь