Sr Data Engineer (Remote Eligible) at Sierra Nevada Corporation #vacancy #remote

Are you ready to showcase your advanced technical abilities? Dream, Innovate, Inspire and Empower the next generation to transform humanity through technology and imagination! As a Sr Data Engineer, you’ll design, develop, document, test, and debug new and existing cloud-based data pipelines and transformations. You will participate and lead design meetings and consult with business clients to develop processes and structures that ingest data from multiple sources into our cloud-based Enterprise Data Warehouse (EDW). Within the EDW, you will use a variety of tools to implement and continuously improve data integration, master data management, data lifecycle management, data security, data quality management, metadata management, and reporting and analytics. You will also perform defect corrections (analysis, design, code). The Sr Data Engineer also identifies and creates data management practices and processes to be included in the technical architectural standard, and collaborates with various teams across the organization on defining and improving the data architecture to support overall enterprise strategy. Finally, the Sr Data Engineer serves as an escalation point for other Data Engineers on the team. As SNC’s corporate team, we provide the company and its business areas with strategic direction and business support spanning executive management, finance and accounting, operations, human resources, legal, IT, information security, facilities, marketing, and communications. Responsibilities: Develop, maintain and optimize data pipelines and workflows on AWS using tools such as AWS Glue, AWS Lambda, and AWS Step Functions. Design and implement data models that ensure data accuracy, completeness, and consistency. Collaborate with stakeholders and analysts to identify data requirements and develop solutions that meet their needs. Troubleshoot and resolve issues related to data pipelines, data quality, and data modeling. Develop and maintain documentation of data pipelines, data models, and other data-related processes. Implement security and compliance measures to ensure data is protected and meets regulatory requirements. Continuously monitor and optimize production data pipelines and data models to improve performance and efficiency. Keep up-to-date with industry trends and advancements in data engineering and AWS services. Communicate effectively with stakeholders to provide updates on project progress and escalate issues when necessary. Must-haves: Bachelor’s Degree in a related field with at least 10 years of relevant experience Higher education may substitute for relevant experience Relevant experience may be considered in lieu of required education Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL), and familiarity with a variety of databases Experience with AWS cloud services: S3, EC2, RDS, Redshift, Glue, Lambda, Step Functions, Athena, CloudWatch, ECS, IAM Experience with AWS security: best practices, AWS KMS, AWS Secrets Manager Experience building and optimizing data pipelines, architectures and data sets Operational responsibilities (schedules, monitoring, logging, alerting, error handling, etc.) Experience performing root cause analysis on data to answer specific business questions or issues and identify opportunities for improvement Experience with object-oriented scripting languages and frameworks: Python (BOTO3), PySpark Source system integration patterns (SQL, APIs) Infrastructure as code: Terraform DevOps experience Preferred: ​ Experience building and using custom GitHub Actions and Workflows to support internal DevOps use-cases Experience with complex infrastructure-as-code concepts such as Terraform module composition and centralization Experience working with dbt to model and build Data Warehouse objects Comfortable working with Agile tools to plan and organize large engineering efforts into work items Familiarity with Containerization concepts and tools such as Docker or Podman Estimated Starting Salary Range: $138,634.92 – $190,623.01. SNC considers several factors when extending job offers, including but not limited to candidates’ key skills, relevant work experience, and education/training/certifications. SNC offers annual incentive pay based upon performance that is commensurate with the level of the position. SNC offers a generous benefit package, including medical, dental, and vision plans, 401(k) with 150% match up to 6%, life insurance, 3 weeks paid time off, tuition reimbursement, and more ( .

IMPORTANT NOTICE: To conform to U.S. Government international trade regulations, applicant must be a U.S. Citizen, lawful permanent resident of the U.S., protected individual as defined by 8 U.S.C. 1324b(a)(3), or eligible to obtain the required authorizations from the U.S. Department of State or U.S. Department of Commerce. Learn more about the background check process for Security Clearances. ( SNC is a global leader in aerospace and national security committed to moving the American Dream forward. We’re known and respected for our mission and execution focus, agility, and disruptive and rapid innovation. We provide leading edge technologies and transformative solutions that support our nation’s most critical security needs. If you are mission-focused, thrive in collaborative environments, and want to make our country stronger with state-of-the-art technologies that safeguard freedom, join our team! As an Equal Opportunity Employer, we welcome our employees to bring their whole selves to their work. SNC is committed to fostering an inclusive, accepting, and diverse environment free of discrimination. Employment decisions are made without regarding to race, color, age, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran or other characteristics protected by law. Contributions to SNC come in many shapes and styles, and we believe diversity in our workforce fosters new and greater ways to dream, innovate, and inspire.

Agile API boto3 aws-step-functions Amazon Web Services (AWS) Terraform Data Engineering podman Docker data-pipelines aws-glue Containerization DBT PySpark amazon-s3 Python Amazon Redshift GitHub Actions aws-lambda Amazon Athena DevOps amazon-cloudwatch SQL RDBMS amazon-rds amazon-iam amazon-ec2

Leave a Reply