Azure Cloud Data Engineer - Fully Remote at Talascend #vacancy #remote

Talascend is currently seeking an Azure Cloud Data Engineer for a fully remote contract opportunity.

NOTE: This is a W2 Assignment and Requires US Citizenship with ability to obtain Public Trust Security Clearance. The position is based In EST Time Zone. Position may require occasional travel to Washington DC Metro Area

SUMMARY:

We are seeking a seeking a customer experience focused Microsoft Azure Cloud Data Engineer to work with a team of Subject Matter Experts and Developers to design and implement full lifecycle data pipeline services for Azure Cloud-based Data Lake, SQL, and NoSQL Data Stores. As a Data Engineer, you will translate business requirements to data engineering solutions to support an Enterprise-scale Microsoft Azure-based data analytics and reporting platform. Our ideal candidate is mission-focused, delivery- oriented, and applies critical thinking to create innovative functions and solve technical issues.

Who We Are:

Our client is a Fortune 500® Technology, Engineering, and Science Solutions and Services leader working to solve the world’s toughest challenges in the Defense, Intelligence, Civil, and Health markets. Our Civil Group helps the Government modernize operations with leading edge AI/ML driven data management and analytics solutions. We are a trusted partner to both government and highly regulated commercial customers looking for transformative solutions in mission IT, security, software, engineering, and operations. We work with our customers including the FAA, DOE, DOJ, NASA, National Science Foundation, Transportation Security Administration, Custom and Border Protection, Airports, and Electric Utilities to make the world safer, healthier, and more efficient.

RESPONSIBILITIES:

  • Support the development, lifecycle management, and deployment of data pipeline services and solutions.
  • Work with client personnel and team members to understand data requirements and implement appropriate data solutions.
  • Design and implement data models, data pipelines for relational, dimensional, data lake house (medallion architecture), data warehouse, data mart, NoSQL data stores.
  • Manage and optimize data storage using Azure ADLS Gen2, Azure Synapse SQL Pools, Azure Cosmos DB.
  • Develop batch/incremental/streaming data pipelines using Python, SQL, Spark, Scala, and Microsoft Azure services including ADLS Gen2/Blob Storage, Data Factory, Synapse Pipelines, Azure Analysis Services, Logic Apps, Azure Functions, Azure Files, Visual Studio, SQL Server Change Data Capture, SQL Server stored procedures.
  • Redevelop or migrate existing SSIS extract, transform, load scripts to Azure Data Factory.
  • Identify, create, prepare data required for advanced analytics, visualization, reporting, and AI/ML.
  • Implement data migration, data integrity, data quality, metadata management, and data security functions, and apply advanced techniques such as machine learning to optimize data pipelines.
  • Monitor and troubleshoot data related issues to maintain high availability and performance.
  • Implement governance, build, deployment and monitoring standards to automate platform administration.
  • Apply DevOps and CI/CD principles to development, test, deployment, and release of code.

REQUIREMENTS:

  • BS degree in Computer Science or related field and 8+ years of experience -OR- Masters with 6+ years of experience.
  • 4+ years of professional experience implementing Azure Cloud-based data engineering solutions.
  • 4+ years of experience designing and building solutions utilizing various Cloud Services such as Azure Blob, Azure Data Lake Services, Azure Synapse Analytics, Azure Data Factory, Integration Runtime, Azure Event Hubs, Azure Functions, Azure Logic Apps, Azure Analysis Services.
  • 4+ years of experience in the design and build of ETL/ELT processes by writing custom data pipelines, initial, incremental and change data capture experiences.
  • 4+ years of experience with more than one of the following Scripting Languages: SQL, T-SQL, Python, Scala, PySpark.
  • Experience working with Microsoft Database and Business Intelligence tools, including SQL Server, stored procedures, SSIS, SSRS, SSAS (cubes).
  • Knowledge and experience with Shell Scripting, MDX, DAX queries.
  • Admin, system level experiences with data management, DB creation, user management/access control, ETL package deployment, data modeling, scheduling, debug, monitor, security controls, and O&M aspect for both on premise and Cloud based data assets.
  • Experience with data engineering solutions for data warehouse, data mart, multi-dimensional models, semantic data models (i.e., Azure Analysis Services).
  • Knowledge and understanding of AWS Cloud Services including data management and data movement services.
  • Demonstrated experience in supporting production, testing, integration, and development environments.
  • Open mindset, ability to quickly adapt new technologies to solve customer problems.
  • Experience with Agile process methodology, CI/CD automation, Cloud-based development (Azure, AWS).
  • Ability to successfully obtain a government-issued Public Trust Security Clearance.

Not required, however, additional Education, Certifications, and/or Experience is a plus:

  • Experience with ETL/ELT processes for Master Data Management.
  • Working within Scaled Agile Framework (SAFe) process.
  • Experience working with Azure DevOps.
  • Knowledge and understanding in data governance, data discovery tools such as Alation, MS Purview.
  • Knowledge and experience of configuring ETL pipelines in Cloud, VPC/V.NET config, Integration Runtime, Gateway, EC2/Bastion access settings.
  • Microsoft Certified in Azure Fundamentals, Data Engineer, AI, or AWS Certified Data Engineer.

#Link

We thank all applicants for their interest. However, only those qualified individuals who closely meet the qualifications of the position will be contacted. The details of the position are only a summary, other duties may be assigned as necessary.

Background Check and Drug Screen may be required.

azure-cosmosdb Agile data-integrity Artificial intelligence (AI) shell Amazon Web Services (AWS) Apache Spark Computer Science data-security ELT azure-blob-storage mdx SQL Server Integration Services (SSIS) data-modeling Microsoft Certified data governance Machine Learning Azure Functions data-quality Business Intelligence (BI) PySpark Azure Data Factory CI/CD Scala Python data-migration data-warehouse cloud-platforms T-SQL Microsoft SQL Server DevOps DAX SQL Azure DevOps ETL vpc

Залишити відповідь