Enterprise Data Platform & Analytics Data Engineer at Thermo Fisher Scientific #vacancy #remote

Work Schedule

Standard (Mon-Fri)

Environmental Conditions

Office

DUTIES:

•           Participate in division and group project data delivery and BI implementations covering Enterprise Data Platform, Data Lake, Data Warehouse and Data Science projects.

•           Ensure data related technical design, policies, procedures, and processes are in place to address complex business needs.

•           Lead the design and development of data delivery solutions in various fields and domains.

•           Build complete  data  platform  solutions,  including  storage,  governance,  security,  and supporting various read and write access patterns outline and participate in producing team deliverables (including architecture and technical design documentation, standards, code development, and QA) to high quality standards. 

•           Develop foundation frame works in various technologies including AWS Cloud Data lake, Databricks.

•           Use automation, DevOps, Machine Learning and document and publish data operations KPI metrics, weekly status updates and issue resolution details.

•           Build  or  maintain  solutions  using  AWS,  cloud  technologies,  DevOps,  Continuous Integration and automation methodologies.

TRAVEL:                               

Up to 10% domestic travel required. Can work remotely or telecommute.

REQUIREMENTS:                 MINIMUM Education Requirement: Bachelor’s degree in Computer Science, Software Engineering, or related field of study.

MINIMUM Experience Requirement:  3 years of experience in IT project delivery, operations or related experience. 

Alternative Education and Experience Requirement:  Master’s degree in Computer Science, Software Engineering, or related field of study, plus 1 year of IT project delivery, operations or any related occupation in which the required experience can be gained.  

Required knowledge or experience with:

•           Cloud platforms, big data platforms, DevOps in operations areas;
•           Security principles, micro services, API management, cost and usage management of cloud- based data platforms;
•           Lead teams to perform against a pre-set Service Level Agreement (SLA) standard using a service management tool such as Service Now, Jira, GitHub, Jenkins or Service Desk Express;
•           IT operations procedures and support processes; and
•           Release management, change management, problem management and other IT operational methodologies.

DevOps CI/CD Amazon Web Services (AWS) Machine Learning

Leave a Reply