Lasso, an IQVIA company, is growing and looking for super curious, passionate, and driven individuals to join the team. Our people are our greatest asset and we’re committed to creating an environment where we all thrive doing what we love. Lasso is the world’s most comprehensive platform for healthcare marketing and analytics. It is changing the way Healthcare Marketing is done by leveraging latest cloud-native solutions, efficient big data pipelines, processes and technologies. We work with the largest pharmaceutical brands and media agencies in the US. We empower media planning, buying, and analytics teams with the tools they need to do their job, and do it well. By simplifying workflows that used to take days into seconds, integrating functionality that used to require multiple vendors into one, and providing faster and deeper insights than anyone in the industry, we are helping healthcare marketers cut their costs, move faster and drive measurable results. We are looking for a driven and dynamic Senior Data Engineer who will have the responsibility to expand and maintain our data warehouse, develop scalable data products, and help orchestrate terabytes of data flowing through the Lasso platform. This individual will work directly with the Data Architect, along with the CTO of Lasso to establish a Machine Learning platform along with best practice engineering standards to ensure secure and successful data solutions. About the Job: Construct data pipelines using Airflow and Cloud Functions to meet business requirements set on from the Product and Engineering teams Maintain and optimize table schemas, views, and queries in our data warehouse and databases Perform ad-hoc analysis to troubleshoot stakeholder issues surrounding data and provide insights into feature usage Develop workflows and integrations with data partners to expand our data capabilities Document data architecture and integration efforts to provide a clear understanding of the data platform to other team members Provide guidance on data best practices when building out new product lines Automate long running manual processes to free up internal resources Create scalable framework for onboarding new data partners Must have: Experience with data task orchestration (Airflow, CRON, Prefect etc..) with dependency mapping Data analysis and data modeling experience Strong experience in Python, SQL, shell scripting Experience interacting with APIs, SFTP, Cloud Storage locations (e.g. GCS, s3) Analytical problem-solving skills, ability to analyze application logs to troubleshoot issues Familiarity with Cloud Computing (GCP a plus) Nice to have: Experience with JavaScript or NodeJS Hands on work with Airflow ML/MLOps Docker and K8s NoSQL Exposure to producer/consumer messaging systems The Lasso Engineering team is a diverse, multinational group of professionals who work together to bring healthcare marketing to the next level. We focus on innovation, delivering results and pushing boundaries. We look for passionate people with solid technical skills and ability to learn fast and implement new knowledge in practice. If you are curious about the latest front end technologies, have experience in building enterprise single page applications and want to achieve a lot – let’s talk. #Ruth_Thompson_IQVIA #LI-Remote #Digitalenablement IQVIA is a leading global provider of advanced analytics, technology solutions and clinical research services to the life sciences industry. We believe in pushing the boundaries of human science and data science to make the biggest impact possible – to help our customers create a healthier world. Learn more at We are committed to providing equal employment opportunities for all, including veterans and candidates with disabilities. IQVIA’s ability to operate and provide certain services to customers and partners necessitates IQVIA and its employees meet specific requirements regarding COVID-19 vaccination status. The potential base pay range for this role, when annualized, is $116,000.00 – $174,000.00. The actual base pay offered may vary based on a number of factors including job-related qualifications such as knowledge, skills, education, and experience; location; and/or schedule (full or part-time). Dependent on the position offered, incentive plans, bonuses, and/or other forms of compensation may be offered, in addition to a range of health and welfare and/or other benefits.
cloud-storage cron API Data Analyst shell Python Node.js Data Engineering JavaScript remote work cloud-computing data-modeling MLOps Google Cloud Platform (GCP) SFTP Docker SQL Airflow Kubernetes Machine Learning NoSQL Prefect