This position requires – Clear Background, Drug Test, and Education Check. Must be authorized to work in the US for any employer without Sponsorship. (Principal Only! No Corp to Corp ) ——————————————————————————————————————— Contract Duration: 1 Year OT Rate: Straight Time (Exempt) Skills Required: 1. Strong understanding of relational database concepts, SQL (Structured Query Language), and data modeling. Knowledge of various databases used in data warehousing, such as Oracle, SQL Server, PostgreSQL, or MySQL. 2. Proficiency in ETL tools like Azure Data Factory, Azure Synapse, or Microsoft SSIS (SQL Server Integration Services) to extract data from various sources, transform it to fit the target schema, and load it into the data warehouse. 3. Ability to design and implement data warehouse data models, including star schema, snowflake schema, and dimension hierarchies for optimized data retrieval and analysis. 4. Understanding of data integration techniques and data quality processes to ensure data accuracy, consistency, and reliability in the data warehouse. 5. Knowledge of data warehouse architecture principles, such as data staging areas, data marts, data lakes, and the overall data flow. 6. Familiarity with data warehouse development methodologies and the ability to apply best practices in building scalable and maintainable data warehouses. 7. Proficiency in scripting languages like Python, Perl, or Shell scripting for automating ETL processes and data manipulation. 8. Understanding of data security principles and compliance regulations to protect sensitive information in the data warehouse. 9. Skills in optimizing data warehouse performance, including query optimization, index creation, and partitioning. Experience Required: 1) 3 years of experience in the past 4 years in working with Oracle database architecture, data modeling, normalization, and performance optimization. 2) 3 years of experience in the past 4 years in working with mainframe databases like IBM DB2 and IMS (Information Management System), including data modeling, database design, and SQL querying. 3) 3 years of experience in the past 4 years in working with Microsoft Azure Cloud platform, including familiarity with other Azure services like Azure Data Lake Storage, Azure Databricks, Azure Data Factory, and Azure DevOps. 4) 3 years of experience in the past 4 years in working with designing and developing data warehouses using other platforms like Microsoft SQL Server, Oracle, or Teradata 5) 3 years of experience in the past 4 years in working with years of experience in big data technologies such as Apache Hadoop, Apache Spark, or Apache Kafka 6) 3 years of experience in the past 4 years in working with data visualization tools like Power BI or Cognos to create insightful visualizations and reports based on data stored in Synapse Data Warehouse. 7) 3 years of experience in the past 4 years in working with data cleansing, data profiling, and data validation techniques to ensure high data integrity in the data warehouse. Education Required: A Bachelor’s or master’s degree in computer science. Additional Information: The candidate must have a live scan and background check performed by Probation. We encourage Minorities, Women, Protected Veterans and Disabled individuals to apply for all positions that they may be qualified for. We maintain a drug-free workplace and perform pre-employment substance abuse testing and background checks This position requires – Clear Background, Drug Test, and Education Check. Must be authorized to work in the US for any employer without Sponsorship. (Principal Only! No Corp to Corp ) ——————————————————————————————————————— Position Title: 482-Azure/Oracle/Mainframe Data Warehouse Developer-Remote (No C2C) Location: Remote, Downey, CA, 90242 Pay Rate: $50-$55 Contract Duration: 1 Year OT Rate: Straight Time (Exempt) Estimated Regular Hours/Week: 40.00 Skills Required: 1. Strong understanding of relational database concepts, SQL (Structured Query Language), and data modeling. Knowledge of various databases used in data warehousing, such as Oracle, SQL Server, PostgreSQL, or MySQL. 2. Proficiency in ETL tools like Azure Data Factory, Azure Synapse, or Microsoft SSIS (SQL Server Integration Services) to extract data from various sources, transform it to fit the target schema, and load it into the data warehouse. 3. Ability to design and implement data warehouse data models, including star schema, snowflake schema, and dimension hierarchies for optimized data retrieval and analysis. 4. Understanding of data integration techniques and data quality processes to ensure data accuracy, consistency, and reliability in the data warehouse. 5. Knowledge of data warehouse architecture principles, such as data staging areas, data marts, data lakes, and the overall data flow. 6. Familiarity with data warehouse development methodologies and the ability to apply best practices in building scalable and maintainable data warehouses. 7. Proficiency in scripting languages like Python, Perl, or Shell scripting for automating ETL processes and data manipulation. 8. Understanding of data security principles and compliance regulations to protect sensitive information in the data warehouse. 9. Skills in optimizing data warehouse performance, including query optimization, index creation, and partitioning. Experience Required: 1) 3 years of experience in the past 4 years in working with Oracle database architecture, data modeling, normalization, and performance optimization. 2) 3 years of experience in the past 4 years in working with mainframe databases like IBM DB2 and IMS (Information Management System), including data modeling, database design, and SQL querying. 3) 3 years of experience in the past 4 years in working with Microsoft Azure Cloud platform, including familiarity with other Azure services like Azure Data Lake Storage, Azure Databricks, Azure Data Factory, and Azure DevOps. 4) 3 years of experience in the past 4 years in working with designing and developing data warehouses using other platforms like Microsoft SQL Server, Oracle, or Teradata 5) 3 years of experience in the past 4 years in working with years of experience in big data technologies such as Apache Hadoop, Apache Spark, or Apache Kafka 6) 3 years of experience in the past 4 years in working with data visualization tools like Power BI or Cognos to create insightful visualizations and reports based on data stored in Synapse Data Warehouse. 7) 3 years of experience in the past 4 years in working with data cleansing, data profiling, and data validation techniques to ensure high data integrity in the data warehouse. Education Required: A Bachelor’s or master’s degree in computer science. Additional Information: The candidate must have a live scan and background check performed by Probation. We encourage Minorities, Women, Protected Veterans and Disabled individuals to apply for all positions that they may be qualified for. We maintain a drug-free workplace and perform pre-employment substance abuse testing and background checks —————————————————————————————————— If you are interested in this position, please submit your resume in a Word Document with the month and year that you have worked at each previous position to – and copy: 482-Azure/Oracle/Mainframe Data Warehouse Developer-Remote (No C2C) to the email Subject Line. Or click this email link and attach your resume in a MS Word Document format Job Posted Date: 6/21/2024 #J-18808-Ljbffr
PostgreSQL Azure Data Factory shell Python Apache Spark Apache Kafka Azure Cognos Microsoft SQL Server Oracle Perl SQL Server Integration Services (SSIS) mainframe SQL ETL teradata Hadoop MySQL synapse Power BI