Data Architect (Remote) at Francisco Partners #vacancy #remote

First San Francisco Partners is a business advisory and enterprise information management (EIM) consultancy dedicated to helping companies leverage their data to improve strategic decision-making, reduce risk, create operational efficiencies and fuel unprecedented business success. Our services span data governance, data quality strategies, data management architecture, master data management strategy and implementation, analytics and big data. Job Responsibilities and Duties We have an immediate opening for a Data Architect to who will have a hands-on role with responsibility to develop data architecture and modeling strategies, design, implement and support data architecture deliverables for multiple data integration, data management, data warehousing, business intelligence and analytics projects. They will also be able to deliver solid, extensible, highly-available data models and data environments that supports the current and future business and technical requirements. Develops and maintains architectures for the high-level data environments of the enterprise, at the reference, conceptual, and logical levels, ensuring that these align to overall business strategy Ensures that database and data storage technologies support the data management needs of the enterprise Develops, communicates, supports, and monitors compliance with Data Modeling standards Evaluates proposals for development projects to ensure they adhere to all data architecture standards Develops and maintains standard patterns for data layers, data stores, and utility data management processes (e.g. data movement, data integration) for application across the enterprise Assists development projects, either directly or indirectly by liaising with a solution architect, to ensure that good data architecture is implemented in these projects. Evaluates currently implemented systems to determine their viability in terms of data architecture Participates in the oversight of setting data standards, for reference data, data formats, and similar needs. Identifies standard metadata for describing data assets. Develops standards for the semantic needs of data, including different kinds of models (e.g. subject areas models, data classification schemes, standards for ontologies). Ensures all documentation for data architecture is of high quality and properly curated Skills & Requirements Skills and Qualifications: Excellent communication skills, presentation and interpersonal skills are required. Ability to communicate clearly with both business and technical resources. A demonstrated track record of making a difference and adding value Strong organizational skills. Able to multi-task Ability to think creatively, highly-driven and self-motivated Ability to work and adjust to changing deadlines Ability to quickly adapt to changes, enhancements and new technologies Able to perform in a fast paced, dynamic and innovative work environment and meet aggressive deadlines Creative problem-solving skills. Must be able to develop relationships across the organization, working cross functionally to get results Ability to present complex information in a simplified fashion to facilitate understanding Can effectively manipulate and analyze large amounts of data The ability to understand data relationships, write and execute SQL queries Proficient with MS Office products Proficiency with SQL Bachelor’s degree in Bachelor’s Degree in Business Administration, Computer Science, CIS or related field 3-5 years of experience with data projects Addition qualifications: Experience with data and enterprise modeling tools, such as Erwin. Experience with ETL/Data Quality tools such as Informatica IDQ and Trillium Technical expertise with analytical tools including SQL, SAS, SPSS, R, and Tableau preferable. 3+ years or more years of experience in design, development, modification and testing of Hadoop solutions. Experience in Oozie, Hive, Pig, Impala, Sqoop, Flume, Hbase & Solr a plus. Minimum 7-10 yrs. of experience in Oracle databases 5 or more years of experience in developing complex SQL queries using tools such as Oracle, MySQL. Understanding of Pentaho or other ETL tool. Experience with RedHat Enterprise Linux preferred. Experience with designing, developing, and administering SQL Server databases 3+ years of experience with developing databases in an Agile framework with constantly changing technical requirements 3+ years of experience with designing, developing, and administrating a data warehouse preferred. 3+ years of experience with designing, developing, and administrating Microsoft Access databases 3+ years of experience with t-SQL and writing stored procedures, functions, and triggers. 1+ years of experience with migrating a Microsoft Access database to a Microsoft SQL Server 1+ years of experience with connecting a Microsoft Access front end to a Microsoft SQL Server back end Experience with NoSQL, including Mongo DB, Neo4j, Cassandra, or others. Experience with at least one scripting language, including Python, Perl, or others Qualifications #J-18808-Ljbffr

neo4j Tableau data-management Establishing interpersonal relationships data-layers Computer Science Data Storage flume pentaho Solutions Architect Microsoft Office data-modeling data governance Hadoop scripting-languages Organizational skills Business Intelligence (BI) SAS spss Python R Apache Hive Big data Analytics ETL ms-access sqoop oozie apache-pig Perl Communication Cassandra MySQL master-data-management Apache Solr HBase data-warehouse Problem-solving MongoDB T-SQL metadata Data Architect Microsoft SQL Server Bachelor’s Degree impala data-integration SQL erwin Agile framework NoSQL Business Administration

Залишити відповідь