REMOTE Snowflake Data Architect at Sogeti #vacancy #remote

Sogeti is a leading provider of professional technology services, specializing in Application Management, Infrastructure Management and High-Tech Engineering. Sogeti offers cutting-edge solutions around Testing, Business Intelligence, Mobility, Cloud and Security, combining world class methodologies and the global delivery model, Rightshore®. Sogeti brings together more than 20,000 professionals in 15 countries and is present in over 100 locations in Europe, the US and India. Sogeti is a wholly-owned subsidiary of Cap Gemini S.A., listed on the Paris Stock Exchange. At Sogeti USA, we are committed to building a long and enduring relationship with our employees and to creating an environment that rewards and empowers. Our mission is to constantly exceed our employees’ expectations in the same way that we strive to exceed our clients’ expectations. We offer an environment that celebrates innovation and helps you to achieve a good balance between your professional and personal life. We strive to be an employer of choice! Key Responsibilities Include

  • Be responsible for designing, developing and maintaining the data architecture, data models and standards for various Data Integration & Data Warehousing projects in Snowflake, combined with other cloud technologies.
  • Model new features and subject areas/data domains and integrate them with existing structures to provide a consistent solution as well as develop and maintain documentation of the data architecture, data flows and data models.
  • Leverage Snowflake native technologies including snowpipe, snowsql and others.
  • Leveraging integration tools that work with Snowflake including DBT, IICS, Python and Spark
  • Work in cross function teams that combine technical, business and data science competencies.
  • Provide technical leadership for the data projects
  • Take ownership of technical solutions from design and architecture perspective, ensure the right direction and propose resolution to potential data pipeline-related problems.

Experiences Required – Education, Key Experiences, Knowledge and Skills:

  • 7+ years of success in consultative/complex technical sales and deployment projects, architecture, design, implementation, and/or support of highly distributed applications required
  • 5+ years’ experience working with large complex enterprise accounts architecting cloud solutions for data estate workloads ( DW/ETL/BI projects preferred)
  • Proficiency in data modelling, very good understanding of data management concepts such as 3NF, data mesh, Dimensional modeling and their specific applications
  • 5+ years implementing large scale data projects within the cloud environment
  • Collaboration and Communication. Acknowledged for driving decisions collaboratively, resolving conflicts and ensuring follow through with exceptional verbal and written communication skills. Ability to orchestrate, lead, and influence remote teams, ensuring successful implementation of customer projects.
  • Enterprise-scale technical experience with cloud and hybrid infrastructures, architecture designs, database migrations, and technology management. required
  • Breadth of technical experience and knowledge, with depth / Subject Matter Expertise in two or more of the following Data Platform Cloud solutions required:
  • Datawarehousing including SQL DW, Snowflake, Databricks, Redshift, Big Query
  • Modern Data storage technologies such as Azure DataLake, Databricks and AWS S3.
  • Technical Capability and experience to learn new technologies and understand relevant cloud trend required
  • Knowledge of Competitors: Knowledge of cloud development platforms preferred

Education You likely possess an advanced degree in a quantitative field such as computer science, applied mathematics, statistics. At Sogeti, we also support an equivalent combination of education and experience. The Ideal Candidate Will Have

  • Experience leading 1-2 Snowflake Cloud migrations
  • Previous expertise with SnowSQL, advanced concepts (query performance tuning, time travel etc.) and features/tools (data sharing, events, SnowPipe etc.)
  • Experience or understand of data pipelines and transformations leveraging python and spark/pyspark
  • Snowflake SnowPro Certification is a plus
  • Experience architecting solutions using MS Azure PaaS services and AWS PaaS services
  • Experience leading distributed teams onshore and offshore

Additional Qualifications

  • A minimum 4-year bachelor’s degree in Computer Science or similar degree
  • 10+ years as an Enterprise Data Architect
  • Excellent verbal and written communication skills
  • Consulting experience is required

The benefits our employees enjoy: 401(k) Savings Plan- Matched 150% up to 6%. (Our 401k is in the top 1% of 401(k) plans offered in the US!) Medical/Prescription/Dental/Vision Coverage! $12,000 in Tuition Reimbursement 100% Company-paid mobile phone plan Personal Time Off (PTO)- Ensuring a balance of work and home life

databricks Business Intelligence (BI) PySpark Consulting amazon-s3 data-management Python data-warehouse Apache Spark snowpipe testing Amazon Redshift mobility Data Architect snowflake-cloud-data-platform data-modeling cloud-computing Security data-integration DBT

Leave a Reply