- Proficiency in a programming language like Python and SQL
- Knowledge of the BigQuery DWH platform
- Working with Spark messaging systems
- Experience as a programmer and knowledge of software engineering, good principles, practices, and solutions
- Familiarity with cloud Google Cloud Platform (GCP)
- Knowledge of at least one orchestration and scheduling tool, for example, Airflow, Prefect, Dragster, etc.
- Familiarity with DevOps area and tools – GKE, Docker
- Experience with Version Control System, preferably GIT
- Ability to actively participate/lead discussions with clients to identify and assess concrete and ambitious avenues for improvement
A Data Engineer’s role involves crafting, constructing, and upholding the structure, tools, and procedures essential for an organization to gather, store, modify, and scrutinize extensive data amounts. This position involves creating data platforms using typically provided infrastructure and establishing a clear path for Analytics Engineers who utilize the system.
,[Working alongside Platform Engineers to assess and choose suitable technologies and tools for the project, R&D, maintenance, and monitoring of the platform’s components, Implementing intricate data intake procedures, Constructing efficient data models, Implementing and executing policies aligned to the strategic plans of the company concerning used technologies, work organization, etc., Ensuring compliance with industry standards and regulations in terms of security and data privacy applied in the data processing layer, Providing training and fostering knowledge-sharing] Requirements: Python, SQL, BigQuery, GCP, Docker, Kubernetes, GIT, Airflow Tools: Jira, GitLab, GIT, Jenkins / GitLab, Agile. Additionally: Sport subscription, Private healthcare, Flat structure, Small teams, International projects, Team Events, Training budget, Free coffee, Gym, Bike parking, Playroom, Free snacks, Free beverages, In-house trainings, Startup atmosphere, No dress code, Kitchen.