Data Engineer with GCP at GFT Poland #vacancy #remote

  • Openness to work in a hybrid model (2 days from the office per week)
  • At least 2 years of experience as a Data Engineer;
  • Deep knowledge of programming in Python;
  • Experience using SQL, Data Modelling, Big Query, Google Cloud daily. 
  • Commercial experience in working with GCP is an asset.
  • Proactive attitude, high communication skills.
  • Knowledge of Tableau, Power BI, Qlik Sense is a big advantage. (nice to have).

Why join GFT?

You will work with and learn from top IT experts. You will join a crew of experienced engineers: 60% of our employees are senior level.

Interested in the cloud? You will enjoy our full support in developing your skills: training programs, certifications and our internal community of experts. We have strong partnerships with top cloud providers: Google, Amazon and Microsoft – we are number one in Poland in numbers of GCP certificates. Apart from GCP, you can also develop in AWS or Azure.

We are focused on development and knowledge sharing. Internal expert communities provide a comfortable environment where you can develop your skillset in areas such as blockchain, Big Data, cloud computing or artificial intelligence.

You will work in a stable company (32 years on the market) in demanding and challenging projects for the biggest financial institutions in the world.

We offer you:

  • Working in a highly experienced and dedicated team
  • Competitive salary and extra benefit package that can be tailored to your personal needs (private medical coverage, sport & recreation package, lunch subsidy, life insurance, etc.)
  • Permanent or B2B contract
  • On-line training and certifications fit for career path
  • Free on-line foreign languages lessons
  • Regular social events
  • Access to e-learning platform
  • Ergonomic and functional working space with 2 monitors (you can also borrow monitors for your home office)
  • Online recruitment process

,[Design, build, test and deploy Google Cloud data models and transformations in BigQuery, Datafusion environment (e.g. SQL, stored procedures, indexes, clusters, partitions, triggers, etc.), Optimize data views for specific visualization use cases making use of schema design partitions, indexes, down-sampling, archiving, etc. to manage trade-offs such as performance and flexibility, Review and refine, interpret and implement business and technical requirements, Ensure you are part of the on-going productivity and priorities by refining User Stories, Epics and Backlogs in Jira, Onboarding new data sources, design, build, test and deploy Cloud data ingest, pipelines, warehouse and data models/products (GCP DataFusion, Spark and etc.)] Requirements: SQL, Data modeling, BigQuery, Google cloud platform, Python, Tableau, Qlik Sense, Power BI Additionally: Home office, Knowledge sharing, Life insurance, Sport subscription, Training budget, Private healthcare, International projects.

data-modeling Google Cloud Platform (GCP) Tableau SQL Python Qlik Sense Data Engineering Power BI

Leave a Reply