- 8+ years experience in Java, Scala or Python development
- 3+ years of experience in Big Data systems development
- Experience building complex data pipelines with data transformation tools like DBT or Dataform and SQL.
- Expertise using BigQuery or other data warehouses including performance tuning and monitoring.
- Understands data orchestration tools like Airflow to extend it with new data pipelines and troubleshoot its operational issues.
- Cloud experience with Google Cloud Platform or AWS, k8s, and Docker.
- Understanding of CI/CD, automated testing, monitoring and other DevOps practices.
- Fluently speak algorithms, data structures, and platforms.
- Experience with database system design, RDBMs and/or NoSQL.
- Bachelor’s degree in computer science, physics, or related field. A Master’s degree is a plus.
- Communicative Polish and English.
You will work in all aspects of agile application development, including our enterprise platform that interfaces with a multitude of services that are dependent on to deliver billions of requests per day. Your opinions will be important in all phases of product development, starting from requirements to validation and deployment. Working on the enterprise platform, you will be working with multiple distributed teams to architect, create, and deliver new features and functionality in order to deliver the best possible advertising experience in the market. Scalability, performance, and rock-solid reliability are all factors to consider with every line you code.
The Team and Project:
You will be part of the core data development team. Our exchange handles billions of ad requests daily connecting thousands of publishers with demand partners. Each request produces data events that have to be processed to extract business value from them. Daily our applications produce more than 1PB of data.
,[Design large-scale data processing systems., Work with Product to drive the requirements, and own the project end-to-end., Analyse and improve efficiency, scalability, and stability of applications., Think long-term and be unsatisfied with band-aids., Identify unnecessary complexity and remove it.] Requirements: Scala, Cloud, Kubernetes, Java, Docker, Spark, Jenkins, Hadoop, Data engineering, Python Tools: Jira, Confluence, GitHub, GIT, Jenkins, Cloud Build, Agile, Scrum. Additionally: Private healthcare, Flat structure, Lunch card, Small teams, Multikafeteria, Life & group insurance, Linkedin Learning, International projects, Free coffee, Canteen, Bike parking, Playroom, Shower, Free snacks, Free beverages, Free lunch, Free parking, Modern office, No dress code.
Agile data-structures Amazon Web Services (AWS) Apache Spark Data Engineering Google Cloud Platform (GCP) Docker Airflow Hadoop DBT Git CI/CD Scala Python Confluence Big data DevOps Jira GitHub SQL Kubernetes Java Scrum RDBMS Google Dataform NoSQL google-bigquery Jenkins algorithms