Advanced Analytics Data Engineer

0
488

Do you want to work in a young and dynamic international team united around data and customer centricity? Are you passionate to turn raw data into gold? Are you ready to shape the future of the financial industry within Central and Eastern Europe? Then you are the right person for our team!

Advanced Analytics Data Engineer

Having a strong customer recognition and being a digital leader on the market, Raiffeisenbank Serbia is seeking to enhance a freshly established Advanced Analytics Team and to drive analytical transformation on international level for entire Raiffeisen Group.

We are looking for ambitious candidates who will support us in our productive environment to:

  • Drive the scaling of new banking solutions with the help of data science
  • Be the mastermind in managing data workflows and turning it in practical insights
  • Contribute to RBI Group transformation into a data-driven company & most recommended financial institute

You will be part of our adaptive set-up where data engineers, data scientists, MLOps as well as business experts internationally work together on different use cases.

What your job will look like:

  • Participate in the lifecycle of data science projects, incl. design and development of data processing and monitoring pipelines, resource planning
  • Work with the state-of-the-art cloud infrastructure (AWS, DataBricks)
  • Assemble large, complex data sets to meet functional / non-functional business requirements
  • Develop, maintain and optimize ELT and ETL pipelines (incl. incidents investigation, writing “postmortems”)
  • Continuously support internal consumers (data analysts, data scientists) in best data engineering practices and automation of development pipelines
  • Prepare accompanying documentation and data specifications, as well as contribute to data catalogue

What you bring to the table:

  • Proven track-record as a Data Engineer / ETL Developer or similar role (financial industry is a plus, but not a must)
  • Software engineering excellence, understanding of SDLC, Linux & bash as your casual instruments
  • Deep knowledge of SQL (DDL, analytical functions, sub-queries, optimization of performance, principles for optimization for popular relational DBs, e.g. postgresql, mysql, ClickHouse)
  • Professional experience in designing and developing data pipelines in Python/Spark/Scala
  • Good comprehension of data warehousing principles, MDM, data models (LDM/PDM)
  • Cloud skills (any of GCS, AZ, AWS as advantage)
  • Fluent English, spoken and written
  • Fire in the eyes, desire to learn and to find improvements to status quo

Will be a plus:

  • BSc in Computer Science, Informatics, Software Engineering or related majors
  • Solid knowledge of ML principles and frameworks, analytical libraries (e.g. pandas, numpy)
  • Familiarity with developing unit and integration tests, TDD
  • Understanding of Git, collaborative coding practice
  • Experience in developing CI/CD/CT pipelines (e.g. Jenkins, TeamCity, GitLab CI)

What we offer:

  • Be part of international team at a leading banking group
  • Flexible working arrangements and determining your own work-life balance
  • Tailored professional development
  • Competitive salary

POSTAVI ODGOVOR

Please enter your comment!
Please enter your name here