Project informations:
Industry: banking
Location: Gdynia hybrid (1 day per week at the office)
Project language: English and Polish
Remuneration: up to 190 PLN/h netto + VAT (B2B)
Key responsibilities:
Develop Scala/Spark programs, scripts, and macros for data extraction, transformation and analysis.
Design and implement solutions to meet business requirements.
Support and maintain existing Hadoop applications and related technologies.
Develop and maintain metadata, user access and security controls.
Develop and maintain technical documentation, including data models, process flows and system diagrams.
Description of knowledge and experience:
Minimum 3-5 years of experience from Scala/Spark related projects.
Create Scala/Spark jobs for data transformation and aggregation as per the complex business requirements.
Should be able work in a challenging and agile environment with quick turnaround times and strict deadlines.
Perform Unit tests of the Scala code.
Raise PR, trigger build and release JAR versions for deployment via Jenkins pipeline.
Should be familiar with CI/CD concepts and the processes.
Peer review the code.
Perform RCA of the bugs raised.
Should have excellent understanding of Hadoop ecosystem.
Should be well versed with below technologies: Jenkins, HQL(Hive Queries), Oozie, Shell scripting, GIT, Splunk.
Nice to have:
• Relevant certifications (e.g., Scala, Spark, Hadoop, Performance)
• Knowledge of other programming languages (e.g., Python, R)
• Insight to cloud-based solutions such as Snowflake
• Experience in Financial Services, preferrable in the Credit risk domain
WE OFFER:
Attractive remuneration in the B2B model depending on your competencies and experience,
Co-financing of private medical care (Medicover) and the Multisport card,
emagine mobile application - easy reporting of working time, quick access to new offers,
Transparently built relations based on trust and fair play.