OBS! Ansökningsperioden för denna annonsen har
passerat.
Arbetsbeskrivning
Veritaz is a fast-growing IT-consultant firm. Our company is made up of insanely bright people from over 4 countries, and we are located in Sweden, UK, US, and Pakistan. The voyage has been incredible this far, but it is only the beginning.
Assignment description
Are you a collaborative, proactive and curious Senior Data Engineer then below is an interesting position for you. Our customer is looking for you that has a genuine interest in building powerful data pipelines and is passionate about clean code. You will be working in building the company's Product Master Data Platform. The company manages the bank's common Products catalogue and ensure that product- and product arrangement is made available in key banking initiatives such as digitalizing customer journeys and know your customer (KYC) processes. You together with the team will take an end-2-end responsibility for our cross-platform for product master data. The data published by Master Data Platform is used by the bank's internal users for example: Pension & Insurance, FICC & Equities, Trading, Mortgage Loan, Private Loan, AML & holding Control and many more.
What you will be doing:
· Develop powerful data pipelines that efficiently process large volumes of product master data, by participating in all the phases of development i.e. implementation architecture, requirement collection, coding & testing etc. (ETL)
· Enable product- and product arrangement workflows that expose performant query interfaces and offer easy-to-use integration hooks.
· Build features to tune processing pipeline and indexing depending on needs and workloads.
· Build solutions by using TDD and code reviews
· Encourage innovation and love working with modern and agile ways of working.
· In practice this means that you will:
· Help in formulating the implementation roadmap that meets the solution architecture
· Ability to take initiative and ownership in a fast-paced environment
· Drive change and is used to work independently, in a group and taking own initiatives
· Technical and personal competences
· Requirements:
o Spring Boot
o Spring Batch
o Kafka
o Java 17
o Google Cloud Platform
o Maven
o Microsoft SQL Server
· Nice to have:
o Data Lake Hadoop
o Angular
o OpenShift
o Splunk
o Junit
o Mockito
o REST-API
o Swagger
o Sonar
o GitHub Actions etc.
Who you are:
· Relevant degree and industry experience in building complex data pipeline
· Have some experience from DevOps maturity models and CI/CD
· Experience with data processing frameworks (Kafka, IBM MQ etc)
· Experience with different data lakes or warehouses
· Experience with containerization and other modern software development paradigms
· Previous experience in the financial industry is an advantage
· Can act as a mentor
· Have great communication skills
· Language skills: English