OBS! Ansökningsperioden för denna annonsen har
passerat.
Arbetsbeskrivning
Veritaz is a fast-growing IT-consultant firm. Our company is made up of insanely bright people from over 4 countries, and we are located in Sweden, UK, US and Pakistan. The voyage has been incredible this far, but it is only the beginning.
We are currently looking for passionate experienced and result-oriented candidates for the role of a Data Engineer with a professional background and exemplary concepts to join our team.
Key Responsibilities:
In our fast-growing and sustainable journey to democratise financial data, the Data Engineer is a key player in forming data pipelines across products.
You'll use your data engineering expertise and knowledge to ensure that data can be analysed by both customers and other developers. Because there has been little work done in this area thus far, you will have an early opportunity to shape solutions and implementations.
Build scalable batch and real-time data pipelines on a large scale.
Such a pipeline should be tested, maintained, and improved iteratively.
Collaborate with product teams to help them get the most out of their data.
We're a tech firm that's cutting-edge, approachable, agile, and always looking for new ways to improve things. We work in small groups that are responsible for all aspects of our services, from conception to implementation to operations and maintenance.
As you will be a major contributor to the development of both existing and future products, this entails a great deal of responsibility, creativity, and motivation. It brings you closer to the final product and makes you proud of your impact.
Education, Expertise, and experience requirements:
We're looking for someone with a senior level of expertise in all or a majority of the following technologies:
Apache Airflow.
AWS stack, which includes S3, Kinesis, Glue, and Lambdas, but is not limited to them.
Avro and parquet flooring
Python, scala, or java
Microservice architecture is number five.
Kubernetes is a container orchestration system.
Other:
It is advantageous to have the following knowledge & experience:
Apache Spark
Beam or Flink
Java, scala and python
GCP
Data Lakes and/or Data Warehouses
Scope: 100%
Location: Stockholm
So, what are you still waiting for? Join us on our adventure!